Hi David...
Are you loading all 100K records from SQL-Server? If so, this is your problem. If you are using remote views, and you need to open those cursors, be sure to set the NoDataOnLoad Property to .T.. Or, if you are using the USE Command, be sure to use the NODATA clause.
Simply put, you cannot and should not pump 100K records from the server to the client. Rather, you should provide some sort of query capabilities and have just the records needed sent to the client. And, I would cap the number of records allowed. I would guess 100 should be sufficient.
In almost any instance, no way you need 100K records on the cleint. If it is a reporting issue, my suggestion is to create some server-side components that can fetch the data and run the reports. In this case, nothing is going to the client. Rather, the client kicks-off the process.
Also, if you are using remote views, my suggestion is to only use the views to present the data. Use SQL Pass Through to actually perform the updates by sharing the remote view connection. This will give you maximum control over the process.
Finally, you could have a cray super-computer for a client. You would still need to pump all of those records over the wire. Remember, you app is as fast as the slowest link.
>I have an application that starts real slow when it loads the data. A couple of tables that are over 100,000 records on a dual processor 400 with 128Mb of ram. Any ideas?
>DLC
Précédent
Répondre
Voir le fil de ce thread
Voir le fil de ce thread à partir de ce message seulement
Voir tous les messages de ce thread
Voir tous les messages de ce thread à partir de ce message seulement