Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Large data sets
Message
Information générale
Forum:
Visual FoxPro
Catégorie:
Client/serveur
Titre:
Divers
Thread ID:
00573202
Message ID:
00573716
Vues:
25
Let's examine this logically. Let's assume you split your VFP tables out to accomodate the 2 Gig size limit. You then wish to join those tables back together to display to the user, which let's say, will be done using a cursor. Unfortunately, a cursor is still a table that resides in the Temp directory and is subject to the size limit. So that solution wouldn't work.

I'm thinking that the solution for you may be to split the transfer down into more manageable chunks. I really don't see any user looking through 2 Gigs of data at one time. I can, however, see that user "paging" through data much like the search engines you can find on the internet. Now, you don't have to limit the number of records retrieved to 25. You could retrieve thousands at a time; whatever happens to make the most sense.

Other than that, I can't think of anything in which you could display that amount of information with in decency in speed. If you decided to use ADO, then you'd have in excess of 2 Gigs worth of stuff in memory. That's a lot of RAM needed! In truth, I don't even know if you could effectively manage the memory well enough to allow that kind of information to be displaye at once even in C++. You'd probably end up doing the "paging" thing I mentioned earlier.

Anyway, HTH.

Travis

>I know that VFP tables are limited to 2 gig. But I also mention the possibility of C++. If we use C++, OLE DB/ADO, is the resulting dataset as retrieved by ADO limited to 2 gig? Suppose VFP is only instantiating ADO, can ADO retrieve a dataset larger than 2 gig?
>
>Thanks,
>Steve
>
>
>>Hi Stephen,
>>
>>The size of the VFP tables is limited to 2Gb regardless where data is comming from.
>>
>>>In time we will have the need to transfer potential data sets in excess of 2 gig for viewing by an end user from terabytes of data. We need to know the best connectivity to use for this: ODBC (currently used), ADO, OLE_DB, etc. what's the fastest and best for this application? The current application is also written in VFP and we need to know if the 2 gig size limit will affect the return of data or whether VFP will need to be converted to C++ in order to accomodate these larger data sets. Even if C++ is eventually used, we'd still need an answer for the connectivity method to a SQL-Server or Oracle backend.
>>>
>>>Thanks,
>>>Steve
Travis Vandersypen
Précédent
Répondre
Fil
Voir

Click here to load this message in the networking platform