Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Distributing and using VERY large tables.
Message
De
04/08/2001 18:36:52
Nancy Folsom
Pixel Dust Industries
Washington, États-Unis
 
Information générale
Forum:
Visual FoxPro
Catégorie:
Base de données, Tables, Vues, Index et syntaxe SQL
Divers
Thread ID:
00539842
Message ID:
00539856
Vues:
18
>>>I have to create a table of 35 millions plus records. The table is to be populated from 850 fixed length text files. I get about 1/3rd of the way through with the imports and I hit the 2 Gig limit for Visual FoxPro tables.
>>>
>>>I took the 2 Gig dbf file and exported it to a delimited text file to see how much smaller it might be since there are a lot of empty fields in the table. To my surprise the 2,079,274 KB table made a 2,066,947 KB delimited text file.
>>
>>Did you use SDF or CSV? I'm finding CSV to be smaller because it removes the spaces (though not for numbers). I dont' think it's going to help really solve the problem though.
>
>Nancy,
>
>I had done a COPY TO ... DELIMITED. I was suprised that I didn't get a larger drop in file size. I opened the resulting txt file just to make sure that it wasn't a fixed length file accidently but it was a delimited file without the spaces. Even had that worked and reduced the file size down to something managable, I would still have had the problem of how to work with a text file doing complicated searches that I would normally do in VFP using indexes and and RUSHMORE.

Apparently it's not entirely clear then exactly what your question is. Do you want to know how to compress 2 gigs down to 650, and have it still be accessible as a Fox backend?

When the application was designed was the requirment that it fit on a CD known? Was it then evaluated as to what the data size would be and then backend tools evaluated?
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform