Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Distributing and using VERY large tables.
Message
 
À
04/08/2001 17:16:58
Gerry Schmitz
GHS Automation Inc.
Calgary, Alberta, Canada
Information générale
Forum:
Visual FoxPro
Catégorie:
Base de données, Tables, Vues, Index et syntaxe SQL
Divers
Thread ID:
00539842
Message ID:
00539851
Vues:
12
>>I have to create a table of 35 millions plus records. The table is to be populated from 850 fixed length text files. I get about 1/3rd of the way through with the imports and I hit the 2 Gig limit for Visual FoxPro tables.
>>
>>I took the 2 Gig dbf file and exported it to a delimited text file to see how much smaller it might be since there are a lot of empty fields in the table. To my surprise the 2,079,274 KB table made a 2,066,947 KB delimited text file.
>>
>>Other vendors have this data is some sort of file structure that only takes up about 450 Megs. They distribute this data (in .dat format whatever that is) via a CD-ROM and access it directly off of the CD via their apps on the fly without any major writing to the hard drive. They can access the data immediately upon demand without any obvious decompression. For me, just trying to zip the 2 Gig text file takes forever and slows the computer to a crawl.
>>
>>What type of data format could they be using that can store what would have been a 6 Gig delimited text file into only 450 Megs?
>>
>>I need to be able to do the same thing. I can’t distribute the data in zip format and expect the user to take 40 minutes unzipping it and also expecting him to have 6 Gigs of available disk space not counting space for the indexes that will need to be created.
>>
>>Any ideas or theories about the file type might be and if there is any way that I can do the same thing via Visual FoxPro (or VB or C# if I have to)?
>
>How much space do the text files take up ? What are the information requirements ? You don't give much info to go on. Even Word has facilities to manage/index multiple "documents" ...

Gerry,

The original zipped fixed length text file take up 487 Megs (there is actually a total of 927 text files in zip form).

Each record has 30 fields and the total length of each record is 182 bytes. There are some fields that can be discarded and doubtless the other vendors have done away with some of them. At the most I could drop about 50 bytes from each record so that really doesn't help much. I'd still end up with multiple Gigs.

I need to do searches on this data, indexes, SQL queries, etc. I'm not sure that Word would help get this data down to 450 Megs or help with doing complex searches would it?

Ed
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform