Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Distributing and using VERY large tables.
Message
De
14/08/2001 11:37:04
 
Information générale
Forum:
Visual FoxPro
Catégorie:
Base de données, Tables, Vues, Index et syntaxe SQL
Divers
Thread ID:
00539842
Message ID:
00543558
Vues:
22
Edmond,

Just a little suggestion.. that could work depending on what kind of queries you will be performing on the data

Could you split the data into subgroups?
Maybe by state? then zip up all the records for eg. Texas
any search you perform would just need to unzip a subset of the records which would (hopefully) fit on the users HDD.

The .DAT followed by .D00 .D01 naming convention you mention sounds like it could be similar to the unPak Pro system (used for downloading large files over the internet)

Also if anyone remembers Revelation/Advanced Rev. I think that used a similar convention.. but the files weren't compressed as far as I can remember.. just variable length records with hash table indexes.

HTH
Will

>I have to create a table of 35 millions plus records. The table is to be populated from 850 fixed length text files. I get about 1/3rd of the way through with the imports and I hit the 2 Gig limit for Visual FoxPro tables.
>
>I took the 2 Gig dbf file and exported it to a delimited text file to see how much smaller it might be since there are a lot of empty fields in the table. To my surprise the 2,079,274 KB table made a 2,066,947 KB delimited text file.
>
>Other vendors have this data is some sort of file structure that only takes up about 450 Megs. They distribute this data (in .dat format whatever that is) via a CD-ROM and access it directly off of the CD via their apps on the fly without any major writing to the hard drive. They can access the data immediately upon demand without any obvious decompression. For me, just trying to zip the 2 Gig text file takes forever and slows the computer to a crawl.
>
>What type of data format could they be using that can store what would have been a 6 Gig delimited text file into only 450 Megs?
>
>I need to be able to do the same thing. I can’t distribute the data in zip format and expect the user to take 40 minutes unzipping it and also expecting him to have 6 Gigs of available disk space not counting space for the indexes that will need to be created.
>
>Any ideas or theories about the file type might be and if there is any way that I can do the same thing via Visual FoxPro (or VB or C# if I have to)?
>
>TIA
Will Jones
Précédent
Répondre
Fil
Voir

Click here to load this message in the networking platform