Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Distributing and using VERY large tables.
Message
From
04/08/2001 19:09:51
 
General information
Forum:
Visual FoxPro
Category:
Databases,Tables, Views, Indexing and SQL syntax
Miscellaneous
Thread ID:
00539842
Message ID:
00539858
Views:
15
Edmond,

I don't know if this is an answer, but I have a product called ZipMagic installed on one of my PCs and it does work some magic... It will let a file that is ZIPped be accessed as if unzipped by most applications. It works (at least the very limited times I've tried) with .DBFs and VFP. It does NOT go through an UNZIP step internally, but rather (it claims, and appears to) handles each record as it is read.

I think it cost me around $39. It does install itself PRESUMPTUOUSLY (in my opinion). For instance I suddenly found a new bix beside the minimize box, some new menu options in various places and a startup/Systray icon. And I have found that I have to remember to turn it off when I install software (like VFP) that has .ZIP files as parts of its install.

It might be worth a try. I'm guessing they have a trial version available. www.mijenix.com.

Good luck

JimN

>I have to create a table of 35 millions plus records. The table is to be populated from 850 fixed length text files. I get about 1/3rd of the way through with the imports and I hit the 2 Gig limit for Visual FoxPro tables.
>
>I took the 2 Gig dbf file and exported it to a delimited text file to see how much smaller it might be since there are a lot of empty fields in the table. To my surprise the 2,079,274 KB table made a 2,066,947 KB delimited text file.
>
>Other vendors have this data is some sort of file structure that only takes up about 450 Megs. They distribute this data (in .dat format whatever that is) via a CD-ROM and access it directly off of the CD via their apps on the fly without any major writing to the hard drive. They can access the data immediately upon demand without any obvious decompression. For me, just trying to zip the 2 Gig text file takes forever and slows the computer to a crawl.
>
>What type of data format could they be using that can store what would have been a 6 Gig delimited text file into only 450 Megs?
>
>I need to be able to do the same thing. I can’t distribute the data in zip format and expect the user to take 40 minutes unzipping it and also expecting him to have 6 Gigs of available disk space not counting space for the indexes that will need to be created.
>
>Any ideas or theories about the file type might be and if there is any way that I can do the same thing via Visual FoxPro (or VB or C# if I have to)?
>
>TIA
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform