Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Distributing and using VERY large tables.
Message
From
07/08/2001 16:31:16
 
General information
Forum:
Visual FoxPro
Category:
Databases,Tables, Views, Indexing and SQL syntax
Miscellaneous
Thread ID:
00539842
Message ID:
00540954
Views:
14
>>Build a table that only has columns for the indexed data, store the rest of the data in a memo field in compressed form. This type of data can be easily compressed using Huffman encoding with a static frequency table. If the table is still too large, break out the columns into individual tables with common primary keys.
>
>Al,
>
>Do you think that this would be sufficient to get 6 Gig of data down to 650 Meg or less?

assuming few indices, yes

If the rest of the data is in compressed format, from your experience, would uncompressing it on the fly degrade the performance very much?

no

>
>Ed

read up on huffman coding http://www.data-compression.com/lossless.html#huff
others Lempel-Ziv (lz78) http://www.data-compression.com/lempelziv.html, Lempel-Ziv Huffman (lzh)

Lempel-Ziv Welch would work but the patent would require royalties, http://burks.brighton.ac.uk/burks/foldoc/59/64.htm

search for libraries that implement the algorithms, like gzip.

I recommend that you do some research on the web to find the best solution for your data.
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform