Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Distributing and using VERY large tables.
Message
De
07/08/2001 16:31:16
 
Information générale
Forum:
Visual FoxPro
Catégorie:
Base de données, Tables, Vues, Index et syntaxe SQL
Divers
Thread ID:
00539842
Message ID:
00540954
Vues:
19
>>Build a table that only has columns for the indexed data, store the rest of the data in a memo field in compressed form. This type of data can be easily compressed using Huffman encoding with a static frequency table. If the table is still too large, break out the columns into individual tables with common primary keys.
>
>Al,
>
>Do you think that this would be sufficient to get 6 Gig of data down to 650 Meg or less?

assuming few indices, yes

If the rest of the data is in compressed format, from your experience, would uncompressing it on the fly degrade the performance very much?

no

>
>Ed

read up on huffman coding http://www.data-compression.com/lossless.html#huff
others Lempel-Ziv (lz78) http://www.data-compression.com/lempelziv.html, Lempel-Ziv Huffman (lzh)

Lempel-Ziv Welch would work but the patent would require royalties, http://burks.brighton.ac.uk/burks/foldoc/59/64.htm

search for libraries that implement the algorithms, like gzip.

I recommend that you do some research on the web to find the best solution for your data.
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform