Information générale
Catégorie:
Base de données, Tables, Vues, Index et syntaxe SQL
Titre:
Millions of records
Hi,
I've the following requirements of a customer:
- import of 2 million records á 150 byte per record, 20 columns per record,
These records weren't changed, apart from records which aren't ok
150 bytes * 2 Mio = 300 MB
- the data should be kept for five years
300 MB * 5 years = 1,5 GB
These shouldn't be a problem with a table size limit of 2 GB.
But now my questions:
- assuming an index on almost every column of the table for rushmore
optimization, is the performance of import a problem ? (more indices
same key values -> more time for inserting a record)
Who has experience with the same amount of data?
- How long could take a SQL-statement ?
- Would it be better to store the data on SQL-Server ?
What about performance of SQL-Server compared to Foxpro
with large amount of data ?
I know, that my question are genereal, but I appreciate every input.
Bye,
Andreas
Suivant
Répondre
Voir le fil de ce thread
Voir le fil de ce thread à partir de ce message seulement
Voir tous les messages de ce thread
Voir tous les messages de ce thread à partir de ce message seulement