Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Creating Performance Standards
Message
De
23/05/1997 09:10:45
Dragan Nedeljkovich (En ligne)
Now officially retired
Zrenjanin, Serbia
 
 
À
21/05/1997 18:34:00
Information générale
Forum:
Visual FoxPro
Catégorie:
Contrats & ententes
Divers
Thread ID:
00033105
Message ID:
00033393
Vues:
45
> >But working with 200,000 records and waiting for 5-20 minutes in Visual > Basic is ridiculous considering regular databases perform those type of > queries in a minute. > > This is the type of input I am looking for. What kind of times make you go > "that's ridiculous". On an "average" machine, an "average network" when > does it become "too long"? My measure is factor of three - if I know I've seen a similar size of tables, similar number of joins/selects from selects etc, and two cases which fall into the same cathegory should not differ by more than a factor of three (two should be the measure for a single-user machine, but when was it last you saw one doing large data processing?). I've seen these things happen, when I first used SQL in FP2.0 - a report totalling some 90,000 records on a 386/33 came down to 25 seconds from previous 3 minutes. So, any program which does sequential processing is practically obsolete these days. You'd be surprised to see how many programmers still use "do while !eof()... skip... enddo" on raw data.

back to same old

the first online autobiography, unfinished by design
What, me reckless? I'm full of recks!
Balkans, eh? Count them.
Précédent
Répondre
Fil
Voir

Click here to load this message in the networking platform