Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Creating Performance Standards
Message
From
23/05/1997 09:10:45
Dragan Nedeljkovich (Online)
Now officially retired
Zrenjanin, Serbia
 
 
To
21/05/1997 18:34:00
General information
Forum:
Visual FoxPro
Category:
Contracts, agreements and general business
Miscellaneous
Thread ID:
00033105
Message ID:
00033393
Views:
48
> >But working with 200,000 records and waiting for 5-20 minutes in Visual > Basic is ridiculous considering regular databases perform those type of > queries in a minute. > > This is the type of input I am looking for. What kind of times make you go > "that's ridiculous". On an "average" machine, an "average network" when > does it become "too long"? My measure is factor of three - if I know I've seen a similar size of tables, similar number of joins/selects from selects etc, and two cases which fall into the same cathegory should not differ by more than a factor of three (two should be the measure for a single-user machine, but when was it last you saw one doing large data processing?). I've seen these things happen, when I first used SQL in FP2.0 - a report totalling some 90,000 records on a 386/33 came down to 25 seconds from previous 3 minutes. So, any program which does sequential processing is practically obsolete these days. You'd be surprised to see how many programmers still use "do while !eof()... skip... enddo" on raw data.

back to same old

the first online autobiography, unfinished by design
What, me reckless? I'm full of recks!
Balkans, eh? Count them.
Previous
Reply
Map
View

Click here to load this message in the networking platform