Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Handling large data sets for web apps
Message
De
26/07/2001 15:50:12
 
 
À
25/07/2001 22:05:47
Information générale
Forum:
Visual FoxPro
Catégorie:
West Wind Web Connection
Divers
Thread ID:
00535271
Message ID:
00536002
Vues:
9
>>>
>>>Are you saying to do a count for each condition of the filter for one field. Based on those conditions build the SQL so that the filter which returns the smallest number of records comes first in the where clause and so on so that the one that gets the highest number is last?
>>>
>>>
>>>
>>>
>>>
>>>>Best wishes!
>>>>
>>>> Jay
>>
>>I like the idea, I wonder how this will work in production. My boss wants me to get 99.5% of the hits to about .1 of a second or less. In one test where I counted a result which returns the majority of the 2.5 million records was coming back in .05. I would be over that limit quickly but if we can improve performance dramatically using this method, this would definitely worth looking into.
>
>The large data set you're working with does create some minimum overhead, doesn't it? But, the times you're encountering seem to be fairly high for simply a fully optimized SELECT alone. Of course, you've got a complex interplay there, too.
>
>I guess I'm not fully convinced that this is a query optimization issue, but I think its an avenue worth pursuing.
>
>Do you experience about the same performance on a test machine, or is it orders of magnitude less -- as you identify in the post above? In that case, there may be some constraints you're running into with the apartment model multithreading approach of VFP -- where the specific instance of VFP may have a more constricted amount of memory available to it. Rick Strahl would be a good source of info on any limits in this arena.
>
>If your tests are running pretty comparable, I would again review (I'm sure you've done it already many times) the indices to ensure that you're fully optimized on any query. Do all the queries run slowly? Or, the more conditions, the more slowly does it run?
>
>And, if you wouldn't mind sharing the method you use to construct the SQL statement, or identify the precise SQL statement you're using, that would also help.
>
> Jay
* DCC - Assumes that the D40 is already open.  This actually can take substantial time.  Total number of records 2,497,224

lnSeconds = Seconds()
Count For Empty( D40CkNo ) To N1 && .1   	2,497,224 records
?"1a", Seconds() - lnSeconds
lnSeconds = Seconds()
Count For Upper( D40Audit ) = "9000" To N2  && .581		2,437,088 records
?"1b", Seconds() - lnSeconds
lnSeconds = Seconds()
Count For Upper( D40DirNo ) = "TC" To N3 && .191		574,785 records
?"1c", Seconds() - lnSeconds
lnSeconds = Seconds()
Count For Between( D40Dte, {05/01/01}, {05/31/01} ) To N4 && .07	210,049
?"1d", Seconds() - lnSeconds
lnSeconds = Seconds()
Count For Left( D40Acct, 2 ) = "22" To N5  && .731		2,312,273
?"1a", Seconds() - lnSeconds
?N1, N2, N3, N4, N5

* DCC - In this case because of the high number of records being returned for 
* DCC - each condition, I don't think that it would work well to use partial 
* DCC - optimization here.
lnSeconds = Seconds()
Select * ;
	From D40 ;
	Where ;
		Upper( D40DirNo ) = "TC" And ;
		Between( D40Dte, {05/01/01}, {05/31/01} ) And ;
		Left( D40Acct, 2 ) = "22" And ;
		Upper( D40Audit ) = "9000" And ;
		Empty( D40CkNo ) ;
	Into Cursor qInvoiceTest NoFilter
?"1b", Seconds() - lnSeconds  && .651  
I haven't fully read your post but plan to later.
Précédent
Répondre
Fil
Voir

Click here to load this message in the networking platform