Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Handling large data sets for web apps
Message
De
25/07/2001 17:38:11
 
 
À
25/07/2001 17:08:57
Information générale
Forum:
Visual FoxPro
Catégorie:
West Wind Web Connection
Divers
Thread ID:
00535271
Message ID:
00535402
Vues:
18
This message has been marked as a message which has helped to the initial question of the thread.
>>>
>>>We have fully optimized.
>>>
>>>I believe that we have about 1 gig of memory. In most cases it seems that we are CPU bound. We are not running out of memory to the point where it starts paging to disk.
>>>
>>>The web server and the data base server are both using VFP.
>>>
>>>Having a separate machine may be the way to go even SQL server, but we have to convince our client that we have run out of options in improving performance. We also need to make it very clear that even 2 seconds is too slow on a web app. It only takes a number of hits that all take 2 seconds and the site can slow down to a crawl.
>>
>>RE: the query. I guess I made the assumption that it involved only the large table. Are you performing joins? As the number of join conditions increases above a certain point, it seems that performance degrades exponentially. Breaking the query into separate queries may help.
>>
>>This approach might work used in another way. If you can generate a cursor based on the result set of the most restrictive filter condition, filtering that cursor with the remaining conditions may give you a boost.
>>
>> Jay
>
>Jay -
>
>I like the idea but this does require quite a bit of analysis and actually it can change over time. It would be nice to have a process that runs over the weekend that checks to see where we are going to get the most restrictive filters.
>
>But ultimately, there are going to be some filters where I am unable to get them a result in a reasonable time. The only way to deal with these kinds of filters that I can think of is to use some kind of asynchronous solution.

I recognize that this is a stopgap measure as you look for a more permanent and robust solution.

But, to the topic at hand.

Are you generating the SELECT statement dynamically or populating values in a hard-coded statement? Dynamically creating SELECT statements will avoid processing filter conditions which aren't used in the present query.

In that light, I don't think that a lot of analysis need be done. It sounds like you're fully optimized. So, without much of an up-front performance penalty you could take each condition, as generated by the user for a specific query, and identify exactly the number of records to be returned for that specific query. Then, you could dynamically generate the 2 SELECT statements, using the filter returning the smallest result set for the 1st.

Pseudo code:
Have an array of filter conditions, each array element holding all the conditions related to 1 field in the table. The array will have an unpopulated column which would indicate the # of records for that condition.

FOR X = 1 TO nFilterConditions
SELECT COUNT (*) WHERE &FilterCondition[X, 1] FROM BigTable INTO ARRAY aFC
aFilterConditions [X, 2] = aFC[1]
ENDFOR

Find Min

Create SQL


Best wishes!

Jay
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform