Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Handling large data sets for web apps
Message
De
25/07/2001 15:54:01
 
 
À
25/07/2001 15:42:23
Information générale
Forum:
Visual FoxPro
Catégorie:
West Wind Web Connection
Divers
Thread ID:
00535271
Message ID:
00535326
Vues:
19
>>>We have a web app which needs to support up to 100 plus users. We have multiple web sites that all use basically the same code. One of the websites has a table with 2.5 million records about 350mb. There is one summary report which if pre-digested will run very quickly on the web.
>>>
>>>Unfortunately there are 11 fields they want to be able to filter on. We have tried to get them to reduce the number of filterable fields but they won't budge on this. The report takes somewhere between 60 and 90 seconds to run. If we can limit them to 2 or 3 fields, I can pre-digest it further and then the speed would be reasonable.
>>>
>>>The machine is a dual 550 and currently running west-wind 3.15.
>>>
>>>Is there a solution to this problem?
>>>
>>>TIA
>>
>>Dan --
>>
>>A couple of ideas just to cover the basics.
>>
>>Rushmore optimization. Have you tried analyzed the various indices for their Rushmore optimizability? SYS (3054) gives information specific to your query.
>>
>>Server load. Have you tried increasing the memory on the server?
>>
>>Configuration. Are you running VFP on a database server distinct from the web server? If so, can you increase the bandwidth.
>>
>>
>> Jay
>
>
>We have fully optimized.
>
>I believe that we have about 1 gig of memory. In most cases it seems that we are CPU bound. We are not running out of memory to the point where it starts paging to disk.
>
>The web server and the data base server are both using VFP.
>
>Having a separate machine may be the way to go even SQL server, but we have to convince our client that we have run out of options in improving performance. We also need to make it very clear that even 2 seconds is too slow on a web app. It only takes a number of hits that all take 2 seconds and the site can slow down to a crawl.

RE: the query. I guess I made the assumption that it involved only the large table. Are you performing joins? As the number of join conditions increases above a certain point, it seems that performance degrades exponentially. Breaking the query into separate queries may help.

This approach might work used in another way. If you can generate a cursor based on the result set of the most restrictive filter condition, filtering that cursor with the remaining conditions may give you a boost.

Jay
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform