Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Handling large data sets for web apps
Message
 
À
25/07/2001 15:37:13
Information générale
Forum:
Visual FoxPro
Catégorie:
West Wind Web Connection
Divers
Thread ID:
00535271
Message ID:
00535315
Vues:
9
Well, I would work with SYS(3050) to make sure that the indexing instance isn't using up too much memory (vfp will grab all it's allowed otherwise). other than that, if you're looping anywhere, place a SLEEP(0) in there so you don't tie up the processor. If these don't work, you should have a good case with them to go to asych processing...
>We have not been able to get our client to agree to any asynchronous procesing yet. We have been trying for about a year. There's a number of tricks we can do by pre-digesting the data that has handled all our problems up to now.
>
>One thing about setting as a background process. Let's say that one process that we are running is indexing a huge file. Even though it is running in the background, we have noticed that once it starts to index the web server becomes sluggish. I'm assuming that an individual item or line of code will wait until the CPU is idle but once that process( such as building an index ) starts, the code that it is executing is not running as a background process and we have no control of that.
>
>We are running as a COM server. Right now we a running 8 servers, higher than recommended by Strahl, but it seems to work for us.
>
>
>>You've probably already thought about/tried this - asynchrounous processing so that it processes in the background. You also could probably set it up so that an online user can kick it off and then have an email (with the link to download) sent to them automatically when it completes. ALso, I'm assuming that you're either running COM or multiple instances of File_Based. Running 1 instance of File_Based is a sure way to slow things down...
>
>>>We have a web app which needs to support up to 100 plus users. We have multiple web sites that all use basically the same code. One of the websites has a table with 2.5 million records about 350mb. There is one summary report which if pre-digested will run very quickly on the web.
>>>
>>>Unfortunately there are 11 fields they want to be able to filter on. We have tried to get them to reduce the number of filterable fields but they won't budge on this. The report takes somewhere between 60 and 90 seconds to run. If we can limit them to 2 or 3 fields, I can pre-digest it further and then the speed would be reasonable.
>>>
>>>The machine is a dual 550 and currently running west-wind 3.15.
>>>
>>>Is there a solution to this problem?
>>>
>>>TIA
Précédent
Répondre
Fil
Voir

Click here to load this message in the networking platform