Are you doing Server side processing for your work, or are you bringing back 1 million records to VFP, crunch there and then post back, get more, etc?
I'd try to do the processing in a T-SQL cursor(s) {not know for speed}. You may win because your not dragging all those transactions across the wire. From a SQLServer back end that may be the time issue right there.
This could all be done in a Stored Procedure on the server.
The benefit could be that you process 4+ jobs at the same time on the server if your have a robust box. Instead of doing 1 job on your desktop, wait, do the next.
You have to learn to work with the backend.
__Stephen
>Hello Simon
>
>Thanks for your answer.
>
>The problem we facing is that we cannot avoid record by record processing in order to process our long statistical algorithms (very specialist processing).
>
>
>In order to run such an algorithm on 100,000 records on SQL Server SP you need 3 minutes (optimised with indexes,file groups on different physical devices, simple loging strategy, 4 processors servers, RAID 5, 2 GB of RAM etc ..). In VFP you need 1 minute.
>
>Imagine we have to run such an algoritm 2000 times or more!
>
>
>One of the reason is that SQL Server is very slow with the record by record processing.
>
>
>Using extented store procedures using C is a solution but we dont have experience with the language.
>
>Any feedback will be very usuful.
>
>Thanks again
>Petros