Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Which is best for Desktop Apps VFP?.NET
Message
From
02/02/2004 05:00:09
Walter Meester
HoogkarspelNetherlands
 
 
To
01/02/2004 17:27:12
General information
Forum:
Visual FoxPro
Category:
Other
Miscellaneous
Thread ID:
00860600
Message ID:
00872875
Views:
68
Hi chris,

>I am highly confident that in your described circumstance of pulling data from 12 tables and merging it into one and doing calculations on it, it may have performed even better by using a stored procedure

Actually, this is not really a suitable alternative since we've got to post process that data anyways. A few of the tables are static tables not found on the server (e.g. Language translations) and has to be merged in anyways.

>We had a routine that we had written in VFP that had to get data from 7 or 8 tables and perform calculations and do other stuff, but the fastest we could get the code to run (no matter VFP or SQL data) was about 2-3 hours on a P4 with 512 MB RAM (this was processing about 100k - 200k records in the largest table) - we tried everything, INDEXSEEKing, SETing ORDER, LOCATE FORs, I mean EVERYTHING VFP had we tried but still it ran only taking at least 2-3 hours (even DAYS in some circumstances). Then we tried it in a Stored proc on SQL Server - exact same result, but takes only 15-20 minutes - so now the VFP backend version of our app uses the old code but the SQL backend version uses the stored proc. So it might be something to look into in your current situation?

Just these cases attrackt my attention. About always there is a bottleneck that has not been identified. Of course all depends on the algorithm you used. Whitouth seeing the code of you VFP application I really cannot comment on your findings, but problems might be related on analyzing the data in a multiuser environment and largely rely on resource intensive commands like SQL - SELECT or xBase commands kicking in rushmore, Or even use the wrong indexes (like index on deleted()).

A first strategy might be to get the data locally and process it in cursors. VFPs DML is more equipped than SQL in many cases so you should be able to outperform SQL and the actual datamunging part. If the time needed to datamunging far exceeds the time needed to get the data local there might be something wrong in your datamunging routine.

>I also noticed in a previous post of yours in this conversation that you were worried about server performance issues which is why you try to do as much data processing on the client as possible, but in our experience (we have a number of these monster stored procs on our SQL backend) a SQL (or Oracle) database server is usually so overpowered that even the most demanding of stored procs or queries still only makes a tiny dent in the available performance of the server. Another benefit of using Stored Procs is that all the processing code we have written in the SQL backend is still perfectly compatible with any new frontend we write (we wrote a .Net client program that makes use of some of these stored procs through ADO.Net without trouble)

My problem with the SPT and VIEWS that is was taking much more time to get the data at the client than with VFPs xBase commands. Since the nature of the process I won't solve that with SPT because the datamunging itself it blazingly fast (only a few milliseconds) The downloads of each table took about tent's of milli seconds on a 10mb network, so I identified the bandwitdh of the network is the bottleneck here. Since the volume of the processed table is not significantly smaller than the resultsets downloaded to the client, processing it on the server does not solve a bit for me.

Handling it in the xBase way of SETting ORDER, SEEKING and SCAN WHILE, revieled that there was a lot of client caching going arround. Tables are not refreshed inmediately (See SET REFRESH command) and the filesystem readbuffers tables to which no data has been written yet. (This is esspecially handy for relative static tables than only changes a few times a year. Once they're open and the contents has been read the data stays on the client that can be used by many xBase commands). I suspect the SQL SELECT commands is more restrictive in the sence that it always checks on the server for the actual contents.

Anyways taking the risk of not having the most recent data for reporting (the data might be a few seconds old, because of refreshing), I decided for the VFP table version of the algoritm to take advantage of the read buffering. It really made my routine about 5 times as fast.

For the SQL server version I really yet have to decide what to do. I think SPT in downloading the raw data is the best choice. Since I do not need all fields I can reduce the network trafic by just selecting which fields I need (which is not possible with VFPs SQL commands since they're processed locally).

Walter,
Previous
Reply
Map
View

Click here to load this message in the networking platform