>We're getting ready to do a major upgrade here (from NT 4.0 Server to 2000 Server and Win 95 Clients to XP Prof. clients).
>
>We, of course, use a VFP app that requires a lot of data entry/retrieval throughout the day - as always - speed is essential. The app resides on the client while the data is on the server. We're not using SQL server - just plain old VFP. We're approaching 1 GB of data (861 MB to be exact).
>
>It would be great if some of you would offer suggestions as to how our network should be setup to acheive the best performance possible. Things like:
>
>How much benefit will we see (from VFP) with a two-processor server compared to a single processor?
>How much RAM in the server?
>Would a seperate machine that would share the VFP data provide any benefit?
>Most of the activity is on the third floor, would placing an additional server there help?
>If we could have a seperate server to handle MS Exchange - would that help?
>
>Any other ideas would be appreciated. Thanks for your help!
I have a few suggestions that are not completely network-related.
Don't forget that VFP is limited to 2 GB per table. I assume that 861 MB is the total size, so that should be no problem. But keep an eye on your largest file (DBF, memo or CDX).
Use views to access and edit small subsets of large table. Be sure they are optimized. Read my FAQ, and other articles, on query optimization.
Also be sure that queries for reports are optimized.
For a start, be sure NOT to include an index on deleted() on your tables - this will often degrade performance, and especially on large tables and over the network.
HTH, Hilmar.
Difference in opinions hath cost many millions of lives: for instance, whether flesh be bread, or bread be flesh; whether whistling be a vice or a virtue; whether it be better to kiss a post, or throw it into the fire... (from Gulliver's Travels)