Would a 1.7 Gig, 11 million record table be big enough for you?
A properly designed application will almost entirely eliminate data corruption. Use views and buffering for any updates and SQL queries for reporting.
The biggest issue you have is non-indexed queries. There are things you can do there to speed up this stuff, such as retrieving a subset using the index, then querying the intermediate table.
>A client wants to run a VFP system with
>Three Tables...
> Customer table up to 50,000 records
> Person table, linked to above, up to 300,000 records
> Order Form Table, linked to Customer, up to 700,000 records
>
>Under LAN I am afraid of data corruption and speed issues, as the software does many different queries that are not indexed.
>
>Under SQL I am not concerned with data corruption, but am still concerned with speed, especially if 30 people query the Order Form Table simultaneously.
>
>Has anyone ever dealt with tables these sizes and this many users. ALL input would be helpful.
>
>Thanks,
>Glenn
Craig Berntson
MCSD, Microsoft .Net MVP, Grape City Community Influencer