Hi Mark,
>We have a VFP 5.0 application that processes a table with 1.2 million records that runs incredibly slower than it does with 500,000 records.
>The process is a simple matter of grouping each record by one of three fields; i.e, find a matching record on field1, if no match, find a matching record on field2, if no match, find a matching record on field 3. Each of these fields has an index for this purpose.
>We are using a new, 450mhz PII w/128M ram. We have tried several things, such as SYS(3050), and numerous coding methods, but basically, when the table reaches a certain size, the process grinds to almost a halt. As I mentioned, 500,000 records can be done in about 20 minutes, whereas 1.2 million is only 13% complete after 2 days!
How do you process the table? Are you sure it's 100% Rushmore optimizable, ie. is there an index on DELETED(), do the index expressions match EXACTLY the one used in the filter or FOR clause, does the COLLATE sequence match?
Sometimes, when you modify a lot of records in a loop, it helps to close all tables and open them again.
Christof
--
Christof