>Thinking that since it was Rushmore-optimizable, I used a SCAN loop to pre-process some data for a report (was too tricky to be done with SQL) and found that there were tremendous delays in processing a relative handful of records from a fairly large table (about 100,000 records). So, I ended up SEEKing followed by a DO WHILE loop, which worked great.
>
>Is SCAN something to avoid with large tables, or did I just miss something? (My FOR expression was definitely optimizable.)
I have seen it in the past where the scan may be slow on large talbes do to the fact that the table being scan for a certain expression has an index open at the time. It could also be a memory problem as a scan seemed to go fairly quickly after reboot, but as I used my computer through the day and resources have been taken the scan may slow down. Just my guess, but I use scan end scan with a for condition and dont seem to have to many slowness problems. Ensure that the table is index on exactly what is in your for clause.
Bret Hobbs
"We'd have been called juvenile delinquents only our neighborhood couldn't afford a sociologist." Bob Hope