Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Two loop or not to loop?
Message
From
02/09/2004 04:55:50
 
 
To
01/09/2004 13:36:53
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00938414
Message ID:
00938665
Views:
28
Hi All

Thanks for the feedback. Here is more info:


TG) I suspect your speed issue is disk access. How about looping through the records and creating a cursor of the data that needs to go into the 9 files? Then, use APPEND FROM or INSERT INTO ... SELECT to add the records in bulk.

Tamar, you are right, the hdd light flashes continualy when the process is slow.


JN) Are you in a position to open the updateable tables EXCL? If so, that itself should make it significantly faster.

No I cant Jim. The files might be shared by other users.


JN) Do you have some constraint on VFP's memory usage in efffect? If so, that likely is hurting you in THIS case. Since you are not seeing significant CPU use it follows in my head that you are doing real I/O, probably more or less as it happens. The idea is to get as much data continuously held in cache so that it become more CPU intensive (by virtue of limiting the I/O).

Yes I had set a memory usage limit. I am now experimenting with setting no limit at all. I will need to run this for a while to see the effects.


TG) I believe processing would be faster if you write to each file in turn.
Reason: the index file has to be updated for each record and more of the index could be cached.

Thomas this was my thought. I was hoping for a more sure answer but looks like I will need to program the alternate loop option to see the effects.


TG) I think the size of the tables and the .cdx is the main factor here, since the machine seems well equipped. OTOH, since there a possible index degradation after ordered inputs, this might be less hampering than I assume.

What I forgot to mention which I realize is crucial is that sometimes existing records need to be updated, as opposed to adding only new records, and that this update can affect both of the index fields including the PK. Indexes are not large btw.


MY) I believe it would be far better to do one table at a time.

Mike, I cannot do a straight APPEND FROM becuase I omitted to say that sometimes existing records will need to be updated rather than only appending new records to the table. I need to check the values of the array records and depending on certai conditions either update an existing record (which requires a SEEK) or append a new record.


WS) I would stick with a 'temp' file for the 1000 records you are processing

Bill, I dont think the array is the problem. If I do not bother with the 9 "other" files and only work with the 1 main file then the loop is very fast. I dont think it's array access thats the problem but rather a case of (a) Tamar's point re hdd access and (b) Jim's point re caching CDX and DBF.


It seems that I will need to program the alternate loop to test. Thanks all.
In the End, we will remember not the words of our enemies, but the silence of our friends - Martin Luther King, Jr.
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform