Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Tableupdate() a large amount of records
Message
 
 
To
25/08/2004 10:34:28
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00936296
Message ID:
00936313
Views:
28
I remeber your post from yesterday. Now, how about something totally radical. Why not install an instance of MSDE or SQL Server and archive the data to that? MSDE I think has a table size limit, but if it does, it is greater than VFPs.

The other developer may have been commiting buffered records that frequently because of network instability. Either that, or the speed perception may have been better by commiting every 100th record instead of waiting until then end then committing them all at once where the wait could could have been substantial. Of course the overall time would still be the same. Maybe he also checked the table size after every 100 records to see if the 2GB limit had been reached instead of using a calculation.

>Hi,
>
>I'm working on a mechanism that is going to archive a couple of large tables. (you may have read my question from yesterday) A tremendous amount of records would be tableupdated at the same time. I saw somewhere else in our software a developper was commiting the buffers every 100th record. That developper no longer works here so I can't ask him about it. Would it be a good idea for me to do something similar? Also, why would it (or not) be a good idea to do it that way? We are using VFP 6 with VMP (version 4) framework.
>
>Thanks,
>Chris
Mark McCasland
Midlothian, TX USA
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform