Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Tableupdate() a large amount of records
Message
 
To
25/08/2004 11:49:11
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00936296
Message ID:
00936574
Views:
17
I guess I don't get it... I still see no need for buffering - the concept of buffering is that you get a local copy of the data (that already exists), you work on it, update it, etc. Then you commit the changes to the underlying table. It is not for large insert operations.

If you create memory variables and INSERT INTO ... FROM MEMVAR, you are essentially waiting until you have it the way you want before you create your archive set. Buffering is just overkill.

The process you need to employ sounds like this...

Loop through existing master table and identify records to archive
For each record, summarize parent and child records into memory (either use select and/or m. assigns)
Insert parent/child summary into archive (INSERT INTO __archive FROM MEMVAR)
Mark the master table record as archived

If the process fails anywhere - you just restart it and it picks up where it left off.





>Hi Wayne,
>
>Thanks for your reply. I was actually wondering the same thing. Its actually two tables I'm archiving. (parent table + a child table with a 1 to many relationship). The archived version of the tables are VFP tables but in a different DBC then the app. (so reindexing won't reindex the archived tables and so that the app won't be dependent on the archived tables being there or not.)
>
>I decided to go with buffering because I am actually summerizing the data to a table in the main dbc as it goes. If something should go wrong I want the summerized table to match what's in the archive tables. That way they could just run it again without our intervention.
>
>I do have one question though. The data to be archived are actually hit counts imported from a web add on. Currently, when the data is being imported from the web there is no buffering because thats the only place (until the archiving idea) that the records get updated. If I set buffering on the tables during the archiving process would it cause a problem if someone tried to import the data (not buffered) at the same time? The records never get edited, they are just added via the import process and deleted via the archiving process.
>
>Thanks,
>Chris
Wayne Myers, MCSD
Senior Consultant
Forte' Incorporated
"The only things you can take to heaven are those which you give away" Author Unknown
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform