>Hey Dragan,
>
>Currently we simply use Windows Exploring to copy the files. I'm not sure about the FileToStr solution as we have 300 tables, associated memo and indexe files and some of the larger files are about 1GB in size. Seems like the memory would blow up. Also, I'm not sure if I understand how you can keep related DBF, CDX and FPT files in sync unless you have an exclusive lock on the table. Or scan through all the records and RLock as you go. I wonder what 24/7 businesses who use native tables do to keep their development and production data straight.
You run such a big 24/7 operation and still hope to be able to copy your files over?
This would definitely call for a timestamp based update scheme - and I think you could apply some locking mechanism to preserve integrity of related tables while updating. In my scheme, I've developed a set of custom objects, where the header table updates record by record, but all related detail tables update in sets of records related to the parent. Here's the relevant bit of code:
For i=1 To This.nKids
luKey=Evaluate("oRec."+This.PKey)
This.okids[i].Export(luKey)
Endfor
This works on several levels, if need be - there can be grandkids etc.
I think something like this could work for you - of course, it would depend on your tables and how they are related.