Hello Ben
How about using
USE BIGTABLE
COPY TO BIGCOPY FOR NOT DELE() CDX
RENAME BIGCOPY.* TO BIGTABLE.*
OR
USE BIGTABLE
COPY STRU TO BIGCOPY CDX
APPEND FROM BIGTABLE FOR NOT DELE()
RENAME BIGCOPY.* TO BIGTABLE.*
* REQUIRES EXCLUSIVE USE
* CAN USE SET DELE ON INSTEAD OF FOR CLAUSE
BOB
>Hi there,
>
>I have a very large VFP table (1.3 to 1.7 gigs) that I archive off records from and pack on a monthly basis. Because of the size of the table, the pack only works about half of the time and gets a "Not enough memory for file mapping" error when it fails.
>
>Through trial and error, I've found a better success rate on the pack by doing it on a server that has the maximum amount of disk space free, RAM (2 gigs) and processor speed on a single processor (3.2 ghz). This particular server is running Windows Server 2000. We have other, newer servers that have Windows Server 2003 with dual 2.0 ghz processors, but the pack seems to repeatedly fail on them.
>
>My question to everyone is this: Is there technique I can use to optimize my chances of a successful pack on such a large table? Is there an alternate strategy to using PACK that will give me the same results without the error? I'm very open to any and all suggestions.
>
>Thanks in advance,
>
>Ben Holton
'If the people lead, the leaders will follow'
'War does not determine who is RIGHT, just who is LEFT'