For a quick band aid you can move large RPadded char fields to memo fields,
giving you +4GB table, split the >2GB csv file with scripting host calls
into 2 files appendable via vfp. Or zap the table before second input -
dunno if your pruning routine needs full set of data in the import table.
Should give you enough time to figure out your next steps...
>thanks Sergey.
>
>>2GB is limitation of Windows 32-bit API that VFP is using so it's affects the size of any file in VFP.
>>
>>>I have a csv file that I have been processing for years. The first step is using an append statement to get the first set of records I need into a vfp table, then other processing after that.
>>>
>>>This file grows with every new version (once a week). Last week the file finally reached 2GB in size. The append statement generated an error saying file too large.
>>>
>>>I take it the 2GB limit does not just apply tables, but other data file types as well?
>>>
>>>I am trying to re-write some of the routines in sql server, but finding the processing from a raw csv file to be slower than what vfp was able to do...
>>>
>>>Thanks,
>>>KP
Previous
Reply
View the map of this thread
View the map of this thread starting from this message only
View all messages of this thread
View all messages of this thread starting from this message only