General information
Title:
Table Size Limitations
I have (6) text files roughly 2GB each with about 20 million records in each file that we want to import into seperate tables and then union them into one table. I created an import program to scan thru using fgets and then parsing the string into about 9 fields and inserting into my table. After about 15.8 million records, I get a 1150 error code "Not enough memory for file map" and cannot continue. The table size is now about 2GB. I thought Foxpro's table restriction was a billion records and hard drive space. I added one record to the table and then closed it. Now when I try to open it FoxPro tells me that it is not a DBF file. Any thoughts?
Next
Reply
View the map of this thread
View the map of this thread starting from this message only
View all messages of this thread
View all messages of this thread starting from this message only