Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Bulk loading of text data (CSV files)
Message
From
24/12/1998 10:28:44
 
 
To
24/12/1998 02:42:05
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00169303
Message ID:
00170263
Views:
33
>Josh and David,
>
>Thanks for your reply. My original tests were showing that it would take about 10 hours to processs 100,000 records using fopen, fget, etc. I felt that something was wrong here, so I "played" some more. My test was simple so it should have been fast. I was printing the record to the screen (using ?) for each record read in. Guess what? That little old "print" statement (that was for debugging purposes only) was the killer. Removing the print statement made the low level file handling statements work the way I originally expected - 15 to 20 seconds total to read 120,000 records (300-400 bytes each), not 10 hours!.
>
>Actually I am quiet familiar with "append from" The "append from" command takes about 50 seconds to read in my 120,000 records and of course that includes the extra overhead of parsing the text string and populating a dbf. It too is quiet fast.
>
>My whole issue though is dealing with embedded double quotes and embedded commas in the data which the "append from" command can not handle. Commas are not a real issue, but the double quotes are. FoxPro simply can not deal with them. Excel and Access can handle them very nicely. Even if I "escape" the double quote with another double quote (ex. "Bandage 4"" Square") FoxPro fails. Other apps import this with no problems.
>
>I am in control of the creation of the original text files, but a requirement is that the text file be written in a format that can be read into Access if the customer what to (why? only God knows!). The embedded commas and quotes are going to get me on this one. I can make it all work nicely if I use a different separator such as ascii 11 (vertical tab), I'm told by our C programmers that it is a commonly used delimiter. But, I can't get Excel nor Access to recognize anything but the standard comma, tab, space or any "typeable" charactor.
>
>So I am left with the low level commands to read in each record and parse the data. The parsing of the data is taking too long. Well, I've talked to much! But, now that you know the whole story, if anyone has run across similar problems, let me know.
>
>Thanks
>Dean

Hi Dean,
I have dealt with text files in several of my programs with the APPEND FROM and using fgets() and the other low level file commands. We recently ran in to a similar situation to yours on a new project. I am the only FOX programmer in a VB house, but the VB guys said that they have had great success with SCHEMA's in VB for importing the files. You may want to look at those. They say once you get them set up for your file layout, they are extremely fast. Hope this helps.

Merry Christmas!
Previous
Reply
Map
View

Click here to load this message in the networking platform