Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Low level read-write optimization for very big file
Message
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00733923
Message ID:
00733987
Views:
18
>>Gerard,
>
>GéraLd :-)

Sorry about that typo

>Fox was suppose to be able to handle a maximum # of characters per character string or memory variable of 16,777,184 bytes...
>But at wich price?

It can indeed, but the time it takes to add a single character to a string over and over, no language is going to do that very fast.

>The program need to determine the format of ANY comma delimited files that was given to it.
>I know that Fox was able to guess the structure of the data in the file, but the client want a parsing and validation of the file.
>After the parsing, a table was created based on the array values.
>The output file was append to the table.

Ok, I'd make this into a multipass process

1 - read the first line of the file to see how many commas there are, that +1 will tell you how many columns the cursor needs.

2) create an array laMaxSize of than many elements, initialized to 0

3) fgets the file one line at a time
alines( laRow, lcLine, .f., ',' )
for i = 1 to n
laMaxSize[i] = max( laMaxSize[i], laRow[i] )
endfor

4) now build a cursor temp1 of n columns, each one type C with a width of laMaxSize[i]

5) append from thefile.csv type csv

6) scan temp1 to check for real field types

7) create the real table

8) scan temp1 inserting data that validates into the real table

This way there's no need to build huge strings, it performs the task in 3 linear passes through the file.
df (was a 10 time MVP)

df FoxPro website
FoxPro Wiki site online, editable knowledgebase
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform