Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Low level read-write optimization for very big file
Message
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00733923
Message ID:
00733984
Views:
19
Gerard,

reply order kinda swapped around

>Anything wrong in this?
>:-)

Yeah, it takes too long. *L*

>Assuming that there is 6 fields by row.
>After 100 000 rows parsed from the file, the array contain 6 strings of 100 000 chars.
>When the entire file was read, a table can be created base on the information in array.

So for every line in the file you are adding a single character to 6 different array elements. This is your non-constant speed operation

Is it really necessary to know that every field you know what type of data was in that field for every line of the file? Why do you need to have 100,000 Ts in one of the array elements?

Why don't you give us a better description in words what you are really trying to accomplish here.

Can you also show us an example of 2-3 input lines and 2-3 outputlines that they'd generate?

>Step 1: Read one line
>
>Step 2: The second step was the validation of all chars in the string.
> I have the replace every chars < chr(32) by a "?" in the string.
> After this validation, the string was put in an array using ALINES.
> Each rows of the array was tested to determine the type of data (date,datetime,numeric and so on).
> The result was put in another array. If there is 6 "fields" by row, the array have 6 elements.
> For each row read from the file, 1 character was added to each of the array's elements (C for char, T for datetime, etc...).
df (was a 10 time MVP)

df FoxPro website
FoxPro Wiki site online, editable knowledgebase
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform