Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Low level read-write optimization for very big file
Message
From
17/12/2002 19:04:16
Hilmar Zonneveld
Independent Consultant
Cochabamba, Bolivia
 
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00733923
Message ID:
00733932
Views:
25
This message has been marked as a message which has helped to the initial question of the thread.
>(VFP 7 SP1 on Windows 2000 or Windows XP pro)
>
>I have to perform 3 operation on a text file.
>
>1 - read a line from a text file.
>2 - perform some operations on the string.
>3 - write the line in another file.
>
>The text file was really big (950 mb).
>
>The program run fast for the first ~50 mb but after that it start to slow down (crawl).
>The computer have 1 gb of memory an 100 gb of available disk space.
>From the task manager I can see that only around 200 mb was used during the operation.
>
>What is the best way to open the files? buffered/unbuffered?
>Flush when I reach a certain point in the file? every 10 mb?
>How can I optimize the process to use all available memory and speed up the process? some sys() function to use?
>
>TIA :-)

I never heard that such a problem existed.

How about re-opening the file every 1 MB or so? Before closing, you would have to save the current position, and then restore it later.

HTH, Hilmar.
Difference in opinions hath cost many millions of lives: for instance, whether flesh be bread, or bread be flesh; whether whistling be a vice or a virtue; whether it be better to kiss a post, or throw it into the fire... (from Gulliver's Travels)
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform