Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Processing large volumes of data
Message
From
22/09/2000 09:05:02
David Fluker
NGIT - Centers For Disease Control
Decatur, Georgia, United States
 
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00419465
Message ID:
00419672
Views:
17
>I have a mortgage application that needs to load 200,000 transactions from an ASCII file at the end of every month. This process is taking 3-4 hours on the clients PC (350MHz, 64MB RAM). The client wants to be able to load up to 1 million. I am looking for the most efficient way of loading, processing and saving these transactions.
>
>Some ideas:
>1. Split the load file between multiple computers
>2. Sort the load file and the table by loan number and use scan/skip
>3. Try to make code more efficient (difficult)
>4. Get the client to use a faster PC.
>5. What else can I do?

Shane,
That seems like a long time to load 200,000 transactions. If you are not doing validations or processing on each record as it is entered, you might delete all of the index tags on the transaction table(s) before you begin importing the records and reindex afterwards. That way VFP isn't updating the .CDX between each record as it is imported.
David.
Previous
Reply
Map
View

Click here to load this message in the networking platform