Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Importing large files
Message
 
To
01/06/2002 19:24:10
Al Doman (Online)
M3 Enterprises Inc.
North Vancouver, British Columbia, Canada
General information
Forum:
Visual FoxPro
Category:
Coding, syntax & commands
Miscellaneous
Thread ID:
00663825
Message ID:
00663833
Views:
19
We've thought about that. The problem is, there is considerable pre-processing that happens between this cursor and the final five tables in SQL that the data is distributed in.

Donna

>>We need to import large text files 1.5 million rows regularly (weekly). We create a cursor and append from text_file TYPE SDF. Very fast so far. Problem is, there is a description in this file that is up to 1000 characters wide. We import it into 4 char(250) fields and then after the import issue a replace statement to fill in the memo field.
>>
>>REPLACE fmitemdesc with ALLTRIM(fcitem1)+ALLTRIM(fcitem2)+ALLTRIM(fcitem3)+ALLTRIM(fcitem4) ALL
>>
>>The above replace statement is EXTREMELY slow - about 100 records every 3 seconds!! It takes forever to do these large files.
>>
>>Is there a different way to import from a text file and get the info into a single memo field? It is going from the cursor into SQL Server in a varchar(1000) field.
>
>Just a couple of general ideas you could try:
>
>- if the final target is SQL Server, maybe you could open an ADO connection to SQL Server and update the varchar column in a Recordset directly from the 4 VFP cursor fields, rather than needing to do the intermediate memo step
>
>- I understand there is a Bulk Copy (bc) command in SQL Server which is fast for importing large amounts of data.
Donna D. Osburn
www.software-plus.net
Time flies whether you are having fun or not!
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform