>We need to import large text files 1.5 million rows regularly (weekly). We create a cursor and append from text_file TYPE SDF. Very fast so far. Problem is, there is a description in this file that is up to 1000 characters wide. We import it into 4 char(250) fields and then after the import issue a replace statement to fill in the memo field.
>
>REPLACE fmitemdesc with ALLTRIM(fcitem1)+ALLTRIM(fcitem2)+ALLTRIM(fcitem3)+ALLTRIM(fcitem4) ALL
>
>The above replace statement is EXTREMELY slow - about 100 records every 3 seconds!! It takes forever to do these large files.
>
>Is there a different way to import from a text file and get the info into a single memo field? It is going from the cursor into SQL Server in a varchar(1000) field.
Just a couple of general ideas you could try:
- if the final target is SQL Server, maybe you could open an ADO connection to SQL Server and update the varchar column in a Recordset directly from the 4 VFP cursor fields, rather than needing to do the intermediate memo step
- I understand there is a Bulk Copy (bc) command in SQL Server which is fast for importing large amounts of data.
Regards. Al
"Violence is the last refuge of the incompetent." -- Isaac Asimov
"Never let your sense of morals prevent you from doing what is right." -- Isaac Asimov
Neither a despot, nor a doormat, be
Every app wants to be a database app when it grows up