>>>>Hi,
>>>>
>>>>I need to generate 20 million records in a table in a database in order to make some realistic testing situations. I created some VFP code which created about 4.5 million records but this took about 5 hours. I am now trying some SQL code to SELECT the top 1000 records and insert them back into the table (with new GUID primary key) and looping many times, but this is going pretty slowly too.
>>>>
>>>>What would you all suggest? Would selecting all 4.5 million and inserting them back into the table be faster? And then maybe do that again a few times?
>>>
>>>Did you consider BULK INSERT a disk file?
>>
>>No because I don't have a text file with the data, although I guess I could export the existing records into a file. Do you think exporting 4,000,000 records to a text file and then importing them will be faster than the other technique?
>
>VFP can build a file for you of that size in a snap.
>
>I believe BULK based command and utilities are targeted for efficiency (among other requisites). For a comparison of the efficiency of different insert methods:
https://www.simple-talk.com/sql/performance/comparing-multiple-rows-insert-vs-single-row-insert-with-three-data-load-methods/Thanks, I'm trying to use a BCP command to create the format file I need but it's not working:
bcp transactions format nul -c -x -f transactionformat.xml –t, -T
gives me: incorrect syntax near format
Any idea what I am doing wrong? I need a format file because I need to skip a couple columns (assuming the default values will get populated using bulk insert)