>>BTW, back to the original problem. Currentlly the code takes 20 minutes to run for 8K file. I added statistics to see how many records were added:
>
>At least add a sec/record entry on the lines "Started to commit changes." and "Committed all changes for Load New Trans."- Reason: if small batches show somewhat similar times per record as huge batches you are certain you can do most of your testing with small batches and only set up a run with a large batch before going home. This will be finished next morning and serves as a check you haven't introduced any data-size dependant exponential slowing down. Depending on your breakup into smaller routines I'ld perhaps add a special counter for the time the whole processing for each table took - including seeks, error handling code and so on. Depends on how different the timings look in coverage. This is also a stopgap against code working alright with current data loads, but might be non-optimal if a specific table hugely increased.
>
>my 0.02 EUR
>
>thomas
>
Not sure, what you're suggesting. But now we're facing a bigger problem. The current speed for Load New Trans is almost acceptable. But ysterday I run LoadNewTrans + ProcessNewTrans and the later process took more than 5 hours to finish! After everything I've done to speed it up :(
If it's not broken, fix it until it is.
My Blog