>Bank transactions are downloaded as text file, including account balance after each transaction. They are moved to temporary dbf file and consolidated into historical file, checking for no duplicates and balance correctness.
>
>The check for duplicates is done before appending to historical file. First cut was to compare concatenation of all fields in new transactions to similar concatenation of fields in historical file:
>
>
>SELECT * ;
> FROM BanMov ;
> WHERE DTOS(dTraBan)+cRefBan+cTipTraBan+cDescBan+ ;
> TRANSFORM(nDebitBan,'9999999.99')+TRANSFORM(nCreditBan,'9999999.99') ;
> NOT IN (SELECT DTOS(dTraBan)+cRefBan+cTipTraBan+cDescBan ;
> +TRANSFORM(nDebitBan,'9999999.99')+TRANSFORM(nCreditBan,'9999999.99') ;
> FROM DacBanMov) ;
> INTO TABLE NoDups
>
>
>This is *very* slow.
>
>Any suggestions an efficient way to guarantee no duplicates?
>
>TIA,
>
>Alex
Do you have index with the exact expression in both tables? You can also change NOT IN to NOT EXISTS
If you don't have an index, may be you can check each field individually (with NOT EXISTS).
If it's not broken, fix it until it is.
My Blog