>>Bank transactions are downloaded as text file, including account balance after each transaction. They are moved to temporary dbf file and consolidated into historical file, checking for no duplicates and balance correctness.
>>
>>The check for duplicates is done before appending to historical file. First cut was to compare concatenation of all fields in new transactions to similar concatenation of fields in historical file:
>>
>>
>>SELECT * ;
>> FROM BanMov ;
>> WHERE DTOS(dTraBan)+cRefBan+cTipTraBan+cDescBan+ ;
>> TRANSFORM(nDebitBan,'9999999.99')+TRANSFORM(nCreditBan,'9999999.99') ;
>> NOT IN (SELECT DTOS(dTraBan)+cRefBan+cTipTraBan+cDescBan ;
>> +TRANSFORM(nDebitBan,'9999999.99')+TRANSFORM(nCreditBan,'9999999.99') ;
>> FROM DacBanMov) ;
>> INTO TABLE NoDups
>>
>>
>>This is *very* slow.
>>
>>Any suggestions an efficient way to guarantee no duplicates?
>
>Perhaps with the selfsame command, but adding an index on the key expression - an index on
dtos(dTraBan) + cRefBan..., especially in the historical file.
Thanks Hilmar. Tried and didn't seem to help much.
Alex