General information
Category:
Coding, syntax & commands
>You guys have been a tremendous help believe me. Your answers reinforced that I was doing it right. Thanks.
>
>We hinted on the real problem a little earlier. I get in real world data. In theory, I should never get an item in from a bank with a duplicate ORGTRANUM. Well this is ain't true. That was what I was using as a primary key. I then decided to use a combination of fields for the primary key including the bank name since I deal with several banks. Some how some way the bank will end up sending in duplicate data that is indeed a real transaction over the course of time. This even includes the same date. Something jsut hit me. Maybe I should put the banks in their own table. Right now I have them all in one table. It was fine in the beginning when we had about 1000 records coming in a day. Now we have 10,000+! I am still thinking that sooner or later they will send in a dup record. I do have code that will look at the file you are trying to input and it will tell you you are putting in a file that has been already load. Problem is that the bank will screw up and send in a file with the
>current days info and some ifo from the previous day.
>
>The life we lead! :-)
Randall --
Best wishes!
Sounds like you're doing some "data cleansing". As you suggest, "quarantining" the data and testing it by batch or bank may be the ticket. I recognize that a batch could duplicate previously sent records. But, keeping a temporary barrier between sent data and queried data may give you the best of both worlds: clean data and easily queriable data.
But, I wouldn't complain if you credited my account twice with my latest deposit<g>
Jay
Previous
Next
Reply
View the map of this thread
View the map of this thread starting from this message only
View all messages of this thread
View all messages of this thread starting from this message only