Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Non data bearing primary keys
Message
Information générale
Forum:
Visual FoxPro
Catégorie:
Autre
Divers
Thread ID:
00106667
Message ID:
00107218
Vues:
32
> This sounds like a very complicated process and probably open to many 'exceptions'. But the short answer would be, implement the rule where you can and where it makes sense. Reporting situations is usually where you run into the most valid cases for violating just about every normalization rule! (snip)

Thank you for the feedback Chad! Seems for once I stuck my head up out of my box and didn't get run over by the "there was a better way to do this" truck! lol :)

> I was wondering though, the bulk of your process would seem to be inserting new records and that would take the most time. Why do you feel creating surrogate keys as you go would create more overhead?

Well actually its not INSERTing in the syntax sense, but rather APPENDing. Here's a test I did fairly early into my implementation after my first sleepless night over "Man I should of done this differently" :

I took 2 tables, Accounts and Equipment
Each table gets fed 7 different text files (With a Delete Tag All before, and a System identifier field called ref_num filled in after each append)
At time of test (10/96), Accounts had approx. 140,000 records and Equipment had almost 4,000,000 records.
With my implementation, PK for Accounts is ref_num+account, PK for Equipment is ref_num+handle and FK is ref_num+account
Time required to fill both these tables with new data and generate a report of equipment listed by account took 3.5 hours during peak network hours (Note, it now takes just over an hour after Craig Bernstein helped me find the SET ANSI trick)

Time required to fill both tables with new data and run a UDF to generate unique id after each Append and then generate the report took 37 minutes longer when done after hours (min network traffic) and 51 minutes longer during peak hours.

Now this report used in this test is a monster on paper, but it is fairly simple in processing compared to the majority since it only joins 2 tables. Once I seen these results I decided to stick with my compound-key implementation with the exception of a few small lookup tables maintained by my apps and used to make the transient data clean and relate-able.
Roxanne M. Seibert
Independent Consultant, VFP MCP

Code Monkey Like Fritos
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform