Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Data Normalization
Message
 
 
To
04/12/1998 04:13:04
General information
Forum:
Visual FoxPro
Category:
Third party products
Miscellaneous
Thread ID:
00163994
Message ID:
00164046
Views:
33
>I am beginning a job for a client who wants to enter several years worth of purchases for a grocery store and, when finished, wants to be able to generate all sorts of reports (I use Foxfire). The fields are standard: Inv. #, Date, Vendor, Category, Product, Qty., Cost. My problem, age old, is how to set up the tables so as to create the reports as quickly as possible:
>
>I create the following tables: Category, Product, Vendor, Invoice Header, Invoice Detail. If I take the simple, but ugly path, every Invoice Detail record contains its date, its Category Name, and its Product Name. Now reporting is easy, but the data is horribly un-normalized.
>
>If I normalize, each of the records in all tables except the Invoice Detail has a program-generated primary key. Then, for example, in the Invoice Detail, the user selects the Product Name and I plug in the associated primary key from the Product table. Now let's say that the user wants an alphabetical listing of products within date parameters. What kind of indexing should I have in place in order to optimize the joins? Foxfire has to filter the Invoice Header table for dates, but the alphabetizing has to come from the Products table. The SQL statement for such a report gets rather complex--I'm not sure if I could write it.
>
>Since we will be dealing with thousands and thousands of entries, speed is of the essence. Should I throw normalization and disk conservation out the window in favor of simplicity (and speed) of reporting? Or should I normalize the data and just let Foxfire worry about the joins? Or is there a middle ground??? Or a better way than Foxfire? Any suggestions would be appreciated.
>
>John

If this is a one-shot deal, I would go for simplicity. If this is something that they will continue to use in the future as well, I would go for normalization.

For FoxFire to work, you have to have all data in the same file or you have to normalize completely. I inherited a semi-normalized system, and had to back track [redesign tables] to get FoxFire to work. One of the tables had 3 fields for identifying each record with up to 3 accounting codes. The Code descriptions were in a different table. In FoxFire only 1 of the fields could be identified with a link to the lookup table. So I had to normalize by adding a new child table for the accounting codes, then join this child table to the lookup table.

My other option was to have 3 identical lookup tables [one for each field in the original table]. I took the extra time [about 4 hours including data conversion] to do it right.
Mark McCasland
Midlothian, TX USA
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform