Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Data Normalization
Message
 
To
04/12/1998 04:13:04
General information
Forum:
Visual FoxPro
Category:
Third party products
Miscellaneous
Thread ID:
00163994
Message ID:
00164380
Views:
15
>I am beginning a job for a client who wants to enter several years worth of purchases for a grocery store and, when finished, wants to be able to generate all sorts of reports (I use Foxfire). The fields are standard: Inv. #, Date, Vendor, Category, Product, Qty., Cost. My problem, age old, is how to set up the tables so as to create the reports as quickly as possible:
>
>I create the following tables: Category, Product, Vendor, Invoice Header, Invoice Detail. If I take the simple, but ugly path, every Invoice Detail record contains its date, its Category Name, and its Product Name. Now reporting is easy, but the data is horribly un-normalized.
>
>If I normalize, each of the records in all tables except the Invoice Detail has a program-generated primary key. Then, for example, in the Invoice Detail, the user selects the Product Name and I plug in the associated primary key from the Product table. Now let's say that the user wants an alphabetical listing of products within date parameters. What kind of indexing should I have in place in order to optimize the joins? Foxfire has to filter the Invoice Header table for dates, but the alphabetizing has to come from the Products table. The SQL statement for such a report gets rather complex--I'm not sure if I could write it.
>
>Since we will be dealing with thousands and thousands of entries, speed is of the essence. Should I throw normalization and disk conservation out the window in favor of simplicity (and speed) of reporting? Or should I normalize the data and just let Foxfire worry about the joins? Or is there a middle ground??? Or a better way than Foxfire? Any suggestions would be appreciated.
>
>John

John, how about loading the supporting tables into arrays.

Then, call udf's to scan the arrays and plug in the values on the actual "report form". I used this on a customer and it was VERY FAST.

You should normalize, load to arrays, report call udf's with array scans.
Carl R. Perkins
NJ5J Software Corp. http://www.nj5j.com
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform