Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Very Large Application in Visual FoxPro v6.0
Message
General information
Forum:
Visual FoxPro
Category:
Databases,Tables, Views, Indexing and SQL syntax
Miscellaneous
Thread ID:
00325080
Message ID:
00325518
Views:
21
Doug,

Would making some summary tables (data mart or other buzz word) help your application? In the first example you gave, perhaps at the end of each day you could create a record containing the detail you need for each location, transaction type combination. So if you had 2 locations and 3 transaction types you would end up with six records per day. Then you could run your graphing and reporting from this table. I'm not sure what level of detail you need to show in your grid. If you are showing each transaction that meets the criteria than this technique may not work.

>which will then contain the table name. The first step of my query scans this table for the data date contained in the file name, and when a table is reach which meets the date criteria, then opens that file and performs the SELECT procedure, which is then sent into a table ("mytable.dbf"), and then scans for the next transaction table. In order to keep the SQL from overwritting the data collected perviously, I use an "append from" into another temp table; "mytemp.dbf"
>
Just another way to skin a cat, when I do this sort of thing I usually begin by creating a cursor (named holder) to hold the data, then I do each SELECT into an array, and then INSERT INTO holder FROM ARRAY. I also use ADIR to get my list of tables as I think someone else already mentioned to you.

HTH
Bill


>I have developed a VSL application in VFP6.0 which has 360 identicaly structured tables (a naming convention of the data transaction date, is used; i. e. xxx990601.dbf) each table contains 24 hours of tranaction data (approximately
>300 to 800K records. The total size of one years' worth of transaction tables is around 4.6 GIGS, but each table is around 2.5 MEGS. One program I am developing requires selecting certain transactions (I am using SQL select)
>using a lengthy filter, first by date, then location, then transaction type. Since I must scan all of the transaction tables
>in order to get some speed, I do a DIR *.DBF command to a temp file and then import the table names into a singlefield table
>which will then contain the table name. The first step of my query scans this table for the data date contained in the file name, and when a table is reach which meets the date criteria, then opens that file and performs the SELECT procedure, which is then sent into a table ("mytable.dbf"), and then scans for the next transaction table. In order to keep the SQL from overwritting the data collected perviously, I use an "append from" into another temp table; "mytemp.dbf" After the scan finishes, I open a grid in which to
>view the results, with an incremental search ability, plus graphing and reporting. When exiting this view, I give the user the option of saving the view to a file for later perusing.
>Another program beats each transacation against a series of joined tables (88 tables totalling 4.6 GIGS)
>to retrieve identification and other statistics for later analysis.
>Currently using this jump through loops sql will produce a view in approximately 1.5 minutes of processing on a stand-alone PC.
>Any comments of better handling of this application?
Previous
Reply
Map
View

Click here to load this message in the networking platform