Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Very Large Application in Visual FoxPro v6.0
Message
From
31/01/2000 12:35:04
 
General information
Forum:
Visual FoxPro
Category:
Databases,Tables, Views, Indexing and SQL syntax
Miscellaneous
Thread ID:
00325080
Message ID:
00325114
Views:
26
Hi Doug,

Not to sound facetious but, yes. The better way is to move the data to a single SQL Server 7.0 table. This way, you can get rid of the gazillion little DBFs and SQL Server has an exobyte (I knew what that was once but I forget now) capacity on data storage.

You can then create some queries in TSQL as stored procs to send back limited result sets of data as required. Should be much more efficient and, I dare say, may be much faster.


>I have developed a VSL application in VFP6.0 which has 360 identicaly structured tables (a naming convention of the data transaction date, is used; i. e. xxx990601.dbf) each table contains 24 hours of tranaction data (approximately
>300 to 800K records. The total size of one years' worth of transaction tables is around 4.6 GIGS, but each table is around 2.5 MEGS. One program I am developing requires selecting certain transactions (I am using SQL select)
>using a lengthy filter, first by date, then location, then transaction type. Since I must scan all of the transaction tables
>in order to get some speed, I do a DIR *.DBF command to a temp file and then import the table names into a singlefield table
>which will then contain the table name. The first step of my query scans this table for the data date contained in the file name, and when a table is reach which meets the date criteria, then opens that file and performs the SELECT procedure, which is then sent into a table ("mytable.dbf"), and then scans for the next transaction table. In order to keep the SQL from overwritting the data collected perviously, I use an "append from" into another temp table; "mytemp.dbf" After the scan finishes, I open a grid in which to
>view the results, with an incremental search ability, plus graphing and reporting. When exiting this view, I give the user the option of saving the view to a file for later perusing.
>Another program beats each transacation against a series of joined tables (88 tables totalling 4.6 GIGS)
>to retrieve identification and other statistics for later analysis.
>Currently using this jump through loops sql will produce a view in approximately 1.5 minutes of processing on a stand-alone PC.
>Any comments of better handling of this application?
------------------------------------------------
John Koziol, ex-MVP, ex-MS, ex-FoxTeam. Just call me "X"
"When the going gets weird, the weird turn pro" - Hunter Thompson (Gonzo) RIP 2/19/05
Previous
Reply
Map
View

Click here to load this message in the networking platform