General information
Category:
Databases,Tables, Views, Indexing and SQL syntax
I think your better choice is move all your data to a SQL Server single table.
SQL Server doesnt have the VFP 2Gb table size limit and will works well but for peak performance, I suggest you try to use the KDB data engine. KDB easily handles gigabytes of data, has timeseries analysis features and you can get a demo for evaluation at kx.com. It works with ODBC but you can also read a DBF directly.
>I have developed a VSL application in VFP6.0 which has 360 identicaly structured tables (a naming convention of the data transaction date, is used; i. e. xxx990601.dbf) each table contains 24 hours of tranaction data (approximately
>300 to 800K records. The total size of one years' worth of transaction tables is around 4.6 GIGS, but each table is around 2.5 MEGS. One program I am developing requires selecting certain transactions (I am using SQL select)
>using a lengthy filter, first by date, then location, then transaction type. Since I must scan all of the transaction tables
>in order to get some speed, I do a DIR *.DBF command to a temp file and then import the table names into a singlefield table
>which will then contain the table name. The first step of my query scans this table for the data date contained in the file name, and when a table is reach which meets the date criteria, then opens that file and performs the SELECT procedure, which is then sent into a table ("mytable.dbf"), and then scans for the next transaction table. In order to keep the SQL from overwritting the data collected perviously, I use an "append from" into another temp table; "mytemp.dbf" After the scan finishes, I open a grid in which to
>view the results, with an incremental search ability, plus graphing and reporting. When exiting this view, I give the user the option of saving the view to a file for later perusing.
>Another program beats each transacation against a series of joined tables (88 tables totalling 4.6 GIGS)
>to retrieve identification and other statistics for later analysis.
>Currently using this jump through loops sql will produce a view in approximately 1.5 minutes of processing on a stand-alone PC.
>Any comments of better handling of this application?
Previous
Next
Reply
View the map of this thread
View the map of this thread starting from this message only
View all messages of this thread
View all messages of this thread starting from this message only