Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Very Large Application in Visual FoxPro v6.0
Message
From
31/01/2000 13:23:39
 
 
To
31/01/2000 13:15:43
General information
Forum:
Visual FoxPro
Category:
Databases,Tables, Views, Indexing and SQL syntax
Miscellaneous
Thread ID:
00325080
Message ID:
00325155
Views:
19
You can get an evaluation copy of this product at www.kx.com.

Don't worry about the download time because the demo has only 200K !!

At http://www.kx.com/technical/tech_kdb.html, you can get some introductory papers.

>Intriguing, Rodolfo.
>
>Personally, I have never heard of this product. Can you post more information about it?
>
>
>>I think your better choice is move all your data to a SQL Server single table.
>>SQL Server doesnt have the VFP 2Gb table size limit and will works well but for peak performance, I suggest you try to use the KDB data engine. KDB easily handles gigabytes of data, has timeseries analysis features and you can get a demo for evaluation at kx.com. It works with ODBC but you can also read a DBF directly.
>>
>>>I have developed a VSL application in VFP6.0 which has 360 identicaly structured tables (a naming convention of the data transaction date, is used; i. e. xxx990601.dbf) each table contains 24 hours of tranaction data (approximately
>>>300 to 800K records. The total size of one years' worth of transaction tables is around 4.6 GIGS, but each table is around 2.5 MEGS. One program I am developing requires selecting certain transactions (I am using SQL select)
>>>using a lengthy filter, first by date, then location, then transaction type. Since I must scan all of the transaction tables
>>>in order to get some speed, I do a DIR *.DBF command to a temp file and then import the table names into a singlefield table
>>>which will then contain the table name. The first step of my query scans this table for the data date contained in the file name, and when a table is reach which meets the date criteria, then opens that file and performs the SELECT procedure, which is then sent into a table ("mytable.dbf"), and then scans for the next transaction table. In order to keep the SQL from overwritting the data collected perviously, I use an "append from" into another temp table; "mytemp.dbf" After the scan finishes, I open a grid in which to
>>>view the results, with an incremental search ability, plus graphing and reporting. When exiting this view, I give the user the option of saving the view to a file for later perusing.
>>>Another program beats each transacation against a series of joined tables (88 tables totalling 4.6 GIGS)
>>>to retrieve identification and other statistics for later analysis.
>>>Currently using this jump through loops sql will produce a view in approximately 1.5 minutes of processing on a stand-alone PC.
>>>Any comments of better handling of this application?
Previous
Reply
Map
View

Click here to load this message in the networking platform