Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Very Large Application in Visual FoxPro v6.0
Message
General information
Forum:
Visual FoxPro
Category:
Databases,Tables, Views, Indexing and SQL syntax
Miscellaneous
Thread ID:
00325080
Message ID:
00326163
Views:
26
>
>>You could run into big performance problems with SQL Server, too. I know we've seen that here with very large DBs. Another approach might be to combine your tables into monthly groupings, so you have 12 tables rather than 360 or so, whether you stick with vfp or move to SQL Server, if that's possible...and same with the 88, combine them into only a few larger tables should help performance.
>
>Thanks to everyone who has taken the trouble to reply and comment. Bruce you are the closest to what I have already discovered with this application.
>Actually the daily tables (I receive them hourly, but combine into daily) are the safeguard to avoid getting close to the 2 GIG limitation (not FoxPro, but WindowsNt limits.) Placing all the records into a SQL database, or any other type of database, causes a very real performance hit. Queries to large SQL (or sybase/Oracle) are just too slow. Scanning all records in a query takes much longer than only having to scan only the daily tables that contain the records that wre within the date range of the query. My users (DOD) do not wish to wait for results. The method I am using to use a temp table with the file names as opposed to a temp cursor or an array, is about the fastest of them all. Populating a 366 record single field table requires less than a couple of seconds of time, However, creating a temp cursor, or using an array is so close in performance that there is virtually no difference in retun time. In comparing the performance as to the return of the sql statements with a
>similar application running on a mainframe, the FoxPro application is getting the job done in about 1/15 the time it takes the mainframe to query the database (Sybase) do the necessary field conversions and then deliver data in a format for reporting.
>
>In my original post, I typo'd the daily table size, when they are actually around 20 to 300 MEGS each.
>
>I have a streaming input of a text file once each hour, containing the previous hours' records, and I poll that directory and parse/input those records into the daily tables as they arrive. This parsing is done on a continous basis on an adjacent machine. After parsing, I auto-archive the text file moving them to CAB files, for later backup to CDROM. Another program, identifies the transactions and produces a variety of reports.
>
>If I find no more magic as to making the application even faster, then I will start writing a paper, which describes the application in more detail and the functions it performs. I have not found any other application, including the Microsoft TerraData project that performs as well as this application does without having to supplement with a big investment in processing power and/or fiber optic network connections.

Doug,
Have you thought about using arrays? Depending on how big your result sets are, you might pull it off like that. I suspect that the slowest operation on your process is the Append From to the table that gets all the results.
Send your SELECTS INTO ARRAY and put them together to populate your grid. Now populating the grid from an array that's another problem :)
Previous
Reply
Map
View

Click here to load this message in the networking platform