Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Query Tool
Message
 
 
To
23/03/1997 14:15:36
General information
Forum:
Visual FoxPro
Category:
Third party products
Title:
Miscellaneous
Thread ID:
00022381
Message ID:
00025496
Views:
54
>I'm planning to use SQL Server as a back end to a client server app with these LARGE (2G+) tables using FoxFire! - I figure if I send the query to SQL Server, it should handle it better than it can in fox- don't you think?

If the tables are really larger than 2 gig, then VFP becomes problematic. But heed fair warning: SQL Server is a monolithic, cumbersome sonabitch to use for analytics when compared to VFP. Just getting data into SQL Server is a pain.

Some suggestions:
1. Use native SQL pass-through and send optimized Transact-SQL queries.

2. Use optimization hints in your Transact SQL code... ESPECIALLY for specifying an index to use on a query. SQL Sever's query optimizer loves to choose long, long table scans instead of using an index when the query becomes even marginally complicated. Here's an example. Note the optimizer hint in the FROM clause:
SELECT fldField1, fldField2, fldField3, etc...
INTO tempdb..MyTempTable
FROM tblData (INDEX=IDXtblIndex1)
WHERE fldICD9DX1 BETWEEN '885' AND '8877' OR
fldICD9DX1 BETWEEN '8951' AND '8977' OR
fldICD9DX1 IN('878','9059')
blah...blah

3. You might want to experiment with the SELECT ... INTO technique (as used above) into a temporary table, build your view based on the temporary table, and use the ODBC driver to return the results of the temporary table to the VFP view where you can manipulate it locally. This assumes that you will not be updating any data in the original table.

4. At end of a query run, you can issue a DROP TABLE command to SQL Server via SQL pass through to immediately remove any temporary tables that you explicitly created.
Previous
Reply
Map
View

Click here to load this message in the networking platform