Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Any Experience with SQL Server 2000 speed ?
Message
General information
Forum:
Visual FoxPro
Category:
Client/server
Miscellaneous
Thread ID:
00812241
Message ID:
00819258
Views:
12
Petros:

SQL has DTS.
DTS uses VBSCript that you can write any possible
transformation script calling ANY DLL's statistical functions.
VBSCRIPT itself is SLOW doing anything, hey, its VB, right!

Anyways: DTS has a BULK-INSERT task (OR BCP->UGGH).

The question: does the data need to be scrubbed before stuck
into SQL tables?

MS wrote DTS to ETL(scrub) incoming data from other formats.
I tried writing VFP code to cook incoming records,
then importing VIA ODBC into SQL....forget it.

If you can do simple data integrity, bounds checks on the data
before SQL gets it, writing the TSQL should be no different
then VFP code. Depends on the stats processing, right?

My solution was to import cleaned data via BULK-INSERT into
SQL temp tables, process via TSQL using NON-PROCEDURAL code:
NO LOOPING!!!!

I optimized a dupe-removal routine this way:
Using query analyzer, I went from thousands or TSQL SP
rcompiles to elegant TSQL iterative commands

Also check out the usage of Table Variables:
these smoke cursors:

-------------------------
declare @temp1 table
(
[ts] datetime not null,
[td] [numeric] (12,2) NOT NULL
)

/* DO DA MATH */
insert into @temp1 (ts,td)
select top 8 timestop,coalesce(pay_data.total_dollars,0) as avg_dollars from pay_data where pay_data.position_id=@mySE_ID and pay_data.total_dollars>0

select avg(td) AS avg_dollars from @temp1
GO
Previous
Reply
Map
View

Click here to load this message in the networking platform