Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Best? way to process a huge result set over a network
Message
Information générale
Forum:
ASP.NET
Catégorie:
Autre
Versions des environnements
Environment:
C# 1.1
Database:
MS SQL Server
Divers
Thread ID:
00965818
Message ID:
00965879
Vues:
8
>>I am working on a business report winform app. It needs a ton of data from our Microsoft SQL Server, which it then processes and transforms into various output tables.
>>
>>When I pull the data table using an SP, it is very big, about 300,000 rows with a big number of columns, but comes from the DB quickly enough. The rows can be processed one at a time. The client PC that is doing all the data crunching has about 512MB of RAM. The client PC hard drive is spinning constantly.
>
>Don't go to a client machine. Keep it on the server via a SP for the gross crunch.
>
So far, we are doing as much preprocessing as we can in the SP, but we have been trying to avoid math and string manipulation in the SP, which we have been doing in C#. Our DBA and what I have read suggest avoiding those kinds of operations in SPs.

>>When I test a small dataset, the data is processed quickly, but with larger result sets, the processing time increases disproportionately because .NET seems to handle a large table inefficiently, even if it is iterating through its rows, constantly swapping between RAM and hard disk.
>
>How are you moving through the ds?
>
We are moving through the ds using foreach loops.

>>These are the options that I am considering. Any additional ideas or criticisms?
>>2. Rewrite the stored procedure to only return the "TOP @batchSize" rows, reapplying all the selection criteria and table joins every time to keep the result set small.
>
>BINGO!!!
>
I wasn't sure if I would save work on the server by avoiding reapplying a lot of complicated selection criteria or if creating and consuming the rows in a temp table would use more DB resources. I assume that additional overhead is incurred every time you receive queries, update temp tables, and send out results.

>>4. Pull all the data onto the client, write it to an XML file, then consume the records from a stream, but I think this will really burn the client CPU.
>
>Could you have a processing SQL Server for batch work handy? If so Truncate the work
tables, and get an XML file from the OLAP server for real data. Crunch away on your process on that other server. I know of a local cotton company that does this with 15 gig XML files. yes GIG!

We could check into the possibility of another SQL Server running on the client or nearby. I will suggest this possibility and see what the licensing cost would be and if it would be worth it for this report. There are certainly other reports, but the client is a desktop PC, so I don't know how well it would perform.

>>How efficiently does .NET serialize and deserialize XML?
>Really fast when you do it right.
If you have a short list of DO's and DON'Ts, I would appreciate them.

Thanks for your suggestions.
David S. Alexander
Kettley Publishing
20271 SW Birch Street, 2nd Floor
Newport Beach, CA 92660-1752
Précédent
Répondre
Fil
Voir

Click here to load this message in the networking platform