Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Optimizing massive update
Message
De
01/04/2015 13:32:59
 
Information générale
Forum:
Microsoft SQL Server
Catégorie:
Autre
Versions des environnements
SQL Server:
SQL Server 2012
Application:
Web
Divers
Thread ID:
01617566
Message ID:
01617653
Vues:
37
>I am not sure if I can help you for you are not giving much details, but I can tell you my experience with some big updates I had. The background is a process that creates very compact binary files that I needed to parse and store in SQL, each file has 2 million records and we get a new one every half an hour, the first version of the program parsed the binary file record by record and inserted the records into the SQL database, but it would take the process more than half an hour, making it of course nonviable as it would fall behind, I found out that the parsing was very very fast, but the bottle neck was the inserts, so I changed the program to use SQL Bulk Copy (System.Data.SqlClient.SqlBulkCopy) and the whole process takes about a minute. HTH

Yes, I have a InsertInBatch class for bulk insert. In this case, however, this is an update. I am adjusting the framework to support bulk update. I am still fine tuning some of that.
Michel Fournier
Level Extreme Inc.
Designer, architect, owner of the Level Extreme Platform
Subscribe to the site at https://www.levelextreme.com/Home/DataEntry?Activator=55&NoStore=303
Subscription benefits https://www.levelextreme.com/Home/ViewPage?Activator=7&ID=52
Précédent
Répondre
Fil
Voir

Click here to load this message in the networking platform