Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Optimizing massive update
Message
De
04/04/2015 20:53:16
 
 
À
04/04/2015 13:00:30
Information générale
Forum:
Microsoft SQL Server
Catégorie:
Autre
Versions des environnements
SQL Server:
SQL Server 2012
Application:
Web
Divers
Thread ID:
01617566
Message ID:
01617789
Vues:
36
>Our scenario was different in that we were doing mostly inserts and deletes. If your job is truly unique updates, I'm not sure how to implement that as a batch process. Could you create datatables or Temp tables of your modified data, then use MERGE to update the target tables?

The closest thing I have found would have been to create a temporary table with all the fields. Then, the unchanged fields would simply have had their values to remain while the updated ones would have the new values. Then, an overall replace of all fields would only apply.

But, as this would have been on all the fields, I am not sure this would have been faster.
Michel Fournier
Level Extreme Inc.
Designer, architect, owner of the Level Extreme Platform
Subscribe to the site at https://www.levelextreme.com/Home/DataEntry?Activator=55&NoStore=303
Subscription benefits https://www.levelextreme.com/Home/ViewPage?Activator=7&ID=52
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform