Plateforme Level Extreme
Abonnement
Profil corporatif
Produits & Services
Support
Légal
English
Optimizing massive update
Message
De
05/04/2015 14:08:51
 
 
À
04/04/2015 20:53:16
Information générale
Forum:
Microsoft SQL Server
Catégorie:
Autre
Versions des environnements
SQL Server:
SQL Server 2012
Application:
Web
Divers
Thread ID:
01617566
Message ID:
01617792
Vues:
36
>>Our scenario was different in that we were doing mostly inserts and deletes. If your job is truly unique updates, I'm not sure how to implement that as a batch process. Could you create datatables or Temp tables of your modified data, then use MERGE to update the target tables?
>
>The closest thing I have found would have been to create a temporary table with all the fields. Then, the unchanged fields would simply have had their values to remain while the updated ones would have the new values. Then, an overall replace of all fields would only apply.
>
>But, as this would have been on all the fields, I am not sure this would have been faster.

If there is a way for you to easily test this approach, I would try it. Set based updates can be vastly quicker.
Précédent
Suivant
Répondre
Fil
Voir

Click here to load this message in the networking platform