Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Optimizing massive update
Message
From
03/04/2015 12:17:46
 
 
To
31/03/2015 20:34:39
General information
Forum:
Microsoft SQL Server
Category:
Other
Environment versions
SQL Server:
SQL Server 2012
Application:
Web
Miscellaneous
Thread ID:
01617566
Message ID:
01617769
Views:
59
Thinking more about it, I am verifying if the usage of a temporary table might help in this case.

If I create a temporary table like this:
CREATE TABLE #Temp (PrimaryKey Int,Field Char(50),Value Char(8))
...and populate the values like this:
INSERT INTO #Temp (PrimaryKey,Field,Value) VALUES (1,'CustomerFirstName','Michel Fournier')
I could then use an update syntax with inner join to update the main table in one command. The problem here seems to be related to 10000 updates commands in one command. This is not any faster. But, if I can have only one update, with inner join usage, I believe this should be faster.

The problem here is that I have about 40 fields of various types. I could create a temporary table for all field type. That would be about 5 temporary tables. So, I would need to make 5 updates instead of one. Or, maybe someone would know how I can build a temporary table which would include all the data like this:
CREATE TABLE #Temp (PrimaryKey Int,Field Char(50),ValueCharacter Char(8),ValueInteger Int,ValueBoolean Bit,ValueDouble Numeric(7,3))
...and make some kind of fancy one and only update that would be able to negotiate with that.

Any idea or suggestions on the topic?
Michel Fournier
Level Extreme Inc.
Designer, architect, owner of the Level Extreme Platform
Subscribe to the site at https://www.levelextreme.com/Home/DataEntry?Activator=55&NoStore=303
Subscription benefits https://www.levelextreme.com/Home/ViewPage?Activator=7&ID=52
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform