A conversation cam up on which is more expensive to the db a computed column or a trigger.
Example was simple Subtract col 4 from col 3. They were non nullable.
So is a trigger more expensive to run vs. a computed column?
My WAG was the CC.
Any takers with some proof? My googles came up with Oracle and DB2 references :(
Hi, Stephen,
I don't have any proof or websites. My guess is that the trigger is more expensive. I'd be rather surprised if it weren't. (If you don't get any proof here, you may want to post the question on the Microsoft public news group for SQL Server)
Kevin