Level Extreme platform
Subscription
Corporate profile
Products & Services
Support
Legal
Français
Obtaining a higher precision in timestamp
Message
From
22/03/2014 12:13:42
 
 
To
21/03/2014 12:57:24
General information
Forum:
Microsoft SQL Server
Category:
Other
Environment versions
SQL Server:
SQL Server 2008
Application:
Web
Miscellaneous
Thread ID:
01597037
Message ID:
01597117
Views:
40
>>Michel, why doesn't that apply? I realize it might mean changing application code (to avoid having the application layer do it), but beyond that, why wouldn't it apply? Just curious.
>
>I have in my data dictionary a field for a default value. But, in the code, this is converted as is. By data dictionary, I mean my own set of tables which controls basically everything. This is not related to the backend. So, at the end, the code ends up parsing a line such as Set ModDate = DateTime.Now, for pseudo code example. So, the usage of SysDateTime functionality of SQL Server is not applicable up to my code. It would work if I would be in SSMS, for example do to a direct data insertion to update a record.

I realize you've taken a different approach. But generally speaking, the recommended practice is for the database engine to handle these as default values.

Let me ask this - since you elected to go with the more precise datetime2 (down to nanosecond), is there an argument to be made that you're losing a tiny level of precision by assigning the mod date in the application layer with the VB DateTime.Now?

It seems to me that any gain by going to DateTime2 is potentially lost if there's any latency (however slight) between the time when the application layer assigns the value and the time when the row is actually written. Maybe there's an explanation for this - but so far, I haven't seen any reason against using default database values. Generally that's what they're there for.
Previous
Next
Reply
Map
View

Click here to load this message in the networking platform