I want to know if there is a compelling reason to choose one over the other What is the difference between it and the decimal data type Most sql server samples (e.g
Feeling like a Mommy Dommy today : DDLG_NSFW
The adventureworks database) use money and not decimal for things like price information
Should i just continue to use the money datatype, or is there a benefit to using decimal instead?
In this article, we look at the differences between storing monetary values in sql server using the money data type versus the decimal data type. Convert money data when you convert to money from integer data types, units are assumed to be in monetary units For example, the integer value of 4 is converted to the money equivalent of 4 monetary units The following example converts smallmoney and money values to varchar and decimal data types, respectively.
So the major difference is that money data types are smaller but have a fixed size Decimal on the other hand takes up more space and when multiplying the size, precision, and scale are going to change Personally that means that i’m going to story invoice amounts (for example) as money (or even smallmoney where i can) and tax rate as a decimal. Explore the critical differences between sql server money and decimal data types for financial calculations, focusing on precision loss, performance gains, and best practices for storage versus computation.
Decimal(x, y) is generally the recommended approach for storing and calculating currency values in sql server, as it grants finer control over precision, delivers consistent arithmetic, and aligns better with modern database design best practices.
Money vs decimal in sql server Accuracy and precision in sql server, choosing the appropriate numeric data type is crucial for database design, especially when dealing with financial data, measurements, or any form of precise numeric storage. The money data type in sql server is shrouded in mystery When should you avoid it
What advantages does it bring