I was testing a typeclass instance of BigDecimal
in my personal FP library using Hedgehog and found some issue in BigDecimal
in Scala 2.13.
[info] - just.fp.MonoidSpec.testBigDecimalMonoidLaw: Falsified after 3 passed tests
[info] > a1: A: 1.0000000000000002E+36
[info] > a2: A: -9223372036854775808
[info] > a3: A: -9223372036835934692
[info] > monoidLaw.associativity
It says the test in associativity law failed which means
((a1 |+| a2) |+| a3)
is not equal to (a1 |+| (a2 |+| a3))
.
That was a Monoid
typeclass instance for BigDecimal
but it also means the following code returns false
.
val a = BigDecimal("1.0000000000000002E+36")
val b = BigDecimal("-9223372036854775808")
val c = BigDecimal("-9223372036835934692")
((a + b) + c) == (a + (b + c)) // returns false in 2.13
Before Scala 2.13, it was true
. In java 11, it is still true
.
I found that it was reported as a bug already, yet it looks more like a change in BigDecimal
in Scala 2.13.
Reported issue: Regression: BigDecimal is broken on 2.13.0, works on 2.12.8
There is MathContext
type used in BigDecimal
to keep precision
and roundingMode
. The precision
tells the number of digits to be used for an operation. So it determines how accurate the calculated number is.
The default MathContext
has been MathContext.DECIMAL128
but before 2.13, it was ignored in arithmetic operations.
For instance, in Scala 2.12.10, +
operation is done in this way.
def + (that: BigDecimal): BigDecimal =
new BigDecimal(this.bigDecimal add that.bigDecimal, mc)
As you can see, it passes mc
(MathContext
) to the newly created BigDecimal
but not to the add
operation method. The add
method without MathContext
parameter is from Java's BigDecimal
and it is for unlimited precision so it is the same as using MathContext.UNLIMITED
.
In Scala 2,13, it's
def + (that: BigDecimal): BigDecimal =
new BigDecimal(this.bigDecimal.add(that.bigDecimal, mc), mc)
This add
takes BigDecimal
and MathContext
and uses the given MathContext
for add
operation.
So that means, Scala prior to 2.13 actually has a bug in BigDecimal
so it had a wrong behaviour. It is finally fixed in Scala 2.13 but then what about all the existing application using BigDecimal
and relying on that bug?
All those applications using BigDecimal
with its primary constructor might be broken as soon as upgrading Scala to 2.13. It might be fixed by adding MathContext
when creating a BigDecimal
instance (e.g. BigDecimal(n, MathContext.UNLIMITED)
). Well it was a bug yet if the bug is used everywhere and there are applications relying on that bug, I don't think it should be fixed.
There are three things to learn from this issue.
- The default value sucks. Using default parameter might be dangerous too.
- Once the behaviour is set and used everywhere that should probably not be changed.
- Property-based testing rocks! I couldn't have found that issue if I hadn't done property-based testing. I recommend Scala Hedgehog for property-based testing or just unit-testing as well.
I need to point out one thing though. For the second point I just mentioned, it's tricky in Scala's BigDecimal
case as the operations that ignore the internal MathContext
are +
and -
. If it's *
or /
, it still passes the MathContext
. So there's inconsistency in arithmetic operations.
/* Arithmetic operations in BigDecimal from Scala 2.12.10 */
def + (that: BigDecimal): BigDecimal =
// it does not pass mc to add()
new BigDecimal(this.bigDecimal add that.bigDecimal, mc)
def - (that: BigDecimal): BigDecimal =
// it does not pass mc to subtract()
new BigDecimal(this.bigDecimal subtract that.bigDecimal, mc)
def * (that: BigDecimal): BigDecimal =
// it passes mc to multiply()
new BigDecimal(this.bigDecimal.multiply(that.bigDecimal, mc), mc)
def / (that: BigDecimal): BigDecimal =
// it passes mc to divide()
new BigDecimal(this.bigDecimal.divide(that.bigDecimal, mc), mc)
So Scala 2.13 has actually fixed the old bug but it may also make some applications' calculation incorrect or causing unexpected result as the bug has been there for years.
In my opinion, this should have been caught by testing so it should have never revealed to public. So my third point makes perfect sense.
"Property-based testing rocks!"
It could have been caught and fixed before it was released if property-based testing had been done.