Closed parnoldx closed 4 years ago
Thanks for mentioning that. We had some benchmarks very early on by @desruisseaux but I believe that is no longer there or vanished with the JSR 275 codebase. If you say 2.0 is not performant, do you see any different results trying this with 1.0?
The better performance of primitive types is not new, but they are primitive and you would not want to sacrifice the data integrity or more as the Mars Climate Orbiter showed (although they may not use Java directly on the spacecraft;-)
The performance penalty is known, although your example is not the best comparison, since you combine parsing/formatting with the most primitive numerical operations even using BigDecimal. If you wanted to get that somewhat to match, why not use DecimalFormat or similar with BigDecimal and another NumberFormat with the primitive types?
However, you will always need more space to deal with quantities than primitive numbers.
There is a compromise because UnitConverter
works with numbers (also primitives like double
) and you don't need to instantiate a Quantity
, so it works faster for mass operations like the conversion of thousands of data points per minute or so, that should work with UnitConverter and low level primitive numbers while it could be challenging if you create a Quantity instance for each of them.
HTH, I don't think the snippet works as a real benchmark, but you are more than welcome to contribute to that, if you can.
The result for 1.0 is
primitives 7
bigdecimal 472
quantities 340
Not sure what you mean with "real benchmark", sure this is artificially to highlight the problem?! But the use case is to replace all numbers with quantities to assure correct calculation according to assigned units in a system where you do a lot of math and data manipulation. I "tweaked" 1.0 a bit for my use case and came to a penalty of ~ 10 so for the example above I get ~80ms for the quantities part. Which is ok for not to sacrifice data integrity but not this penalty in 2.0. IMO
Did you use Apache stopwatch or what is the stopwatch? I mean a JMH Benchmark, it is part of the JDK now for a while (after Java 11)
The parsing should not be used in one case only, instantiating a new Quantity with the Units constant seems more appropriate in this case. Do you have some code snippets including Apache Stopwatch we could try out?
I created #299 for a true benchmark.
Indriya 2.0.x does calculation with higher precision than 1.x. Also note, that you can speed up calculations above by being more careful when picking operand values for comparison. (might be 2 orders of magnitude faster) ...
// instead of
BigDecimal b = new BigDecimal(5.2);
// use
BigDecimal b = new BigDecimal("5.2");
// instead of
Quantities.getQuantity(5.2, Units.METRE);
// use
Quantities.getQuantity(RationalNumber.of(52, 10), Units.METRE);
Thanks for the input @andi-huber, could you maybe with help by @parnold-x if he wishes to help and others like @desruisseaux if he has time think about a benchmark along the lines of https://github.com/JavaMoney/javamoneyjmh? I forked that based on something a contributor did earlier. We don't have to use Gradle, since Maven is currently used everywhere else, but the idea would be to create some benchmarks that show also how using RationalNumber
instead of BigDecimal
can impact the performance. This should not go here, but either under https://github.com/unitsofmeasurement/uom-tools where checkstyle rules are also parked among other tools, or in a separate repo.
I've created a gist, not a best practice benchmark, but gives some idea on what to look for. Testing 3 aspects of quantity calculus:
https://gist.github.com/andi-huber/92ab34634560f22647adab584041144c
Showing
-- ADD
bigdecimal 21 ms
quantities 361 ms
-- MUL
bigdecimal 14 ms
quantities 328 ms
-- SCALAR MUL
bigdecimal 12 ms
quantities 273 ms
Roughly speaking, Indriya is 20 times slower than when using BigDecimal directly. However, this sound pretty reasonable to me.
I think if the JUnit test itself does not take too long, we could add it to the JUnit tests, if it is very slow, then it should be excluded like some LocalUnitFormat tests because they occasionally cause resource overflows during the whole build (something we might like to explore, but LocalUnitFormat is also not a core feature and mostly for GUI rendering, not unit exchange between systems)
I added it under https://github.com/unitsofmeasurement/indriya/blob/master/src/test/java/tech/units/indriya/IndriyaPerformanceTest.java, with a few tweaks like log outputs instead of System.out. So it can be run with a custom logging properties while by default it won't pollute the regular builds.
Hi, not sure if this is a issue for you but it seems that the 2.0 implementation is somehow not really performant. For the simple addition test below, I see these numbers in ms