Closed BarionLP closed 1 month ago
Tagging subscribers to this area: @dotnet/area-system-numerics See info in area-owners.md if you want to be subscribed.
Same on my Surface Pro X (ARM64, 128bit simd) and an old HP ProBook (Intel Core i7 x64, 256bit simd)
cc: @tannergooding
Something about the vectorized implementation of Exp, e.g. this:
using System.Numerics.Tensors;
for (int length = 1; length <= 2; length++)
{
Console.WriteLine($"Length: {length}");
double[] values = new double[length];
double[] results = new double[values.Length];
values.AsSpan().Fill(-709.7);
TensorPrimitives.Exp<double>(values, results);
Console.WriteLine(Math.Exp(values[0]));
Console.WriteLine(results[0]);
Console.WriteLine();
}
outputs this (vectorization will kick in for at least two doubles):
Length: 1
6.04235438695844E-309
6.04235438695844E-309
Length: 2
6.04235438695844E-309
-∞
I would appreciate if you can explain what happened here. I love low level performance and would like to learn more about it.
Tagging subscribers to this area: @dotnet/area-system-numerics-tensors See info in area-owners.md if you want to be subscribed.
I would appreciate if you can explain what happened here.
The code has a fallback code path for edge case values, which these are. The vectorized implementation of that fallback is producing different answers than the scalar implementation.
can confirm fix in .NET 9 RC 2
Description
TensorPrimitives.Exp returns negative infinity instead of 0 for doubles between approximately -709.5 and -709.7.
When using Math.Exp() it correctly returns 0.
when using float also everything is fine.
I'm using System.Numerics.Tensors version 9.0.0-rc.1.24431.7
Reproduction Steps
Expected behavior
the above code should output 0 or values close to 0
Actual behavior
Output on my machine: -Infinity -Infinity -Infinity -Infinity -Infinity
Regression?
i haven't tried in the preview versions
Known Workarounds
No response
Configuration
.NET 9 rc1 Windows 10 x64 (AMD) Simd uses 256bits (4 doubles)
Other information
No response