Open Steve73 opened 3 years ago
When you use precision 'unsigned' on the Apple M1 GPU (with Safari and Chrome), kernel arguments are read inaccurately.
const gpu = new GPU({ mode: 'gpu' }); const createTexture1 = gpu.createKernel(function(r) { return r; }, { output: [1], precision: 'unsigned'}); var t1 = createTexture1(2); console.log(t1); const createTexture2 = gpu.createKernel(function() { return 2; }, { output: [1], precision: 'unsigned'}); var t2 = createTexture2(); console.log(t2);
This leads to this output:
Float32Array [2.015624761581421] Float32Array [2]
Both lines should show the same value (2).
See code example above.
Not sure to what extent such inaccuracies are tolerated.
Both output lines should show the same value (2).
What is wrong?
When you use precision 'unsigned' on the Apple M1 GPU (with Safari and Chrome), kernel arguments are read inaccurately.
Where does it happen?
This leads to this output:
Both lines should show the same value (2).
How do we replicate the issue?
See code example above.
How important is this (1-5)?
Not sure to what extent such inaccuracies are tolerated.
Expected behavior (i.e. solution)
Both output lines should show the same value (2).