Open markebowles opened 3 years ago
Update: replace the argument "scale" with a constant, e.g.:
test1 = gpu.createKernel(
function () {
// in this expression, the constant 1.77 is treated as an integer
return (this.thread.x * 1.77);
})
.setOutput([10])
.setCanvas(canvas1)
.setContext(ctx1);
This also produces the unexpected integer behavior.
What is wrong?
Multiplying this.thread.x (and y) by a float scale functions differently depending on the operand order. GPU interprets the scale as an integer in the form "this.thread.x scale", but in the form "scale this.thread.x" the scale is interpreted as a float.
Where does it happen?
Latest version of gpu.js in gpu mode, mac os x, chrome browser.
How do we replicate the issue?
How important is this (1-5)?
There's an easy work-around, but it might help to include the workaround in the documentation.
Expected behavior (i.e. solution)
Javascripters are trained (aka brainwashed) to believe that "everything is a number, there's no such thing as an int". This is why it took me a day to find this problem -- the expectation is that regardless of the ordering of the expression, any scalar input to a kernel should be interpreted as a float.
Would this behavior also manifest with other sources of ints in the kernel, e.g. when we extract a cell from a Uint8Array argument?
Is this a bug that should ultimately be fixed, or a necessary side-effect of the transpilation process?
Other Comments
GPU.js is an extraordinarily useful and clever mechanism for producing high-performance code in desktop and mobile applications! Fantastic work!