Closed renderappco closed 1 year ago
A bit of an update, as I know have a lot better understanding of what I'm doing.
1) So, that's an optional bit, but it's weird to me that I can't access the _rgbImages.length 2) That's the main issue. Everything seems to be working fine, except from this section. Any help on that, that would be more than appreciated!
const { GPU } = require('gpu.js');
const gpu = new GPU();
const rgbImages = [
// Red Image
[ [[255, 0, 0],[255, 0, 0]], [[255, 0, 0],[255, 0, 0]] ],
// Green Image
[ [[0, 255, 0],[0, 255, 0]], [[0, 255, 0],[0, 255, 0]] ],
// Blue Image
[ [[0, 0, 255],[0, 0, 255]], [[0, 0, 255],[0, 0, 255]] ]
]
const alphaMultipliers = [
// Keep Red
[ [1,1], [1,1] ],
// Hide Green
[ [0,0], [0,0] ],
// Hide Blue
[ [0,0], [0,0] ]
]
const multiplyBuffers = gpu.createKernel(function(_rgbImages, _alphaMultipliers, _imageCount){
let resultImage = [255, 0, 0];
// 1) I would assume that all this code is executed with this the same thread,
// so, I would expect the _rgbImages.length to work, but weirdly it doesn't.
// That's why I had to add an extra variable _imageCount and pass it as an argument.
for(let i = 0; i < _imageCount; i++){
let rgbPixel = _rgbImages[i][this.thread.y][this.thread.x]
let alphaPixel = _alphaMultipliers[i][this.thread.y][this.thread.x]
// 2) This is crashing it. What I would expect is each rgbPixel
// to have an array similar to [255, 0, 0] and the alphaPixel
// to have integer values like 1. Which then, I can multiply
// the alphaPixel with each channel and add the value to each
// channel in the resultImage before returning it.
resultImage[0] += rgbPixel[0] * alphaPixel;
resultImage[1] += rgbPixel[1] * alphaPixel;
resultImage[2] += rgbPixel[2] * alphaPixel;
}
return resultImage;
}).setOutput([2,2])
const out = multiplyBuffers(rgbImages, alphaMultipliers, rgbImages.length)
out.forEach(element => {
console.log(element[0], element[1])
});
The error
Uncaught Error Error: Error compiling fragment shader: ERROR: 0:511: 'user_rgbPixelSize' : undeclared identifier
ERROR: 0:511: 'user_rgbPixelDim' : undeclared identifier
ERROR: 0:511: 'getMemoryOptimized32' : no matching overloaded function found
ERROR: 0:511: 'getMemoryOptimized32' : no matching overloaded function found
ERROR: 0:511: 'getMemoryOptimized32' : no matching overloaded function found
at build (c:\Users\sinok\Desktop\ZMerge\gpujs\node_modules\gpu.js\src\backend\web-gl\kernel.js:529:13)
at build (c:\Users\sinok\Desktop\ZMerge\gpujs\node_modules\gpu.js\src\backend\headless-gl\kernel.js:103:17)
at run (c:\Users\sinok\Desktop\ZMerge\gpujs\node_modules\gpu.js\src\kernel-run-shortcut.js:10:18)
at shortcut (c:\Users\sinok\Desktop\ZMerge\gpujs\node_modules\gpu.js\src\kernel-run-shortcut.js:30:16)
at <anonymous> (c:\Users\sinok\Desktop\ZMerge\gpujs\app.js:46:13)
at Module._compile (<node_internals>/internal/modules/cjs/loader.js:1063:30)
at Module._extensions..js (<node_internals>/internal/modules/cjs/loader.js:1092:10)
at Module.load (<node_internals>/internal/modules/cjs/loader.js:928:32)
at Module._load (<node_internals>/internal/modules/cjs/loader.js:769:14)
at executeUserEntryPoint (<node_internals>/internal/modules/run_main.js:72:12)
at <anonymous> (<node_internals>/internal/main/run_main_module.js:17:47)
so, apparently it just doesn't like the array type. I did an experiment with just integers and it worked as expected. Below is the example for anyone interested in the future.
const { GPU } = require('gpu.js');
const gpu = new GPU();
const rgbImages = [
// Red Image
[
[1,1],
[1,1]
],
// Green Image
[
[2,2],
[2,2]
],
// Blue Image
[
[3,3],
[3,3]
],
]
const alphaMultipliers = [
// Keep Red
[
[1,1],
[1,1]
],
// Hide Green
[
[1,1],
[0,0]
],
// Hide Blue
[
[0,1],
[0,1]
]
]
const multiplyBuffers = gpu.createKernel(function(_rgbImages, _alphaMultipliers, _imageCount){
let resultImage = 0
for(let i = 0; i < _imageCount; i++){
const rgbPixel = _rgbImages[i][this.thread.y][this.thread.x]
const alphaPixel = _alphaMultipliers[i][this.thread.y][this.thread.x]
resultImage += rgbPixel * alphaPixel
}
return resultImage;
}).setOutput([2,2])
const out = multiplyBuffers(rgbImages, alphaMultipliers, rgbImages.length)
out.forEach(element => { console.log(element) });
showroom different frame
Hi all,
I want to develop something with GPU, images and nodejs, and I'm trying to understand if this can be done with this module.
In this small example, I have an array with three "images" with pixel dimensions 2x2. There is a red, a green and a blue "image". I also have a second array, with some multiplier values which they are supposed to multiply each pixel channel with 1 or 0 - to hide or show an image. In the kernel, the idea is to do the multiplication and then add all the channel values to the final image and return it.
So, with the current setup, I would expect to get a red image back, but something is not working. Not sure if I'm doing something wrong, or if what I'm trying to do, is not supported.
Here's the code
ps: The next step after that would be to do that images 🙈