gpujs / gpu.js

GPU Accelerated JavaScript
https://gpu.rocks
MIT License
15.03k stars 643 forks source link

How to determine the maximum grid size? #817

Open MrOlegus opened 1 year ago

MrOlegus commented 1 year ago

What is wrong?

const gpu = new GPU();
const multiplyMatrix = gpu.createKernel(function(a, b) {
    let sum = 0;
    for (let i = 0; i < 512; i++) {
        sum += a[this.thread.y][i] * b[i][this.thread.x];
    }
    return sum;
}).setOutput([512, 512]);

const c = multiplyMatrix(a, b);

I am using your example for matrix multiplication. For 512x512 everything is ok, but at 1024x1024 I get it. Снимок

Where does it happen?

All the time.

How do we replicate the issue?

const gpu = new GPU();
const multiplyMatrix = gpu.createKernel(function(a, b) {
    let sum = 0;
    for (let i = 0; i < 1000000; i++) {
        sum += a[this.thread.y][i] * b[i][this.thread.x];
    }
    return sum;
}).setOutput([1000000, 1000000]);

const c = multiplyMatrix(a, b);

How important is this (1-5)?

3

Expected behavior (i.e. solution)

Throwing an out-of-memory error instead of trying to execute.

Other Comments

jacobbogers commented 6 months ago

you can ask limits from webgl for example

var maxTextures = gl.getParameter(gl.MAX_TEXTURE_IMAGE_UNITS);

ofc a superduper card is going to have a higher limit then (lets say) your feature phone