gpujs / gpu.js

GPU Accelerated JavaScript
https://gpu.rocks
MIT License
15.12k stars 655 forks source link

Example of parallel reduction? #769

Open jhurliman opened 2 years ago

jhurliman commented 2 years ago

What is wrong?

I'm having trouble implementing parallel reduction algorithms using gpu.js and would benefit from example starting code.

Where does it happen?

My various implementation attempts in node.js on top of gpu.js.

How do we replicate the issue?

I have an arbitrarily sized Float32Array that I want to run various reduction algorithms (min, max, sum) on. gpu.js looks like a good fit but I don't have a strong enough grasp on immutable, pipeline, how to reuse texture outputs, and when to delete them to make everything come together. Example code demonstrating how to use gpu.js for reduction would be very helpful.

How important is this (1-5)?

5 for me personally since that is my interest in using this library. More generally, this is probably a 2 since it doesn't involve a bug report.

Expected behavior (i.e. solution)

Example code provided somewhere in the documentation.

benrayfield commented 1 year ago

const gpu = new GPU(); let doMins = gpu.createKernel(function(array){ let id = this.thread.x; let min = 10000000000000.0; for(let i=0; i<3000; i++){ min = Math.min(min, array[id 3000+i]); } return min; }).setOutput([1000]); let arr = new Float32Array(3000 1000); for(let i=0; i<arr.length; i++) arr[i] = Math.random(); let theMins = doMins(arr) //theMins is now a Float32Array(1000) that are each the min of 3000 numbers. //WARNING: min, max, and sum are IO bottlenecked. GPU is only useful for calculations that are very compute bottlenecked, do alot more compute than IO. You should do min, max, and sum in CPU.