littledivy / shaderplay

Live WGSL shader playground in Deno
MIT License
7 stars 1 forks source link

GLSL translation not working /w solution #2

Open wolfspider opened 18 hours ago

wolfspider commented 18 hours ago

For some reason GLSL translation was not working and had a closer look at it to find that whatever was in node-devel did not work on my system. I'm running Linux Ubuntu 22 on Vulkan and Wayland is the WM. It would either not run at all and when I tried to force the module to load outside of an async function it would peg the CPU. The first thing I did was switch it over to web-devel:

glslang-pkgjson

Afterwards I got an error about charCodeAt() and went ahead and patched it:

glslang-update

Since this package has not been updated for a while I figured I would share this if it helps anyone else.

On line 17 of gslang.js var f=a.charCodeAt(g); just needs to be changed to var f=a.toString().charCodeAt(g);.

These files are located in the cache folder so on my system they are at: ~/.cache/deno/npm/registry.npmjs.org/@webgpu/glslang/0.0.15

I am not sure what the good solution would be in this case since it is happening outside of ShaderPlay.

I've been really enjoying ShaderPlay and all the other additions to Deno as well, thank you!

I'm also including the shader which I have been using to test this with:


// aether.glsl
// Based on:
// Ether by nimitz 2014 (twitter: @stormoid)
// https://www.shadertoy.com/view/MsjSW3
// License Creative Commons Attribution-NonCommercial-ShareAlike 3.0 Unported License
// Contact the author for other licensing options
// ported to ShaderPlay by WolfSpider 2024-09-29

#version 450

layout(binding = 0) uniform Uforms {
    vec2 mouse;
    float clicked;
    float frame;
};

mat2 m(float a) {
    float c = cos(a), s = sin(a);
    return mat2(c, -s, s, c);
}
float map(vec3 p) {
    float t = frame / 100.0;
    p.xz *= m(t * 0.4);
    p.xy *= m(t * 0.3);
    vec3 q = p * 2. + t;
    return length(p + vec3(sin(t * 0.7))) * log(length(p) + 1.) + sin(q.x + sin(q.z + sin(q.y))) * 0.5 - 1.;
}

layout(location = 0) out vec4 outColor;

void main() {   
    //This should be 1/2 width adjust accordingly
    vec2 p = gl_FragCoord.xy / 256.0 - vec2(1.1, 1.0);
    vec3 cl = vec3(0.);
    float d = 2.5;
    for(int i = 0; i <= 5; i++) {
        vec3 p = vec3(0, 0, 5.) + normalize(vec3(p, -1.)) * d;
        float rz = map(p);
        float f = clamp((rz - map(p + .1)) * 0.5, -.1, 1.);
        vec3 l = vec3(0.1, 0.3, .4) + vec3(5., 2.5, 3.) * f;
        cl = cl * l + smoothstep(2.5, .0, rz) * .7 * l;
        d += min(rz, 1.);
    }
    outColor = vec4(cl, 1.);
}
littledivy commented 10 hours ago

Thanks for the research! I was able to run aether.glsl with this diff but never hit the charCodeAt error 🤔

diff --git a/shaderplay.ts b/shaderplay.ts
index 92b79db..38d4a71 100644
--- a/shaderplay.ts
+++ b/shaderplay.ts
@@ -1,5 +1,5 @@
 import { EventType, WindowBuilder } from "jsr:@divy/sdl2@0.13.0";
-import loadGlslang from "npm:@webgpu/glslang@0.0.15";
+import loadGlslang from "npm:@webgpu/glslang@0.0.15/dist/web-devel/glslang.js";
 import "./vendor/cdn.babylonjs.com/twgsl/twgsl.js";

 const twgsl = await (globalThis as any).twgsl(
$ deno --version
deno 2.0.0-rc.7+183130f (canary, release, aarch64-apple-darwin)