Open nicolasgere opened 3 years ago
@nicolasgere you can test it right now by installing it via docker. You can find the published containers here. The package should listed as @nvidia/webgl
, although #313 will update it to @rapidsai/webgl
. Let us know if you face any issues!
Do you mean that if I npm install @nvidia/webgl inside the container, it will install the package correctly in my node module?
No actually, the package should come pre-installed in that docker image. You would just use it directly!
@nicolasgere we added more information on how to use the docker containers and installing individual packages on bare metal instances as well.
where can I found a demo for webgl? I am facing the error 2.1.0 Missing GL version
no update here @AjayThorve ?
@nicolasgere are you running the WebGL demo from inside a docker container? If so, are you passing the --runtime=nvidia
flag (and have the nvidia-container-toolkit
installed)?
Our dev and runtime docker-compose
files are configured to run GL apps inside the container but render in X11 on the host.
You can pass the same configuration via docker run
. Save the following as a file called run-node-rapids-demo.sh
:
#!/usr/bin/env bash
envvars="\
-e NODE_NO_WARNINGS=1 \
`# Colorize the terminal in the container if possible` \
-e TERM=${TERM:-} \
`# Use the host's X11 display` \
-e DISPLAY=${DISPLAY:-} \
-e XAUTHORITY=${XAUTHORITY:-} \
-e NVIDIA_DRIVER_CAPABILITIES=all \
-e XDG_SESSION_TYPE=${XDG_SESSION_TYPE:-} \
-e XDG_RUNTIME_DIR=${XDG_RUNTIME_DIR:-/run/user/$UID} \
-e DBUS_SESSION_BUS_ADDRESS=${DBUS_SESSION_BUS_ADDRESS:-unix:path=/run/user/$UID/bus} \
"
volumes="\
-v /etc/fonts:/etc/fonts:ro \
-v /tmp/.X11-unix:/tmp/.X11-unix:rw \
-v /usr/share/fonts:/usr/share/fonts:ro \
-v /usr/share/icons:/usr/share/icons:ro \
-v /etc/timezone:/etc/timezone:ro \
-v /etc/localtime:/etc/localtime:ro \
-v /run/dbus/system_bus_socket:/run/dbus/system_bus_socket \
-v ${XDG_RUNTIME_DIR:-/run/user/$UID}:${XDG_RUNTIME_DIR:-/run/user/$UID} \
"
# Default to the "luma.gl-lessons/01" demo if none was passed in cmd="${@:-npx @rapidsai/demo-luma.gl-lessons 01}"
exec docker run --rm -it --runtime=nvidia \
${envvars} \
${volumes} \
ghcr.io/rapidsai/node:22.02.00-runtime-node16.13.2-cuda11.6.0-ubuntu20.04-demo \
`# Run cmd inside the container` \
${cmd}
HI there, thanks for the amazing work! We are very interested to use the webgl package, we are on headless-gl for now, but it does not work with gpu. Do you have an ETA on when we could start to test? Thank you!