NVIDIAGameWorks / kaolin-wisp

NVIDIA Kaolin Wisp is a PyTorch library powered by NVIDIA Kaolin Core to work with neural fields (including NeRFs, NGLOD, instant-ngp and VQAD).
Other
1.45k stars 133 forks source link

Devcontainer #61

Closed nkyriazis closed 1 year ago

nkyriazis commented 1 year ago

It's very useful to have the docker images, especially for development. It's great to quickly jump in the container and develop. However, since wisp comes bundled, it's hard to develop from within the container.

Is there an idiomatic way of incorporating dynamic changes to the original wisp python code while in the container, in a persistent fashion that does not involve rebuilding the container? E.g., have the python part of wisp being mounted as a volume.

Caenorst commented 1 year ago

Hi @nkyriazis , you can mount a volume using the -v optional argument with docker run.

nkyriazis commented 1 year ago

@Caenorst thanks for taking the time to answer.

I was going for something less hacky than the following.

docker-compose.yml:

version: '3.9'

services:
  # Make sure to have followed the instructions in
  # INSTALL.md and arrive at an image with the tag `wisp`
  app:
    image: wisp
    runtime: nvidia
    environment:
      DISPLAY: ${DISPLAY}
      # make sure mounted wisp is found before builtin wisp
      PYTHONPATH: /workspace
    volumes:
      # mount wisp root as workspace
      - .:/workspace

It is important to set the PYTHONPATH to have a top priority of mounted wisp (against bundled wisp). Even so, the _C.so needs to be copied over to the mounted wisp from the bundled and built wisp.

Caenorst commented 1 year ago

actually reexecuting python setup.py develop should set the path for you and should be very fast (only compiling the C++ / CUDA source files is slow and that shouldn't re-build unless you modify a file).

nkyriazis commented 1 year ago

Thanks for the info. Closing.