ayaanzhaque / instruct-nerf2nerf

Instruct-NeRF2NeRF: Editing 3D Scenes with Instructions (ICCV 2023)
https://instruct-nerf2nerf.github.io/
MIT License
792 stars 70 forks source link

RuntimeError: Invalid device string: 'cuda:None' with ns-render #5

Closed Pashtetickus closed 1 year ago

Pashtetickus commented 1 year ago

I was trying to render a scene with a trex from original NeRF dataset, but got this error. Using nerfstudio in docker v0.1.19 i installed in2n with:

git clone https://github.com/ayaanzhaque/instruct-nerf2nerf.git
cd instruct-nerf2nerf
python3.10 -m pip install --upgrade pip setuptools
python3.10 -m pip install -e .

Then i checked it with ns-train -h and started training. I did not notice any changes according my prompt (decorate a trex like a christmas tree) in real time and decided to render the scene.

Then this error appears in

/home/user/.local/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py:431 in enable_model_cpu_offload`

RuntimeError: Invalid device string: 'cuda:None'

I hardcoded gpu_id with 0 and it worked, but it seems to be not a good solution.

ayaanzhaque commented 1 year ago

Hi,

It seems that at render time, the cuda device does not have a device index (which is weird). I think hard coding it to 0 is fine, so I've added an if statement to the code to fix this. Since we don't override any of the rendering code, it seems like this is the best option for now. You can go ahead and pull the newest version of the code and it should fix this. Or you can just copy in the few lines that I've added.

https://github.com/ayaanzhaque/instruct-nerf2nerf/commit/bb7deff481064d3c68db8849c46384ab457b521e

Let me know if this fixes your issue.

Also, when you rendered your edit, did you see a change? If not, I'm happy to help you troubleshoot!

ayaanzhaque commented 1 year ago

Please note that training does take many iterations. If your underlying scene was trained with nerfacto to 30k iters, I'd at least give the edited scene 5-7k iters to see if an edit appears.

Pashtetickus commented 1 year ago

Hi,

Thank you for your response. If all is well, I'll close the issue.

Pashtetickus commented 1 year ago

Please note that training does take many iterations. If your underlying scene was trained with nerfacto to 30k iters, I'd at least give the edited scene 5-7k iters to see if an edit appears.

Yeah, i've trained it for about 10 hours for the Trex scene (55 images) and it only changed overall color a little - it got more warmer. My colleague trained a basic poster scene and he got the same result. I'll try to tweak guidance scales, but it would be great to have your sample dataset and step-by-step guide to reproduce some of the results.

ayaanzhaque commented 1 year ago

https://drive.google.com/drive/folders/1v4MLNoSwxvSlWb26xvjxeoHpgjhi_s-s Here is the data for our method. You can train the farm-small scene. Train up a nerfacto model first, and then run the following command:

in2n --data data/nerfstudio/farm-small/ --load-dir {outputs/farm-small/nerfacto/.../nerfstudio_models/} --pipeline.prompt "make it sunset" --pipeline.guidance-scale 7.5 --pipeline.image-guidance-scale 1.5

Let me know if this works and if you get results similar to the one in our paper.

Some scenes are more challenging to edit, and some edits are hard to get to work. Do you have some examples of the Trex scene before/after edit along with your prompt?

Pashtetickus commented 1 year ago

Thank you! I will try to repeat training with your suggestions and data, and come back with feedback. Is it ok to continue the conversation in this issue?

I did some creepy camera work, but it's enough to see that trex didn't change any structure other than a slight shift toward the warm color of the whole scene (again, promt was decorate a trex like a christmas tree). There is another example with the poster scene and disco party promt with no effect.

I'll try to reinstall in2n and train with your data.

ayaanzhaque commented 1 year ago

I see, i'm not really surprised that the disco party edit didn't work, but I would expect the Christmas tree one to maybe work. What you can do is try to do the edit on a single frame in 2D using InstructPix2Pix (at this huggingface demo). Send me an example of what the 2d edits look like.