AiuniAI / Unique3D

[NeurIPS 2024] Unique3D: High-Quality and Efficient 3D Mesh Generation from a Single Image
https://wukailu.github.io/Unique3D/
MIT License
3.11k stars 246 forks source link

Docker for building project #34

Closed Voveka98 closed 5 months ago

Voveka98 commented 5 months ago

Hi! Thanks for your work. I have some troubles with installation inside docker nvidia/cuda:12.1.0-devel-ubuntu22.04 I run docker with next command:

docker run --gpus all -it  -v /Unique3D/:/workspace/ --net=host --shm-size 5g  --name unique3d  nvidia/cuda:12.1.0-devel-ubuntu22.04

Then i install all requirements as in README.md and when i try to create gradio demo i get next error:

[F glutil.cpp:338] eglInitialize() failed
Aborted (core dumped)

So i'd like to know maybe you have solution for this problem or familiar with it? Thanks in advance!

Lektro9 commented 5 months ago

I had the same issue yesterday. Googling it suggested OpenGL is not supported by the Nvidia docker images? (https://github.com/NVIDIA/nvidia-docker/issues/328).

What worked for me was replacing dr.RasterizeGLContext with dr.RasterizeCudaContext in the codebase. But make sure to remove output_db=False as the first argument.

I am not smart enough to know if the output quality suffers from this change but it works for me and I get excellent models (compared to other 3d mesh generators)

Voveka98 commented 5 months ago

@Lektro9 Thanks for advice, saw this solution but couldn't do it because of not removing output_db=False Will go try this!

jtydhr88 commented 5 months ago

I committed a dockerfile https://github.com/AiuniAI/Unique3D/tree/main/docker

Voveka98 commented 5 months ago

@jtydhr88 Thanks a lot! I will try this FYI: I faced a problem with gpu inference in docker from issue starting message:

2024-06-24 10:58:35.423958997 [E:onnxruntime:Default, provider_bridge_ort.cc:1730 TryGetProviderInfo_TensorRT] /onnxruntime_src/onnxruntime/core/session/provider_bridge_ort.cc:1426 onnxruntime::Provider& onnxruntime::ProviderLibrary::Get() [ONNXRuntimeError] : 1 : FAIL : Failed to load library libonnxruntime_providers_tensorrt.so with error: libnvinfer.so.10: cannot open shared object file: No such file or directory

but it was due to incorrect TensorRT installation. I fixed it with apt-get install tensorrt and now works nice!

jtydhr88 commented 5 months ago

Yeah , I do the same thing in my Dockerfile

jtydhr88 commented 5 months ago

The issue I think can be closed