threestudio-project / threestudio

A unified framework for 3D content generation.
Apache License 2.0
6.17k stars 475 forks source link

xformers version #380

Open zimingzhong opened 9 months ago

zimingzhong commented 9 months ago

we have to use the low version 'xformers' like 0.0.12 to satisfy torch==1.12.1+cu113. If using high version xformers, it will automatically download torch=2.1.2.

but low xformers.ops.memory_efficient_attention get the version problem and i have no idea to fix it.

4dfy/extern/MVDream/mvdream/ldm/modules/diffusionmodules/model.py", line 258, in forward

 out = xformers.ops.memory_efficient_attention(q, k, v, attn_bias=None, op=self.attention_op)

TypeError: memory_efficient_attention() got an unexpected keyword argument 'attn_bias'
Eecornwell commented 9 months ago

xformers enforces the pytorch version based on their repo's released tag. I had to be sure to pick a version of xformers to stick with and be sure the dependencies are mapped from there. For example, I am running on Ubuntu 22.04 and decided to go with pip install xformers==0.0.22 which means that my pytorch version is pinned at 2.0.1 and cuda 11.8. I then made sure tinycudann installed the correct version by running pip install git+https://github.com/NVlabs/tiny-cuda-nn.git#subdirectory=bindings/torch and not using their pre-built wheels.

ivanpuhachov commented 6 months ago

Had similar warning, fixed with:

pip3 install -U xformers==0.0.23 --index-url https://download.pytorch.org/whl/cu118

See xformers repo (https://github.com/facebookresearch/xformers)