Closed MeterPreter57 closed 1 year ago
Can you make sure your notebook is up to date? I know the xformers builder used to build the minimized version without the forward attention operators.
Otherwise, something funky might be happening with torch versions: PyTorch 1.12.0+cu116 with CUDA 1102 (you have 1.13.1+cu117)
I'm having the same problem although my error says:
WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for:
PyTorch 2.0..0+cu118 with CUDA 1108 (you have 1.13.1+cu117)
I tried updating the launch.py file to:
torch_command = os.environ.get('TORCH_COMMAND', "pip install torch==2.0.0+cu118 torchvision==0.14.1+cu118 --extra-index-url https://download.pytorch.org/whl/cu118")
I also set the requirements_versions.txt to torch==2.0.0
That didn't work, just created a bunch of errors when I ran the notebook so I changed it back. I also tried reinstalling the stable-diffusion-webui folder to no avail.
My notebook is up to date. Not a huge deal but hopefully we someone can get to the bottom of this.
I solved the issue. The problem was that I did not run the "Install requirements and download repositories" block before building xFormers. I updated the notebook, reset storage and built xFormers again. Old errors are gone but I got a new one
TypeError: Descriptors cannot not be created directly.
If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0.
If you cannot immediately regenerate your protos, some other possible workarounds are:
1. Downgrade the protobuf package to 3.20.x or lower.
2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).
I downgraded the protobuf package to 3.20.3 Now everything works without any errors but I think that xFormers doesn't work because it takes the same amount of time to generate image as before when I didn't install it.
I had that error, but I don't remember how I fixed it. I might have re-built xformers...
I solved the issue. The problem was that I did not run the "Install requirements and download repositories" block before building xFormers. I updated the notebook, reset storage and built xFormers again. Old errors are gone but I got a new one
TypeError: Descriptors cannot not be created directly. If this call came from a _pb2.py file, your generated code is out of date and must be regenerated with protoc >= 3.19.0. If you cannot immediately regenerate your protos, some other possible workarounds are: 1. Downgrade the protobuf package to 3.20.x or lower. 2. Set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python (but this will use pure-Python parsing and will be much slower).
I downgraded the protobuf package to 3.20.3 Now everything works without any errors but I think that xFormers doesn't work because it takes the same amount of time to generate image as before when I didn't install it.
I basically went through the exact steps you described here, now everything works so thanks. Weird that xFormers isn't giving you any speed benefits though, seems to be working fine for me.
@personxyz what gpu are you using?
@personxyz what gpu are you using?
A6000.
I rebuilt xFormers again and got the same result. It seems that Quadro M4000 is too old to benefit from the xFormers
WARNING[XFORMERS]: xFormers can't load C++/CUDA extensions. xFormers was built for: PyTorch 2.0..0+cu118 with CUDA 1108 (you have 1.13.1+cu117) 遇到这个问题我的解决方法是: (you have 1.13.1+cu117)这是你的系统内的版本, 重新安装 xFormers 可以按照以下步骤进行: 首先,打开命令行界面。 使用以下命令卸载当前已安装的 xFormers(如果已安装): x:\xxx\xxx\python(你的SD目录内的python文件夹),在上面键入CMD 然后在控制台输入以下命令 pip uninstall xformers 完成后打开启动器(我是秋叶的绘世) 高级选项 点进去后跟一键启动并排 有个环境维护,配置pyTorch点开,选择版本,与你的(you have 1.13.1+cu117)这一项版本一致的选择完点击安装即可解决.
Hi, I have an issue with xFormers. I built xFormers using cell in the tools section.
Then I reran "Install requirements and download repositories" cell. Then I launched the WebUI and got this
Now When I try to generate image I get this
python: 3.9.13 torch: 1.13.1+cu117 xformers: 0.0.18+da27862.d20230331 gradio: 3.16.2 commit: a9fed7c3 checkpoint: 7f16bbcd80