mindspore-lab / mindone

one for all, Optimal generator with No Exception
https://mindspore-lab.github.io/mindone/
Apache License 2.0
334 stars 63 forks source link

ModuleNotFoundError: No module named 'te' #293

Open songtianhui opened 5 months ago

songtianhui commented 5 months ago

我运行了sh examples/pangu_draw_v3/run_sampling.sh,不能导入https://github.com/mindspore-lab/mindone/blob/3ef47e4176b193845c628fd3059cae2d4cff6559/examples/pangu_draw_v3/gm/modules/attention.py#L18 的FlashAttention模块。 我去掉了try逻辑,报错 ModuleNotFoundError: No module named 'te' traceback:

Traceback (most recent call last):
  File "/data/songtianhui.sth/projects/mindone/examples/pangu_draw_v3/pangu_sampling.py", line 7, in <module>
    from gm.helpers import SD_XL_BASE_RATIOS, VERSION2SPECS, create_model, init_sampling, load_img, perform_save_locally
  File "/data/songtianhui.sth/projects/mindone/examples/pangu_draw_v3/gm/helpers.py", line 12, in <module>
    from gm.modules.diffusionmodules.discretizer import Img2ImgDiscretizationWrapper, Txt2NoisyDiscretizationWrapper
  File "/data/songtianhui.sth/projects/mindone/examples/pangu_draw_v3/gm/modules/__init__.py", line 1, in <module>
    from .embedders.modules import GeneralConditioner
  File "/data/songtianhui.sth/projects/mindone/examples/pangu_draw_v3/gm/modules/embedders/__init__.py", line 1, in <module>
    from .modules import FrozenCLIPEmbedder, FrozenOpenCLIPEmbedder2, GeneralConditioner
  File "/data/songtianhui.sth/projects/mindone/examples/pangu_draw_v3/gm/modules/embedders/modules.py", line 6, in <module>
    from gm.modules.diffusionmodules.openaimodel import Timestep
  File "/data/songtianhui.sth/projects/mindone/examples/pangu_draw_v3/gm/modules/diffusionmodules/openaimodel.py", line 6, in <module>
    from gm.modules.attention import SpatialTransformer
  File "/data/songtianhui.sth/projects/mindone/examples/pangu_draw_v3/gm/modules/attention.py", line 13, in <module>
    from mindspore.nn.layer.flash_attention import FlashAttention
  File "/data/songtianhui.sth/apps/miniconda3/envs/pangu/lib/python3.9/site-packages/mindspore/nn/layer/flash_attention.py", line 24, in <module>
    from mindspore.ops._op_impl._custom_op.flash_attention.flash_attention_impl import get_flash_attention
  File "/data/songtianhui.sth/apps/miniconda3/envs/pangu/lib/python3.9/site-packages/mindspore/ops/_op_impl/_custom_op/__init__.py", line 17, in <module>
    from mindspore.ops._op_impl._custom_op.dsd_impl import dsd_matmul
  File "/data/songtianhui.sth/apps/miniconda3/envs/pangu/lib/python3.9/site-packages/mindspore/ops/_op_impl/_custom_op/dsd_impl.py", line 17, in <module>
    from te import tik
ModuleNotFoundError: No module named 'te'

我要怎么安装这个te库,还是我安装mindspore库有问题? 我使用pip install https://ms-release.obs.cn-north-4.myhuaweicloud.com/2.2.10/MindSpore/unified/x86_64/mindspore-2.2.10-cp39-cp39-linux_x86_64.whl --trusted-host ms-release.obs.cn-north-4.myhuaweicloud.com -i https://pypi.tuna.tsinghua.edu.cn/simple 命令安装。 GPU为V100.

songtianhui commented 5 months ago

是不是只用Ascend GPU才支持这个库?

vigo999 commented 5 months ago

当前GPU 还未支持。

vigo999 commented 5 months ago

计划适配3月完成

songtianhui commented 5 months ago

感谢您的回复!期待适配完成!

ultranationalism commented 5 months ago

是不是只用Ascend GPU才支持这个库?

想要跑通的话可以不使用fast attention,将attention.py里面的fast attention部分删除掉,直接设置
FLASH_IS_AVAILABLE = False 然后把model.py文件里面 from gm.modules.attention import FLASH_IS_AVAILABLE, FlashAttention, LinearAttention 改成 from gm.modules.attention import FLASH_IS_AVAILABLE, LinearAttention 目前测试在A40中,只有设置--ms_mode为0的时候才能正常跑图,512×512消耗显存26G,1024×1024消耗显存33G

Yuezeyi commented 5 months ago

是不是只用Ascend GPU才支持这个库?

想要跑通的话可以不使用fast attention,将attention.py里面的fast attention部分删除掉,直接设置 FLASH_IS_AVAILABLE = False 然后把model.py文件里面 from gm.modules.attention import FLASH_IS_AVAILABLE, FlashAttention, LinearAttention 改成 from gm.modules.attention import FLASH_IS_AVAILABLE, LinearAttention 目前测试在A40中,只有设置--ms_mode为0的时候才能正常跑图,512×512消耗显存26G,1024×1024消耗显存33G

请问你在gpu上跑pangu生成的图像效果怎么样呀,我跑出来感觉和示例差距很大,不知道是不是我哪里跑错了……

songtianhui commented 2 months ago

计划适配3月完成 请问目前GPU可以使用了吗