mkshing / e4t-diffusion

Implementation of Encoder-based Domain Tuning for Fast Personalization of Text-to-Image Models
https://arxiv.org/abs/2302.12228
MIT License
318 stars 24 forks source link

Can you specify versions in requirements.txt? #5

Closed cutlass90 closed 1 year ago

cutlass90 commented 1 year ago

Thank you for publishing this code. I tried to run it but got an error I could not handle. It may be for the reason I have different libraries versions. Error traceback is here

Traceback (most recent call last):
  File "pretrain_e4t.py", line 20, in <module>
    from diffusers import DDPMScheduler, AutoencoderKL
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/diffusers/__init__.py", line 35, in <module>
    from .models import (
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/diffusers/models/__init__.py", line 19, in <module>
    from .autoencoder_kl import AutoencoderKL
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/diffusers/models/autoencoder_kl.py", line 23, in <module>
    from .vae import Decoder, DecoderOutput, DiagonalGaussianDistribution, Encoder
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/diffusers/models/vae.py", line 22, in <module>
    from .unet_2d_blocks import UNetMidBlock2D, get_down_block, get_up_block
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/diffusers/models/unet_2d_blocks.py", line 20, in <module>
    from .attention import AdaGroupNorm, AttentionBlock
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/diffusers/models/attention.py", line 22, in <module>
    from .cross_attention import CrossAttention
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/diffusers/models/cross_attention.py", line 29, in <module>
    import xformers.ops
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/xformers/ops/__init__.py", line 8, in <module>
    from .fmha import (
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/xformers/ops/fmha/__init__.py", line 10, in <module>
    from . import cutlass, flash, small_k, triton
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/xformers/ops/fmha/triton.py", line 15, in <module>
    if TYPE_CHECKING or _is_triton_available():
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/xformers/__init__.py", line 33, in func_wrapper
    value = func()
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/xformers/__init__.py", line 44, in _is_triton_available
    from xformers.triton.softmax import softmax as triton_softmax  # noqa
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/xformers/triton/__init__.py", line 12, in <module>
    from .dropout import FusedDropoutBias, dropout  # noqa
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/xformers/triton/dropout.py", line 13, in <module>
    import triton
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/triton/__init__.py", line 20, in <module>
    from .runtime import (
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/triton/runtime/__init__.py", line 1, in <module>
    from .autotuner import Config, Heuristics, autotune, heuristics
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/triton/runtime/autotuner.py", line 7, in <module>
    from ..compiler import OutOfResources
  File "/home/jupyter/e4t-diffusion/nenv/lib/python3.7/site-packages/triton/compiler.py", line 1047, in <module>
    @functools.lru_cache
  File "/opt/conda/lib/python3.7/functools.py", line 490, in lru_cache
    raise TypeError('Expected maxsize to be an integer or None')
TypeError: Expected maxsize to be an integer or None
mkshing commented 1 year ago

Apparently, you are using <Python3.7. https://github.com/NVIDIA/DeepLearningExamples/issues/1016 You need python >3.8 instead.

cutlass90 commented 1 year ago

Apparently, you are using <Python3.7. NVIDIA/DeepLearningExamples#1016 You need python >3.8 instead.

Indeed, switching to python3.8 fixed the problem. Thanks