unslothai / unsloth

Finetune Llama 3, Mistral, Phi & Gemma LLMs 2-5x faster with 80% less memory
https://unsloth.ai
Apache License 2.0
12.39k stars 805 forks source link

[FIXED] NotImplementedError: No operator found for `memory_efficient_attention_forward` with inputs #400

Open rantianhua opened 2 months ago

rantianhua commented 2 months ago

I'm a beginner to try unsloth. I run the free notebook Llama 3 (8B), and then got the following error:

Screenshot 2024-04-30 at 17 18 16

I also encountered the following error during the first installing step:

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
xformers 0.0.26.post1 requires torch==2.3.0, but you have torch 2.2.1+cu121 which is incompatible.
tariksghiouri commented 2 months ago

facing the same issue , I have no idea why. the cell was running just a couple of days ago.

jamiehughes5926 commented 2 months ago

facing the same issue

danielhanchen commented 2 months ago

@rantianhua @tariksghiouri @jamiehughes5926 Apologies on the issue! I should have wrote it here - please update the first cell's installatation instructions from

%%capture
import torch
major_version, minor_version = torch.cuda.get_device_capability()
# Must install separately since Colab has torch 2.2.1, which breaks packages
!pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
if major_version >= 8:
    # Use this for new GPUs like Ampere, Hopper GPUs (RTX 30xx, RTX 40xx, A100, H100, L40)
    !pip install --no-deps packaging ninja einops flash-attn xformers trl peft accelerate bitsandbytes
else:
    # Use this for older GPUs (V100, Tesla T4, RTX 20xx)
    !pip install --no-deps xformers trl peft accelerate bitsandbytes
pass

to

%%capture
# Installs Unsloth, Xformers (Flash Attention) and all other packages!
!pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
!pip install --no-deps "xformers<0.0.26" trl peft accelerate bitsandbytes
tariksghiouri commented 2 months ago

thank you Daniel :)

Kishore-bluekyte commented 1 month ago

the following change suggested is not working on my aws notebook instance with 1 A10G GPU. the error persists.

tariksghiouri commented 1 month ago

the following change suggested is not working on my aws notebook instance with 1 A10G GPU. the error persists.

for my case just following daniel's suggestion I was able to solve the issue.

danielhanchen commented 1 month ago

@Kishore-bluekyte Oh A10G hmm :( Ok thats very weird - what's your Pytorch version?

michaelchang328 commented 2 weeks ago

@Kishore-bluekyte Oh A10G hmm :( Ok thats very weird - what's your Pytorch version?

I am also facing the same issue with A10G on aws notebook, my Pytorch version is currently on 2.1.0

sarthakforwet commented 1 day ago

I am also facing the same issue on my Kaggle Notebook. My Pytorch version is 2.1.2 and I am following the suggestion mentioned by @danielhanchen to install Unsloth.