unslothai / unsloth

Finetune Llama 3.2, Mistral, Phi, Qwen 2.5 & Gemma LLMs 2-5x faster with 80% less memory
https://unsloth.ai
Apache License 2.0
18.37k stars 1.28k forks source link

`{% if add_generation_prompt %}` [FIXED] #1284

Open giuliabaldini opened 1 week ago

giuliabaldini commented 1 week ago

Hi there,

if I run my usual code after the Qwen 2.5 commit, I get multiple errors. The first one is the following

jinja2.exceptions.TemplateSyntaxError: Encountered unknown tag 'endfor'. Jinja was looking for the following tags: 'elif' or 'else' or 'endif'. The innermost block that needs to be closed is 'if'.

which is probably because of the change in this line. Once I fix that, I still get

RuntimeError: Unsloth: The tokenizer `OpenMeditron/Meditron3-8B`
does not have a {% if add_generation_prompt %} for generation purposes.
Please file a bug report immediately - thanks!

Any ideas?

Best, Giulia

xizhangmable commented 1 week ago

I have a similar issue when running

train_ds = train_ds.map(lambda x: {"training_prompt": tokenizer.apply_chat_template(x["chat"], tokenize=False, add_generation_prompt=False)})

TemplateSyntaxError: Encountered unknown tag 'endfor'. Jinja was looking for the following tags: 'elif' or 'else' or 'endif'. The innermost block that needs to be closed is 'if'.

danielhanchen commented 1 week ago

Apologies just fixed @giuliabaldini @xizhangmable - thanks for reporting! Please update Unsloth on local machines via pip install --upgrade --no-cache-dir --no-deps unsloth

For Colab, Kaggle just refresh!

GreenBogDes commented 5 days ago

I am still getting this error in google colab in a new session. I was helped by rolling back to a2f8db3e7341f983af5814a2c56f54fa29ee548d

RuntimeError Traceback (most recent call last)

in () 17 # ] # More models at https://huggingface.co/unsloth 18 ---> 19 model, tokenizer = FastLanguageModel.from_pretrained( 20 model_name = "mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated", 21 max_seq_length = max_seq_length, 3 frames /usr/local/lib/python3.10/dist-packages/unsloth/tokenizer_utils.py in fix_chat_template(tokenizer) 656 if "{% if add_generation_prompt %}" not in new_chat_template and \ 657 "{%- if add_generation_prompt %}" not in new_chat_template: --> 658 raise RuntimeError( 659 f"Unsloth: The tokenizer `{tokenizer.name_or_path}`\n"\ 660 "does not have a {% if add_generation_prompt %} for generation purposes.\n"\ RuntimeError: Unsloth: The tokenizer `mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated` does not have a {% if add_generation_prompt %} for generation purposes. Please file a bug report immediately - thanks!
scoliono commented 2 days ago

Getting this issue on kaggle with the same Meta-Llama-3.1-8B-Instruct-abliterated model.

@GreenBogDes mind sharing how you installed? I've tried pip install git+https://github.com/unslothai/unsloth.git@a2f8db3e7341f983af5814a2c56f54fa29ee548d (and several variations) but then I get errors when trying to import unsloth

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
File /opt/conda/lib/python3.10/site-packages/unsloth/__init__.py:32
     31 try:
---> 32     import unsloth_zoo
     33 except:

File /opt/conda/lib/python3.10/site-packages/unsloth_zoo/__init__.py:27
     26 if not ("UNSLOTH_IS_PRESENT" in os.environ):
---> 27     raise ImportError("Please install Unsloth via `pip install unsloth`!")
     28 pass

ImportError: Please install Unsloth via `pip install unsloth`!

During handling of the above exception, another exception occurred:

ImportError                               Traceback (most recent call last)
Cell In[3], line 1
----> 1 from unsloth import FastLanguageModel
      2 import torch
      3 max_seq_length = 2048 # Choose any! We auto support RoPE Scaling internally!

File /opt/conda/lib/python3.10/site-packages/unsloth/__init__.py:34
     32     import unsloth_zoo
     33 except:
---> 34     raise ImportError("Unsloth: Please install unsloth_zoo via `pip install unsloth-zoo`")
     35 pass
     37 # Unsloth currently does not work on multi GPU setups - sadly we are a 2 brother team so
     38 # enabling it will require much more work, so we have to prioritize. Please understand!
     39 # We do have a beta version, which you can contact us about!
     40 # Thank you for your understanding and we appreciate it immensely!

ImportError: Unsloth: Please install unsloth_zoo via `pip install unsloth-zoo`
GreenBogDes commented 2 days ago

Getting this issue on kaggle with the same Meta-Llama-3.1-8B-Instruct-abliterated model.

@GreenBogDes mind sharing how you installed? I've tried pip install git+https://github.com/unslothai/unsloth.git@a2f8db3e7341f983af5814a2c56f54fa29ee548d (and several variations) but then I get errors when trying to import unsloth

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
File /opt/conda/lib/python3.10/site-packages/unsloth/__init__.py:32
     31 try:
---> 32     import unsloth_zoo
     33 except:

File /opt/conda/lib/python3.10/site-packages/unsloth_zoo/__init__.py:27
     26 if not ("UNSLOTH_IS_PRESENT" in os.environ):
---> 27     raise ImportError("Please install Unsloth via `pip install unsloth`!")
     28 pass

ImportError: Please install Unsloth via `pip install unsloth`!

During handling of the above exception, another exception occurred:

ImportError                               Traceback (most recent call last)
Cell In[3], line 1
----> 1 from unsloth import FastLanguageModel
      2 import torch
      3 max_seq_length = 2048 # Choose any! We auto support RoPE Scaling internally!

File /opt/conda/lib/python3.10/site-packages/unsloth/__init__.py:34
     32     import unsloth_zoo
     33 except:
---> 34     raise ImportError("Unsloth: Please install unsloth_zoo via `pip install unsloth-zoo`")
     35 pass
     37 # Unsloth currently does not work on multi GPU setups - sadly we are a 2 brother team so
     38 # enabling it will require much more work, so we have to prioritize. Please understand!
     39 # We do have a beta version, which you can contact us about!
     40 # Thank you for your understanding and we appreciate it immensely!

ImportError: Unsloth: Please install unsloth_zoo via `pip install unsloth-zoo`

What helped me was adding those lines:

!pip install git+https://github.com/unslothai/unsloth-zoo.git
import os
os.environ["UNSLOTH_IS_PRESENT"] = "1"

But it still doesn't work in the new version. Here's my notebook: https://colab.research.google.com/drive/1MGwKUq1O46IFkR5R6MmPn1ZtonaDhSN9?usp=sharing

scoliono commented 2 days ago

This actually worked for me. Very hacky, but here's what I have so far, if it helps anyone else. Some commands taken from the Colab starter notebook.

%%capture
!pip install pip3-autoremove
!pip-autoremove torch torchvision torchaudio -y
!pip install torch torchvision torchaudio xformers --index-url https://download.pytorch.org/whl/cu121
!pip install unsloth[kaggle-new]
!pip uninstall unsloth -y && pip install git+https://github.com/unslothai/unsloth.git@a2f8db3e7341f983af5814a2c56f54fa29ee548d
!pip install git+https://github.com/unslothai/unsloth-zoo.git
import os
os.environ["UNSLOTH_IS_PRESENT"] = "1"

Then, after loading the model:

Unsloth: We successfully patched the tokenizer to add a {% if add_generation_prompt %} to the chat_template. This is not a bug, but please notify the Unsloth maintainers - thanks! mlabonne/Meta-Llama-3.1-8B-Instruct-abliterated does not have a padding token! Will use pad_token = <|finetune_right_pad_id|>.

It appears to be training now.