vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
22.26k stars 3.14k forks source link

[Misc]: Cohere models are not working due to transformers library outdated? #3728

Open Playerrrrr opened 3 months ago

Playerrrrr commented 3 months ago

Anything you want to discuss about vllm.

Got error

ValueError: The checkpoint you are trying to load has model type cohere but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

hliuca commented 3 months ago

use 4.39.1?

ganeshkamath89 commented 2 months ago

I am getting error with 4.40.0, is it expected then? Is there any alternative to overcome this error?

Vindhya-Singh commented 2 months ago

Is there any solution to this? I get the following error: ModuleNotFoundError: No module named 'transformers.models.cohere.configuration_cohere' My code reads as:


from transformers import (
    AutoModelForCausalLM,
    AutoTokenizer,
    BitsAndBytesConfig,
    HfArgumentParser,
    TrainingArguments,
    pipeline,
    logging,
)

model_name = "Orkhan/llama-2-7b-absa"

base_model = AutoModelForCausalLM.from_pretrained(
    model_name,
    low_cpu_mem_usage=True,
    return_dict=True,
    torch_dtype=torch.float16,
    device_map={"": 0},
)
senthilscouser commented 1 month ago

Getting the same Import error, i checked transformer==40.2.0 but there was 'transformers.configuration_cohere' but i could not find 'transformers.models.cohere.configuration_cohere'

senthilscouser commented 1 month ago

use 4.39.1?

no use, tried using it

sriramanush105 commented 3 weeks ago

using transformers version 4.39.1 resolved the error for me