mistralai / mistral-inference

Official inference library for Mistral models
https://mistral.ai/
Apache License 2.0
9.64k stars 850 forks source link

[BUG: ModuleNotFoundError: No module named 'mistral_inference.transformer' ​ #202

Open yafangwang9 opened 2 months ago

yafangwang9 commented 2 months ago

Python -VV

from mistral_inference.transformer import Transformer
from mistral_inference.generate import generate

from mistral_common.tokens.tokenizers.mistral import MistralTokenizer
from mistral_common.protocol.instruct.messages import UserMessage
from mistral_common.protocol.instruct.request import ChatCompletionRequest

tokenizer = MistralTokenizer.from_file(f"{mistral_models_path}/tekken.json")
model = Transformer.from_folder(mistral_models_path)

prompt = "How expensive would it be to ask a window cleaner to clean all windows in Paris. Make a reasonable guess in US Dollar."

completion_request = ChatCompletionRequest(messages=[UserMessage(content=prompt)])

tokens = tokenizer.encode_chat_completion(completion_request).tokens

out_tokens, _ = generate([tokens], model, max_tokens=64, temperature=0.35, eos_id=tokenizer.instruct_tokenizer.tokenizer.eos_id)
result = tokenizer.decode(out_tokens[0])

print(result)

Pip Freeze

---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Input In [3], in <cell line: 1>()
----> 1 from mistral_inference.transformer import Transformer
      2 from mistral_inference.generate import generate
      4 from mistral_common.tokens.tokenizers.mistral import MistralTokenizer

ModuleNotFoundError: No module named 'mistral_inference.transformer'

Reproduction Steps

I use Mistral_inference for mistral-nemo ,got this issue

Expected Behavior

https://huggingface.co/mistralai/Mistral-Nemo-Instruct-2407

Additional Context

No response

Suggested Solutions

No response

bhargavyagnik commented 2 months ago

Did you build mistral-inference from source or pip install ? Can you do pip freeze to find the details

yafangwang9 commented 2 months ago

Thank you for reply🥰

Hi~ I'm following the official instructions to install Mistral_inference via pip, and here are the results from pip freeze. I am running the installation on a cloud service provider in China.

(I didn't see the Mistral_inference that I installed using pip install.)

kiwisolver==1.3.2 lightgbm==2.3.1 Markdown==3.3.6 MarkupSafe==2.1.0 matplotlib==3.5.1 matplotlib-inline==0.1.3 mistune==0.8.4 mmcv-full==1.4.8 mmdet==2.23.0 msgpack==1.0.3 multidict==6.0.2 nbclassic==0.3.7

Bhargav Yagnik @.***> 于2024年7月24日周三 00:19写道:

Did you build mistral-inference from source or pip install ? Can you do pip freeze to find the details

— Reply to this email directly, view it on GitHub https://github.com/mistralai/mistral-inference/issues/202#issuecomment-2245676507, or unsubscribe https://github.com/notifications/unsubscribe-auth/AWDJHN42TC3PTGTYXBTISOTZNZ673AVCNFSM6AAAAABLJ3MJTWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENBVGY3TMNJQG4 . You are receiving this because you authored the thread.Message ID: @.***>

bhargavyagnik commented 2 months ago

You can try to install mistral-inference again in this Cloud env. Try - pip install mistral-inference in the environment. Check which -a pip to see where its being installed might be that its being installed somewhere else.

yafangwang9 commented 2 months ago

Hi~ I have investigated the issue, and the version of Mistral-inference that I installed using pip is lower than required, while version 1.3.0 is needed. THanks very much

Bhargav Yagnik @.***> 于2024年7月24日周三 11:39写道:

You can try to install mistral-inference again in this Cloud env. Try - pip install mistral-inference in the environment. Check which -a pip to see where its being installed might be that its being installed somewhere else.

— Reply to this email directly, view it on GitHub https://github.com/mistralai/mistral-inference/issues/202#issuecomment-2246801153, or unsubscribe https://github.com/notifications/unsubscribe-auth/AWDJHN64HUP2NCEWNBHNTK3ZN4OYNAVCNFSM6AAAAABLJ3MJTWVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENBWHAYDCMJVGM . You are receiving this because you authored the thread.Message ID: @.***>

ShadyPi commented 2 months ago

I met the same issue and I found that it requires peotry to properly install mistral-inference. I verified this on 3 different servers: if you install peotry first and then pip install mistral-inference, it works perfectly.

shulin16 commented 2 months ago

Since the mistral_inference/ was put under src/ folder, you just need to append the path to system path by using

import sys sys.path.append('$YOUR_REPO_DIR/src/')