Run this code using llama-index-llms-mistralai version 0.1.5:
from llama_index.core import Document, VectorStoreIndex, Settings
from llama_index.core.llms import ChatMessage
from llama_index.llms.mistralai import MistralAI
# Evaluator
llm = MistralAI(model="mistral-tiny")
Settings.llm = llm
# Index
document = Document.example()
index = VectorStoreIndex([document])
engine = index.as_chat_engine(
chat_mode='context',
)
# Test case
query = "tell me a joke"
response = engine.chat(message=query)
print(response)
Expected: completion of the script, printing output
Observed: script doesn't run, error:
Traceback (most recent call last):
File "c:\projects\mistralexample\issue_mistralai.py", line 18, in <module>
response = engine.chat(message=query)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\callbacks\utils.py", line 41, in wrapper
return func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\chat_engine\context.py", line 172, in chat
chat_response = self._llm.chat(all_messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\llms\callbacks.py", line 93, in wrapped_llm_chat
f_return_val = f(_self, messages, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\llms\mistralai\base.py", line 176, in chat
from mistralai.client import ChatMessage as mistral_chatmessage
ImportError: cannot import name 'ChatMessage' from 'mistralai.client' (c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\mistralai\client.py)
Traceback (most recent call last):
File "c:\projects\mistralexample\issue_mistralai.py", line 18, in <module>
response = engine.chat(message=query)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\callbacks\utils.py", line 41, in wrapper
return func(self, *args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\chat_engine\context.py", line 172, in chat
chat_response = self._llm.chat(all_messages)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\core\llms\callbacks.py", line 93, in wrapped_llm_chat
f_return_val = f(_self, messages, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\llama_index\llms\mistralai\base.py", line 176, in chat
from mistralai.client import ChatMessage as mistral_chatmessage
ImportError: cannot import name 'ChatMessage' from 'mistralai.client' (c:\Users\my_username\AppData\Local\Programs\Python\Python312\Lib\site-packages\mistralai\client.py)
Bug Description
After upgrading
llama-index-llms-mistralai
to 0.1.5, the most simple chat example for Mistral models no longer works.Version
llama-index-core 0.10.19, llama-index-llms-mistralai 0.1.5
Steps to Reproduce
Run this code using
llama-index-llms-mistralai
version0.1.5
:Expected: completion of the script, printing output
Observed: script doesn't run, error:
Output of
pip list
:Relevant Logs/Tracbacks