Closed mw19930312 closed 9 months ago
@mw19930312 please start from a fresh venv. Any remnants of the previous install will cause some issues due to the change to namespaced packages
In a new terminal
pip uninstall llama-index # just to confirm its not installed globally
python -m venv venv
source venv/bin/activate
pip install llama-index ....
Same issue here, after migrate to 10.6 the issue persist in the new environment as well. from .token_counting import TokenCountingHandler File "/opt/homebrew/lib/python3.11/site-packages/llama_index/core/callbacks/token_counting.py", line 6, in <module> from llama_index.core.utilities.token_counting import TokenCounter File "/opt/homebrew/lib/python3.11/site-packages/llama_index/core/utilities/token_counting.py", line 6, in <module> from llama_index.core.llms import ChatMessage, MessageRole ImportError: cannot import name 'ChatMessage' from 'llama_index.core.llms' (unknown location)
P.S. Helped an uninstall of llama-index-core from the (base) and (specific env) in coda and reinstall inside of the new env.
@juicesharp try creating a new venv, as I described above :) Should help. For example, this all works in a fresh google colab, which indicates just some env tweaking/setup is needed
@logan-markewich Creating a new venv works. Thanks!
I did a fresh install in my conda env and I am still seeing this error:
ImportError: cannot import name 'BasePromptTemplate' from partially initialized module 'llama_index.core.prompts' (most likely due to a circular import) (/opt/miniconda3/envs/proj/lib/python3.10/site-packages/llama_index/core/prompts/__init__.py)
@cmosguy yea that's a slightly different issue. Update all llama-index-* packages you have installed and it should be working
@cmosguy I fixed that as well Today on another machine ... the issue was llama-index-core was of version 0.10.8 instead of 0.10.7. Run pip list and check the version. Then remove llama-index-core and install specifically 0.10.7 instead.
I also had the same issue, got resolved by: https://docs.llamaindex.ai/en/stable/getting_started/starter_example/
Looks like there had been some change with updates
Bug Description
I'm following the example provided in graph search engine. https://docs.llamaindex.ai/en/stable/examples/query_engine/knowledge_graph_rag_query_engine.html
However, I cannot even run
from llama_index.llms.openai import OpenAI
because I'm encountering an error ofImportError: cannot import name 'ChatMessage' from 'llama_index.core.llms' (unknown location)
Version
0.10.6
Steps to Reproduce
%pip install llama-index-llms-azure-openai %pip install llama-index-graph-stores-nebula %pip install llama-index-llms-openai %pip install llama-index-embeddings-azure-openai pip install llama-index==0.10.6
Code
import os
os.environ["OPENAI_API_KEY"] = "sk-"
import logging import sys
logging.basicConfig( stream=sys.stdout, level=logging.INFO ) # logging.DEBUG for more verbose output
from llama_index.llms.openai import OpenAI from llama_index.core import Settings
Settings.llm = OpenAI(temperature=0, model="gpt-3.5-turbo") Settings.chunk_size = 512
Relevant Logs/Tracbacks