run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
37.01k stars 5.31k forks source link

[Bug]: OpenAI broken in 0.10.6 #10982

Closed mw19930312 closed 9 months ago

mw19930312 commented 9 months ago

Bug Description

I'm following the example provided in graph search engine. https://docs.llamaindex.ai/en/stable/examples/query_engine/knowledge_graph_rag_query_engine.html

However, I cannot even run from llama_index.llms.openai import OpenAI because I'm encountering an error of ImportError: cannot import name 'ChatMessage' from 'llama_index.core.llms' (unknown location)

Version

0.10.6

Steps to Reproduce

%pip install llama-index-llms-azure-openai %pip install llama-index-graph-stores-nebula %pip install llama-index-llms-openai %pip install llama-index-embeddings-azure-openai pip install llama-index==0.10.6

Code

import os

os.environ["OPENAI_API_KEY"] = "sk-"

import logging import sys

logging.basicConfig( stream=sys.stdout, level=logging.INFO ) # logging.DEBUG for more verbose output

from llama_index.llms.openai import OpenAI from llama_index.core import Settings

Settings.llm = OpenAI(temperature=0, model="gpt-3.5-turbo") Settings.chunk_size = 512

Relevant Logs/Tracbacks

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
Cell In[3], line 16
     10 logging.basicConfig(
     11     stream=sys.stdout, level=logging.INFO
     12 )  # logging.DEBUG for more verbose output
     15 # define LLM
---> 16 from llama_index.llms.openai import OpenAI
     17 from llama_index.core import Settings
     19 Settings.llm = OpenAI(temperature=0, model="gpt-3.5-turbo")

File /opt/homebrew/lib/python3.11/site-packages/llama_index/llms/openai/__init__.py:1
----> 1 from llama_index.llms.openai.base import AsyncOpenAI, OpenAI, SyncOpenAI, Tokenizer
      3 __all__ = ["OpenAI", "Tokenizer", "SyncOpenAI", "AsyncOpenAI"]

File /opt/homebrew/lib/python3.11/site-packages/llama_index/llms/openai/base.py:28
     16 from llama_index.core.base.llms.types import (
     17     ChatMessage,
     18     ChatResponse,
   (...)
     25     MessageRole,
     26 )
     27 from llama_index.core.bridge.pydantic import Field, PrivateAttr
---> 28 from llama_index.core.callbacks import CallbackManager
     29 from llama_index.core.constants import (
     30     DEFAULT_TEMPERATURE,
     31 )
     32 from llama_index.core.llms.callbacks import (
     33     llm_chat_callback,
     34     llm_completion_callback,
     35 )

File /opt/homebrew/lib/python3.11/site-packages/llama_index/core/callbacks/__init__.py:4
      2 from .llama_debug import LlamaDebugHandler
      3 from .schema import CBEvent, CBEventType, EventPayload
----> 4 from .token_counting import TokenCountingHandler
      5 from .utils import trace_method
      7 __all__ = [
      8     "CallbackManager",
      9     "CBEvent",
   (...)
     14     "trace_method",
     15 ]

File /opt/homebrew/lib/python3.11/site-packages/llama_index/core/callbacks/token_counting.py:6
      4 from llama_index.core.callbacks.base_handler import BaseCallbackHandler
      5 from llama_index.core.callbacks.schema import CBEventType, EventPayload
----> 6 from llama_index.core.utilities.token_counting import TokenCounter
      7 from llama_index.core.utils import get_tokenizer
     10 @dataclass
     11 class TokenCountingEvent:

File /opt/homebrew/lib/python3.11/site-packages/llama_index/core/utilities/token_counting.py:6
      1 # Modified from:
      2 # https://github.com/nyno-ai/openai-token-counter
      4 from typing import Any, Callable, Dict, List, Optional
----> 6 from llama_index.core.llms import ChatMessage, MessageRole
      7 from llama_index.core.utils import get_tokenizer
     10 class TokenCounter:

ImportError: cannot import name 'ChatMessage' from 'llama_index.core.llms' (unknown location)
logan-markewich commented 9 months ago

@mw19930312 please start from a fresh venv. Any remnants of the previous install will cause some issues due to the change to namespaced packages

In a new terminal

pip uninstall llama-index  # just to confirm its not installed globally
python -m venv venv
source venv/bin/activate
pip install llama-index ....
juicesharp commented 9 months ago

Same issue here, after migrate to 10.6 the issue persist in the new environment as well. from .token_counting import TokenCountingHandler File "/opt/homebrew/lib/python3.11/site-packages/llama_index/core/callbacks/token_counting.py", line 6, in <module> from llama_index.core.utilities.token_counting import TokenCounter File "/opt/homebrew/lib/python3.11/site-packages/llama_index/core/utilities/token_counting.py", line 6, in <module> from llama_index.core.llms import ChatMessage, MessageRole ImportError: cannot import name 'ChatMessage' from 'llama_index.core.llms' (unknown location)

P.S. Helped an uninstall of llama-index-core from the (base) and (specific env) in coda and reinstall inside of the new env.

logan-markewich commented 9 months ago

@juicesharp try creating a new venv, as I described above :) Should help. For example, this all works in a fresh google colab, which indicates just some env tweaking/setup is needed

mw19930312 commented 9 months ago

@logan-markewich Creating a new venv works. Thanks!

cmosguy commented 9 months ago

I did a fresh install in my conda env and I am still seeing this error:

ImportError: cannot import name 'BasePromptTemplate' from partially initialized module 'llama_index.core.prompts' (most likely due to a circular import) (/opt/miniconda3/envs/proj/lib/python3.10/site-packages/llama_index/core/prompts/__init__.py)
logan-markewich commented 9 months ago

@cmosguy yea that's a slightly different issue. Update all llama-index-* packages you have installed and it should be working

juicesharp commented 9 months ago

@cmosguy I fixed that as well Today on another machine ... the issue was llama-index-core was of version 0.10.8 instead of 0.10.7. Run pip list and check the version. Then remove llama-index-core and install specifically 0.10.7 instead.

harshit0209 commented 7 months ago

I also had the same issue, got resolved by: https://docs.llamaindex.ai/en/stable/getting_started/starter_example/
Looks like there had been some change with updates