run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
36.06k stars 5.13k forks source link

[Bug]: Breaking changes makes it hard to work with llama_index #10747

Closed jobergum closed 8 months ago

jobergum commented 8 months ago

Bug Description

A few months ago we did a notebook demonstrating how to connect Vespa with this notebook and lately it started to fail because there has been a major rewrite of the project.

I tried following the migrating guide/tooling to update the example by

pip3.11 install -U llama-index llama-index-core

and then trying to run the converter

llamaindex-cli upgrade-file scaling-personal-ai-assistants-with-streaming-mode-cloud.ipynb
Traceback (most recent call last):
  File "/opt/homebrew/bin/llamaindex-cli", line 5, in <module>
    from llama_index.core.command_line.command_line import main
  File "/opt/homebrew/lib/python3.11/site-packages/llama_index/core/command_line/command_line.py", line 4, in <module>
    from llama_index.core.command_line.rag import RagCLI, default_ragcli_persist_dir
  File "/opt/homebrew/lib/python3.11/site-packages/llama_index/core/command_line/rag.py", line 9, in <module>
    from llama_index.core import (
ImportError: cannot import name 'Response' from 'llama_index.core' (unknown location)

Help with this would be appreciated.

pip3.11 show llama-index llama-index-core
Name: llama-index
Version: 0.10.4
Summary: Interface between LLMs and your data
Home-page: https://llamaindex.ai
Author: Jerry Liu
Author-email: jerry@llamaindex.ai
License: MIT
Location: /opt/homebrew/lib/python3.11/site-packages
Requires: llama-index-agent-openai, llama-index-core, llama-index-embeddings-openai, llama-index-legacy, llama-index-llms-openai, llama-index-multi-modal-llms-openai, llama-index-program-openai, llama-index-question-gen-openai, llama-index-readers-file
Required-by: RAGatouille
---
Name: llama-index-core
Version: 0.10.3
Summary: Interface between LLMs and your data
Home-page: https://llamaindex.ai
Author: Jerry Liu
Author-email: jerry@llamaindex.ai
License: MIT
Location: /opt/homebrew/lib/python3.11/site-packages
Requires: aiohttp, dataclasses-json, deprecated, dirtyjson, fsspec, httpx, nest-asyncio, networkx, nltk, numpy, openai, pandas, pillow, PyYAML, requests, SQLAlchemy, tenacity, tiktoken, tqdm, typing-extensions, typing-inspect
Required-by: llama-index, llama-index-agent-openai, llama-index-embeddings-openai, llama-index-llms-openai, llama-index-multi-modal-llms-openai, llama-index-program-openai, llama-index-question-gen-openai, llama-index-readers-file

Version

0.10.4

Steps to Reproduce

https://github.com/vespa-engine/pyvespa/blob/master/docs/sphinx/source/examples/scaling-personal-ai-assistants-with-streaming-mode-cloud.ipynb

Relevant Logs/Tracbacks

No response

dosubot[bot] commented 8 months ago

πŸ€–

Hey @jobergum, great to see you back here! Hope you're doing well.

Based on the traceback you provided, it seems like the issue is with the import statement for the 'Response' class. In the recent versions of LlamaIndex, the import statement for the 'Response' class has been changed. You should update your import statement to:

from llama_index.core.response.schema import Response

This change was discussed in a similar issue: [Bug]: cannot import name 'Response' from 'llama_index.core' (unknown location) when running upgrade script.

If the issue persists after correcting the import statement, it might be due to conflicts with other packages or leftover packages from previous installations. A fully fresh install seemed to work for another user who faced a similar issue. You can try uninstalling and then reinstalling the packages:

pip3.11 uninstall llama-index llama-index-core
pip3.11 install llama-index llama-index-core

Also, please ensure that you have the correct versions of other packages installed. There was an issue with version conflicts between 'index-storage-index-store-mongodb' and 'llama-index-storage-docstore-mongodb' which has been fixed now.

After resolving the installation issues, you may need to refactor some parts of your code. There was a side note in the same issue regarding the 'StreamingResponse' class not being transformed correctly.

If you're still facing issues, could you please provide more details about your environment? Specifically, the Python version, operating system, and any other relevant details would be helpful.

Sources

#### About Dosu This response is meant to be useful and save you time. It is **not meant to be a precise solution**, but rather a starting point for your own research. Help me be more useful! Please leave a πŸ‘ if this is helpful and πŸ‘Ž if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot.
logan-markewich commented 8 months ago

@jobergum typically import errors like this are solved by starting with a fresh venv. I see you are installing packages globally, which is less recommended for python projects in general.

For example, I would do

python3 -m venv venv
source venv/bin/activate
pip install llama-index

(Or you could use poetry or conda for env management as well)

Due to switching to namespaced packages, any remnants of the previous llama-index install will cause an issue.

Thank you for understanding πŸ™πŸ»

jobergum commented 8 months ago

The core problem here is all the breaking changes for a single simple abstraction (retriever). The blog posts are outdated, notebooks are outdated (we at least notice that the notebook started to fail). Pinning to a specific 0.x version also is not very attractive.

logan-markewich commented 8 months ago

@jobergum the choice was made to refactor the package into several namespaced packages. Yes this is breaking, and I realize its frustrating, but going forward is

a) more scaleable (maintaining 400+ integrations in a single package is not the way to go) b) sane dependency management (since each integration can declare its deps vs. hidden imports) c) allows for smaller installations d) each integration can have proper unit tests now

The notebooks in our repo have been updated. If you notice ones that don't work please do let me know, and I can update them.

Did the above venv recommendation help?

logan-markewich commented 8 months ago

Also, if you need complete backwards compatibility, you can replace llama_index with llama_index.legacy in every import (which will point to a legacy version pinned at v0.9.48)

i.e.

from llama_index.legacy import VectorStoreIndex, SimpleDirectoryReader
jobergum commented 8 months ago

The notebooks in our repo have been updated.

Yeah, but not ours. Also, this is the second breakage with the retriever abstraction since we published the notebook. I changed to the legacy and will consider removing it entirely

image

logan-markewich commented 8 months ago

@jobergum we've provided all the tools we can to help others migrate. Keep in mind this library is both pre v1.0 and has really only been around for less than a year.

This was a monumental task to refactor, and I wouldn't expect something like this to happen again, if I have anything to say about it. This lays the groundwork for a more scalable and maintainable project.

As always, we are around to help with any migrations or other issues you encounter πŸ‘πŸ»