run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
33.89k stars 4.77k forks source link

[Bug]: I got a No module named 'llama_index.core.llms.generic_utils' #14639

Closed JoseGHdz closed 2 weeks ago

JoseGHdz commented 2 weeks ago

Bug Description

I created my code in my local machine and it works perfectly no issues. But now I am working on the import to Google Colab Notebook. After I installed my packages I tried running my import libraries and implementation files but this: No module named 'llama_index.core.llms.generic_utils' came up.

It looks like it is trying to access a library from a package that might not be installed.

Version

llama-index 0.10.4

Steps to Reproduce

import os !pip install python-dotenv==1.0.0 !pip install requests==2.31.0 !pip install 'llama-index==0.10.4' !pip install 'openai==1.8.0' !pip install 'llama-index-agent-openai==0.1.1' !python3 -m pip install 'llama-index-core==0.10.3' !python3 -m pip install 'llama-index-embeddings-adapter==0.1.0' !python3 -m pip install 'llama-index-embeddings-openai==0.1.1' !python3 -m pip install 'llama-index-finetuning==0.1.0' !python3 -m pip install 'llama-index-legacy==0.9.48' !python3 -m pip install 'llama-index-llms-gradient==0.1.0' !python3 -m pip install 'llama-index-llms-openai==0.1.1' !python3 -m pip install 'llama-index-multi-modal-llms-openai==0.1.1' !python3 -m pip install 'llama-index-postprocessor-cohere-rerank==0.1.0' !python3 -m pip install 'llama-index-program-openai==0.1.1' !python3 -m pip install 'llama-index-question-gen-openai==0.1.1' !python3 -m pip install 'llama-index-readers-file==0.1.3'

from dotenv import load_dotenv load_dotenv()

import uuid import json import service_j1_export as j1s from Modules.service_description_generator import description_generator from rag import rag_variables

image

Modules Service Description Generator File: import os import json from dotenv import load_dotenv from pathlib import Path from llama_index.core import download_loader from llama_index.core import VectorStoreIndex, ServiceContext from rag import main load_dotenv()

Relevant Logs/Tracbacks

ModuleNotFoundError                       Traceback (most recent call last)
<ipython-input-23-853673ddc8a1> in <cell line: 4>()
      2 import json
      3 import service_j1_export as j1s
----> 4 from Modules.service_description_generator import description_generator
      5 from rag import rag_variables

3 frames
/usr/local/lib/python3.10/dist-packages/llama_index/llms/openai/base.py in <module>
     34     llm_completion_callback,
     35 )
---> 36 from llama_index.core.llms.generic_utils import (
     37     achat_to_completion_decorator,
     38     acompletion_to_chat_decorator,

ModuleNotFoundError: No module named 'llama_index.core.llms.generic_utils'
logan-markewich commented 2 weeks ago

Need a newer version of llama-index-llms-openai installed

JoseGHdz commented 2 weeks ago

Need a newer version of llama-index-llms-openai installed

Awesome! I'll try that out

JoseGHdz commented 2 weeks ago

Need a newer version of llama-index-llms-openai installed

I updated to a newer version and I got the same error. I also updated the other packages to see if that changed the error but it did not.

I think the issue is Google Colab itself.

Error: `ModuleNotFoundError Traceback (most recent call last) File /usr/local/lib/python3.11/dist-packages/llama_index/embeddings/openai/utils.py:5 2 import os 3 from typing import Any, Callable, Optional, Tuple, Union ----> 5 from llama_index.core.llms.generic_utils import get_from_param_or_env 6 from tenacity import ( 7 before_sleep_log, 8 retry, (...) 13 wait_random_exponential, 14 ) 15 from tenacity.stop import stop_base

ModuleNotFoundError: No module named 'llama_index.core.llms.generic_utils'`

logan-markewich commented 2 weeks ago

Did you restart after updating? Im unable to replicate. Seems like the issue is with openai embeddings now, not the LLM (so one bug fixed)

Maybe pip install -U llama-index-embeddings-openai ?

In a fresh colab install, it works for me https://colab.research.google.com/drive/1uiOF7gr6OeTp-vRnhszKoGQz4X0EbwRf?usp=sharing

JoseGHdz commented 2 weeks ago

Did you restart after updating? Im unable to replicate. Seems like the issue is with openai embeddings now, not the LLM (so one bug fixed)

Maybe pip install -U llama-index-embeddings-openai ?

In a fresh colab install, it works for me https://colab.research.google.com/drive/1uiOF7gr6OeTp-vRnhszKoGQz4X0EbwRf?usp=sharing

Yeah I restarted after updating but it kept giving me issues so I just went ahead and opened a session in AWS Cloud9 which seems to be working better. Only issue so far is that it takes very long to load the query engine tools.