run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.47k stars 5k forks source link

[Question]: ModuleNotFoundError: No module named 'llama_index.core.llms.function_calling' #15999

Open li-hhhh opened 5 days ago

li-hhhh commented 5 days ago

Question Validation

Question

在使用Ollama时报错,显示没有llama_index.core.llms.function_calling模块。 代码: from llama_index.llms.ollama import Ollama

llm = Ollama(model="llama3.1:latest", request_timeout=60.0)

response = llm.complete("hello?") print(response)

报错: Traceback (most recent call last): File "/home/huidao/KnowledgeProject/pythonProject/Trimming/Trimming_Embedding/test_codes/test_ollama.py", line 1, in from llama_index.llms.ollama import Ollama File "/home/huidao/KnowledgeProject/pythonProject/.venv/lib/python3.12/site-packages/llama_index/llms/ollama/init.py", line 1, in from llama_index.llms.ollama.base import Ollama File "/home/huidao/KnowledgeProject/pythonProject/.venv/lib/python3.12/site-packages/llama_index/llms/ollama/base.py", line 18, in from llama_index.core.llms.function_calling import FunctionCallingLLM ModuleNotFoundError: No module named 'llama_index.core.llms.function_calling'

我查看官方文档,pip install llama-index-llms-ollama之后代码可以正常运行,我照官方文档步骤执行依旧有这个问题,所以这个问题具体应该如何解决呢?

dosubot[bot] commented 5 days ago

I found a similar issue related to the ModuleNotFoundError: No module named 'llama_index.core.llms.function_calling' when using the Ollama module in llama_index. Here are the steps to resolve this issue:

  1. Check Installation: Ensure that you have installed the llama_index package correctly. You can reinstall it using pip:

    pip install llama-index
  2. Verify Module Path: Ensure that the module path llama_index.core.llms.function_calling is correct and exists in your installed package. You can check the package contents in your environment.

  3. Update Package: If the module path is correct and the issue persists, try updating the llama_index package to the latest version:

    pip install --upgrade llama-index
  4. Check for Typos: Ensure there are no typos in the import statement. The correct import statement should be:

    from llama_index.core.llms.function_calling import FunctionCallingLLM

If the above steps do not resolve the issue, it is possible that there is a problem with the package itself or its version compatibility. In such a case, you may need to check the package documentation or contact the package maintainers for further assistance [1].

Additionally, ensure that you have the specific dependencies required for the llama_index.llms.ollama module:

These dependencies are documented in the pyproject.toml file under [tool.poetry.dependencies] [2][3].

To continue talking to Dosu, mention @dosu.

logan-markewich commented 4 days ago

@li-hhhh What version of llama-index do you have? Like dosu is suggesting, make sure you have the latest version of llama-index-core