crewAIInc / crewAI-examples

2.63k stars 981 forks source link

Trying to use a local LLM #46

Closed dirvine closed 7 months ago

dirvine commented 7 months ago

In the agents.py of the game example, I set up ollama with this

`from langchain_community import llms from langchain_community.llms import ollama;

llm_general = ollama.Ollama(model="mixtral")'

In each agent I set the llm=llm_general,

However I get this error message


  File "/Users/davidirvine/Documents/Documents - Mac/Devel/crewAI-examples/game-builder-crew/main.py", line 1, in <module>
    from crewai import Crew
  File "/opt/homebrew/anaconda3/lib/python3.11/site-packages/crewai/__init__.py", line 1, in <module>
    from crewai.agent import Agent
  File "/opt/homebrew/anaconda3/lib/python3.11/site-packages/crewai/agent.py", line 9, in <module>
    from pydantic import (
ImportError: cannot import name 'InstanceOf' from 'pydantic' (/opt/homebrew/anaconda3/lib/python3.11/site-packages/pydantic/__init__.cpython-311-darwin.so)```
dirvine commented 7 months ago

Solved with a new conda env? Sorry for the noise

jaideep11061982 commented 4 days ago

@dirvine do you know how we can call LLM deployed on some internal server of Company ?. Today I use requests.post