AbubakrChan / crewai-UI-business-product-launch

Streamlit UI for crewai | crewai ui
149 stars 39 forks source link

Where to define llm model? #1

Closed PiotrEsse closed 6 months ago

PiotrEsse commented 8 months ago

Hi, where to define llm model? I cant find a place where we can define llm.

AbubakrChan commented 8 months ago

Hi there!

Thank you for creating this issue.

we don't actually need to define the llm anywhere manually, just make sure you have your "OPENAI_API_KEY=sk-xxxxxxxxxxxxxxxxxxxxxxxx" in your environment variables.

Please, let me know if you have any other questions.

AbubakrChan commented 8 months ago

Also, added more detail regarding this issue in the readme.md file.

thanks

PiotrEsse commented 8 months ago

Thank you. I am using Ollama as local LLM provider that's my question.

AbubakrChan commented 8 months ago

Oh okay, this might help then: https://docs.crewai.com/how-to/LLM-Connections/#crewai-agent-overview

sjdthree commented 7 months ago

In #2, I've added Ollama, add these lines at the top of main.py:

# NOTE: to find which model names you have, use cli tool:  `ollama list`
llm = ChatOpenAI(
      model='llama2',
      base_url="http://localhost:11434/v1",
      api_key="NA"
    )

then at the end of each Agent() function you add:

        allow_delegation=True,
        llm=llm,
    )

If it doesn't work, make sure you ran ollama serve in a terminal. You can also look at page: http://localhost:11434/ and it should say "Ollama is running" to confirm. Hope it helps.