crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
19.72k stars 2.73k forks source link

[BUG] Deepseek Integration #1359

Closed angelarias2014 closed 6 days ago

angelarias2014 commented 1 week ago

Description

Hi, somebody try to connect crewai with deepseek?

I´m trying with Langchain and LiteLLM components and cannot connect this model.

I update for new version 0.6.3.1 with LLM Class, and if config with Deepseek shows me an error and cannot connect with model.

Steps to Reproduce

The code example:

.... from crewai import LLM .... ....

Integración Modelo

deepseek_llm = LLM( model="deepseek-chat", temperature=1.3, max_tokens=150, base_url="https://api.deepseek.com", api_key=DEEPSEEK_API_KEY ) ..... ..... agent = Agent( role=role, goal=goal, backstory=backstory, verbose=True, allow_delegation=True, tools=selected_tools, llm=deepseek_llm )

Expected behavior

In DOCS, this configuration: from crewai import Agent, LLM

llm = LLM( model="gpt-4", temperature=0.7, base_url="https://api.openai.com/v1", api_key="your-api-key-here" )

agent = Agent( role='Customized LLM Expert', goal='Provide tailored responses', backstory="An AI assistant with custom LLM settings.", llm=llm )

In my configuration:

from crewai import Agent, LLM

deepseek_llm = LLM( model="deepseek-chat", temperature=1.3, max_tokens=150, base_url="https://api.deepseek.com", api_key=DEEPSEEK_API_KEY )

        agent = Agent(
            role=role,
            goal=goal,
            backstory=backstory,
            verbose=True,
            allow_delegation=True,
            tools=selected_tools,
            llm=deepseek_llm
        )

Screenshots/Code snippets

error output:

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

Provider List: https://docs.litellm.ai/docs/providers

error in console: 2024-09-26 17:23:29,861 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:29,865 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:30,173 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:30,175 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:30,181 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:30,182 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable

Operating System

Ubuntu 20.04

Python Version

3.10

crewAI Version

0.6.3.1

crewAI Tools Version

-

Virtual Environment

Venv

Evidence

2024-09-26 17:23:29,861 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:29,865 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:30,173 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:30,175 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:30,181 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable 2024-09-26 17:23:30,182 - 140617412761152 - llm.py-llm:104 - ERROR: Failed to get supported params: argument of type 'NoneType' is not iterable

Possible Solution

None

Additional context

I´m try with several configurations for connect crewai this Deepseek model.

With langchain library and LiteLLM library, but always shows me an error or of Library of Openai.

Please can someone help me

Thank you.

joaomdmoura commented 6 days ago

Version 0.64.0 is out and fixes this.

from crewai import LLM
#...
return Agent(
    config=self.agents_config['researcher'],
    tools=[FileReadTool()],
    llm=LLM(model="deepseek/deepseek-chat"),
    verbose=True
)
#...

I also set an env var DEEPSEEK_API_KEY

joaomdmoura commented 6 days ago

this was a great catch btw, sorry about the inconvenience 😅

angelarias2014 commented 6 days ago

Thank you very much, Joao. You are doing an incredible job with CrewAI / Muito obrigado, Joao. Você está fazendo um trabalho incrível com o CrewAI.