crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
19k stars 2.62k forks source link

API Connection Error :- CrewAgentExecutor chain does not generate any output #1050

Closed pravincoder closed 1 month ago

pravincoder commented 1 month ago

Just build a small Stock Analysis and Investment Bot, it was working fine but now I only se empty chains . Output:- error1

Then i used LangSmith to check the underlying process of crew and llms , here i got to see an error :- LangSmith Error :-

APIConnectionError('Connection error.')Traceback (most recent call last):

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpx\_transports\default.py", line 69, in map_httpcore_exceptions
    yield

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpx\_transports\default.py", line 233, in handle_request
    resp = self._pool.handle_request(req)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpcore\_sync\connection_pool.py", line 216, in handle_request
    raise exc from None

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpcore\_sync\connection_pool.py", line 196, in handle_request
    response = connection.handle_request(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpcore\_sync\connection.py", line 99, in handle_request
    raise exc

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpcore\_sync\connection.py", line 76, in handle_request
    stream = self._connect(request)
             ^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpcore\_sync\connection.py", line 122, in _connect
    stream = self._network_backend.connect_tcp(**kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpcore\_backends\sync.py", line 205, in connect_tcp
    with map_exceptions(exc_map):

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpcore\_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc

httpcore.ConnectError: [WinError 10061] No connection could be made because the target machine actively refused it

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_base_client.py", line 978, in _request
    response = self._client.send(
               ^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpx\_client.py", line 914, in send
    response = self._send_handling_auth(
               ^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpx\_client.py", line 942, in _send_handling_auth
    response = self._send_handling_redirects(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpx\_client.py", line 979, in _send_handling_redirects
    response = self._send_single_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpx\_client.py", line 1015, in _send_single_request
    response = transport.handle_request(request)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpx\_transports\default.py", line 232, in handle_request
    with map_httpcore_exceptions():

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\contextlib.py", line 158, in __exit__
    self.gen.throw(typ, value, traceback)

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\httpx\_transports\default.py", line 86, in map_httpcore_exceptions
    raise mapped_exc(message) from exc

httpx.ConnectError: [WinError 10061] No connection could be made because the target machine actively refused it

The above exception was the direct cause of the following exception:

Traceback (most recent call last):

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain\chains\base.py", line 156, in invoke
    self._call(inputs, run_manager=run_manager)

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\crewai\agents\executor.py", line 70, in _call
    next_step_output = self._take_next_step(
                       ^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain\agents\agent.py", line 1318, in _take_next_step
    [

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain\agents\agent.py", line 1318, in <listcomp>
    [

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\crewai\agents\executor.py", line 134, in _iter_next_step
    output = self.agent.plan(  # type: ignore #  Incompatible types in assignment (expression has type "AgentAction | AgentFinish | list[AgentAction]", variable has type "AgentAction")
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain\agents\agent.py", line 463, in plan
    for chunk in self.runnable.stream(inputs, config={"callbacks": callbacks}):

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\runnables\base.py", line 3253, in stream
    yield from self.transform(iter([input]), config, **kwargs)

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\runnables\base.py", line 3240, in transform
    yield from self._transform_stream_with_config(

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\runnables\base.py", line 2053, in _transform_stream_with_config
    chunk: Output = context.run(next, iterator)  # type: ignore
                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\runnables\base.py", line 3202, in _transform
    for output in final_pipeline:

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\runnables\base.py", line 1271, in transform
    for ichunk in input:

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\runnables\base.py", line 5267, in transform
    yield from self.bound.transform(

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\runnables\base.py", line 1289, in transform
    yield from self.stream(final, config, **kwargs)

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\language_models\chat_models.py", line 373, in stream
    raise e

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_core\language_models\chat_models.py", line 353, in stream
    for chunk in self._stream(messages, stop=stop, **kwargs):

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\langchain_openai\chat_models\base.py", line 480, in _stream
    with self.client.create(messages=message_dicts, **params) as response:
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_utils\_utils.py", line 277, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\resources\chat\completions.py", line 646, in create
    return self._post(
           ^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_base_client.py", line 1266, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_base_client.py", line 942, in request
    return self._request(
           ^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_base_client.py", line 1002, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_base_client.py", line 1079, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_base_client.py", line 1002, in _request
    return self._retry_request(
           ^^^^^^^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_base_client.py", line 1079, in _retry_request
    return self._request(
           ^^^^^^^^^^^^^^

  File "C:\Users\WIN10\anaconda3\envs\Crew_env\Lib\site-packages\openai\_base_client.py", line 1012, in _request
    raise APIConnectionError(request=request) from err

openai.APIConnectionError: Connection error.

For this project i am using ollama (mistral) model ,below usage code

llm = ChatOpenAI(
    model='mistral:latest',
    base_url='http://localhost:11434/v1',
    api_key='NA'
)

but i tested it with Groq :- the Output was exactly the same Chain were'nt generating any output ,

lorenzejay commented 1 month ago
Screenshot 2024-08-02 at 4 44 51 PM

its in the error message. possibly you are using crew_output.result we introduced a breaking change there. Remove .result or use .raw

pravincoder commented 1 month ago

@lorenzejay

I am just using crew.kickoff() in my code ,here is my main.py file code , Note:- This main file does not include Task,Agent,tool as there are seperate file for each .

main.py

import logging
from dotenv import load_dotenv
from crewai import Crew
from task import Stock_bot
from agents import Stock_bot_agents

def main():
    load_dotenv()

    print("#### -----WELCOME TO STOCK ANALYSIS BOT -----####")
    print("------_____________________________________------")
    print("Enter the stock name you want to analyze:")

    stock = input()
    print(f"Stock name entered: {stock}")

    tasks = Stock_bot()
    agent = Stock_bot_agents()

    # Create agents
    stock_analysis = agent.stock_anaylsis()
    investment_analysis = agent.investment_analysis()

    # Create tasks
    stock_analysis_task = tasks.stock_analysis(stock_analysis,stock)
    investment_analysis_task = tasks.investment_analysis(investment_analysis,stock)
    # Execute tasks
    print("Creating Crew instance >>>>")
    crew = Crew(
        agents=[
            stock_analysis,
            investment_analysis
        ],
        tasks=[
            stock_analysis_task,
            investment_analysis_task
        ],
    )

    try:
        result = crew.kickoff()
        logging.info("Crew kickoff executed successfully.")
        print(f"Result: {result}")
    except Exception as e:
        logging.error(f"Error during crew kickoff: {e}")

    # Save the result to a text file
    try:
        with open('result.txt', 'w') as file:
            file.write(str(result))
        logging.info("Result saved to result.txt successfully.")
    except Exception as e:
        logging.error(f"Error saving result to file: {e}")

if __name__ == '__main__':
    main()

The ERROR messange is saying result's because i am using a parameter 'result' to save the output generated by crew