microsoft / FLAML

A fast library for AutoML and tuning. Join our Discord: https://discord.gg/Cppx2vSPVP.
https://microsoft.github.io/FLAML/
MIT License
3.75k stars 495 forks source link

Provide initial values to `exitcode`, `exitcode2str` and `logs` #1268

Open marcos-venicius opened 5 months ago

marcos-venicius commented 5 months ago

When using gpt-3.5-turbo-1106 i got the error bellow

User_Proxy (to chat_manager):

Find a latest paper about gpt-4 on arxiv and find its potential applications in software

--------------------------------------------------------------------------------

>>>>>>>> USING AUTO REPLY...
Traceback (most recent call last):
  File "/home/marcos_souza/Projects/auto-gen/2/./main.py", line 46, in <module>
    user_proxy.initiate_chat(
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 521, in initiate_chat
    self.send(self.generate_init_message(**context), recipient, silent=silent)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 324, in send
    recipient.receive(message, self, request_reply, silent)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 452, in receive
    reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 767, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/groupchat.py", line 118, in run_chat
    reply = speaker.generate_reply(sender=self)
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 767, in generate_reply
    final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"])
  File "/home/marcos_souza/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 635, in generate_code_execution_reply
    return True, f"exitcode: {exitcode} ({exitcode2str})\nCode output: {logs}"
UnboundLocalError: local variable 'exitcode' referenced before assignment

This happens because the logs, exitcode and exitcode2str does not have an "initial value", so, when we not have messages no code is executed and the exitcode, logs and exitcode2str is not initiated, so, the code breaks as we can see above.

Why are these changes needed?

Related issue number

Checks

marcos-venicius commented 5 months ago

@marcos-venicius please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.

@microsoft-github-policy-service agree [company="{your company}"]

Options:

  • (default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
  • (when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"

Contributor License Agreement

@microsoft-github-policy-service agree

sonichi commented 5 months ago

Could you move the discussion to https://github.com/microsoft/autogen ? AutoGen has been developed in its own repo since October. And please try the latest version of pyautogen. This error shouldn't happen in my understanding.