response = agent.generate_reply(messages=[{"role": "user", "content": "Tell me a funny joke"}])
print(response)
Getting an error like below . Earlier, it was working fine. Any idea how to fix this ?
self._throttle_api_calls(i)
File "D:\UserData\z004kxab\05_Learning\autogen\venv1\Lib\site-packages\autogen\oai\client.py", line 1072, in _throttle_api_calls
if self._rate_limiters[idx]:
Trying to execute the below code.
import os from autogen import AssistantAgent, UserProxyAgent, ConversableAgent from dotenv import load_dotenv
load_dotenv()
model = "gpt-3.5-turbo" llm_config = { "model": model, "api_key": os.environ.get("OPENAI_API_KEY") }
agent = ConversableAgent( llm_config = llm_config, name = "chatbot", code_execution_config=False, human_input_mode="NEVER", )
response = agent.generate_reply(messages=[{"role": "user", "content": "Tell me a funny joke"}])
print(response)
Getting an error like below . Earlier, it was working fine. Any idea how to fix this ?
self._throttle_api_calls(i) File "D:\UserData\z004kxab\05_Learning\autogen\venv1\Lib\site-packages\autogen\oai\client.py", line 1072, in _throttle_api_calls if self._rate_limiters[idx]: