Chainlit / cookbook

Chainlit's cookbook repo
https://github.com/Chainlit/chainlit
668 stars 246 forks source link

Chainlit Autogen example with opensource LLM Qwen Openai.PermissionDenied Error #65

Open Haxeebraja opened 6 months ago

Haxeebraja commented 6 months ago

I have started Qwen API locally using openai_api.py. https://github.com/QwenLM/Qwen/blob/main/openai_api.py

Tried both examples in: https://github.com/Chainlit/cookbook/tree/main/pyautogen

Following works fine using OpenAIWrapper but initiate_chat with agents as in examples is not working. config_list = [ { "model": "Qwen", "base_url": "http://localhost:8787/v1", "api_key": "NULL", } ] client = OpenAIWrapper(config_list=config_list) response = client.create(messages=[{"role": "user", "content": "2+9="}]) print(client.extract_text_or_completion_object(response))

Error: openai.PermissionDeniedError: < !DOCTYPE html> < html xmlns="http://www.w3.org/1999/xhtml"> < body> < /body> < /html>

2024-01-21 10:35:13 - Loaded .env file 2024-01-21 10:35:15 - Your app is available at http://localhost:8081 user_proxy (to assistant):

hello.


2024-01-21 10:35:18 - <!DOCTYPE html>

Traceback (most recent call last): File "/usr/local/lib/python3.10/dist-packages/chainlit/utils.py", line 39, in wrapper return await user_function(params_values) File "/workspace/chainlit-autogen/app5.py", line 108, in on_chat_start await cl.make_async(user_proxy.initiate_chat)( File "/usr/local/lib/python3.10/dist-packages/asyncer/_main.py", line 358, in wrapper return await anyio.to_thread.run_sync( File "/usr/local/lib/python3.10/dist-packages/anyio/to_thread.py", line 33, in run_sync return await get_asynclib().run_sync_in_worker_thread( File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 877, in run_sync_in_worker_thread return await future File "/usr/lib/python3.10/asyncio/futures.py", line 285, in await yield self # This tells Task to wait for completion. File "/usr/lib/python3.10/asyncio/tasks.py", line 304, in __wakeup future.result() File "/usr/lib/python3.10/asyncio/futures.py", line 201, in result raise self._exception.with_traceback(self._exception_tb) File "/usr/local/lib/python3.10/dist-packages/anyio/_backends/_asyncio.py", line 807, in run result = context.run(func, args) File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 672, in initiate_chat self.send(self.generate_init_message(context), recipient, silent=silent) File "/workspace/chainlit-autogen/app5.py", line 86, in send super(ChainlitUserProxyAgent, self).send( File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 420, in send recipient.receive(message, self, request_reply, silent) File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 578, in receive reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender) File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 1241, in generate_reply final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"]) File "/usr/local/lib/python3.10/dist-packages/autogen/agentchat/conversable_agent.py", line 761, in generate_oai_reply response = client.create( File "/usr/local/lib/python3.10/dist-packages/autogen/oai/client.py", line 266, in create response = self._completions_create(client, params) File "/usr/local/lib/python3.10/dist-packages/autogen/oai/client.py", line 531, in _completions_create response = completions.create(params) File "/usr/local/lib/python3.10/dist-packages/openai/_utils/_utils.py", line 271, in wrapper return func(args, kwargs) File "/usr/local/lib/python3.10/dist-packages/openai/resources/chat/completions.py", line 648, in create return self._post( File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 1167, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 856, in request return self._request( File "/usr/local/lib/python3.10/dist-packages/openai/_base_client.py", line 947, in _request raise self._make_status_error_from_response(err.response) from None openai.PermissionDeniedError: <!DOCTYPE html>

willydouhard commented 6 months ago

This looks like an OpenAI error. From the openai documentation:


PermissionDeniedError | Cause: You don't have access to the requested resource.Solution: Ensure you are using the correct API key, organization ID, and resource ID.
-- | --
Haxeebraja commented 6 months ago

This looks like an OpenAI error. From the openai documentation:


PermissionDeniedError | Cause: You don't have access to the requested resource.Solution: Ensure you are using the correct API key, organization ID, and resource ID.
-- | --

Yes, but why is OpenAIWrapper working as shared above but not initiate_chat with agents as in the examples? I am using the same config_list for both.