qnguyen3 / chat-with-mlx

An all-in-one LLMs Chat UI for Apple Silicon Mac using MLX Framework.
https://twitter.com/stablequan
MIT License
1.41k stars 132 forks source link

i run in github with codespaces,there is a error showing as below #23

Closed younggggger closed 4 months ago

younggggger commented 4 months ago

Traceback (most recent call last): File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_exceptions.py", line 10, in map_exceptions██▉| 4.26G/4.26G [00:56<00:00, 77.3MB/s] yield File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_backends/sync.py", line 206, in connect_tcp sock = socket.create_connection( File "/usr/local/python/3.10.13/lib/python3.10/socket.py", line 845, in create_connection raise err File "/usr/local/python/3.10.13/lib/python3.10/socket.py", line 833, in create_connection sock.connect(sa) ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/home/codespace/.local/lib/python3.10/site-packages/httpx/_transports/default.py", line 67, in map_httpcore_exceptions yield File "/home/codespace/.local/lib/python3.10/site-packages/httpx/_transports/default.py", line 231, in handle_request resp = self._pool.handle_request(req) File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 268, in handle_request raise exc File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_sync/connection_pool.py", line 251, in handle_request response = connection.handle_request(request) File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 99, in handle_request raise exc File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 76, in handle_request stream = self._connect(request) File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_sync/connection.py", line 124, in _connect stream = self._network_backend.connect_tcp(**kwargs) File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_backends/sync.py", line 205, in connect_tcp with map_exceptions(exc_map): File "/usr/local/python/3.10.13/lib/python3.10/contextlib.py", line 153, in exit self.gen.throw(typ, value, traceback) File "/home/codespace/.local/lib/python3.10/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: [Errno 111] Connection refused

younggggger commented 4 months ago

client = OpenAI(api_key='EMPTY',base_url=openai_api_base)

should i replace "empty" with my openai key?

qnguyen3 commented 4 months ago

Hi. this need to be run natively on a Macbook. Codespaces wont work

jiangyang118 commented 4 months ago

`api_keyEMPTY You try to use a model that was created with version 2.4.0.dev0, however, your version is 2.4.0. This might cause unexpected behavior or errors. In that case, try to update to the latest version.

0.00s - Debugger warning: It seems that frozen modules are being used, which may 0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off 0.00s - to python to disable frozen modules. 0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation. Running on local URL: http://127.0.0.1:7860 To create a public link, set `share=True` in `launch()`. [{'role': 'system', 'content': "A chat between a curious user and an artificial intelligence assistant. The assistant gives helpful, detailed, and polite answers to the user's questions."}, {'role': 'user', 'content': '测'}] /chat/completions Traceback (most recent call last): File "/opt/homebrew/lib/python3.11/site-packages/gradio/queueing.py", line 495, in call_prediction output = await route_utils.call_process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/route_utils.py", line 235, in call_process_api output = await app.get_blocks().process_api( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/blocks.py", line 1627, in process_api result = await self.call_function( ^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/blocks.py", line 1185, in call_function prediction = await utils.async_iteration(iterator) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 514, in async_iteration return await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 640, in asyncgen_wrapper response = await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/chat_interface.py", line 490, in _stream_fn first_response = await async_iteration(generator) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 514, in async_iteration return await iterator.__anext__() ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 507, in __anext__ return await anyio.to_thread.run_sync( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/jack/Library/Python/3.11/lib/python/site-packages/anyio/to_thread.py", line 31, in run_sync return await get_asynclib().run_sync_in_worker_thread( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/jack/Library/Python/3.11/lib/python/site-packages/anyio/_backends/_asyncio.py", line 937, in run_sync_in_worker_thread return await future ^^^^^^^^^^^^ File "/Users/jack/Library/Python/3.11/lib/python/site-packages/anyio/_backends/_asyncio.py", line 867, in run result = context.run(func, *args) ^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/gradio/utils.py", line 490, in run_sync_iterator_async return next(iterator) ^^^^^^^^^^^^^^ File "/Users/jack/code/099-github/chat-with-mlx/chat_with_mlx/app.py", line 168, in chatbot response = client.chat.completions.create( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper return func(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 663, in create return self._post( ^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 1201, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 889, in request return self._request( ^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 965, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 965, in _request return self._retry_request( ^^^^^^^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _retry_request return self._request( ^^^^^^^^^^^^^^ File "/opt/homebrew/lib/python3.11/site-packages/openai/_base_client.py", line 980, in _request raise self._make_status_error_from_response(err.response) from None openai.InternalServerError: Error code: 502 `
jiangyang118 commented 4 months ago

Hi. this need to be run natively on a Macbook. Codespaces wont work The errors occurred during runtime as mentioned above, and this is on a Mac with an M1 chip. Do I need to change the openai_api_base and the api_key?