qnguyen3 / chat-with-mlx

An all-in-one LLMs Chat UI for Apple Silicon Mac using MLX Framework.
https://twitter.com/stablequan
MIT License
1.46k stars 131 forks source link

I suppose this should be local only? #53

Open darrenxyli opened 6 months ago

darrenxyli commented 6 months ago
  File "/opt/homebrew/lib/python3.10/site-packages/chat_with_mlx/app.py", line 166, in chatbot
    response = client.chat.completions.create(
  File "/opt/homebrew/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
  File "/opt/homebrew/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 663, in create
    return self._post(
  File "/opt/homebrew/lib/python3.10/site-packages/openai/_base_client.py", line 1200, in post
    return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
  File "/opt/homebrew/lib/python3.10/site-packages/openai/_base_client.py", line 889, in request
    return self._request(
  File "/opt/homebrew/lib/python3.10/site-packages/openai/_base_client.py", line 942, in _request
    return self._retry_request(
  File "/opt/homebrew/lib/python3.10/site-packages/openai/_base_client.py", line 1013, in _retry_request
    return self._request(
  File "/opt/homebrew/lib/python3.10/site-packages/openai/_base_client.py", line 942, in _request
    return self._retry_request(
  File "/opt/homebrew/lib/python3.10/site-packages/openai/_base_client.py", line 1013, in _retry_request
    return self._request(
  File "/opt/homebrew/lib/python3.10/site-packages/openai/_base_client.py", line 952, in _request
    raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.
qnguyen3 commented 6 months ago

You need the download the embedding model on the first time using it, after you download the embedding model + LLM, it can be entirely offline.

taozhiyuai commented 6 months ago

You need the download the embedding model on the first time using it, after you download the embedding model + LLM, it can be entirely offline.

please save all LLMs and embedding model under the app directory without symlink @qnguyen3