Closed ryan-serpico closed 1 year ago
Same issue here on Mac OS Sonoma M2 Max MBP. Not working with chroma but something similar as my requirements.txt is
openai~=0.27.2
pydantic~=1.8.2
pydantic[dotenv]
uvicorn~=0.17.6
fastapi~=0.78.0
pandas~=1.4.2
tqdm~=4.64.0
requests>=2.27.1
beautifulsoup4~=4.11.1
transformers~=4.19.1
numpy~=1.24.1
scipy~=1.8.0
sentence_transformers==2.2.2
pyyaml~=6.0
scikit-learn~=1.1.0
unidecode~=1.3.4
tiktoken~=0.3.3
streamlit~=1.20.0
trafilatura>=1.6.0
backoff~=2.2.1
pydub~=0.25.1
simple_salesforce~=1.12.5
selenium~=4.10.0
and getting
84 INFO: Use pytorch device: cpu
Insert data to database: 0%| | 0/25 [00:00<Insert data to database: 100%|██████████| 25/25 [00:00<00:00, 485451.85it/s]
Batches: 0%| | 0/1 [00:00<?, ?it/s]
xxxx-api-service-1 exited with code 139
what's interesting is i recloned my repo and the container worked once, but any attempts to use it again leads to the same exit code.
@Yemeen Whew, OK, I thought I was the only one experiencing this. So your script was functional before the Sonoma update as well?
not immediately. i've been on the beta sonoma for about 6 weeks now. not sure why it started now
@Yemeen I figured it out last night, at least for me. I added torch==2.0.1
to my requirements.txt
, and boom, everything works. It seems like whatever PyTorch did in the update they released on Oct. 4 broke everything on M1 chips.
you're a lifesaver!!
Closing the issue as this got solved by pinning pytorch.
Amazing! Had the same issue, and pinning to 2.0.1
solved it for me.
Description
Hey y'all,
Ever since upgrading my M1 Pro MacBook Pro to macOS Sonoma, I haven't been able to generate any embeddings using Chroma in my Docker container. Every time I run
docker-compose up --index embedding-test
with the basic code below, I receive exited with code 139.What I don't understand is that I can run this same Docker container on another Mac running Monterey and on an EC2 server without any issue. I can get the same script below running on my main machine if I run it outside of a Docker container and in a simple venv.
This is only the second Github issue I've ever submitted (first being earlier this morning), so apologies in advance if this posting is misplaced or incomplete. I'm also attaching my Dockerfile. If I can supply any other information that would lead to any assistance, let me know. I would appreciate any help on this issue.
I believe this may have something to do with https://github.com/docker/for-mac/issues/7006
embedding_test.py
Dockerfile
requirements.txt
versions
Chroma v0.4.13, macOS 14.0 Sonoma, Docker v4.24.0, python:3.11.4-slim-bookworm image
Reproduce
docker-compose up --build embedding-test
Expected behavior
docker-compose up --build embedding-test
runs and generates embeddings successfully.docker version
docker info
Diagnostics ID
78FCD04C-5FB1-457D-97C2-F7F899584238/20231006183318
Additional Info
docker logs 8c38e53b2186
Downloading (…)a20e8/.gitattributes: 100%|██████████| 1.48k/1.48k [00:00<00:00, 16.0MB/s] Downloading (…)_Pooling/config.json: 100%|██████████| 200/200 [00:00<00:00, 2.78MB/s] Downloading (…)16616a20e8/README.md: 100%|██████████| 67.6k/67.6k [00:00<00:00, 12.3MB/s] Downloading (…)616a20e8/config.json: 100%|██████████| 650/650 [00:00<00:00, 10.2MB/s] Downloading model.safetensors: 100%|██████████| 438M/438M [00:51<00:00, 8.58MB/s] Downloading (…)0e8/onnx/config.json: 100%|██████████| 632/632 [00:00<00:00, 929kB/s] Downloading model.onnx: 100%|██████████| 436M/436M [00:50<00:00, 8.55MB/s] Downloading (…)cial_tokens_map.json: 100%|██████████| 125/125 [00:00<00:00, 223kB/s] Downloading (…)/onnx/tokenizer.json: 100%|██████████| 711k/711k [00:00<00:00, 3.54MB/s] Downloading (…)okenizer_config.json: 100%|██████████| 314/314 [00:00<00:00, 1.29MB/s] Downloading (…)a20e8/onnx/vocab.txt: 100%|██████████| 232k/232k [00:00<00:00, 3.20MB/s] Downloading pytorch_model.bin: 100%|██████████| 438M/438M [00:51<00:00, 8.56MB/s] Downloading (…)nce_bert_config.json: 100%|██████████| 57.0/57.0 [00:00<00:00, 74.2kB/s] Downloading (…)cial_tokens_map.json: 100%|██████████| 125/125 [00:00<00:00, 579kB/s] Downloading (…)a20e8/tokenizer.json: 100%|██████████| 711k/711k [00:00<00:00, 9.27MB/s] Downloading (…)okenizer_config.json: 100%|██████████| 314/314 [00:00<00:00, 1.38MB/s] Downloading (…)16616a20e8/vocab.txt: 100%|██████████| 232k/232k [00:00<00:00, 3.39MB/s] Downloading (…)16a20e8/modules.json: 100%|██████████| 387/387 [00:00<00:00, 1.87MB/s]