jina-ai / clip-as-service

🏄 Scalable embedding, reasoning, ranking for images and sentences with CLIP
https://clip-as-service.jina.ai
Other
12.48k stars 2.07k forks source link

fix: install pytorch cu116 for server docker image #882

Closed ZiniuYu closed 1 year ago

ZiniuYu commented 1 year ago

This PR fixes the error in linked issue

jemmyshin commented 1 year ago

please rebuild the image and reply to the communities.

codecov[bot] commented 1 year ago

Codecov Report

Merging #882 (a3d8796) into main (0b293ec) will decrease coverage by 11.27%. The diff coverage is 100.00%.

@@             Coverage Diff             @@
##             main     #882       +/-   ##
===========================================
- Coverage   83.06%   71.78%   -11.28%     
===========================================
  Files          22       22               
  Lines        1529     1531        +2     
===========================================
- Hits         1270     1099      -171     
- Misses        259      432      +173     
Flag Coverage Δ
cas 71.78% <100.00%> (-11.28%) :arrow_down:

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
server/clip_server/model/model.py 75.37% <100.00%> (+0.18%) :arrow_up:
server/clip_server/executors/clip_tensorrt.py 0.00% <0.00%> (-94.60%) :arrow_down:
server/clip_server/model/clip_trt.py 0.00% <0.00%> (-85.72%) :arrow_down:
server/clip_server/model/trt_utils.py 0.00% <0.00%> (-83.52%) :arrow_down:
server/clip_server/executors/clip_onnx.py 87.91% <0.00%> (+1.09%) :arrow_up:
server/clip_server/model/clip_onnx.py 87.30% <0.00%> (+22.22%) :arrow_up:

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

numb3r3 commented 1 year ago

@ZiniuYu Does the hub executor also suffer from this issue?

ZiniuYu commented 1 year ago

@ZiniuYu Does the hub executor also suffer from this issue?

I just tested both Torch and ONNX executors, they are all fine

numb3r3 commented 1 year ago

I just tested both Torch and ONNX executors, they are all fine

Then, I doubled if this issue comes from cuda? And I would not recommend pinning the cuda version. Our work should work on all stable cuda versions.

numb3r3 commented 1 year ago

So, please try to make it work with cuda-11.6.0, rather than freeze the cuda version.