xorbitsai / inference

Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
https://inference.readthedocs.io
Apache License 2.0
3.72k stars 314 forks source link

Download llama_cpp_python-0.2.75.tar.gz failure #1516

Open Alchemistqqqq opened 1 month ago

Alchemistqqqq commented 1 month ago

Describe the bug

pip install "xinference[all]" , Downloading llama_cpp_python-0.2.75.tar.gz fail

To Reproduce

To help us to reproduce this bug, please provide information below:

  1. Your Python version.
  2. The version of xinference you use.
  3. Versions of crucial packages.
  4. Full stack of the error.
  5. Minimized code to reproduce the error.

Expected behavior

A clear and concise description of what you expected to happen.

Additional context

Add any other context about the problem here. image image When I downloaded this file, the download failed because of a network timeout, and the download speed was very slow. High probability is the problem of my campus network, but is there a solution?

codingl2k1 commented 1 month ago

可以试试清华源:https://mirrors.tuna.tsinghua.edu.cn/help/pypi/ pip install -i https://pypi.tuna.tsinghua.edu.cn/simple "xinference[all]"

Alchemistqqqq commented 1 month ago

可以试试清华源:https://mirrors.tuna.tsinghua.edu.cn/help/pypi/ pip install -i https://pypi.tuna.tsinghua.edu.cn/simple "xinference[all]"

Collecting vllm-nccl-cu12<2.19,>=2.18 (from vllm<0.4.2,>=0.2.6->xinference[all]) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/41/07/c1be8f4ffdc257646dda26470b803487150c732aa5c9f532dd789f186a54/vllm_nccl_cu12-2.18.1.0.4.0.tar.gz (6.2 kB) According to the method you provided, I have tried many times, but every time I get here, I get stuck, and there is even no progress bar. Have you encountered similar problems?

codingl2k1 commented 1 month ago

可以试试清华源:https://mirrors.tuna.tsinghua.edu.cn/help/pypi/ pip install -i https://pypi.tuna.tsinghua.edu.cn/simple "xinference[all]"

Collecting vllm-nccl-cu12<2.19,>=2.18 (from vllm<0.4.2,>=0.2.6->xinference[all]) Downloading https://pypi.tuna.tsinghua.edu.cn/packages/41/07/c1be8f4ffdc257646dda26470b803487150c732aa5c9f532dd789f186a54/vllm_nccl_cu12-2.18.1.0.4.0.tar.gz (6.2 kB) According to the method you provided, I have tried many times, but every time I get here, I get stuck, and there is even no progress bar. Have you encountered similar problems?

This download link is good for me. Maybe your network has some issues. You can try some other PyPI mirrors, e.g. http://mirrors.aliyun.com/pypi/simple/