pypi / support

Issue tracker for support requests related to using https://pypi.org
91 stars 48 forks source link

File Limit Request: vllm - 400 MiB #3792

Closed youkaichao closed 2 months ago

youkaichao commented 3 months ago

Project URL

https://pypi.org/project/vllm/

Does this project already exist?

New Limit

400

Update issue title

Which indexes

PyPI

About the project

vLLM is a fast and easy-to-use library for LLM inference and serving.

It plans to ship nvidia-nccl-cu12==2.18.3 within the package.

Reasons for the request

We identified nccl>=2.19 with a bug that largely increased GPU memory overhead, so we have to pin and ship nccl versions ourselves.

We cannot use pip install nvidia-nccl-cu12==2.18.3 because we depend on torch, which has binary dependency with pip install nvidia-nccl-cu12==2.19.5. So we are in a dependency hell, and we have to keep a nccl library ourselves.

vllm is a popular library for LLM inference, and it is used by many tech companies. Shipping nccl with vllm can increase its throughput and the quality of LLM serving. However, the downside is that the package wheel will become much larger. So we have to come here for support, to ask for a larger file size limit.

Code of Conduct

youkaichao commented 3 months ago

bump up πŸ‘€

youkaichao commented 3 months ago

bump up πŸ‘€

mgoin commented 2 months ago

+1, it would be great to have this!

agt commented 2 months ago

From README.md

Large (more than 200MiB) upload limits are generally granted for the following reasons:

project contains large compiled binaries to maintain platform/architecture/GPU support

Project maintainers are having to limit or cut architecture/GPU/format support in order to fit <100mb: vllm-project/vllm#4290 vllm-project/vllm#4304

zhuohan123 commented 2 months ago

Kindly cc @cmaureir for visibility. vLLM is the most popular open-source LLM serving engine in the world right now. Having a larger package limit can help us support more different types of hardware, and help democratize LLMs to the vast majority of developers.

WoosukKwon commented 2 months ago

Hi @cmaureir, I'm also a maintainer of vLLM. We do make our best effort to keep the binary size small, but it's increasingly difficult to meet the current limit since vLLM is rapidly growing with new features and optimizations that require new GPU kernels (binaries). Increasing the limit would be very helpful for the development of vLLM.

cmaureir commented 2 months ago

Hello @youkaichao :wave: I have set the new upload limit for vllm to 400M mainly to unlock your release processes, but I'm making a note that it's highly probable your project will reach the project limit soon because it's including an additional package. This is not encouraged, nor recommended.

Additionally, I see you have one package per-python version, which heavily increases the release total size, I recommend you to look into the Python Limited API in order to provide one-wheel per platform. https://docs.python.org/3/c-api/stable.html

Have a nice rest of the week :rocket:

youkaichao commented 2 months ago

@cmaureir thanks for your support! We will try to see if we can build just one wheel for all python versions.

youkaichao commented 2 months ago

@cmaureir is it possible to build one wheel for all supported python version, when we have extensions? I find the wheel name always contains python version. Not sure how to build a Python-agnostic wheel.

youkaichao commented 2 months ago

I did a quick investigation:

To use Python Limited API in order to provide one-wheel per platform:

  1. add flags to wheel building: python3 setup.py bdist_wheel --dist-dir=dist --py-limited-api=cp38
  2. add macro during compilation: # define Py_LIMITED_API 0x30800000 (or can be set by extension arguments, c.f. https://stackoverflow.com/a/69073115/9191338 )

I tried, however, since we use pybind11, which does not support Python Limited API (c.f. https://github.com/pybind/pybind11/issues/1755 ), we have to build one wheel for each Python version.

Sorry for the trouble :(