threestudio-project / threestudio

A unified framework for 3D content generation.
Apache License 2.0
6.32k stars 480 forks source link

`pip install -r requirements.txt` takes 10+ minutes re-building wheels #188

Closed claforte closed 1 year ago

claforte commented 1 year ago

Everytime we add a new package in requirements.txt I run this usual command:

pip install -r requirements.txt

Lately I noticed that doing so, it takes tens of minutes to rebuild the wheels for nerfacc, tinycudann, etc. ptxas seems to be consuming 100% CPU for that period.

I'm wondering if this problem is specific to our cluster and/or my use of venv for the virtual environment. Does anyone else have that problem? If so I can try to investigate whether we can make that step optional.

bennyguo commented 1 year ago

Hi! It seems that pip install git+something will always install even if it's already installed ... Wonder if there's a better way to manage these git-based dependencies.

DSaurus commented 1 year ago

Is it possible for us to pack these wheels and upload them to PyPI?

claforte commented 1 year ago

I think I found a reasonably simple solution. We need to specify hashes for each of the github packages to ensure that the wheels that are built are cached and reused. I'm trying it out right now and will create a PR shortly.

claforte commented 1 year ago

Turns out there's a much simpler solution:

# Newer pip versions, e.g. pip-23.x, can be much faster than old versions, e.g. pip-20.x
python3 -m pip install --upgrade pip 

The newer version automatically gets the hash of the latest commit of each git package, caches it and reuses it. So no need to specify it. I'll prepare a PR.