Closed claforte closed 1 year ago
Hi! It seems that pip install git+something
will always install even if it's already installed ... Wonder if there's a better way to manage these git-based dependencies.
Is it possible for us to pack these wheels and upload them to PyPI?
I think I found a reasonably simple solution. We need to specify hashes for each of the github packages to ensure that the wheels that are built are cached and reused. I'm trying it out right now and will create a PR shortly.
Turns out there's a much simpler solution:
# Newer pip versions, e.g. pip-23.x, can be much faster than old versions, e.g. pip-20.x
python3 -m pip install --upgrade pip
The newer version automatically gets the hash of the latest commit of each git package, caches it and reuses it. So no need to specify it. I'll prepare a PR.
Everytime we add a new package in requirements.txt I run this usual command:
pip install -r requirements.txt
Lately I noticed that doing so, it takes tens of minutes to rebuild the wheels for nerfacc, tinycudann, etc.
ptxas
seems to be consuming 100% CPU for that period.I'm wondering if this problem is specific to our cluster and/or my use of
venv
for the virtual environment. Does anyone else have that problem? If so I can try to investigate whether we can make that step optional.