Closed srilman closed 1 year ago
Deleting poetry's cache directory (https://python-poetry.org/docs/configuration/#cache-dir) seems to have fixed the issue.
So between the first time I tested conda-lock and now, I did install and run poetry. My guess is that there is some difference between how poetry 1.3 (current version) handles its cache and how the vendored poetry handles it.
In order to avoid issues like this in the future, I think it would be best if the vendored poetry point its cache to a different directory. Any thoughts?
Generally for complex dependencies that pull in lots of things like deltalake you may want to inspect their dependencies manually to see if they have any dependencies that are already available on conda.
In this case
$ grayskull pypi deltalake
...
Build requirements:
<none>
Host requirements:
- python
- pip
Run requirements:
- python
- pyarrow >=7
- typing-extensions # [py<38]
so adding in pyarrow as a conda dep would fix your problem
Actually in my original use case, I included PyArrow as a conda dep. That did not change anything.
@srilman, I'm trying to get caught up here, and I noticed
In order to avoid issues like this in the future, I think it would be best if the vendored poetry point its cache to a different directory.
If you think this would help, it makes sense to me. I really want to get that other stuff merged, but perhaps we could also think about a PR for this? (Note: maintenance will be much easier if you can avoid touching any of the vendored poetry code.)
@maresb I think we should. I found a couple of other situations where this ended up being the root problem, and I see that someone else had a similar situation too.
Any thoughts on a potential fix? The crude approach is to just modify the POETRY_CACHE_DIR
environment variable at the top of the main script. But it might be better if there was some way to pass it in directly.
Ya, it's definitely best to find a way to pass it in directly, if possible. But I suspect that we can't.
After a quick glance, it looks like the cache directory is set in _vendor/poetry/locations.py
, and it doesn't look like there's a good way to configure it.
It seems that pypoetry
is Poetry's universal prefix, so perhaps the best way would be to add a substitution in pyproject.toml
for "pypoetry"
→ "pypoetry-conda-lock"
. Then we should re-vendor Poetry... but unfortunately this is a very involved process which I hope to get to within the next month or two. (There are many subtle details to check, and there are complications due to the vendoring of Poetry itself of several other packages.) As an interim solution, we could carry out the substitution by hand.
What do you think?
Sounds like a good approach in general. It'd be best to avoid other potential conflicts anyways
I have noticed that recently,
conda-lock
seems to hang on my local machine when building a lockfile for a specification that includes pip dependencies. I do not see any hangs when building conda-only lockfiles. And when I tried conda-lock in the past a couple of months ago, I did not see this problem.For example, I tested building a lockfile for the following environment file
I ran it using the following command
I also tested the
main
branch version by runningWhen I tested on a docker container, building the lockfile took under a minute. When testing on the base machine (which is a Mac running on Python 3.10 installed from Homebrew), it was still running even after 10 minutes. This was the last log output