Open gatesn opened 4 years ago
I think the plan is we use Grayskull to update recipes, which will (or does?) contain logic to update requirements. This probably eliminates the need for linting specifically.
cc @ocefpaf @marcelotrevisani (who know more)
Grayskull looks more like what we want, rather than pip check which is helpful but not sufficient
You can already use grayskull
as a package to get the dependencies for projects on pypi
for example (with grayskull
installed):
from grayskull.pypi import PyPi
recipe = PyPi(name="pytest", version="5.4.1")
print(recipe["requirements"]["host"].values)
it will print
[
RecipeItem(position=0, value=pip, selector=),
RecipeItem(position=1, value=python, selector=),
RecipeItem(position=2, value=setuptools >=40.0, selector=),
RecipeItem(position=3, value=setuptools_scm, selector=)
]
if I try to print the run
requirements it will be
[
RecipeItem(position=0, value=atomicwrites >=1.0, selector=win),
RecipeItem(position=1, value=attrs >=17.4.0, selector=),
RecipeItem(position=2, value=colorama, selector=win),
RecipeItem(position=3, value=importlib-metadata >=0.12, selector=py<38),
RecipeItem(position=4, value=more-itertools >=4.0.0, selector=),
RecipeItem(position=5, value=packaging, selector=),
RecipeItem(position=6, value=pathlib2 >=2.2.0, selector=py<36),
RecipeItem(position=7, value=pluggy <1.0,>=0.12, selector=),
RecipeItem(position=8, value=py >=1.5.0, selector=),
RecipeItem(position=9, value=python, selector=),
RecipeItem(position=10, value=wcwidth, selector=)
]
I still need to disable the cli
when using it as a module and there are a lot of improvements which we can do, and also I was planning to version 1.0.0
to update the recipe from the cli.
But I am a bit busy these days because of some personal matters
However, I am still developing (and I will continue to) grayskull
is just that it is a bit slower than usual for now, sorry.
Any contribution is more than welcome :)
Thanks for sharing! This is already great to have 😄
Hope things are going ok 🙂
Since we now have:
<at>conda-grayskull show requirements
this can perhaps be closed as resolved?
Issue: it is common to find dependencies declared in setup.py missing from the Conda recipe. This results in runtime failures that could have been avoided.
Given 99% of Conda packages share the same name with the respective PyPi package, could we lint that the meta.yml includes the same version constraints as setup.py? And perhaps an explicit comment to opt-out if the package name is different or the conda recipe is known to be more correct.