Closed cnstt closed 5 months ago
I'm not an expert at all, but just activated dependabot on my plugin repository and it detected 3 updates for the github actions (also I just freshly cloned the plugin template). For python packages you might indeed be right, I just activated it and will see if it useful. But you are probably right that for hard-pinned dependencies, such dependency updates might just brake things...
have a look into it. If you can find a configurable that restricts the scope to github actions deps or development dependencies (but probably not python deps), that could be interesting
That totally possible! In your configuration file you need to precise which "package-ecosystem" you want to update. You can specify only "github-actions" for example
@tlambert03 There you have PartSeg configuration where dependabot is for GitHub actions and pyinstaller requirements (for reproducible bundle build). By removing the first entry you will get only GitHub actions.
what do you think @Czaki, should we add something here? other @napari/core-devs ?
Seems useful! If we add it it should probably be optional though
yeah... but one bit of feedback we've heard multiple times here is that it's too complicated already. it's tempting to want to set this up as a general "fully loaded python repository bootstrapper that also works for napari" ... but that adds cognitive overload, and I think we do need to be somewhat careful about every seemingly small thing we add.
there's no good answer:
this is why I (personally) don't really use this cookiecutter anymore! :joy: (I have https://github.com/tlambert03/pyrepo-cookiecutter). That's not to say we shouldn't keep this carefully maintained and considered... it's just to say that we shouldn't expect it to be all the things for all the people
dependabot does not change your code, it only creates pull requests.
if you add things, new users will say "this is too complicated, can't you just provide me a simple napari plugin template?"
Did we want to simplify entry or make it a little complicated, but increase code quality.
I today feel confused reading this article: https://www.frontiersin.org/articles/10.3389/fcomp.2022.931939/full where they mention napari with ZELDA
plugin (that I meat first time) that metadata declares that it is python 3.7 only, but its readme suggest that it works with python 3.8.
So if we do not put enough requirements on entry then we may end with multiple unmaintained/low-quality plugins that may introduce the impression that you need to be an expert to use napari. And addressing this may require creating another team that will curate napari plugins and verifies them to provide information to a user if he should try this plugin.
As a workaround maybe we could hide pre-commit and dependabot under one question?
Thanks for enumerating, it's tough! I do like napari as a vehicle for discovery of this tooling, it was my introduction to testing and deployment on github actions but I also agree we want the on-ramp to be as gradual as possible
Let's err on the side of simple and point at cool shiny things in the docs :-)
Hi @cnstt, would you like to add a little context here? My first thought is "probably: no ... but curious to hear your thoughts". For me, dependabot has been very useful in app-like settings, where no one depends on me, I'm hard-pinning my requirements, but would like to know a) if an update is generally available and b) if there's a security risk in one of my pinned dependencies. However, in a python-based ecosystem (with a flat package tree, where there's only 1 version of a given package installed at any given time), having plugins hard-pin dependencies, or even give very narrow compatibility ranges is very problematic, since one plugin can very easily break another.
thoughts?