Closed ppfeufer closed 1 year ago
It makes it impossible to handle repositories like brew too.
Same for the Gentoo ebuild I'm maintaining ...
Same here on AUR
I'm worried there's some sort of mental health issue going on. There have been 20 "stable" releases in less than 24 hours. Some just minutes apart.
I'm not a package maintainer, but this is clogging up my GitHub Notifications view :slightly_frowning_face:
Hi, we are currently switching to a new CI system, this issue should be temporary. Sorry for the inconvenience.
cc @ab77 for more context
Hi, we are currently switching to a new CI system, this issue should be temporary. Sorry for the inconvenience.
cc @ab77 for more context
But even when switching to a new CI system, there does not have to be a release for every single change that's being made ... That suspiciously looks like a misconfiguration somewhere ...
And if I have a look at this (https://github.com/balena-io/etcher/issues/3835), this will potentially be going on for quite a while ... please fix it.
Folks, for some context, Etcher's CI pipeline has been broken for years. On Monday, Etcher has been moved to our new GH actions pipeline (Flowzone) and wired up to our renovate config. Renovate will run ~ every hour and if there is a dependency update, it will build, test and publish a new Etcher release and make it latest.
This is by design, enjoy new Etcher releases without critical vulnerabilities, etc.
Thank you for the clarification @ab77. Speaking as a Homebrew maintainer, we have decided to discontinue support for future updates of Etcher due to this new release cadence. It is too difficult to keep up with. I wish the other package maintainers luck in finding a solution that works for them.
@ab77 thanks for the clarification, I am maintaining etcher-bin
on AUR, and thinking of possible solutions for handling the situation, I can automate the process, but I think it will become frustrating for users to have an etcher update, each time they check for an update
I don't think it's really necessary to have hourly releases to be "secure"?? I don't think etcher is more of a target than web browsers and Linux kernel! And they don't release hourly!
I really hope for a reconsideration, but I understand if you don't wanna do anything about the issue as it's "by design"
Speaking as an AUR user, if @molaeiali does manage to get some sort of automation going most people are not going to want to spend network and cache space redownloading the whole binary every time they update. Ultimately, someone on the AUR is going to create a pinned version of etcher that updates less frequently. It is going to worsen the problem of more installs with out of date dependencies. I know that eventually the bot will catch up and make updates slightly less frequent, but it still hampers any sort of distribution.
Folks, re-opening this to avoid duplicates being logged.
While I understand all the concerns expressed in this issue, let me add some more context around events leading up to this point. While being a popular piece of open source software, Etcher has been often neglected at balena, more often than not, due to lack of developers willing to work on it (because, reasons), reluctance from maintainers to keep software dependencies up to date (also because, reasons) and the fact, that the CI system used to work with it, was effectively broken for years.
This has led to a situation, where we now find ourselves having to catch up on years of technical debt, having migrated to a working CI pipeline with automated dependency management.
I am personally leaning towards leaving the system run and do it's thing for as long as it takes, until such time, the backlog is cleared and the software is up to date. There is of course no obligation to install every new version that comes out, nor to manually update upstream package managers. Though, the later part should always be automated to be practical. Etcher updates can also be toggled off entirely in settings.
It is not practical to simply grab all the updates and bundle them into a single pull request/release, because some of these introduce breaking changes, which would break the build and/or fail the unit tests for the whole PR.
If someone in the open source community is willing to spend time sifting through out of date dependencies and coming up with a single PR containing a maximum working/passing combination, then, as the saying goes - "PRs welcome".
I am not an expert on this, but does the CI pipeline have to end with a release tag for each run? Or is it possible to get the backlog cleared and do a single release after instead?
I'm pretty sure that's a configuration thing.
@ab77 :: The issue is not with (finally) updating all the outdated dependencies. That's absolutely fine and we appreciate that. The issue is with publishing a new release after each single dependency update. The better approach would be, as @ulab already said, to let the CI work through the dependency backlog and test and whatever else it doesn't, but don't have it publish a release for every single change. Let it finish, check the end product and then release it.
Automated and probably unsupervised releases are not always the best approach. In fact, it's pretty rare that this might be a good idea. Because I doubt that any of your guys sit there and check the releases.
For now, I am suspending the Gentoo ebuild due to this in favour of the rpi-imager.
The current CI pipeline is designed to cut a new version/release with every PR and move the latest pointer. There are no current plans to change this behaviour.
Instead, we've disabled renovate for now on this repository, so it won't get any updates (unless these are done manually).
Will this new CI include the macOS ARM version or do I need to keep building for this?
@thedocbwarren the CI is GitHub Actions, which currently only supports x86_64 architecture (hosted runners). Which platforms are you currently building ARM releases for and are you using native hardware or emulated?
@ab77 :: The issue is not with (finally) updating all the outdated dependencies. That's absolutely fine and we appreciate that. The issue is with publishing a new release after each single dependency update. The better approach would be, as @ulab already said, to let the CI work through the dependency backlog and test and whatever else it doesn't, but don't have it publish a release for every single change. Let it finish, check the end product and then release it.
Automated and probably unsupervised releases are not always the best approach. In fact, it's pretty rare that this might be a good idea. Because I doubt that any of your guys sit there and check the releases.
I'd agree on this. The balena etcher website currently only links to 1.7.9, not latest. Switching that to 'latest' would be a bad idea because you'd have end users downloading an untested automated build. We provide a PortableApps.com Format package of balena etcher, but I'm not sure there's any way to maintain it with the current auto-build system. Sending out a dozen releases a day means, for all intents and purposes, there is no 'stable' release anymore.
@thedocbwarren the CI is GitHub Actions, which currently only supports x86_64 architecture (hosted runners). Which platforms are you currently building ARM releases for and are you using native hardware or emulated?
@ab77 I build for macOS ARM, and as needed, Ubuntu ARM. No emulated hardware.
This is by design, enjoy new Etcher releases without critical vulnerabilities, etc.
HAHAHAHAHAHA. you must be joking! LOL. you have the audacity to state that any crititcal vulnerabilities have been fixed yet you use an Electron version from 2021 (your largest dependency, vulnerability surface, and known exploitable target)....
Renovate will run ~ every hour and if there is a dependency update, it will build, test and publish a new Etcher release and make it latest.
also fyi, there is such thing as a pre-release on github. Here is the official description for that:
To notify users that the release is not ready for production and may be unstable, select This is a pre-release.
this is exactly what your automation produces. you should be marking all of these releases as pre-release and only bumping the minor version. major version bumps and regular release statis could then only come manually at maintainer descresion.
packaging maintainers could then decide whether to update to the latest pre-release (untested code) or wait till a major version bump occurs and the regular release is made (fully tested and stable)
prerelease has been implemented in https://github.com/balena-io/etcher/pull/3868 and renovate (re)enabled in https://github.com/balena-io/renovate-config/pull/271.
You guys trying to break a new record? Looks like your CI is running amok.