Closed dcommander closed 1 year ago
It appears as if the issue has been resolved, but I am still open to suggestions for other file hosting solutions, bearing in mind the constraints I mention above.
As most of you know, one of the reasons why we continue to use SourceForge for file releases is that SourceForge allows file deployment and management via SSH, so I am able to automatically push and update releases using rsync and manage YUM repositories on SourceForge's file release server.
Note that I don't use RPM distros, but I am making the suggestion anyways.
The above can be done with GitHub if we use a GitHub repository as a YUM repository, and point it to GitHub release URLs. A matter of using REST APIs over SSH in this case.
Some references to automatically build RPMs on GitHub (you don't need to use this though): https://github.com/naveenrajm7/rpmbuild https://github.com/xmidt-org/rpm-package-action
Technically, it should be possible to (optionally) build assets using GitHub actions, then use https://github.com/softprops/action-gh-release to upload the asset to a release, then point the YUM repository to the release URL formats. The release URLs have a fixed format. SourceForge releases also redirect anyways, just like GitHub.
Even if you opt to build on your own packages then upload, GitHub provides an API to automatically upload assets to a Draft Release (which is not yet visible to the outside, the draft release can be created automatically and then be published automatically too).
https://docs.github.com/en/rest/releases/assets https://docs.github.com/en/rest/releases/releases
Then, an example of a YUM repository hosted on GitHub (the URL of a release on GitHub has nothing fundamentally in difference to Sourceforge - both redirect, so it can be pointed to a URL format like you do in SourceForge):
https://github.com/riboseinc/yum (I do not recommend uploading rpms on git repositories like this though)
The below is an example of how the GitHub API works (Btw, please don't mix ESR releases on the main releases since the below GitHub API approach breaks because of that):
# Automatically fetch the latest selkies-gstreamer version and install the components
SELKIES_VERSION=$(curl -fsSL "https://api.github.com/repos/selkies-project/selkies-gstreamer/releases/latest" | jq -r '.tag_name' | sed 's/[^0-9\.\-]*//g') && \
curl -fsSL "https://github.com/selkies-project/selkies-gstreamer/releases/download/v${SELKIES_VERSION}/selkies-gstreamer-v${SELKIES_VERSION}-ubuntu${UBUNTU_RELEASE}.tgz" | tar -zxf - && \
curl -O -fsSL "https://github.com/selkies-project/selkies-gstreamer/releases/download/v${SELKIES_VERSION}/selkies_gstreamer-${SELKIES_VERSION}-py3-none-any.whl" && pip3 install "selkies_gstreamer-${SELKIES_VERSION}-py3-none-any.whl" && rm -f "selkies_gstreamer-${SELKIES_VERSION}-py3-none-any.whl" && \
curl -fsSL "https://github.com/selkies-project/selkies-gstreamer/releases/download/v${SELKIES_VERSION}/selkies-gstreamer-web-v${SELKIES_VERSION}.tgz" | tar -zxf - && \
LUTRIS_VERSION=$(curl -fsSL "https://api.github.com/repos/lutris/lutris/releases/latest" | jq -r '.tag_name' | sed 's/[^0-9\.\-]*//g') && \
curl -fsSL -O "https://github.com/lutris/lutris/releases/download/v${LUTRIS_VERSION}/lutris_${LUTRIS_VERSION}_all.deb" && \
apt-get install --no-install-recommends -y ./lutris_${LUTRIS_VERSION}_all.deb && rm -f "./lutris_${LUTRIS_VERSION}_all.deb" && \
I do things like this for downloading from GitHub, but for VirtualGL, the yum repository can be set up on a separate repository on GitHub and can be point to the URL like: https://github.com/VirtualGL/VirtualGL/releases/download/${VIRTUALGL_VERSION}/VirtualGL-${VIRTUALGL_VERSION}.x86_64.rpm
for installations and so on.
Why am I suggesting this? Because I (also) used Sourceforge since when nobody knew about GitHub, and I experienced why Sourceforge shouldn't be the single point of failure so many times since then (especially for a production project like this!).
GitHub might break their whole website from time to time, but at least everyone breaks, and they publish a comprehensive report on how to prevent that later. They don't partially break things all the time.
@ehfd That is not a good fit for our current release workflow. I have to sign our official release packages using a GPG key on Linux, an Apple Developer certificate on macOS, and a code signing certificate on Windows. I do not trust any cloud-based CI system to keep the associated private keys secure. (GitHub Actions has never had a data breach that exposed encrypted secrets, but Travis CI had one last year, and CircleCI had one this month.) Thus, I use physical security to protect the private keys. They are stored on my local build machines in an encrypted volume, and official releases are built only on those machines.
I have been using SourceForge since 2004, so I do not need you to explain the tradeoffs of it to me. My projects do not rely on SourceForge for anything but file hosting. What I would need in order to consider using GitHub for file hosting instead:
createrepo
via SSH on SourceForge.At the moment, I have very little time and funding to look into migrating our file releases, and I certainly do not have time to learn how to use the GitHub API. Furthermore, there are other nice things about using SourceForge for file hosting, such as automatic virus scanning, multiple mirrors, fine-grained statistics, etc. It's funny how many people beg me to use GitHub for everything because they consider SourceForge to be a "single point of failure." You see the irony in that, right? I've been around long enough to have seen these sites come and go, and I don't really think that trusting Microsoft's current benevolence toward the open source community is a more sound strategy than trusting Slashdot Media's benevolence. In both cases, the companies are running the sites primarily to promote their other businesses.
A reliable way to push files from my local build machines to a particular release on GitHub, using only the command line. Bonus points if I can also update existing release files, as I can currently do via rsync/SSH with SourceForge.
I've been talking about this the whole time. It will work the way you want to. And you don't need GitHub actions (it's only required if you want to build the packages in GitHub, and you don't want to). You can build things in your own node and use cURL with the REST API to upload the assets to GitHub releases. All tokens and important security information in your own node. You can use the REST API from your own node to create the releases automatically too.
I need to be able to easily update the repository using an automated script whenever a new release drops.
Yes, it's possible. Easier, perhaps. With the GitHub REST API from your own node.
I understand that you lack time to do it in different ways, but I still think what you listed is not enough as an argument to stay on SourceForge only. Maybe try using both GitHub and SourceForge for files when you finally have the time to learn GitHub REST APIs in the future, and choose what you like (or keep both).
Basically, if you don't want to use the GitHub Actions CI/CD and want to build everything in your node, the below is all you need to study (and for YUM, you already know what you need).
https://docs.github.com/en/rest/releases/assets https://docs.github.com/en/rest/releases/releases
A reliable way to push files from my local build machines to a particular release on GitHub, using only the command line. Bonus points if I can also update existing release files, as I can currently do via rsync/SSH with SourceForge.
I've been talking about this the whole time. It will work the way you want to. And you don't need GitHub actions (it's only required if you want to build the packages in GitHub, and you don't want to). You can build things in your own node and use cURL with the REST API to upload the assets to GitHub releases. All tokens and important security information in your own node. You can use the REST API from your own node to create the releases automatically too.
Great, so show me how to do it. Your examples above appear to only demonstrate how to download assets, not push or update them.
I need to be able to easily update the repository using an automated script whenever a new release drops.
Yes, it's possible. Easier, perhaps. With the GitHub REST API from your own node.
I understand that you lack time to do it in different ways, but I still think what you listed is not enough as an argument to stay on SourceForge only. Maybe try using both GitHub and SourceForge for files when you finally have the time to learn GitHub REST APIs in the future, and choose what you like (or keep both).
I don't have to convince you. You have to convince me. You haven't spent the last 19 years maintaining this project (the last 14 of which have been as an independent developer working for very little money.)
I don't know you, and I don't know your background, and you have given me no reason to believe that your opinion is more informed than my own. I have demonstrated time and time again my willingness to change project management platforms and strategies when there is a compelling enough reason to do so. However, any proposal that requires me to spend more time than I am already spending is a non-starter. (Corollary: maintaining two copies of every release is a non-starter.)
Basically, if you don't want to use the GitHub Actions CI/CD and want to build everything in your node, the below is all you need to study (and for YUM, you already know what you need).
https://docs.github.com/en/rest/releases/assets https://docs.github.com/en/rest/releases/releases
The API seems straightforward, but it will still take some time (that I don't currently have) to design and test appropriate release management scripts, re-deploy all current releases to GitHub, etc. Then I have to endure the inevitable migration pain-- fielding support queries from people who didn't bother to read the announcement and are still looking on SourceForge, people who are still trying to access the old YUM repository and don't understand why the latest release isn't there, etc. (I still have users trying to subscribe to the archived project mailing lists on SourceForge, even though they have to click through a screen that clearly says "this list has moved to Google Groups" in order to do so. People don't read, so every change is more painful than it should be.)
I don't actually know how to create a YUM repository using only external URLs. I currently use the default createrepo
script, which walks a local directory, finds the RPM files therein, and creates the repository in that same directory. That wouldn't work for GitHub, to the best of my understanding. So you could speed this process along by helping me understand how to create such a YUM repository. You documentation above was not clear.
I will try these things and help you along the way. And yes, I know that I am the one who should convince you, so I will try to give you a proof of concept.
@ehfd SourceForge SSH access has been flaky lately, so I'm ready to move everything to GitHub if someone can figure out the YUM hosting problem.
https://copr.fedorainfracloud.org/
https://docs.pagure.org/copr.copr/how_to_enable_repo.html#how-to-enable-repo
An option for hosting RPMs alternatively. Integrates tightly with GitHub. I believe COPR is what you are looking for.
Other options: https://en.opensuse.org/openSUSE:Build_Service
https://rpmfusion.org/Contributors#Submitting_a_new_package http://ghettoforge.org/index.php/Contributing
I believe dedicated FOSS YUM/DNF providers are best.
@dcommander What I suggest is to preserve the SourceForge repository, but in addition upload all release assets uploaded to SourceForge to GitHub as well using the automated assets API to each release (https://docs.github.com/en/rest/releases/assets and https://docs.github.com/en/rest/releases/releases), and use COPR for the YUM repositories.
Then if people gradually stop using SF, it can be phased out.
Uploading to three places simply isn't going to happen. I am only willing to host files on GitHub if it can fully replace SF, which means figuring out the YUM repository problem.
Fedora COPR automatically builds RPM packages in their own infrastructure from information obtained from the GitHub repository or a provided src.rpm
, and provides a ready repository automatically. That plus GitHub release assets is sufficient if you are not willing to upload to three locations and thus willing to move out from SourceForge quickly.
https://www.reddit.com/r/Fedora/comments/l5bzy0/need_help_creating_a_package_in_copr/
In a *.-primary.xml
file (reference https://blog.packagecloud.io/yum-repository-internals/):
<package type="rpm">
<name>VirtualGL</name>
<arch>src</arch>
<version epoch="0" ver="2.1.1" rel="20081203"/>
<checksum type="sha256" pkgid="YES">1b73eaedf0862e4eb16bcddad1c3bf9f8839312c7dcd8df4f83b037b1b6810c0</checksum>
<summary>A toolkit for displaying OpenGL applications to thin clients</summary>
<description>VirtualGL is a library which allows most Linux OpenGL applications to be remotely displayed to a thin client without the need to alter the applications in any way. VGL inserts itself into an application at run time and intercepts a handful of GLX calls, which it reroutes to the server's display (which presumably has a 3D accelerator attached.) This causes all 3D rendering to occur on the server's display. As each frame is rendered by the server, VirtualGL reads back the pixels from the server's framebuffer and sends them to the client for re-compositing into the appropriate X Window. VirtualGL can be used to give hardware-accelerated 3D capabilities to VNC or other remote display environments that lack GLX support. In a LAN environment, it can also be used with its built-in motion-JPEG video delivery system to remotely display full-screen 3D applications at 20+ frames/second. VirtualGL is based upon ideas presented in various academic papers on this topic, including "A Generic Solution for Hardware-Accelerated Remote Visualization" (Stegmaier, Magallon, Ertl 2002) and "A Framework for Interactive Hardware Accelerated Remote 3D-Visualization" (Engel, Sommer, Ertl 2000.)</description>
<packager/>
<url>http://www.virtualgl.org</url>
<time file="1228776708" build="1228305252"/>
<size package="3310737" installed="3314645" archive="3315036"/>
<location href="2.1.1/VirtualGL-2.1.1.src.rpm"/>
It is possible to set <location href="2.1.1/VirtualGL-2.1.1.src.rpm"/>
to the GitHub release assets (https://github.com/VirtualGL/VirtualGL/releases/download/3.1/VirtualGL-3.1.x86_64.rpm
).
So it is theoretically possible. But since GitHub release assets is not a system where a relative directory is available, all of these XML formatting should seemingly be done manually, thus not trivial.
I really believe that Fedora COPR can remove this headache and you wouldn't even need to use createrepo
each release.
Then uploading to GitHub releases (deb and rpm files uploaded previously to SourceForge are now uploaded to GitHub, just that the yum repo and rpm assets are managed by COPR) is even more trivial.
https://docs.pagure.org/copr.copr/user_documentation.html#webhooks
Anyways, if the yum repo must be hosted on GitHub, I need more investigation but I think there is a way around, although very non-trivial.
-n --includepkg
in createrepo
might work.
Again, COPR automatically solves these problems.
External building of RPMs is a non-starter. Our official packages are signed with a private key that is, for security reasons, stored only on a local machine and not in the cloud. This is one of the differences between a "community" project and a project, such as ours, that is trying to be as enterprise-friendly as possible (because being as enterprise-friendly as possible is how I attract funded development, which keeps the project alive.)
Why couldn't the generation of the XML file be automated?
External building of RPMs is a non-starter. Our official packages are signed with a private key that is, for security reasons, stored only on a local machine and not in the cloud.
I suspected this would be an issue, and I understand. I'll find a way for the RPM repository, or a way you could upload RPM assets to a hosted repository without remotely building them. Might take some time because I have more other works again.
Why couldn't the generation of the XML file be automated?
Because I believe that the link to the package has to be manually updated in the XML file in order to accommodate GitHub release URLs.
https://blog.packagecloud.io/packagecloud-loves-oss/
They could accommodate VirtualGL/TurboVNC. They have a CLI which allows uploading custom RPMs, but is a paid service. OSS projects can be exempt from the payment though. This way DEB files can also be installed using apt-get.
RabbitMQ (https://www.rabbitmq.com/install-rpm.html) and other large projects use PackageCloud.
Also free for FOSS: 1) https://www.cloudrepo.io/docs/raw-repositories.html 2) https://fury.co/pricing (free for public repositories)
PackageCloud and GemFury look like safe choices supporting DEB and RPM free for FOSS and both can push built RPMs.
So, what changes from moving away from SourceForge in this proposal?
GitHub Release Assets:
Direct download link moves from: https://sourceforge.net/projects/virtualgl/files/3.1/virtualgl_3.1_amd64.deb -> https://github.com/VirtualGL/VirtualGL/releases/download/3.1/virtualgl_3.1_amd64.deb
Uses GitHub REST APIs instead of Sourceforge SSH.
PackageCloud (https://blog.packagecloud.io/packagecloud-loves-oss/) RPM and DEB packages:
Uses PackageCloud CLI to upload built RPM and DEB packages.
Thus allows the download and update of DEB packages using apt-get
natively as well while SourceForge only made the RPM repository work. repomd.xml is automatically managed.
Let me be more clear:
The reason why we started discussing this issue in the first place was a lack of trust in the stability and longevity of SourceForge. Now you're proposing that I move YUM repositories to another service that is a complete unknown? Six or seven years ago, people begged me to move to BinTray. Now it's gone, but the packages I host on SF are still there. I am only interested in a solution that involves services with a higher trust profile than SF. At the moment, that probably means GitHub only.
Yes, you're right. I guess they can go out of business. But that doesn't mean SF is doing good...
I will check how an RPM would be possible on GitHub, but it will take time.
At the moment, that probably means GitHub only.
I still doubt this. There is one more set of options; hosting services from distros themselves (Launchpad for DEB, OpenSUSE Build Service for DEB/RPM if somehow there is a way to upload pre built packages).
https://en.opensuse.org/openSUSE:Build_Service_comparison and https://copr.fedorainfracloud.org - if uploading is possible
Example: https://lutris.net/downloads https://software.opensuse.org/download.html?project=home%3Astrycore&package=lutris
Anyways, that was enough discussion for now, I will come up with something later.
A very simple solution might be just to change the baseurl in repomd.xml. But untested.
Let me be even more clear:
Hosting packages on multiple sites is a non-starter. I will host packages on one site only. I've already explained why. If you want me to host packages on GitHub, then I will not host packages elsewhere. If you want me to host packages elsewhere, then I will not host packages on GitHub.
My interest in moving from SF to GitHub is based on a belief that GitHub has better stability and potential longevity than SF. If you want me to host packages elsewhere, then you need to convince me that the site has better stability and potential longevity than SF or GitHub. So far, I doubt that that is the case for any site you have suggested. I have seen distros come and go many times over the last two decades, but SF is still here. (SUSE, in particular, has changed hands a lot more often than SF has.)
Created a new issue (#240) to track this.
As of this writing, VirtualGL 3.0 beta1 and later do not appear at https://sourceforge.net/projects/virtualgl/files
The files aren't actually gone. I can see them from an SSH shell. There is apparently a temporary issue with the SourceForge web interface that is preventing them from being displayed, and numerous projects are reporting the same issue; https://sourceforge.net/p/forge/site-support/search/?q=status%3Aopen
As most of you know, one of the reasons why we continue to use SourceForge for file releases is that SourceForge allows file deployment and management via SSH, so I am able to automatically push and update releases using rsync and manage YUM repositories on SourceForge's file release server. If GitHub had that same functionality, then I assure you that I would use it. Barring that, I am open to other suggestions.
If this outage lasts more than a day, then I will upload the missing releases to GitHub as a stopgap solution.