Open jonesmz opened 7 years ago
I'm on Windows, and I apologize for my lack of understanding about the different kinds of Linux packages. Is it normal for open-source projects to house these platform-dependent files in the repository? Wouldn't users normally do something like sudo apt-get install <some repository>
on their own? From what you stated, I guess an ebuild is a custom script to do that same thing but for Gentoo.
I'm also preparing to move my repository over to CMake completely, so it will be a bit more cleaned up in the near future, and that should help with any organizing efforts. I'd like to keep the repository relatively slim, in any case.
Gentoo is an entirely different distribution from Ubuntu (where you'd use apt-get as you mentioned) where all the 'packages' are installed by downloading source code and compiling it on the spot. This allows for highly customized systems because you can specify various options to be applied at compile time so you only end up with packages that implement what you want to use (Which is what the ebuild does).
Just like Ubuntu, though, someone needs to publish a compatible version of the game on repositories or provide it via a Gentoo overlay at which point someone then can use the emerge (or whatever package manager they used to use) command to download and compile OpenTESArena.
As a note aside, the prebuilt Linux binaries might at least work in the interim until an ebuild is produced. But that's assuming you get lucky with linkages. But then again, if you're on Gentoo, you probably know how to actually build the game yourself.
@afritz1 Normally you would be right that most projects wouldn't host a package file (such as a .deb or .rpm file) in their repository. However, a very large number of projects do keep a "packaging" folder that can contain a collection of packaging scripts for use in creating the package files. For debian/ubuntu/mint, this would be some collection of files used by the debian tools, for Fedora this would be a .spec file that's used by Fedora / RedHat's build tools to compile and create a package.
For all of the Linux package management systems that I'm personally aware of, there's really three steps. 1) Get source 2) Compile 3) Install
The main difference between a Gentoo ebuild and Ubuntu .deb or Fedora .rpm is that the Ubuntu and Fedora packages get compiled by someone else, whereas on Gentoo, it gets compiled on your system.
@Ragora Right you are. If someone's using Gentoo without knowing how to compile something by hand they have excellent luck!
Either way, I'm hoping that I'll find time to put together a package script for OpenTESArena. Being able to store the script in the OpenTESArena repository will make it much easier for people to find between when it's added and Gentoo, or some overlay for Gentoo picks it up, but it's up to you if you want to include it. Given that OpenTESArena isn't quite at the release candidate stage, I'm skeptical that many distributions will want to make packages for it yet, but the process does have to start somewhere shrug.
As for moving to CMake and re-organizing the repository, might I suggest including a "packaging" folder ? OpenTESArena/packaging/debian/ OpenTESArena/packaging/redhat/ OpenTESArena/packaging/arch/ OpenTESArena/packaging/gentoo/
and so on?
Possibly a txtfile in there asking for pull requests for packing scripts?
The repository has been moved over to using CMake for a little while now, and I added a preliminary packages folder in commit 5fb011ba1b281cba5fdcbcdbce85d094410ce070, so the build scripts could start going in there as needed.
I still don't understand certain things such as the lifetime of these scripts. Are they needed forever, or just until a downstream distribution has accepted the repository? Also, would there ever be multiple scripts for the same distribution, or does one script fit all? Ideally, the repository would be completely platform-independent, save for a couple switches in the CMake files.
Perhaps this can also get the ball rolling for Debian/Ubuntu packages, @psi29a? It is still a little early I guess, but I am less shy now about having it in a package than I was 6 months ago, and I'm feeling better about it for a v0.2.0 release soon as well. If it's not considered a "release candidate" until something like v0.10.0, that's fine with me. Overall, this whole packaging process is just to make it easier for Linux developers to join the project, right? It's not for myself, but for those that want to work with my code!
I can get us a ITP (intent to package) which creates the initial bug report for a package in Debian. From there it is a question of getting all the bits together for a proper release.
The packages are for end users (downstream), not really for developers (upstream). Developers will always download from source, not from Debian/Ubuntu/whatever.
Oh, okay. So the build scripts are just there to do what the Linux binaries in the release folder can't do (with linking, like Ragora said)? The Windows executables basically work on any Windows 7/8/10 computer (I think), but Linux users normally need more customized executables, right? I think I'm starting to understand.
TL;DR: Personally I think package scripts should be kept around, even if just as an archive of what once happened, as long as the scripts in question are updated by someone somewhere at least once a year.
I recommend assuming that all distributions might choose to use multiple files to properly package the code.
End TL;DR:
My way of thinking about packaging scripts is as follows (The following is just my personal opinion, take several grains of salt and please understand that my views may differ from others, and that's totally expected and copacetic):
Many open source developers (Myself included) view their projects as scratching an itch that they want scratched. It solves some problem that the developer solves. The open sourcing part of it can be some combination of (or none of these, and something entirely different) charity, professionalism, teamwork, altruism, pursuit of a business model, resume building, or thousands of other reasons. My take on the possible reasons of Charity / Altruism / Teamwork, so on, is that it's great to solve the problem for yourself, but if you had the problem a lot of other people probably did.
From that view, I find myself desiring to organize my code in such a way as to make it easier for people to quickly download the code for the very first time, and start making the changes they want. I think that, a lot of the time, if someone's downloading the code (by hand), it's because they want to use it, but even more so because they want to make changes/improvements!.
From that perspective, even if a developer would normally want to grab the source code directly, that doesn't mean they want to always compile and then run directly out of the git clone's folder. A lot of the time, the person downloading the code just wants to make a small change/patch and then just apply that patch to subsequent releases until / unless the patch is integrated into the main codebase.
All that being said, even after a package has been integrated into a distribution, that doesn't mean that you should remove such packaging scripts from your codebase. You certainly can, don't get me wrong, but the same packaging scripts that Debian / Ubuntu can use can easily be modified for use in dozens of other minor distributions. If they're kept relatively up to date compared to what Debian / Ubuntu are using to compile the package, then keeping the same scripts in the codebase allows others to quickly and easily get everything they need in a "one stop shop" style to package your program for their own system.
From the perspective of a Gentoo user, it gets even more interesting. Contrasting with Debian / Ubuntu, where the vast vast majority of packages are compiled on some server farm, and then made available for download, Gentoo computers download the source code and then compile on the system directly. This gives Gentoo a lot of flexibility with regard to how package files are made use of. Gentoo has the concept of an "Overlay" which is similar to a ppa, in as much as it's an "additional" place to get packages, but the Overlays are set up to very trivially and very quickly allow users to make quick hacked up changes to the packages they're bringing in. Including an "ebuild" file directly in your repository will give someone who wants to work with OpenTESArena a huge boost in their efforts.
As for whether to remove packaging scripts after they've been integrated into a distribution, that's really up to the person/group who manages the repository. Personally, I lean toward keeping them for archival and/or "If you want it, here it is" reasons, but as long as the scripts aren't purged from the internet in general, it's not a big deal.
To address your question of "One or many files per distribution", it's definitely a per-distribution thing. I'd recommend assuming each distribution is going to have multiple files, just to be on the safe side.
IMO, packaging should be left to downstream, upstream (this project) shouldn't bother itself with that. If someone wants an ebuild, they it should be created by someone interested in maintaining it downstream for Gentoo or at least as an overlay. I say this as a former Gentoo developer.
@jonesmz should make the ebuild, of course upstream can help where they can, but downstream shouldn't be the concern of upstream unless they happen to also be involved.
but Linux users normally need more customized executables, right? I think I'm starting to understand.
It depends on what versions of everything each distro has. Incompatible versions of libc will cause problems, but shouldn't be a prominent issue unless people are trying to run the binaries on sufficiently old boxes or the distros simply ship different versions of libc. Incompatible ABI's will also cause problems, but I purposely built the Linux binaries with GCC 4 to make sure they at least run on Ubuntu 14.04 and 16.04. It will also depend on what versions of all dependent packages will be when obtained through your respective distro's package manager. If the runtime dependencies can be dealt with via the package manager though I setup the release packages to come with most of the libraries necessary to run (using LD_LIBRARY_PATH) and thus up to that point it will depend on if the linkages for those packaged dependencies can be resolved correctly on a given distro, which they usually should be unless symbols of system libraries (like libc) changed between the system it was built and linked on to the system it is trying to run on (incompatible versions).
You also have that Linux can run on a ton of different architectures and thus would need binaries for each one, but that doesn't really relate to the question at hand since it also applies to any operating system with support for multiple processor architectures.
Oh, okay. So the build scripts are just there to do what the Linux binaries in the release folder can't do (with linking, like Ragora said)?
Partially. It would partially be to build the binaries in each respective environment (via a chroot or some such) to ensure linkages are correctly resolved (library names may have changed, symbols may have changed between distros due to running different library versions, etc -- like mentioned above). But it would be mostly to build a convenient package for that distro to easily install while intelligently handling dependencies and later uninstall if desired.
I still don't understand certain things such as the lifetime of these scripts. Are they needed forever, or just until a downstream distribution has accepted the repository?
You would need some way to build updated packages for each distro for as long as your project is being developed, so someone has to maintain these build routines effectively indefinitely.
The Windows executables basically work on any Windows 7/8/10 computer (I think)
They should if you built with any recent version of Visual Studio. Every running machine would just need whatever version of the VC++ Redistributable that version of Visual Studio wants to link against installed unless you changed the compiler toolchain it uses in your project settings (something added since VS12 I believe).
@jonesmz should make the ebuild, of course upstream can help where they can, but downstream shouldn't be the concern of upstream unless they happen to also be involved.
I'd just say that as long as whatever scripts being used to package for a given distro are easily accessible somewhere to spool up a packaging environment, it should be fine.
I open this issue to request an ebuild (Package manager file) for Gentoo Linux
My intention is, if I have the time, to write such a file myself.
However, it's possible others may have the same desire as I do, so this issue can serve as a coordination area for doing that.
Ebuilds are different from a "normal" package file for Linux, such as Debian .debs, or Fedora RPMs, in the sense that an ebuild is a file that describes the process to download the code for a project, and compile it right then and there, and then install the result of compiling.