Closed stsp closed 4 years ago
Do you still need this?
I think I am not using that in my build system, no. But that was good for testing when one has the local cvs tree with patches. I think it would make sense to have something like this, but it doesn't have to be compatible with my impl (as not being used by my build system).
Do you want me to pull them in on djgpp-cvs? We could make that repo an unofficial dev branch of sorts.
I am not sure how exactly. LP does not allow getting anything from internet. So the only possibility is to supply source archives and the patches directly into the same repo where the build system resides. So while I have nothing against treating your branch as an upstream (but you seem to disappear for quite too long etc; treating someone as an upstream should be justified, but I don't mind to try), I currently don't see how this can be done. The only possibility I can think of, is for you to work on my code, or deploy something else on LP. I won't mind to change the URL at LP to your project if you come up with the deployment solution.
The changes I pushed just now include a --no-download
switch which you could use. But I see that's only a tiny part of it, and I don't want to clutter this repo with large tar files that most people won't need (and can download themselves). So as I said it's probably best to keep PPA related changes on your branch.
Still I could merge your djgpp patches in djgpp-cvs, so users of this script can get those patches too.
Merging the patches is fine with me of course. :) But its not universally true that the large files need to exist in the same repo (even if I said that above myself). LP can handle deb-style dependencies. It is possible to have 2 projects at LP, one with binaries and one with a build system, each fetching either from different github repos, or even from different branches of the same repo (I think different repos is better).
Not sure, it seems: https://help.launchpad.net/Packaging/PPA/Uploading you can only upload source packages. It doesn't say about uploading binary. I suppose otherwise people would use it as a file share. So my guess is, yes, it should be compiled on launchpad. And overall, launchpad can build the package for any ubuntu you ask him. You'll have problems doing so on github.
I think I can make this work.
The idea is to have github download all the necessary files (there's a new option --only-download
that does this) and create a source package, and upload that package to Launchpad. Then LP builds the binaries. I can set this up so that this action triggers whenever I push to a specific branch.
Now I'll need to figure out how the debian packaging system works. I want to see if it's possible to have a single source package that builds multiple binary debs. djgpp-gcc
, djgpp-gcc-doc
, djgpp-gdb
, etc.
The idea is to have github download all the necessary files (there's a new option --only-download that does this) and create a source package, and upload that package to Launchpad.
Maybe this is the possible way around indeed. However, you'll eat quite a lot of traffic on a poor github. :) But maybe you can find some way to not re-download every time...
single source package that builds multiple binary debs. djgpp-gcc, djgpp-gcc-doc, djgpp-gdb, etc.
Yes, this is of course possible, and is a right thing to do.
Thanks for your commitment and please make a good upstream for djgpp. :) I am obviously quite unhappy with its current upstream.
Now I'll need to figure out how the debian packaging system works.
I'd suggest to first inherit my work, and proceed with your idea of getting rid of large blobs in the repo. That way you can defer studying the debuild stuff for quite long, and will be able to accomplish most of the task w/o any fiddling with debuild. Splitting the package can be the final step.
However, you'll eat quite a lot of traffic on a poor github. :) But maybe you can find some way to not re-download every time...
Yeah it's probably not the nicest thing to do. There is a cache option, I could look into that: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/caching-dependencies-to-speed-up-workflows However for the test builds I do want to make sure that the scripts work on a clean checkout. There don't seem to be any restrictions on bandwidth, so I think what I'm doing is allowed.
I'd suggest to first inherit my work, and proceed with your idea of getting rid of large blobs in the repo. That way you can defer studying the debuild stuff for quite long, and will be able to accomplish most of the task w/o any fiddling with debuild. Splitting the package can be the final step.
I ran dh_make
and it created a bunch of stuff in the debian/ directory. I'm both looking at how you did it and filling in the example files it made. There's a lot in there that seems unnecessary but I'm not sure yet what it all does.
It seems they also want you to check the license of each source package and add special cases for those in the copyright file. Makes sense I guess but that's a lot to go through.
Okay we have DESTDIR
support now. I found a way to do it without having to build gcc twice. It's a bit of a hack but it does the trick.
2-stage build is a very common approach. Avoiding it was not among my goals. Whether to apply the hack like that instead, is up to you and I won't care as long as things work. At gcc bugzilla the sysroot tricks were suggested instead, but I gave up w/o trying. So maybe you can instead try the sysroot trick.
PPA is live now: https://launchpad.net/~jwt27/+archive/ubuntu/djgpp-toolchain
And the upload process is fully automatic :) https://github.com/jwt27/build-gcc/runs/443889013
Only for bionic?
I didn't see a way to specify multiple versions. Does it not work on later releases?
Not sure, it haven't built yet. But at least Xenial should also be covered, see https://code.launchpad.net/~stsp-0/+recipe/djgpp-daily
Can I do this with s/bionic/xenial/
(and others) in debian/changelog and just reupload the same source? Does each release need a new version string?
In your log I can see:
find download/*/* ! -wholename '*/.git/*' -delete
find: ‘download/*/*’: No such file or directory
Makefile:14: recipe for target 'clean' failed
Can I do this with s/bionic/xenial/ (and others) in debian/changelog
I don't think so, as I never changed that info myself. I asked the needed builds via GUI.
That line has a -
in the makefile so it's ignored :)
-find download/*/* ! -wholename '*/.git/*' -delete
Also I found this in launchpad, is that what you mean?
Yes, it seems copying the source package is the right way to trigger multiple builds. Whether or not this can be done by an automatic daily recipe, is what I don't know.
I will try just uploading the same sources for each release now: https://github.com/jwt27/build-gcc/actions/runs/38847751 / 77ca466 If they get rejected then I'll have to change the version string for each release.
All rejected. New attempt: 37a3eed edit: I don't know how but I managed to break something again.
New attempt is accepted and building, except on xenial:
The following packages have unmet dependencies:
sbuild-build-depends-djgpp-toolchain-dummy : Depends: debhelper (>= 11) but it is not going to be installed
E: Unable to correct problems, you have held broken packages.
apt-get failed.
Package installation failed
debhelper 11 is required because the debian/*.install files don't work on earlier versions.
Is debian/install
not enough?
The .install files (and .manpages, *.info) split the archive into multiple binaries djgpp
, gcc-djgpp
, binutils-djgpp
, etc. On earlier versions they have a different format and debhelper only looks for files in the current dir (not DESTDIR
).
I think for now the easiest option is to use the copy function on launchpad and copy the binaries from bionic to xenial. We will have to wait for the bionic binaries to be published first.
Deprecating xenial is an option too. I already have problems with Bionic, and even with Disco, not to speak about xenial.
I think that's a much better option :+1:
All packages have been published now. Let me know if everything works for you.
No, it doesn't.
LC_ALL=C i386-pc-msdosdjgpp-gcc -Wall -O2 -finline-functions -Wmissing-declarations -c command.c -o command.o 2>&1 |more
/tmp/ccsXc9io.s: Assembler messages:
/tmp/ccsXc9io.s:24: Error: invalid instruction suffix for `push'
/tmp/ccsXc9io.s:65: Error: invalid instruction suffix for `pop'
/tmp/ccsXc9io.s:71: Error: invalid instruction suffix for `push'
...
I installed gcc-djgpp and it took binutils-djgpp via deps, but I suspect it ends up using the wrong assembler.
And if I would have to guess, I'd say it may be because of the 1-stage hack you applied.
$ i386<tab>
i386 i386-pc-msdosdjgpp-gcc-9.2.0
i386-pc-msdosdjgpp-c++ i386-pc-msdosdjgpp-gcc-ar
i386-pc-msdosdjgpp-cpp i386-pc-msdosdjgpp-gcc-nm
i386-pc-msdosdjgpp-djasm i386-pc-msdosdjgpp-gcc-ranlib
i386-pc-msdosdjgpp-dxe3gen i386-pc-msdosdjgpp-gcov
i386-pc-msdosdjgpp-dxe3res i386-pc-msdosdjgpp-gcov-dump
i386-pc-msdosdjgpp-dxegen i386-pc-msdosdjgpp-gcov-tool
i386-pc-msdosdjgpp-exe2coff i386-pc-msdosdjgpp-setenv
i386-pc-msdosdjgpp-g++ i386-pc-msdosdjgpp-stubedit
i386-pc-msdosdjgpp-g++-9.2.0 i386-pc-msdosdjgpp-stubify
i386-pc-msdosdjgpp-gcc
Isn't there supposed to be *-as?
Okay that's weird. It looks like binutils-djgpp didn't get installed at all? It works on my end:
What ubuntu version are you on? Although I don't think that should make any difference...
Do you have other binutils files? eg /usr/i386-pc-msdosdjgpp/lib/ldscripts/*
You need to get back to i586 it seems.
I really dunno what was the reason to provide binutils-djgpp but not djgpp itself though. We can ask @skitt about that.
Huh. Why does that exist if there is no gcc-djgpp or libc?
You need to get back to i586 it seems.
I can rename my packages to djgpp-binutils
or binutils-djgpp-i386
or something.
You need to rename, and also add a "conflicts" control. This is weird though.
Or you can go back to i586 and work with the existing tools.
I really dunno what was the reason to provide binutils-djgpp but not djgpp itself though. We can ask @skitt about that.
The plan is to also provide GCC and the C library, but I haven’t finished the bootstrapping sequence necessary to get the packages into Debian yet.
I think I could add Conflicts: binutils-djgpp (>2)
. However I was considering to make separate source packages eventually so each has its own version number and can be updated individually. That would then still conflict with binutils-djgpp in focal.
Initially I did name the packages djgpp-*
but then I realized that other cross-tools are all of the form program-target
so I switched it around. Maybe I should ignore that and go rename them back to djgpp-*
.
i586 is not happening. We need i386 code for this target.
Hi @skitt. Do you think it's possible to change your binutils-djgpp to target=i386-pc-msdosdjgpp
? That's what we're doing now, with i586-*
symlinks for compatibility with other distributions.
The plan is to also provide GCC and the C library, but I haven’t finished the bootstrapping sequence necessary to get the packages into Debian yet.
But will they go into ubuntu focal when you are done? Because it seems not.
Hi @skitt. Do you think it's possible to change your binutils-djgpp to target=i386-pc-msdosdjgpp?
I can't speak for @skitt, but my guess is that the route via debian to ubuntu will be long, if at all possible within Focal.
Hi @skitt. Do you think it's possible to change your binutils-djgpp to
target=i386-pc-msdosdjgpp
? That's what we're doing now, withi586-*
symlinks for compatibility with other distributions.
Yes, that’s definitely possible, I’ll do that ASAP.
Hi @skitt. Do you think it's possible to change your binutils-djgpp to target=i386-pc-msdosdjgpp?
I can't speak for @skitt, but my guess is that the route via debian to ubuntu will be long, if at all possible within Focal.
The Debian import freeze for 20.04 is on February 27, I doubt I’ll make it...
The Debian import freeze for 20.04 is on February 27, I doubt I’ll make it...
Is removing a package a faster option?
The Debian import freeze for 20.04 is on February 27, I doubt I’ll make it...
Is removing a package a faster option?
Yes, I can do that after the freeze (to avoid the package getting back in automatically). Couldn’t @jwt27’s packages work with the Debian package instead?
Couldn’t @jwt27’s packages work with the Debian package instead?
He uses i386 and you use i586, so not w/o hacks and symlinks.
Couldn’t @jwt27’s packages work with the Debian package instead?
If binutils-djgpp
from Debian is compiled to target i386, then yes. In the meantime I could add a dummy package with symlinks to i586 binutils.
Alternatively if the whole toolchain ends up in focal (or the release after that) then my ppa only needs to support eoan and earlier, and can be deprecated after that.
Couldn’t @jwt27’s packages work with the Debian package instead?
If
binutils-djgpp
from Debian is compiled to target i386, then yes. In the meantime I could add a dummy package with symlinks to i586 binutils. Alternatively if the whole toolchain ends up in focal (or the release after that) then my ppa only needs to support eoan and earlier, and can be deprecated after that.
The updated package is on its way to the Debian archive, it should be in Focal tomorrow or Saturday.
Do you still need this? I see you made more changes on your branch too. Maybe it's better to keep ubuntu PPA related changes on a separate branch.
I have some local changes here too that I should have committed months ago. This includes global command line parameters for things like
--prefix=
and--target=
. That will probably cause merge conflicts on your branch.Also I see your djgpp patches still haven't been merged into CVS. Do you want me to pull them in on djgpp-cvs? We could make that repo an unofficial dev branch of sorts.