Stichting-MINIX-Research-Foundation / minix

Official MINIX sources - Automatically replicated from gerrit.minix3.org
Other
3.06k stars 993 forks source link

"Configuration i386-elf32-minix not supported" for port gcc6 #232

Open martinvahi opened 7 years ago

martinvahi commented 7 years ago

Version: 3.4.0 rc6

Detailed error message is shown at a screenshot (archival copy).

Reproduction

step_1)

After a fresh install

pkgin update
pkgin install git-base
pkgin install binutils
pkgin install clang
pkgin install bmake
pkgin install digest
pkgin_sets  

cd /usr
make pkgsrc 

step_2)

cd /usr
make pkgsrc-update

step_3)

cd /usr/pkgsrc/lang/gcc6
bmake install

At that point, the error message appears.

martinvahi commented 7 years ago

It seems to me that it's a general "show stopper". I tried to install the Octave and I ran into a similar issue (archival copy).

martinvahi commented 7 years ago

Similar error appears with the gcc5 port.

I remember that at some video there was a statement that the GCC is available on MINIX. Could someone please explain, how to install the GCC to MINIX? Thank You.

martinvahi commented 7 years ago

I found the following 2 references, which are related to the explanation, how the GCC support of MINIX3 got dropped.

http://wiki.minix3.org/doku.php?id=usersguide:gnudevelopmenttools

https://groups.google.com/forum/#!topic/minix3/v9Z_Vx1j7wM

martinvahi commented 7 years ago

I tried the "bmake install" for gcc5 at a freshly cloned

https://github.com/sambuc/pkgsrc-ng

but ended up seeing a similar error message.

sambuc commented 7 years ago

GCC in PKGSRC is not patched to support MINIX.

If you want it you will have to port the patches we have in the source tree on the version(s) of your choice, and import them into PKGSRC.

I do not have the time to maintain 2 toolchains (LLVM & GCC) in basically two different environments (base system & PKGSRC) over multiple versions of each.

Patches are welcome.

sambuc commented 7 years ago

I do not see how not having GCC is a show stopper. Please submit more evidence to why you have a hard requirement on GCC.

We have a C and C++ compiler (clang) readily available, it now comes bundled with the system, you don't even need to install it using pkgin anymore.

martinvahi commented 7 years ago

Thank You for the answers, including the 2017_08_15 one at https://groups.google.com/forum/#!topic/minix3/YxG1V_Nihnw

What regards to the patches, then I'll keep that in mind, but I probably won't be able to work on them before 2019 (~2 years from now), because I really need to earn a living somehow and as a freelancer I happen to specialize on serving people, who pay for userland software, not operating system development. As of 2017 all of the operating system and distribution development side is a nasty burden for me, something that I don't get paid for at all, but it's something that I just happen to have the need to look into due to quality issues, mainly reliability topics, at userland. In my view security is a sub-topic of reliability.

What regards to the question, how can a lack of GCC be a show stopper, then I think that this question of Yours illustrates really vividly the 2017_08_14 statement of the person called Rune at

https://groups.google.com/forum/#!topic/minix3/YxG1V_Nihnw (archival copy)

where he/she (I'm not able to derive the gender from the name "Rune") points out that, a citation from Rune's post:

Consider also that "Minix" is the kernel. Kernel developers do not usually live (figuratively) in the same world as the deployers, which makes it harder to understand each other. Naturally, nobody can clearly see the needs and challenges which one does not experience oneself.

In another words,

You(Sambuc) claim: "There's no need for GCC, because LLVM is sufficient."

I(Martin Vahi) claim: "As it turns out, many userland applications use 
                       GCC specific features, GCC dialect of C/C++ or
                       something GCC specific at their build system, and 
                       upgrading GCC in stead of upgrading a whole
                       plethora of userland components/packages/ports that 
                       depend on GCC is smaller amount of work than 
                       upgrading the plethora of userland components"

What regards to the argument that

"people should not use compiler implementation specific features"

then generally I agree, but since some people DO use compiler specific features and others, who intentionally avoid them, do not test their code to compile/work with other compilers, the dumping of GCC is a mild analogue of a case, where an idealistic academic declares that since Pascal is more elegant system programming language than C/C++, then support for C/C++ compilers is dropped and people are asked to "port" their C/C++ applications to use Pascal. That is to say, indeed, as a person, who has worked years writing speed-optimized C++ (as of 2017 my C++ skills are rusted) I can sympathize with the wish to use Pascal, except with a few corner cases, but ordering masses of people to substantially rewrite their software that works just fine with some other, less fancy, tool/compiler/library is clearly too much to be asked.

So, that's my "hard requirement on GCC". If there's not enough manpower to get the GCC working on MINIX3, not to mention that I'm self totally useless in this context right now, then I think that it's smarter to just add the GCC to some TODO list, at some high priority position, and admit that as of 2017_08 at that narrow context the MINIX3 project is at a ditch, has a serious handicap and is incapable of delivering to the majority of userland users, but may be one day we'll get it fixed.

What regards to the "fire and forget" aspect of "software maintenance", the statement that as operating systems and compilers evolve, they have to be continuously upgraded and that this upgrade process is an everlasting battle, then my answer is that "fire and forget" is the only thing that I am able to afford. Actually, the "fire and forget" is THE ONLY SOLUTION, that a team of N developers is able to afford, because the alternative means that as the number of packages/ports increases, the number of ports/packages needed to upgrade increases and if the number of developers stays constant, the N, then at some point ports/packages that need continuous upgrades, will be dropped.

The design pattern to implement the "fire and forget" solution at userland, at the development of libraries, is that libraries are not released that often, but when they are released, then ALL PUBLIC VERSIONS ARE AVAILABLE IN PARALLEL, allowing the client code of the library to migrate from one version to another AT ITS OWN PACE. At some point there can be a bloat reduction effort, where many old versions of the libraries are thrown out of the library collection, but that can be done with a "migration period". In the Java world it was done by marking old components deprecated, which is a warning, that at some point that component will be removed from the Java stdlib. In the case of MINIX3 and GCC the GCC port should just use the MINIX3 POSIX library, a specific version of the library.

What regards to future-proofing GCC porting related work, reducing the amount of upgrades to the GCC package/port, then in the VERY LONG RUN the activity of customizing compilers to CPU-types and operating systems will be ELIMINATED TOTALLY, because the CPU manufacturers need a formal specification of their CPU, usually in the form of tests, including formal verification based tests, and in the form of the hardware description language, which is "compiled" to FPGA bitstreams, semiconductor foundry masks, etc. With the exception of "Java CPU-s" and the Forth related GreenArrays microcontrollers, which are stack machine based, all of the CPU-s have been RAM machines. Basically, they're all a set of memory locations, be it directly in RAM or some special region of "RAM", the CPU registers, and then there are some operations defined between the memory locations. Sometimes the memory region is a bit wider, like the MMX CPU-registers and a like, sometimes it's smaller, like the 1B CPU-registers in 8-bit microcontrollers, but basically, from theory of computation point of view, one might just skip the use of all traditional on-chip CPU-registers and allocate some regions from RAM to be in the same role as the CPU-registers are in 2017. As a matter of fact, the Raspberry Pi has the CPU-chip and RAM chip literally stacked on top of each other, so the physical distance from the CPU die to the RAM die is less than 1mm, excluding the "horizontal" wires. As the use of FPGA-s becomes more frequent, the CPU architecture, the specific set of registers, etc. might become even less relevant and the CPU-manufacturer accompanies its CPU-s with some JSON/XML/YAML that is loaded by operating system build system and by the compiler. The CPU manufacturers will probably save tons of money by investing to the auto-porting capability, not to mention "time-to-market" decreases to basically zero: the moment the chips are shipped, all major compilers and operating systems are already ported without any effort from the operating system developers and compiler developers. Less human effort, less bugs. Porting an operating system and porting a compiler/GCC/LLVM might resort to telling the compiler and operating system build system the path of the file that describes the other components, the CPU, the OS, whatever else. No more manual fiddling.

Secondly, as hardware becomes cheaper than food for human slaves, it is perfectly acceptable to just have 1024 CPU-s to do some extra work. If the CPU cores, the RAM-machines, are a bit outdated, nonoptimal, then it's OK to just add another batch/chip of 1024 CPUs to the mix. That's what's been going on at mobile devices for the past 20 years, where my ~16MHz Palm V offered basically all of the applications that a 2017 "smart"-phone with 4 1GHz 64bit cores offers, except color screen, GPS, video, web surfing, internet connection, taking phone calls, sending/receiving SMS/mail/tweets/etc.

As a matter of fact, I want to do most of my systems programming in ParaSail. Actually, so far my only/main packaging effort has been repackaging ParaSail compiler, but I did not do any serious work at porting it. It just so happens that due to the ParaSail effort I know that the ParaSail compiler is written in Ada, which uses GCC based gnat for its compiler. Therefore, no GCC means no Ada means no ParaSail. Yet another "hard requirement" for GCC. I haven't studied/tested, how things are with the OCaml language, but I do remember, that some compiler, may be OCaml, may be Go, needed an older version of itself to bootstrap. May be it was the same case with the Java. The circular dependency can include build tools, id est build tools of Java might use Java, which requires its build tools, ... (I'm not sure, if it was Java, I can be mistaken, but I certainly gave up compiling the Java SDK back in the day, when I was interested in Java.)

Thank You for Your answers and thank You for reading my comments.

martinvahi commented 7 years ago

Actually, philosophically speaking, one huge advantage that the GCC has over the LLVM, despite the technical superiority of the LLVM, is that the GCC has survived financial winters, but the LLVM stays alive largely due to quite huge flows of money from Apple and others.

The idea is that I as a freelancer can not afford to re-do my work, which means that my work needs to be USABLE for a long period of time, which in turn means that the dependencies of my created software need to be available at least as long as my software is in use, which is long-term. Even very well funded projects loose their funding at some point and my conclusion is that only the projects that are developed with ZERO BUDGET, survive longterm. The LLVM has a bit "too much money", a bit like the Microsoft funded C#/Mono has.

The dying of the VRML demonstrates that an open standard combined with relatively good financing from multiple commercial entities is not enough for a technology to survive. The VRML was not even technically superseded by anything else. The modern options, WebGL and x3dom.org don't offer practically anything substantially new and that revolutionary that the VRML or 3D Java applets did not offer. On the other hand projects like the GCC and Vim have survived just fine.

I emphasize that popularity of an open source project alone will not save the project either. An example of a popular project that was/is open source, but that is practically dead the moment paid developers stop developing it is the NetBeans IDE. There was a period, when the NetBeans was probably the best IDE on planet Earth, yet, when the Oracle acquired Sun Microsystems intellectual property, the MySQL got forked/saved, but nobody wanted to fork the NetBeans IDE. My guess is that the popularity of the NetBeans IDE came mostly from application programmers, who did not work on programming language development, who were passive users from IDE development point of view.

I have written more about my subjective preferences for my technology stack at one of my blog posts (archival copy).