Closed gabriel-laddel closed 9 years ago
I tried building SBCL-1.2.5, cutting out the SB-POSIX tests until I can figure out something better.
This is SBCL 1.2.5.53-c787e53-dirty, an implementation of ANSI Common Lisp.
More information about SBCL is available at <http://www.sbcl.org/>.
SBCL is free software, provided as is, with absolutely no warranty.
It is mostly in the public domain; some portions are provided under
BSD-style licenses. See the CREDITS and COPYING files in the
distribution for more information.
fatal error encountered in SBCL pid 30324(tid 140737354086144):
can't load .core for different runtime, sorry
Welcome to LDB, a low-level debugger for the Lisp runtime environment.
ldb>
I'm betting that this is NixOS related. there were issues trying to run an XULrunner binary too for whatever it's worth.
Some notes:
1) For the overall design of the language and concepts, Eelco's papers are (for better or worse) the best place to look, I think. They describe the language, its motivations, and NixOS rather thoroughly. Some good starting points are Nix: A Safe and Policy Free System for Software Development, and NixOS: A Purely functional Linux distribution.
It's been known for a while that unfortunately documentation is lacking on a few of these fronts. It's absolutely something we need to fix (I too think the NixOS on-boarding and beginner material is fairly sub-par).
2) Related to the above, there is no true BNF describing the Nix parser, unfortunately. This was a question raised recently in an attempt to write another Nix implementation in Haskell. A proper BNF is something that should be easy to add to the documentation somewhere.
3) I am confused by your inquiry about libraries being available. libx11
, libxau
, etc are all readily available in Nixpkgs (if they weren't, many other things would not be either). Your query of ls /nix/store
does not tell you anything useful or meaningful. You must query the actual repository, which is described using Nix. For example:
$ nix-env -qa | egrep -i '(libx11|libxau)'
libX11-1.6.2
libXau-1.0.8
In your paste of the Emacs default.nix
file, one dependency is specified as x11
. This is not the same as libX11
- indeed, x11
is actually a meta-package which encompasses the modular Xorg framework and many libraries, including libXext
, libXft
, libX11
, etc etc. You may determine this by looking at what the name x11
is bound to in pkgs/top-level/all-packages.nix
and seeing what other packages it incorporates.
That said, I'm afraid I cannot help you with SBCL or StumpWM. I use neither. As it stands today, the NixOS community is structured such that all Nix users are, in essence, de-facto Nix developers. This community is still so small it is almost a requirement for you to use it regularly. In return, most of us believe the benefits are massive enough to deal with that. But it is not true of everyone. That is an understandably painful reality, but it's a reality we cannot yet change. Hopefully in the future we can have enough users and dedicated maintainers to alleviate this burden from the userbase, and StumpWM and SBCL and X number of other things will just work for anyone who wants them, with no effort.
If a package is broken or outdated or unmaintained (which happens with any distro), it almost always falls to a user who wishes to use that software to maintain it, I'm afraid. Have you tried asking on the #nixos
channel on freenode? I've found many helpful people there in my time who may be willing to assist you in getting a more recent SBCL and StumpWM working.
Oh, and one final thing about your 'editors note' (which I'll quote to ensure everyone can read it for the future):
# Fix the tests [editor's note: 'fix' the tests by removing them? fuck you.]
If you wish to behave this way, I'm afraid my impression is that your mind about how to conduct yourself in public spaces is fairly made up - and no amount of honest actors will change that behavior or attitude.
I can understand frustration at using new software and being perplexed by it. But your tone in here is pretty clear. And I imagine those aren't attitudes or tone we want anyway - besides, nothing brought you here beyond your own desires.
So, if that is the case, maybe it would be best for you to find a community that does appreciate that sort of attitude - perhaps the Common Lisp community does. Or the Xbox Live community, perhaps.
1) For the overall design of the language and concepts, Eelco's papers are (for
better or worse) the best place to look, I think. They describe the language,
its motivations, and NixOS rather thoroughly. Some good starting points are Nix:
A Safe and Policy Free System for Software Development, and NixOS: A Purely
functional Linux distribution.
For worse.
A proper BNF is something that should be /easy/ to add to the documentation somewhere.
I would use the term "tedious bullshit" to describe this task, but whatever, moving on.
3) I am confused by your inquiry about libraries being available. libx11,
libxau, etc are all readily available in Nixpkgs (if they weren't, many other
things would not be either). Your query of ls /nix/store does not tell you
anything useful or meaningful. You must query the actual repository, which is
described using Nix. For example:
$ nix-env -qa | egrep -i '(libx11|libxau)'
libX11-1.6.2
libXau-1.0.8
I'm an idiot, thank you.
In your paste of the Emacs default.nix file, one dependency is specified as
x11. This is not the same as libX11 - indeed, x11 is actually a meta-package
which encompasses the modular Xorg framework and many libraries, including
libXext, libXft, libX11, etc etc. You may determine this by looking at what the
name x11 is bound to in pkgs/top-level/all-packages.nix and seeing what other
packages it incorporates.
I'll look into this now.
That said, I'm afraid I cannot help you with SBCL or StumpWM. I use neither. As
it stands today, the NixOS community is structured such that all Nix users are,
in essence, de-facto Nix developers. This community is still so small it is
almost a requirement for you to use it regularly. In return, most of us believe
the benefits are massive enough to deal with that. But it is not true of
everyone. That is an understandably painful reality, but it's a reality we
cannot yet change. Hopefully in the future we can have enough users and
dedicated maintainers to alleviate this burden from the userbase, and StumpWM
and SBCL and X number of other things will just work for anyone who wants them,
with no effort. If a package is broken or outdated or unmaintained (which
happens with any distro), it almost always falls to a user who wishes to use
that software to maintain it, I'm afraid.
WIP, K.
Have you tried asking on the #nixos channel on freenode? I've found many helpful
people there in my time who may be willing to assist you in getting a more
recent SBCL and StumpWM working.
curl http://nixos.org/irc/logs/log.20141112 | grep gabriel_laddel
Oh, and one final thing about your 'editors note' (which I'll quote to ensure
everyone can read it for the future):
# Fix the tests [editor's note: 'fix' the tests by removing them? fuck you.]
If you wish to behave this way, I'm afraid my impression is that your mind about
how to conduct yourself in public spaces is fairly made up - and no amount of
honest actors will change that behavior or attitude.
(dolist (k '("define remove" "define fix" "define test") (google k)))
I can understand frustration at using new software and being perplexed by it.
(google "define perplexed") ;=>
per·plexed/pərˈplekst/
adjective
completely baffled; very puzzled.
per·plex/pərˈpleks/
verb
(of something complicated or unaccountable) cause (someone) to feel
completely baffled.
There is nothing perplexing about Nix.
But your tone in here is pretty clear. And I imagine those aren't attitudes we
want anyway
A person with the handle thoughtpolice doesn't want anyone to use mean words or to be critical of a project he is involved with? Color me shocked.
besides, nothing brought you here beyond your own desires.
...
So, if that is the case, you should find a community that does appreciate that
sort of attitude - perhaps the Common Lisp community does. Or the Xbox Live
community, perhaps.
sizzle
Thanks for the help.
A person with the handle thoughtpolice doesn't want anyone to use mean words or to be critical of a project he is involved with?
No, actually. It's because I'm a developer who works on other open source projects (several for my job), and realistically I have better things to do than deal with annoying people telling me and other people in the projects I work on to go fuck themselves - which substantially decreases my motivation to deal with said project in the first place. Indeed, my respect lies with the NixOS developers, and not you. So you'll have to forgive me if I find you annoying, but that's life sometimes.
This should be pretty plainly obvious (as opposed to your quite rude assumption of my motivations and presumably willful misinterpretation of it, as if it would help your position). But apparently it needs to be spelled out to some people who are seemingly incapable of understanding these aspects of social interaction and why their behaviors might be considered harmful. Which I believe are pretty clearly alluded to in my original posting.
Of course, I'm not about to succumb to such belittling tactics by people like you - that would only prove such tactics work to drive people away, ultimately harming everyone. So I'm afraid the feared Orwellian Thought Police of NixOS™ is here to stay -- much to your great despair, I imagine. Be afraid - be very afraid!
That said, this bug is a bit convoluted. Really there are several things going on here, based on your original description:
1) The emacs package is missing several things, including:
PDF/postscript viewing doesn't work.
2) Dependencies for Emacs are unclear, although as I explained before, we should have all of these readily available. They merely follow different conventions for naming/meta-packaging. If something is missing, it can certainly be upstreamed.
3) SBCL tests should not be nerfed; furthermore they should likely be integrated into the build process on Hydra, so ever. How/if this can be fixed depends on the upstream project and the tests in question; while many projects do enable tests (by specifying doCheck = true;
in their default.nix
), this isn't always easily doable in some cases. I'm afraid I don't know much about the specific tests in question here to comment.
4) SBCL dynamic dependencies/cffi libraries need to be properly managed in some way. There are a slew of other packages that manage their own subpackages in a similar way; Emacs is one of them, in fact. Another is OCaml or the Haskell package set. In short, we tend to write environment wrappers which properly 'set up' the environment for a specific package.
For example, when we use GCC and specify a library like libxml2
in the buildInputs
of an executable, this actually is translated into a set of paths in which GCC will look for libraries - this way, #include
directives properly work, as well as flags like -lxml2
. These paths are actually provided via environment variables - and GCC is actually a shell script that wraps the real GCC in with the proper flags.
This explains why we have packages like clang-wrapper
or whatnot. We would probably equally need something like an sbcl-wrapper
tool, that when invoked will properly set up an environment in which the real SBCL executable will be able to dlopen()
the right libraries (assuming of course there's some flag/option to control the base directory of where the dlopen
occurs and where SBCL searches for shared objects in the first place).
Then, individual SBCL packages would go into their own namespace, such as nixpkgs.sbclPackages
, where people could select the sub libraries they want. We have quite a few toolchains that follow this exact scenario.
5) StumpWM and SBCL are outdated it seems, and could do with a working update.
Really though, these are all separate issues in a way, so I would suggest you file them each in a ticket so they can be triaged and more easily tracked by developers.
Having a meta-ticket is fine in general AFAIK (maybe 'overhaul SBCL support' which most of the issues would fall under), although keeping separate issues for the separate subtasks makes it easier to manage them and address individual concerns.
Of course, I'm not going to do this for you, but I'd suggest that it's probably the easiest way to at least have your complaints looked at (as opposed to posting a very long Emacs buffer - the syntax highlighting actually makes it rather painful to read).
Thanks for the help.
Don't worry about it - although you won't be getting any from me in the future. Although I will certainly return and be sure to call you out on shitty behavior should I see it. After all, what kind of Thought Police™ would I be if I didn't? I'd be letting you down, what with the expectations you set for me!
This issue is fascinating, I've never seen a bug report written in LISP before. I'm also a little perplexed by the general tone and sense of entitlement but anyway... I'd like to address some of your points @gabriel-laddel .
I see that you try to build EMACS from the command line. That's not how it is done in NixOS as that's not repeatable and stateful. Either you clone nixpkgs and change the nix expressions already there, or you use the packageOverrides in your ~/.nixpkgs/config.nix
(see http://lethalman.blogspot.it/2014/11/nix-pill-17-nixpkgs-overriding-packages.html).
You asked 1 question on #nixos and then logged off when you got no replies? Perhaps the nix-dev mailing list would be more suited to your IRC style.
I general I would say there is a passion for code quality and statelessness in NixOS, just go look at all the comments on issues asking to change this or that. You seem to have gotten the reverse impression from the learning curve and EMACS. Nothing I can do about that, just interesting. If you like the Nix concepts you could implement StumpWM the way you like it, or you could just walk away.
I don't know what causes your SBCL error. The guys in the LISP channel are correct, when you build something for Nix you fix all inputs, and if something doesn't work like that you wrap it so it only sees the part of the world that you want.
Oh, those sb-posix test failures are also perplexing. Expecting 13, getting /? Wow. Of course, just turning them off was not the right thing to do.
Anyway.
Oh and hats off to @thoughtpolice for his detailed and gracious answers.
Oh, and I will leave one mention about the IRC channel: I am not sure in what timezone the OP resides, but realistically from my time contributing I've noticed NixOS has a very large European userbase and a much smaller "not European" userbase. Given the assumption the logs are timestamped to a European timezone, it's perhaps not surprising most people weren't available in the early morning hours.
So knowing nothing about OP, I imagine this could possibly have some impact on the availability of people to respond. As an American myself, this too has caught me off guard. So yes, the nix-dev
mailing list may actually be a better place to formulate longer discussions, depending on your availability.
Also, currently NixPkgs has a working StumpWM 0.9.8 package (previous release) compiled using SBCL-1.2.5 (which is also in NixPkgs).
Some of SBCL tests make assumptions about the base system; the others are run during the SBCL build: http://hydra.nixos.org/build/16738664/log/raw (look for ACLREPL-TESTS)
@gabriel-laddel:
eg, where is the BNF form for the Nix language?
There's an SDF representation of the syntax in @edolstra's PHD thesis starting on page 64, though I'm not sure if that would be entirely up to date. It also can be seen as acting as a design document, and lays out the thought and central metaphors behind Nix/OS pretty clearly.
@Shados actually, you could say that https://github.com/NixOS/nix/blob/b6809608cc467925db44b1eb435095c37e433255/src/libexpr/parser.y#L298-L523 (Bison configuration) is a very strict, machine-readable BNF form... I wonder if there's a tool that takes Bison input and generates a human-readable grammar.
I've seen everyone's responses and will be responding in due time.
Relevant, apparently you can get an almost-BNF out of bison with -v
: http://stackoverflow.com/questions/19477103/extract-bnf-grammar-rules-from-yacc-file
FWIW, here's the grammar. The tokens are defined in https://github.com/NixOS/nix/blob/master/src/libexpr/lexer.l#L80.
0 $accept: start $end
1 start: expr
2 expr: expr_function
3 expr_function: ID ':' expr_function
4 | '{' formals '}' ':' expr_function
5 | '{' formals '}' '@' ID ':' expr_function
6 | ID '@' '{' formals '}' ':' expr_function
7 | ASSERT expr ';' expr_function
8 | WITH expr ';' expr_function
9 | LET binds IN expr_function
10 | expr_if
11 expr_if: IF expr THEN expr ELSE expr
12 | expr_op
13 expr_op: '!' expr_op
14 | '-' expr_op
15 | expr_op EQ expr_op
16 | expr_op NEQ expr_op
17 | expr_op '<' expr_op
18 | expr_op LEQ expr_op
19 | expr_op '>' expr_op
20 | expr_op GEQ expr_op
21 | expr_op AND expr_op
22 | expr_op OR expr_op
23 | expr_op IMPL expr_op
24 | expr_op UPDATE expr_op
25 | expr_op '?' attrpath
26 | expr_op '+' expr_op
27 | expr_op '-' expr_op
28 | expr_op '*' expr_op
29 | expr_op '/' expr_op
30 | expr_op CONCAT expr_op
31 | expr_app
32 expr_app: expr_app expr_select
33 | expr_select
34 expr_select: expr_simple '.' attrpath
35 | expr_simple '.' attrpath OR_KW expr_select
36 | expr_simple OR_KW
37 | expr_simple
38 expr_simple: ID
39 | INT
40 | '"' string_parts '"'
41 | IND_STRING_OPEN ind_string_parts IND_STRING_CLOSE
42 | PATH
43 | SPATH
44 | URI
45 | '(' expr ')'
46 | LET '{' binds '}'
47 | REC '{' binds '}'
48 | '{' binds '}'
49 | '[' expr_list ']'
50 string_parts: STR
51 | string_parts_interpolated
52 | %empty
53 string_parts_interpolated: string_parts_interpolated STR
54 | string_parts_interpolated DOLLAR_CURLY expr '}'
55 | STR DOLLAR_CURLY expr '}'
56 | DOLLAR_CURLY expr '}'
57 ind_string_parts: ind_string_parts IND_STR
58 | ind_string_parts DOLLAR_CURLY expr '}'
59 | %empty
60 binds: binds attrpath '=' expr ';'
61 | binds INHERIT attrs ';'
62 | binds INHERIT '(' expr ')' attrs ';'
63 | %empty
64 attrs: attrs attr
65 | attrs string_attr
66 | %empty
67 attrpath: attrpath '.' attr
68 | attrpath '.' string_attr
69 | attr
70 | string_attr
71 attr: ID
72 | OR_KW
73 string_attr: '"' string_parts '"'
74 | DOLLAR_CURLY expr '}'
75 expr_list: expr_list expr_select
76 | %empty
77 formals: formal ',' formals
78 | formal
79 | %empty
80 | ELLIPSIS
81 formal: ID
82 | ID '?' expr
There has been no activity for a couple of weeks, so I'll close the issue. If there are actionable items hidden in this thread somewhere, I guess it would be better to create separate, new tickets for those.
I'll note that I've drafted my response to all unanswered messages but am busy and don't know when I'll have time to finish it off.
Closing the issue is fine with me.
@gabriel-laddel just paste your draft? Otherwise you'll never come back to it...
I don't publish drafts. I'm in the middle of something at the moment and will respond over the weekend.
:-) reminds me of classical Cathedral vs. Bazaar dilemma.
Wasn't able to finish over the weekend and am busy for the next two days. Best guess is wed/thurs.
Note1: I've read NixOS: A Purely Functional Linux Distribution and Nix: A Safe and Policy-Free System for Software Devlopment, but not the associated PHD thesis. The contents listing didn't indicate that it'd introduce me to any new ideas.
Note2: I've ignored forth and apl derivatives in this comment. Anyone familiar with will understand why upon reading it.
Note3: I posted and deleted this approximately ~5min ago - sorry, the using ``` as blockquote didn't work as expected.
Fundamentally, package management is composed of graph traversals and transformations. The goal that separates NixOS from other package managers, reproducible builds, can be rendered as "the ability to save and restore unix's dependency graph". This is a neat idea, but the implementation merely trades one set of problems for another. Many crucial design decisions that should have been throughly discussed were completely ignored. For example, nowhere in the design documents is it stated that every(?) programming language already has its own package manager(s) with its own idiosyncrasies that for the most part don't share concepts or code with the others. There are many good questions that originate from this fact, were one to acknowledge it. Examples: at what point is appropriate to leave package management up to a language's ecosystem? Is there ever a reason to? Can maintainers be convinced that the 'nixos way' is a good one and that packaging their code with a flag of some sort that produces a nix friendly build script is a reasonable request? Are we confident enough in nix to spend their time? Given that the NixOS plan is to modify build scripts and sources to its liking, what is the proper way to go about doing source-to-source transformations? Are there commercial offerings that solve this problem? The undiscussion of these issues is prototypical of the Nix design documents and documentation.
From NixOS: A Purely Functional Linux Distribution,
Runtime dependencies In Nix, we generally try to fix runtime dependencies at build time. This means that while a program may execute other programs or load dynamic libraries at runtime, the paths to those dependencies are hard-coded into the program at build time. For instance, for ELF executables, we set the RPATH in the executable such that it will find a statically determined set of library dependencies at runtime, rather than using a dynamic mechanism such as the LD_LIBRARY_PATH environment variable to look up libraries. This is important, because the use of such dynamic mechanisms makes it harder to run applications with conflicting dependencies at the same time (e.g., we might need Firefox linked against GTK 2.8 and Thunder bird linked against GTK 2.10). It also enhances determinism: a program will not suddenly behave differently on another system or under another user account because environment variables happen to be different. However, there is one case in NixOS and Nixpkgs of a library dependency that /must/ be overridable at runtime and cannot be fixed statically: the implementation of OpenGL to be used at runtime (libGL.so) which is hardware specific. We build applications that need OpenGL against Mesa, but add the impure (stateful) path /var/run/opengl-driver to the RPATH. The activate script syslinks that path to the actual OpenGL implementation selected by the configuration (e.g., nvidiaDriver) to allow programs to use it.
[...]
Build actions The Nix /model/ is that derivations are pure, that is, two builds of identical derivation should produce the same result in the Nix store. However, in contemporary operating systems, there is no way to actually enforce this model. Builders can use any impure source of information to produce the output, such as the system time, data downloaded from the network, or the current number of processes in the system as seen in /proc. It is trivial to construct a contrived builder that does such things. But build processes in the system as seen in /proc. But build processes generally do not, and instead are fairly determinisitc; impure influences such as the system time generally do not affect the runtime behavior of the package in question.
There are however frequent exceptions. First, many build processes are greatly affected by environment variables, such as PATH or CFLAGS. Therefore we clear the environment before starting a build (except for the attributes variable to a non-existent directory, because some derivations (such as Qt) try and read settings) from the user's home directory.
Second, almost all packages look for dependencies in impure locations such as /usr/bin and /usr/include. Indeed, the undeclared dependencies caused by this behavior are what motivated Nix in the first place: by storing packages in isolation from each other, we prevent undeclared build-time dependencies. In five years we haven't had a single instance of a package having an undeclared build-time dependency on another package /in the Nix store/, or having a runtime dependency in the Nix store not detected by the reference scanner. However, with Nix under other Linux distributions or operating systems, there have been numerous instances of packages affected by paths outside the Nix store. We prevent most of those impurities through a wrapper script around GCC and ld that ignores or fails on paths outside of the store. However, this cannot prevent undeclared dependencies such as direct calls to other programs, e.g., a Makefile running /usr/bin/yacc.
Since NixOS has no /bin, /usr and /lib, the effect of such impurities is greatly reduced. However, even in NixOS such impurities can occur. For instance, we recently encountered a problem with the build of the dbus package, which failed when /var/run/dbus didn't exist.
As a final example of impurity, some packages try to install files under a different location than $out. Nix causes such packages to /fail deterministically/ by executing builders under unprivileged UIDs that do not have write permisions to other store paths other than $out, let alone paths such as /bin. This packages must then be patched to make them well-behaved.
To ascertain how well these measures work in preventing impurities in NixOS, we performed two builds of the Nixpkgs collection on two different NixOS machines. This consisted of building 485 non-fetch url derivations. The output consisted of 165927 files and directories. Of these, there was only one /file name/ that differed between the two builds, namely in mono-1.1.4: a directory gac/IBM.Data.DB2/1.0.3008.37160_7c307b91aa13d208 versus 1.0.3008.40191_7c307b91aa13d280. The differing number is likely derived from system time.
We then compared each file. There we differences in 5059 files, or 3.4% of all regular files. We inspected the nature of the differences: almost all were caused by timestamps being encoded in files, such as in Unix object files archives or compiled Python code. 1048 Emacs Lisp files differed because the hostname of the build machines were stored in the output. Filtering out these and other files types that are known to contain timestamps as well (such as a build process inserting the build time in a C string). This hypothesis is strongly0 supported by the fact that of those, only 42 (or 0.03%) had different files sizes. None of these content differences have ever caused an observable difference in behavior.
How exactly does the reference scanner detect runtime dependencies? Regex and sed hackery? Or the correct way, parsing each language into its abstract syntax tree (ast), walking that and finding references to unix level dependencies (both static and dynamic)? I assume the former, as the latter is a great deal of work and would have been mentioned. Also, the only complete C99 parser (i.e., one that takes into account all the GCC extensions used in the linux kernel) I'm aware of is the haskell package Language.C.AST - and nix doesn't use haskell.
The problem with anything other than traversing a language's ast is that you're forever stuck with an approximation of the input language, resulting in false positives etc. This is not new information.
Much of the New Jersey approach is about getting away with less than is necessary to get the /complete/ job done. E.g., perl, is all about doing as little as possible that can approximate the full solution, sort of the entertainment industry's special effects and make-believe works, which for all practical purposes /is/ the real thing. Regular expressions is a pretty good approximation to actually parsing the implicit language of the input, too, but the rub with all these 90% solutions is that you have /no/ idea when they return the wrong value because the approximation destroys any ability to determine correctness. Most of the time, however, the error is large enough to cause a crash of some sort, but there is no way to do transactions, either, so a crash usually causes a debugging and rescue session to recover the state prior to the crash. This is deemed acceptable in the New Jersery approach. The reason they think this also /should/ be acceptable is that they believe that getting it exactly right is more expensive than fixing things after crashes. Therefore, the whole language must be optimized for getting the first approximations run fast.
-- http://www.xach.com/naggum/articles/3244575386963745@naggum.no.html
Notice that the paper says,
by storing packages in isolation from each other, we prevent undeclared build-time dependencies. In five years we haven't had a single instance of a package having an undeclared build-time dependency on another package /in the Nix store/, or having a runtime dependency in the Nix store not detected by the reference scanner.
but fails to mention how many people used NixOS over those 5 years, how many times undeclared build time dependencies /outside/ of the Nix store were encountered or how many packages were in the Nix store over this time period. Is this not relevant information? As Naggum stated above: "the rub with these 90% solutions is that you have /no/ idea when they return the wrong value becuase the approximation destroys any ability to determine correctness." If say, most of the NixOS users are NixOS developers (which they are) and the reference scanner doesn't walk the ast of the programs in question (which it doesn't), no reported failures means exactly nothing, because there will be many code paths not encountered by its users due to the sheer complexity of the software being interfaced with (e.g., how many NixOS +developers+ users use emacs docview? Not many apparently).
There are other issues with these papers I'm going to skip over, such as the clearing of environment variables (the proper thing to do imo is to gensym them to prevent unwanted interactions and study how they're being used - filter for 'functional' builds...), failure to discuss why a new project is needed in the first place (why can't you just perform this computation via an extension to an existing project?), total wtf issues with the papers (who are they written for? One who needs the concept of "file system" explained is going to have nfi what emacs lisp or ~/what/all/the/slashes/signify) etc.
The heart of the matter is that unix (ostensibly[0]) plays host to many different philosophies of how to go about programming. Languages differ in their approach to build systems, package management, syntax and foreign function interfaces. To change the way many software packages are built in a semi-automated, non-ast-walking fashion is insane. Not to discuss this in the design documents is insane. NixOS has 46 contributors and the documentation is sufficiently general as to give each contributor his own wildly different interpretation of the scope of the project. As far as I can tell, were one to extrapolate from the given information to a set of concrete requirements we see that NixOS plans to rewrite the build scripts for every version of every project on unix. Again, this is insane. The correct thing to do in this situation is to realize the utter impossibility of the task that has been set forth and re-evaluate one's approach.[1]
Let's examine the reference scanner. What is the correct way to approach the problem? We know regular expressions cannot respect the semantics of a language and thus are (unless used as a part of a larger parser) inappropriate for meta-programming (source-to-source transformations). Thus, we must work by manipulating the ast of various languages, and then pretty printing the altered code into the language's concrete syntax.
Most languages don't offer the ability to do this, and if one examines e.g., Clang, he will see all sorts of nonsense about concrete/surface syntax, context free grammars and compiler technologies. This certainly can't all be necessary (hint: it isn't).
Consider the following mathematical expression:
(3 + 2) * 8 / 3 * 3^6
Fully parenthesizing yields:
(((3 + 2) * 8) / (3 * 3^6))
When computers execute programs, or humans mathematics, the order of operations must be taken into account. one way to notate the previous expression to remove such ambiguities present in mathematical notation is to move operators to the front of each parenthesized list, passing the remaining elements as its arguments. This is known as fully-parenthesized prefix notation.
(/ (* (+ 3 2) 8) (* 3 (^ 3 6)))
This notational scheme has a direct mapping to the program's ast, which can be rendered as
http://i.imgur.com/8TO5VcK.png
See the pattern? An s-expression (fully parenthesized prefix notation) is merely a serialization for the ast. Thus, manipulations of the sexpr correspond to manipulations of the ast. Languages with the vocabulary to manipulate the s-expression structure can manipulate themselves (their ast) just as easily as any other thusly structured information. It follows from this that all software 'tooling' and translation tasks, are transparently tree traversals e.g., find me all things that call this function, reference this variable, modify all build scripts to take into account the new &optional argument in the updated library, etc.[2]
For whatever reason, many find s-expressions distasteful and spend much of their lives attempting to add the same meta-programming facilities to ALGOL derived languages. They all have failed. Not because of any sort of mechanical failing of the computer, but the human's inability to fully comprehend the parsing and syntactical schemes they're able to create (Ruby's parser is 10k lines of C, Clang's is >100k loc. Note that the entirety of SBCL is ~300k loc, and has much fat that could be trimmed off - http://www.cliki.net/Lisp%20-%20Next%20Generation).
Scala is a notable failure in this regard. Watch this video:
http://www.youtube.com/watch?v=TS1lpKBMkgg
Pay attention to 37:39-42:50 and you'll get to see Paul Phillips flipping out over ir/asts (same thing!). He even states his plan for the next 25 years - attempt to solve a problem solved 50+ years ago (http://c2.com/cgi/wiki?LispOnePointFive).
In particular, I found these quotes quite pertinent.
"I want to programmatically generate asts and feed those" "Even though this is what everybody does it's kinda nuts, why is the canonical code representation a STRING?!"
(not everyone does this, just algol derivatives)
"The ast is going to be designed along side the VM" "I need a tight feedback loop on the thing that I'm working on right now"
Wait, like every common lisp compiler ever? 30+ years behind the times yo.
"the code that you look at, that ought to be a reflection of the AST. The canonical thing ought to be the tree, the code is a view of it.... It's trees that are fundamental, that's what we work with"
Lol. Gotcha.
"something not offered by our tremendously entangled compiler, which doesn't even have a clean parse tree. It's comical. Try to get back to the source from what you get out of the scala parser. To me, the minimum test of a parser is that it parses!"
Lol. 'Comical' is definitely the right word.
"modifiability is paramount. If it isn't straightforward to modify, it will never be any good. It will never be fast. It will never be correct. And it will eventually be replaced by something modifiable... after consuming as many hours as you feed it."
Again, 30+ yrs behind the times[3]: http://article.gmane.org/gmane.comp.java.clojure.user/34272
So what does all this have to do with NixOS?
Much of the complexity in NixOS originates in the failure to address why it isn't a trivial modification on an existing project. Were they to have honestly sized up the problem they'd end up at something like: people suck at both reading & programming, also asts are fundamental. They'd then realize that at some point you have to choose some languages and leave others, because not everyone has something to offer and writing fully compliant parsers for all the languages involved is about as entertaining as building pyramids. When one has decided on a particular scheme, one could set about implementing "the ability to save and restore parts of unix's dependency graph".
But by failing to address this complexity NixOS inherits it, and in practice, adds to it by the 'invention' of /yet another/ language created for no particular reason. 800+ existing languages[4] and /none of them/ have the required properties? Nonsense. Allow me to anticipate your retorts:
"I need to know that my nix code is functional"
;; let's assume, for the sake of argument that you only want the symbols
;; `functional-lol', `design-antipattern', `the-type-gods-will-save-me' plus
;; strings to occur in your 'language' in addition to strings and keywords. You
;; know that the functions these symbols describe are functional and thus, a
;; valid program is functional.
;;
;; here is how you validate a 'nix' ast.
(defun flatten (tree)
"Traverses the tree in order, collecting non-null leaves into a list."
(let (list)
(labels ((traverse (subtree)
(when subtree
(if (consp subtree)
(progn
(traverse (car subtree))
(traverse (cdr subtree)))
(push subtree list)))))
(traverse tree))
(nreverse list)))
(defvar myast '(functional-lol (design-antipattern "death")
(the-type-gods-wont-save-me "destruction and chaos! also, bunnies")))
(defun validate (ast)
(every (lambda (k) (or (keywordp k) (and (symbolp k)
(member k '(functional-lol design-antipattern the-type-gods-will-save-me) :test 'eq))
(stringp k)))
(flatten ast)))
(validate myast) ; => nil
"I don't want to require a whole compiler!"
By failing to make use of existing programs NixOS not only adds a language (compiler/interpreter), but the requirement for an Emacs mode (+1 more for every other editor!), custom syntax highlighting, extra crap for auto-completion etc. etc. One should instead make use of what already exists. As it stands SLOCCOUNT reports 26.3k loc, divided across 7 languages being are used to introduce another for the nix project. This number of languages alone is enough to guarantee that no one will ever understand the whole codebase on a rolling basis and have enough time to do much else with their life.
A preceptable analogy to the natural languages springs forth: if we created them for the same reason so-called professional programmers create programming languages, each new subculture would require a whole new alphabet/glyph scheme, phonetic system, grammatical structures etc. Though some surface features may be shared across languages, the underlying semantics they represent would change in unpredictable and often incompatible ways.[5]
"So, you want us to write lisp? What if something better comes along?"
You are faced with a choice - either learn from those who came before you, or spend the rest of your life fighting monstrosities of your own creation, attempting to eke out some order in all the chaos, be it the SDF/BNF format of a language's syntax or otherwise. In any case, if something better comes along, transforming your s-expression code into $POWERFUL_NEW_NOTATION will be a straightforwards intern-level task[6].
Now to address some unanswered questions.
If you like the Nix concepts you could implement StumpWM the way you like it, or you could just walk away.
You don't say.
I'm also a little perplexed by the general tone and sense of entitlement but anyway... I'd like to address some of your points @gabriel-laddel .
You're in Mexico city and ask a bystander to direct you to the airport (your phone died, you lack a map and you're going to miss your plane unless you make it to the airport soon). He gives very specific instructions, which you write down and follow exactly. You reach the last instruction somewhere near the slums, missing your flight. As such, you swear loudly.
Is anyone entitled to correct directions? Of course not. I however, expect the /bare minimum/ of people giving correct directions and accurate descriptions of their projects, much like I expect my friends to brush their teeth regularly.
There's an SDF representation of the syntax in @edolstra's PHD thesis starting on page 64, though I'm not sure if that would be entirely up to date. It also can be seen as acting as a design document, and lays out the thought and central metaphors behind Nix/OS pretty clearly.
If I say "jump in front of the bus on my mark, ready? 1, 2, 3 - JUMP!" it is clear what I'm asking/ordering, what is less clear is why I'd be asking/ordering someone to jump in front of a bus.
:-) reminds me of classical Cathedral vs. Bazaar dilemma.
There are very few classics in the field of computing. We are at the very beginning of this journey and it hasn't even begun to get interesting yet. IMHO, the Cathedral vs. Bazaar isn't in anyway a dilemma or a classic. Linux was a failure 20 years ago, it is a failure today and any posturing otherwise is just that. See The UNIX-HATERS Handbook[0] for more information.
I find the contrast between these quotes accurately depicts the computing situation circa 2015.
The process of constructing instruction tables should be very fascinating. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.
-- Turing, A. M., 1946, Proposed electronic calculator, report for National Physical Laboratory, Teddington
for the present-day darkness in the programmer's world the programmers themselves are responsible and nobody else.
-- Dijkstra
- Footnotes
[0] In actuality unix is mostly a pile of stupid, see: web.mit.edu/~simsong/www/ugh.pdf
[1]
Suppose you're trying to find the best way to structure your description of something. (Examples: choosing the structure for a computer program to perform some task; or choosing the structure for a theory of physics.)
What you hope to find is the natural structure of what you're describing — a structure that affords a really beautiful, simple description. When you strike the natural structure, a sort of resonance occurs, in which various subsidiary problems you may have had with your description just melt away, and the description practically writes itself.
But here's a problem that often occurs: You've got a structure that affords a pleasing approximate description. But as you try to tweak the description for greater precision, instead of the description getting simpler, as it should if you were really fine-tuning near the natural structure, instead the description gets more and more complicated. What has happened, I suggest, is that you've got a local optimum in solution space, instead of the global optimum of the natural structure: small changes in the structure won't work as well as the earlier approximation, and may not work at all, so fundamental improvement would require a large change to the structure.
-- http://fexpr.blogspot.com/2012/12/metaclassical-physics.html
[2] This scheme the surrounding ideas are so fundamental that they've sometime's referred to as "Maxwell's equations of Software". You can read more about Lisp and s-expressions elsewhere so I'll not repeat it here.
http://www.defmacro.org/ramblings/lisp.html http://www.michaelnielsen.org/ddi/lisp-as-the-maxwells-equations-of-software/ http://norvig.com/lispy.html http://norvig.com/lispy2.html
[3] This information re, Scala is interesting beyond the technicalities. Typesafe, the Scala company (has Martin Odersky as Chairman and Chief Scientist) has received 31 MM in funding. Coursera (who built everything in Scala) has received 85MM. I'll predict they'll eventually fail as they're out-competed by more intelligent adversaries, though both will (due the amount of funding and high-profile people involved, probably limp along for years to come).
Last I checked, Coursera already had fungus growing on it: something about, "usg-sponsored studies find that coursera is better than starving in Africa"
[4] http://en.wikipedia.org/wiki/List_of_programming_languages
[5]
The corresponding lisp narrative feels 'organic'. Consider the SBCL (steel bank common lisp) compiler. It is descended from CMUCL python compiler and uses some of its code, changing it as circumstances require (and of course, any transformations that needed automation were but a tree traversal away). Naggum also had some related thoughts:
There is a simple and elegant answer to this question: Just learn Common Lisp well first. New languages are exciting to people who know mostly new languages, so learn an old language before you learn new ones and get out of the maelstrom that will drown you in ever new languages that add nothing at all except some miniscule additional feature from another language that someone needed to make a whole new language to implement because he did not know (Common) Lisp to begin with. A "new" language that differs from the rest of the crop by one or a couple features is proof positive that both what it came from and what it has become are mutations about to die. There are tens if not hundreds of thousands of such "languages" that people have invented over the yeare, for all sorts of weird purposes where they just could not use whatever language they were already using, could not extend it, and could not fathom how to modify its tools without making a whole new language. They never stopped to think about how horribly wasteful this is, they just went on to create yet another language called Dodo, the Titanic, Edsel, Kyoto-agreement...
-- Erik Naggum, http://www.xach.com/naggum/articles/3206985430398054@naggum.net.html
[6] This could be largely automated, but that is a discussion for another day.
Holy wall of text @gabriel-laddel!
I understand that you fully wanted to underline your points but I would have been happy with
Those are your points right, or am I misunderstanding them/missing some?
Package management was indeed not built in from the beginning. The current approach is to convert instructions to npm and friends into nix package specifications. AFAIK this is in place for Haskell, Ruby, NodeJS, Python, Go and maybe Perl, not sure. This works okay but still has problems when semantics clash. The build flow is something like "generate nix expressions for dependencies, store in nixpkgs tree next to package and create nix expression for the package". Because of idempotency it doesn't matter if dependencies are mentioned many times, if they result in the same inputs hash they will be reused.
You can think of this as a very lazy static evaluation of needed packages out of hundreds of thousands of third-party packages. Considering the alternative of putting Nix expressions in other package management systems, I think this works pretty well.
Regarding the hash checking, the observation of no problems with it still stands. Dependency detection is quite robust and has low impact if incorrect.
So yes, both those things are only 99.9% solutions, but they are gaining 9s each year. The 100% solution seems infeasible to me in both cases.
For the last point, I think http://sandervanderburg.blogspot.com/2012/11/on-nix-and-gnu-guix.html articulates the issues nicely. Personally I prefer the advantages Nix gives over what Guix gets via Scheme.
Thank you for posting your thoughts, I still believe that NixOS and ecosystem are awesome and in a good position going forward.
On Sun Feb 08 2015 at 6:51:59 PM Gabriel Laddel notifications@github.com wrote:
Note1: I've read NixOS: A Purely Functional Linux Distribution and Nix: A Safe and Policy-Free System for Software Devlopment, but not the associated PHD thesis. The contents listing didn't indicate that it'd introduce me to any new ideas.
Note2: I've ignored forth and apl derivatives in this comment. Anyone familiar with will understand why upon reading it.
Note3: I posted and deleted this approximately ~5min ago - sorry, the using ``` as blockquote didn't work as expected.
Fundamentally, package management is composed of graph traversals and transformations. The goal that separates NixOS from other package managers, reproducible builds, can be rendered as "the ability to save and restore unix's dependency graph". This is a neat idea, but the implementation merely trades one set of problems for another. Many crucial design decisions that should have been throughly discussed were completely ignored. For example, nowhere in the design documents is it stated that every(?) programming language already has its own package manager(s) with its own idiosyncrasies that for the most part don't share concepts or code with the others. There are many good questions that originate from this fact, were one to acknowledge it. Examples: at what point is appropriate to leave package management up to a language's ecosystem? Is there ever a reason to? Can maintainers be convinced that the 'nixos way' is a good one and that packaging their code with a flag of some sort that produces a nix friendly build script is a reasonable request? Are we confident enough in nix to spend their time? Given that the NixOS plan is to modify build scripts and sources to its liking, what is the proper way to go about doing source-to-source transformations? Are there commercial offerings that solve this problem? The undiscussion of these issues is prototypical of the Nix design documents and documentation.
From NixOS: A Purely Functional Linux Distribution,
Runtime dependencies In Nix, we generally try to fix runtime dependencies at build time. This means that while a program may execute other programs or load dynamic libraries at runtime, the paths to those dependencies are hard-coded into the program at build time. For instance, for ELF executables, we set the RPATH in the executable such that it will find a statically determined set of library dependencies at runtime, rather than using a dynamic mechanism such as the LD_LIBRARY_PATH environment variable to look up libraries. This is important, because the use of such dynamic mechanisms makes it harder to run applications with conflicting dependencies at the same time (e.g., we might need Firefox linked against GTK 2.8 and Thunder bird linked against GTK 2.10). It also enhances determinism: a program will not suddenly behave differently on another system or under another user account because environment variables happen to be different. However, there is one case in NixOS and Nixpkgs of a library dependency that /must/ be overridable at runtime and cannot be fixed statically: the implementation of OpenGL to be used at runtime (libGL.so) which is hardware specific. We build applications that need OpenGL against Mesa, but add the impure (stateful) path /var/run/opengl-driver to the RPATH. The activate script syslinks that path to the actual OpenGL implementation selected by the configuration (e.g., nvidiaDriver) to allow programs to use it.
[...]
Build actions The Nix /model/ is that derivations are pure, that is, two builds of identical derivation should produce the same result in the Nix store. However, in contemporary operating systems, there is no way to actually enforce this model. Builders can use any impure source of information to produce the output, such as the system time, data downloaded from the network, or the current number of processes in the system as seen in /proc. It is trivial to construct a contrived builder that does such things. But build processes in the system as seen in /proc. But build processes generally do not, and instead are fairly determinisitc; impure influences such as the system time generally do not affect the runtime behavior of the package in question.
There are however frequent exceptions. First, many build processes are greatly affected by environment variables, such as PATH or CFLAGS. Therefore we clear the environment before starting a build (except for the attributes variable to a non-existent directory, because some derivations (such as Qt) try and read settings) from the user's home directory.
Second, almost all packages look for dependencies in impure locations such as /usr/bin and /usr/include. Indeed, the undeclared dependencies caused by this behavior are what motivated Nix in the first place: by storing packages in isolation from each other, we prevent undeclared build-time dependencies. In five years we haven't had a single instance of a package having an undeclared build-time dependency on another package /in the Nix store/, or having a runtime dependency in the Nix store not detected by the reference scanner. However, with Nix under other Linux distributions or operating systems, there have been numerous instances of packages affected by paths outside the Nix store. We prevent most of those impurities through a wrapper script around GCC and ld that ignores or fails on paths outside of the store. However, this cannot prevent undeclared dependencies such as direct calls to other programs, e.g., a Makefile running /usr/bin/yacc.
Since NixOS has no /bin, /usr and /lib, the effect of such impurities is greatly reduced. However, even in NixOS such impurities can occur. For instance, we recently encountered a problem with the build of the dbus package, which failed when /var/run/dbus didn't exist.
As a final example of impurity, some packages try to install files under a different location than $out. Nix causes such packages to /fail deterministically/ by executing builders under unprivileged UIDs that do not have write permisions to other store paths other than $out, let alone paths such as /bin. This packages must then be patched to make them well-behaved.
To ascertain how well these measures work in preventing impurities in NixOS, we performed two builds of the Nixpkgs collection on two different NixOS machines. This consisted of building 485 non-fetch url derivations. The output consisted of 165927 files and directories. Of these, there was only one /file name/ that differed between the two builds, namely in mono-1.1.4: a directory gac/IBM.Data.DB2/1.0.3008.37160_7c307b91aa13d208 versus 1.0.3008.40191_7c307b91aa13d280. The differing number is likely derived from system time.
We then compared each file. There we differences in 5059 files, or 3.4% of all regular files. We inspected the nature of the differences: almost all were caused by timestamps being encoded in files, such as in Unix object files archives or compiled Python code. 1048 Emacs Lisp files differed because the hostname of the build machines were stored in the output. Filtering out these and other files types that are known to contain timestamps as well (such as a build process inserting the build time in a C string). This hypothesis is strongly0 supported by the fact that of those, only 42 (or 0.03%) had different files sizes. None of these content differences have ever caused an observable difference in behavior.
How exactly does the reference scanner detect runtime dependencies? Regex and sed hackery? Or the correct way, parsing each language into its abstract syntax tree (ast), walking that and finding references to unix level dependencies (both static and dynamic)? I assume the former, as the latter is a great deal of work and would have been mentioned. Also, the only complete C99 parser (i.e., one that takes into account all the GCC extensions used in the linux kernel) I'm aware of is the haskell package Language.C.AST - and nix doesn't use haskell.
The problem with anything other than traversing a language's ast is that you're forever stuck with an approximation of the input language, resulting in false positives etc. This is not new information.
Much of the New Jersey approach is about getting away with less than is necessary to get the /complete/ job done. E.g., perl, is all about doing as little as possible that can approximate the full solution, sort of the entertainment industry's special effects and make-believe works, which for all practical purposes /is/ the real thing. Regular expressions is a pretty good approximation to actually parsing the implicit language of the input, too, but the rub with all these 90% solutions is that you have /no/ idea when they return the wrong value because the approximation destroys any ability to determine correctness. Most of the time, however, the error is large enough to cause a crash of some sort, but there is no way to do transactions, either, so a crash usually causes a debugging and rescue session to recover the state prior to the crash. This is deemed acceptable in the New Jersery approach. The reason they think this also /should/ be acceptable is that they believe that getting it exactly right is more expensive than fixing things after crashes. Therefore, the whole language must be optimized for getting the first approximations run fast.
-- http://www.xach.com/naggum/articles/3244575386963745@naggum.no.html
Notice that the paper says,
by storing packages in isolation from each other, we prevent undeclared build-time dependencies. In five years we haven't had a single instance of a package having an undeclared build-time dependency on another package /in the Nix store/, or having a runtime dependency in the Nix store not detected by the reference scanner.
but fails to mention how many people used NixOS over those 5 years, how many times undeclared build time dependencies /outside/ of the Nix store were encountered or how many packages were in the Nix store over this time period. Is this not relevant information? As Naggum stated above: "the rub with these 90% solutions is that you have /no/ idea when they return the wrong value becuase the approximation destroys any ability to determine correctness." If say, most of the NixOS users are NixOS developers (which they are) and the reference scanner doesn't walk the ast of the programs in question (which it doesn't), no reported failures means exactly nothing, because there will be many code paths not encountered by its users due to the sheer complexity of the software being interfaced with (e.g., how many NixOS +developers+ users use emacs docview? Not many apparently).
There are other issues with these papers I'm going to skip over, such as the clearing of environment variables (the proper thing to do imo is to gensym them to prevent unwanted interactions and study how they're being used - filter for 'functional' builds...), failure to discuss why a new project is needed in the first place (why can't you just perform this computation via an extension to an existing project?), total wtf issues with the papers (who are they written for? One who needs the concept of "file system" explained is going to have nfi what emacs lisp or ~/what/all/the/slashes/signify) etc.
The heart of the matter is that unix (ostensibly[0]) plays host to many different philosophies of how to go about programming. Languages differ in their approach to build systems, package management, syntax and foreign function interfaces. To change the way many software packages are built in a semi-automated, non-ast-walking fashion is insane. Not to discuss this in the design documents is insane. NixOS has 46 contributors and the documentation is sufficiently general as to give each contributor his own wildly different interpretation of the scope of the project. As far as I can tell, were one to extrapolate from the given information to a set of concrete requirements we see that NixOS plans to rewrite the build scripts for every version of every project on unix. Again, this is insane. The correct thing to do in this situation is to realize the utter impossibility of the task that has been set forth and re-evaluate one's approach.[1]
Let's examine the reference scanner. What is the correct way to approach the problem? We know regular expressions cannot respect the semantics of a language and thus are (unless used as a part of a larger parser) inappropriate for meta-programming (source-to-source transformations). Thus, we must work by manipulating the ast of various languages, and then pretty printing the altered code into the language's concrete syntax.
Most languages don't offer the ability to do this, and one examines into e.g., Clang, he will see all sorts of nonsense about concrete/surface syntax, context free grammars and compiler technologies. This certainly can't all be necessary (hint: it isn't).
Consider the following mathematical expression:
(3 + 2) * 8 / 3 * 3^6
Fully parenthesizing yields:
(((3 + 2) * 8) / (3 * 3^6))
When computers execute programs, or humans mathematics, the order of operations must be taken into account. one way to notate the previous expression to remove such ambiguities present in mathematical notation is to move operators to the front of each parenthesized list, passing the remaining elements as its arguments. This is known as fully-parenthesized prefix notation.
(/ (* (+ 3 2) 8) (* 3 (^ 3 6)))
This notational scheme has a direct mapping to the program's ast, which can be rendered as
http://i.imgur.com/8TO5VcK.png
See the pattern? An s-expression (fully parenthesized prefix notation) is merely a serialization for the ast. Thus, manipulations of the sexpr correspond to manipulations of the ast. Languages with the vocabulary to manipulate the s-expression structure can manipulate themselves (their ast) just as easily as any other thusly structured information. It follows from this that all software 'tooling' and translation tasks, transparently become trees e.g., find me all things that call this function, reference this variable, modify all build scripts to take into account the new &optional argument in the updated library, etc. are viewed quite transparently (and correctly) as tree traversals.[2]
For whatever reason, many find s-expressions distasteful and spend much of their lives attempting to add the same meta-programming facilities to ALGOL derived languages. They all have failed. Not because of any sort of mechanical failing of the computer, but the human's inability to fully comprehend the parsing and syntactical schemes they're able to create (Ruby's parser is 10k lines of C, Clang's is >100k loc. Note that the entirety of SBCL is ~300k loc, and has much fat that could be trimmed off - http://www.cliki.net/Lisp%20-%20Next%20Generation).
Scala is a notable failure in this regard. Watch this video:
http://www.youtube.com/watch?v=TS1lpKBMkgg
Pay attention to 37:39-42:50 and you'll get to see Paul Phillips flipping out over ir/asts (same thing!). He even states his plan for the next 25 years - attempt to solve a problem solved 50+ years ago ( http://c2.com/cgi/wiki?LispOnePointFive).
In particular, I found these quotes quite pertinent.
"I want to programmatically generate asts and feed those" "Even though this is what everybody does it's kinda nuts, why is the canonical code representation a STRING?!"
(not everyone does this, just algol derivatives)
"The ast is going to be designed along side the VM" "I need a tight feedback loop on the thing that I'm working on right now"
Wait, like every common lisp compiler ever? 30+ years behind the times yo.
"the code that you look at, that ought to be a reflection of the AST. The canonical thing ought to be the tree, the code is a view of it.... It's trees that are fundamental, that's what we work with"
Lol. Gotcha.
"something not offered by our tremendously entangled compiler, which doesn't even have a clean parse tree. It's comical. Try to get back to the source from what you get out of the scala parser. To me, the minimum test of a parser is that it parses!"
Lol. 'Comical' is definitely the right word.
"modifiability is paramount. If it isn't straightforward to modify, it will never be any good. It will never be fast. It will never be correct. And it will eventually be replaced by something modifiable... after consuming as many hours as you feed it."
Again, 30+ yrs behind the times[3]: http://article.gmane.org/gmane.comp.java.clojure.user/34272
So what does all this have to do with NixOS?
Much of the complexity in NixOS originates in the failure to address why it isn't a trivial modification on an existing project. Were they to have honestly sized up the problem they'd end up at something like: people suck at both reading & programming, also asts are fundamental. They'd then realize that at some point you have to choose some languages and leave others, because not everyone has something to offer and writing fully compliant parsers for all the languages involved is about as entertaining as building pyramids. When one has decided on a particular scheme, one could set about implementing "the ability to save and restore parts of unix's dependency graph".
But by failing to address this complexity NixOS inherits it, and in practice, adds to it by the 'invention' of /yet another/ language created for no particular reason. 800+ existing languages[4] and /none of them/ have the required properties? Nonsense. Allow me to anticipate your retorts:
"I need to know that my nix code is functional"
``common lisp
;; let's assume, for the sake of argument that you only want the symbols
;;functional-lol', design-antipattern',the-type-gods-will-save-me' plus
;; strings to occur in your 'language' in addition to strings and keywords. You ;; know that the functions these symbols describe are functional and thus, a ;; valid program is functional. ;; ;; here is how you validate a 'nix' ast.
(defun flatten (tree) "Traverses the tree in order, collecting non-null leaves into a list." (let (list) (labels ((traverse (subtree) (when subtree (if (consp subtree) (progn (traverse (car subtree)) (traverse (cdr subtree))) (push subtree list))))) (traverse tree)) (nreverse list)))
(defvar myast '(functional-lol (design-antipattern "death") (the-type-gods-wont-save-me "destruction and chaos! also, bunnies")))
(defun validate (ast) (every (lambda (k) (or (keywordp k) (and (symbolp k) (member k '(functional-lol design-antipattern the-type-gods-will-save-me) :test 'eq)) (stringp k))) (flatten ast)))
(validate myast) ; => nil
"I don't want to require a whole compiler!"
By failing to make use of existing programs NixOS not only adds a language (compiler/interpreter), but the requirement for an Emacs mode (+1 more for every other editor!), custom syntax highlighting, extra crap for auto-completion etc. etc. One should instead make use of what already exists. As it stands SLOCCOUNT reports 26.3k loc, divided across 7 languages being are used to introduce another for the nix project. This number of languages alone is enough to guarantee that no one will ever understand the whole codebase on a rolling basis and have enough time to do much else with their life.
A preceptable analogy to the natural languages springs forth: if we created them for the same reason so-called professional programmers create programming languages, each new subculture would require a whole new alphabet/glyph scheme, phonetic system, grammatical structures etc. Though some surface features may be shared across languages, the underlying semantics they represent would change in unpredictable and often incompatible ways.[5]
"So, you want us to write lisp? What if something better comes along?"
You are faced with a choice - either learn from those who came before you, or spend the rest of your life fighting monstrosities of your own creation, attempting to eke out some order in all the chaos, be it the SDF/BNF format of a language's syntax or otherwise. In any case, if something better comes along, transforming your s-expression code into
will be a straightforwards intern-level task[6]. Now to address some unanswered questions.
If you like the Nix concepts you could implement StumpWM the way you like it, or you could just walk away.
You don't say.
I'm also a little perplexed by the general tone and sense of entitlement but anyway... I'd like to address some of your points @gabriel-laddel .
You're in Mexico city and ask a bystander to direct you to the airport (your phone died, you lack a map and you're going to miss your plane unless you make it to the airport soon). He gives very specific instructions, which you write down and follow exactly. You reach the last instruction somewhere near the slums, missing your flight. As such, you swear loudly.
Is anyone entitled to correct directions? Of course not. I however, expect the /bare minimum/ of people giving correct directions and accurate descriptions of their projects, much like I expect my friends to brush their teeth regularly.
There's an SDF representation of the syntax in @edolstra's PHD thesis starting on page 64, though I'm not sure if that would be entirely up to date. It also can be seen as acting as a design document, and lays out the thought and central metaphors behind Nix/OS pretty clearly.
If I say "jump in front of the bus on my mark, ready? 1, 2, 3 - JUMP!" it is clear what I'm asking/ordering, what is less clear is why I'd be asking/ordering someone to jump in front of a bus.
:-) reminds me of classical Cathedral vs. Bazaar dilemma.
There are very few classics in the field of computing. We are at the very beginning of this journey and it hasn't even begun to get interesting yet. IMHO, the Cathedral vs. Bazaar isn't in anyway a dilemma or a classic. Linux was a failure 20 years ago, it is a failure today and any posturing otherwise is just that. See The UNIX-HATERS Handbook[0] for more information.
I find the contrast between these quotes accurately depicts the computing situation circa 2015.
The process of constructing instruction tables should be very fascinating. There need be no real danger of it ever becoming a drudge, for any processes that are quite mechanical may be turned over to the machine itself.
-- Turing, A. M., 1946, Proposed electronic calculator, report for National Physical Laboratory, Teddington
for the present-day darkness in the programmer's world the programmers themselves are responsible and nobody else.
-- Dijkstra
- Footnotes
[0] In actuality unix is mostly a pile of stupid, see: web.mit.edu/~simsong/www/ugh.pdf
[1]
Suppose you're trying to find the best way to structure your description of something. (Examples: choosing the structure for a computer program to perform some task; or choosing the structure for a theory of physics.)
What you hope to find is the natural structure of what you're describing — a structure that affords a really beautiful, simple description. When you strike the natural structure, a sort of resonance occurs, in which various subsidiary problems you may have had with your description just melt away, and the description practically writes itself.
But here's a problem that often occurs: You've got a structure that affords a pleasing approximate description. But as you try to tweak the description for greater precision, instead of the description getting simpler, as it should if you were really fine-tuning near the natural structure, instead the description gets more and more complicated. What has happened, I suggest, is that you've got a local optimum in solution space, instead of the global optimum of the natural structure: small changes in the structure won't work as well as the earlier approximation, and may not work at all, so fundamental improvement would require a large change to the structure.
-- http://fexpr.blogspot.com/2012/12/metaclassical-physics.html
[2] This scheme the surrounding ideas are so fundamental that they've sometime's referred to as "Maxwell's equations of Software". You can read more about Lisp and s-expressions elsewhere so I'll not repeat it here. http://www.defmacro.org/ramblings/lisp.htmlhttp://www.michaelnielsen.org/ddi/lisp-as-the-maxwells-equations-of-software/http://norvig.com/lispy.htmlhttp://norvig.com/lispy2.html
[3] This information re, Scala is interesting beyond the technicalities. Typesafe, the Scala company (has Martin Odersky as Chairman and Chief Scientist) has received 31 MM in funding. Coursera (who built everything in Scala) has received 85MM. I'll predict they'll eventually fail as they're out-competed by more intelligent adversaries, though both will (due the amount of funding and high-profile people involved, probably limp along for years to come).
Last I checked, Coursera already had fungus growing on it: something about, "usg-sponsored studies find that coursera is better than starving in Africa"
[4] http://en.wikipedia.org/wiki/List_of_programming_languages
[5]
The corresponding lisp narrative feels 'organic'. Consider the SBCL (steel bank common lisp) compiler. It is descended from CMUCL python compiler and uses some of its code, changing it as circumstances require (and of course, any transformations that needed automation were but a tree traversal away). Naggum also had some related thoughts:
There is a simple and elegant answer to this question: Just learn Common Lisp well first. New languages are exciting to people who know mostly new languages, so learn an old language before you learn new ones and get out of the maelstrom that will drown you in ever new languages that add nothing at all except some miniscule additional feature from another language that someone needed to make a whole new language to implement because he did not know (Common) Lisp to begin with. A "new" language that differs from the rest of the crop by one or a couple features is proof positive that both what it came from and what it has become are mutations about to die. There are tens if not hundreds of thousands of such "languages" that people have invented over the yeare, for all sorts of weird purposes where they just could not use whatever language they were already using, could not extend it, and could not fathom how to modify its tools without making a whole new language. They never stopped to think about how horribly wasteful this is, they just went on to create yet another language called Dodo, the Titanic, Edsel, Kyoto-agreement...
-- Erik Naggum, http://www.xach.com/naggum/articles/3206985430398054@naggum.net.html
[6] This could be largely automated, but that is a discussion for another day.
—
Reply to this email directly or view it on GitHub https://github.com/NixOS/nixpkgs/issues/4952#issuecomment-73422170.
I understand that you fully wanted to underline your points but I would have been happy with
- you didn't take existing source package management like CPAN into account 10 years ago
- you just scan for strings in the build output so you can not be 100% sure that the dependency graph is correct
- you invented Nix instead of using something like LISP
Uh... yeah, that's totally all the points I made.
Toodles.
NB: I'm a Lisp guy who uses and likes NixOS, but isn't a big expert in NixOS.
Minor remarks: uiop:run-program has a now reliably portable :directory clause. If you only care about Unix, then you can use ; to separate commands in a single string. (If you want to support windows, & might work, but not all implementations can successfully use CMD.EXE on Windows, and /bin/sh isn't reliably present.)
NixOS isn't perfect, but is accepting patches.
If you're interested in static linking of libfixposix into sbcl images, I have recipes that need to be put together into actual code, that you may be interested in. If you want a NixOS package with dynamic linking the solution is to either enhance cffi so you can statically specify the search path, or have a wrapper that initializes the search environment around the binary.
Also, the solution to false negatives is that a package breaks if it's missing a dependency, so whoever is writing the package specification will notice and fix it — and thereafter it will build deterministically.
Thanks for the info Francois. FTR, NixOS is fundamentally braindamaged. I found it was easier to create my own distribution than to hack around their crud. Details here:
http://gabriel-laddel.github.io/system.html http://gabriel-laddel.github.io/arsttep.html
On Fri, Jul 10, 2015 at 4:11 PM, François-René Rideau < notifications@github.com> wrote:
NB: I'm a Lisp guy who uses and likes NixOS, but isn't a big expert in NixOS.
Minor remarks: uiop:run-program has a now reliably portable :directory clause. If you only care about Unix, then you can use ; to separate commands in a single string. (If you want to support windows, & might work, but not all implementations can successfully use CMD.EXE on Windows, and /bin/sh isn't reliably present.)
NixOS isn't perfect, but is accepting patches.
If you're interested in static linking of libfixposix into sbcl images, I have recipes that need to be put together into actual code, that you may be interested in. If you want a NixOS package with dynamic linking the solution is to either enhance cffi so you can statically specify the search path, or have a wrapper that initializes the search environment around the binary.
Also, the solution to false negatives is that a package breaks if it's missing a dependency, so whoever is writing the package specification will notice and fix it — and thereafter it will build deterministically.
— Reply to this email directly or view it on GitHub https://github.com/NixOS/nixpkgs/issues/4952#issuecomment-120447628.
@gabriel-laddel :open_mouth:
I'm leaving this here in hopes that it'll be useful to someone. I didn't see anything in the NixOS manual that would indicate that I'd run into this class of problems while using it (granted, I've not read it in detail, but it's largely unreadable and overall, the whole codebase is poorly documented - attempting to discern the intent of the programmers isn't worth my time anymore, eg, where is the BNF form for the Nix language? What was so wrong with existing languages that a new one had to be invented. Where are the design documents? Perhaps I'm just not looking hard enough... in any case, looking hasn't proved useful, and reading the sources has revealed only a complete and utter disregard for quality.).
Cheers.