Open dundalek opened 4 years ago
Absence of evidence for the contrary is not evidence.
There are some comments in https://www.reddit.com/r/lisp/comments/cznrhg/the_lisp_curse_submitted_8_years_ago_still/ that describe reasons why the article isn't true.
I also suggest http://metamodular.com/Essays/wrong.html which has its own reasons that Lisp may not be "popular"; and jumping from ALGOL-lookalike to ALGOL-lookalike is easier than jumping from ALGOL-lookalike to not-ALGOL-lookalike (the one and only Lisp syntax joke is a good manifestation of the tendency to avoid the latter).
My main goal for seeking explanations of reasons is so that we can try to address and improve upon them. Therefore I have more appreciation for arguments that try to present stronger cases to make existing points less relevant, instead of only trying to weaken existing points by just dismissing them without providing alternative.
There are some comments in https://www.reddit.com/r/lisp/comments/cznrhg/the_lisp_curse_submitted_8_years_ago_still/ that describe reasons why the article isn't true.
Could you enumerate which arguments presented in the reddit thread you consider substantial?
I also suggest http://metamodular.com/Essays/wrong.html which has its own reasons that Lisp may not be "popular"; and jumping from ALGOL-lookalike to ALGOL-lookalike is easier than jumping from ALGOL-lookalike to not-ALGOL-lookalike (the one and only Lisp syntax joke is a good manifestation of the tendency to avoid the latter).
Yeah, perhaps it is all along just a combination of hardware not being powerful enough in the past and a strangeness of prefix notation with parentheses. Although that seems too superficial and feels a bit hopeless. I am not ready to accept that just yet, so I am still trying to search for other explanations :)
Therefore I have more appreciation for arguments that try to present stronger cases to make existing points less relevant, instead of only trying to weaken existing points by just dismissing them without providing alternative.
My understanding is that the way these kind of debates work is that there have to be arguments that support the Lisp "curse" article before one can claim that the non-presence of refutation means it is a valid argument. I do have some other guesses as for why Lisp may not be popular which I will state, but my problems with the article are that
Then your proposed solution is...weird. More power causes the tendency for people to create incomplete solution by themselves, so using another platform will lead them away from that tendency? Are the platforms on this list less powerful then?
Could you enumerate which arguments presented in the reddit thread you consider substantial?
kazkylheku suggested that the hardware requirements of a Lisp system were always well above average for some time. I wasn't around in those times, and I don't really know how those machines did their thing, but now I would say that the "performance" of a Lisp system usually sits in a weird place between the "fast" languages like C and the "scripting" languages like Python. People also have baseless concerns about efficiency these days, and so do not like having resident compilers and debuggers. Of course the time they spend suffering without those is longer and more expensive than the CPU time they saved without having those around, but they aren't great at doing cost analysis.
defunkydrummer believes that Lisp might be better suited for advanced programmers. Some features could be tricky to understand for a while, but in the general case I think Lisp is easier than average, as there's less syntax and a nice interactive environment to hack in. Still, the cool things like macros and interactive debugging take some getting used to.
itmuckel states that you could substitute the names of some other languages and big projects written in those into the Lisp "curse" article and it would still check out, naming JavaScript as an example. (npm is still the poster-child of library management problems.) They also mention that Common Lisp is a (Common) unification of many incompatible Lisp systems, so collaborative work is indeed possible in Lisp, and also possible after divergence has already taken place.
I did come up with another few ideas though:
Although that seems too superficial and feels a bit hopeless.
I would say making a nice-sounding lie is easier than finding a nice-sounding truth, but I'm no good at philosophy.
To summarise, I'm not convinced I have to come up with my own alternate hypothesis to refute that article, but I have a few, these people on Reddit have a few, and the conclusion made in the introduction to the list doesn't make sense.
IMHO the fundamental problem is that Lisp is a big-idea language. In most programmers there is a strong aversion and even a contempt for thinking in new ways. Haskell, Forth and APL have the same problem.
Lisp was popular for a while because the new ideas were the only game in town for productivity gains. Then new languages came along that gave about 50-80% of Lisp's benefits in a familiar form that doesn't require thinking in new ways. Success was guaranteed, and begat so many libraries that their productivity has now far exceeded Lisp's.
Lisp used to be like calculus: something obscure that was essential in creating practical things. Now Lisp is like philosophy: something obscure that people cannot figure out how to use for practical gains.
If someone came up with a way to build cars and airplanes without calculus, using an easier kind of math, calculus would become a historical relic like Lisp. Most people do not enjoy thinking for long periods and would gladly do 2-5x the amount of work to avoid it.
So any big-idea language fundamentally has to settle for a small subset of programmers unless it provides a boost akin to switching from horses to cars. A road trip across the US is 4-6 days; by horse it's 4-6 months, so a 30x improvement. GC and dynamic typing probably gave a boost in that ballpark, at least for big programs. Lisp doesn't have any more 10x boosts up its sleeve. S-expressions, macros and reflection probably don't add up to 10x.
Since there are millions of programmers, even a small subset of them can still make for a big and prosperous community. This is where the Lisp curse kicks in. Big-idea languages seem to have intrinsic social problems because they attract people who are more interested in ideas than practical results. It's very difficult to be equally interested in both since one usually requires compromising a lot on the other.
Standard advice of the form "people should just do X" generally does not work, since people generally ended up doing what they do because of who they are (absent major environmental constraints). Telling a thinker to be more practical, or an individualist to collaborate, or an elegance junkie to ship something 20% polished, is not simple - it's a lifelong struggle at best.
The way to win here is to figure out a way to make the missing ingredient the easiest thing to do. Lisp's problem (the one we can solve without watering down its essence) is too much individualism, i.e. missing collaboration, as the Lisp curse article identified. So we have to make collaboration easier than individualism (for people who care about thought and beauty). There is plenty of scope for social engineering here and it should be fully explored.
Some ideas (medium-long term):
Tone down the individualism by tolerating other Lisp dialects than our own favorite. Remember that when all is said and done, it's still Lisp, and that means something.
Find ways to bring the dialects and implementations closer together. Pidgin languages (subsets that translate easily to many dialects) would probably be useful.
Make a fast, featureful Lisp implementation that supports all dialects natively (not just as macro layers) and lets them talk to each other.
Centralize Lisp resources onto a few websites. Right now it's almost impossible to get an overview of Lisp, which misses an obvious opportunity to improve recruitment and morale. lisp.org should be a homepage covering all dialects.
Find ways to inform Lisp programmers that others are working on similar things. Helps avoid "throw-away design". This could be something like aggregating all Lisp libraries onto one site, and having RSS feeds on topic keywords (e.g. sockets, graphics, machine learning).
Make Lisp really portable and have excellent interfaces to OS APIs and other languages - it's an important safeguard for commercial projects that you have an escape hatch to more popular languages if you're on a deadline.
One thing that puzzles me is why inheritance in object-oriented programming became so popular even though it's a "big idea" that gives no obvious practical benefit (and causes lots of practical problems). Inheritance means inventing fake taxonomies in your head. Isn't that exactly the kind of thinking-for-no-obvious-reason that people hate about Lisp? Perhaps it's because you can opt out of it. You can't really opt out of Lisp's weirdness and use only the non-weird parts, since all parts are weird :)
even a contempt for thinking in new ways
How am I supposed to believe piggy-backing off another slimy phony-object-oriented-and-procedural language is "new"?
Then new languages came along that gave about 50-80% of Lisp's benefits in a familiar form that doesn't require thinking in new ways.
And thinking in new ways isn't a benefit at all? Wait...didn't you just say Lisp programmers had contempt for thinking in new ways? I suppose the first quote has "new ways" as in Lisp/Haskell/APL programmers abandoned bothering because they were inexpressive and inflexible, and the second has "new ways" in that someone can't comprehend that the main paradigm of a language could possibly be that sick phony-object-procedural mishap.
Then new languages came along that gave about 50-80% of Lisp's benefits in a familiar form that doesn't require thinking in new ways.
"If I catapult you over this wall, you'll get about 50-80% of the benefits of air travel." Or, at least, I think you assign the wrong weights to the wrong values.
Now Lisp is like philosophy: something obscure that people cannot figure out how to use for practical gains.
So what do philosophers do? You'd think the market would have killed them off if philosophy wasn't useful. One site states that philosophers are good at "general problem solving", communication, persuasion, understanding other disciplines and developing methods of research. Very handy things.
Lisp doesn't have any more 10x boosts up its sleeve.
Interactivity is at least another 20×, reflection is a prerequisite for interactivity, and S-expressions and macros make domain-specific-languages trivial, which admittedly is an acquired taste; one which the Lisp "curse" author couldn't have tried as they barely tried to learn Lisp.
In fact, all the numbers you just made up would fall under "acquired taste", and it may be that I get more value from those features than you do. Trying to assert those numbers puts people that think like me at a disadvantage. Lisp excels at not doing that, as it gives programmers the ability to pick their favourite parts without some BDFL telling them off for not following the Party line.
Big-idea languages seem to have intrinsic social problems because they attract people who are more interested in ideas than practical results.
And the feasibility of certain ideas isn't a result in itself?
Telling [...] an individualist to collaborate [...] is not simple - it's a lifelong struggle at best. Lisp's problem (the one we can solve without watering down its essence) is too much individualism, i.e. missing collaboration, as the Lisp curse article identified. So we have to make collaboration easier than individualism (for people who care about thought and beauty).
[Presented without further comment.]
Tone down the individualism by tolerating other Lisp dialects than our own favorite. Remember that when all is said and done, it's still Lisp, and that means something. Find ways to bring the dialects and implementations closer together. Pidgin languages (subsets that translate easily to many dialects) would probably be useful.
Can you form a meaningful subset of any two (or more) given Lisp "dialects"? Scheme has only hygenic macros and continuations, Common Lisp has unhygenic macros and no continuations but has restarts, and Clojure has neither and apparently has to be tied to some phony-object-procedural platform or another.
And, well, I have expectations: I want interactivity and all the other features that let me work at my pace and not my language's pace. If I'm being limited by the latter, what does that mean?
Make a fast, featureful Lisp implementation that supports all dialects natively (not just as macro layers) and lets them talk to each other.
Also not trivial; as well as the previous comment on "dialect" subsets, representations and implementation strategies are also wildly different.
Find ways to inform Lisp programmers that others are working on similar things.
(ql:system-apropos "topic")
is quite good at searching for Common Lisp projects. IMO it should be social practise to use the wonderful thing some wizards call a "search engine", and it would probably be less spammy and would help more when a programmer wants to know.
Make Lisp really portable and have excellent interfaces to OS APIs and other languages - it's an important safeguard for commercial projects that you have an escape hatch to more popular languages if you're on a deadline.
I can't argue against OS interfaces, but you have more things that could go wrong working mixing languages (and especially ones without formal standards!), and you definitely should not be working on commercial projects in languages you aren't comfortable working with.
I meant that Lisp, APL, Forth and Haskell programmers enjoy learning about new ways to think. Most other programmers don't. That's why most programmers will never like Lisp, APL, Forth or Haskell. But a Lisp programmer probably likes learning about APL, Forth, or Haskell, and vice versa, because these people like thinking in new ways.
Most programmers acquire one taste (for infix math in grade school, and then for infix Algol/C/Python style programming). It's true that all programming languages are an acquired taste, but most programmers only want to acquire one taste, and it has been decided for them in math class. People could also do math in prefix or postfix, but almost nobody does.
None of this has anything to do with what makes sense or not. Most people want to avoid work, especially want to avoid thinking, and most especially want to avoid thinking in unfamiliar ways. These psychological forces are some of the strongest ones we have.
Calculus and philosophy are similar - they consist entirely of thinking in new ways, so most people don't want to have any but the most superficial contact with them. Calculus is respected because people understand that it's irreplaceable. Many people suspect philosophy is replaceable, so it is not as widely respected. Lisp perhaps used to have a reputation like calculus; now it has a reputation more like philosophy (or Latin).
That's why most programmers will never like Lisp, APL, Forth or Haskell.
Their loss then, not mine. If you have to bend Lisp around to them to make them like it, then you lose the whole "big idea" part.
People could also do math in prefix or postfix, but almost nobody does.
And that's why there are people that post math problems on social media where they deliberately confuse precedence and trip up many people that respond with their answers? If infix is so convenient, how come it's easy to get someone to make such an elementary mistake?
most programmers only want to acquire one taste, and it has been decided for them in math class
Still their loss (and a sign of poor education coming from the education system), if they are unable to challenge what has been presented to them.
Most people want to avoid work, especially want to avoid thinking,
Well, what good are they at programming that doesn't involve being elaborate copy-paste machines?
Indeed, as they say, Lisp is already mainstream - it's called JavaScript and Python. They kept the big ideas hidden in the "engine room" (GC, dynamic typing, closures) but removed the ideas that people can see (macros, S-expressions).
I would probably prefer to be a copy-paste machine if we could copy-paste our way into something elegant. It's the end result that counts, not the method. A language that produces elegant code by copy-pasting instead of thinking could take over the world. But it would require thinking in new ways to invent that language :)
Indeed, as they say, Lisp is already mainstream - it's called JavaScript and Python.
Yet they still carry around the procedural object-centric junk of the olde languages. To rephrase an earlier joke, "air travel was popular in the middle ages as catapults had the big ideas".
I would probably prefer to be a copy-paste machine if we could copy-paste our way into something elegant.
And you don't ever think someone will have to think of something new again?
"If I catapult you over this wall, you'll get about 50-80% of the benefits of air travel."
In my experience, this is how most people and organizations solve most problems :) Not just programming but money, health, relationship and societal problems, etc.
So what do philosophers do? You'd think the market would have killed them off if philosophy wasn't useful.
The problem with things like philosophy, Lisp and classical music is that they produce results on a longer timescale (e.g. 10 years for real return on investment, sometimes much longer). Very few people genuinely have that kind of time horizon. And a lot of people each benefit a little, the total benefit is hard to see.
The market is happy to reap the rewards from long-term projects (e.g. GC from Lisp). But it's not happy to pay for their development, which takes a long time with uncertain payoff. A lot of business and personal life is still about survival with little time for hypotheticals like that.
And the feasibility of certain ideas isn't a result in itself?
It is, but on a longer timescale. Also, idea people are biased to think ideas are more useful than they actually are. We like ideas, so we feel good when new ideas win and bad when they turn out useless. Emotional attachment is hard to avoid.
I would probably prefer to be a copy-paste machine if we could copy-paste our way into something elegant.
And you don't ever think someone will have to think of something new again?
Ideally, thinking would only be required when solving new kinds of problems. Familiar kinds of problems would be solved almost automatically, perhaps by some advanced version of auto-complete AI that would suggest how other people solved similar problems. So you would write most of your code by selecting auto-complete options, and you would only think from scratch in special circumstances. Few programmers solve truly new problems, so intuitively, this auto-complete thing is how humanity should do most of its programming tasks.
I've been coding Clojure professionally for > 4 years, and am not bothered by these existential questions anymore. Be practical, get your hands on a popular dialect, and start making things. Feel free to make your own lisps, but expect it to not become the Next Big Thing unless you are really lucky and heavily funded.
What an epic thread. Thank you for your thoughts.
kazkylheku suggested that the hardware requirements of a Lisp system were always well above average for some time. I wasn't around in those times, and I don't really know how those machines did their thing, but now I would say that the "performance" of a Lisp system usually sits in a weird place between the "fast" languages like C and the "scripting" languages like Python. People also have baseless concerns about efficiency these days, and so do not like having resident compilers and debuggers. Of course the time they spend suffering without those is longer and more expensive than the CPU time they saved without having those around, but they aren't great at doing cost analysis.
It indeed seems Lisp got a bad rep in the early days because of performance and AI winter. What is strange to me that there was not a larger resurrection of Lisp in the 90s during transition to PCs with more power when other languages like Python, PHP, Ruby, JavaScript were created and thrived.
defunkydrummer believes that Lisp might be better suited for advanced programmers. Some features could be tricky to understand for a while, but in the general case I think Lisp is easier than average, as there's less syntax and a nice interactive environment to hack in. Still, the cool things like macros and interactive debugging take some getting used to.
Here I would distinguish the difference between simple vs. easy. Lisps are simpler. But for majority of people it is not easy, because it is unfamiliar. Maybe Lisp programmers cannot communicate the benefits clearly enough to motivate others to give it a try. But functional programming is getting more popular in the web frontend programming, so maybe a time will come for more people to realize things and go all the way.
itmuckel states that you could substitute the names of some other languages and big projects written in those into the Lisp "curse" article and it would still check out, naming JavaScript as an example. (npm is still the poster-child of library management problems.) They also mention that Common Lisp is a (Common) unification of many incompatible Lisp systems, so collaborative work is indeed possible in Lisp, and also possible after divergence has already taken place.
I think this a large misunderstanding of the point the article is trying to make. Actually, JavaScript is a great counter example in which the whole industry standardized on compatible implementations. It is quite easy to create a toy JS implementation, but only a few huge organizations with a massive corporate backing can create and maintain production grade implementations - Google, Apple, Mozilla and ~Microsoft~ (not even Microsoft is able to do it and threw in the towel). To add new language features people invented transpilers which compile down to base implementations. And eventually some ideas from those efforts are standardized and adopted by all relevant implementations.
I did come up with another few ideas though:
* It's taught poorly, my university tries to get that done in about two weeks in the second year of a bachelor course * It's still associated with symbolic AI, which really knocked it around in the AI winter. My university teaches Lisp as an "AI language" basically.
I completely agree here. This was unfortunately also my experience at the university. Lisp got introduced as this weird thing that can be used to solve "AI" problems, but the way it was presented one could conclude it was not useful for practical tasks.
* The semantics are also quite different to ALGOL-lookalikes; it almost feels like a functional language, but Common Lisp still has objects, classes, GOTO, plenty of imperative constructs, etc
More and more features from Lisp are being added to mainstream languages. Basically every language now has lambdas. Kotlin and Java are working on adding continuations. Elixir has macros. Of course none of those are as elegant as Lisp. So if mainstream languages are picking features from Lisp, there must be something else that keeps people preventing to go to the original language instead of using derivatives.
To summarise, I'm not convinced I have to come up with my own alternate hypothesis to refute that article, but I have a few, these people on Reddit have a few, and the conclusion made in the introduction to the list doesn't make sense.
What caught my interest in the Lisp curse article is that it presented other plausible reason besides the obvious ones related to timing and unfamiliarity. That reason is not so obvious and not easy to admit at least partially, especially from inside of the community. It is a bit like fish that don’t know they’re in water. Although I think there is much truth to it, I will try to think how to rephrase the introduction, so that people do not get mistaken impression that the Lisp curse phenomenon is the one and only reason.
There is a great post which answers similar questions from perspective of Haskell like "If Haskell is so great, why hasn't it taken over the world?".
The simplest explanation is probably that Haskell is not that much better than, say, Java, for many of the software systems people write today.
Go has found a niche as basically “a better C” or “a better Java” for writing high-performance servers that do lots of I/O. Unlike C or Java, it has a much more high-level I/O and concurrency story, but the language itself is otherwise very familiar to people with a background in these and other mainstream languages. Thus it serves a niche that wasn’t previously well-covered.
As soon as you need to be defining lots of complex or interesting computations, you start needing languages with good support for composability to manage that complexity. Here Go fails, for all the reasons that people have criticized it. But there’s still a good chunk of services where Go can do quite well!
Haskell programmers might object that, well, Haskell has its own very nice I/O and concurrency story, in many ways more sophisticated than Go (things like software-transactional memory, which make writing highly concurrent data structures and algorithms much simpler). But Haskell is “weird”. A C, Java, Python, or Ruby programmer can pick up Go easily. They can’t pick up Haskell so easily, as even in beginner Haskell, you are immediately confronted with lots of unfamiliar concepts. And since Haskell isn’t enough of a win for these “boring” services, Go can still make sense.
Some people mentioned objections against the Lisp Curse article.
If you have a different explanation why lisp is not more popular, please leave a comment.