capr / blag

:suspect: Blog/wiki of a grumpy old programmer
4 stars 0 forks source link

John Hennessy's talk about the future of computing #15

Open capr opened 3 years ago

capr commented 3 years ago

Today I saw John Hennessy's talk about the future of computing and I was thinking I should probably put RISC in the history of computers somewhere, but I dunno, John Hennessy seems kinda arrogant to me.

For example, he brags about how RISC runs on 95% of devices today presumably because of their low power rating, but the reason that phones run on ARM has nothing to do with that, I mean they could've made smartphones that last 5 minutes and people would still buy them. After all phones used to last a week before they became "smart", now they barely last a day and nobody complains. People get used to anything as long as they get what they want. It takes 2 seconds to change the channel on a TV these days when it used to be instant and people still don't complain (incidentally, TVs also run on ARM and I'm sure it's not because of their watt rating). In any case, Atom's failure in the mobile market has little to do with the technical superiority of RISC and much more to do with IP & antitrust law (and with Intel's business/foundry model, etc.).

Another thing that irks me is that now that Moore's law is dead, he now thinks that hardware people should collaborate more with stupid software people to better understand their needs. Why only now? This lack of collaboration between hardware people and software people seems like a massive case of Conway's Law to me. If you look at how CPUs evolved over the years, there doesn't seem to be much collaboration between these two groups. At first, everybody was riding on Moore's law, hardware people making smaller and faster gates, software people making PHP and Ruby to compensate for that. Everyone was happy. When that stopped working, they added more cores (silicon is cheap so we'll just print more!), but that meant dumping multi-threading onto software people along with an entire new class of nasty bugs. When they realized that that didn't totally work because hardly anyone uses multi-threading unless absolutely necessary (and at massively added costs and with usually disappointing benefits due to Amdahl's Law), they stepped on the gas of deep pipelining and speculative execution to the point where CPUs today are more like JIT compilers than CPUs. All done in a bubble, to run C, a language notoriously hard to parallelize and vectorize. A language not even fit for out-of-order execution (we have complicated aliasing rules and undefined overflow behavior in order to allow compilers to stuff the pipe as much as possible on these types of CPUs). Now that even that doesn't work anymore, with DRAM being so much behind the CPU, their next move will probably be dumping the problem again onto software people, most likely by making the memory hierarchy itself programmable, without much thought about how programming itself must change to accommodate that (just like with threads, it will probably be just more C extensions piled on top).

Anyway, watch Bret Victor's talk on the future of programming, it's much more enlightening, especially the epilogue.

Software and hardware design is less different than software designers think, 
but more different than hardware designers think. 
    -- Fred Brooks, The Mythical Man-Month