Closed MarisaKirisame closed 4 years ago
I think fundamentally the problem is that the books do not list dependency - we should add dependency to each book so we get a real tech tree, and fix the missing link ASAP.
I think the focus has been primarily on the popular techniques being used currently. In my opinion, if there are any constraints being forced simply by the logistics of storing so much material, perhaps there can basic templates or modules which can be prepared, guiding on how to do any of the above things. These are usually easier than having to build things from scratch or gaining this knowledge all over again, because it presents the opportunity to pyramid upon past or pre-existing technology.
Im curious to see what you guys come up with and how it comes out.
This is an interesting conversation but sadly one devoid of any concrete proposals or suggestions.
I dont think the current tech tree is nearly focused enough to archive "how computers work and the foundational technologies required to make and use computers"
Let's look at some examples.
All of the three compiler book cover building a compiler for a low level imperative language (essentially C). There is no compiler about functional programming language (SML/Haskell), no compiler for declarative language (datalog), or for high-level dynamically typed language (smalltalk). Furthermore there is no book that talk about SAT/SMT solving, nor Garbage Collection, which is the foundation for compiler and programming language (C Compiler use graph coloring (NP complete) for register allocation).
All of the book only talk about contemporary programming language, with zero books about how to design a language. How does one design the type system to ensure safety, without sacrificing too much expressiveness, meanwhile allowing efficient type checking/inference? How do you manage different kind of effect (e.g. reference, concurrency, probability, nondeterminism, exception)? How do you design the language such that it contain a few simple yet universal construct, instead of having lots of ad hoc construct, and quickly becoming and overly-complex language (see Gedanken and Scheme)?
Scientific Computing workload often consists of 0: lots of domain knowledge 1: ran with numerics algorithm (e.g. finite element methods) 2: optimization for superomputer
I didnt see any of them.
There are five books on deep learning, with no book on bayesian method/probabilistic graphical model/symbolic methods/classical machine learning.
Even if this is just for deep learning, it is still not enough - there is no mention on how to rebuild deep learning framework.