There's a quote by Rich Hickey, the creator of the Clojure Programming Language, floating around the Internet that goes like this:
[Simplicity is hard work](https://www.youtube.com/watch?v=34_L7t7fD_U). But, there's a huge payoff. The person who has a genuinely simpler system - a system made out of genuinely simple parts, is going to be able to effect the greatest change with the least work. He's going to kick your ass. He's gonna spend more time simplifying things up front and in the long haul he's gonna wipe the plate with you because he'll have that ability to change things when you're struggling to push elephants around.
I've been thinking about this in the context of Elon Musk's venture and adventures, especially the gushing press coverage from more recent books that talk about Musk's dedication to "first principles," and the idea, which so far seems to be paying off for Musk, that people are doing things not because they're correct, but because they're familiar.
SpaceX is the obvious example. Musk ran the numbers, realized that the material cost of a launch vehicle is 4% of its cost, asked why launches are so damned expensive, and set out to prove that they shouldn't be. He's trying to do the same thing with tunneling machines, subways, electric cars, solar roofs, and batteries.
I'm a software developer who's long felt more of a kinship for computer science than for computer engineering. And the longer I work in this business, the more I feel like there's a first principles issue that is missing somewhere.
For example, in the early 2000's there was a lot of buzz about "object oriented databases." They were all the rage until someone tried to implement them at scale, and the processing cost of updating all those rows every time a change happened was enormous; the relationship between objects in an OODB constituted an incredibly expensive forest of directed acyclic graphs to maintain. And yet, at the same time, everyone was working with spreadsheets, and underneath the grid, a spreadsheet is just a forest of directed acyclic graphs. The secret, it turned out, was to only update the parts you could see; calculated cell values that were out of sight and didn't affect the view didn't matter.
The technical term for updating only what output the user currently wants is laziness. OODBs work just fine so long as the results are lazy.
I have this nagging notion that at the user layer, almost everything is over-engineered to be easy rather than simple. That there's a missing idea. That many of the features we see in applications: the DAGs of spreadsheets and garbage collectors, the page catalogs of databases, the piece tables and gap buffers of word processors, and so forth, would be significantly easier to understand if all the weirdness of it, the humanness of it, were boiled down to a couple of declarative tables that explained to the machine what the human thought these terms meant.
Because underneath it all, every programming language in the world is semantic sugar around memory allocation, assignment, loops, and conditions. And if your language has first-class functions, tail calls and pattern matching, you've replaced your loops and conditions with something smarter.
And Clojure isn't it. Because Clojure isn't simple at the bottom. Clojure is Java at the bottom. The Lisp Reader in Clojure is
LispReader.java, and to me, that screams that there's more work to be done.