Five Neglected Computer Science Classics

Written by kwindla | Published 2015/07/30
Tech Story Tags: books | computer-science | teaching-computer-science | css | computers

TLDRvia the TL;DR App

Working programmers are craftspeople. We learn as we go, from code we’re exposed to, prose we read, and people we work with.

I have a list in my head of books and papers that played an outsize role in shaping my programming sensibilities. It’s fuzzy around the edges, because I’ve never written it down. But it includes some of the classics: SICP, Knuth, The Mythical Man Month, Programming Pearls, Introduction to Algorithms, The C Programming Language. And also some things I read at particular times in my life that sent me off in new directions: Danny Hillis’ thesis, Mitchel Resnick’s book that describes how to think about massive parallelism, Alan Kay on Smalltalk, some of the papers on Self from the early 90s, On Lisp and Lisp in Small Pieces, Compiling with Continuations.

Recently, a friend asked for brain-food suggestions, and the discussion that followed made me realize that, in addition to “obvious classics,” like the ones listed above, I am a big fan of a handful of works that are a bit less well known, but no less worth spending time with.

The Implementation of Functional Programming Languages (1987)

The Implementation of Functional Programming Languages (1987) is a book-length tutorial on high-level functional languages, the lambda calculus, and graph reduction.

Implementing Lazy Functional Languages on Stock Hardware: The Spineless Tagless G-machine (1992) builds on the final section of the book, describing a compiler for functional languages that produces efficient code.

Both works are deeply technical, extremely well-written, and hugely clever, in the best sense of the word. They’re among the best examples I know of starting with theory and working step by step towards a real-world system.

The spineless, tagless G-machine is integral to the Glasgow Haskell Compiler (and much other Haskell research and implementation work). These two works together provide a pretty good sense of why Haskell is the way it is.

The Art of the Metaobject Protocol (1991)

The first time I read The Art of the Metaobject Protocol I found it interesting, but I didn’t really absorb it. A few years later, I came back to it and had a profound ah hah moment.

Again, this book is about using computer science theory to build tools for real-world programming. But, in this case, the authors start with a survey of object oriented language features used by Common Lisp programmers, then apply theory from several branches of computer science to design a meta-system that abstracts common features and makes “language design” choices parametrizable.

The ah hah part, for me, was the lesson that some programming notations are much more effective than others for thinking about certain problems.

All the code in the The Art of the Metabject Protocol is Common Lisp. When I first read the book, I had only a passing familiarity with Lisp. By the time I came back to the book for a second reading, I’d written a fair amount of production Lisp, and the source code in the book was accessible to me in a way that it hadn’t been before. I found that the ideas in the book were more accessible, too.

It often doesn’t much matter which programming languages we use to solve everyday problems. But it matters a lot which programming languages we actually think in.

Modern C++ Design (2001)

Modern C++ Design was so influential that nobody reads it any more.

Much of the functionality that Alexandrescu pioneered and popularized in this book made its way into the C++ standard. These days, there are many C++ references that cover compile time template metaprogramming, template generics, and typelists.

Alexandrescu’s approach is, in some ways, the perfect complement to that of Peyton Jones. Where Peyton Jones starts with theory and constructs a language (Haskell), Alexandrescu starts with a language (C++) and backs his way into rigorous and useful application of theory.

Here’s the backstory: it turns out that the C++ template system is Turing-complete. However, that’s an accident. The language authors were just trying to make it a little bit less painful to use container data structures. (This anecdote tells you everything you need to know about C++.) A number of people realized that interesting (also, appalling) things could be done, given the existence of this quite-powerful (but also exceedingly odd and brittle) compile-time language.

Alexandrescu dissected the C++ standard, broke down the mechanisms of the templating system, and built those mechanisms back up into a virtuosic, mind-bending, practical library that makes C++ into a far more expressive language than it has any right to be. Modern C++ Design describes the building blocks, implementation, and use of this library.

Purely Functional Data Structures (1999)

There’s a functional programming theme running through this list. (Even C++ templates are a purely functional environment. A deeply weird one, but if you want to do much template meta-programming it definitely helps to immerse yourself in functional programming.)

I don’t think it’s an accident that the books I remember as most mind-expanding are disproportionately about functional programming. Functional programming ideas are powerful, and fundamental to computer science.

It feels, these days, like we’re slowly but steadily incorporating elements of the functional programming toolkit into “mainstream” languages.

This kind of evolution has happened before. When I was in graduate school, a lot of my friends and I wrote Lisp and Smalltalk code in our academic lives, and C and C++ code when we moonlighted in “industry.” It’s fair to say that people outside academia were often pretty skeptical about languages with garbage collection and first class functions. Fast forward to today, and a huge amount of production code (maybe most production code) is written in languages that have both features (JavaScript, Python, Ruby). Computers got faster, implementations improved, and a new generation of programmers grew up and made new decisions about tools and approaches.

JavaScript is a perfectly reasonable functionally oriented language, if you choose to use it that way. Ruby has #map, and #each, and #select. Swift has pattern matching. These features are the low hanging fruit of functional programming. They are very useful, but are also easy compromises that still fit into an imperative language design.

It’s going to be interesting to see whether additional aspects of functional programming start to see widespread use over the next few years. Purely functional data structures — sometimes called “immutable data structures” — are one candidate for mainstream adoption. If you’re coming from an imperative programming background, there’s a learning curve to using purely functional data structures. And performance will sometimes be an issue. But if you use only immutable data structures, some classes of bugs go away, particularly bugs relating to multi-threaded code. In our increasingly multi-core and distributed world, that’s a big deal.

Chris Okasaki’s Purely Functional Data Structures is the standard reference and tutorial on immutable data structures. It’s a joy to read. I agree with jao: the signal to noise ratio in this book is very, very high. This stackexchange thread on what’s new since the book was published is also worth a read.

Synthesis: An Ecient Implementation of Fundamental Operating System Services (1992)

And now for something completely different: Alexia Massalin’s 1992 PhD thesis describing the Synthesis Operating System.

Introspection is an uncertain undertaking, but I think this paper, more than anything else I’ve read, shaped my sense of the outer limits of the possible.

When hyperbole is insufficient to the task at hand, stating the facts is an honorable fallback position. I can’t do any better than Valerie Aurora’s description of Synthesis:

a completely lock-free operating system optimized using run-time code generation, written from scratch in assembly running on a homemade two-CPU SMP with a two-word compare-and-swap instruction — you know, nothing fancy.

Which (necessarily) undersells by a very large margin just how impressive, innovative, and interesting this thesis is. The above description implies a bunch of things that might not be immediately obvious. For instance, here’s a (snipped) bit of text from pages 3 and 4 of the thesis:

The first version of the software consisted of drivers for the machine’s serial ports and 8-bit analog I/O ports, a simple multi-tasker, and an unusual debug monitor that included a rudimentary C-language compiler/interpreter as its front end. It was quite small — everything fit into the machine’s 16 kilobyte ROM — and ran comfortably in its 16 kilobyte RAM. […] Even though it lacked many fundamental services, such as a filesystem, and could not be considered a “real” operating system in the sense of Unix, it was the precursor of the Synthesis kernel, though I did not know it at the time.

After entering the PhD program at Columbia in the fall of 1984, I continued to develop the system in my spare time, improving both the hardware and the software, and also experimenting with other interests — electronic music and signal processing. During this time, the CPU was upgraded several times as Motorola released new processors in the 68000 family. Currently, the Quamachine uses a 68030 processor rated for 33 MHz, but running at 50MHz, thanks to a homebrew clock circuit, special memory decoding tricks, a higher-than-spec operating voltage, and an ice-cube to cool the processor.

Did you catch that bit about a “rudimentary C-language compiler/interpreter at its front end?” When you have to write your own C implementation (and environment) as a precursor to solving some other problem — well, that’s either the best or the worst possible approach to systems hacking.

If you’re interested in operating systems, or compilers, or concurrency, or data structures, or real-time programming, or benchmarking, or optimization, you should read this thesis. Twenty-five years after it was published, it still provides a wealth of general inspiration and specific food for thought. It’s also clearly and elegantly written. And, as a final bonus, it’s a snapshot from an era in which Sony made workstations and shipped its own, proprietary, version of Unix. Good times.


Published by HackerNoon on 2015/07/30