Sceptical technologist

A tale of two giants

The father of Unix and C should be missed more than the father of Apple, but won’t be.
3 January 2012
Paul Furber is a coder, gamer, writer and eclectic hobbyist. Comments welcome at

Two giants of the computing world died in October. I want to talk about the lesser sung one because he did exponentially more for the computer and consumer electronics industries than Steve Jobs.

Dennis Ritchie, who died aged 70, has fingerprints all over almost all the technology we use for two main reasons: while working at Bell Labs, he co-authored the C programming language with Brian Kernighan and also co-developed the Unix operating system with Kernighan, Ken Thompson, Rob Pike, Doug McIlroy and Joe Ossanna.

Both creations are elegant and concise solutions to big problems. The C language was developed because hardware and software in the late 1960s were tightly wedded; to develop software for Company A’s minicomputer, you needed to learn Company A’s hardware instruction set. Company B’s computer had a different instruction set, different ways of specifying things, often even different numbers of bits in a byte. Kernighan and Ritchie’s C language solved that problem. A program written in C could target any hardware for which there was a C compiler.

This is still true: C remains the most ported language in the world, running on everything from giant supercomputer clusters to embedded controllers and everything in between. Our telephone networks are written in C, as are nearly all operating system kernels. Languages derived from C include Javascript, which runs on hundreds of millions of browsers; Java, which runs millions of business applications; and C++, which is the choice for modern games. Programmers still genuflect at the covers of the classic 1978 text by Kernighan and Ritchie describing the C language and how to program in it.

Unixifying force

Unix, Ritchie’s other great contribution to mankind, is a multitasking multiuser operating system originally written to solve time-sharing of expensive resources but still runs most of the world. Ported to C in the early 1970s by Ritchie, it would strongly influence the hardware market a decade later by decoupling it from software. Unix, written in C, would run on most hardware. Applications written in C would run on anything that ran Unix. History shows there was a great explosion of productivity and software during this time, particularly in academia, thanks to this freedom. The internet, the Free Software movement, the World Wide Web, the telecoms industry, Apple and Linux owe their existence (and ongoing health) to Ritchie’s work on Unix and the cheap hardware that sprang up from more competition. Many of the things we take for granted in computing – the client-server model of computing, configuration in plain text files, connecting small and flexible tools, and choosing simplicity of implementation over efficiency – were first shown to be practical on Unix.

Does the world at large care about Dennis Ritchie’s contribution? No, because technology at every level has become so exceedingly complex that no one person can understand even a fraction of it all, let alone the entire stack. I recently read a provocative essay explaining the almost infinite depth of complexity behind the scenes of a simple visit to a web page. It’s worse than that, actually: to really understand the entire process of what happens when just a single key on a keyboard is pressed would take several lifetimes of study of silicon, chemistry, software and electronics, assuming you were bright enough to understand any of it in the first place. But because it’s invisible, we don’t appreciate any of the pure magic of it. What Steve Jobs did was very visible, hence the public reaction to his death. But Dennis Ritchie’s contribution was far broader and far deeper. We’re much poorer without him.