As programmers, we interact with software tools daily. Most of them can barely get the job done. But occasionally, we discover a piece of software that transcends mere utility. These tools capture our imagination, open new possibilities, and affect how we design our own systems. I call such software enlightenmentware.

The most common source of enlightenment for programmers is the programming language they use at work or learn as a hobby. I experienced many jolts of enlightenment from fiddling with programming languages, from masm and C to Prolog and Idris. I won’t focus on languages, however, since the effects of language learning on mind expansion is old news See, for example, Peter Norvig’s Teach Yourself Programming in Ten Years. .

In this article, I praise the software that contributed the most to my enlightenment.


unix is user-friendly—it’s just choosy about who its friends are.

Anonymous, in the Art of unix Programming by Eric S. Raymond

I started looking for my first real programming job around 2008 while studying at a university in my hometown of Nizhny Novgorod. Almost all the open positions required knowledge of mysterious things called unix and sockets. My curriculum didn’t offer a course on unix or operating systems in general, so I decided to get a textbook and master the topic myself.

The unix Operating System by Andrey Robachevsky et al., also known as the turtle book in Russia because of its cover, introduced me to the magical world of unix-like operating systems. unix became something I could understand, explore, and programmatically interact with. All pieces of the puzzle—the filesystem interface, the process model with environments and permissions, forking, sockets, and signals—fell into place and revealed a coherent, beautiful picture.

A search for a working unix installation led me to Mandriva Linux. It was like discovering a parallel universe where you don’t have to pirate software or spend forty minutes installing an ide to compile a c program. Here, people developed software for fun and shared it freely. I couldn’t fathom why anyone would use Windows I became significantly more tolerant since my early university years. Windows (specifically the NT family) is a great operating system. I even have it installed on my gaming pc so that I can buy games I never play. .

From that moment on, unix followed me through all stages of my life: the toddler phase of keeping up with the cutting-edge Ubuntu releases, the rebellious teens of compiling custom kernels for my Thinkpad T61p and emerging the @world on Gentoo, the maturity of returning to Ubuntu lts and delaying upgrades until the first dot one release, and to the overwhelmed parent stage of becoming a happy macOS user.

unix also became an essential building block in my profession. Most of the software I wrote operates in a unix environment, and I still occasionally consult my copy of Advanced Programming in the unix Environment.


It is easy to shoot your foot off with git, but also easy to revert to a previous foot and merge it with your current leg.

Jack William Bell

I encountered version control systems in early 2009; the company I worked for used Rational ClearCase to manage their code. The system versioned each file separately and relied on large configuration files—config specs—to construct a consistent snapshot of the source tree. The tool was utterly confusing and intimidating, so I avoided dealing with it beyond the minimal requirements of my job.

About a year later, I joined a shop that used Subversion. This time, I invested in learning upfront and swallowed the entire Version Control with Subversion before making my first commit. Subversion was easy to understand and use; I couldn’t imagine how to improve on it. Still, I perceived it as a tool that you use at work. There was enough friction in setting up a repository to hinder its use for small personal projects. At the time, Google offered hosting on Google Code, but I didn’t feel comfortable sharing my experiments with the world back then.

And then I discovered Git.

Git was nothing like Subversion. It had a steep learning curve and confused everyone to no end Way before we all got used to ChatGPT, Kim Ødegaard created a service that generates random man pages mocking Git’s dense documentation style. . Still, the confusion was qualitatively different from what I experienced with ClearCase. ClearCase is confusing like a Russian novel: All the characters have strange names, the plot is complex, and it doesn’t end well. Git is confusing like math: It slowly melts your brain and molds it into a tesseract, giving access to higher dimensions.

Git removed the friction from using version control; there was no excuse not to version anything of value anymore. Merging branches with Git didn’t cause anxiety disorders. The staging area—confusingly named index—became essential to my workflows. But my favorite feature was the breathtaking beauty of Git’s design, the elegant mix of distributed systems, acyclic graphs, and content-addressed storage.

Learning about Git’s internals was so much fun that I became interested in the bits and bolts of other version control systems. I travelled through time from Darcs to Mercurial, BitKeeper, and the ultimate origin, SCCS. I also built a toy one-file version control system while learning Rust.

Will Git ever be replaced with something better? Just as it was hard to imagine an improvement over Subversion before Git came along, it’s hard to imagine a significant improvement over Git now. For me, Git’s primary disadvantage is its snapshot-oriented approach which makes merges hard to reason about. Git, Mercurial, and most other tools make it challenging to separate original code from the decisions that the person merging files had to make Jane Street’s tech staff reports similar concerns. See, for example, the Patch review vs. diff review, revisited blog article. . Systems based on patch theory, such as Pijul and Darcs, might address these issues.


While any text editor can save your files, only Emacs can save your soul.

Per Abrahamsen

I edited my first programs in a friendly blue window of Turbo Pascal 7.0. The environment had little friction: no project or build configuration, no noticeable build time; you type your code and run it. That was a perfect tool for learning.

My university used Pascal for introductory programming classes, so I also used Turbo Pascal for my assignments. Later courses introduced C++ and Java, for which we used Visual Studio 6.0 and JBuilder. Although we learned to invoke compilers from the command line, ides dominated my early code-editing experience.

At my first programming job, I worked on a remote Solaris workstation over a Citrix connection. Almost everyone in our group used NEdit to edit the code. One day, I noticed a person whose editor looked markedly different from everyone else’s; the background was dark, and the code glowed with bright colors. To me, that was a sign of their superior technical knowledge. I needed to learn how to tweak my editor.

The quest for customization led me to Vim (the workstation had Vim 6 installed out of the box). After all, if the goal is to stand out from the crowd, why stop at the color scheme? I went through the Vim tutorial, and it clicked with me immediately. It felt like playing a musical instrument: challenging but fun. It turned a mundane job of fixing bugs into an exercise in skill.

I don’t remember exactly when and why I got interested in Emacs Daly journaling is one of the things I wish I started doing earlier. I’m nowhere near Stephen Wolfram’s level, but I enjoy going back and seeing what I was up to a few years ago. . Most likely, it was a result of my obsession with Lisp after reading Structure and Interpretation of Computer Programs and looking for a Lisp to interact with. I remember reading An Introduction to Programming in Emacs Lisp around 2010 and having a great time.

I became interested in the editor internals. Using The Craft of Text Editing by Craig A. Finseth as a guide, I explored the source code of various editors to see how they worked: which the data structures they used to represent text buffers, how they interacted with extensions and implemented the undo mechanism. I found Vim’s source code somewhat messy, inconsistent, and hard to understand. Emacs’s source was spotless, well-organized, and well-documented.

A dive into Emacs internals also revealed the inherent beauty of its architecture. Emacs is a Lisp machine that provides text editing and window management capabilities. It is a powerful, convenient, and friendly development environment for building dynamic text-driven applications in Emacs Lisp.

Almost everything in Emacs is a Lisp object you can inspect, interact with, and access its documentation. That makes Emacs’ documentation system unparalleled once you master it. Emacs’ dynamism makes extending it much easier than any other editor. The feedback loop is airtight: you can immediately try out your code in the context of the editor you use to write it.

Although I’m writing these words in Visual Studio Code, I always have my Emacs open I also have a tmux session with multiple nvim instances running. I use these when my pinky gets tired of holding down the Ctrl key. . Many things are easier in Emacs; I don’t think any software will ever completely replace it for me. It’s also my editor of choice if I need to implement an extension. Programming Emacs Lisp is a joy, especially compared to writing Vimscript.


I also must confess to a strong bias against the fashion for reusable code. To me, re-editable code is much, much better than an untouchable black box or toolkit.

The evening of 2013 New Year’s Eve didn’t go as planned. I was on a cruise ship crossing the stormy Baltic Sea and couldn’t stay on my feet because of the seasickness. Instead of consuming tasty treats with the rest of the passengers, I was lying in bed and reading a book I took for the trip: The Boost Graph Library by Jeremy G. Siek et al.

Most algorithm libraries require you to commit to a specific data representation, making integrating them into an existing project prohibitively expensive. That’s especially true for graph algorithms: The vertices and edges are usually implicitly defined and deeply embedded into other data structures, so it’s easier to re-implement the algorithm than to use a generic library.

The Boost.Graph library solves this problem elegantly using Alex Stepanov’s ideas on generic programming. It uses a bag of tricks (type traits, property maps, visitors) to implement graph algorithms that can work with any graph representation you throw at them, given that you provide an adapter telling the library how to view your data structures as a graph.

Even though I never had a chance to use the library in practice Given that I share Donald Knuth’s attitude opening this section, I would probably not use the library even if I had a chance. I’d rather write one page of interesting code traversing a graph than two pages of boring adapters required to invoke the algorithm. , its design helped me deepen my understanding of stl design and generic programming in general. It also helped me understand the motivation for advanced type-level programming features in other programming languages, such as type families in Haskell. Overall, Boost.Graph is one of the most enlightening pieces of software that I’ve never used.


If make doesn’t do what you expect it to, it’s a good chance the makefile is wrong.

I wrote my first Makefile around 2009 while working on a research project in computational mathematics for my degree. I already used make at work, but I didn’t need to understand how it worked. This time, I had to compile a fortran program mixing sources adhering to different language standards: from venerable fortran 77 to hip Fortran 2003. To get a deeper understanding of the tool, I referred to Managing Projects with GNU Make by Robert Mecklenburg.

Most books on technology excite me: I become enthusiastic about the subject and want to try it out in practice. The book on make had the opposite effect. The complexity required to make builds correct and ergonomic made me yearn for a better tool One book that made me feel the same way was Modern C++ Design by Andrei Alexandrescu. The book is deep and beautifully written, but the terrifyingly clever and ugly tricks in the second chapter made me question the choice of the programming language. Another one is Autotools by John Calcote. .

After my deep dive into make, I often fiddled with build systems at work: I introduced CMake to my first C++ project to replace complex and scarily incorrect Makefile files and replaced an inflexible Ant-based build system in a 500 kloc Java project with Gradle scripts that everyone on the team could contribute to. But all the tools I tried, including CMake, Ant, Maven, Gradle, SCons, and autotools left me deeply unsatisfied. They were clunky, awkward, and hard to extend and compose.

In 2016, I joined Google in Zurich. I heard about Google’s internal build tool, blaze, and couldn’t wait to lay my hands on it. Surprisingly, I didn’t need to fiddle with blaze, nor did I have to understand how it worked. I could copy some build targets and edit the dependency list, and the build worked as expected. blaze made correct and fast builds not just easy, but boring in the good sense. Only a few years later, when I attempted to use Bazel—the open-source version of blaze—for a toy personal project, did I have to understand the underlying model.

Bazel was the final piece of the puzzle, together with Haskell’s typeclasses, Flume pipelines interface, and the TensorFlow 1.0 execution model, that made me understand the ubiquitous plan-execute pattern The Build Systems à la Carte article by Andrey Mokhov, Neil Mitchell, and Simon Peyton Jones explains how various build system designs map to Haskell typeclasses. Thomas Leonard’s CI/CD pipelines: Monad, Arrow or Dart? blog post is also a great read on this topic. . Bazel build file is a program that constructs a slice of the build artifact graph. Bazel rules don’t run the build commands; they declare how to transform inputs into outputs, and the Bazel engine figures out the rest.

My relationship with the tool reached true intimacy when I helped transition dfinity’s build system to Bazel. Despite all the challenges I faced on the way, Bazel is still my favorite build system. It’s fast, correct, easy to use, and language-agnostic.

Paraphrasing Bjarne Stroustup, I think a smaller, simpler, cleaner build system is struggling to get out within Bazel. I hope this core will someday reveal itself to the world and become the standard tool for building all software.


After presenting my cases, I find it tempting to look for a common theme. What makes a good enlightenmentware? For me, these are the key points:

What’s your enlightenmentware? Tell me on Hacker News or Reddit!

Similar articles