Sunday, September 2, 2007

The Icarus Ultimatum

It is unlikely that every programmer is familiar with Icarus, but I bet that almost all programmers have something in common with him. Programmers are saturated with advice not to do things, similar to the advice Icarus' dad gave him about aviation. Don't use threads unless you really know what you're doing (and then don't use them anyway.) Don't use new language features (they're too dangerous.) Use the "right tool for the right job" (i.e., not the one you like.) Don't "prematurely" optimize (note: it's always premature.) Don't use macros, nobody will be able to read your code. Don't use multiple inheritance. You ain't gonna need it (you dork!) Don't, don't, don't; no, No, NO! For the rest of this essay I will refer to such advice as "Dead" advice, in honor of Icarus' father, Daedalus (and also to honor my own lack of spelling ability.)

Let's remember one thing about Icarus. Yes, he died a fiery, horrible death. But at the same time, he flew close to the freakin' Sun! How cool is that? Likewise, the most exciting developments in software have been crazy stupid. I remember how quixotic the notion that you could index and search the entire Internet (or even a significant part of it) once seemed. Only by thinking big does anything cool get done, it doesn't get done by sitting around pondering all the things you shouldn't be doing.

Dead advice exists for many reasons, of course. Programming is so open-ended that it is easy to create unmaintainable, unreliable, slow software. Programs start out as a blank slate; one can do anything. Without discipline, anything turns into unmitigated crap. Thus experienced programmers observe mistakes (including their own) and, sometimes, craft memorable axioms as a result. Unfortunately, these well-intentioned rules-of-thumb often turn into blindly-followed commandments by the time they penetrate the consciousness of millions of programmers.
For example, the original advice about premature optimization has become twisted over the years, such that programmers are afraid ever to optimize lest their effort be labeled "premature". Similarly, Microsoft's infamous detour into Hungarian notation started with a misunderstood paper by future astronaut Charles Simonyi. But this sort of telephone game isn't the only source of Dead advice.
"It's not guns that kill people, It's these little hard things!"
--- the Flash from "The Trickster Returns"
Sometimes Dead advice emerges not from a misinterpretation of the original advice but from a failure to perceive a problem's underlying cause. The treatment of threads in computer science illustrates this phenomenon well. Typical computer science programs teach little, if anything about threads; threads are an "implementation detail" with minor theoretical import (compared, say, to data structures, Big O, recursion, etc). What one does learn about them, though, makes them sound cool. Heck, let's face it, they are pretty darn cool. They're the expression of multitasking; a piece of magic. Junior programmers are disappointed when, on their first job, their mentors seem ready to pour boiling sulphuric acid on their keyboard if they so much as mention the possibility of using a thread. "My advice is not to use threads unless you're extremely experienced, a thread wizard, and even then don't use them." That's what they'll hear.

I'm going to step out on a limb here, insert earplugs to drown out the chorus of "boos" that I can already hear even though nobody's read this post yet, and state: there's nothing wrong with using threads. This is a big reversal for me; I spread the threads are evil gospel for many years. Yet I have confidence in this assertion for a number of reasons, foremost among them being that the admonitions against threads haven't helped much. How many applications have you used where the user interface freezes up every time the application does something that takes longer than a second or two? How many times have you found Cancel buttons (you know, those ones you're trying to hit when you realize you're accidentally deleting all of your un-backed-up files?) that take minutes to have any effect?

If anti-thread prohibitions haven't worked, what would? I've been casually researching this issue for a long time. At first I decided, like many, that threads are indeed evil, and that event-driven programming was the only way to go (programmers love nothing more than a dichotomy.) When other programmers would point out to me that event-driven programming is, at times, a giant pain in the rear, I'd silently mark them as wimps. Finally, I read a series of papers, including this one, that convinced me there was, at least, more to the story. Even more recently, someone clued me in to the work of Joe Armstrong, one of the only people outside of academia to have stepped back from the problem of threads and tackle, instead, concurrency, which was the real issue all along. By far the coolest thing Joe did was realize that you can in fact fly close to the Sun and spawn as many darn "threads" as the problem you're solving truly warrants. You can also use events, too, there having been, it turns out, no dichotomy between events and threads either.

I found this revelatory after having wasted a lot of time, over the years, writing things like HTTP parsers in frameworks that either force you to use only events, or frameworks that let you spawn threads easily (though not necessarily efficiently) but have no standard event-passing mechanism. It's not just that I didn't think of it myself, it's that almost everyone I talked to about this sort of issue was either in the "threads are evil" camp or the "events are confusing" camp. I wasted a lot of time following bad advice.

Experiences with Dead advice such as "threads are evil" have led me to question other Dead advice, such as:
  • Premature optimization is the root of all evil.

  • Never use multiple inheritance.

  • YAGNI.

  • Never use macros.
Each of these have a kernel of truth to them, but are also too easily misunderstood and have not had the overall effect on programming that was originally intended. If I have some time I'll try to give more examples of each of the above. In the meantime, I do have to mention one piece of (arguably) Dead advice that I haven't found any fault with yet:

2 comments:

denis bider said...

When people say "don't use threads", what they mean is, don't use locks and monitors to communicate between them. That is a recipe for loads of heisenbugs.

However, if you limit your locks and monitors to a small portion of your code which formalizes your message-passing semantics, then your exposure to heisenbugs deriving from your use of locks and monitors is limited to the message-passing implementation. It is therefore just this core that needs to be written by a zen master, and less experienced programmers can write the rest.

So, yeah, the "dead" advice is still correct, if properly interpreted. Apparently you just didn't understand what it meant. ;)

Unknown said...

Echoing what denis bider said, it's not threads that are dangerous, it's the interfaces between them that can be if they're too complex or large in scope. Simple queue based message passing (e.g. Erlang) is hard to get wrong. Having a hundred shared interdependent objects is hard to get right.

It's possible to have a purely event based system with one thread, using things like async IO, but library functions that block but don't have an async version always seem to gum up the works. For example, an app that freezes for no apparent reason on some deep RPC call.

Avoiding fallback in distributed systems

As previously mentioned , I was recently able to contribute to the Amazon Builders' Library . I'd also like to share another post t...