Let's remember one thing about Icarus. Yes, he died a fiery, horrible death. But at the same time, he flew close to the freakin' Sun! How cool is that? Likewise, the most exciting developments in software have been crazy stupid. I remember how quixotic the notion that you could index and search the entire Internet (or even a significant part of it) once seemed. Only by thinking big does anything cool get done, it doesn't get done by sitting around pondering all the things you shouldn't be doing.
Dead advice exists for many reasons, of course. Programming is so open-ended that it is easy to create unmaintainable, unreliable, slow software. Programs start out as a blank slate; one can do anything. Without discipline, anything turns into unmitigated crap. Thus experienced programmers observe mistakes (including their own) and, sometimes, craft memorable axioms as a result. Unfortunately, these well-intentioned rules-of-thumb often turn into blindly-followed commandments by the time they penetrate the consciousness of millions of programmers.
For example, the original advice about premature optimization has become twisted over the years, such that programmers are afraid ever to optimize lest their effort be labeled "premature". Similarly, Microsoft's infamous detour into Hungarian notation started with a misunderstood paper by future astronaut Charles Simonyi. But this sort of telephone game isn't the only source of Dead advice.
"It's not guns that kill people, It's these little hard things!"Sometimes Dead advice emerges not from a misinterpretation of the original advice but from a failure to perceive a problem's underlying cause. The treatment of threads in computer science illustrates this phenomenon well. Typical computer science programs teach little, if anything about threads; threads are an "implementation detail" with minor theoretical import (compared, say, to data structures, Big O, recursion, etc). What one does learn about them, though, makes them sound cool. Heck, let's face it, they are pretty darn cool. They're the expression of multitasking; a piece of magic. Junior programmers are disappointed when, on their first job, their mentors seem ready to pour boiling sulphuric acid on their keyboard if they so much as mention the possibility of using a thread. "My advice is not to use threads unless you're extremely experienced, a thread wizard, and even then don't use them." That's what they'll hear.
--- the Flash from "The Trickster Returns"
I'm going to step out on a limb here, insert earplugs to drown out the chorus of "boos" that I can already hear even though nobody's read this post yet, and state: there's nothing wrong with using threads. This is a big reversal for me; I spread the threads are evil gospel for many years. Yet I have confidence in this assertion for a number of reasons, foremost among them being that the admonitions against threads haven't helped much. How many applications have you used where the user interface freezes up every time the application does something that takes longer than a second or two? How many times have you found Cancel buttons (you know, those ones you're trying to hit when you realize you're accidentally deleting all of your un-backed-up files?) that take minutes to have any effect?
If anti-thread prohibitions haven't worked, what would? I've been casually researching this issue for a long time. At first I decided, like many, that threads are indeed evil, and that event-driven programming was the only way to go (programmers love nothing more than a dichotomy.) When other programmers would point out to me that event-driven programming is, at times, a giant pain in the rear, I'd silently mark them as wimps. Finally, I read a series of papers, including this one, that convinced me there was, at least, more to the story. Even more recently, someone clued me in to the work of Joe Armstrong, one of the only people outside of academia to have stepped back from the problem of threads and tackle, instead, concurrency, which was the real issue all along. By far the coolest thing Joe did was realize that you can in fact fly close to the Sun and spawn as many darn "threads" as the problem you're solving truly warrants. You can also use events, too, there having been, it turns out, no dichotomy between events and threads either.
I found this revelatory after having wasted a lot of time, over the years, writing things like HTTP parsers in frameworks that either force you to use only events, or frameworks that let you spawn threads easily (though not necessarily efficiently) but have no standard event-passing mechanism. It's not just that I didn't think of it myself, it's that almost everyone I talked to about this sort of issue was either in the "threads are evil" camp or the "events are confusing" camp. I wasted a lot of time following bad advice.
Experiences with Dead advice such as "threads are evil" have led me to question other Dead advice, such as:
- Premature optimization is the root of all evil.
- Never use multiple inheritance.
- YAGNI.
- Never use macros.