The Skeptical Developer

CREDIT: CrizzlDizzl via Pixabay (CC0)

In the autumn of 1854, cholera paralyzed the city of London. The outbreak’s mechanism was well understood: infection followed from breathing the foul air rising from the banks of the River Thames. Preserving one’s health therefore meant avoiding the river.

The only trouble? The theory was wrong. Even as the death toll rose uptown in Soho, no deaths were reported down at the docks. Only once blame shifted to the contaminated Broad Street pump, subsequently shut, did the outbreak finally begin to recede.

Superstition begins innocently enough. Someone sets out to explain an inexplicable phenomenon, and–lacking a better explanation–everyone else goes along. Word spreads, self-referencing credibility grows, and a flimsy claim is accepted as fact.

This may seem harmless enough–the garlic strewn about the house keeps vampires away, and all it really hurts is the smell–but simply by confusing causality, even innocuous superstitions obstruct investigation into the phenomenon lying beneath.

And belief is tenacious. While we now count the physician John Snow’s investigation into the Broad Street outbreak as a triumph of epidemiology, it was hardly clear-cut at the time. A friend later recounted that Snow:

…gave it as his opinion that the pump in Broad Street, and that pump alone, was the cause of all the pestilence. He was not believed–not a member of his own profession, not an individual in the parish believed that Snow was right. But the pump was closed nevertheless and the plague was stayed.

Germ theory flew in the face of the wisdom of the day. The smell of sewage pervading 19th-century London lent support to the belief in airborne transmission, and–despite robust evidence supporting Snow’s case–it would be another decade before the miasma theory’s staunchest supporters finally hauled down their flag.

If superstition isn’t apparent to its believers, we’re left with a tricky question: not knowing what we don’t know, how do we avoid becoming victims ourselves?

John Snow’s map of the Broad Street outbreak CREDIT: Public Domain

Shortcutting Science

Snow’s investigation into the Broad Street outbreak followed a very familiar pattern. Form a hypothesis. Gather data. Analyze the results. Science, we can do.

Though science is a powerful tool for extracting cause from effect, it’s rarely an easy path. In less-mysterious circumstances it’s faster to replace the rigor with an a priori explanation, and we often get it right. But mysteries by their nature often run deeper than they seem. Take the wrong cognitive shortcut, and we’re little better off than our garlic-tossing, river-fearing forebears.

“Not us!” you protest. “We’re logical, evidence-driven software developers.” And I’m sure you are. But I bet you can find superstitions around you.

I bet you don’t need to look very hard.

The Devil’s in the Debugging

Consider, for instance, defensive checks. There’s a believe widely held by JavaScript developers that default assignments sprinkled throughout a project will ward off evil (this particular evil revealing itself in the guise of a runtime exception). No dessicated cloves above the mantle–just a little pattern that comes out something like this:

function notify (users = []) {
  users.forEach(function (user) {
  // ...

It’s an old story:

When evil appeared before us in the shape of a ReferenceError, we knew just what to do. “It’s easy,” we thought. “We’ll just make sure the reference always exists.”

Knowing as we do that defaults fix reference errors, we didn’t bother asking why the reference wasn’t set–we simply dabbed on garlic and moved along.

Unfortunately, insulating ourselves from a runtime exception, has also obscured whatever upstream logic was trying to notify undefined users in the first place. Rather than pushing uncertainty back towards the system boundary, execution can now pass even deeper into the program before lurking evil will reveal itself. And the poor developer sent in to exorcise it now must understand (and excise) a superstitious “fix” blocking the way to the actual problem.

Defensive checks can help protect our code from runtime errors, but given their tendency to obscure real issues elsewhere, they often do more harm than good. And they’re hardly an isolated example of developer superstition. We believe that test coverage means things are working as intended, todays’s best practice discounts those come before, that 0.1 + 0.2 = 0.3, and all sorts of things about time.

Much as we may believe otherwise, the facts we can truly take for granted are few and far between.

Whether a superstition is protecting V. cholerae or just a digital gremlin, we can recognize when facts don’t add up. Erring on the side of skepticism helps us stay alert to evidence that supports (or disqualifies) dubious claims. Avoiding the miasma didn’t stop the outbreak? Maybe we need a different explanation.

We want to treat the root, not the symptoms, and the best way to get there is to assume the worst. Stare at the problem before you and ask why. Why this problem? This solution? If the answers smell, challenge them. Challenge them again, and keep on challenging until a satisfying answer is all that remains.

Which brings us to a final superstition: that fixing a symptom before its cause is known will invariably lead to harm. While we should question, dig, and root out voodoo fixes wherever we can, we will not always have the luxury of time. Faced with deadlines or a production fire, well, we may only have time to treat the symptoms.

So treat them. Patch now, document your assumptions, and promise to return to them later. Set a reminder, if that’s your thing, but do come back to them. Maybe time has proven you right. Maybe the facts have changed. But maybe it’s been superstition all along.

It’s not too late to make things right.