Playing diagnostician and not succeeding the first time is a common occurrence. This practice, in my opinion, is more art than science and is linked to some important factors. The first one is that our brains love an obvious culprit.
For instance, when something breaks, it feels satisfying to identify a single source. New device installed? Must be the device. We’re wired to prefer clean narratives over messy systems. It’s very hard to think thoroughly about systems, because they’re inherently complicated. Most real‑world problems involve interactions between multiple elements.
To return to yesterday’s thermostat story, there are so many things involved aside from the thermostat itself, the furnace, the electronics, the wiring, the ducts and the various sensors.Still, our intuition ignores these other components. It’s the same reason people misdiagnose car problems, software bugs, or even interpersonal conflicts.
In addition, the familiar quickly becomes invisible. We had that furnace running like a clock for a dozen years, so as a loyal servant, it faded into the background.
The thermostat was the novelty, so it got the blame. This is the “assumed good” bias — we trust what we know. All this to say that my recent experience is just how most professionals in engineering, medicine, and aviation describe diagnostic errors.
They warn against “anchoring” onto one explanation too early and “confirmation bias” by only noticing evidence that supports our initial assumption. We can only break out of that loop by stepping back and widening the frame. That’s the real skill: not just fixing one lone defective part, but recognizing a narrow way of thinking. If anything, my story is a perfect reminder that most problems aren’t isolated but relational.
The thermostat wasn’t misbehaving alone; it was dancing with a partner I forgot to watch!

No comments:
Post a Comment