Bertrand Russell, a 20th century British mathematician and philosopher, is credited with saying, “The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts.” I’m sure all of us have known people that fit both of these descriptions but I would argue that what is needed from the modern leader is a combination. I want senior tech leaders to have strong opinions about people, process, and technology. If you’ve been around long enough, I’d like you to have an opinion on the best way to organize a team, share feedback, and design a systems architecture. But, because our world changes so rapidly, I also want you to be open to learning something new and changing your mind. Maybe you are a big fan of microservice architecture but when you learn about the cost of operationalization at scale, I’d like you to be open to new ideas.
A few years ago, I was introduced to the phrase, “strong opinions, loosely held” and I think it brilliantly expresses how a senior leader should approach situations. You should have a strong opinion based on your years of experience engineering or managing engineers. You should also be constantly on the lookout for information that would change your mind. This skill is super difficult because we all suffer from confirmation bias, where we tend to seek out information that supports our preexisting beliefs and ignore information that contradicts those beliefs. Add to this in today’s algorithmically driven world, the reinforcement that comes from filter bubbles and even the most open among us are likely to find this skill very difficult.
This puts us smack into the path of what is known as the Dunning–Kruger effect. This theory was introduced in the 1999 paper "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments" authored by social psychologists Justin Kruger and David Dunning in the Journal of Personality and Social Psychology. This is a cognitive bias whereby people with low expertise in a certain area of knowledge tend to overestimate their knowledge, while folks with more knowledge tend to underestimate their expertise. I suspect many parents have witnessed this firsthand as their toddlers are quick to explain that they 100% know how to do some task despite learning it two minutes ago.
This concept is applicable and important to us older folks as well, especially in situations where wrong decisions can have severe ramifications. None of us wants to step onboard an airplane and hear the pilot not sounding completely confident in their skills to get us safely to our destination. However, research has shown that pilots with lower skill or knowledge levels have unrealistic positive images of their capabilities, while higher scoring students underestimate their ability. The same is true for emergency room physicians.
While those of us in tech might not make decisions with such impactful short term ramifications as those of a pilot or emergency room physician, many of us are making pronouncements or at least influence directions for companies that impact hundreds, thousands, or even millions of people’s lives. Any time we begin thinking we absolutely know something, we should take a minute and make sure we’re not stumbling into being another example of the Dunning-Kruger effect or confirmation bias. I find that stating out loud, especially to new teams, that I have strong opinions, loosely held helps remind me that I should be open to changing my mind about even the things that I “know for sure.”
It goes back way further than Dunning-Kruger. If you look past the religious overtones that's what "The Second Coming" is about. The best lack all conviction, while the worst
Are full of passionate intensity.