What just happened in our fictional example is that you fell prey to pluralistic ignorance. Pluralistic ignorance happens in situations where everyone privately rejects a norm but thinks that everyone else in the group supports it. Over time, pluralistic ignorance can lead to situations where a group follows rules that all of its members reject in private.
We fall in this social trap when we conclude that the behavior of our peers depends on beliefs that are different from our own, even if we behave in an identical way ourselves. That’s what happened around Andersen’s naked emperor. Because everyone praised the emperor’s new clothes, each individual thought they missed something obvious. That’s why they chose to conform to the group behavior and play along with the praise of the wonderful clothes they couldn’t see.

Another common social bias is to mistake a familiar opinion for a widespread one. If we hear the same option repeatedly, we come to think of that opinion as more prevalent than it really is. As if that wasn’t bad enough, we fall for the bias even if it’s the same person who keeps expressing that opinion (source: Inferring the popularity of an opinion from its familiarity: A repetitive voice can sound like a chorus [WMGS07]).
This means it’s enough with one individual, constantly expressing a strong opinion, to bias your whole software development project. It may be about technology choices, methodologies, or programming languages. Let’s see what you can do about it.
Most people don’t like to express deviating opinions, but there are exceptions. One case is when our minority opinion is aligned with the group ideal. That is, we have a minority opinion, but it deviates from the group norm in a positive way; the group has some quality it values, and we take a more extreme position and value it even more. In that setting, we’re more inclined to speak up, and we’ll feel good about it when we do.
Within our world of programming, such “good” minority opinions may include desired attributes such as automatic tests and code quality. For example, if tests are good, then testing everything must be even better (even if it forces us to slice our designs in unfathomable pieces). And since code quality matters, we must write code of the highest possible quality all the time (even when prototyping throwaway code).
Given what we know about pluralistic ignorance and our tendency to mistake familiar opinions for common ones, it’s easy to see how these strong, deviating opinions may move a team in a more extreme direction.
Social biases are hard to avoid. When you suspect them in your team, try one of the following approaches:
Ask questions: By asking a question, you make others aware that the proposed views aren’t shared by everyone.
Talk to people: Decision biases like pluralistic ignorance often grow from our fears of rejection and criticism. So if you think a decision is wrong but everyone else seems fine with it, talk to your peers. Ask them what they like about the decision.
Support decisions with data: We cannot avoid social and cognitive biases. What we can do is to check our assumptions with data that either supports or challenges the decision. The rest of this book will arm you with several analyses for this purpose.
If you’re in a leadership position, you have additional possibilities to guide your group toward good decisions:
Use outside experts to review your decisions.
Let subgroups work independently on the same problem.
Avoid advocating a specific solution early in the discussions.
Discuss worst-case scenarios to make the group risk-aware.
Plan a second meeting upfront to reconsider the decisions of the first one.
These strategies are useful to avoid groupthink (source: Group Process, Group Decision, Group Action [BK03]). Groupthink is a disastrous consequence of social biases where the group ends up supressing all forms of internal dissent. The result is group decisions that ignore alternatives and the risk of failure, and that give a false sense of consensus.
As you’ve seen, pluralistic ignorance often leads to groupthink. This seems to be what happened in the Thomas Quick case.