In November 2008, with the financial crisis in full swing, Queen Elizabeth attended a ceremony at the London School of Economics. Facing an audience of high ranked academics, she posed a simple question: “Why did nobody notice it?”
How could it be that no one among the smartest economists, commentators, and policymakers in all her kingdom – and beyond – had been able to see the formation of a bubble of such dimensions?
And yet critical facts were readily available – facts that could have warned about the craziness of the housing market, on which an even bigger financial house of cards had been erected. A short trip to a “regular” American neighbourhood – like the one undertaken by Mark Baum in The Big Short – would have presented an endless list of properties under foreclosure, real estate agents openly bragging about the laxity of credit requirements, and exotic dancers with multiple mortgage-financed properties.
Such evidence would have been sufficient to convince most people of the existence of a bubble. However, in London, New York and the other financial centres of the world, an entire class of experts kept blatantly ignoring the facts, anecdotal evidence, and common sense that could have anticipated what was about to happen.
This is a high profile example of a more general situation in which a narrative establishes itself and resists being disproven, even when it is clearly contradicted by information right under our noses. Like the crowd in Hans Christian Andersen’s famous parable, we watch our sovereign parading naked in the street, but are unable to see through his invisible clothes. Until a young boy steps forward and with a little common sense lifts the veil on our “common talk”.
Falling for conformity
Common talk is the unreflective parroting of smart-sounding theories, stories, and arguments without applying any test, even the most basic one, to verify their validity. It’s not that common talk is necessarily false – some of what people repeat mindlessly happens to be true. It’s just that veracity is of secondary or tertiary importance. In this respect, it is a lot like Harry Frankfurt’s concept of bullshit: a lack of concern for the truth.
But why do we fall for and perpetuate common talk?
Sometimes it is a deliberate choice. This happens when we perpetuate narratives out of direct personal interests – like the bankers levered up in the real estate market – or in an effort to please those in a position of authority. More often, though, our motives are less conscious.
During the 1950s, Solomon Asch demonstrated in a series of “conformity experiments” how easily social pressure can cause people to second-guess their judgments, even on a question as basic as the length of a line. This stems in part from a fear of ridicule or humiliation. Faced with a dominant opinion, it is easy to doubt ourselves and question whether we are qualified to contradict so many other people – especially if they are acknowledged “experts” in complex domains like finance.
In other situations, what draws us towards common talk is the desire to maintain consensus. When an idea or theory starts spreading, there is a lot of inertia to stick with it. This is a tendency that runs deep in our genes. As Roy Baumeister has recently argued, humans have “an innate propensity to value consensus above accuracy.” Although groups have a strong incentive to seek accurate information about a given topic, Baumeister concludes that other criteria are indeed more powerful:
…groups value consensus and shared reality, and so members are often reluctant to bring up information that goes against the emerging consensus. Although critique and argument would best serve the group’s epistemic goals, the goal of harmony tends to suppress those processes.
As social animals, we need a collective worldview within which to operate. Common talk is one of the main ways we construct that worldview.
Common talk is fragile
There is a tendency today to associate fake news and disinformation only with the uneducated, but this is extremely self-oblivious. Instead, as the financial crisis of 2008 shows, people that can be considered “very smart” by any acknowledged external measure – from IQ to educational and professional achievements – are far from being immune to common talk. Peter Thiel goes even further, arguing that “smart people” are more likely than average to pick up on trendy and fashionable thinking and get trapped by it.
A possible explanation lies in the content of common talk. People pick up on things when they somehow are looking for them. They are, in other words, receptive to a particular type of message. Common talk is particularly tempting for those who have an affinity for explanations, for overarching stories, for big-picture thinking. These seemingly coherent narratives have something in common: they all focus on the macro.
If macro thinking makes common talk attractive, it also makes it vulnerable to turn into big disappointments. The barriers to bullshit lowered, common talk can drag us to the false belief that we can rationalise the complexity of the world we live in, and inhibits our ability to erect defences against collective illusions. For these reasons, common talk, rationalisations, and narratives are failure prone. To say it like Nassim Taleb, they are “fragile”.
The same Taleb offers us a way out from this trap, encapsulated in this quote: “it is easier to macrobullshit than to microbullshit”. If we want to stay away from the temptation of the macro we need to turn our attention to the micro.
How to see through invisible clothes
Common sense sits at the opposite side of common talk along the macro/micro divide. Its focus is tangible and practical. Observations and experiences as opposed to rationalisations. Common sense is inherently micro.
In the context of the 2008 real estate bubble, common sense is the “layman’s” realisation that an increasingly large number of people cannot afford their mortgages and the ensuing conclusion that they will be defaulting on their loans. Its value doesn’t lie in the ability to offer comprehensive explanations, but rather in its empirical validity. Traditional common-sense knowledge, like simple heuristics and grandmotherly advice, is the ossified product of observations. They have endured through time not because they are attractive but because they work.
We can now consider the optimal approach to navigate situations where common talk appears in contradiction with common sense. Because of its empirical and practical nature, I argue that common sense should be considered “default-right” while common talk should be considered “default-wrong”. In other words, faced with a dilemma, the burden of proof lies with the statement that contradicts common sense.
Thinking in terms of burden of proof can help us come up with a set of simple rules of thumb to guide our day-to-day decision making and communication. I’ll suggest three here, but there are surely more which I’d be happy to hear:
When a proposition contradicts common sense, we should assume it to be wrong.
Jared Diamond made this point recently when naming “common sense” as his choice for a scientific concept that “ought to be more widely known.” As an example, he cites a recent debate among archaeologists regarding the discovery of pre-Clovis settlements in Southern U.S. states and Latin America. Common sense alone would be enough to dismiss such claims. There are hundreds of Clovis settlements south of the Canada/U.S. border, but no sign of human presence before that epoch has yet been discovered. If pre-Clovis populations had indeed passed through the continent, there would be plenty of evidence by now and it seems implausible that they have been “airlifted” directly to the south. Yet a large number of scholars overlook this basic logic and get blinded by the desire to claim a new discovery pre-dating those of their colleagues. Unsurprisingly, these claims usually turn out to be the result of measurement or sampling errors in the dating of radiocarbon.
The point is not to preemptively disregard any theory that contradicts common sense. Rather, it is a warning to avoid getting bogged down in details before the primary contradictions of a new theory are resolved. It also forces us to come up with a plausible explanation for why such a theory is not wrong.
If we cannot strip a statement of its jargon and rephrase in our own words, we are likely perpetuating common talk.
One of the best formulations of this point comes from Timothy Snyder’s On Tyranny. In lesson nine – “Be kind to our language” – he writes:
Avoid pronouncing the phrases everyone else does. Think up your own way of speaking, even if only to convey that thing you think everyone is saying.
If we cannot explain a concept in plain terms, there is a high likelihood that we are either falling prey to consensus thinking or that we are simply indifferent to the validity of our statements. Writing and teaching are two great ways to avoid this trap. More often than not, they lead to an accurate realisation of our true level of understanding.
Naturally, it is not realistic to expect we can reach a teaching-level of knowledge in all possible subjects. There will be situations where we are forced to work from cached thoughts – thoughts that we pick from our memory or environment without critical processing. In these situations, it is worth realising that the burden of proof is on us. Everything we say without fully understanding it, or without a strong trust in the source, should be taken with a grain of salt.
Always test macro thoughts against micro examples.
From a general point of view, this is simply one of the essential characteristics of science. In Feynman’s words: “It doesn’t make a difference how beautiful your guess (theory) is, it doesn’t make a difference how smart you are… if it disagrees with experiment, it’s wrong”.
As a methodological advice, I found a great example in this interview with Scott Aaronson. His approach to preventing the risk of falling for “elegant” but flawed theories is to avoid looking for general frameworks too early in his investigation. He starts instead by looking for “easy special cases and simple sanity checks”, or things he can try out “using high-school algebra or maybe a five-line computer program”. Not only does this micro focus prevent wild goose chases, but it also primes him for a better understanding at the macro level:
I find that, after you’ve felt out the full space of obstructions and counterexamples… finding the proof techniques by which to convince everyone else is often a more-or-less routine exercise.
This approach is valid for science as it is for running a business. Execution (the micro) without vision (the macro) can feel like mindless, unexciting work. But, to say it like Edison, “vision without execution is hallucination”.
When common sense fails
Seen like this, trusting our common sense would seem like straightforward, perhaps even bulletproof advice to follow. But common sense does fail us on occasion – and when it does, it fails us big time. The tricky part is distinguishing the situations where common sense works from those where it can lead us astray.
First, there is a practical aspect. Even when common sense is right, it can still result in an economic loss if used as a guide to investment decisions. Right or wrong, a common belief can push prices up in a growing spiral, fuelled by the self-fulfilling effect of bull markets. As every investor knows, “the market can remain irrational longer that you can remain solvent”.
Substantial structural changes – in economics, politics, or technology, for example – can also undermine the validity of our common sense. These are situations where previously reliable mental models stop working and new ones – new common sense – need to be created. We are currently facing one of these structural rearrangements as we move from an economy based on scarcity (industrial) to an economy based on abundance (information). Sensible industrial-age values like thriftiness, planning, and risk minimisation are now losing relevance and are being replaced by “new” ones like experimentation, learn-fast-fail-fast, and optionality.
Another significant shortcoming of common sense is that it has limited explanatory value. Common sense can be a valuable compass to guide our behaviour, it can help us spot and debunk flawed theories or common talk, but it does little to explain new phenomena or prove the validity of new propositions.
In this way, common sense plays a role akin to observation or experimentation within the scientific process. As argued by Hume, observation alone cannot produce scientific knowledge, or at least it is not sufficient for that purpose. Observing a thousand white swans does not prove that all swans are white. But a single black swan does refute the proposition. In this way, common sense is most useful as a falsification mechanism.
At the same time, science has the authority to revise our “common sense” just as it has the power to explain things beyond what we can directly observe. It was once fairly obvious that the sun revolved around the earth – but a long, arduous campaign of reason and observation convinced us otherwise. In other words, when common sense and science collide, either a hypothesis is proven wrong, or common sense needs to be updated.
At this point, we are left to answer a critical question. How can we decide when to overrule our common sense? What should we do in the many, almost daily, situations where it’s impossible to verify the validity of a statement? When can we trust common talk?
In this post, I have focused on situations when common talk should not be trusted. Put another way, I have tried to advocate for the adoption of a precautionary principle. But society rarely moves at the speed of precaution. Significant changes are initiated by people that find confidence in unproven convictions and are brought forward by people that disregard rationality to follow them.
I suspect the answer cannot be found in a positive theory of certainty, but in the acceptance that, as humans, our destiny is to live, and act, in doubt.
This post appeared first on Ribbonfarm