How Confusion Spreads and Becomes Engrained
In this article I want to discuss the modes by which confusion spreads and manifests itself in today’s society. First I want to define what I mean by confusion.
Confusion is: “When a logically wrong conclusion or viewpoint becomes accepted by the vast majority, including experts. This flawed conclusion then remains unnoticed for a long time and engrains itself into the social fabric.”
If you wonder how often this can happen in a societal context, I have to disappoint you that it happens frequently. And it frequently remains unnoticed for decades.
As an investor in public markets, I follow the markets daily and notice how frequently the vast majority of investors come to a wrong conclusion in an economic system. But within this economic system, facts and financial results eventually clear this confusion in the market. Sometimes confusion can be missed for a while and takes longer to unearth, which leads to bubble or anti-bubble like phenomena. But eventually they burst and everyone realizes the collective misunderstanding.
In physical systems confusion also starts out frequently but it is easier to spot early on. The flaws can be debunked with a controlled experiment used to convince the majority. Sometimes the confusion is missed early on and the truth comes out through some physical accident or small disaster. For example, a rocket explosion or aircraft crash can unearth a confusion in a physical context.
Naturally, in a societal setting these confusions arise as frequently as in the other two settings, but they are usually much harder to unearth. This is primarily because there is no clear validation or feedback mechanism for societal confusions. Therefore these confusions can spread, manifest and last for decades, before ever noticed or challenged.
Therefore, I want to discuss the most common modes that lead to these confusions in societal settings, and how to spot and avoid them.
1. Only accessing secondary sources of information
Most people when accessing information rely only on secondary sources (media channels) and rarely access primary sources of information (listening to a live statement, reading the primary statement, examining a research finding, doing a personal experiment).
Secondary sources are really bad!
As an investor, I spend a lot of time listening to CEOs, politicians, investors, economists and broader intellectuals talk in live settings. What I usually notice is that by the time this information reaches a secondary source (any media channel) it is heavily distorted. Usually the information is presented incomplete, out of context, simplified, generalized and distorted in other ways. This behavior is rooted in the economic nature of mass media. I discuss this distorting dynamic in another article about mass media.
Not accessing primary sources does not allow the public to participate in a primary discussion, but only in distorted and meaningless side-discussions.
Mitigation: Try to access all information as closely as possible to the primary source (where it was created). If you can, try to avoid secondary sources.
2. Inability to differentiate facts from arbitrary content
Since most people naturally don’t have deep expertise in a field discussed, most people do not have good means for differentiating facts from arbitrary content and commentary. In addition the concept of a fact has become a vague concept in the information age. The amount of distorted information on the Internet has proliferated and anything is considered a fact now.
On top of that, all of us are prone to confirmation biases. We easily accept information that confirms our views and rarely spend time and rigor to validate the facts.
Mitigation: Use multiple, ideally opposing sources trying to validate the facts presented. Also be aware of the confirmation bias and search for even more rigorous validation for facts that confirm your viewpoints.
3. Inability to catch logical inconsistency
Our educational institutions teach us a lot of material today, but they put little effort in teaching skills for testing logical consistency. With the immense amount of information available on the internet today, logical consistency analysis is one of the most important skills that we are neglecting to teach.
Because people are not learning how to catch logical inconsistencies, they can not differentiate if a statement is logically consistent and makes sense.
Similarly, since logical consistency skills can help one identify information distortions, as well as, “false facts”, lacking these skills also re-enforces the above described two modes.
We often internalize and start spreading statements and arguments that are logically inconsistent. If one pays attention to any social media channel, this dynamic is the rule rather than the exception.
Mitigation: Make an effort to validate logical consistency of any statement or conclusion. Learn about all the modes that presenters can use trying to hide logical consistency. Also be aware of the confirmation bias here.
4. Generalization as knowledge
Most knowledge is created through generalization. We observe a limited number of instances of a phenomenon, capture patterns and connections. We generalize it to be considered knowledge about other similar phenomena.
Often we overgeneralize it and after a while forget the exceptions, that do not conform to the rule. Sometimes the newly created knowledge becomes so popular, that we even generalize it to instances that are not even related to the initial phenomenon.
Also in knowledge creation, we often do not account for the fact that the world is dynamic and patterns can change over time. The social world is very dynamic, so very few patterns are permanent.
Many things that we consider knowledge and we use to guide our decisions, may not apply for a large number of instances.
Mitigation: To become immune to over-generalization, it helps to look at every phenomenon as a unique instance of its own. In reality every single occurrence in the world is unique, some are just connected by overarching similarities.
5. Echo Chamber effect
We are naturally wired to like people who agree with us and dislike people who disagree with us. Over time this leads to an echo chamber phenomenon. We surround ourselves with like-minded people, who are validating our viewpoints. Consequently, we don’t take the time to understand opposing views.
Nietzsche once said: “The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike, than those who think differently”
I constantly try to remind myself of this quote.
Mitigation: Surround yourself with people who also disagree with you and are not just yes-sayers. Spend time to listen to people or sources, that disagree with you. Try to imagine other perspectives.
6. Imitation as primary mode of opinion building
A fundamental aspect of human nature is that we learn by imitation. Our behavior is an imitation of the environment we were born into. Also most of our opinions are an imitation of the opinions we were exposed to from a young age. People rarely build their own opinions. Sometimes we think we are building our own opinions, but we are just choosing between two options presented to us.
Oftentimes we need to take a position on issues where we lack expertise. We usually tend to migrate to the opinion of an authority, role model or the crowd. Avoid this bias. Don’t copy, think!
I like to say: “No opinion is better than an uninformed opinion”.
Mitigation: Become aware of how your opinion is built and who is influencing it. Try to assess if it is really independent. Avoid developing uninformed opinions.
7. Excessive brand reverence
Often a person, who is part of an institution with a recognized brand, can make a completely wrong statement. That person can even have a completely confused understanding of the topic discussed. However, since the person is hiding behind the brand of an institution, the statement will be easily accepted and not rigorously analyzed.
We are wired to trust intellectually authoritative brands that dominate our society. Institutional names like NASA, CIA, Harvard, Stanford, Goldman Sachs, McKinsey, Google can cloud our judgments. When we are assessing information, we need to be aware of this bias. The source itself should not affect our judgment about the information’s validity.
I often notice Stanford professors, Harvard economists, Goldman Sachs managers make completely wrong and meaningless statements on financial TV (in an economic system this confusion is easier to validate later). These people are not infallible. Even more likely, because of the echo chamber effect, these institutions can be sometimes more fallible.
In the stock market, when an authority makes a wrong statement, many start following this viewpoint. It then spreads and creates the type of confusions I discussed in the opening paragraph.
Our society has excessive brand reverence.
Our advertisements and sales pitches show this with statements like: “A mattress developed by former NASA scientists” or “a method developed by MIT engineers”.
In my professional career I have interacted with many NASA scientists, U.S. intelligence agents, Stanford professors, Goldman Sachs managers, Google engineers, CalTech researchers. Among them there are many great minds, but also several mediocre minds, who have found ways to hide behind these big brands.
Remember that our number one authority is our own mind in combination with correct reasoning and facts.
Mitigation: Try to develop immunity to institutional brand reverence. Don’t let these names cloud your judgment. Don’t fall for this bias.
8. Excessive trust in profound initial impression
This phenomenon is somewhat similar to authority reverence, but it is a slightly different dynamic. Therefore I like to consider it separately.
Excessive trust in profound, initial impression occurs frequently, if the source is not a mainstream authority but a fringe authority. The source does not have the same mainstream brand reverence.
The person will start pointing out some aspects that seem profound and true, yet are novel, and others on the mainstream rarely talk about. People start perceiving this person as different, as speaking the unpopular truth. After a while the vast majority will not catch, if that person starts making wrong or illogical statements. This phenomenon can be frequently observed, especially in sociological settings.
Mitigation: Keep in mind, just because a person makes several statements that are profoundly true and enlightening, does not make him/her incapable of committing an error. Once again the validity of a statement should not be affected by its source.
9. Comfort Bias
Comfort Bias describes the phenomenon that any information that makes human’s feel better about themselves is more likely to be accepted and spread without rigorous analysis.
Many writes use the phenomenon of Comfort Bias to sell books by including mis-information and inaccurate findings that make the readers feel better about themselves.
This over time increases the amount of mis-information circulating.
10. Crowd dynamics
Crowd behavior is a well-known phenomenon that many of us are susceptible to. It is much easier to go with the crowd than against the crowd. I notice this dynamic every day in the stock market. Going against the crowd requires courage and conviction. Going with the crowd does not require anything. When the size of the crowd backing a confused viewpoint becomes large enough, it is often very difficult to stop the phenomenon from growing.
Mitigation: Be aware that the crowd can be frequently wrong. Start by observing the crowd dynamics before you build your opinion. Try to understand which feedback loop is creating that specific viewpoint in the crowd.
11. Static “truths”
These nine phenomena often interact to create societal confusion. Most likely there are many instances of confusion that none of us are aware of. History will be our judge and hopefully can uncover some of them.
When a wrong conclusion becomes engrained in a society, after a while it becomes static truth. We start viewing it as absolute. Anyone that opposes a long and widely held position is exposed to vehement resistance from the crowd. That person can sometimes be ostracized. This has been the case throughout the ages.
As Goethe once said:
“The few who knew what might be learned,
Foolish enough to put their whole heart on show,
And reveal their feelings to the crowd below,
Mankind has always crucified and burned.”
Be aware that “many, many” views that we consider absolutely true today could end up being absolutely wrong!
Many of our systems are based on majority opinion and “economic-feedback conformity” and both of these decision making modes can be wrong a lot.
Mitigation: Next time when someone opposes a widely held truth, remain open-minded until you have understood the reasoning. Try to remain neutral, while trying to validate or discount the position with logic and facts.
It is difficult to exactly gauge how much societal confusion is out there. But my best guess is that there must be a lot. My experiences with the stock market show how often and easily the vast majority of market participants can become confused.
So if we do the simple math and consider that over centuries we have created many “absolute truths” in societal contexts and that societal systems have limited validation mechanisms, it is easy to imagine that the actual state of societal confusion is fairly large.
On a positive note, society has lived with this condition for centuries and it will go forward with it, so it’s not that bad. Just be aware that these confusions exist everywhere.
Since I am a dyslexic, I am prone to spelling and grammar mistakes. Hopefully it does not distract from the substance of the article.
Thank you for reading this article