On Information Presented by Authorities

It's hard to distinguish between facts and matters of opinion while growing up

I.

From Rob Bensinger:

There’s nothing intrinsically wrong with expressing your support for something you care about—like a group you identify with, or a spiritual experience you find exalting. When we conflate cheers with factual beliefs, however, those misunderstood cheers can help shield an entire ideology from contamination by the evidence.

At the beginning of our lives, when we are children, we develop the mentalese to make then-rudimentary sense of our surroundings, and those surroundings are our entire world.

Parents are the first authorities. We trust them automatically, and they begin by teaching us about simple objects, language, and the existence of concepts like politeness, fairness and sharing. This sort of basic factual knowledge, instilled from adults, helps form the basis of social life.

Then we go to school: starting with primary school teachers, other authorities emerge to communicate information about the world. A certain dose of this is quite helpful, especially to accumulate common academic knowledge like algebra or the general structure of the government.

At some point we become aware of topics with no clear answers. Economics and politics are like this. But to a young mind, the lines between these contentious topics and common knowledge are blurry; it’s easy to mistake opinions for facts when the same authorities present them and survival has hitherto depended on taking such information as given. My parents are outspoken Democrats, for example, so until I was about fifteen I took as given that the left was obviously right about everything and Republicans were silly for opposing them.1 In grade nine, I had a teacher who presented liberal talking points as given in class, mainly related to Obama’s then-imminent healthcare reform bill. I was left thinking that the half of the class raised by Democrats agreed with every word and the Republican half the opposite—all without knowing anything important about the topic.

Everyone having that experience in the class was captured by ideology—they were convinced they knew factual truth about what actually were contested claims or matters of opinion. This likely happened innocuously and imperceptibly2 as a consequence of our early authorities’ desires to communicate increasingly detailed information about the world. To the young student, outside the very limited available scope of direct experience, there is only one category: information presented by authorities.

The problem is that topics prone to ideology like politics and economics are so complicated that any individual view of them is likely to be biased by the observer’s (highly variable) direct experience, so it’s incorrect to apprehend them the same way as, for instance, a geometric proof. In my experience, teachers often used the term “critical thinking” to refer solely to this act of discriminating between matters of opinion and fact (which is prone to make about as much sense as getting out of the car when almost all of your life has consisted of receiving information presented by authorities and not questioning it too much). Inevitably, though, even the good critical thinker accidentally learns some opinions-as-facts.

My teachers mostly did a pretty good job avoiding ideology in class, with a few glaring exceptions. Perhaps the sharp contrast helped me see the distinction, but it seems like many people remain captured by ideology to varying degrees. This can probably arise from either too much or too little trust in our early authorities: too much and you go down the path I describe above; too little and you’ll generalize too much from your own experience and fail to learn from others. Poignantly, early authorities could teach heuristics to their kids based on proper reasoning or well-researched conclusions, present it in context, and a child might still parse it as ideology and get captured by it. Some of the best teachers I met in my education did their best to explain “this is just my opinion, don’t take it too seriously, if you’re interested you should read book X about this later” before saying anything. But sometimes kids are impressionable enough that it doesn’t work.

So, as a conscious being you have a direct experience of the world, and that’s usually pretty informative about whatever parts of the Earth you regularly experience. No talking head or writer online will know more about your particular neighborhood block than you, nor the lived experience in your school or place of work. But that direct experience doesn’t necessarily translate into general principle, even for the people who live or work in the same areas, and to what extent you should trust your own judgment is a difficult question.

I believe the proper response to this is to be a rationalist: perform ongoing work to understand your own cognitive biases, epistemological tendencies, etc. and take care to present one’s own opinion with the degree of confidence that one actually feels. I see no better answer.

II.

To roughly align these commonly-encountered types of knowledge in descending order by certainty, we have direct experience, then common academic knowledge, then perhaps knowledge gained from one’s particular areas of expertise. Next we have heuristics, principles, and the like, either generalized from direct experience or credibly acquired (e.g. reading from that one really important Wikipedia article). Finally, at the least helpful side of the spectrum, we have beliefs obtained from ambient sources or authorities, and hypotheses far from direct experience.3

Ideologies can attach to these weaker types of knowledge and convince a subject that this weaker knowledge actually is (or should be) a matter of common academic knowledge or similar. I’m reminded of people who are absolutely convinced beyond doubt that God exists and think I’m silly for not agreeing with them. This is a categorization problem: in my model, the believer-beyond-doubt has either filed their religious belief under “common academic knowledge” or mischaracterized their direct experience to definitively support their claims. The believer-beyond-doubt has shielded their belief from contamination by the evidence. To be faithful is fine; to be certain is not.

Most examples are much more nuanced, of course, and placing every idea in the right bucket is unrealistic. To apprehend an idea and categorize it properly is an ideal pursued by every rationalist.

Ideologies are bad because they obstruct a proper view of the facts of an issue. Suppose you are the victim of an Absolute Certainty that QAnon is true. In that case, there is not only no chance that you’ll understand US politics accurately, but there is a standing reason why you can’t in the future either until you move on from the false premise of believing in QAnon. This is bad for the individual to the extent that they want or need to know anything about US politics etc., and bad for society to the extent that it causes inaccurate beliefs to propagate.

Again, QAnon is an extreme example because it’s obviously not right about anything; a more pesky example would be something like “the Democratic Party is right about everything” or “expanding the social safety net is always a good idea.” Reasoning with these premises may often lead to agreeable conclusions, but not soundly: to reason this way is to begin with an excessively explanatory premise. A very powerful premise leads to a very powerful conclusion even if the argument itself is weak. (And you’d often beg the question or be wrong anyway, of course.)

Best is to begin exploring a new topic with humble uncertainty and to gradually apply Bayesian updating; if that eventually leads to “the Democratic Party is generally right" so be it, but this way you can never arrive at “the Democratic Party is right about everything.”

Beliefs are things in constant need of calibration.

This topic relates closely to whether and when it is justifiable to trust your own judgment rather than that of society. For further reading on this matter, consider Inadequate Equilibria.

1

Notably not what something directly communicated by my parents at any point in time; instead, a conclusion I came to without particularly noticing.

2

I, for one, didn’t reflect clearly on that experience in class until a year later.

3

Here, common academic knowledge can definitely come from authorities, but it’s so universally agreeable that I categorize it separately here. I’m thinking of things like mathematics, probability, and statistics—deriving these from scratch is clearly unnecessary but they ought not be categorized with other arguments from authority.