I am going to start with an unfashionable idea: there is such a thing as truth, and over time, people who are willing to put in the time, effort, and hard work can get closer to it. I don’t care if anyone agrees with me, because if I am right, then the practical consequences of seeking truth (or not) will happen sooner or later regardless. I’m much more interested in why this idea has become unfashionable.

I read a news source and share it. Independent fact-checkers report that the information is false. I get irritated: who are these “fact-checkers” anyway, and why do they think they know better than me? I hear scientists on TV telling me that things I’ve just known for years aren’t true or changing their minds about something I thought was a sure thing. How do these scientists know better than me? They keep changing their minds; they don’t seem to know anything!

So, I go out looking for someone who tells me what I knew in my heart all along: those damned fact-checkers and scientists just don’t get it. I am reacting to authorities who tell me I am wrong by finding authorities who tell me I’m right, and in an Internet age, someone who tells you what you want to hear (perhaps while asking for your credit card number) is never hard to find. When reaction is rewarded and opinion held sacred, pursuit of truth easily becomes confused with obeying authority, and people become convinced they’re truth seekers when they’re really just doing what they’re told. If you ever get closer to truth by obeying authority, then you do so only by accident, because obeying authority and seeking truth are two completely different things.

Appealing to authority is a kind of special pleading: when “because I/they said so” is the answer, it is a failure of the pursuit of truth, intellectual laziness if not intellectual cowardice. Things aren’t true because an expert said them, “because they said so.” Things also aren’t false because an idiot believes them—"stand apart from the herd and don’t listen to them” is just as much of an appeal to authority as “because I/they said so.” There are no universal experts or universal idiots—we’re all knowledgeable in a few ways and ignorant in an infinite number of others.

Saying there are no experts is just as much an appeal to authority as putting blind faith in experts, because if “no one knows anything,” then you still need information to make decisions, that information still comes from somewhere, and you once again end up obeying authorities who tell you what you want to hear. If you want to get closer to the truth, then you must give more weight to what people say who’ve done the work. If I want to know about infectious disease, I don’t ask my auto mechanic, any more than I call the local university medical center to get my fuel injectors replaced.

This isn’t an appeal to blind faith in experts—that also takes us back to authority, not truth. An expert in a field is likely closer to the truth than someone who isn’t an expert in that field not “because they said so” but because they’ve done the work. But individual experts can also be swayed by their own blind spots, prejudices, pride, and other human failings. Focusing on authority sabotages the search for truth by making it about the people and not the process.

Pursuing truth is always a social process that is more than the sum of its parts. Even the solitary madman poring over books alone in a library is listening to collections of human voices, inviting them in, reflecting on them, taking them seriously. Individual experts know what they know because of years, and often decades, of experiment, of watching, listening, learning, and getting it wrong. To really get closer to the truth, they needed to know what all the people who are well-versed in a field said, not just one or a few, and why they say it. They’re not all going to agree, but there are going to be some general conclusions, those are going to be closer to the truth, and there will be new questions to answer. That’s what new generations of experts in a field do, and the process leads closer to truth in baby steps, becoming a little less wrong through a lot of hard work.

If that sounds hard and time-consuming, it is. That fact-check isn’t an authority figure shutting you down; it’s an opportunity to learn more. Put aside the pride, read what’s in the fact-check, click on the links, follow the links to new links, try to find your way toward many somethings written by many people who have done the work, put in the years or decades trying to make sense of it. Is there some broad or general conclusion that most of their work points towards? Same with that scientist on TV—if they make claims that sound weird, are those claims supported by other scientists’ work in the same field? “Scientist” doesn’t mean “knowledgeable about all things that get called science.” After a decade of diligent study, I can reasonably claim only a few areas of expertise out of the sixty-ish sub-fields in sociology. If I were to speak outside my specific expertise in a public forum, my colleagues would rightly suspect ulterior motives—money, pride, power-seeking, or defense of some pet ideology.

The question that matters right now is whether we’re going to put in the work to learn from one another and get closer to the truth; or fall back into that comfortable old habit of finding someone to tell us what to do. The problem, I fear, is that it’s becoming harder and harder for most of us to tell the difference.

Image Credit: Lemmings in Migration, Popular Science Monthly, Volume 11, 1877 {{PD-US}}

https://commons.wikimedia.org/wiki/File:PSM_V11_D400_Lemmings_in_migration.jpg

 

 

Previous
Previous

A Matter of Time

Next
Next

Academic Job Market Hacks