I like to envision science inching its gossamer tentacles into every nook and cranny of the human endeavor—from literature to mountain climbing to medicine—but that inclination could be my confirmation bias inching to the fore. In fact, if I were to suggest a field furthest from the vast reach of science, it might just have been economics—might have been, that is, before this week. A study published this month in the incongruous-sounding Journal of Economic Literature suggests that our judgments and perspectives are tempered not only by available information and confirmation bias, but also by the related phenomenon of “active information avoidance.”
While confirmation bias holds that we preferentially pursue data aligned with our beliefs, interpret tangential data in a way that favors those beliefs, and have poorer recollection of whatever might counter them, information avoidance holds that we also dodge data that we suspect challenges our bias. As Julie Beck of The Atlantic writes, “you can construct a pillow fort of the information that’s comfortable.” And this practice, known as “motivated reasoning,” is common across continents and cultures—intrinsic, that is, to the human experience (or so I (choose to?) believe).
Others who believe in motivated reasoning have taken it further, wracking their brains on what benefits might stem from a seeming immunity to uncomfortable or unfamiliar truths. “Any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational,” writes The New Yorker’s Elizabeth Kolbert. “Still, an essential puzzle remains: How did we come to be this way?”
The most tempting tactic for getting ahead is consistently putting oneself first—a route arguably taken to its purest form by parasites, but continually cropping up in species across the board. “For any individual, freeloading is always the best course of action,” Kolbert points out. The trait of logic, she writes, arose of necessity to “resolve the problems posed by living in collaborative groups.” Communication, in such an environment, becomes a valuable commodity: those who have and provide information come out on top. There’s a trick, though: the accuracy of the information, according to Beck, is secondary to whether others believe it. Accordingly, if you can be a convincing liar, you gain the benefit of deference minus the cost of fact-checking. In this envisioned social environment, researchers suggest that motivated reasoning served a protective function against guilelessly believing everyone with a knack for spinning persuasive tales. Consequently, they propose, we developed an internal, unconscious gauge for filtering out the tales that didn’t align with previous experience—or beliefs. An inadvertent effect of motivated reasoning, then, is that the pillow fort each of us constructs is built not on truth, necessarily, but on the particular group persona we’ve rallied to. Inaccurate suppositions, according to Beck, are symptomatic of group identity. These run the gamut from “chemophobia” in liberal-minded folks (typified by fear of BPA plastic in household products despite studies vouchsafing its safety in current concentrations) to climate change denial among those of a more conservative bent.
This ancient unintended consequence is exacerbated by our present information-saturated situation, according to Beck. “Society has advanced to the point that believing what’s true often means accepting things you don’t have any firsthand experience of,” she writes. “In areas where you lack expertise, you have to rely on trust.” This brings us back around to active information avoidance.
“Echo chamber” is a persistent theme in many a journalism class. Be aware, professors caution, that when you write for x, y, or z publication, you’re preaching to the choir—the audience already believes you. How do you reach the people who don’t? This question, which everyone fervently hopes is hypothetical each time it’s raised, has not been paired with a satisfactory answer (at least in my experience). Or—and now this concept has me actively questioning information—it has been, and I chose to ignore it because the resolution didn’t align with my own preferences or beliefs at the time.
What’s a journalist to do? What, moreover, is a science journalist to do? As Kolbert writes, “providing people with accurate information doesn’t seem to help; they simply discount it. Appealing to their emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science.” While I wouldn’t go so far as to suggest emotion inevitably precludes sound science, its sordid manifestation as sensationalism seems to have marred the field’s credibility. Beck suggests that people will be more receptive to new data if its source is framed as enjoying diverse support—when something is presented to us as grey, rather than black-and-white or us-versus-them, our bias takes a back seat.
Of course, all this speculation puts us in a curious spot (assuming, that is, we’re inclined to accept the information avoidance premise) of wondering what our relationship with the truth really is. This research had me personally seeing the very concept of “objective” as standing on the brink of extinction. But perhaps that’s my bias showing.—