How the truth monopoly was broken up
We are suffering through a pandemic of lies — or so we hear from leading voices in media, politics, and academia. Our culture is infected by a disease that has many names: fake news, post-truth, misinformation, disinformation, mal-information, anti-science. The affliction, we are told, is a perversion of the proper role of knowledge in a healthy information society.
What is to be done? To restore truth, we need strategies to “get the facts straight.” For example, we need better “science communication,” “independent fact-checking,” and a relentless commitment to exposing and countering falsehoods. This is why the Washington Post fastidiously counted 30,573 “false or misleading claims” by President Trump during his four years in office. Facebook, meanwhile, partners with eighty organizations worldwide to help it flag falsehoods and inform users of the facts. And some disinformation experts recently suggested in the New York Times that the Biden administration should appoint a “reality czar,” a central authority tasked with countering conspiracy theories about Covid and election fraud, who “could become the tip of the spear for the federal government’s response to the reality crisis.”
Such efforts reflect the view that untruth is a plague on our information society, one that can and must be cured. If we pay enough responsible, objective attention to distinguishing what is true from what is not, and thus excise misinformation from the body politic, people can be kept safe from falsehood. Put another way, it is an implicitly Edenic belief in the original purity of the information society, a state we have lapsed from but can yet return to, by the grace of fact-checkers.
We beg to differ. Fake news is not a perversion of the information society but a logical outgrowth of it, a symptom of the decades-long devolution of the traditional authority for governing knowledge and communicating information. That authority has long been held by a small number of institutions. When that kind of monopoly is no longer possible, truth itself must become contested.
This is treacherous terrain. The urge to insist on the integrity of the old order is widespread: Truth is truth, lies are lies, and established authorities must see to it that nobody blurs the two. But we also know from history that what seemed to be stable regimes of truth may collapse, and be replaced. If that is what is happening now, then the challenge is to manage the transition, not to cling to the old order as it dissolves around us.
Truth, New and Improved
The emergence of widespread challenges to the control of information by mainstream social institutions developed in three phases.
First, new technologies of mass communication in the twentieth century — radio, television, and significant improvements in printing, further empowered by new social science methods — enabled the rise of mass-market advertising, which quickly became an essential tool for success in the marketplace. Philosophers like Max Horkheimer and Theodor Adorno were bewildered by a world where, thanks to these new forms of communication, unabashed lies in the interest of selling products could become not just an art but an industry.
The rise of mass marketing created the cultural substrate for the so-called post-truth world we live in now. It normalized the application of hyperbole, superlatives, and untestable claims of superiority to the rhetoric of everyday commerce. What started out as merely a way to sell new and improved soap powder and automobiles amounts today to a rhetorical infrastructure of hype that infects every corner of culture: the way people promote their careers, universities their reputations, governments their programs, and scientists the importance of their latest findings. Whether we’re listening to a food corporation claim that its oatmeal will keep your heart healthy or a university press office herald a new study that will upend everything we know, radical skepticism would seem to be the rational stance for information consumers.
In a second, partly overlapping phase in the twentieth century, science underwent a massive expansion of its role into the domain of public affairs, and thus into highly contestable subject matters. Spurred by a wealth of new instruments for measuring the world and techniques for analyzing the resulting data, policies on agriculture, health, education, poverty, national security, the environment and much more became subject to new types of scientific investigation. As never before, science became part of the language of policymaking, and scientists became advocates for particular policies.
The dissolving boundary between science and politics was on full display by 1958, when the chemist Linus Pauling and physicist Edward Teller debated the risks of nuclear weapons testing on a U.S. television broadcast, a spectacle that mixed scientific claims about fallout risks with theories of international affairs and assertions of personal moral conviction. The debate presaged a radical transformation of science and its social role. Where science was once a rarefied, elite practice largely isolated from society, scientific experts were now mobilized in increasing numbers to form and inform politics and policymaking. Of course, society had long been shaped, sometimes profoundly, by scientific advances. But in the second half of the twentieth century, science programs started to take on a rapidly expanding portfolio of politically divisive issues: determining the cancer-causing potential of food additives, pesticides, and tobacco; devising strategies for the U.S. government in its nuclear arms race against the Soviet Union; informing guidelines for diet, nutrition, and education; predicting future energy supplies, food supplies, and population growth; designing urban renewal programs; choosing nuclear waste disposal sites; and on and on.
Philosopher-mathematicians Silvio Funtowicz and Jerome Ravetz recognized in 1993 that a new kind of science was emerging, which they termed “post-normal science.” This kind of science was inherently contestable, both because it dealt with the irreducible uncertainties of complex and messy problems at the intersection of nature and society, and because it was being used for making decisions that were themselves value-laden and contested. Questions that may sound straightforward, such as “Should women in their forties get regular mammograms?” or “Will genetically modified crops and livestock make food more affordable?” or “Do the benefits of decarbonizing our energy production outweigh the costs?” became the focus of intractable and never-ending scientific and political disputes.
This situation remained reasonably manageable through the 1990s, because science communication was still largely controlled by powerful institutions: governments, corporations, and universities. Even if these institutions were sometimes fiercely at odds, all had a shared interest in maintaining the idea of a unitary science that provided universal truths upon which rational action should be based. Debates between experts may have raged — often without end — but one could still defend the claim that the search for truth was a coherent activity carried out by special experts working in pertinent social institutions, and that the truths emerging from their work would be recognizable and agreed-upon when finally they were determined. Few questioned the fundamental notion that science was necessary and authoritative for determining good policy choices across a wide array of social concerns. The imperative remained to find facts that could inform action — a basic tenet of Enlightenment rationality.
The rise of the Internet and social media marks the third phase of the story, and it has now rendered thoroughly implausible any institutional monopoly on factual claims. As we are continuing to see with Covid, the public has instantly available to it a nearly inexhaustible supply of competing and contradictory claims, made by credentialed experts associated with august institutions, about everything from mask efficacy to appropriate social distancing and school closure policies. And many of the targeted consumers of these claims are already conditioned to be highly skeptical of the information they receive from mainstream media.
Today’s information environment certainly invites mischievous seeding of known lies into public discourse. But bad actors are not the most important part of the story. Institutions can no longer maintain their old stance of authoritative certainty about information — the stance they need to justify their actions, or to establish a convincing dividing line between true news and fake news. Claims of disinterest by experts acting on behalf of these institutions are no longer plausible. People are free to decide what information, and in which experts, they want to believe. The Covid lab-leak hypothesis was fake news until that news itself became fake. Fact-checking organizations are themselves now subject to accusations of bias: Recently, Facebook flagged as “false” a story in the esteemed British Medical Journal about a shoddy Covid vaccine trial, and the editors of the journal in turn called Facebook’s fact-checking “inaccurate, incompetent and irresponsible.”
No political system exists without its share of lies, obfuscation, and fake news, as Plato and Machiavelli taught. Yet even those thinkers would be puzzled by the immense power of modern technologies to generate stories. Ideas have become a battlefield, and we are all getting lost in the fog of the truth wars. When everything seems like it can be plausible to someone, the term “fake news” loses its meaning.
The celebrated expedient that an aristocracy has the right and the mission to offer “noble lies” to the citizens for their own good thus looks increasingly impotent. In October 2020, U.S. National Institutes of Health director Francis Collins, a veritable aristocrat of the scientific establishment, sought to delegitimize the recently released Great Barrington Declaration. Crafted by a group he referred to as “fringe epidemiologists” (they were from Harvard, Stanford, and Oxford), the declaration questioned the mainstream lockdown approach to the pandemic, including school and business closures. “There needs to be a quick and devastating published take down,” Collins wrote in an email to fellow aristocrat Anthony Fauci.
But we now live in a moment where suppressing that kind of dissent has become impossible. By May 2021, that “fringe” became part of a new think tank, the Brownstone Institute, founded in reaction to what they describe as “the global crisis created by policy responses to the Covid-19 pandemic.” From this perspective, policies advanced by Collins and Fauci amounted to “a failed experiment in full social and economic control” reflecting “a willingness on the part of the public and officials to relinquish freedom and fundamental human rights in the name of managing a public health crisis.” The Brownstone Institute’s website is a veritable one-stop Internet shopping haven for anyone looking for well-credentialed expert opinions that counter more mainstream expert opinions on Covid.
Similarly, claims that the science around climate change is “settled,” and that therefore the world must collectively work to decarbonize the global energy system by 2050, have engendered a counter-industry of dissenting experts, organizations, and websites.
At this point, one might be forgiven for speculating that the public is being fed such a heavy diet of Covid and climate change precisely because these are problems that have been framed politically as amenable to a scientific treatment. But it seems that the more the authorities insist on the factiness of facts, the more suspect these become to larger and larger portions of the populace.
A Scientific Reformation
The introduction of the printing press in the mid-fifteenth century triggered a revolution in which the Church lost its monopoly on truth. Millions of books were printed in just a few decades after Gutenberg’s innovation. Some people held the printing press responsible for stoking collective economic manias and speculative bubbles. It allowed the widespread distribution of astrological almanacs in Europe, which fed popular hysteria around prophesies of impending doom. And it allowed dissemination of the Malleus Maleficarum, an influential treatise on demonology that contributed to rising persecution of witches.
Though the printing press allowed sanctioned ideas to spread like never before, it also allowed the spread of serious but hitherto suppressed ideas that threatened the legitimacy of the Church. A range of alternative philosophical, moral, and ideological perspectives on Christianity became newly accessible to ever-growing audiences. So did exposés of institutional corruption, such as the practice of indulgences — a market for buying one’s way out of purgatory that earned the Church vast amounts of money. Martin Luther, in particular, understood and exploited the power of the printing press in pursuing his attacks on the Church — one recent historical account, Andrew Pettegree’s book Brand Luther, portrays him as the first mass-market communicator.
To a religious observer living through the beginning of the Reformation, the proliferation of printed material must have appeared unsettling and dangerous: the end of an era, and the beginning of a threatening period of heterodoxy, heresies, and confusion. A person exposed to the rapid, unchecked dispersion of printed matter in the fifteenth century might have called many such publications fake news. Today many would say that it was the Reformation itself that did away with fake news, with the false orthodoxies of a corrupted Church, opening up a competition over ideas that became the foundation of the modern world. Whatever the case, this new world was neither neat nor peaceful, with the religious wars resulting from the Church’s loss of authority over truth continuing until the mid-seventeenth century.
Like the printing press in the fifteenth century, the Internet in the twenty-first has radically transformed and disrupted conventional modes of communication, destroyed the existing structure of authority over truth claims, and opened the door to a period of intense and tumultuous change.
Those who lament the death of truth should instead acknowledge the end of a monopoly system. Science was the pillar of modernity, the new privileged lens to interpret the real world and show a pathway to collective good. Science was not just an ideal but the basis for a regime, a monopoly system. Within this regime, truth was legitimized in particular private and public institutions, especially government agencies, universities, and corporations; it was interpreted and communicated by particular leaders of the scientific community, such as government science advisors, Nobel Prize winners, and the heads of learned societies; it was translated for and delivered to the laity in a wide variety of public and political contexts; it was presumed to point directly toward right action; and it was fetishized by a culture that saw it as single and unitary, something that was delivered by science and could be divorced from the contexts in which it emerged.
Such unitary truths included above all the insistence that the advance of science and technology would guarantee progress and prosperity for everyone — not unlike how the Church’s salvific authority could guarantee a negotiated process for reducing one’s punishment for sins. To achieve this modern paradise, certain subsidiary truths lent support. One, for example, held that economic rationality would illuminate the path to universal betterment, driven by the principle of comparative advantage and the harmony of globalized free markets. Another subsidiary truth expressed the social cost of carbon emissions with absolute precision to the dollar per ton, with the accompanying requirement that humans must control the global climate to the tenth of a degree Celsius. These ideas are self-evidently political, requiring monopolistic control of truth to implement their imputed agendas.
An easy prophesy here is that wars over scientific truth will intensify, as did wars over religious truth after the printing press. Those wars ended with the Peace of Westphalia in 1648, followed, eventually, by the creation of a radically new system of governance, the nation-state, and the collapse of the central authority of the Catholic Church. Will the loss of science’s monopoly over truth lead to political chaos and even bloodshed? The answer largely depends upon the resilience of democratic institutions, and their ability to resist the authoritarian drift that seems to be a consequence of crises such as Covid and climate change, to which simple solutions, and simple truths, do not pertain.
Both the Church and the Protestants enthusiastically adopted the printing press. The Church tried to control it through an index of forbidden books. Protestant print shops adopted a more liberal cultural orientation, one that allowed for competition among diverse ideas about how to express and pursue faith. Today we see a similar dynamic. Mainstream, elite science institutions use the Internet to try to preserve their monopoly over which truths get followed where, but the Internet’s bottom-up, distributed architecture appears to give a decisive advantage to dissenters and their diverse ideologies and perspectives.
Holding on to the idea that science always draws clear boundaries between the true and the false will continue to appeal strongly to many sincere and concerned people. But if, as in the fifteenth century, we are now indeed experiencing a tumultuous transition to a new world of communication, what we may need is a different cultural orientation toward science and technology. The character of this new orientation is only now beginning to emerge, but it will above all have to accommodate the over-abundance of competing truths in human affairs, and create new opportunities for people to forge collective meaning as they seek to manage the complex crises of our day.