CHAPTER 22

SCIENCE

If we were called upon to name the proudest accomplishments of our species, whether in an intergalactic bragging competition or in testimony before the Almighty, what would we say?

We could crow about historic triumphs in human rights, such as the abolition of slavery and the defeat of fascism. But however inspiring these victories are, they consist in the removal of obstacles we set in our own path. It would be like listing in the achievements section of a résumé that you overcame a heroin addiction.1

We would certainly include the masterworks of art, music, and literature. Yet would the works of Aeschylus or El Greco or Billie Holiday be appreciated by sentient agents with brains and experiences unimaginably different from ours? Perhaps there are universals of beauty and meaning that transcend cultures and would resonate with any intelligence—I like to think there are—but it is devilishly difficult to know.

Yet there is one realm of accomplishment of which we can unabashedly boast before any tribunal of minds, and that is science. It’s hard to imagine an intelligent agent that would be incurious about the world in which it exists, and in our species that curiosity has been exhilaratingly satisfied. We can explain much about the history of the universe, the forces that make it tick, the stuff we’re made of, the origin of living things, and the machinery of life, including our mental life.

Though our ignorance is vast (and always will be), our knowledge is astonishing, and growing daily. The physicist Sean Carroll argues in The Big Picture that the laws of physics underlying everyday life (that is, excluding extreme values of energy and gravitation like black holes, dark matter, and the Big Bang) are completely known. It’s hard to disagree that this is “one of the greatest triumphs of human intellectual history.”2 In the living world, more than a million and a half species have been scientifically described, and with a realistic surge of effort the remaining seven million could be named within this century.3 Our understanding of the world, moreover, consists not in mere listings of particles and forces and species but in deep, elegant principles, such as that gravity is the curvature of space-time, and that life depends on a molecule that carries information, directs metabolism, and replicates itself.

Scientific discoveries continue to astound, to delight, to answer the formerly unanswerable. When Watson and Crick discovered the structure of DNA, they could not have dreamed of a day when the genome of a 38,000-year-old Neanderthal fossil would be sequenced and found to contain a gene connected to speech and language, or when an analysis of Oprah Winfrey’s DNA would tell her she was descended from the Kpelle people of the Liberian rain forest.

Science is shedding new light on the human condition. The great thinkers of antiquity, the Age of Reason, and the Enlightenment were born too soon to enjoy ideas with deep implications for morality and meaning, including entropy, evolution, information, game theory, and artificial intelligence (though they often tinkered with precursors and approximations). The problems these thinkers introduced to us are today being enriched with these ideas, and are being probed with methods such as 3-D imaging of brain activity and the mining of big data to trace the propagation of ideas.

Science has also provided the world with images of sublime beauty: stroboscopically frozen motion, flamboyant fauna from tropical rain forests and deep-sea ocean vents, graceful spiral galaxies and diaphanous nebulae, fluorescing neural circuitry, and a luminous Planet Earth rising above the moon’s horizon into the blackness of space. Like great works of art, these are not just pretty pictures but prods to contemplation, which deepen our understanding of what it means to be human and of our place in nature.

And science, of course, has granted us the gifts of life, health, wealth, knowledge, and freedom documented in the chapters on progress. To take just one example from chapter 6, scientific knowledge eradicated smallpox, a painful and disfiguring disease which killed 300 million people in the 20th century alone. In case anyone has skimmed over this feat of moral greatness, let me say it again: scientific knowledge eradicated smallpox, a painful and disfiguring disease which killed 300 million people in the 20th century alone.

These awe-inspiring achievements put the lie to any moaning that we live in an age of decline, disenchantment, meaninglessness, shallowness, or the absurd. Yet today the beauty and power of science are not just unappreciated but bitterly resented. The disdain for science may be found in surprising quarters: not just among religious fundamentalists and know-nothing politicians, but among many of our most adored intellectuals and in our most august institutions of higher learning.


The disrespect of science among American right-wing politicians has been documented by the journalist Chris Mooney in The Republican War on Science and has led even stalwarts (such as Bobby Jindal, the former governor of Louisiana) to disparage their own organization as “the party of stupid.”4 The reputation grew out of policies set in motion during George W. Bush’s administration, including his encouragement of the teaching of creationism (in the guise of “intelligent design”) and the shift from a longstanding practice of seeking advice from disinterested scientific panels to stacking the panels with congenial ideologues, many of whom promoted flaky ideas (such as that abortion causes breast cancer) while denying well-supported ones (such as that condoms prevent sexually transmitted diseases).5 Republican politicians have engaged in spectacles of inanity, such as when Senator James Inhofe of Oklahoma, chair of the Environment and Public Works Committee, brought a snowball onto the Senate floor in 2015 to dispute the fact of global warming.

The previous chapter warned us that the stupidification of science in political discourse mostly surrounds hot buttons like abortion, evolution, and climate change. But the scorn for scientific consensus has widened into a broadband know-nothingness. Representative Lamar Smith of Texas, chair of the House Committee on Science, Space, and Technology, has harassed the National Science Foundation not just for its research on climate science (which he thinks is a left-wing conspiracy) but for the research in its peer-reviewed grants, which he pulls out of context to mock (for example, “How does the federal government justify spending over $220,000 to study animal photos in National Geographic?”).6 He has tried to undermine federal support of basic research by proposing legislation that would require the NSF to fund only studies that promote “the national interest” such as defense and the economy.7 Science, of course, transcends national boundaries (as Chekhov noted, “There is no national science just as there is no national multiplication table”), and its ability to promote anyone’s interests comes from its foundational understanding of reality.8 The Global Positioning System, for example, uses the theory of relativity. Cancer therapies depend on the discovery of the double helix. Artificial intelligence adapts neural and semantic networks from the brain and cognitive sciences.

But chapter 21 prepared us for the fact that politicized repression of science comes from the left as well. It was the left that stoked panics about overpopulation, nuclear power, and genetically modified organisms. Research on intelligence, sexuality, violence, parenting, and prejudice have been distorted by tactics ranging from the choice of items in questionnaires to the intimidation of researchers who fail to ratify the politically correct orthodoxy.


My focus in the rest of this chapter is on a hostility to science that runs even deeper. Many intellectuals are enraged by the intrusion of science into the traditional territories of the humanities, such as politics, history, and the arts. Just as reviled is the application of scientific reasoning to the terrain formerly ruled by religion: many writers without a trace of a belief in God maintain that it is unseemly for science to weigh in on the biggest questions. In the major journals of opinion, scientific carpetbaggers are regularly accused of determinism, reductionism, essentialism, positivism, and, worst of all, a crime called scientism.

This resentment is bipartisan. The standard case for the prosecution by the left may be found in a 2011 review in The Nation by the historian Jackson Lears:

Positivism depends on the reductionist belief that the entire universe, including all human conduct, can be explained with reference to precisely measurable, deterministic physical processes. . . . Positivist assumptions provided the epistemological foundations for Social Darwinism and pop-evolutionary notions of progress, as well as for scientific racism and imperialism. These tendencies coalesced in eugenics, the doctrine that human well-being could be improved and eventually perfected through the selective breeding of the “fit” and the sterilization or elimination of the “unfit.” Every schoolkid knows about what happened next: the catastrophic twentieth century. Two world wars, the systematic slaughter of innocents on an unprecedented scale, the proliferation of unimaginably destructive weapons, brushfire wars on the periphery of empire—all these events involved, in various degrees, the application of scientific research to advanced technology.9

The case from the right is captured in this 2007 speech from Leon Kass, Bush’s bioethics advisor:

Scientific ideas and discoveries about living nature and man, perfectly welcome and harmless in themselves, are being enlisted to do battle against our traditional religious and moral teachings, and even our self-understanding as creatures with freedom and dignity. A quasi-religious faith has sprung up among us—let me call it “soul-less scientism”—which believes that our new biology, eliminating all mystery, can give a complete account of human life, giving purely scientific explanations of human thought, love, creativity, moral judgment, and even why we believe in God. The threat to our humanity today comes not from the transmigration of souls in the next life, but from the denial of soul in this one. . . .

Make no mistake. The stakes in this contest are high: at issue are the moral and spiritual health of our nation, the continued vitality of science, and our own self-understanding as human beings and as children of the West. . . . All friends of human freedom and dignity—including even the atheists among us—must understand that their own humanity is on the line.10

These are zealous prosecutors indeed. But as we shall see, their case is trumped up. Science cannot be blamed for genocide and war, and does not threaten the moral and spiritual health of our nation. On the contrary, science is indispensable in all areas of human concern, including politics, the arts, and the search for meaning, purpose, and morality.


The highbrow war on science is a flare-up of the controversy raised by C. P. Snow in 1959 when he deplored the disdain for science among British intellectuals in his lecture and book The Two Cultures. The term “cultures,” in the anthropologists’ sense, explains the puzzle of why science should draw flak not just from fossil-fuel-funded politicians but from some of the most erudite members of the clerisy.

During the 20th century, the landscape of human knowledge was carved into professionalized duchies, and the growth of science (particularly the sciences of human nature) is often seen as an encroachment on territories that had been staked and enclosed by the academic humanities. It’s not that practitioners of the humanities themselves have this zero-sum mindset. Most artists show no signs of it; the novelists, painters, filmmakers, and musicians I know are intensely curious about the light that science might shed on their media, just as they are open to any source of inspiration. Nor is the anxiety expressed by the scholars who delve into historical epochs, genres of art, systems of ideas, and other subject matter in the humanities, since a true scholar is receptive to ideas regardless of their origin. The defensive pugnacity belongs to a culture: Snow’s Second Culture of literary intellectuals, cultural critics, and erudite essayists.11 The writer Damon Linker (citing the sociologist Daniel Bell) characterizes them as “specialists in generalizations, . . . pronouncing on the world from out of their individual experiences, habits of reading and capacity for judgment. Subjectivity in all of its quirks and eccentricities is the coin of the realm in the Republic of Letters.”12 This modus could not be more different from the way of science, and it’s the Second Culture intellectuals who most fear “scientism,” which they understand as the position that “science is all that matters” or that “scientists should be entrusted to solve all problems.”

Snow, of course, never held the lunatic position that power should be transferred to the culture of scientists. On the contrary, he called for a Third Culture, which would combine ideas from science, culture, and history and apply them to enhancing human welfare across the globe.13 The term was revived in 1991 by the author and literary agent John Brockman, and it is related to the biologist E. O. Wilson’s concept of consilience, the unity of knowledge, which Wilson in turn attributed to (who else?) the thinkers of the Enlightenment.14 The first step in understanding the promise of science in human affairs is to escape the bunker mentality of the Second Culture, captured, for example, in the tag line of a 2013 article by the literary lion Leon Wieseltier: “Now science wants to invade the liberal arts. Don’t let it happen.”15

An endorsement of scientific thinking must first of all be distinguished from any belief that members of the occupational guild called “science” are particularly wise or noble. The culture of science is based on the opposite belief. Its signature practices, including open debate, peer review, and double-blind methods, are designed to circumvent the sins to which scientists, being human, are vulnerable. As Richard Feynman put it, the first principle of science is “that you must not fool yourself—and you are the easiest person to fool.”

For the same reason, a call for everyone to think more scientifically must not be confused with a call to hand decision-making over to scientists. Many scientists are naïfs when it comes to policy and law, and cook up nonstarters like world government, mandatory licensing of parents, and escaping a befouled Earth by colonizing other planets. It doesn’t matter, because we’re not talking about which priesthood should be granted power; we’re talking about how collective decisions can be made more wisely.

A respect for scientific thinking is, adamantly, not the belief that all current scientific hypotheses are true. Most new ones are not. The lifeblood of science is the cycle of conjecture and refutation: proposing a hypothesis and then seeing whether it survives attempts to falsify it. This point escapes many critics of science, who point to some discredited hypothesis as proof that science cannot be trusted, like a rabbi from my childhood who rebutted the theory of evolution as follows: “Scientists think the world is four billion years old. They used to think the world was eight billion years old. If they can be off by four billion years once, they can be off by four billion years again.” The fallacy (putting aside the apocryphal history) is a failure to recognize that what science allows is an increasing confidence in a hypothesis as the evidence accumulates, not a claim to infallibility on the first try. Indeed, this kind of argument refutes itself, since the arguers must themselves appeal to the truth of current scientific claims to cast doubt on the earlier ones. The same is true of the common argument that the claims of science are untrustworthy because the scientists of some earlier period were motivated by the prejudices and chauvinisms of the day. When they were, they were doing bad science, and it’s only the better science of later periods that allows us, today, to identify their errors.

One attempt to build a wall around science and make science pay for it uses a different argument: that science deals only with facts about physical stuff, so scientists are committing a logical error when they say anything about values or society or culture. As Wieseltier puts it, “It is not for science to say whether science belongs in morality and politics and art. Those are philosophical matters, and science is not philosophy.” But it is this argument that commits a logical error, by confusing propositions with academic disciplines. It’s certainly true that an empirical proposition is not the same as a logical one, and both must be distinguished from normative or moral claims. But that does not mean that scientists are under a gag order forbidding them to discuss conceptual and moral issues, any more than philosophers must keep their mouths shut about the physical world.

Science is not a list of empirical facts. Scientists are immersed in the ethereal medium of information, including the truths of mathematics, the logic of their theories, and the values that guide their enterprise. Nor, for its part, has philosophy ever confined itself to a ghostly realm of pure ideas that float free of the physical universe. The Enlightenment philosophers in particular interwove their conceptual arguments with hypotheses about perception, cognition, emotion, and sociality. (Hume’s analysis of the nature of causality, to take just one example, took off from his insights about the psychology of causality, and Kant was, among other things, a prescient cognitive psychologist.)16 Today most philosophers (at least in the analytic or Anglo-American tradition) subscribe to naturalism, the position that “reality is exhausted by nature, containing nothing ‘supernatural,’ and that the scientific method should be used to investigate all areas of reality, including the ‘human spirit.’”17 Science, in the modern conception, is of a piece with philosophy and with reason itself.

What, then, distinguishes science from other exercises of reason? It certainly isn’t “the scientific method,” a term that is taught to schoolchildren but that never passes the lips of a scientist. Scientists use whichever methods help them understand the world: drudgelike tabulation of data, experimental derring-do, flights of theoretical fancy, elegant mathematical modeling, kludgy computer simulation, sweeping verbal narrative.18 All the methods are pressed into the service of two ideals, and it is these ideals that advocates of science want to export to the rest of intellectual life.

The first is that the world is intelligible. The phenomena we experience may be explained by principles that are deeper than the phenomena themselves. That’s why scientists laugh at the Theory of the Brontosaurus from the dinosaur expert on Monty Python’s Flying Circus: “All brontosauruses are thin at one end, much much thicker in the middle, and then thin again at the far end”—the “theory” is just a description of how things are, not an explanation of why they are the way they are. The principles making up an explanation may in turn be explained by still deeper principles, and so on. (As David Deutsch put it, “We are always at the beginning of infinity.”) In making sense of our world, there should be few occasions on which we are forced to concede, “It just is” or “It’s magic” or “Because I said so.” The commitment to intelligibility is not a matter of raw faith, but progressively validates itself as more of the world becomes explicable in scientific terms. The processes of life, for example, used to be attributed to a mysterious élan vital; now we know they are powered by chemical and physical reactions among complex molecules.

Demonizers of scientism often confuse intelligibility with a sin called reductionism, the analysis of a complex system into simpler elements, or, according to the accusation, nothing but simpler elements. In fact, to explain a complex happening in terms of deeper principles is not to discard its richness. Patterns emerge at one level of analysis that are not reducible to their components at a lower level. Though World War I consisted of matter in motion, no one would try to explain World War I in the language of physics, chemistry, and biology as opposed to the more perspicuous language of the perceptions and goals of leaders in 1914 Europe. At the same time, a curious person can legitimately ask why human minds are apt to have such perceptions and goals, including the tribalism, overconfidence, mutual fear, and culture of honor that fell into a deadly combination at that historical moment.

The second ideal is that we must allow the world to tell us whether our ideas about it are correct. The traditional causes of belief—faith, revelation, dogma, authority, charisma, conventional wisdom, hermeneutic parsing of texts, the glow of subjective certainty—are generators of error, and should be dismissed as sources of knowledge. Instead our beliefs about empirical propositions should be calibrated by their fit to the world. When scientists are pressed to explain how they do this, they usually reach for Karl Popper’s model of conjecture and refutation, in which a scientific theory may be falsified by empirical tests but is never confirmed. In reality, science doesn’t much look like skeet shooting, with a succession of hypotheses launched into the air like clay pigeons and shot to smithereens. It looks more like Bayesian reasoning (the logic used by the superforecasters we met in the preceding chapter). A theory is granted a prior degree of credence, based on its consistency with everything else we know. That level of credence is then incremented or decremented according to how likely an empirical observation would be if the theory is true, compared with how likely it would be if the theory is false.19 Regardless of whether Popper or Bayes has the better account, a scientist’s degree of belief in a theory depends on its consistency with empirical evidence. Any movement that calls itself “scientific” but fails to nurture opportunities for the testing of its own beliefs (most obviously when it murders or imprisons the people who disagree with it) is not a scientific movement.


Many people are willing to credit science with giving us handy drugs and gadgets and even with explaining how physical stuff works. But they draw the line at what truly matters to us as human beings: the deep questions about who we are, where we came from, and how we define the meaning and purpose of our lives. That is the traditional territory of religion, and its defenders tend to be the most excitable critics of scientism. They are apt to endorse the partition plan proposed by the paleontologist and science writer Stephen Jay Gould in his book Rocks of Ages, according to which the proper concerns of science and religion belong to “non-overlapping magisteria.” Science gets the empirical universe; religion gets the questions of morality, meaning, and value.

But this entente unravels as soon as you begin to examine it. The moral worldview of any scientifically literate person—one who is not blinkered by fundamentalism—requires a clean break from religious conceptions of meaning and value.

To begin with, the findings of science imply that the belief systems of all the world’s traditional religions and cultures—their theories of the genesis of the world, life, humans, and societies—are factually mistaken. We know, but our ancestors did not, that humans belong to a single species of African primate that developed agriculture, government, and writing late in its history. We know that our species is a tiny twig of a genealogical tree that embraces all living things and that emerged from prebiotic chemicals almost four billion years ago. We know that we live on a planet that revolves around one of a hundred billion stars in our galaxy, which is one of a hundred billion galaxies in a 13.8-billion-year-old universe, possibly one of a vast number of universes. We know that our intuitions about space, time, matter, and causation are incommensurable with the nature of reality on scales that are very large and very small. We know that the laws governing the physical world (including accidents, disease, and other misfortunes) have no goals that pertain to human well-being. There is no such thing as fate, providence, karma, spells, curses, augury, divine retribution, or answered prayers—though the discrepancy between the laws of probability and the workings of cognition may explain why people believe there are. And we know that we did not always know these things, that the beloved convictions of every time and culture may be decisively falsified, doubtless including many we hold today.

In other words, the worldview that guides the moral and spiritual values of a knowledgeable person today is the worldview given to us by science. Though the scientific facts do not by themselves dictate values, they certainly hem in the possibilities. By stripping ecclesiastical authority of its credibility on factual matters, they cast doubt on its claims to certitude in matters of morality. The scientific refutation of the theory of vengeful gods and occult forces undermines practices such as human sacrifice, witch hunts, faith healing, trial by ordeal, and the persecution of heretics. By exposing the absence of purpose in the laws governing the universe, science forces us to take responsibility for the welfare of ourselves, our species, and our planet. For the same reason, it undercuts any moral or political system based on mystical forces, quests, destinies, dialectics, struggles, or messianic ages. And in combination with a few unexceptionable convictions—that all of us value our own welfare, and that we are social beings who impinge on each other and can negotiate codes of conduct—the scientific facts militate toward a defensible morality, namely principles that maximize the flourishing of humans and other sentient beings. This humanism (chapter 23), which is inextricable from a scientific understanding of the world, is becoming the de facto morality of modern democracies, international organizations, and liberalizing religions, and its unfulfilled promises define the moral imperatives we face today.


Though science is increasingly and beneficially embedded in our material, moral, and intellectual lives, many of our cultural institutions cultivate a philistine indifference to science that shades into contempt. Intellectual magazines that are ostensibly dedicated to ideas confine themselves to politics and the arts, with scant attention to new ideas emerging from science, with the exception of politicized issues like climate change (and regular attacks on scientism).20 Still worse is the treatment of science in the liberal arts curricula of many universities. Students can graduate with a trifling exposure to science, and what they do learn is often designed to poison them against it.

The most commonly assigned book on science in modern universities (aside from a popular biology textbook) is Thomas Kuhn’s The Structure of Scientific Revolutions.21 That 1962 classic is commonly interpreted as showing that science does not converge on the truth but merely busies itself with solving puzzles before flipping to some new paradigm which renders its previous theories obsolete, indeed, unintelligible.22 Though Kuhn himself later disavowed this nihilist interpretation, it has become the conventional wisdom within the Second Culture. A critic from a major intellectual magazine once explained to me that the art world no longer considers whether works of art are “beautiful” for the same reason that scientists no longer consider whether theories are “true.” He seemed genuinely surprised when I corrected him.

The historian of science David Wootton has remarked on the mores of his own field: “In the years since Snow’s lecture the two-cultures problem has deepened; history of science, far from serving as a bridge between the arts and sciences, nowadays offers the scientists a picture of themselves that most of them cannot recognize.”23 That is because many historians of science consider it naïve to treat science as the pursuit of true explanations of the world. The result is like a report of a basketball game by a dance critic who is not allowed to say that the players are trying to throw the ball through the hoop. I once sat through a lecture on the semiotics of neuroimaging at which a historian of science deconstructed a series of dynamic 3-D multicolor images of the brain, volubly explaining how “that ostensibly neutral and naturalizing scientific gaze encourages particular kinds of selves who are then amenable to certain political agendas, shifting position from the neuro(psychological) object toward the external observatory position,” and so on—any explanation but the bloody obvious one, namely that the images make it easier to see what’s going on in the brain.24 Many scholars in “science studies” devote their careers to recondite analyses of how the whole institution is just a pretext for oppression. An example is this scholarly contribution to the world’s most pressing challenge:

Glaciers, Gender, and Science: A Feminist Glaciology Framework for Global Environmental Change Research

Glaciers are key icons of climate change and global environmental change. However, the relationships among gender, science, and glaciers—particularly related to epistemological questions about the production of glaciological knowledge—remain understudied. This paper thus proposes a feminist glaciology framework with four key components: (1) knowledge producers; (2) gendered science and knowledge; (3) systems of scientific domination; and (4) alternative representations of glaciers. Merging feminist postcolonial science studies and feminist political ecology, the feminist glaciology framework generates robust analysis of gender, power, and epistemologies in dynamic social-ecological systems, thereby leading to more just and equitable science and human-ice interactions.25

More insidious than the ferreting out of ever more cryptic forms of racism and sexism is a demonization campaign that impugns science (together with reason and other Enlightenment values) for crimes that are as old as civilization, including racism, slavery, conquest, and genocide. This was a major theme of the influential Critical Theory of the Frankfurt School, the quasi-Marxist movement originated by Theodor Adorno and Max Horkheimer, who proclaimed that “the fully enlightened earth radiates disaster triumphant.”26 It also figures in the works of postmodernist theorists such as Michel Foucault, who argued that the Holocaust was the inevitable culmination of a “bio-politics” that began with the Enlightenment, when science and rational governance exerted increasing power over people’s lives.27 In a similar vein, the sociologist Zygmunt Bauman blamed the Holocaust on the Enlightenment ideal to “remake the society, force it to conform to an overall, scientifically conceived plan.”28 In this twisted narrative, the Nazis themselves are let off the hook (“It’s modernity’s fault!”). So is the Nazis’ rabidly counter-Enlightenment ideology, which despised the degenerate liberal bourgeois worship of reason and progress and embraced an organic, pagan vitality which drove the struggle between races. Though Critical Theory and postmodernism avoid “scientistic” methods such as quantification and systematic chronology, the facts suggest they have the history backwards. Genocide and autocracy were ubiquitous in premodern times, and they decreased, not increased, as science and liberal Enlightenment values became increasingly influential after World War II.29

To be sure, science has often been pressed into the support of deplorable political movements. It is essential, of course, to understand this history, and legitimate to pass judgment on scientists for their roles in it, just like any historical figures. Yet the qualities that we prize in humanities scholars—context, nuance, historical depth—often leave them when the opportunity arises to prosecute a campaign against their academic rivals. Science is commonly blamed for intellectual movements that had a pseudoscientific patina, though the historical roots of those movements ran deep and wide.

“Scientific racism,” the theory that races fall into an evolutionary hierarchy of mental sophistication with Northern Europeans at the top, is a prime example. It was popular in the decades flanking the turn of the 20th century, apparently supported by craniometry and mental testing, before being discredited in the middle of the 20th century by better science and by the horrors of Nazism. Yet to pin ideological racism on science, in particular on the theory of evolution, is bad intellectual history. Racist beliefs have been omnipresent across history and regions of the world. Slavery has been practiced by every civilization, and was commonly rationalized by the belief that enslaved peoples were inherently suited to servitude, often by God’s design.30 Statements from ancient Greek and medieval Arab writers about the biological inferiority of Africans would curdle your blood, and Cicero’s opinion of Britons was not much more charitable.31

More to the point, the intellectualized racism that infected the West in the 19th century was the brainchild not of science but of the humanities: history, philology, classics, and mythology. In 1853 Arthur de Gobineau, a fiction writer and amateur historian, published his cockamamie theory that a race of virile white men, the Aryans, spilled out of an ancient homeland and spread a heroic warrior civilization across Eurasia, diverging into the Persians, Hittites, Homeric Greeks, and Vedic Hindus, and later into the Vikings, Goths, and other Germanic tribes. (The speck of reality in this story is that these tribes spoke languages that fell into a single family, Indo-European.) Everything went downhill when the Aryans interbred with inferior conquered peoples, diluting their greatness and causing them to degenerate into the effete, decadent, soulless, bourgeois, commercial cultures that the Romantics were always whinging about. It was a small step to fuse this fairy tale with German Romantic nationalism and anti-Semitism: the Teutonic Volk were the heirs of the Aryans, the Jews a mongrel race of Asiatics. Gobineau’s ideas were eaten up by Richard Wagner (whose operas were held to be re-creations of the original Aryan myths) and by Wagner’s son-in-law Houston Stewart Chamberlain (a philosopher who wrote that Jews polluted Teutonic civilization with capitalism, liberal humanism, and sterile science). From them the ideas reached Hitler, who called Chamberlain his “spiritual father.”32

Science played little role in this chain of influence. Pointedly, Gobineau, Chamberlain, and Hitler rejected Darwin’s theory of evolution, particularly the idea that all humans had gradually evolved from apes, which was incompatible with their Romantic theory of race and with the older folk and religious notions from which it emerged. According to these widespread beliefs, races were separate species; they were fitted to civilizations with different levels of sophistication; and they would degenerate if they mixed. Darwin argued that humans are closely related members of a single species with a common ancestry, that all peoples have “savage” origins, that the mental capacities of all races are virtually the same, and that the races blend into one another with no harm from interbreeding.33 The historian Robert Richards, who carefully traced Hitler’s influences, ended a chapter entitled “Was Hitler a Darwinian?” (a common claim among creationists) with “The only reasonable answer to the question . . . is a very loud and unequivocal No!”34

Like “scientific racism,” the movement called Social Darwinism is often tendentiously attributed to science. When the concept of evolution became famous in the late 19th and early 20th centuries, it turned into an inkblot test that a diverse assortment of political and intellectual movements saw as vindicating their agendas. Everyone wanted to believe that their vision of struggle, progress, and the good life was nature’s way.35 One of these movements was retroactively dubbed social Darwinism, though it was advocated not by Darwin but by Herbert Spencer, who laid it out in 1851, eight years before the publication of The Origin of Species. Spencer did not believe in random mutation and natural selection; he believed in a Lamarckian process in which the struggle for existence impelled organisms to strive toward feats of greater complexity and adaptation, which they passed on to later generations. Spencer thought that this progressive force was best left unimpeded, and so he argued against social welfare and government regulation that would only prolong the doomed lives of weaker individuals and groups. His political philosophy, an early form of libertarianism, was picked up by robber barons, advocates of laissez-faire economics, and opponents of social spending. Because those ideas had a right-wing flavor, left-wing writers misapplied the term social Darwinism to other ideas with a right-wing flavor, such as imperialism and eugenics, even though Spencer was dead-set against such government activism.36 More recently the term has been used as a weapon against any application of evolution to the understanding of human beings.37 So despite its etymology, the term has nothing to do with Darwin or evolutionary biology, and is now an almost meaningless term of abuse.

Eugenics is another movement that has been used as an ideological blunderbuss. Francis Galton, a Victorian polymath, first suggested that the genetic stock of humankind could be improved by offering incentives for talented people to marry each other and have more children (positive eugenics), though when the idea caught on it was extended to discouraging reproduction among the “unfit” (negative eugenics). Many countries forcibly sterilized delinquents, the mentally retarded, the mentally ill, and other people who fell into a wide net of ailments and stigmas. Nazi Germany modeled its forced sterilization laws after ones in Scandinavia and the United States, and its mass murder of Jews, Roma, and homosexuals is often considered a logical extension of negative eugenics. (In reality the Nazis invoked public health far more than genetics or evolution: Jews were likened to vermin, pathogens, tumors, gangrenous organs, and poisoned blood.)38

The eugenics movement was permanently discredited by its association with Nazism. But the term survived as a way to taint a number of scientific endeavors, such as applications of medical genetics that allow parents to bear children without fatal degenerative diseases, and to the entire field of behavioral genetics, which analyzes the genetic and environmental causes of individual differences.39 And in defiance of the historical record, eugenics is often portrayed as a movement of right-wing scientists. In fact it was championed by progressives, liberals, and socialists, including Theodore Roosevelt, H. G. Wells, Emma Goldman, George Bernard Shaw, Harold Laski, John Maynard Keynes, Sidney and Beatrice Webb, Woodrow Wilson, and Margaret Sanger.40 Eugenics, after all, valorized reform over the status quo, social responsibility over selfishness, and central planning over laissez-faire. The most decisive repudiation of eugenics invokes classical liberal and libertarian principles: government is not an omnipotent ruler over human existence but an institution with circumscribed powers, and perfecting the genetic makeup of the species is not among them.

I’ve mentioned the limited role of science in these movements not to absolve the scientists (many of whom were indeed active or complicit) but because the movements deserve a deeper and more contextualized understanding than their current role as anti-science propaganda. Misunderstandings of Darwin gave these movements a boost, but they sprang from the religious, artistic, intellectual, and political beliefs of their eras: Romanticism, cultural pessimism, progress as dialectical struggle or mystical unfolding, and authoritarian high modernism. If we think these ideas are not just unfashionable but mistaken, it is because of the better historical and scientific understanding we enjoy today.


Recriminations over the nature of science are by no means a relic of the “science wars” of the 1980s and 1990s, but continue to shape the role of science in universities. When Harvard reformed its general education requirement in 2006–7, the preliminary task force report introduced the teaching of science without any mention of its place in human knowledge: “Science and technology directly affect our students in many ways, both positive and negative: they have led to life-saving medicines, the internet, more efficient energy storage, and digital entertainment; they also have shepherded nuclear weapons, biological warfare agents, electronic eavesdropping, and damage to the environment.” Well, yes, and I suppose one could say that architecture has produced both museums and gas chambers, that classical music both stimulates economic activity and inspired the Nazis, and so on. But this strange equivocation between the utilitarian and the nefarious was not applied to other disciplines, and the statement gave no indication that we might have good reasons to prefer understanding and know-how to ignorance and superstition.

At a recent conference, another colleague summed up what she thought was the mixed legacy of science: vaccines for smallpox on the one hand; the Tuskegee syphilis study on the other. In that affair, another bloody shirt in the standard narrative about the evils of science, public health researchers, beginning in 1932, tracked the progression of untreated latent syphilis in a sample of impoverished African Americans for four decades. The study was patently unethical by today’s standards, though it’s often misreported to pile up the indictment. The researchers, many of them African American or advocates of African American health and well-being, did not infect the participants, as many people believe (a misconception that has led to the widespread conspiracy theory that AIDS was invented in US government labs to control the black population). And when the study began, it may even have been defensible by the standards of the day: treatments for syphilis (mainly arsenic) were toxic and ineffective; when antibiotics became available later, their safety and efficacy in treating syphilis were unknown; and latent syphilis was known to often resolve itself without treatment.41 But the point is that the entire equation is morally obtuse, showing the power of Second Culture talking points to scramble a sense of proportionality. My colleague’s comparison assumed that the Tuskegee study was an unavoidable part of scientific practice as opposed to a universally deplored breach, and it equated a one-time failure to prevent harm to a few dozen people with the prevention of hundreds of millions of deaths per century in perpetuity.

Does the demonization of science in the liberal arts programs of higher education matter? It does, for a number of reasons. Though many talented students hurtle along pre-med or engineering tracks from the day they set foot on campus, many others are unsure of what they want to do with their lives and take their cues from their professors and advisors. What happens to those who are taught that science is just another narrative like religion and myth, that it lurches from revolution to revolution without making progress, and that it is a rationalization of racism, sexism, and genocide? I’ve seen the answer: some of them figure, “If that’s what science is, I might as well make money!” Four years later their brainpower is applied to thinking up algorithms that allow hedge funds to act on financial information a few milliseconds faster rather than to finding new treatments for Alzheimer’s disease or technologies for carbon capture and storage.

The stigmatization of science is also jeopardizing the progress of science itself. Today anyone who wants to do research on human beings, even an interview on political opinions or a questionnaire about irregular verbs, must prove to a committee that he or she is not Josef Mengele. Though research subjects obviously must be protected from exploitation and harm, the institutional review bureaucracy has swollen far beyond this mission. Critics have pointed out that it has become a menace to free speech, a weapon that fanatics can use to shut up people whose opinions they don’t like, and a red-tape dispenser which bogs down research while failing to protect, and sometimes harming, patients and research subjects.42 Jonathan Moss, a medical researcher who had developed a new class of drugs and was drafted into chairing the research review board at the University of Chicago, said in a convocation address, “I ask you to consider three medical miracles we take for granted: X-rays, cardiac catheterization, and general anesthesia. I contend all three would be stillborn if we tried to deliver them in 2005.”43 (The same observation has been made about insulin, burn treatments, and other lifesavers.) The social sciences face similar hurdles. Anyone who talks to a human being with the intent of gaining generalizable knowledge must obtain prior permission from these committees, almost certainly in violation of the First Amendment. Anthropologists are forbidden to speak with illiterate peasants who cannot sign a consent form, or interview would-be suicide bombers on the off chance that they might blurt out information that puts them in jeopardy.44

The hobbling of research is not just a symptom of bureaucratic mission creep. It is actually rationalized by many academics in a field called bioethics. These theoreticians think up reasons why informed and consenting adults should be forbidden to take part in treatments that help them and others while harming no one, using nebulous rubrics like “dignity,” “sacredness,” and “social justice.” They try to sow panic about advances in biomedical research using far-fetched analogies with nuclear weapons and Nazi atrocities, science-fiction dystopias like Brave New World and Gattaca, and freak-show scenarios like armies of cloned Hitlers, people selling their eyeballs on eBay, or warehouses of zombies to supply people with spare organs. The moral philosopher Julian Savulescu has exposed the low standards of reasoning behind these arguments and has pointed out why “bioethical” obstructionism can be unethical: “To delay by 1 year the development of a treatment that cures a lethal disease that kills 100,000 people per year is to be responsible for the deaths of those 100,000 people, even if you never see them.”45


Ultimately the greatest payoff of instilling an appreciation of science is for everyone to think more scientifically. We saw in the preceding chapter that humans are vulnerable to cognitive biases and fallacies. Though scientific literacy itself is not a cure for fallacious reasoning when it comes to politicized identity badges, most issues don’t start out that way, and everyone would be better off if they could think about them more scientifically. Movements that aim to spread scientific sophistication such as data journalism, Bayesian forecasting, evidence-based medicine and policy, real-time violence monitoring, and effective altruism have a vast potential to enhance human welfare. But an appreciation of their value has been slow to penetrate the culture.46

I asked my doctor whether the nutritional supplement he had recommended for my knee pain would really be effective. He replied, “Some of my patients say it works for them.” A business-school colleague shared this assessment of the corporate world: “I have observed many smart people who have little idea of how to logically think through a problem, who infer causation from a correlation, and who use anecdotes as evidence far beyond the predictability warranted.” Another colleague who quantifies war, peace, and human security describes the United Nations as an “evidence-free zone”:

The higher reaches of the UN are not unlike anti-science humanities programs. Most people at the top are lawyers and liberal arts graduates. The only parts of the Secretariat that have anything resembling a research culture have little prestige or influence. Few of the top officials in the UN understood qualifying statements as basic as “on average and other things being equal.” So if we were talking about risk probabilities for conflict onsets you could be sure that Sir Archibald Prendergast III or some other luminary would offer a dismissive, “It’s not like that in Burkina Faso, y’know.”

Resisters of scientific thinking often object that some things just can’t be quantified. Yet unless they are willing to speak only of issues that are black or white and to foreswear using the words more, less, better, and worse (and for that matter the suffix –er), they are making claims that are inherently quantitative. If they veto the possibility of putting numbers to them, they are saying, “Trust my intuition.” But if there’s one thing we know about cognition, it’s that people (including experts) are arrogantly overconfident about their intuition. In 1954 Paul Meehl stunned his fellow psychologists by showing that simple actuarial formulas outperform expert judgment in predicting psychiatric classifications, suicide attempts, school and job performance, lies, crime, medical diagnoses, and pretty much any other outcome in which accuracy can be judged at all. Meehl’s work inspired Tversky and Kahneman’s discoveries on cognitive biases and Tetlock’s forecasting tournaments, and his conclusion about the superiority of statistical to intuitive judgment is now recognized as one of the most robust findings in the history of psychology.47

Like all good things, data are not a panacea, a silver bullet, a magic bullet, or a one-size-fits-all solution. All the money in the world could not pay for randomized controlled trials to settle every question that occurs to us. Human beings will always be in the loop to decide which data to gather and how to analyze and interpret them. The first attempts to quantify a concept are always crude, and even the best ones allow probabilistic rather than perfect understanding. Nonetheless, quantitative social scientists have laid out criteria for evaluating and improving measurements, and the critical comparison is not whether a measure is perfect but whether it is better than the judgment of an expert, critic, interviewer, clinician, judge, or maven. That turns out to be a low bar.

Because the cultures of politics and journalism are largely innocent of the scientific mindset, questions with massive consequences for life and death are answered by methods that we know lead to error, such as anecdotes, headlines, rhetoric, and what engineers call HiPPO (highest-paid person’s opinion). We have already seen some dangerous misconceptions that arise from this statistical obtuseness. People think that crime and war are spinning out of control, though homicides and battle deaths are going down, not up. They think that Islamist terrorism is a major risk to life and limb, whereas the danger is smaller than that from wasps and bees. They think that ISIS threatens the existence or survival of the United States, whereas terrorist movements rarely achieve any of their strategic aims.

The dataphobic mindset (“It’s not like that in Burkina Faso”) can lead to real tragedy. Many political commentators can recall a failure of peacekeeping forces (such as in Bosnia in 1995) and conclude that they are a waste of money and manpower. But when a peacekeeping force is successful, nothing photogenic happens, and it fails to make the news. In her book Does Peacekeeping Work? the political scientist Virginia Page Fortna addressed the question in her title with the methods of science rather than headlines, and, in defiance of Betteridge’s Law, found that the answer is “a clear and resounding yes.” Other studies have come to the same conclusion.48 Knowing the results of these analyses could make the difference between an international organization helping to bring peace to a country and letting it fester in civil war.

Do multiethnic regions harbor “ancient hatreds” that can only be tamed by partitioning them into ethnic enclaves and cleansing the minorities from each one? Whenever ethnic neighbors go for each other’s throats we read about it, but what about the neighborhoods that never make the news because they live in boring peace? What proportion of pairs of ethnic neighbors coexist without violence? The answer is, most of them: 95 percent of the neighbors in the former Soviet Union, 99 percent of those in Africa.49

Do campaigns of nonviolent resistance work? Many people believe that Gandhi and Martin Luther King just got lucky: their movements tugged at the heartstrings of enlightened democracies at opportune moments, but everywhere else, oppressed people need violence to get out from under a dictator’s boot. The political scientists Erica Chenoweth and Maria Stephan assembled a dataset of political resistance movements across the world between 1900 and 2006 and discovered that three-quarters of the nonviolent resistance movements succeeded, compared with only a third of the violent ones.50 Gandhi and King were right, but without data, you would never know it.

Though the urge to join a violent insurgent or terrorist group may owe more to male bonding than to just-war theory, most of the combatants probably believe that if they want to bring about a better world, they have no choice but to kill people. What would happen if everyone knew that violent strategies were not just immoral but ineffectual? It’s not that I think we should airdrop crates of Chenoweth and Stephan’s book into conflict zones. But leaders of radical groups are often highly educated (they distill their frenzy from academic scribblers of a few years back), and even the cannon fodder often attend some college and absorb the conventional wisdom about the need for revolutionary violence.51 What would happen over the long run if a standard college curriculum devoted less attention to the writings of Karl Marx and Frantz Fanon and more to quantitative analyses of political violence?


One of the greatest potential contributions of modern science may be a deeper integration with its academic partner, the humanities. By all accounts, the humanities are in trouble. University programs are downsizing; the next generation of scholars is un- or underemployed; morale is sinking; students are staying away in droves.52

No thinking person should be indifferent to our society’s disinvestment in the humanities.53 A society without historical scholarship is like a person without memory: deluded, confused, easily exploited. Philosophy grows out of the recognition that clarity and logic don’t come easily to us and that we’re better off when our thinking is refined and deepened. The arts are one of the things that make life worth living, enriching human experience with beauty and insight. Criticism is itself an art that multiplies the appreciation and enjoyment of great works. Knowledge in these domains is hard won, and needs constant enriching and updating as the times change.

Diagnoses of the malaise of the humanities rightly point to anti-intellectual trends in our culture and to the commercialization of universities. But an honest appraisal would have to acknowledge that some of the damage is self-inflicted. The humanities have yet to recover from the disaster of postmodernism, with its defiant obscurantism, self-refuting relativism, and suffocating political correctness. Many of its luminaries—Nietzsche, Heidegger, Foucault, Lacan, Derrida, the Critical Theorists—are morose cultural pessimists who declare that modernity is odious, all statements are paradoxical, works of art are tools of oppression, liberal democracy is the same as fascism, and Western civilization is circling the drain.54

With such a cheery view of the world, it’s not surprising that the humanities often have trouble defining a progressive agenda for their own enterprise. Several university presidents and provosts have lamented to me that when a scientist comes into their office, it’s to announce some exciting new research opportunity and demand the resources to pursue it. When a humanities scholar drops by, it’s to plead for respect for the way things have always been done. Those ways do deserve respect, and there can be no replacement for the close reading, thick description, and deep immersion that erudite scholars can apply to individual works. But must these be the only paths to understanding?

A consilience with science offers the humanities many possibilities for new insight. Art, culture, and society are products of human brains. They originate in our faculties of perception, thought, and emotion, and they cumulate and spread through the epidemiological dynamics by which one person affects others. Shouldn’t we be curious to understand these connections? Both sides would win. The humanities would enjoy more of the explanatory depth of the sciences, and a forward-looking agenda that could attract ambitious young talent (not to mention appealing to deans and donors). The sciences could challenge their theories with the natural experiments and ecologically valid phenomena that have been so richly characterized by humanities scholars.

In some fields, this consilience is a fait accompli. Archaeology has grown from a branch of art history to a high-tech science. The philosophy of mind shades into mathematical logic, computer science, cognitive science, and neuroscience. Linguistics combines philological scholarship on the history of words and grammatical constructions with laboratory studies of speech, mathematical models of grammar, and the computerized analysis of large corpora of writing and conversation.

Political theory, too, has a natural affinity with the sciences of mind. “What is government,” asked James Madison, “but the greatest of all reflections on human nature?” Social, political, and cognitive scientists are reexamining the connections between politics and human nature, which were avidly debated in Madison’s time but submerged during an interlude in which humans were treated as blank slates or rational actors. Humans, we now know, are moralistic actors: they are guided by intuitions about authority, tribe, and purity; are committed to sacred beliefs that express their identity; and are driven by conflicting inclinations toward revenge and reconciliation. We are starting to grasp why these impulses evolved, how they are implemented in the brain, how they differ among individuals, cultures, and subcultures, and which conditions turn them on and off.55

Comparable opportunities beckon in other areas of the humanities. The visual arts could avail themselves of the explosion of knowledge in vision science, including the perception of color, shape, texture, and lighting, and the evolutionary aesthetics of faces, landscapes, and geometric forms.56 Music scholars have much to discuss with the scientists who study the perception of speech, the structure of language, and the brain’s analysis of the auditory world.57

As for literary scholarship, where to begin?58 John Dryden wrote that a work of fiction is “a just and lively image of human nature, representing its passions and humours, and the changes of fortune to which it is subject, for the delight and instruction of mankind.” Cognitive psychology can shed light on how readers reconcile their own consciousness with those of the author and characters. Behavioral genetics can update folk theories of parental influence with discoveries about the effects of genes, peers, and chance, which have profound implications for the interpretation of biography and memoir—an endeavor that also has much to learn from the cognitive psychology of memory and the social psychology of self-presentation. Evolutionary psychologists can distinguish the obsessions that are universal from those that are exaggerated by a particular culture, and can lay out the inherent conflicts and confluences of interest within families, couples, friendships, and rivalries which are the drivers of plot. All these ideas can help add new depth to Dryden’s observation about fiction and human nature.

Though many concerns in the humanities are best appreciated with traditional narrative criticism, some raise empirical questions that can be informed by data. The advent of data science applied to books, periodicals, correspondence, and musical scores has inaugurated an expansive new “digital humanities.”59 The possibilities for theory and discovery are limited only by the imagination, and include the origin and spread of ideas, networks of intellectual and artistic influence, the contours of historical memory, the waxing and waning of themes in literature, the universality or culture-specificity of archetypes and plots, and patterns of unofficial censorship and taboo.

The promise of a unification of knowledge can be fulfilled only if knowledge flows in all directions. Some of the scholars who have recoiled from scientists’ forays into explaining art are correct that these explanations have been, by their standards, shallow and simplistic. All the more reason for them to reach out and combine their erudition about individual works and genres with scientific insight into human emotions and aesthetic responses. Better still, universities could train a new generation of scholars who are fluent in each of the two cultures.

Although humanities scholars themselves tend to be receptive to insights from science, many policemen of the Second Culture proclaim that they may not indulge such curiosity. In a dismissive review in the New Yorker of a book by the literary scholar Jonathan Gottschall on the evolution of the narrative instinct, Adam Gopnik writes, “The interesting questions about stories . . . are not about what makes a taste for them ‘universal,’ but what makes the good ones so different from the dull ones. . . . This is a case, as with women’s fashion, where the subtle, ‘surface’ differences are actually the whole of the subject.”60 But in appreciating literature, must connoisseurship really be the whole of the subject? An inquisitive spirit might also be curious about the recurring ways in which minds separated by culture and era deal with the timeless conundrums of human existence.

Wieseltier, too, has issued crippling diktats on what scholarship in the humanities may not do, such as make progress. “The vexations of philosophy . . . are not retired,” he declared; “errors [are] not corrected and discarded.”61 In fact, most moral philosophers today would say that the old arguments defending slavery as a natural institution are errors which have been corrected and discarded. Epistemologists might add that their field has progressed from the days when Descartes could argue that human perception is veridical because God would not deceive us. Wieseltier further stipulates that there is a “momentous distinction between the study of the natural world and the study of the human world,” and any move to “transgress the borders between realms” could only make the humanities the “handmaiden of the sciences,” because “a scientific explanation will expose the underlying sameness” and “absorb all the realms into a single realm, into their realm.” Where does this paranoia and territoriality lead? In a major essay in the New York Times Book Review, Wieseltier called for a worldview that is pre-Darwinian—“the irreducibility of the human difference to any aspect of our animality”—indeed, pre-Copernican—“the centrality of humankind to the universe.”62

Let’s hope that artists and scholars don’t follow their self-appointed defenders over this cliff. Our quest to come to terms with the human predicament need not be frozen in the last century or the century before, let alone the Middle Ages. Surely our theories of politics, culture, and morality have much to learn from our best understanding of the universe and our makeup as a species.

In 1778 Thomas Paine extolled the cosmopolitan virtues of science:

Science, the partisan of no country, but the beneficent patroness of all, has liberally opened a temple where all may meet. Her influence on the mind, like the sun on the chilled earth, has long been preparing it for higher cultivation and further improvement. The philosopher of one country sees not an enemy in the philosophy of another: he takes his seat in the temple of science, and asks not who sits beside him.63

What he wrote about the physical landscape applies as well to the landscape of knowledge. In this and other ways, the spirit of science is the spirit of the Enlightenment.