A case for optimism

Darrel Rees

Are you among those who doubt that climate change is caused by humans? You have company: Half of the American populace is on your side. For them, anthropic causes of global warming are an illusion or possibly a hoax. But that teeming horde doesn’t include many climate scientists. Only 13 percent of these experts dispute that climate change is largely wrought by man.

What about biological evolution, an idea that’s now 157 years old? Do you think your presence on this planet is the consequence of the adaptation and change of species with time? If not, there’s no need to feel marginalized by your skepticism. Two-fifths of your countrymen figure that Homo sapiens somehow arose in its present form only 10,000 years ago. They consider it laughable to suggest that an undirected process could have produced something as wonderful and complex as themselves. However, you won’t find many biology professors in that crowd.

Courtesy Seth Shostak ’65

Perhaps you suspect that vaccines cause autism? Or that GMOs are bad for your health? What about that clumsy government cover-up of an alien saucer crash near Roswell, N.M.? Large fractions of the public consider these ideas — which run contrary to mainstream science — at least plausible. 

So what’s going on here? What’s happened to the credibility of the white-lab-coated brainiacs who were once the final authority on how everything worked? Today, many in the public regard scientists as having motives that go beyond merely sussing out nature’s machinery. They are perceived as having an agenda that threatens lifestyles as often as it improves them.

Has science become unreliable, closed-minded, or possibly even malicious? Is the public wrong in occasionally regarding science with raised eyebrows, especially when it intrudes in the most personal of ways by admonishing people that major trouble is afoot if they don’t riddle their infants with a volley of vaccines or curtail their love for large cars?

Intrusions into daily life have set up science as the bad boy for those with a liking for old-fashioned agriculture, natural medicine, or bulldozing coal from the wilds of Wyoming. The result is a significant hostility to science or, if you’re partial to expansive phraseology, an “attack on science.” This attack is as unsurprising as belly fat. Science is in the business of explaining things, and as its range of explanation continually expands, so will the societal consequences.

This is a modern phenomenon, as our regard for science has shifted considerably in the past seven decades. After World War II, science mutated from an egghead enterprise to a major engine of society. Even apart from proving itself indispensable for vanquishing present and future enemies, research was seen as a relentless promoter of a better life. What followed was a decades-long honeymoon in which scientists looked beautiful from every angle. In the 1950s, nuclear power (“our friend, the atom”) promised to supply us with electricity at a price too cheap to meter. On TV, avuncular doctors, sheathed in de rigeur lab coats, confidently assured viewers that certain brands of cigarettes were actually good for them.

Today, the haloed scientists of the past have given way to less benign models. Scientists are no longer the ultimate authorities. A prime example of this can be found in the brouhaha over childhood vaccinations. Roughly one in 10 people suspect that these vaccines cause autism. This has motivated parents (often wealthy and well educated) to avoid inoculating their kids and has been one of the few science topics discussed, albeit inaccurately, by presidential candidates. It has become a major public issue, rather than a matter of personal principle, because vaccines — like self-driving cars — offer their greatest societal benefits only if everyone participates. 

There is overwhelming evidence that discredits any link between vaccines and autism. Nonetheless, large numbers of parents choose to rank their intuition (or the testimony of movie stars) above peer-reviewed research, irrespective of the direct and occasionally lethal consequences. They distrust the scientists, who in their eyes have somehow morphed from saints to devils.

How can one understand such a monumental decline in authority? One obvious explanation is to recognize that scientists — like everyone — are fallible. They make mistakes, and occasionally cheat by manipulating or fabricating data. When this happens, when pointy-headed professors turn out to be as reliable as Ford Pintos, their transgressions become a useful cudgel for those who think that scientists are goring their ox. 

But at the risk of sounding self-serving, science seldom stays wrong for long. Science autocorrects. Nothing pleases a researcher quite so much as demonstrating that a competitor has made an error, offering the delicious opportunity to set the record (and the textbooks) straight. If your conclusions are faulty, the first one to challenge you surely will be another scientist. 

Because of this self-correction, it’s a weak argument to suggest — as anti-vaxxers and climate-change deniers often do — that the science asserted by large numbers of researchers is mistaken. That’s a precarious position, and the odds against it being right are long. It’s one attack on science that has little chance — no more than slingshots against a castle wall. 

But here’s another, more subtle explanation for the dulling of science’s luster: a widespread unease about where it’s taking us. When the Renaissance was getting underway, no one could imagine the long-term changes that the newly invented discipline of science could foster. It sowed seeds that flowered in unexpected ways. 

Consider a modern example: A century ago, when physicists were developing quantum mechanics to describe the seemingly preposterous behavior of atoms, few outside academia had obvious reason to care. Indeed, even the scientists themselves were unsure that their work was any more consequential than doing the Sunday crossword. As recently as 1940, the British mathematician G.H. Hardy declaimed that relativity and quantum mechanics were “almost as useless as the theory of numbers.” And at the time, the last was quite useless.

But that’s changed. Anyone with a cellphone owns a device that would have been impossible to build without an understanding of the non-intuitive conduct of very small bits of matter. Quantum mechanics is everywhere.

The frequent delay between research and benefit is a strong argument against politicians who feel that research must always have an obvious practical goal. Sen. William Proxmire became famous (and eventually notorious) for his Golden Fleece Award, a finger-pointing exercise directed against federally funded science he considered frivolous. Quantum mechanics certainly would have qualified.

But delay or not, there’s no doubt that the public now recognizes that the future really is being fashioned in the lab, and that research into artificial intelligence or genetics may result in discomfiting scenarios. Are white-collar workers destined to lose their jobs to ever-smarter robots? Will their grandchildren inevitably begin life as designer babies? For some people, today’s scientists are busily clearing the path to tomorrow’s nightmare. 

Inevitably, as the scope of science has grown, it has shed the benign regard in which it was held. Modern physics was once far removed from the mind of the average person, and thoroughly innocuous — until it produced the atomic bomb. Today’s science touches subjects that are big in anyone’s budget: defense, health care, and the environment. 

Despite these understandable worries, I believe that much of the contemporary distrust of science is motivated not by its occasional inaccuracies or even its unpredictable and possibly sinister outcomes, but by a very human resistance to its practitioners. 

This isn’t because scientists wear black hats, but because they deal in dark arts. If you disagree with science or its findings, it’s a tough slog to take it on. After all, researchers are armored with intellect, status, tenure, and subject matter that’s about as comprehensible to the uninitiated as the Dead Sea Scrolls.

As middle-school kids love to lament, modern science is hard. In the 19th century, there were discoveries lying around like fallen fruit just waiting to be collected by the observant and thoughtful. You could become an expert in nearly any research area with little more than an above-average intellect and a week in a decent library. This was the era of gentlemen scientists with time on their hands — “natural philosophers” sporting tweedy jackets rather than sheepskins on the wall.

That era definitely is past tense. To prove the existence of the Higgs boson required a machine, the Large Hadron Collider, that took $9 billion and more than a decade to build. About 10,000 specialists were involved. No member of the landed gentry ever would have made that discovery.

Indeed, the author list on one of the seminal papers describing the uncovering of the Higgs had 5,154 names on it. That’s more text than many research papers of a century ago, and is a good indicator of the cumulative nature of science. Knowledge builds on itself. Newton could not have understood what the Higgs discoverers found, despite the fact that his brain was undoubtedly more supple than most of theirs. 

This is in dramatic contrast to other societal endeavors, such as the arts. Books, plays, and music still are largely the work of individuals, and these individuals need not stand on the shoulders of their predecessors for much more than inspiration. Would anyone say that a modern composer, say Elton John, has totally eclipsed Mozart thanks to two centuries of progress in music? Is Wolfie no longer worth a download? Even movie-making, which today employs teams as large as those doing particle physics, is — aside from its greater technical finesse — hardly changed from its past. Would you really argue that contemporary films are fundamentally more captivating than those of the ’30s and ’40s?

Science obviously is different. As the easy stuff is mastered, cutting-edge research leads to deeper complication. As a result, it becomes less easily grasped by non-experts. While even high school students of two generations ago could appreciate the concept of atoms and picture what “splitting the atom” might mean, how many among the citizenry of today command enough science to appreciate string theory, or what problem it’s trying to solve?

The result is that those whose lives are forcibly altered by science understandably can regard it as an enemy — and its practitioners as enemy troops. The research establishment is sometimes seen as a society of bullies, emboldened by fancy degrees.

Researchers themselves often are surprised by this tendency to, in their view, blame the innocent. Scientists argue that they are entirely agnostic when reporting on the safety of GMO foods or the effects of coal-fired power plants. If there’s a fight about these things, it doesn’t include any dog of theirs. The researchers are simply calculating the odds. They never promised that their efforts would be agreeable, entertaining, interesting, useful, or beautiful. The citizenry doesn’t need to like what science tells it. In this regard, it’s unlike nearly any other activity you can name. 

In addition, scientists generally are nonplussed by accusations of cover-up or hidden knowledge imposed by fearful governments. As anyone who has worked in research knows, science is very bad at keeping secrets.

So where does this battle lead? Personally, I think it’s destined to fade with time. Millennials surely have a better understanding than their predecessors of the truth that basic research is the midwife of future technology. And just about everyone is sympathetic to the promise of improved technology — be it in their cars, in medicine, entertainment, or personal electronics. This spawns a soft undercurrent of support for science. We want the goodies, so we’ll ante up for the R&D.

True, this support could be likened to a religion: In our hunger for the technology, we take the science on faith. I suspect that rather few people find the existence of the Higgs boson interesting or comprehensible enough to discuss at cocktail parties. But they have little issue with the fact that billions in tax dollars (admittedly, mostly European tax dollars) were spent to track it down.

There seems to be a historical buy-in that, because we want the fruit, we’re willing to invest in the orchard — or at least in a small grove. The budget for the National Science Foundation is 0.2 percent of the federal budget. But that expenditure hasn’t caused the citizenry to reach for their lanterns and pitchforks (although it must be noted that the amount spent on non-defense research has stagnated for the past dozen years).

So is there really a good reason to think that the attack on science is damaging our research efforts and our future? In the short term, you could argue there is. The frustrating reluctance to confront the existential problem of climate change could come back to bite us in a big way. However, and as contrary as it might sound, the failure to vigorously address this issue might be cured by a worsening of the problem itself. As pundits enjoy noting, America generally is unenthusiastic about making hard choices on problems until they’re as obvious as vaudeville humor. With 16 out of 17 of the hottest years on record being experienced in the scant time since the new millennium began, climate change is one problem that may become dramatically manifest very soon, provoking some serious action. 

But what about the long term? Has science had its heyday in America? A perennial lament is that the public has very little understanding of science — not just the facts, but also how it works and how it decides if something is likely to be true or not.

Judging from the phone calls and emails I receive every day, you might think this lament has legs. I’m astounded by how many people are willing to accept that any bright dot of light in the night sky is convincing proof that alien spacecraft are sailing overhead, or that the Egyptians used extraterrestrial consultants to build the pyramids.

Disconcerting indeed, but I suspect these experiences are largely a selection effect: I hear only from the people who choose to get in touch with me. And what’s different today is that they can. The internet allows everyone to engage with anyone. 

What I believe is more relevant than the funky phone calls is the fact that the fraction of college freshmen who intend to major in science or engineering is substantial. Indeed, it was about one-third in 1995, and since then has increased by about 10 percentage points. This group is far more diverse with regard to sex and ethnicity.

As important as these metrics are, I derive the greatest encouragement from the way science is seen by our culture. Being a nerd is now a compliment, and not — as it once was — a one-way ticket to social ostracism. STEM education is valued by parents and sought for their children. TV shows and movies — which once portrayed the scientifically adept with derision — now frequently make them the heroes.

The attack on science, insofar as such aggression is real, should be resisted. But it seems to me, when I look at the prestigious role models that scientists — despite their complicated jobs — have become, I figure that “the kids will be alright.” The offensive against science is one attack that can be repulsed. I’m counting on the youth.  

Seth Shostak ’65 is senior astronomer at the SETI Institute.