WHY THE AGE OF AMERICAN PROGRESS ENDED

Print Friendly, PDF & Email
- Advertisement -

The Atlantic

Invention alone can’t change the world; what matters is what happens next.

The Scourge of All Humankind

if you were, for whatever macabre reason, seeking the most catastrophic moment in the history of humankind, you might well settle on this: About 10,000 years ago, as people first began to domesticate animals and farm the land in Mesopotamia, India, and northern Africa, a peculiar virus leaped across the species barrier. Little is known about its early years. But the virus spread and, whether sooner or later, became virulent. It ransacked internal organs before traveling through the blood to the skin, where it erupted in pus-filled lesions. Many of those who survived it were left marked, disfigured, even blind.

As civilizations bloomed across the planet, the virus stalked them like a curse. Some speculate that it swept through ancient Egypt, where its scars appear to mar the mummified body of Pharaoh Ramses V. By the fourth century A.D., it had gained a foothold in China. Christian soldiers spread it through Europe during the 11th- and 12th-century Crusades. In the early 1500s, Spanish and Portuguese conquistadors conveyed it west across the Atlantic, where it ravaged native communities and contributed to the downfall of the Aztec, Mayan, and Inca empires.

By the end of the 1500s, the disease caused by the virus had become one of the most feared in the world. About a third of those who contracted it were dead within weeks. The Chinese called it tianhua, or “heaven’s flowers.” Throughout Europe, it was known as variola, meaning “spotted.” In England, where doctors used the term pox to describe pestilent bumps on the skin, syphilis had already claimed the name “the great pox.” And so this disease took on a diminutive moniker that belied the scale of its wretchedness: smallpox.

Over time, different communities experimented with different cures. Many noticed that survivors earned lifetime immunity from the disease. This discovery was passed down through the generations in Africa and Asia, where local cultures developed a practice that became known as inoculation—from the Latin inoculare, meaning “to graft.” In most cases, people would stick a sharp instrument into a smallpox-infected pustule to collect just a little material from the disease. Then they would stick the same blade, wet with infection, into the skin of a healthy individual. Inoculation often worked—pustules would form at the injection site, and a low-grade version of the disease would typically follow. But the intervention was terribly flawed; it killed about one in every 50 patients.

Not until the early 1700s did a chance encounter in the Ottoman empire bring the process to Britain, and bend the axis of history. In 1717, Lady Mary Wortley Montagu, an English aristocrat living in Constantinople with her husband, a diplomat, heard about inoculation from her acquaintances in the Ottoman court. Circassian women, from the Caucasus Mountains and in great demand for the Turkish sultan’s harem, were inoculated as children in parts of their bodies where scars would not easily be seen. Lady Montagu asked the embassy surgeon to perform the procedure on her son—and upon her return to London a few years later, on her young daughter.

Word spread from court physicians to members of the College of Physicians to doctors across the continent. Within a few years, inoculation had become widespread in Europe. But many people still died of smallpox after being deliberately infected, and in some cases inoculation transmitted other diseases, like syphilis or tuberculosis.

One boy who went through the ordeal of inoculation was Edward Jenner, the son of a vicar in Gloucestershire, England. He trained as a physician in the late 1700s, and carried out these rough smallpox inoculations regularly. But Jenner also sought a better cure. He was taken by a theory that a disease among cows could provide cross-immunity to smallpox.

In the spring of 1796, Jenner was approached by a dairymaid, Sarah Nelmes, who complained of a rash on her hand. She told Jenner that one of her cows, named Blossom, had recently suffered from cowpox. Jenner suspected that her blister might give him the opportunity to test whether cowpox was humanity’s long-awaited cure.   certain 8-year-old boy. Jenner drew a blade, slick with ooze from a cowpox blister, across the arm of James Phipps, the brave and healthy son of his gardener.

After a week, young James developed a headache, lost his appetite, and came down with chills. When the boy had recovered, Jenner returned with a new blade—this one coated with the microbial matter of the smallpox virus. He cut the boy with the infected lancet. Nothing happened. The boy had been immunized from smallpox without encountering the disease.

Jenner would go down in history as the person who invented and administered a medical cure for one of the deadliest viruses in world history. Then he invented something else: a new word, from the Latin for “cow,” that would be carried down through the centuries alongside his scientific breakthrough. He called his wondrous invention a vaccine.

The Eureka Myth

let’s pause the story here. Jenner’s eureka moment is world-famous: cherished by scientists, rhapsodized by historians, and even captured in oil paintings that hang in European museums.

For many, progress is essentially a timeline of the breakthroughs made by extraordinary individuals like Jenner. Our mythology of science and technology treats the moment of discovery or invention as a sacred scene. In school, students memorize the dates of major inventions, along with the names of the people who made them—Edison, light bulb, 1879; Wright brothers, airplane, 1903. The great discoverers—Franklin, Bell, Curie, Tesla—get best-selling biographies, and millions of people know their names.

This is the eureka theory of history. And for years, it is the story I’ve read and told. Inventors and their creations are the stars of my favorite books about scientific history, including The Discoverers, by Daniel Boorstin, and They Made America, by Harold Evans. I’ve written long features for this magazine holding up invention as the great lost art of American technology and the fulcrum of human progress.

But in the past few years, I’ve come to think that this approach to history is wrong. Inventions do matter greatly to progress, of course. But too often, when we isolate these famous eureka moments, we leave out the most important chapters of the story—the ones that follow the initial lightning bolt of discovery. Consider the actual scale of Edward Jenner’s accomplishment the day he pricked James Phipps in 1796. Exactly one person had been vaccinated in a world of roughly 1 billion people, leaving 99.9999999 percent of the human population unaffected. When a good idea is born, or when the first prototype of an invention is created, we should celebrate its potential to change the world. But progress is as much about implementation as it is about invention. The way individuals and institutions take an idea from one to 1 billion is the story of how the world really changes.

And it doesn’t always change, even after a truly brilliant discovery. The 10,000-year story of human civilization is mostly the story of things not getting better: diseases not being cured, freedoms not being extended, truths not being transmitted, technology not delivering on its promises. Progress is our escape from the status quo of suffering, our ejection seat from history—it is the less common story of how our inventions and institutions reduce disease, poverty, pain, and violence while expanding freedom, happiness, and empowerment.

It’s a story that has almost ground to a halt in the United States.

In theory, the values of progress form the core of American national identity. The American dream is meant to represent that exception to the rule of history: Here, we say, things really do get better. For much of the 19th and 20th centuries, they did. Almost every generation of Americans was more productive, wealthier, and longer-lived than the one before it. In the past few decades, however, progress has faltered—and faith in it has curdled. Technological progress has stagnated, especially in the nonvirtual world. So have real incomes. Life expectancy has been falling in recent years.

light bulb with deflated white balloon instead of glass on blue background
Derek Brahney

What went wrong? There are many answers, but one is that we have become too enthralled by the eureka myth and, more to the point, too inattentive to all the things that must follow a eureka moment. The U.S. has more Nobel Prizes for science than the U.K., Germany, France, Japan, Canada, and Austria combined. But if there were a Nobel Prize for the deployment and widespread adoption of technology—even technology that we invented, even technology that’s not so new anymore—our legacy wouldn’t be so sterling. Americans invented the first nuclear reactor, the solar cell, and the microchip, but today, we’re well behind a variety of European and Asian countries in deploying and improving these technologies. We were home to some of the world’s first subway systems, but our average cost per mile for tunnel projects today is the highest in the world. The U.S. did more than any other nation to advance the production of the mRNA vaccines against COVID-19, but also leads the developed world in vaccine refusal.

At its worst, the eureka theory distorts American views of how best to push society forward, and slows material advance in the process. To appreciate the deeper story of progress—and to see how it bears on America’s own problems in the 21st century—let’s return to 1796 and recall how history’s first vaccine went global.

One to 1 Billion

Αfter edward jenner verified that James Phipps was indeed protected against smallpox, he wrote a brief paper to announce his discovery. The Royal Society of London refused to publish it. His own self-published booklet, An Inquiry Into the Causes and Effects of the Variolae Vaccinae, was initially ignored by the medical community. (Jenner was both a physician and a zoologist, and his studies of cuckoo-bird behavior may have stoked suspicions that he was at best a dilettante, and perhaps something of a cuckoo himself.)

Jenner needed surrogates in the English medical field to give his wild experiments gravitas. He found one such defender in Henry Cline, an open-minded London surgeon who acquired some inoculating substance from Jenner and began conducting trials to confirm Jenner’s findings, establishing the practice as safe and reliable. The vaccine was so immediately and obviously successful that it proved self-recommending. By 1800, vaccinations had spread rapidly through Europe, in large part because so many elites supported them. The kings of Denmark, Spain, and Prussia personally promoted the vaccine. The pope called it “a precious discovery” that ought to restore the public’s faith in God.

Still, doctors faced a prodigious challenge: how to deliver the stuff around the world in an era without cold storage, airplanes, or cars. They settled on distribution methods that were, by any reasonable estimation, extremely strange and a little ingenious. In the early 1800s, Spain recruited 22 orphaned boys to bring the vaccine to the Americas on their body. Two boys were vaccinated immediately before their ship’s departure. When pustules appeared on their arms, doctors scraped material from them to jab two more children on board. Doctors continued this daisy-chain routine until the ship reached modern-day Venezuela, where they began using the most recent pox eruption to vaccinate people in the Americas. Without any advanced storage technology, they had managed to transport history’s first vaccine more than 4,000 miles, in perfect condition. Arm-to-arm, the vaccine traveled to Mexico, Macau, and Manila. Within 10 years of Jenner’s paper, the vaccine had gone global.

The smallpox vaccine faced popular resistance wherever it went. (In Britain, one cartoonist depicted the vaccinated as sprouting miniature cows out of their bodies.) But America’s most powerful people, including priests and presidents, typically extolled the virtues of the vaccine, having personally witnessed its benefits, which helped overcome the anti-science skepticism. Gradually, the vaccine pushed smallpox out of Europe and the U.S.

Even so, in the 1950s—some 150 years after Jenner’s discovery—1.7 billion people, or roughly 60 percent of the world’s population, still lived in countries where the virus was endemic. The major powers would often talk about finishing the job of smallpox eradication, but major technical and organizational obstacles stood in the way. Vaccination efforts still lacked funding. Outbreaks were still too difficult to track.

Then along came several heroes who belong in the pantheon of science history alongside Edward Jenner. The first is D. A. Henderson, the director of the World Health Organization’s global vaccination effort. Henderson was just 38 years old when he arrived in Geneva to lead a program to vaccinate more than 1 billion people in 50 countries within 10 years. He was put in charge of a small staff and a modest budget within the labyrinth of a global bureaucracy.

Reaching 1 billion people with limited resources required a brilliant strategy for surveilling and containing the disease. Henderson’s team invented the technique of “ring vaccination.” Rather than inoculate every person in every country, his disease detectives would look for an outbreak and vaccinate all the contacts of the affected people and anyone else in the area. And so, each outbreak was encircled by people who were immune to the smallpox virus and wouldn’t let it pass through them.

Above all, Henderson needed an extraordinary supply of vaccine at a cheap price with a low-cost way to administer doses to people around the world. He benefited from a timely invention that proved essential to the story of smallpox eradication. In 1965, an American microbiologist named Benjamin Rubin created a bifurcated needle, which held a tiny droplet of vaccine between two prongs, like a miniature olive fork. It allowed 100 vaccinations from a single vial (four times the previous amount) and brought down the cost of vaccination to about 10 cents a patient.

Henderson and his small army of eradicators eventually squeezed smallpox out of Africa, South Asia, and Brazil. Since October 26, 1977, no naturally occurring smallpox cases have been recorded. In 1980, the WHO announced that smallpox, which had killed about 300 million people in the 20th century alone, had finally been eradicated.

Invention Without Implementation

the end of smallpox offers a usefully complete story, in which humanity triumphed unequivocally over a natural adversary. It’s a saga that offers lessons about progress—each of which pertains to America today.

The most fundamental is that implementation, not mere invention, determines the pace of progress—a lesson the U.S. has failed to heed for the past several generations. Edward Jenner’s original vaccine could not have gone far without major assistance from early evangelists, such as Henry Cline; distribution strategies to preserve the vaccine across the Atlantic; and a sustained push from global bureaucracies more than a century after Jenner’s death.

Almost every story of progress is at least a little like this, because even the most majestic breakthroughs are typically incomplete, expensive, and unreliable. “Most major inventions initially don’t work very well,” the economic historian Joel Mokyr told me. “They have to be tweaked, the way the steam engine was tinkered with by many engineers over decades. They have to be embodied by infrastructure, the way nuclear fission can’t produce much energy until it’s inside a nuclear reactor. And they have to be built at scale, to bring down the price and make a big difference to people.”

For many decades, the American government has focused overwhelmingly on discovery rather than deployment. After World War II, Vannevar Bush, the architect of our thrillingly successful wartime tech policy, published an influential report, “Science: The Endless Frontier,” in which he counseled the federal government to grow its investment in basic research. And it did. Since the middle of the 20th century, America’s inflation-adjusted spending on science and technology, through the National Institutes of Health and the National Science Foundation, has increased by a factor of 40.

But the government hasn’t matched that investment in the realm of implementation. This, too, was by design. Bush believed, with some reason, that politicians should not handpick nascent technologies to transform into new national industries. Better to advance the basic science and technology and let private companies—whose ears were closer to the ground—choose what to develop, and how.

You could say that we live in the world that Bush built. “The federal government, through NIH and NSF, pours billions into basic science and defense technology,” Daniel P. Gross, an economist at Duke University, told me. “But for civilian technology, there has been a view that Washington should fund the research and then get out of the way.”

As a result, many inventions languish in the so-called valley of death, where neither the government nor private ventures (risk-averse and possessed by relatively short time horizons) invest enough in the stages between discovery and commercialization. Take solar energy. In 1954, three American researchers at Bell Labs, the R&D wing of AT&T, built the first modern solar-cell prototype. By 1980, America was spending more on solar-energy research than any other country in the world. According to the Bush playbook, the U.S. was doing everything right. But we lost the technological edge on solar anyway, as Japan, Germany, and China used industrial policy to spur production—for example, by encouraging home builders to put solar panels on roofs. These tactics helped build the market and drove down the cost of solar power by several orders of magnitude—and by 90 percent in just the past 10 years.

The U.S. remains the world’s R&D factory, but when it comes to building, we are plainly going backwards. We’ve lost out on industrial opportunities by running Bush’s playbook so strictly. But there are other problems, too. Since the early 2000s, the U.S. has closed more nuclear-power plants than we’ve opened. Our ability to decarbonize the grid is held back by environmental regulations that ironically constrict the construction of solar- and wind-energy farms. It’s been roughly 50 years since Asia and Europe built their first high-speed rail systems, but the U.S. is almost comically incapable of pulling train construction into the 21st century. (A 2008 plan to build a high-speed rail line in California has seen estimated costs more than triple and deployment delayed by a decade, and it’s still uncertain if it can be completed as planned.)

“New ideas are getting harder to use,” the futurist and economist Eli Dourado told me. If the U.S. wanted to unleash geothermal power, we could simplify geothermal permitting. If we wanted to build the next generation of advanced nuclear reactors, we could deregulate advanced nuclear reactors. These measures would not require inventing anything new. But they would stimulate progress by making it easier to bring our best ideas into the light.

The United States once believed in partnerships among the government, private industry, and the people to advance material progress. The Lincoln administration helped build the railroads. The New Deal helped electrify rural America. Dwight Eisenhower signed the Price-Anderson Act, which guaranteed government funds and limited liability for nuclear-energy firms in case of serious accidents, facilitating the construction of nuclear-power plants. John F. Kennedy’s space ambitions made NASA a major consumer of early microchips, which helped reduce their price by a factor of 30 in a matter of years, accelerating the software revolution.

light bulb with basket suspended below it to look like a hot-air balloon on blue background
Derek Brahney

“And then, around 1980, we basically stopped building,” Jesse Jenkins, who researches energy policy at Princeton, told me. In the past 40 years, he said, the U.S. has applied several different brakes to our capacity to build what’s already been invented. Under Ronald Reagan, the legacy of successful public-private partnerships was ignored in favor of the simplistic diagnosis that the government was to blame for every major problem. In the ’70s, liberals encouraged the government to pass new environmental regulations to halt pollution and prevent builders from running roughshod over low-income neighborhoods. And then middle-class Americans used these new rules to slow down the construction of new housingclean-energy projects—just about everything. These reactions were partly understandable; for example, air and water pollution in the ’70s were deadly crises. But “when you combine these big shifts, you basically stop building anything,” Jenkins said.

To understand how we could do better, it’s useful to compare the story of the first global vaccine to the story of the latest one.

Warp Speed

in april 2020, as COVID was circumnavigating the globe and demolishing normalcy everywhere, The New York Times published an article titled “How Long Will a Vaccine Really Take?” Although Trump-administration officials aimed to unveil a COVID vaccine within 18 months—that is, by the fall of 2021—the journalist Stuart Thompson reminded readers that the shortest time in history for developing a new vaccine was four years. “The grim truth,” he wrote, “is that a vaccine probably won’t arrive any time soon.” But then it did. The first mRNA vaccines were administered before the end of 2020.

The COVID vaccines underline a second lesson from the smallpox story. Some technology myths make it seem like progress is exclusively the work of geniuses, untouched by the grubby hands of politicians and bureaucrats. But a rogue cadre of inventors didn’t eradicate smallpox. States did. Agencies did. Progress is often political, because the policy decisions of states and international organizations frequently build the bridges between discovery and deployment.

The story of the mRNA vaccines can be traced back to the ’90s, when the Hungarian-born scientist Katalin Karikó began her research on the pharmaceutical potential of mRNA, a small but mighty molecule that tells our cells what proteins to make. Her work, along with that of her fellow University of Pennsylvania researcher Drew Weissman, gradually raised our mastery of mRNA to the point where it could be deployed for a vaccine. In early 2020, within 48 hours of receiving the genetic sequencing of the coronavirus, Moderna had prepared its COVID-vaccine recipe, and BioNTech, a German firm that later partnered with Pfizer, had designed its own vaccine candidate.

These technological breakthroughs, building on decades of basic research, were themselves miracles. But alone, they weren’t enough. The U.S. also needed a policy miracle—a feat of bureaucratic ingenuity that would make, distribute, and administer novel vaccines with record-breaking efficiency. We got just that with Operation Warp Speed, which belongs with the Apollo program and the Manhattan Project as one of the most important technology programs in the history of modern federal policy. It likely saved hundreds of thousands, if not millions, of lives.

From the beginning, Warp Speed’s job seemed nearly impossible. To create the fastest vaccine program ever, officials had to essentially map out the entire journey of a new therapy—from research and clinical trials to regulatory approval and distribution—and turn this obstacle course into something like a glide path. They invested in both traditional and mRNA vaccine approaches, paid up front for clinical trials, and placed billions of dollars in advance orders to urge pharmaceutical companies to move as fast as possible. When Moderna needed more manufacturing facilities, Warp Speed provided funding for additional factory space. When the government identified a shortage of the special material that mRNA vaccines require for ultracold transport, Warp Speed granted $347 million to SiO2 and Corning, two manufacturers of glass vials. And because standard vaccine approval from the FDA can take years, the program’s leaders allowed vaccine makers to proceed with emergency use authorizations to speed up the review process.

“The single most important thing that Operation Warp Speed did was to provide a whole-of-government urgency” to the goal of rapid deployment, Caleb Watney, a co-founder of the Institute for Progress, told me. “Getting everything right meant you needed to make a million correct decisions in the right order.” If the government had bet only on traditional vaccine technology, we would have had no mRNA therapies. If the government hadn’t done extensive supply-chain mapping in the summer of 2020, the initial vaccine rollout might have taken months rather than weeks. And if the government hadn’t bought out vaccines from the pharmaceutical companies, they wouldn’t have been free to consumers. But because Operation Warp Speed did all of this, the vaccines were expeditiously approved, manufactured, and distributed at no cost to the public.

Warp Speed was a special case, essentially a wartime policy applied to a health crisis. Few people would recommend such an aggressive approach for developing ordinary consumer technology. And the government is certainly capable of making bad choices as to exactly what technology to develop, and how. But while too much government action on this front can waste money, too little can waste time and even lives, stymieing possible breakthroughs. Warp Speed showed that smart government action can accelerate discovery and deployment. Just as significant, it showed that the kinds of bets the government can place, such as FDA reforms, don’t necessarily involve spending any money at all.

Here’s a thought experiment: Let’s imagine what an Operation Warp Speed for cancer prevention would look like. It might include not only a larger cancer-research budget, but also a search for regulatory bottlenecks whose elimination would speed up the approval of preventative drugs that have already been developed. According to Heidi Williams, the director of science policy at the Institute for Progress, from the time the War on Cancer was announced, in 1971, until 2015, only six drugs were approved to prevent any cancer. This reflects an enormous gap in clinical trials: From 1973 to 2011, nearly 30,000 trials were run for drugs that treated recurrent or metastatic cancer, compared with fewer than 600 for cancer prevention. How could this be?

You could start by blaming the U.S. system of patents and clinical trials, Williams told me. If a company discovers a drug that, when used by younger adults, prevents colon cancer in middle age, it could still take decades to gather long-term data from clinical trials. At that point, the patent on the original discovery might have expired. Reforming trials for preventative drugs and for early-stage disease therapies “might be the single highest-value thing we could do for biomedical research in the U.S.,” Williams said. The FDA already approves heart-disease treatments, such as beta-blockers, by looking at patients’ cholesterol levels rather than waiting for full mortality results. The agency could similarly establish short-term proxies for approving drugs that prevent cancers, Williams said. Or we could change the law so that the patent clock on cancer-prevention treatments didn’t start ticking until after the pharmaceutical company first starts selling the drug. As with Warp Speed, these policies could accelerate the development of lifesaving medication without spending a taxpayer dime on research. The key is adopting a more aggressive problem-solving approach, with the ends in mind.

One regrettable feature of history is that it sometimes takes a catastrophe to fast-forward progress. The U.S. directly advanced airplane technology during World War I; radar, penicillin manufacturing, and nuclear technology during World War II; the internet and GPS during the Cold War; and mRNA technology during the pandemic. A crisis is a focusing mechanism. But it is up to us to decide what counts as a crisis. The U.S. could announce a Warp Speed for heart disease tomorrow, on the theory that the leading cause of death in America is a national crisis. We could announce a full emergency review of federal and local permitting rules for clean-energy construction, with the rationale that climate change is a crisis. Just as it did in the ’60s with smallpox, the U.S. could decide that a major disease in developing countries, such as malaria, deserves a concerted global coalition. Even in times without world wars and pandemics, crises abound. Turning them into national priorities is, and has always been, a political determination.

A Question of Culture

operation warp speed was ingenious, admirable, and wildly successful. But despite all that, it was not enough.

Having overcome the hurdles of scientific breakthrough, technological invention, and rapid distribution, the mRNA vaccines faced a final obstacle: cultural acceptance. And the skepticism of tens of millions of American adults proved too much for the vaccines to overcome. This is the third lesson of the smallpox story—culture is the true last-mile problem of progress. It doesn’t matter what you discover or invent if people are unwilling to accept it.

5 flat pictures of lightbulbs lined up like dominoes with the first one tipping over the next one on blue background
Derek Brahney

In 2021, the U.S. took an early global lead in vaccine distribution, thanks to the accelerated development of vaccines under President Donald Trump and their timely delivery under President Joe Biden. By April, we had distributed more shots per capita than almost any other country in the world. But by September, according to one estimate, the U.S. had fallen to 36th in national vaccination rates, behind Mongolia and Ecuador. The problem wasn’t supply, but demand. Tens of millions of American adults simply refused a free and effective vaccine in the middle of a pandemic.

Michael Bang Petersen, a Danish researcher who led a survey of attitudes in Western democracies about COVID-19, told me that America’s history of vaccine skepticism—and of conspiracy theories surrounding vaccines—of course predates the coronavirus pandemic. And although American vaccine resistance has several sources, including the cost of some vaccines and our legacy of medical racism, Petersen told me that one of the most important factors today is “the level of polarization between Democratic and Republican elites.” Vaccine rejection remains higher among Republican adults than any other measured demographic, including age, education level, gender, and ethnicity.

In the 19th century, state and church leaders across Europe and the Americas typically praised the smallpox vaccine in unison. But in the 21st century, a dwindling number of subjects enjoy such universal elite endorsement. Despite the historical assumption that moments of tragedy bring a country together, the pandemic efficiently sorted Americans into opposing camps—for and against lockdowns, for and against vaccines. Nearly 90 percent of Americans told the Pew Research Center that the pandemic has made the country more divided.

Americans are deeply polarized; that much is obvious. Less obvious, and more important for our purposes, is how polarization might complicate material progress today. One big problem the country faces is that as coastal, educated elites have come to largely identify as Democrats, Republicans have come to feel ignored or condescended to by the institutions populated by the former group. As if recoiling from the rise of a liberal scientific and managerial class, the GOP has become almost proudly anti-expertise, anti-science, and anti-establishment. Cranks and conspiracy theorists have gained prominence in the party. It is hard to imagine scientific institutions flourishing within right-wing governments averse to both science and institutions. But this is only part of the problem, culturally speaking.

The other part is that some Democrats—many of whom call themselves progressives—have in meaningful ways become anti-progress, at least where material improvement is concerned. Progress depends on a society’s ability to build what it knows. But very often, it’s progressives who stand against building what we’ve already invented, including relatively ancient technology like nuclear power or even apartment buildings. Cities and states run by Democrats have erected so many barriers to construction that blue metro areas are now where the housing crisis is worst. The five states with the highest rates of homelessness are New York, Hawaii, California, Oregon, and Washington; all are run by Democrats. Meanwhile, it is often left-leaning environmentalist groups that use onerous rules to delay the construction of wind and solar farms that would reduce our dependency on oil and gas. The left owns all the backpack pins denouncing the oil industry, but Texas produces more renewable energy than deep-blue California, and Oklahoma and Iowa produce more renewable energy than New York.

One possible explanation is that progressives have become too focused on what are essentially negative prescriptions for improving the world, including an emphasis on preservation and sacrifice (“reduce, reuse, recycle”) over growth (“build, build, build”). At the extreme, this ascetic style leads to calls for permanent declines in modern living standards, a philosophy known as “degrowtherism.” The aim is noble: to save our descendants from climate change by flying less, traveling less, buying less, and using less. But it is a profound departure from progressivism’s history, which is one of optimism about the ability of society to improve lives on a big scale through bold action. It’s self-defeating to tell voters: “My opponent wants to raise your living standards, but I promise I won’t let that happen.” It’s far better—and, arguably, more realistic—to tell voters that building more renewable power is a win-win that will make energy cheaper and more abundant.

When you add the anti-science bias of the Republican Party to the anti-build skepticism of liberal urbanites and the environmentalist left, the U.S. seems to have accidentally assembled a kind of bipartisan coalition against some of the most important drivers of human progress. To correct this, we need more than improvements in our laws and rules; we need a new culture of progress.

The Trust Gap

a famous theme in American history is adaptability, and justifiably so. When something isn’t working, we’ve typically been game to try something new. In the summer of 2022, Biden signed a series of laws, including the CHIPS and Science Act and the Inflation Reduction Act, that included hundreds of billions of dollars for building microchips, solar panels, electric cars, and infrastructure, green and otherwise. In an address touting this approach, Treasury Secretary Janet Yellen branded it “modern supply-side economics.” Contrasted with the Reagan-era phrase, which referred to cutting taxes to stimulate the economy, her speech focused more on direct investments in American manufacturing and improving America’s ability to build what it invents. In October, Brian Deese, a senior adviser to Biden, announced the administration’s plans to deliver a modern industrial strategy that would help “spur mature technologies to deploy more quickly [and] pull emerging innovations to market faster.”

No one can say for sure how well Biden’s specific plans will work—and a decade from now, critics will undoubtedly find particular initiatives that failed or wasted money. Still, we might be moving from the eureka theory of progress to an abundance theory of progress, which focuses on making our best ideas affordable and available to everyone. Overall, this new direction of federal policy seems promising.

Still, it doesn’t solve the problem of cultural unreadiness for progress, a problem that afflicts the left and right differently, but that ultimately comes down to trust. Every form of institutional trust is in free fall. Fewer than half of Republicans say they have faith in higher education, big businesses, tech firms, media, the entertainment industry, and unions. Among Democrats, too, confidence in government has declined. Why is social trust so important to progress? In a country where people don’t trust the government to be honest, or businesses to be ethical, or members of the opposite party to respect the rule of law, it is hard to build anything quickly and effectively—or, for that matter, anything that lasts.

One of the most important differences between invention and implementation is that the former typically takes place in private while the latter is necessarily public. The first practical silicon-solar-cell technology was developed in a corporate lab in New Jersey. Building a solar farm to generate electricity requires the sustained approval of officials and local residents—in other words, it requires people to genuinely believe that they will benefit, at least collectively, from changes to their lived environment.

I want to tell you that there is a simple agenda for restoring trust in America, but I don’t think I can do that. When discussing barriers to the construction of nuclear-power plants or the pace of drug development, one can play the part of a bottleneck detective—identifying obstacles to progress and working to overcome them through clever policy tweaks. But Americans’ growing mistrust of institutions and one another is rooted in the deepest hollows of society: in geographical sorting that physically separates liberals and conservatives; in our ability to find ideological “news” that flatters our sensibilities but inhibits compromise.

In 2022, the medical journal The Lancet published an analysis of which variables best predicted the rates of COVID infection across 177 countries. Outside wealth, one of the most powerful variables was trust in government among the public. “Trust is a shared resource that enables networks of people to do collectively what individual actors cannot,” the authors of the Lancet paper wrote. When I first read their definition, I stared at it for a while, feeling the shock of recognition. I thought of how much that could serve as a definition of progress as well: a network of people doing collectively what individual actors cannot. The stories of global progress tend to be the rare examples where science, technology, politics, and culture align. When we see the full ensemble drama of progress, we realize just how many different people, skills, and roles are necessary.

The last needle to be applied against smallpox, before its eradication almost half a century ago, carried a dose of vaccine smaller than a child’s pupil. Four hundred years fit inside that droplet. The devotion of D. A. Henderson’s disease-eradicating team was in it. So were the contributions of Benjamin Rubin and the Spanish boys, as well as the advocacy of Henry Cline and the discovery by Edward Jenner, and before him the evangelism of Lady Montagu, and the influence of Circassian traders from the Caucasus Mountains, who first brought the practice of inoculation to the Ottoman court. An assembly line of discovery, invention, deployment, and trust wound its way through centuries and landed at the tip of a needle. Perhaps there is our final lesson, the one most worth carrying forward. It takes one hero to make a great story, but progress is the story of us all.


This article appears in the January/February 2023 print edition with the headline “The Eureka Theory of History Is Wrong.” When you buy a book using a link on this page, we receive a commission. Thank you for supporting The Atlantic.

Derek Thompson is a staff writer at The Atlantic and the author of the Work in Progress newsletter.
spot_img

1 ΣΧΟΛΙΟ

  1. RE “from the time the War on Cancer was announced, in 1971, until 2015, only six drugs were approved to prevent any cancer. This reflects an enormous gap in clinical trials: From 1973 to 2011, nearly 30,000 trials were run for drugs that treated recurrent or metastatic cancer, compared with fewer than 600 for cancer prevention. How could this be?”

    The fact that author Thompson must ask that shows his profound ignorance about this bogus war.

    The official mainstream “war on cancer” has been an unofficial “war” on the unsuspecting public: to keep them misinformed and misguided about the real truth of this “war.” The latest program/”promise” is an extension or reincarnation of the enduring deep racket.

    This PHONY official “war” was never meant to be won but to be CONTINUED (preferably endlessly, at least for decades) so that the criminal BIG allopathic medical business (the medical mafia) built around them make insane profits and defraud the general naive/stupid public, which they’ve been doing successfully … so “THEY ARE winning THEIR war against the general forever-naive/forever-stupid public”..

    The orthodox cancer establishment has been saying a cure for cancer “is just around the corner” and “we’re winning the war on cancer” for decades. It’s all hype and lies (read Dr. Guy Faguet’s ‘War on cancer,” Dr. Sam Epstein’s work, or Clifton Leaf’s book, or Dr. Siefried’s work on this bogus ‘war’, etc). The criminal medical establishment deliberate and falsely self-servingly claims and distorts a ‘win’ in the bogus ‘war on cancer’ when the only truly notably win is a reduction in lung cancer due to a huge reduction in smoking, which has NOTHING to do with their cancer treatments. Lying is their mode of operation.

    Since the war on cancer began orthodox medicine hasn’t progressed in their basic highly profitable therapies: it still uses primarily and almost exclusively highly toxic, deadly things like radiation, chemo, surgery, and drugs that have killed millions of people instead of the disease.

    As long as the official “war on cancer” is a HUGE BUSINESS based on expensive TREATMENTS (INTERVENTIONS) of a disease instead of its PREVENTION, logically, they will never find a cure for cancer. The moonshot-war on cancer inventions, too, includes industry-profitable gene therapies of cancer treatment that are right in line with the erroneous working model of mechanistic reductionism of allopathic medicine.

    The lucrative game of the medical business is to endlessly “look for” a cure but not “find” a cure. Practically all resources in the phony ‘war on cancer’ are poured into TREATMENT of cancer but almost none in the PREVENTION of the disease. It’s IRREFUTABLE PROOF POSITIVE that BIG MONEY and a TOTAL LACK OF ETHICS rule the official medical establishment.

    It’s just like with any bogus official “war” (‘war on drugs’, ‘war on terrorism’, ‘war on covid’ etc) — it’s not about winning these wars but to primarily prolong them because behind any of these fraudulent “war” rackets of the criminal establishment is a Big Business, such as the massive cancer industry. The very profitable TREATMENT focus of conventional medicine, instead of a PREVENTION focus which these official medical quacks (or rather crooks) can hardly make any money off, is a major reason why today 1 of 2 men and 1 in 3 women can expect a cancer diagnosis at some point in their lifetimes yet that rate was multiple times lower 5 decades ago when the phony ‘war on cancer’ began (1 in about 16). And 5 decades ago when this bogus war began cancer was the second leading cause of death and 50 years later it is STILL the second leading cause of death in the country this “war” was declared in. These facts alone prove we are NOT winning the war on cancer.

    At the same time, this same orthodox cancer cartel has been suppressing and squashing a number of very effective and beneficial alternative cancer approaches. You probably guessed why: effective, safe, inexpensive cancer therapies are cutting into the astronomical profits of the medical mafia’s lucrative treatments. That longstanding decadent activity is part of the fraud of the war on cancer.

    If the public were to scrutinize what the medical industry and its government pawns are telling them about the ‘war on cancer’ instead of blindly believing what they’re saying, they’d find that the cancer industry and the cancer charities have been dismissing, ignoring, and obfuscating the true causes of cancer while mostly putting the blame for cancer on the individual, denying or dismissing the serious harms from orthodox cancer treatments and chemical toxicants, and resorting to deceptive cancer statistics to “educate” (think: mislead) the public that their way of treatment is actually successful — read this well referenced scholarly article’s (“A Mammogram Letter The British Medical Journal Censored”) afterword on the war on cancer at https://www.rolf-hefti.com/mammogram.html (scroll down to the afterword that addresses the fraudulent ‘war on cancer’).

    What the medical establishment “informs” the public about is about as truthful as what the political establishment keeps telling them. Not to forget, the corporate media (the mainstream fake news media) is a willing tool to spread these distortions, lies, and the scam of the war on cancer.

    Does anyone really think it’s a coincidence that double Nobel laureate Linus Pauling called the ‘war on cancer’ a fraud? If you look closer you’ll come to the same conclusion. But…politics and self-serving interests of the conventional medical cartel, and their allied corporate media, keep the real truth far away from the public at large. Or people’s own denial or indifference of the real truth.

ΑΦΗΣΤΕ ΜΙΑ ΑΠΑΝΤΗΣΗ

εισάγετε το σχόλιό σας!
παρακαλώ εισάγετε το όνομά σας εδώ

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Διαβάστε ακόμα

Stay Connected

2,900ΥποστηρικτέςΚάντε Like
2,767ΑκόλουθοιΑκολουθήστε
30,500ΣυνδρομητέςΓίνετε συνδρομητής
- Advertisement -

Τελευταία Άρθρα