The Half-Life of Truth: Why "Settled Science" Is a Myth
Executive Summary
We are often told to “trust the science,” as if science were a finished product rather than an ongoing process. History tells a different story. From lobotomies earning Nobel Prizes to plate tectonics being mocked for half a century, scientific consensus has repeatedly collapsed under new evidence. Science advances not because it is certain, but because it is willing to be wrong. This article explores the concept of the “half-life of facts” and why skepticism, not blind faith, is the engine of real scientific progress.
In 2026, “science” is frequently treated as an authority beyond question—a neutral arbiter of truth that delivers permanent answers. But open a medical textbook from 1950, a geology text from 1920, or an astronomy guide from 1900, and you will find confident explanations that modern science now considers dangerously wrong. The past is littered with discarded certainties, many of which were defended just as passionately as today’s “settled” conclusions.
The history of science is not a smooth upward climb. It is a series of disruptive revolutions in which yesterday’s facts become today’s cautionary tales. To truly trust science is not to freeze it in place, but to recognize that it is always provisional—always subject to revision when better evidence arrives.
The Half-Life of Facts
Scientist and mathematician Samuel Arbesman introduced the idea of the half-life of facts, arguing that knowledge decays over time much like radioactive material. In medicine, for example, it has been estimated that roughly half of what a physician learns during training will eventually be proven outdated or incorrect. The catch, of course, is that no one knows which half.
This does not mean science is broken. It means it is alive. Knowledge evolves as tools improve, data accumulates, and assumptions are tested. The danger emerges when evolving understanding is mistaken for final truth. “Settled science” is not a scientific concept—it is a rhetorical one. If an idea is truly settled, it is no longer science; it is doctrine.
“The danger is not that science changes. The danger is pretending that it doesn’t.”
When the Earth Wasn’t Allowed to Move
Today, plate tectonics is foundational geology. But when Alfred Wegener proposed continental drift in 1912, the scientific establishment rejected him outright. His evidence—matching fossils across oceans, mirrored geological formations, and the puzzle-piece fit of continents—was compelling, yet dismissed because he could not explain how continents moved.
For nearly fifty years, the consensus clung to increasingly strained explanations involving land bridges and immobile crusts. Wegener died before the discovery of seafloor spreading and magnetic striping finally vindicated him in the 1960s. The evidence had been there all along; the intellectual framework to accept it had not. The failure was not a lack of data, but an unwillingness to abandon entrenched assumptions.
Medicine’s Darkest Blind Spots
If any field should inspire humility, it is medicine. In 1949, the Nobel Prize in Physiology or Medicine was awarded to Egas Moniz for developing the lobotomy, a procedure that involved severing neural connections to treat mental illness. At the time, it was hailed as a miracle cure. Thousands of patients were subjected to it with the full endorsement of scientific authority.
Today, the lobotomy is remembered as a tragedy—one among many. Thalidomide was prescribed to pregnant women before its catastrophic effects were understood. Doctors once appeared in cigarette advertisements, lending credibility to a product we now know causes cancer. For centuries, the miasma theory of disease delayed the acceptance of germ theory, costing countless lives. These errors were not committed by fools. They were made by experts operating within the limits of their knowledge, bolstered by institutional confidence and social reinforcement. The problem was not science itself, but the illusion that it had reached final conclusions.
The Universe That Refused to Behave
Even physics, often seen as the most rigorous of sciences, has repeatedly been humbled by reality. At the turn of the 20th century, many physicists believed their field was nearly complete. Newtonian mechanics seemed to explain everything. Then relativity shattered assumptions about space and time, and quantum mechanics undermined the very idea of determinism.
Einstein himself resisted the implications of his equations, introducing the cosmological constant to preserve the belief in a static universe. It took Edwin Hubble’s observations of galactic redshift to force acceptance of an expanding cosmos. Overnight, the universe gained a beginning, and an entire worldview collapsed. What had been “known” was replaced—not refined, but overturned.

The Cost of Ignoring Evidence
Few stories illustrate scientific resistance more tragically than that of Ignaz Semmelweis. In the 1840s, Semmelweis demonstrated that handwashing drastically reduced maternal mortality in childbirth. His data was clear. His conclusions were rejected. Doctors refused to accept that they themselves were spreading disease. Semmelweis was ridiculed, marginalized, and died before germ theory confirmed what he had already proven.
Today, this pattern is known as the Semmelweis Reflex: the instinctive rejection of evidence that contradicts established belief. It is not a flaw of science, but of human nature operating inside scientific institutions.
Nutrition and Manufactured Certainty
For decades, official dietary guidelines warned that fat was the enemy and carbohydrates were the foundation of health. The Food Pyramid became doctrine, influencing public policy and personal behavior worldwide. Only later did evidence emerge showing that sugar industry influence had shaped early nutritional science, redirecting blame away from carbohydrates and toward fats.
The resulting health consequences—rising obesity, diabetes, and metabolic disorders—were not the result of ignorance, but of misplaced certainty. Nutrition science continued to evolve, but public messaging lagged behind evidence, trapped by institutional inertia.
Consensus Is Not Truth
Scientific consensus is a social phenomenon. Science itself is a methodological one. The two often align, but history shows they are not the same. Philosopher Thomas Kuhn described scientific progress as a series of paradigm shifts, where dominant frameworks collapse under accumulating contradictions. These shifts are rarely graceful. They are resisted, delayed, and often only accepted after generational turnover.
As the saying often attributed to Max Planck goes, “science advances one funeral at a time.” Institutions resist change because careers, funding, and reputations are built on prevailing models. This does not invalidate science—it humanizes it.
What Then? Science Without Certainty
At What Then Studio, we defend the scientific method: observation, hypothesis, testing, and falsification. But we reject scientism—the belief that current consensus represents permanent truth. When someone declares “the science is settled,” they are not speaking scientifically; they are speaking dogmatically.
History shows that today’s heretics often become tomorrow’s authorities. Progress depends on skepticism, curiosity, and intellectual humility. Knowledge has a half-life, but humility must be permanent. The danger is not that science changes—it’s pretending that it doesn’t.
FAQ: Scientific Paradigm Shifts
A: Coined by Thomas Kuhn, a paradigm shift is a fundamental change in the basic concepts and experimental practices of a scientific discipline— for example, shifting from an Earth-centered to a Sun-centered solar system, or from a static Earth to plate tectonics.
A: Yes. In the mid-20th century, tobacco companies used doctors in advertisements to claim certain cigarettes were “gentler” on the throat, leveraging the authority of medicine to normalize a carcinogenic product.
A: No. Questioning the consensus is part of the scientific method. Healthy science encourages scrutiny, replication, and falsification— while still weighing evidence and expertise appropriately.
Leave a comment