Robert F. Kennedy Jr. has done something rare in public life: he’s made the tangled relationship between science, skepticism, and tribalism impossible to ignore.
He leans hard into his role as a rebel truth-teller, raising alarms about food dyes, fluoridation, and vaccines. It’s tempting to dismiss him outright. He gets much wrong. He fuels conspiracy thinking. His mistrust of institutions can be corrosive.
By championing controversial causes—like the risks of food dyes, the dangers of water fluoridation, and, most divisively, vaccine skepticism—he pushes a difficult question into the spotlight:
When should scientists, or those who trust science, speak up—and when must we re-examine what we think we know?
It’s a mistake to assume everything he questions must be wrong. His popularity points to a deeper failure in science communication—a failure to explain uncertainty, admit mistakes, and let evidence evolve without looking like we’re “flip-flopping.”
Should we defend what’s long been accepted as protective, even when new evidence calls for re-examination? Should we ignore findings just because they appear on the fringe—or because we fear being mistaken for anti-science?
Carl Sagan put it well:
“At the heart of science is an essential balance between two seemingly contradictory attitudes—an openness to new ideas, no matter how bizarre or counterintuitive, and the most ruthless skeptical scrutiny of all ideas, old and new.”
Openness and scrutiny. Both are necessary. Neither is easy.
The High Wire of Public Health
In today’s public debates—about vaccines, food dyes, fluoridation—we often abandon one for the other. Some cling to the gospel of absolute safety. Others embrace suspicion and forget what good science has built.
Neither approach is scientific.
The real work—the hard work—is balance. Accept the lifesaving power of vaccines, but stay honest about rare harms. Acknowledge that fluoridation may—or may not—protect teeth, but face the evidence it harms the developing brain. And stay open to new research on food dyes.
When an exposure is voluntary, the bar for evidence is lower. People can choose. But when a public health measure is mandated or an exposure universal, the burden of proof must be higher. Population-level exposures require clear evidence of net benefit and minimal harm—especially to the most vulnerable.
Geoffrey Rose, one of the great epidemiologists of the 20th century, said it best:
“People can buy toothpaste with or without added fluoride, but if fluoride is added to the drinking water, they can hardly avoid imbibing it. We should expect a higher level of scientific evidence and popular acceptability for measures such as water fluoridation which are imposed and not chosen by the recipients.”
Speaking From the Data
Unlike many who wade into these debates, I’ve studied fluoride neurotoxicity and how toxic chemicals increase the risk for autism. I’ve read the studies, the criticisms, and the defenses.
And frankly, I’m disturbed by how many scientists make sweeping claims of safety—without carefully reviewing the latest evidence, without acknowledging credible signals of harm, and without the humility that real science requires.
The same thing happened when I was studying low-level lead poisoning. Most public health officials and physicians ignored the growing evidence. Some dismissed it because it was inconvenient. Too many hadn’t kept up with research.
And the consequences were—and still are—devastating. Every year of delay in strengthening lead regulations has meant another wave of children harmed, another generation with reduced potential, and another missed opportunity to prevent premature births and heart attacks. The science was there. The tools were available. What was missing was the will. We didn’t lack knowledge—we lacked the courage to act on it.
Public health loses credibility when scientists become defenders of orthodoxy instead of seekers of truth.
Skating to Where the Science Is Going
People often say, “follow the science,” as if that means deferring to whatever a government agency or professional body currently recommends. But science doesn’t live in policy documents. It lives in evidence. And evidence evolves.
Real science isn’t about clinging to old guidelines. It’s about watching the data, spotting patterns, and being willing to change course. That’s especially true in public health, where effects are complex, delayed, and politically inconvenient.
Wayne Gretzky said his secret was simple: “I skate to where the puck is going, not where it’s been.” Good science works the same way. It looks ahead. It listens for early signals. It acts to prevent harm before the system catches up—before early warnings become policy. Good science is the source of regulation, not a servant to it.
Waiting for consensus can be a way of avoiding action. First, do no harm means paying attention early—even when the evidence is still emerging.
What Builds Public Trust?
I believe in evidence-based, population-wide measures. I have my loyalties. Vaccines have saved millions of lives. But so could other evidence-based strategies—like a living wage and universal medical care. Public health isn’t just about shots and screenings; it’s about security, stability, and trust.
After living in Canada for the past 16 years, I’ve seen that people are far more likely to trust government—and follow its advice—when they feel supported by it. Trust isn’t built through slogans. It’s built through care. Do I receive a fair wage? Does my government ensure access to food, housing, and medical care? Or is it more concerned with protecting the rights of corporations?
Doing No Harm Means Acting Early
I also know this: widespread exposure to lead, air pollution, plastics, pesticides, and other chemicals is causing real harm—often at levels regulators still call “safe.” Too often, public health agencies downplay these risks.
As a physician, my first principle is simple: do no harm.
Mandating a population-level intervention—or permitting a universal exposure—requires not only stronger evidence of benefit, but also honesty about risks, even rare ones. It demands humility in the face of uncertainty and the courage to change course when necessary.
Before You Speak
As scientists, we should resist the pressure to speak about issues we haven’t studied closely. The best way to serve the public isn’t to speak more often—but to speak more carefully. If you haven’t thoroughly reviewed the evidence, don’t weigh in. Scientific authority is not a license to speculate. It’s a responsibility to know what you’re talking about—and to say so when you don’t.
One of the best compliments I ever received was: “He knows his limits.” In fact, the more I’ve mastered a subject—like lead poisoning—the more often I’ve found myself referring people to other experts. Knowing when to defer is part of knowing your field.
For me, offering a public opinion requires more than a quick review of the literature or a policy statement. It means investing time—usually more than 100 hours over several years—conducting studies, reviewing the literature, reading critiques, and weighing the evidence from multiple angles. That’s what I consider a reasonable foundation for speaking with authority.
Before I write or speak publicly about fluoridation or autism—both emotionally charged topics—I ask myself: Am I sure enough? Have I earned the right to speak? Can I offer something useful without pretending to know more than I do?
If the answer is yes, I try to speak carefully, humbly, and with an invitation to think together—not divide (though, in honesty, my wife says I need to work on this).
Science or Dogma?
The task isn’t to defend yesterday’s truths. It’s to stay fiercely loyal to the process that reveals new ones.
First, do no harm.
Second, respect the boundaries of your expertise.
Third—and hardest of all—never confuse loyalty to science with loyalty to authority.
That’s the difference between science and dogma.
And that’s the difference between serving the public—and protecting your beliefs.
Thank you, Arthur. A friend recently asked if they should continue focusing on fluoride research and advocacy, now that the issue has become a political lightning rod thanks to RFK Jr. I told them: follow the science, not the noise. This post was partly my answer to that question.
what would be helpful is a concise list of what regulators have gotten wrong (or worse) over the last 75 years.. what they said at the time, *how* they ended up being wrong, whether or not there was collusion with industry, whether or not they muzzled academics, whether they're updating their regulatory frameworks and methodologies to account for past mistakes, etc.
whether you should trust them in 2025 should be an empirical question first - not ideological. We should start with an honest accounting of their performance
phthalates, bisphenols, PFAS, flame retardants, microplastics, ~all pesticides, lead, asbestos, cigarettes, fluoride, PCBs, etc etc etc etc.