How to cover environmental exposure studies
It’s a genre of medical study-based story all too familiar to health/science journalists: “a new study found X chemical in [common household product or food],” — the “scare” story that launches a thousand sensationalist headlines.
Sometimes it’s chemicals, and other times it’s pathogens, such as stories decrying how much bacteria is on the typical person’s cell phone (apparently ten times as much as on a toilet seat).
The result is the same: millions of people worrying about developing a condition or contracting an infectious disease due to exposure to something in their environment.
As familiar as these stories are, I find them to be among the hardest and most problematic to cover responsibly. And frankly, they never get easier, no matter how experienced a journalist you are.
A somewhat recent example was the study that identified several heavy metals in tampons. Some articles did a great job of putting the findings in perspective, while others simply prompted fear and confusion.
I wrote about what that study can teach us about informing audiences on the purpose of a study — in that case, to “spot check” tampons for the presence of chemicals to determine the need for a more rigorous exploration. This tip sheet extends more broadly to discuss best practices specifically for covering environmental exposure studies. There’s not necessarily a “right” way to cover these studies, but there are certainly pitfalls and ways to improve coverage.
Coverage of environmental exposures matters
First, it’s important to point out just how vital the role of good journalism is in reporting these stories and the challenge of finding a balance between the two extremes. On the one hand, there are overblown freak-outs, like formaldehyde in baby shampoo or glyphosate in any of countless products (the most recent was Cheerios and Quaker Oats). Some of the blame for these goes to editors, or at least headline writers, more than journalists, as retired science writer Jennifer Jackson Cox pointed out.
“I believe good journalists are trying their best to cover these important issues, but the publications encourage editors to write ridiculous headlines, including some that have very little in common with the actual article as written,” Cox said. “The competition for the attention of readers seems to have driven editors to turn responsible journalism into clickbait. Everything seems to require a ‘hair’s on fire’ quality that undermines solid information. More and more I’m finding that even the trusted media outlets are embracing this tactic.”
On the other hand, science journalist Jenny Morber pointed out that the fear “comes from a real concern that leaders and lawmakers and corporations are not trustworthy and don’t have the public’s best interests in mind.”
After all, there’s a long list of substances that were previously thought of as innocuous that we now know cause substantial health harms: BPA, PFAS, lead, microplastics, asbestos, ultra-processed foods, ethylene oxide… the list goes on. And there’s plenty of evidence that industries lobby to hide or minimize known health threats, as the tobacco, sugar and alcohol lobbies have.
For every “heavy metals in tampons” or “formaldehyde in baby shampoo” story, there’s a Flint, Mich., with lead in their water or a lead in turmeric story. And there are bigger picture stories looking at whole categories of problematic exposures, such as air pollution from climate change or ultra-processed foods.
Then there’s the great in-between, where many stories fall. When I first wrote about arsenic in wine in 2015, we didn’t know how much to worry. It’s since become clear that wine drinkers need not obsess over arsenic exposure (though there are plenty of other reasons to curb wine consumption). But there’s still the big picture to consider. “There is no good way for us right now to be able to calculate accumulated risk from all these things,” Morber pointed out.
Journalists play a big role in which risks get amplified versus downplayed, for better or worse. We are best poised to help the public determine what they should and shouldn’t freak out about. People can’t live their lives being anxious about everything they use in all parts of their lives, but it’s precisely because there are very real environmental risks that people need to take seriously that journalists need to ensure the public knows when not to panic.
Here are tips for doing exactly that.
Is there reliable, verifiable evidence of absorption, ingestion, inhalation or other entry into the human body? In the case of the tampon study, we don’t even know if people using tampons absorb any of the metals detected.
Ignore the press release. Read the study. This tip hopefully goes without saying; former AHCJ president Ivan Oransky likes to say that relying on a press release to report findings is “journalistic malpractice.” But Lisa Gill, a health and medicine investigative reporter at Consumer Reports, puts it even more bluntly: “The culprit behind this concept that every single scientific or medical study should translate to consumer actionable advice is actually the fault of most press officers and media relations people hired to amplify the findings because those researchers are looking for bigger and more research dollars.”
But that doesn’t serve the public interest. As journalists, our job isn’t to help those folks — it’s to accurately convey what was really found and what it really means. Maybe we should even be calling those press people out, at times, when they’ve directly contributed to a panic or exaggerated story lines.gnore the press release. Read the study. This tip hopefully goes without saying; former AHCJ president Ivan Oransky likes to say that relying on a press release to report findings is “journalistic malpractice.” But Lisa Gill, a health and medicine investigative reporter at Consumer Reports, puts it even more bluntly: “The culprit behind this concept that every single scientific or medical study should translate to consumer actionable advice is actually the fault of most press officers and media relations people hired to amplify the findings because those researchers are looking for bigger and more research dollars.”
But that doesn’t serve the public interest. As journalists, our job isn’t to help those folks — it’s to accurately convey what was really found and what it really means. Maybe we should even be calling those press people out, at times, when they’ve directly contributed to a panic or exaggerated story lines.
Describe the purpose of the study. I already covered this in detail here, but it’s worth reiterating that helping consumers understand a study’s purpose helps them understand the research better and improves overall scientific and media literacy.
Distinguish between public health and individual health. Again, I discussed this distinction in my prior post, but here’s the main takeaway: Some findings suggest the need for regulatory or industry-led change (public health). Others require individuals to change behavior or otherwise take action (individual health). Others involve a bit of both.
Many public health issues are “well beyond what an individual consumer has control over,” Northwell Health environmental scientist and occupational medicine physician Kenneth Spaeth told me. Clarifying where an exposure or issue falls along the public-individual health continuum can help consumers understand what actions, if any, they might want to take.
Pay attention to the strength of the study’s findings and the magnitude of the effect. Gill noted that “not every research study is powered to provide advice or to wag a finger or be a piece of advocacy.” If the study provides evidence of a dose response, what’s the effect size? (The tampon study reported no dose effect and wasn’t powered to do anything besides detect whether some tampons contained metals.) AHCJ’s Medical Studies topic area has a wealth of resources to help journalists become better at reading and interpreting studies.
Be clear about what’s known and what’s not — and what each means.
It takes a long time to gather enough research from enough different studies to understand most risks well enough to know what to do about them. Help your audience understand that and where researchers (and possibly policymakers and industry leaders) are in the process.
“Journalists need to connect these dots and also acknowledge in stories that there are still a lot of unknowns,” Jyoti Madhusoodanan, an independent journalist and AHCJ Civic Science Fellow, said. “This is new territory; there’s a lot of ‘look at this cool/terrifying thing!’ But what it means — for health risks, for whether the regulatory bodies we have are doing their job, for what change is needed — and who should do what about it, are key pieces of context.”
You don’t need to describe the entire scientific process, but you can reinforce what it involves — and where we are in it — by considering these questions:
2. Is there reliable, verifiable evidence that presence of this substance in the human body causes harm in amounts that are being absorbed/ingested/inhaled and via that route of entry? Concerns about formaldehyde in baby shampoo often neglected to consider two points: first, that the amount in shampoo was too small to have any carcinogenic effect, even with multiple exposures, and second, that formaldehyde’s carcinogenic effects come from inhaling it, not skin contact. It can cause skin irritation in some people, which is important to mention, but that’s a far cry from cancer worries. But the risk related to route of exposure will vary by substance. Why skin contact from formaldehyde in baby shampoo is not a risk factor for cancer, there is evidence showing increased risk of ovarian cancer from skin contact with talcum powder — including that used with sanitary napkins or tampons. With that substance, skin contact as a route of exposure does carry the potential for harm.
3. Can you quantify the level of harm or risk in ways that consumers can use to assess their own personal risks and benefits? How many thermal receipts would you have to handle to absorb enough to have potential adverse effects? At what percentage of a diet do ultra-processed foods have adverse health outcomes? How many years of inhaling fumes with formaldehyde are associated with how much increased risk of cancer?
4. Is there a regulatory agency or accountability system in place to address exposures or their effects? If so, is it being used or enforced? And/or, does it need to be expanded, updated, or otherwise modified, or is it good as is? If there’s not a system in place, should there be? If so, at what jurisdictions or levels should policy change occur, and is it being discussed?
5. If there is NOT (yet) verifiable evidence of absorption, ingestion, inhalation or other entry into the human body, is there a reasonable likelihood or biological plausibility for entry into the human body in amounts that could theoretically be concerning as a health risk? If so, explain in clear, plain terms what evidence does and does not exist for that possibility.
6. If not, what other theoretical risks or concerns might be worth mentioning, if any? Is this something that could harm the environment? Could it harm animals? Could it harm only certain vulnerable members of society in certain circumstances? Consider, for example, the risk to immune-compromised people of opportunistic infections from bacteria that’s harmless to most people. All that bacteria on our cellphones might be harmless to most of us, but perhaps not to those with weakened immune systems.
7. If there is reliable evidence of entry into the human body, but the risks/harms have not been established, be clear about what we do and don’t know. Is there animal research? Have the doses been adjusted to account for differences between the animal and humans? How substantial is the possible harm? How likely is the risk of that harm? (Make sure to explain the difference between the harm (hazard) and the risk — see this infographic and these great examples.)
8. If the exposure or the harm is outside the control of everyday individuals, consider including what they can control. For example, when I wrote about the link between microplastics and heart disease earlier this year, some sources suggested ways to reduce microplastic exposure, but another pointed out that diet, exercise, and other lifestyle behaviors likely have a far greater impact on heart disease risk and are within people’s control.
Place the risk in context.
Provide examples, ideally with units people can understand or using metaphors or analogies, to help quantify what the risk is and how people can estimate exposure to it. Sometimes the best units are everyday items, such as “bathtubs” or “fist-sized.” In OBGYN Jen Gunter’s excellent piece on the tampon study, she wrote “that a human could never use enough tampons in a day to get anywhere near the amount of arsenic or cadmium the EPA allows in a single bottle of water.”
Putting exposures in context also means considering cumulative exposure, both from duration of exposure and from multiple sources. People may be able to reduce some exposure even if they cannot reduce all of it. For example, heavy metals, such as lead, arsenic, mercury and cadmium, can be present in a wide range of foods, including rice and rice products, baby food, wine, seafood, spices, and chocolate. Check out how well Alice Callahan put heavy metals in chocolate into context here.
When reporting on research about one product, you should address the cumulative exposure from others and include expert quotes or cite recommendations from agencies or medical organizations that offer consumers advice in protecting themselves from that cumulative exposure.
Seek out a balanced and diverse slate of expert sources with genuine expertise.
I often tell non-journalists that I’m not an expert in topics I report on, but I am an expert in finding experts for those topics. It’s important to find people who truly have deep knowledge and/or experience in the specific exposures you’re writing about, ideally independent sources without conflicts of interst — both industry-related COIs and ideological or other non-financial COIs.
Also be aware that not all COIs are fully reported in studies, and most “expire” (or “sunset”) eventually, such that researchers aren’t expected or required to report conflicts that are more than, say, three or five years old. When I was researching independent sources for potential risks of artificial sweeteners — which was incredibly difficult — I searched PubMed for every paper each potential source had been involved in, going back decades. A remarkably high number of currently “independent” sources were involved in the studies that led to FDA approval of sweeteners over two decades ago. They may not be “conflicted” at the moment, but I wouldn’t consider them independent.
I’m also careful about experts who may not be linked to industry but who might have ideological biases that can inappropriately inflate or downplay risks. “You can find experts who will affirm whatever narrative you bring,” Spaeth said.
Take your time.
There’s admittedly a lot to consider in reporting on environmental exposure studies — hence why I consider them among the most difficult to cover. Doing so responsibly and accurately can be overwhelming and time-consuming in ways that rarely conform to deadlines. But it’s better to be right than to be first. Sometimes the best response to an editor’s impatience is to suggest a second-day story where you not only get the reporting right, but you can point out how others may have gotten it wrong.
Other resources
link