Risk Reporting 101

What journalists should know about hazards and exposure

During my years as a daily TV journalist in Boston, I covered a seemingly endless string of risks: from the run-of the mill threats like car crashes and plane crashes and fires and crime, to artificial sweeteners (yes, I’m that old) and air bags and silicone breast implants and the “new” epidemic of child abductions, to a depressingly rich litany of environmental risks. I tried to do it well, and won a bunch of awards. Then I left TV news and joined the Harvard School of Public Health, and discovered a lot of details about risk that would have made me a better reporter had I known about them back when I was reporting.

Risk is more than just a number…one in a million. It’s more than just saying some scary thing is out there…“Suspected Carcinogen in Coffee”. There are important aspects to risk that I never provided my viewers because I never knew what questions to ask. The people who depended on me for the information that would help them make healthy choices were disserved, and possibly even harmed, by my failure. I wasn’t alone, of course. Plenty of my colleagues in broadcast and print—the best of them—did the same thing. Our reporting was inherently deficient because we just didn’t know that there are important details without which a story about a risk is simply incomplete. This still happens all the time, even at the very best news organizations.

So in the hope of contributing to better journalism, here are some basic “Who, What, When, Where, Why, and How” questions to answer about risk that will help journalists cover these stories more thoroughly and give readers/viewers/listeners all the information they need to know just how risky something may actually be. (And, by the way, none of the details described below are complicated, hard to understand, or take more than a sentence or so to squeeze into any story.)

For something to actually be a risk, you need two elements: a hazard and exposure to the hazard. Either component creates the potential for risk and is therefore a story by itself, but an actual risk to people only exists if there is some hazard and exposure to it. A poisonous snake is hazardous, but not a risk if it’s in a cage and we’re not exposed. A snake on the loose to which we are exposed is not a risk unless it’s poisonous. That something is a hazard is a story, of course. That we are exposed to some worrisome thing is, too. But for there to be an actual risk, you have to have both. If you are reporting on just the hazard (“Substance X May Cause Cancer”) or just the exposure (“Trace Amounts of Human-made Chemicals in our Blood”), readers won’t know whether they are actually at risk.

Hazard and exposure both have critical details. Here are the biggies regarding hazard:

Hazard - How much? Dose matters, yet you’d be astonished how often that information never shows up in risk stories. Stories often say something like “Substance X causes Y,” but they fail to say how much of substance X it takes. Sometimes there is no bright line—a specific dose above which there is hazard and below which there isn’t—and the best science can come up with are frustratingly vague “guidelines.” Sometimes there’s a threshold dose below which the hazard isn’t hazardous, and in most cases the greater the dose above that threshold, the greater the risk. But not always. Sometimes small doses are believed to be the riskiest (endocrine disruptors), and with things that cause cancer, the standard scientific assumption is that any dose is potentially hazardous. Then there’s the new toxicology that has found that sometimes things that are hazardous at high doses may actually be good for you at low ones (even carcinogens!). In any event, a story without information about dose is missing a basic fact the reader needs.

Hazard - What does it do, and when? It’s amazing how many stories on risky things don’t include these basic facts. Is it fatal? How? If it’s not fatal, does it cause severe or minimal harm (food poisoning stories often fail to include this), treatable damage (exposure to TB) or untreatable (birth defects)? Even some carcinogens produce treatable harms (a frequent outcome of radiation exposure is thyroid cancer, highly treatable, but an important fact that rarely showed up in stories about the radiation risk from Chernobyl). If the hazard is fatal, does it kill you right away or later? (Asbestos causes lung cancer or mesothelioma, but it’s usually decades before those diseases show up.) These are important details the reader needs in order to judge the risk.

Hazard - To whom? Stories about air pollution alerts usually describe the increased risk for people with compromised immune systems. That sort of subgroup distinction exists for most risks, but is often ignored. There are almost always subgroups at greater risk than others, and there are often large groups for whom the hazard isn’t hazardous at all. (Mercury is regulated because studies have linked it to small cognitive impairments to the developing fetus of pregnant women who eat lots of mercury-containing fish. It’s much more a risk to that group than anyone else, yet it is stunning how often mercury stories fail to make that distinction.) Key subgroups to ask about include age (75 percent of all cancers occur in people fifty-five or older), health status, gender, and location.

Key details on exposure include:

Exposure - At what dose? (Again.) Since the dose determines how hazardous something is, you also have to tell the reader the dose to which they are actually being exposed. Are they actually being exposed to enough of the bad thing to worry about? (Stories on radiation risk usually leave this out. Mercury stories, too.)

Exposure - By what route? Do you inhale it, ingest it, or get it on your skin? Plutonium particles can’t penetrate clothes or skin, but breath them in and they’ll kill you. Radon is dangerous if you inhale it, but not if you swallow it mixed in water from an underground well, bringing it up through uranium rich bedrock. The risk depends in part on the route by which people are exposed.

Exposure - Over what period of time? Is one shot harmful (asbestos, contaminated food), or does it take chronic exposure (PCBs ingested in food, alcohol)? It’s easy, and vital, to make this important detail clear.

Exposure - At what age? Fetal and infant exposure to lead and mercury (and lots of things) is way worse than exposure to older kids and adults. Intense sunburn before age eighteen raises the risk of adult skin cancer. This is an important yet often missing detail.

The factors listed above pertain mostly to environmental risks stories. But during my few years as a director of risk communication at the Harvard Center for Risk Analysis, I would often get calls from journalists on a wide range of risks—environmental, medical, transportation, occupational. They always asked the same question: “What’s the risk of…?” And what they wanted was a number. “The risk is one in X.” Well, it turns out that with risk numbers, too, there are really important details you need to understand. (Don’t worry, no math required.)

The problem, basically, is that no one number tells the story. Let’s take the risk of flying in airplanes. (I wrote about this for a NOVA documentary on plane crashes) The annual risk of being killed in a plane crash for the average American is about one in 11 million. That’s simply the number of people who die in plane crashes per year, divided into the total population. But how simplistic and meaningless is that!? There is no average American. Some people fly more, some fly less, and many don’t fly at all. And there is no average year for flying deaths. Imagine if you used 2001 for your baseline. (Or even if you used the plane crash deaths for ten years, and one of them was 2001.)

Think about how this applies to other risks. Crime in a city is usually much greater in some areas than others and at some times more than others, so a citywide average and a one-year average are meaningless. Child abduction is more often committed by relatives than strangers, so the total number of abductions—262,000—divided into the total number of kids is an inaccurate picture of the kind of child abduction that usually makes the news—115.

The point is that you have to be more precise about the population you are using. (For math lovers, that’s the denominator. For the math-challenged like me, that’s the number at the bottom of a fraction.) If you want to say the risk is one in X, you have to think carefully about the X. Take flying, for example. You can divide the number of people who die into the total number of people, which gives you the risk for the average person. You can divide the number of victims into the number of total flights all passengers took, which gives the risk per flight. Or you can divide the number of victims into the total number of miles all of them flew, which gives you the risk per mile. It turns out that most planes crash on takeoff or landing, so the per-flight risk is greater than the per-mile risk. That’s a critical distinction.

Some of the things to ask about when you are trying to come up with the most relevant risk number include: the risk for the general population (usually pretty meaningless) or the most at-risk subgroup(s); the risk per year or for a lifetime (lifetime risk is the basis for most government regulations); the risk for a certain age group (this is a biggie for cancer, since three quarters of all cancers occur after age fifty-five) for a particular gender. Sometimes the numerical question to ask isn’t about people, but units. Another thing to ask about is the risk per unit of exposure (e.g. distance driven in local traffic or on the highway, hours of use on the cell phone, the type of medical radiation and how long it was applied).

One last basic point about risk numbers: To give readers what they need, journalists ought to include both the relative risk and the absolute risk, not just the one that makes the story sound more dramatic (which is usually the one you’ll lead with). Relative is the risk compared to something else…the risk is 20 percent greater than last year, the risk is 50 percent higher for those who fly frequently with more takeoffs and landings compared to those who fly more miles, but take longer flights. Relative involves comparison, but it doesn’t tell you how many actual victims there are. That’s the absolute risk. Let’s say a risk increased 20 percent compared to the previous year. That could mean the risk of four people in a million dying has increased to five in a million. Twenty percent sounds bigger and makes the story more intriguing, but without the absolute number, the reader doesn’t know what they need to know to make an informed judgment.

But relative risk is important, too. Again, take flying. Even though the absolute risk of dying in a plane crash is really low, the fact that it is X percent (relatively) higher for frequent fliers is also important to put the overall threat in perspective. Both relative risk and absolute risk offer important information. Stories that fail to include both are simply incomplete.

I shudder to think how often I made the mistakes listed above. Not because I didn’t care about getting things right. I did, and most journalists do. Not because I purposefully left things out to hype my stories so they’d run higher in the newscast (I did plenty of that, and most journalists do too). These important basic facts were missing from my stories because I never knew to ask about them—these obviously important, basic elements of risk that they never taught us to ask about in journalism school.

I just finished a series of visits to newsrooms around the country (paid for by a small grant from the Richard Lounsbery Foundation) to offer this “Intro to Risk” training. Most of the journalists at NPR, USA Today, the Los Angeles Times, and most of the other places I visited said they found the information really helpful. I hope to get another grant and continue that work. It is vitally important for public and environmental health.

Has America ever needed a media watchdog more than now? Help us by joining CJR today.

David Ropeik is an instructor in the Harvard University Extension School's Environmental Management Program, author of How Risky Is it, Really? Why Our Fears Don't Always Match the Facts, creator of the in-house newsroom training program "Media Coverage of Risk," and a consultant in risk communication. He was an environment reporter in Boston for twenty-two years and a board member of the Society of Environmental Journalists for nine years. Tags: , , , , ,