Be wary of parachute journalism. And also parachute research.
A new study in the British Medical Journal serves as an example for all journalists who swoop in and out of academic papers without much care.
The paper, titled “Parachute Use to Prevent Death and Major Trauma When Jumping from Aircraft: Randomized Controlled Trial,” finds that the safety devices do not significantly reduce the likelihood of death or major injury for people jumping from an aircraft as compared with the control group, equipped only with empty backpacks.
It has all the makings of a click-worthy headline: SKYDIVERS DON’T NEED PARACHUTES, SCIENTISTS FIND. Until you actually read the paper.
When you take the time to read beyond the abstract, you learn that all participants were jumping from a stationary, grounded airplane. The participants “could have been at lower risk of death or major trauma because they jumped from an average altitude of 0.6 m (standard deviation 0.1) on aircraft moving at an average of 0 km/h (standard deviation 0),” the authors write. “Clinicians will need to consider this information when extrapolating to their own settings of parachute use.”
Led by Harvard Medical School professor Robert Yeh, the study was published as part of the British Medical Journal’s annual Christmas edition, which highlights lighthearted and satirical research. The study authors consider the research a tongue-in-cheek illustration of an important point about interpreting results of medical research. “The PARACHUTE trial satirically highlights some of the limitations of randomized controlled trials,” they write.
“Our intended audience was for clinicians, medical personnel, people who conduct research,” Yeh said in a phone call with Journalist’s Resource. He was surprised and gratified to find the research resonated with journalists, too.
“It’s a parable to talk about the dangers of potentially misinterpreting research findings,” Yeh said of the research. “I think it’s relevant to any consumers of research, particularly to journalists in a time crunch.”
Consider it a strain of ‘parachute journalism’ – a practice news outlets are often criticized for, in which a reporter travels to an unfamiliar place to cover a single story without any prior knowledge of the area, and then leaves. This study shows why it’s important to avoid parachute journalism when reporting about research (including, but not limited to, work on the aerial devices themselves).
The authors advise thoroughness when evaluating scholarship. Read the study (the whole thing!) with an analytical eye: “Interpretation requires a complete and critical appraisal of the study,” they write.
Beyond the general critique of careless interpretations of seemingly sexy study findings, the authors make a subtler point about clinical trials. They screened 92 potential participants for the study, but only 23 were eligible and willing to participate. The authors suggest that this resembles common practice in clinical trials, where a small fraction of the patients screened are included. They add that prior research shows participants who might stand to gain the most from the experimental treatment often are less likely to be included in clinical trials.
To extend the analogy, if you’re testing the medical equivalent of a parachute on the patient equivalent of someone jumping two feet from an airplane, you’re unlikely to learn whether or not the treatment actually works.
Despite these critiques of randomized clinical trials, the authors maintain that they “remain the gold standard for the evaluation of most new treatments.”
“Our message was to understand trials and research in context, in the clinical environment in which research is conducted,” Yeh said.
To this end, the authors suggest careful interpretation of published results and better efforts to include patients who most need the treatment in clinical trials.
In the interest of preventing misleading headlines and inaccurate news stories, Journalist’s Resource has several tip sheets to help reporters understand and interpret a study’s findings.
In “10 Things We Wish We’d Known Earlier About Research: Tips from Journalist’s Resource,” Denise-Marie Ordway cautions against focusing too much on a paper’s abstract.
“Many people think of the abstract as a summary of the most compelling findings. Oftentimes, this is not the case,” she writes. “The two best places to find information about key findings are 1) the ‘results’ section, which typically is located in the middle of a research article and is where authors explain what they have learned and provide their statistical analyses and 2) the ‘discussion’ or ‘conclusions’ section, which is usually located at the end of the paper and offers a summary of findings as well as a discussion of the real-world implications of the author’s work.”
Another tip from Ordway on discerning good research from bad: Ask yourself, “Can you follow the methodology?” If you can’t understand how the study was conducted, you can’t check the quality of the work, which could lead to breathless coverage of shoddy research.
If you don’t understand a study’s methodology, ask an expert source for help. “The onus shouldn’t fall completely on journalists to do statistical analyses or say whether they’re correct,” said Christie Aschwanden, FiveThirtyEight’s lead writer for science, in an interview with Journalist’s Resource.
And don’t try jumping out of an airplane without a parachute, unless it’s firmly on the ground.
This post was updated on December 18, 2018 to include comments from an interview with lead author Robert Yeh.