Skip to content

When you choose to publish with PLOS, your research makes an impact. Make your work accessible to all, without restrictions, and accelerate scientific discovery with options like preprints and published peer review that make your work more Open.

PLOS BLOGS Absolutely Maybe

A Classic Case of Science “He Said”, “She Said”: How Psychologists Trying to Prevent PTSD Got Controversial

 

 

Cartoon of a victim of disaster calling to participate in a study of disaster response

 

Natural disasters have a lot in common with other major traumas, like life-threatening accidents and mass shootings – especially the emotional distress they leave in their aftermath. As predictable and common as the psychological distress is, though, what those psychologists should or shouldn’t be doing is still controversial. It’s the center of a heated scientific debate that stewed and bubbled through the 1990s and then boiled over.

It began when a technique from the battlefield crossed over to civilian life. Soldiers traditionally debrief to share information and learn from missions and incidents. Psychological debriefing evolved along with military psychiatry: instead of only discussing what happened, groups discussed feelings and coping too.

Psychological debriefing spread far and wide. It spread to civilian first responders. Like soldiers, trauma was in the line of normal duty for them. They needed to be prepared and to cope with the stress, and debriefing was part of normality. It emerged in forms of critical incident stress debriefing among health care professionals and others, too, including students.

Then psychological debriefing spread out to victims of trauma. And on to experiences like childbirth. And its form changed along the way.

Some people expected that professional care could prevent post-traumatic stress disorder (PTSD) and other psychological harm. In the search for an affordable and efficient intervention that could be offered to everyone, it was often a single session – although generally in a context where additional help was available, too.

Somewhere along the way, the belief had also spread that it was always better to get your emotions out than bottle things up. Debriefing fit right in with that belief. Offering professional care to people in crisis also meets the need people are feeling when they see heart-wrenching scenes: to do something immediately to help relieve intense distress.

By the 1990s, psychological first responders had become as much a part of immediate disaster response as medical care and blankets. A police psychologist was already on the scene within half an hour of the first shots at Columbine High School in 1999, for example.

People who got debriefing often said it had helped – and people who were debriefed were coping well or recovering from deep distress. But then, most people will cope and recover after trauma, even without particular help.

Just as we worry about saying the wrong thing and further distressing someone in crisis, professionals can make things worse for people too. And maybe everyone doesn’t benefit from dwelling on the trauma in the immediate aftermath of a crisis. Robust randomized controlled trials were needed to be sure if debriefing was genuinely helping.

A few trials of single session debriefing were done in the 1990s. The people weren’t traumatized in the line of duty, or from being caught in natural disasters or other mass trauma. They had suffered traumas like burns, road accidents or crimes. Or they had been debriefed around childbirth. And when discouraging results came in, controversy erupted.

What followed was a researcher version of “he said, she said.” That’s notoriously hard to sort out. In the ideal world, instead of just arguing about the merits or weaknesses of this or that study, all the important studies would be found and analyzed in a good systematic review. That would sort out conflicting trials and everyone would be happy.

But in this instance the same “he said, she said” problem emerged with systematic reviews, too, after a Cochrane review in 2000 concluded single-session debriefing couldn’t prevent PTSD, but might actually cause it in some people.

 

Cartoon of dueling meta-analysts

 

Many stopped debriefing or recommended against it. The military in the UK quickly  banned single-session debriefing, for example, and the NHS in England was soon recommending against it, and later WHO did too [PDF].  Some continued, either believing that the results did not apply to their system – or they were just not convinced by the review. Others modified their approach or developed other techniques, like psychological first aid and trauma-focused cognitive behavior therapy (CBT). (I’ve written about CBT in another context here.)

Overall, the Cochrane review seems to have had a major impact on practice and research on trying to prevent PTSD. It was followed by another in 2009 on multiple session psychological treatments of any type that was nearly as critical. The authors – one of whom was also an author of the first Cochrane review, concluded:

…[N]o psychological intervention can be recommended for routine use following traumatic events and that multiple session interventions, like single session interventions, may have an adverse effect on some individuals. The clear practice implication of this is that, at present, multiple session interventions aimed at all individuals exposed to traumatic events should not be used.

So why did this research and the conclusion about possible harm become so controversial? Is it just because people shot the messenger when they didn’t like the message? That’s definitely what at least one of the first review’s authors want us to believe [PDF]. I don’t think so. I think the Cochrane reviews are seriously problematic. They were at high risk of causing what I once called evidence-based mistakes – where we are led astray by what claims to be strong evidence, but isn’t. In this case, it raises questions, I think, about author bias.

Even though systematic reviews are the best scientific option we’ve got, they still involve a lot of judgment calls. Researchers make quite legitimate decisions about precise questions differently, and that can lead to a different set of studies being analyzed. They can weigh the quality and value of individual studies differently. It’s a bit like several teams playing football at once, but the players can be on more than one team and the teams are playing by different rules. It gets complicated.

In 2013, systematic reviewers from the US Agency for Healthcare Research and Quality (AHRQ) concluded that some form of early intervention by a trained professional may turn out to be important in the short-term and prevent some serious mental health problems. But basically, the quality of the evidence is just too low to be sure about a lot. Which is a recipe for ongoing controversy. Those reviewers excluded most of the trials of debriefing because they judged their quality to be too low. By 2013, an NHS evidence update was quite positive about multiple session trauma-focused psychological intervention to prevent ongoing trauma symptoms.

You can see links to several systematic reviews I looked at in 2013 in my post at Statistically Funny. While several factors play a role in different review conclusions, the conclusion about harm in the first Cochrane review depended mostly on a single trial that had extensive weaknesses. That’s a risky thing to rely on.

That one trial got good quality ratings by the Cochrane reviewers. But that decision-making including one of the trial’s authors, who was an author of both the Cochrane reviews concluding psychological intervention can do harm. That’s a red flag, and I’ll write more about this issue in my next post.

It was far from being a reliable study, though. As I wrote here, it suffered from several serious biases. The group getting debriefed was much larger (64 versus 46 people) because of the way they randomized and because they stopped the trial early. (I’ve written more about the mess trials stopped early can leave behind here.)

The 2 comparison groups ended up different as a result – including being at higher risk of PTSD. And there was a high attrition rate (>22%): 7 people left the hospital before debriefing and 23 were lost to follow-up. That’s enough bias to materially tilt the results.

What’s more, the number of “events” – people who got PTSD – was low in both the trial’s groups. That’s what makes a study too small to reliably assess differences between groups.

What about the finding of harm in the second Cochrane review on multiple sessions for prevention? That was even weaker: there was no statistically significant evidence of an increase in PTSD. There was just something the author’s called a trend towards harm in one measure only – self-reported symptoms, not clinical assessment of PTSD. And the 2 studies carrying the weight of this “trend” were fraught with problems. For example, both trials also had issues in how people were recruited that were so severe, there was imbalance in the comparison groups – 82 versus 65 people in one, and 66 versus 83 in the other.

In the first of those, allocation was predictable – women who gave birth on set days of the month were the ones who got into the debriefing arm. That’s really prone to bias – I explain why that’s fraught with peril here. In the second, the pathway into the trial was convoluted, and apparently as a result of that, the group in the debriefing arm had higher levels of distress than the control group right from the start.

The evidence base in both these Cochrane reviews is so weak, I think the strong recommendations were jumping the gun. They are now years out of date, yet still influential, and that’s a problem, too. I did a quick and dirty search for trials on single-session debriefing-style interventions and found 5 subsequent ones in adults, plus 2 in children (linked below this post). None found harm, and some found benefit – but I don’t know what a thorough, objective systematic review would conclude now.

While counseling for everyone exposed to a trauma splits the disaster community’s opinions, there’s more agreement that people with symptoms of PTSD could benefit from early therapy. And many others need support. But that leaves professionals to wrestle with the question, where exactly is the line between psychological support and counseling, and what form should it take? As far as I know, we still have no trials of how all this translates to disaster and mass trauma situations other than some crimes.

Interviewed after the Aurora theater shooting in 2012, a counselor summed up their role this way: “Most people are resilient. Our job in disaster response is to help them find their resilience.” More robust research could help them do it.

~~~~

 

If you’re looking for information for support after trauma, psychological first aid is one of the newer techniques. There are psychological first aid resources from the VA National Center for PTSD here, including Handouts for Survivors [PDF] and advice for parents.

This is an extension and update of a post that originally appeared at Scientific American blogs on 21 May 2013.

I mention 5 trials in adults subsequent to the Cochrane reviews, and 2 in children. They are:

 

The cartoons are my own (CC BY-NC-ND license). (More cartoons at Statistically Funny and on Tumblr.)

 

Discussion
  1. Your blog post reminded me of something i came across recently “Primum non nocere” which translates to “First, to do no harm” if Wikipedia is correct: https://en.wikipedia.org/wiki/Primum_non_nocere

    I feel this principle is very important in science, but i feel the possible importance of this is not 1) written and talked about enough, and 2) perhaps even realized.

    I sometimes get the idea that (some) scientists view the world, and the people in it, as some sort of test subjects. Or that possible negative consequences of research, and the practical implementation of (parts) of it, are not even thought about.

    I would be interested in reading some papers about this should they exist, and/or i hope that someone will write a paper about this stuff. I think it could be important, and useful.

Leave a Reply

Your email address will not be published. Required fields are marked *


Add your ORCID here. (e.g. 0000-0002-7299-680X)

Back to top