Note: I love Mother Jones. I think it’s a great publication, I love their investigative journalism, and I think they’re an important force in independent media. That said, I’m not going to sit idly by while they propagate harmful and incorrect information.
A number of people have pointed me in the direction of this piece about mental illness and rampage violence at Mother Jones by Mark Follman, usually with a note of triumph in their voices, suggesting that for all the information I provide about the false linkage between mental health conditions and violence, I’m wrong.
Actually, a close reading of this piece suggests that Mother Jones is wrong, and highlights the need for the use of critical thinking in evaluating any kind of investigative journalism (or scientific research, for that matter). When looking at any kind of statistics or numbers thrown around, you need to closely evaluate the source of that information, consider the methodology, and ask yourself how reliable that information is.
The alleged money shot in this piece is this line:
according to additional research we completed recently, at least 38 of [shooters in 61 cases] displayed signs of mental health problems prior to the killings.
So, almost 2/3 of perpetrators of rampage violence ‘displayed signs of mental health problems.’
Except, wait. Let’s back up a minute. What was the methodology used to identify ‘signs of mental health problems’? Well, we don’t know, because Follman didn’t provide it.
This raises three questions:
(3) is the most important, because here is where the biggest problem with this ‘study’ lies. Did Mother Jones have access to comprehensive psychiatric records as well as evaluations conducted by independent psychiatrists? Did they use an experienced analyst familiar with psychiatry (and, given that most perpetrators of rampage violence die in the act, forensic psychiatry specifically) to go over this information?
Or did they rely on news reports about the perpetrator? Such reports rely heavily on ‘man on the street’ interviews and commentary elicited from people after the fact, when bias has already had a chance to settle in. Not only are most people around the perpetrator not mental health professionals, but they’re typically surrounded with narratives pressuring them to identify ‘signs of mental health problems’ even if none were actually present. There’s actually a term for this: hindsight bias.
If you establish a conclusion on the basis of faulty data, your conclusion will be faulty as well. I’d love to hear from Follman and Mother Jones about the methodology used, and I think the rest of the world should hear too—this information should be readily available as a supplement to the article.