Social Media...Again...Ugh...
And why the CDC seems intent on sacrificing its reputation on a moral panic
Ugh. I don’t want to make this Substack about social media effects all the time, particularly as that’s become an increasingly nasty and dumb debate. But people exaggerating weak evidence in support of a moral panic keeps coming.
This latest comes from the CDC which release a really crude study correlating social media time to mental health in youth. I say crude because it’s the kind of simplistic study I wouldn’t let undergraduates do for a research project and I have to think that with 9 coauthors (why?1), 5 of whom have Ph.Ds., they have to know better.
Basically, they used the Youth Risk Behavior Survey, which actually is a very good database, and simply correlated frequency of social media use to experiencing bullying as well as 3 mental health questions. They didn’t include potential theoretically relevant controls such as family abuse, which they did have in the same dataset.
The correlations they found were very tiny, mostly below the level that has a high potential to be statistical noise, and well below what I’ve recommended as a level that reaches clinical significance. At best, the results indicate tiny correlations between social media use and mental health, far smaller than would fit the kind of public apocalyptic narrative about the supposed dooms social media brings to kids.
There are other warning signs in the report. For instance, the authors claim early on that “Associations between frequent social media use and poor mental health outcomes among adolescents, including depression and suicide risk are being increasingly documented” and later claim “In alignment with existing research, findings in this report support associations between adolescent social media use and mental health…” But both these statements misrepresent the actual research literature, which has been inconsistent and, by and large, found only null to weak effects. No less a journal than The Lancet concluded this week that “However, attempts to conclusively link rising rates of mental illness or find any clear brain changes with the growing use of social and digital media during adolescence have proved difficult” and “…research on the effects of social media has so far produced mixed results.”
This behavior…misrepresenting prior research as more conclusive than it actually is…is called citation bias and often serves as a risk marker of researcher expectancy effects and false-positive results. I’d argue such behavior borders on the (minor league to be sure) unethical and certainly is irresponsible for an organization such as the CDC.
So we have:
1) A report that uses statistics that fail to control for theoretically relevant variables which actually exist in the dataset.
2) An exaggeration of effects that appear to be tiny, quite possibly statistical noise and
3) Researcher expectancy effects as indicated by failure to accurately portray prior research on social media and mental health.
This isn’t very impressive for the CDC.
On the face of it, the data here suggests that, at very least, assumed relationships between social media and behavioral outcomes are much weaker than most people imagine. With just a bit more skepticism, these provide very little evidence at all for a relationship, as these outcomes could just be noise.
Ultimately, my confidence in the CDC as a reliable reporter of information has dropped after seeing this report. Perhaps, as a government agency, they must sing to a particular tune that has become politically convenient. Unfortunately, this kind of weak effort can only reduce public trust in an organization like the CDC on issues that really matter.
I say this because I struggle to see why it took 9 people to run such basic statistics and write such a brief and misleading report. Not one of these 9 people raised an objection?