Facebook—a company notorious for its bizarre defence of allowing rampant misinformation to spread like wildfire on its platform—has found itself under fire again after a mother whose 4-year-old son reportedly died of the flu this week turned to an anti-vaxxer group on the site for treatment recommendations for her children, a practice that is somehow still permitted on Facebook despite these groups often spreading information this is dangerous and potentially deadly.
NBC News reported this week that the woman was among other Facebook users who have turned to the group Stop Mandatory Vaccination—one from anti-vaxxer troll Larry Cook with some 178,000 members—for information related to the flu. According to screenshots of the since-deleted comments from the woman obtained by the Colorado Times Recorder, she told the group that two of her four children—neither of whom, she said, had received the flu shot—had been diagnosed with the flu and that she had been prescribed Tamiflu, a treatment for the virus. The Tamiflu had been prescribed for herself as well as for her other two children, she said, but she added that she “did not pick it up.”
According to the Facebook posts obtained by the Times Recorder, the woman expressed that two of her children were running high fevers of over 100 degrees. Seemingly ignoring the treatment course the woman shared was prescribed by a doctor, commenters instead recommended “taking Vitamin D and C, Elderberry, Zinc, and eating lots of fruits and vegetables” as well as “boiling thyme on the stove” and breastfeeding. The same commenter who recommended “skin to skin” nursing as a treatment course recommended giving the woman’s 10-month-old with the flu vitamin C “until diarrhoea.”
On the comment thread involving the suggestion about elderberry and zinc, the mother responded, “ok perfect I’ll try that.” NBC News reported none of the 45 comments on the mother’s Facebook post suggested she seek the help of a medical professional.
According to a GoFundMe page seemingly belonging to the parents of the 4-year-old child and set up for medical expenses, the 4-year-old collapsed and was later hospitalized. A post shared Thursday said that the boy had “been taken off of life support and has passed.” The Colorado Department of Public Health and Environment confirmed to local CBS-affiliated KKTV that “a preschool-aged child in southern Colorado has died of flu,” adding it was the second death in a child related to the flu this season.
Facebook did not respond to multiple requests for comment about the incident or why it does not ban these dangerous communities, opting instead not to “promote” them and plastering them with disclaimers. In a statement to NBC News, the company offered its “thoughts” to the family and added that it doesn’t “want vaccine misinformation on Facebook, which is why we’re working hard to reduce it everywhere on the platform, including in private groups.”
But demoting misinformation isn’t enough. The U.S. Centres for Disease Control and Prevention states that “the spread of myths and misinformation has put some communities at risk” and remains a problem—in large part thanks to platforms like Facebook that refuse to do anything about the conspiracy theories and lies that fester on its site and encourage dangerous behaviour.
Facebook’s response to this incident is embarrassing. No amount of “thoughts” are going to bring back somebody’s kid.