Our #1 priority is protecting the well-being of children. Toward that goal, we have decided to use every lever available to us. One of those levers is the court system.
Parents Grieving the Death and Harm Suffered by Their Children After Being Exposed to a Fatal Video Trend File Complaint to Hold Video Streaming Platforms Accountable
Parents of minor users impacted by dangerous, viral trends on video streaming apps come together to hold social media companies responsible for their failure to heed user reports about the proliferation of fatal trends on their platforms.
Video streaming platforms have exploded in popularity, and so too have viral trends that induce users–many of whom are minors–to participate in toxic and fatal behaviors. The Choking Challenge, which encourages users to hold their breath until they are rendered unconscious due to a lack of oxygen, is just one example of a deadly, viral trend that has flourished on social media platforms. Parents Joann Bogard and Annie McGrath tragically lost their children at the hands of this challenge, which became popularized on TikTok and YouTube.
Through their own volition, the parents of the deceased minors voluntarily contributed labor and resources to make reports about the videos to content moderation teams at social media companies that had taken the lives of their children and many others. In response, all they received were non-response or canned responses stating that their reports of the choking challenge videos, harassing contents, and child pornography did not violate the platform’s community guidelines. The videos remained up, while viewers increased. There was no option for the parents to appeal or seek explanations for these automated responses.
Now, these parents and Becca Schmill Foundation seek to hold these video streaming platforms accountable for failing to respond to reports of harm and thereby misrepresenting the safety of platforms. Parents of the deceased and harmed as well as Becca Schmill Foundation are represented by Juyoun Han and Eric Baum of Eisenberg & Baum LLP.
Joann Bogard expressed frustration after making content reports on videos depicting the deadly Blackout Challenge that took her son’s life stating, “These videos remain posted and viewable to many young users. Time and time again, they ignore my pleas to remove them.” The inaction of social media platforms has left Bogard and other similarly situated parents feeling retraumatized, hoping these deadly videos do not reach and claim the lives of more children.
Annie McGrath laments the sorrow she and her family experience for never being able to say goodbye to their son who died after watching tutorial videos on the choking challenge. She also expressed how things might have been different for her son had social media companies kept their promises to address and take down harmful video content that violated their community guidelines. McGrath conveys that the continued failures of social media companies to appropriately address her reports of dangerous videos thriving on their platforms is a constant reminder and “re-traumatization of what we, forever-sad parents, have to endure each day.”
Jane Doe, a mother who had to watch TikTok videos of her child being harassed and bullied, was terrified. When a friend reported to TikTok to take down the video, she was dismayed to find a canned response stating that the video did not violate the platform’s guidelines and would remain up. She was left feeling helpless.
The Becca Schmill Foundation commissioned the Alliance to Counter Crime Online (ACCO) to conduct a research study on social media platforms’ responses to reports about drug sales and choking challenge videos. The ACCO researchers reported 20 videos that depicted the choking challenge on each social media platform. TikTok ruled that only one of 20 videos was in violation of their policies, and YouTube removed none of the 20 videos reported by the researchers. Deb Schmill, the President of the Becca Schmill Foundation said that she and others “are hoping this lawsuit leads to significant changes in the way reports of illegal and harmful content are dealt with by the platforms.”