One of the many contentious viral “challenges” that have swept social media is the “Blackout Challenge.” In this, participants are urged to choke themselves until they pass out from a lack of oxygen. The US Centers for Disease Control and Prevention (CDC) issued a warning to people trying to strangle themselves “in order to obtain a brief euphoric state or high” in 2008, linking it to more than 80 deaths. Despite the fact that this Challenge only became popular on TikTok in 2021, it appears to have been around since that time.

At least 15 deaths of children under the age of 12 and five more of children between the ages of 13 and 14 were attributed to the Challenge in a Bloomberg Businessweek report published in November of this year.

Numerous lawsuits have been filed against TikTok, the social media site where the Challenge gained popularity, as the number of fatalities and accidents has increased.

Parents of children who died as a result of the “Blackout Challenge” have filed lawsuits, alleging, among other things, that TikTok’s algorithm promotes harmful content, permits users who are underage, and fails to inform users or their legal representatives of the app’s addictive nature.

The Social Media Victims Law Center (SMVLC), which seeks to hold social media companies liable for the harm they cause to weak users, and the families of 8-year-old Lalani of Temple, Texas, and 9-year-old Arriani of Milwaukee, Wisconsin, filed the lawsuit on June 30 in a California court. Tawainna Anderson, the mother of Nylah who passed away following the Challenge, brought a lawsuit in May. The “For You” page on TikTok, which offers curated content based on a user’s interests and past behaviour, is specifically contested in her lawsuit.

The algorithm found that the deadly Blackout Challenge was well-tailored and likely to interest 10-year-old Nylah Anderson, and as a result, she died, according to the lawsuit. However, a ruling in the Anderson case from October has raised significant doubts about the viability of the other lawsuits, which all make comparable allegations.

The judge reached the following conclusion in an eight-page decision: “The defendants did not originate the Challenge; rather, they made it easily accessible on their website. The defendant’s algorithm served as a means of alerting those most likely to be interested in the Challenge to the Challenge. Defendants published that work by doing this, which is exactly the activity that Section 230 shields from liability.

The court ruled that the US Congress would be required to enact appropriate laws and regulations if Tiktok was to be held accountable for the deaths linked to the “Blackout Challenge.” The judge stated that Congress, not the courts, should be consulted regarding the wisdom of granting such immunity.

Leave a Reply