Facebook, Instagram, and other social media sites may soon face lawsuits alleging that the way their algorithms work may be a factor in causing mental illness in some users.
A Defective Product?
In the US, a reported consolidation of lawsuits across multiple districts that are rumoured to be filed next month in the Northern District of California will allege that these social media giants are causing eating disorders, anxiety and depression in their users through the use of algorithms that are “defective products”.
It is alleged that social media algorithms can encourage addictive behaviour by encouraging users to view certain posts, e.g. posts that could lead them into mental illness.
It’s been reported that some of the evidence will relate to comments made by former Facebook Product Manager turned whistleblower Frances Haugen. In 2021, for example, she alleged that Facebook (now Meta) knew that Instagram users were suffering ill health effects and that Facebook had been putting profit over safety. Her reported (unproven) allegations about Facebook at the time included:
– There were conflicts of interest between what was good for the public and what was good for Facebook.
– Facebook knew that Instagram was worsening body image issues among teenagers and had a two-tier justice system.
– Facebook uses engagement-based ranking algorithms (in Instagram) knowing that these algorithms can’t adequately identify dangerous content and may even amplify negative content and help to fuel violent rhetoric and ethnic violence.
– Facebook hid most of its own data and when asked directly about how it impacts the health and safety of children, it chose to mislead and misdirect.
– Facebook failed to act on internal research showing that Instagram had a negative impact on the mental health of teenage girls.
Despite calls for regulation from some members of Congress and President Biden since Haugen’s initial allegations, the lawsuits may argue that nothing substantial has been done.
Known About For A Long Time
It has been reported (Portico) that Previn Warren, an attorney for Motley Rice (a leading firm involved in the case) has said that Frances Haugen’s allegations suggest that Meta may have known for some time about the negative effects of Instagram on children, and that “It’s similar to what we saw in the 1990s, when whistleblowers leaked evidence that tobacco companies knew nicotine was addictive.”
Since the focus may be on social media algorithms as possibly being defective products, the case will relate to product liability law. Although algorithms being treated as products is a relatively new area, an algorithm could be considered a product under U.S. product liability law. If, for example (as the lawsuits may allege) an algorithm is a defective product, it may fall under the category of “strict liability” which means that the manufacturer of the product can be held liable for damages caused by a defect, regardless of whether they were at fault or not. The defectiveness of the algorithm must have existed at the time it was sold or supplied to the user. The determination of whether the algorithm is defective is likely to be a complex part of the legal argument and could depend on factors like industry standards and the foreseeable uses of the algorithm.
Protected From Product Liability Claim?
It has, however, been noted by some tech commentators that the Section 230 provision of the 1996 Communications Act may currently protect social media companies by restricting lawsuits against them relating to content users posted on their sites. This could potentially protect Meta and Instagram from a product liability claim.
What Does This Mean For Your Business?
These lawsuits, if successful, could have a significant impact both for the social media companies and for users. For example, if it were proven in court that the algorithms used by social media companies are defective products and cause harm to users, the social media companies could, of course, face significant legal and financial consequences, including large damage awards to affected users. This could also have implications for their business operations and reputation.
Regarding Section 230 of the 1996 Communications Act, if the court finds that the algorithms are defective products, this could lead to a re-evaluation of the protections provided by Section 230 to social media companies. Section 230 provides immunity from liability for third-party content posted on their platforms, but if the algorithms themselves are deemed to be the cause of harm, this immunity may no longer apply. This could lead to increased regulation and oversight of the algorithms used by social media companies, resulting in a potential shift in the balance of power between these companies and the users they serve.