Tech critics’ least favorite law is under fire again, this time with a focus on its recommendation algorithms.
On Wednesday, Senators John Curtis (R-UT) and Mark Kelly (D-AZ) introduced the Algorithm Accountability Act, which amends Section 230 of the Communications Decency Act to make platforms responsible for preventing their recommendation systems from causing certain predictable harms. Section 230 is the law that protects online platforms – including social media sites, digital forums, blogs with comment sections, and their users – from being held liable for other people’s unlawful posts, or engaging in good faith content moderation. But the Algorithm Accountability Act would require commercial social media platforms to “exercise reasonable care in the design, training, testing, deployment, operation, and maintenance of recommendation-based algorithms” to “prevent physical injury or death.” If a platform were to be able to reasonably anticipate that its content recommendations would result in physical harm, Section 230 would no longer provide protection for those making those recommendations.
Under the Algorithms Accountability Act, victims who have suffered physical harm, or their representatives, will be able to sue tech platforms for damages.
This approach, known as duty of care, is similar to the Kids Online Safety Act (KOSA), an embattled bill with broad support in the Senate that has been stalled in the House amid tech lobbying and speech concerns. Under the Algorithms Accountability Act, victims who have suffered physical harm, or their representatives, will be able to sue tech platforms for damages if they believe they breached a duty of care. But this only applies to a subset of web services: specifically, profitable social media platforms with more than one million registered users.
Moving beyond routine criticism of Section 230 reforms, the bill’s sponsors insist that it would not violate First Amendment rights. Like KOSA, the new bill states that it will not prevent platforms from directly providing users with the information they seek. It also would not prohibit feeds being served in chronological or reverse-chronological order, and would be restricted to enforcing the law based on users’ viewpoints.
Curtis has blamed Section 230 for enabling a toxic social media environment that he believes contributed to the September killing of conservative activist Charlie Kirk by a gunman in his home state of Utah. recently wall street journal op-ed, he said that “Online platforms likely played a major role in radicalizing Kirk’s alleged killer,” a phenomenon “driven not just by ideology but also by algorithms – code written to keep us engaged and angry.” on a cnn At a town hall at the university where Kirk was killed, Curtis and Kelly, whose wife Gabby Giffords survived an assassination attempt, previewed their new bill with a message calling for “de-escalation of political tensions on both sides of the aisle.”
Recommendation algorithms were a core issue in a major lawsuit against YouTube, Meta and other platforms earlier this year, when a gun safety group alleged they bore responsibility for radicalizing a racist mass shooter by surfacing hate speech in recommendation algorithms. Hate speech is legally protected, and a court dismissed the case, citing both Section 230 and First Amendment concerns. But the new law could shift the balance of power over a whole range of lawsuits against tech companies for everything from drug use to suicide. Even in cases where the speech is ultimately found to be legal, losing Section 230 protection could leave platforms embroiled in lengthy legal proceedings over challenges to their hosting or moderation of user posts.
But groups that have opposed COSA and prior efforts to reform Section 230, such as the Electronic Frontier Foundation (EFF), warn that even with such assurances, platforms will simply be incentivized to remove the information or not to surface it. It is possible This would be considered a violation, even potentially a misuse of resources meant to prevent the same harmful behavior that lawmakers seek to reduce.