Industry News


Bi-Partisan NUDGE Act Takes Aim at Social Media Algorithms

February 11, 2022

By Hillary K. Grigonis

© Primakov/Shutterstock

A new Senate bill, the Nudging Users to Drive Good Experiences on Social Media Act, or the NUDGE Act, could influence how major social media platforms’ algorithms—Facebook and Instagram, for example—work. On Feb. 10, two Senators introduced legislation that would fund research into harmful content on social media, then allow the Federal Trade Commission to use that research to create new rules for platforms to follow. The bill, introduced by Amy Klobuchar (D-MN) and Cynthia Lummis (R-WY) addresses a wide swath of concerns, including addiction to social media, as well as the amplification of misinformation and the promotion of posts to young users.

The bill, if passed, would initiate studies by the National Science Foundation and the National Academy of Sciences, Engineering and Medicine to study content-neutral interventions and how they could reduce the spread of harmful content as well as address addiction. A content-neutral intervention is one that’s applied regardless of what the post is about. For example, a pop-up on Facebook now warns users if they click the share button without clicking a link to read the article first.

[Read: Photogs’ Lawsuit Hopes to Change Instagram’s Embed Tool Algorithms]

The content-neutral approach is key to the legislation because it doesn’t require the social media platforms to determine what content is harmful or misleading. Social media regulation is often opposed over concerns over free speech, but the bi-partisian bill attempts to address concerns by applying the changes without giving platforms the responsibility of determining which content is harmful or misleading. Lummis said the legislation could “build a healthier internet without the federal government dictating what people can and can’t say.”

“The NUDGE act has the potential to make online spaces meaningfully safer for all users by kickstarting research into content-agnostic interventions and incentivizing digital platforms to implement interventions that research demonstrates to be effective,” Laura Edelson, a Lead Researcher at the NYU Cybersecurity for Democracy and a PhD candidate in computer science, explained. “This research-driven approach to improving online safety is the first step to reducing harm to vulnerable users such as kids and teens, while still putting the free-speech rights of users first.”

[Read: Instagram Finally Adds Option to Turn Off Embed Tool]

Following the initial research, the bill would then allow the Federal Trade Commission to rule on how to apply the findings. Social media platforms would then need to implement those recommendations and share statistics on how effective the changes are. The bill does not change Section 230, which addresses immunity that the platforms have when algorithms promote posts that violate civil rights. In previous proposed legislation, tying the new laws into Section 230 raised concern among technology and public interest groups.

The bill also addresses several social media issues highlighted by recent research, including the addiction to technology, content viewed by teenagers and children, and the spread of misinformation. How the bill would change social media platforms isn’t yet clear, as the bill first requires research into the topic.

“For too long, tech companies have said ‘Trust us, we’ve got this,’” Klobuchar said in a statement. “But we know that social media platforms have repeatedly put profits over people, with algorithms pushing dangerous content that hooks users and spreads misinformation. The NUDGE Act will help address these practices, including by implementing changes that increase transparency and improve user experience. It’s past time to pass meaningful reforms that address social media’s harms to our communities head-on.”