Op-Ed: California’s fight for a safer internet isn’t over yet | Panda Anku

This month, a bill regulating children’s social media services was defeated without justification by California’s Senate Appropriations Committee. The proposed legislation, sponsored by Reps Jordan Cunningham (R-Paso Robles) and Buffy Wicks (D-Oakland) and dubbed the Social Media Platform Duty to Children Act, would have allowed the Attorney General and local prosecutors to use social media -Sue companies for knowingly incorporating features into their products that make children addicted. The powerful tech industry has campaigned for months to thwart the bill.

But a companion piece of legislation spearheaded by the same lawmakers — the Age-Appropriate Design Code Act — survived the Appropriations Committee and could still become law.

Without the strict enforcement provisions in the Social Media Platform Duty to Children Act, some despair that the legislative effort was a waste of time. However, progress is incremental. These legislative efforts have already achieved some success.

First, they raised the issue of corporate responsibility and reframed the public debate surrounding social media regulation.

Both bills identified social media addiction as a legitimate problem caused by businesses. Social media companies and their advocates have repeatedly denied responsibility for the addictive nature of their offerings. Facebook, for example, has denied reports that Instagram uses harmed teenage girls and has vigorously denied claims that its platform harms users.

This corporate gaslighting effectively blames children for being addicted to social media and conveniently ignores how companies have intentionally designed their products to have addictive properties, suggesting: My product is not the problem; it’s your child The bipartisan legislative effort rejected this view, legitimizing parents’ concerns about social media addiction and reframing the issue as one of corporate responsibility and product liability, rather than blaming the victim.

Second, these bills recognized that the use of the Internet implies competing values ​​and priorities, and broke away Big Tech’s exclusivity.

Faced with the prospect of some sort of online regulation, social media companies, old-school internet idealists, and free-market zealots are ringing the same two alarm bells: regulation will stifle free speech and hinder technical innovations. For the past few decades or so, these Gemini bogeys have kept lawmakers from imposing regulations with real teeth. But these arguments have several flaws.

Social media companies are not absolute protectors of free speech and already set limits on the speech they spread. Nor are they the only innovative companies subject to regulation. For example, the biotech industry must comply with regulations that promote safety and efficacy.

An exclusive focus on so-called freedom of expression and innovation also fails to recognize that other important values ​​such as privacy, autonomy and security can be threatened by an unregulated Internet.

Up until this month, it looked like California might finally pass legislation that would hold the tech industry accountable for the harm caused by their products. But even though this enforcement legislation, which creates a new avenue for lawsuits, has been killed, the fight is not over.

Design legislation still in force in the state Senate would require companies to implement sensible privacy requirements such as strict default settings and clear, concise terms of use. Companies would also need to assess the impact of their products on children and ban features such as “dark patterns”, pop-ups and other product interface elements that encourage children to disclose personal information.

These rules aren’t just good for kids—they’re good for all users. Critics who claim that legislation would require age verification on all websites miss the point. Businesses should comply with these regulations regardless of the age of their users; Age verification would only be required if they wish to engage in fraudulent and harmful online practices involving adult users.

The legislation provides for civil penalties of up to $2,500 or $7,500 per child, depending on whether the violation was negligent or intentional. While these penalties may be too small to financially harm companies like Facebook (or technically Meta), the bill provides that they will be used to offset the cost of regulation. Civil penalties also usually have a public shame effect that can further undermine the Teflon coating that has long protected the tech industry from regulation.

Additionally, the death of the companion law authorizing state lawsuits doesn’t rule everything out Civil lawsuits against social media companies. Although parents could not sue under the design law itself, the standards evolving in response could help parents sue companies based on product liability, tort, or contract liability. Tech can also face pressure from consumers: companies that don’t conform to established business norms can be seen as negligent or reckless.

It is now up to the state Senate to approve the Common Sense Design Act, which will help protect California children from addictive and toxic online environments.

Nancy Kim is a law professor at the Chicago-Kent College of Law, Illinois Institute of Technology.

Leave a Comment