The John Adams Courthouse in Boston. (Maria Pemberton / CommonWealth Beacon)

TWO YEARS AGO, Attorney General Andrea Campbell took Meta – Mark Zuckerberg’s monolith that owns Facebook and Instagram – to court over claims that its platform designs and features exploit children and keep them hooked on addictive content. On Friday, the Supreme Judicial Court will be the first state high court in the nation to consider whether those platform designs are shielded by a law protecting publishers from being sued over the content of their websites.

The case, scheduled for oral argument Friday morning, puts Massachusetts at the center of a debate over Section 230 of the Communications Decency Act. The court will consider whether the 1996 federal law that protects internet companies from lawsuits over user-generated content extends to claims about platform design.

Campbell filed the lawsuit in Suffolk Superior Court in October 2023, joining a bipartisan coalition of 42 attorneys general who sued Meta in an array of federal and state courts. The Massachusetts complaint alleges that Meta violated state consumer protection law and created a public nuisance by deliberately designing Instagram with features like infinite scroll, autoplay, push notifications, and “like” buttons to addict young users, then falsely represented the platform’s safety to the public. The company has also been reckless with age verification, the AG argues, and allowed children under 13 years old to access its content. 

Suffolk Superior Court Judge Peter Krupp denied Meta’s motion to dismiss the case in October 2024, writing, Meta’s statements about its safety “are belied by its internal data showing that Instagram addicts and harms children. Meta had repeatedly deprioritized youth well-being to increase revenue.” 

The case is before the high court because Meta wants to challenge Krupp’s ruling that it was not entitled to immunity – a procedural move that bumped the case up to the SJC even though the lower court has not decided its merits. 
 
The state’s complaint relies heavily on Meta’s own internal research, which allegedly showed the company understood Instagram’s features were harming teenagers but concealed this knowledge to maximize profits. According to Campbell’s lawsuit, Meta secretly utilizes design features that “deliberately exploit and capitalize off young users’ unique vulnerabilities” and overcome their ability to self-regulate time on the platform. 
 
The complaint alleges that young Massachusetts users are induced into using Instagram for multiple hours a day, often instead of homework or sleep, in an addictive manner they cannot self-regulate. 
 
Research cited in the lawsuit shows that for adolescents, mental health steeply declines after one hour of daily social media use. As hundreds of thousands of Massachusetts teenagers actively use Instagram, the lawsuit alleges the website’s practices have burdened Massachusetts school systems and added to health care expenditures.   

“This addiction, according to Meta’s own data, has caused widespread mental and physical harm to children,” the AG’s office wrote. “The Commonwealth has been left to grapple with the extent of that harm.” 
 
Meta argues that Section 230 bars the entire case. The law states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” Courts have broadly interpreted this to protect websites from lawsuits over content posted by users. 
 
In its filings, Meta argues that its products are fundamentally based on third-party content.  For that reason, the company says, its platform features and design cannot, on their own, be held responsible for any problematic behavior related to use of the platforms.   

Simply put, it wrote, “if there were no videos to play, there would be no ‘autoplay.’ Nor does the Complaint allege how users would be addicted to Instagram if there was no content on the service. Regardless of Meta’s publishing choices, if Instagram had no content, and instead displayed a blank screen upon loading, no one would use it, nor could possibly claim that they were addicted to the service.” 

Meta’s argument extends beyond user content. The company contends that Section 230 also protects its editorial choices about how to organize and present that content, including through design features like notifications, infinite scroll, and autoplay. Meta further argues the First Amendment bars claims targeting its content moderation policies and algorithms. 
 
Judge Krupp rejected these arguments. The issue at hand included false statements that Meta made about their safety, he said, which would not be the type of content protected by Section 230. The design and moderation choices are also fair game, Krupp said, because the state “is principally seeking to hold Meta liable for its own business conduct,” not content posted by third parties. 
 
“The Commonwealth alleges that Instagram’s features in and of themselves, regardless of their associated content, cause its young users to become addicted to the platform,” Krupp wrote. 
 
Meta appealed to the SJC, challenging the denial of its Section 230 defense.  

“The evidence will demonstrate our commitment to supporting young people,” a Meta spokesperson wrote after the ruling. The company recently announced new “Teen Accounts” on Instagram, a protected experience that automatically limits who can contact teens and the content they see. 
 
Justices on Friday will consider two questions, the first of which is whether the high court should be considering the appeal at all. A legal doctrine allows parties to appeal if they are denied immunity they think they are entitled to by law, so the question that the SJC will consider for the first time is whether Section 230 creates an immunity from suit and therefore whether Meta can appeal the lower court’s denial to bring the question before the SJC. 

But the philosophical meat of the suit is whether Meta is actually immune from the consumer protection and public nuisance lawsuits because of Section 230.  
 
The case has attracted significant attention from free speech advocates. The Foundation for Individual Rights and Expression (FIRE) and TechFreedom both filed amicus briefs urging the court to reverse Krupp’s ruling and affirm Meta’s immunity from lawsuits over the design of its platforms.  

Campbell’s argument is part of a long history of panic over unfettered speech, argued FIRE in its brief. People can make choices about how they want to consume their entertainment and information, they note, and young people are no more at risk of being overwhelmed by the style of social media than literary figures of old were by books. 

“The Commonwealth is concerned that the speech is too powerful,” it wrote. “They think minors are like Don Quixote, transfixed by stories and ideas. This problem, however—if it’s a problem—is not for the Commonwealth to fix. Under the First Amendment, the strong effects of speech are an inherent part of speech—not a ground for regulation.”  

Law professors Jane Bambauer and Eugene Volokh argued in an amicus brief that social media platforms create expressive products and their features “stem from constitutionally protected decisions about where, when, and how speech is communicated.”

Campbell’s office argues that Meta’s attempt to distinguish between content moderation and design features creates a false distinction. The state contends that, just as Meta may curate posts through content moderation policies, it may approve of and encourage speech through design features that are separate from the third-party content itself. 

Campbell argues that “by designing and using addictive design features on Instagram to exploit children’s psychological vulnerabilities,” Meta falsely represented that its features were not addictive and in fact prioritized youth health and safety. 
 
The Massachusetts case is one of several challenging Section 230’s reach.  

In October 2024, US District Judge Yvonne Gonzalez Rogers, overseeing multidistrict litigation in California involving more than 30 states, rejected Meta’s motion to dismiss similar claims. However, no state supreme court has yet ruled on whether Section 230 shields social media companies from lawsuits targeting platform design. 
 
Massachusetts is part of a bipartisan effort led by Colorado Attorney General Phil Weiser and Tennessee Attorney General Jonathan Skrmetti. Thirty-three states joined the federal lawsuit, while nine states, including Massachusetts, filed in their own state courts. 

In an amicus brief filed in the Massachusetts case, the coalition of attorneys general wrote that Meta’s attempts to ask higher courts to wade in on the immunity question “and expand the scope of Section 230 have the potential to sow confusion among the courts.” They noted that Meta has universally lost, in part or in full, in its motions to dismiss based on the publishers’ immunity. 
 
As Campbell put it in announcing the lawsuit in October 2023, “based on its own internal research, Meta knew of the significant harm these practices caused to teenage users and chose to hide its knowledge and mislead the public to make a profit.” 
 
The SJC’s ruling could determine whether that alleged knowledge and those choices are subject to state consumer protection laws, or whether Section 230 places them beyond the reach of state enforcement. 

Jennifer Smith writes for CommonWealth Beacon and co-hosts its weekly podcast, The Codcast. Her areas of focus include housing, social issues, courts and the law, and politics and elections. A California...