Attorney General Andrea Campbell's lawsuit against Meta, which alleges Instagram has addictive features that exploit young users, can proceed following a Supreme Judicial Court ruling on Friday. Chris Lisinski/CommonWealth Beacon

THE COMPANY THAT OWNS Facebook, Instagram, and WhatsApp is not automatically shielded from litigation alleging that Instagram has addictive features that exploit children, the Massachusetts Supreme Judicial Court said in a first-of-its-kind ruling Friday, dealing another blow to the tech giant Meta amid a growing national movement to hold the social media platform accountable for its role in fueling the youth mental health crisis.

Justices determined that a federal telecommunications law known as Section 230, which generally prevents internet platforms from facing liability over content posted by users, does not disqualify complaints that Meta’s endless scrolling, frequent notifications, and time-limited features violate state consumer protection laws.

It’s the first time a statewide high court anywhere in the country has weighed in on whether Section 230 makes social media platforms immune to legal action over their features, punching a hole into the legal defense the company has sought to deploy in a series of cases.

The SJC did not weigh in on the underlying arguments that Meta’s features run afoul of state law, but the decision clears the way for Attorney General Andrea Campbell’s lawsuit to proceed in Superior Court.

The ruling came on the heels of a pair of other landmark decisions. Two weeks ago, a New Mexico jury concluded that Meta misled users and that its lackluster safety contributed to sexual exploitation of minors, ordering the company to pay $375 million in damages. A day later, a California jury found Meta and Google liable for depression and suicidal thoughts in a young woman who became hooked on Instagram and YouTube, which is owned by Google, and ordered the Silicon Valley titans to pay $6 million.

In both cases, the companies argued that Section 230 shielded them from liability, but judges allowed trials to proceed.

Friday’s SJC ruling is the first time a full panel of justices at the highest level in a state has agreed that the 1996 federal law does not shield against complaints about specific features baked into an app or platform.

Justice Dalila Argaez Wendlandt, who penned the opinion, noted that the Massachusetts lawsuit against the company “do[es] not seek to impose liability on Meta for information provided by third parties.”

“Instead, the claims allege harm stemming from Meta’s own conduct either by designing a social media platform that capitalizes on the developmental vulnerabilities of children or by affirmatively misleading consumers about the safety of the Instagram platform,” Wendlandt wrote. “Thus, at least at this preliminary stage of the litigation, Meta has not shown it is entitled to the protection provided by [Section 230].”

Campbell, a Democrat, sued Meta in 2023 as part of a bipartisan coalition of 42 attorneys general. She argued that the company violated consumer protection law and created a public nuisance by knowingly fueling addiction among young users with features such as infinite scroll, frequent push notifications, and autoplaying videos, then misrepresenting its safety to the public.

Instagram has more than 300,000 daily active users in Massachusetts between the ages of 13 and 17, state prosecutors wrote in their complaint. Citing Meta’s internal research, the AG’s team alleged that the company “designs and deploys features that it knows exploit vulnerabilities of the teenage brain, knows overrides teenagers’ ability to regulate their time on the platform, and knows causes addictive overuse that results in mental and physical harms.”

Meta argued that it was entitled to immunity under Section 230, but Suffolk Superior Court Judge Peter Krupp disagreed, bumping the case up to the SJC.

“Big tech companies like Meta have designed platforms that harm young people, all while downplaying the risks,” Campbell said in a statement Friday. “Today’s victory is a major step in holding these companies accountable for practices that have fueled the youth mental health crisis and put profits over kids.”

A Meta spokesperson said the company has made changes, such as introduction of “Teen Accounts” and giving parents more tools, to better protect minors on its platforms.

“While we continue to disagree with the false distinction between content and platform design, this ruling is procedural and doesn’t address the merits of the case,” the spokesperson said about the SJC’s decision. “We are confident the evidence will show our longstanding commitment to supporting young people.”

As court cases unfold across the country, policymakers are also grappling with whether to regulate social media more forcefully to protect young users.

Just two days before the SJC’s ruling, the Massachusetts House approved legislation that would ban any Bay Stater 13 years old and younger from using social media. Companies would be required to acquire parental consent for any 14- and 15-year-old users. House Ways and Means Committee chair Aaron Michlewitz called it “an important step in helping protect the children of the Commonwealth from predatory social media platforms.”

“The science is clear that exposure to social media at a young age can have a harmful effect on a minor’s development,” Michlewitz said in a statement following the vote. “By banning it for those 13 and under and allowing for parental consent for those who are 14 or 15, we will ensure that children are protect[ed] while giving them the ability to express themselves online at a safe and appropriate age.”

Gov. Maura Healey has also signaled her intent to file legislation with additional protections for teenage social media users, though she postponed a planned rollout this week.

The legislative road might lead to the same judicial arena where debate is already unfolding about companies’ liability. Trade associations representing social media platforms have challenged other states’ restrictions on First Amendment grounds.

A federal judge blocked enforcement of a Florida law requiring parental consent for 14- and 15-year-olds to use social media — similar to the Massachusetts House proposal — and case is headed to an appeals court in June, according to the Tallahassee Democrat.

This story has been updated to include a statement from Meta.

Chris Lisinski covers Beacon Hill, transportation and more for CommonWealth Beacon. After growing up in New York and then graduating from Boston University, Chris settled in Massachusetts and spent...