THIS MONTH, the Massachusetts House and Gov. Maura Healey rolled out proposals aimed at limiting the impact of social media on young people. They say social media websites and their algorithms are addictive and anxiety-inducing, arguing in favor of age restrictions or requiring parental consent to use sites like Instagram and TikTok.
Tech companies like Meta – the owner of Facebook, Instagram, and WhatsApp – and TikTok have raised concerns about the difficulty of implementing precise age restrictions and pointed to recent site changes they’ve made to reduce the pull for teenage users, like limiting screen time and having distinct parts of the platform for those in a younger age bracket. But they are under fire in legislatures and in the courts across the country for how their platforms target young users and work to keep them scrolling.
Massachusetts may join a growing number of governments here and abroad looking to force stricter rules. Australia rolled out a restrictive policy that took effect at the end of 2025, barring users under 16 years old from having social media accounts. Currently, 17 states have some legal ban or restriction on children’s access to social media. Florida’s version is being challenged on First Amendment grounds.
Proponents say these bills are commonsense steps to reduce young people’s time on addictive sites. But opponents warn that, even if well-intentioned, the legislation would lead to more personal information from children and adults alike left in the hands of social media giants.
What has the Massachusetts House proposed and why?
Both the House and Senate have passed bills banning cellphone use by students during the school day, but the House version also includes a provision banning social media accounts for children 13 and younger, and requiring 14- and 15-year-olds to have a parent or guardian’s permission to sign up for an account.
Under the House bill, a parent or guardian could order the company to terminate the young user’s account. It would also prohibit the sites from sending notifications to 14- and 15-year-olds between midnight and 6 a.m.
Proponents of the bill, which passed in a 129-25 vote, took aim at the addictive features of the social media sites. “This is a matter of protecting our children with regard to public health. It’s a matter of standing up to Big Tech, just as we stood up to Big Tobacco in the past,” Education Committee chair Rep. Ken Gordon said in a speech on the House floor on April 8.
Do social media platforms currently allow children to have accounts?
Under the sites’ existing policies, a user has to be at least 13 years old to use social media, including Facebook, Instagram, Reddit, or TikTok. But this bill goes beyond just raising the age to 14 – it also requires verification of that age, and introduces a new system of parental permissions with data privacy implications of their own.
Some features of the social media sites are available without accounts, so children would theoretically still be able to view those offerings.
Who would be responsible for making sure that users are of age?
The social media companies themselves. The House bill says the companies “shall prohibit” these children from accessing their products. Under the proposed parental permission rules, they would also be responsible for verifying that someone was actually a parent or guardian of a 14- or 15-year-old creating an account.
How would they verify a user’s information?
That is not specifically laid out in the legislation, and the ways verification can intrude into privacy and speech rights of children and adults alike is at the heart of the controversy over this sort of bill. The House bill says, “A social media platform shall implement an age assurance or verification system to determine whether a current or prospective user on the social media platform meets the age requirement” and, vaguely, “to the extent practicable, the age assurance or verification system shall consist of the best technology available to reasonably and accurately identify a current or prospective user’s age.”
This could take several forms. In Australia, age assurance technology involves providing legal documentation or digital IDs or uploading a selfie for artificial intelligence to make an age assessment. Many young users have found ways around these requirements.
Similar methods are at play in states requiring age verification, though the details vary. New York, for instance, tasks its state attorney general under anti-discrimination laws with ensuring social media companies provide “age verification methods all New Yorkers can use and will not use age verification methods that rely solely on biometrics or require government identification that many New Yorkers do not possess.”
Attorney General Andrea Campbell’s office said Thursday that determining user age is a rapidly developing field that does not necessarily require the use of government ID or biometric data.
Social media companies are already using some of these methods – like contracting with a third-party program that could collect identifying information to verify age and legal relationships without sharing those details with the social media site, or using artificial intelligence analyzing user behavior – to estimate their users’ ages. To the extent user data is collected, the AG’s office said, it supports strong privacy-protecting guardrails against misuse by the companies.
How would social media sites verify that a parent or guardian is giving permission for a 14- or 15-year-old to have an account?
This is a thorny question. The methods of verifying ages – like an uploaded selfie scanned by AI or even a driver’s license – do not prove that there is a legal guardianship relationship between an adult and a minor. The federal law aimed at protecting the online privacy of children under 13 lays out a few options for verification, like a consent form sent by fax, mail, or scanner; logging a credit card; a phone or video call with trained personnel; or providing proof of government ID.
Critics of parental verification laws note that, by implication, there would need to be legal documentation somewhere in the process of both the underage child and a parent or guardian relationship.
Would new age restrictions mean anyone signing up for a social media account has to verify their age?
Yes, this could mean that everyone who wants to have a social media account would have to provide proof of their age – by submitting biometric, algorithmic, or legal identification to the privately owned social media company.
Efforts to impose age restrictions on social media sites – while drawing praise from a number of state governments and the Federal Trade Commission – are facing pushback from free speech and privacy groups. They say there are many ways young people could get around the restrictions, and the verification rules open up a devil’s bargain for adult users who may want to limit how much of their data is in private hands.
“We already know that the online ecosystem is porous, insecure, and routinely subject to data breaches,” Aaron Mackey, deputy legal director of the nonprofit Electronic Frontier Foundation, told NBC News. “So why would we, then, in the name of protecting people, create a whole other legal mandate that requires the collection and storage of even more personally identifying information that would be subject to either data thieves or data breaches?”
Who would be in charge of figuring out the details of the verification system?
According to the House bill, the attorney general would be charged with creating regulations to implement the social media policy by September, and the young user ban and new rules for 14- and 15-year-olds would go into effect on October 1.
“We leave a lot of it up to the AG in terms of regulations on exactly how to actually institute that, because, frankly, this is a moving target related to the social media conversation,” said Rep. Aaron Michlewitz, a top deputy to House Speaker Ron Mariano. “We want to make sure that regulations, as opposed to putting it in statute, will allow that flexibility.”
Has the state Senate taken any steps?
The Senate has not approved any measures restricting youth social media use, nor has it held hearings on such a ban. It did pass a data privacy bill earlier this year, which includes provisions collecting and selling certain users’ data.
But because the House restrictions were folded into its version of the school cellphone ban, the issue will now be taken up by a conference committee of House and Senate members who will work to reconcile differences between the cellphone ban bills passed by each chamber. Each chamber will then take an up or down vote on the bill that emerges.
What is the governor’s position on this?
In her State of the Commonwealth speech in January, Gov. Healey promised to introduce her own proposal to restrict social media use by young people. “These platforms are built with addictive algorithms and they exploit insecurities, especially in our young people,” she said. “So I am proposing strict new requirements to protect kids and teens on social media. We will require parental consent and age verification on all of these platforms. We’re going to prevent social media companies from targeting kids for profit. Parents are trying to protect their kids, and we’re going to help them do it.”
In the months afterward, she went quiet on the matter until the House unveiled its proposal, after which she rolled out her own legislation.
What did she propose?
At a press conference last week announcing her new supplemental budget proposal, Healey laid out a proposal mostly focused on changing the default settings on social media sites for young users and requiring age verification. “This isn’t a ban, but it is deactivation,” she said.
Under her proposal, for all users under 18, social media companies would be required to automatically deactivate features like infinite scrolling, auto-playing videos, and “addictive algorithms that target young people based on what they privately viewed in the past.”
Healey’s bill also includes parental verification, in that a parent or legal guardian must consent to the modification of these default settings for users 15 or younger, while users 16 or older may alter their own accounts’ default settings.
One provision of the governor’s bill has raised additional alarms for privacy groups. While all age assurance data must be deleted soon after verification under both the governor’s and House’s bills, Healey would require social media companies to post anonymized data on the ages of minors using the platform. This is intended to help the state get a sense of how many young people are spending time on each site.
“The governor’s proposal would require online ID checks for every single person in Massachusetts, not just kids,” said Evan Greer, director of the advocacy group Fight for the Future, which opposes online ID checks, in a statement. “Because the bill requires social media companies to publish precise data on users of different ages, it would force companies to know the exact age of every user, making it necessary to do an ID check as opposed to using an age estimation or age assurance method.”
Are the courts a factor here?
In Massachusetts, neither proposal is being legally challenged, because they are not yet law. But in Florida, a challenge to a similar law is working its way through the courts on First Amendment grounds. Essentially, opponents argue, parents have substantial rights to weigh in on their children’s safety, but even minors have the right to decide what speech they want to share or hear.
Massachusetts’s highest court this month gave a green light to Attorney General Campbell to continue her suit against Meta for Instagram features like infinite scrolling, push notifications, and algorithmic recommendations that the AG claims are addictive and exploitative to young users. The algorithms and system design are active choices by Meta, the court ruled, and the company can’t just get the suit tossed under a federal law known as Section 230 designed to protect publishers from liability for third-party content.
As she continues her crusade against the tech companies’ treatment of young users, Campbell also signaled interest in the possible legislation that would task her office with figuring out how exactly to keep these sites out of more children’s reach.
“It is critical that we hold tech companies accountable for designing social media platforms that keep young people addicted and wreak havoc on their mental health,” Campbell said in a statement. “I am grateful that the Healey-Driscoll Administration and the Legislature are taking this issue seriously, and I look forward to working with all parties on next steps.”
To understand the bigger national and international context, listen to our episode of The Codcast featuring two academic experts unpacking the growing regulatory movement to limit social media use by minors.

