After a nationwide tour engaging with parents, Meta brought its Screen Smart series to San Francisco last month, returning to its Bay Area home to unveil new features aimed at protecting teens from social media risks.
The event drew about 58 parents, including 12 representatives from various parent-teacher associations across the region.
“We bring parents together to talk about Meta-specific features that we’re developing to support parents and teens, so they have safe and age-appropriate experiences online,” said Nicole Lopez, director of youth safety policy for Meta. “What we keep hearing is the same thing: parents are concerned about the contact their teen might have online, the content their teen might see, and how much time their teen spends online.”
Meta recently launched new accounts tailored for teenagers, a step aimed at enhancing safety for younger users amid growing concerns about mental health and risks associated with social media. The move comes as tech companies face increasing scrutiny over their platforms’ impact on teens’ well-being.
Recently, Australia passed a law banning social media for users under 16, the Associated Press reported. The policy imposes fines of up to about $33 million for platforms like Facebook, Instagram, TikTok, Snapchat, Reddit, and X that fail to block underage users.
In response to similar concerns, Meta has introduced new protective features for Instagram teen accounts. These include private accounts, restricted messaging, sensitive content controls, limited interactions, and tools like sleep mode and daily time limits to promote healthier app use. Teens under 16 need parental permission to change these settings, while older teens have more flexibility unless their account is supervised.
Parents can also view who their teen has been messaging and the topics they’re exploring, fostering greater safety and transparency. These updates are being rolled out globally.
In San Mateo County, home to Meta’s headquarters, Supervisor David Canepa has joined a growing chorus — among them attorneys general from 42 states, such as California’s Rob Bonta — calling for warning labels on social media posts.
Canepa acknowledged Meta’s efforts to implement safety features on Instagram as a positive step but insists that more needs to be done.
“Since Congress has not mandated warning labels on social media apps, as urged by U.S. Surgeon General Vivek Murthy, it falls to voluntary efforts like the one Meta-owned Instagram is taking with these teen accounts to protect children from harm,” Canepa said. “However, only time and data will reveal whether these new restrictions can truly prevent online bullying and harm to children. In the meantime, I believe social media platforms should carry U.S. Surgeon General warnings, just as Big Tobacco does.”
While Lopez did not comment on whether warning labels are being considered, she emphasized that Meta is actively listening to parents as they address concerns about their children’s use of social media.
“We visit each group, listen to their concerns, and consistently hear the same issues,” Lopez said. “That’s why we developed teen accounts to address those recurring concerns.”
For Gina Lee, a San Jose-based lifestyle content creator, the discussion felt timely.
Although her children are not yet old enough for social media, she wanted to prepare for conversations about the responsibilities and risks that come with it.
“It’s very important to have the education and information in this digital world we live in,” Lee said. “I want to build these conversations as my daughter grows because this is the world we live in.”
Lee appreciated Meta’s efforts to make monitoring easier for parents, especially when it comes to managing screen time.
“It’s easy for parents to see, so they’re not spending a lot of time doing it,” Lee said. “It’s nice that those kinds of settings are already automatically in place when your teen sets up an account.”