Elva Etienne/Getty Images
Snapchat is rolling out parental controls that allow parents to see their teen’s contacts and report to the social media company – without their child’s knowledge – any account they might be worried about.
The goal, according to the executives, is to allow parents to monitor their child’s connections without compromising the teens’ autonomy. Named Family Center, the new suite of tools released Tuesday requires both the caregiver and the teen to register.
“It lets parents see who’s in their teen’s world,” said Nona Farahnik, director of platform policy for Snap, the company that makes Snapchat. “It gives parents the ability to ask who someone might be, how they might know a contact, which sparks these kinds of real-time conversations about who teens are talking to.”
Farahnik says Family Center is modeled after real-life parenting.
“If your teenager is going to the mall, you might ask him who he is going with. ‘How do you know them? Do you play on a sports team together? Do you go to school together?'” said Farahnik. “But you won’t be sitting there at the mall with them listening to their conversations.”
Likewise, parents cannot see the content their teen sends or receives on Snapchat. They can only see who their child has communicated with in the last seven days. Snapchat is popular with young people, in part because messages on the platform disappear within 24 hours.
The company says it consulted with security experts and academics and held focus groups with parents to develop Family Center and plans to roll out more features in the coming months. The tool is reserved for parents of children under the age of 18.
With Family Center, Snap follows other social media platforms, including Instagram, which have recently tightened parental controls. According to at least one survey, Snapchat is the second most popular social network among teens. The first, TikTok, offers “Family Sharing”, which gives parents a few ways to limit the videos shown to their children.
“I think these platforms want to show that they can take action to protect children, that they can self-regulate and that they are able to do it themselves without involving the government,” said Irene Ly, Policy Advisor for Common Sense Media. , which reviews apps, games and media for families.
Bipartisan legislation in Congress would require more sweeping changes aimed at protecting children on social media, but lawmakers have yet to vote on the measures.
Advocate: Social media networks should be ‘safer by design’ for children
Parental controls can be helpful for some families, says Josh Golin, executive director of FairPlay, an advocacy group focused on improving online safety for children. But they require parents to have the time, energy and commitment to understand social media tools and use them regularly.
“Are you going to spend 20 minutes a day figuring out what’s going on on Snap and another 20 minutes on TikTok and another 20 on Instagram? he said. “I don’t think parents particularly want to spend their time this way. What they would prefer to see is that these platforms take real steps to be safer by design.”
For example, Golin says, it should be easier for kids to put down their phones and take a break from social media.
“As a 12-year-old you might be like, ‘Oh my God, my life is going to be over if I don’t communicate with my friend today on Snapchat,'” Golin said. “I don’t think we should be giving kids rewards and badges and things for using online platforms more. It doesn’t encourage intentional and thoughtful use. I think it promotes coercion and doesn’t only benefits the company.”
Snap’s Terms of Service require a child to declare that they are 13 or older before signing up for the service. Snap says it screens underage users under the Children’s Online Privacy Protection Act.
“We already have millions of young people on Snap, including millions under the age of 13 who shouldn’t even be there in the first place,” Golin said.
He says companies could better verify the age of their users, rather than taking users at their word.
Ly of Common Sense says companies could also examine how their algorithms amplify content that could be harmful to children.
For example, Ly said, a child might interact with a message that encourages healthy eating for a fitness routine. But algorithms created to show users more of what they like could quickly lead that kid down a rabbit hole of misinformation about eating disorders or other harmful eating habits.