Social media companies prohibit kids under 13 from signing up because of federal privacy law. But parents like Danielle Hawkins can tell you a different story.
“She got on Instagram and Snapchat without my approval when she was about 12,” Hawkins, a mom of four who lives near Detroit, said of her eldest daughter.
The tech companies are well aware of this problem. Facebook CEO Mark Zuckerberg told a congressional hearing in March that his company knows kids get around the age limits on apps like Instagram, the photo-sharing network Facebook owns.
“There [are] clearly a large number of people under the age of 13 who would want to use a service like Instagram,” he said.
Now, Facebook is working on a solution for underage kids: “We’re exploring having a service for Instagram that allows under 13s on, because we worry that kids may find ways to try to lie and evade some of our systems,” Zuckerberg told lawmakers. “But if we create a safe system that has appropriate parent controls, then we might be able to get people into using that instead.”
The project, which Facebook calls Instagram Youth, would likely give parents the ability to monitor and limit what their kids do on the app. Facebook hasn’t made public any concrete details or timeline, but that hasn’t eased the criticism.
Parents say struggles with social apps start at early age
Parents say Zuckerberg is right: many kids are going on social media, despite the age-13 limit set by apps like Instagram, Snapchat and TikTok.
Charity White-Voth, a mom in San Diego, said the struggle began long before her daughter’s 13th birthday.
“I was the last holdout of her friend’s parents around Snapchat,” she said. Her daughter told her all her friends were using the app best known for disappearing messages.
“She was not joking,” White-Voth said. “They were on it and they were using it. And I was like, ‘I just don’t feel comfortable. I don’t think it’s the right thing to do.'”
She relented once her daughter turned 13, but she still worries that her daughter is too young to appreciate that what she posts online will be on the internet forever.
“I worry about her being 13, having poor impulse control, hormones are raging … just that inability to think long term,” White-Voth said. “I worry about sending something out that’s inappropriate, that somehow is going to get screenshot by somebody else.”
Another source of unease for many parents is the focus on likes, followers and selfies that is especially pronounced on visual platforms like Instagram, TikTok and Snapchat.
“Body image, who you are, how accepted you are, is a very big part of becoming a teenager,” said Hawkins, the Detroit area mom. “Being able to have people on another side of a screen … tell you who you are or how good you are? You really can’t comprehend what that actually does to the psyche.”
Her oldest daughter, who signed up for Instagram and Snapchat last year at age 12, is no longer allowed to use social media.
“We had to pull the reins on it. We just realized that it really wasn’t beneficial to her education, to her emotional state,” Hawkins said.
Growing concern social media use may be linked to mental health problems
These worries about the role screen time in general and social media in particular play in kids’ wellbeing are grounded, said Blythe Winslow, co-founder of Everyschool.org, a nonprofit that advises schools on how to use technology.
“Kids have more anxiety and depression … Empathy is on the decline. Creativity is on the decline. Suicide rates in kids ages 10 to 14 have tripled” between 2007 and 2017, she said, referring to a 2019 report from the Centers for Disease Control and Prevention.
Snapchat Can Be Sued Over Role In Fatal Car Crash, Court Rules
“Parents fear that social media might be linked to a lot of those problems,” she said.
As the mom of two tween girls, Winslow knows first-hand how hard these choices are for parents.
“My 11-year-old has been gunning for social media probably since she was eight or nine years old,” she said. “Most of her friends have TikTok, and they love TikTok.”
Researchers say the risks to kids from being on platforms where they can interact with adults are urgent. A recent report from the nonprofit Thorn, which builds technology to defend children from online sexual abuse, found more than a third of kids ages 9-12 said they had a “potentially harmful online experience” with someone they believed was 18 or older. Nineteen percent reported having an online sexual interaction with someone they believed to be an adult.
Many are skeptical a social media network just for kids would keep out adults with bad intentions.
“If you build a community for children, adults that really want to get into that will figure out how to get into it as well,” said Julie Cordua, Thorn’s CEO.
These fears are not just unsettling parents. They’re fueling a backlash to Instagram Youth from child safety advocates, members of congress, and 44 attorneys general, who are urging Facebook to scrap the idea entirely.