Will Biden’s new online safety guidelines really help protect our kids? Parents and experts weigh in
The Kids Online Health and Safety Task Force just released parental and industry guidelines for online health and safety. Is it enough?
While 86% of parents have expressed concern over the online safety of their kids, according to a recent State of the Youth survey, many don’t know what to do about it—and about one in three wish they had more resources to help them figure it out.
This week, the White House heeded that call. Noting that approximately 95% of teenagers and 40% of children between ages 8 and 12 use some form of social media—which has been linked by the U.S. surgeon general and others to the ongoing youth mental-health crisis—the Kids Online Health and Safety Task Force released a massive report of best practices both for families and the industry. It called for more industry accountability and for Congress to pass the Kids Online Safety Act (KOSA), the first law aiming to protect children online since 1998—long before smartphones or TikTok even existed.
KOSA—which is endorsed by groups including the American Academy of Pediatrics, the American Federation of Teachers, and Common Sense Media (and opposed by Meta Platforms)—appears to have enough bipartisan support to pass, and cleared a major Senate hurdle on Thursday.
The bill, expected to come up for a vote on July 30, would set a “duty of care” standard for social media companies with users who are minors—something that opponents, including the American Civil Liberties Union, worry would censor important LGBTQ, reproductive, and sexual health information for youth. It would also require such platforms offer options for minors to protect their information and disable addictive features by default.
But the best practices for parents, which suggest building a family media plan, discussing social media with their children, and being intentional about creating screen-free time, are out there (pages 19–24) for families to lean on right now.
But do these best practices—and the KOSA, for that matter—go far enough? Fortune got industry experts and psychologists to weigh in.
Danny Weiss, Chief Advocacy Officer for Common Sense Media (part of the Kids Online Health and Safety Task Force)
“KOSA is a compromise, there’s no question about it. But it’s a needed bill, and it’s still a very good bill,” he says, which will be helpful “once you work things out in the courts. Because obviously tech companies, every time a bill is passed, I assume they will sue in federal court. There’s sort of no way around that.”
That could tie it up for some time, he explains, “because the Supreme Court hasn’t ruled yet on whether or not moderating content is a First Amendment violation or if it’s an obligation of companies. Our view is it’s an obligation of companies to moderate content. It’s not a First Amendment right for them to do whatever they want with no controls.”
Regarding the parental guidance in the report, he says: “Let’s be candid: The average parent will not see this report. I do think they did an excellent job collecting very good recommendations … but it’s up to other organizations and individuals and pediatricians to talk about this. That’s a big thing that we do at Common Sense Media,” especially through its Digital Citizenship Curriculum. It’s used by 1.3 million teachers in 87,000 schools in the U.S., offering lesson plans for teachers to try to help kids have a healthier relationship with technology.
Parents who do look at the report might find its length and breadth to be overwhelming. Which is why Weiss suggests picking one recommendation and starting with that. He suggests choosing the idea of simply engaging with your kid about the content they’re consuming. “I would very casually just say, ‘Hey, what are you laughing at? What are you frowning at? Tell me about it.’ Create that opening so that your kid knows that you’re not judging them, but that you care about them.”
When the task force released its guidance earlier this week, Weiss attended the event, which featured a panel of three teens. “One of them just said, ‘Look, we’re not perfect. We know that we use our stuff too much. But we know that our brains are still developing, and we don’t have the wherewithal, frankly, to overcome the allure of technology,’” he says. “And so they want solutions.”
Hari Ravichandran, CEO of online safety app Aura and founder of Digital Parenthood
“I’m a dad. I have four kids. We’ve had personal struggles with one of our kids, kind of related to mental health issues. And as [our family has] sort of dug into it, we do see that there is some very clear correlation … between invested time on devices and using social media platforms, especially when she feels down.”
Regarding the bill, he says, “I think keeping Big Tech a bit more accountable is really good because there is a general conflict of interest between the economic incentives for large technology companies versus safety. Putting a bit more pressure on there, I think, is a very positive thing.”
One of the best practices’ suggestions is to initiate honest conversations about social media and its effects. And while it may be important, he says, it can prove difficult with teens. “This is literally a daily battle with my older kids. Initially, she would be very straightforward about it. Now, when she knows that there’s a consequence, the narrative kind of changes quite a bit. There’s a bit more hiding of stuff, so I think it’s a continuous cat and mouse game.”
The report lays out its concept of “5 Cs” for parental guidance—child (know your child’s temperament), content (monitor its quality), calm (direct your child away from technology for self-soothing), crowding out (encourage less time online and more outside), and communication (help build digital media literacy)—which “resonates really well for me,” Ravichandran says. “I think it’s a really nice framework … But it would be helpful to parents if we had some help on our side, which is policy, right?”
So between the report and the bill together, he says, “I think it’s a start.”
Dane Witbeck, Founder and CEO of Pinwheel kid-safe smartphones
“I’m really appreciative of the attention to the problem. I think the problem is really massive, and social media is certainly affecting our kids and youth negatively right now. Unfortunately, I don’t think that this bill has the right approach. I believe there are better ideas out there than this, and I think that this is quite vague in its prescriptions. And I think that we will get a lot of court battles over this for years to come.”
Corporate lawyers will find lots of ways to attack this bill, Witbeck adds, particularly around the idea of age verification being inherently required. “They’ve taken out the explicit requirement of age verification, but the implicit requirement is still there because the ‘duty of care’ that’s established by the bill essentially means that these platforms are responsible for what kids consume,” he says of the bill. “And the only way to determine if someone is a kid is really to get some sort of age verification completed.”
A better approach, Witbeck believes, would have been something Pinwheel has advocated for: “content nutritional labels.” Back in the early part of the 20th century, he explains, “there was a real problem with food quality, even safety, in this country. There were kids that died from drinking bad milk, and there was a huge campaign about the safety of our food. And so I think government-mandated nutritional labels need to be applied to apps, saying, ‘These are the types of things that are in this app, and these are the research-backed consequences.’”
“I have long been an advocate for the safety and emotional well-being particularly of racially, culturally, and otherwise diverse youth. I served on Meta’s online youth safety advisory group … providing best practices and guidance regarding a highly unaddressed group, intersectional youth of color.”
What Breland-Noble likes best about the new guidelines, she says, “is that they explicitly name the harms faced by youth of color in the online space as a critical area of future research focus,” though she wishes they had paid more attention to the unique and possible online harms for LGBTQ youth of color, as well as those with disabilities. She points to the AAKOMA Project 2023 brief, “Representation Matters: Exploring Social Media Experiences and Mental Health of Youth and Young Adults of Color,” as a critical resource in that regard.
Jennifer Kelman, JustAnswer Parenting & Mental Health Expert, and LCSW
“One thing that I’m consistently amazed about is the fact that parents are reaching out, exasperated, without an idea of what to do, but also without the skill or the desire or the conviction to put restrictions in place—even while knowing that they’re seeing the harm of more anxiety, more depression, addiction to the device, addiction to social media apps. And yet they feel powerless or afraid to put some boundaries around it.”
She believes it started when our kids were young, “where the digital babysitter was so helpful to us,” and so now “we have that teenager who is brought up with it—not only to soothe us, but to soothe them. We cultivated this addiction, and now we’re in trouble. We’re behind the eight ball.”
The recommendations, as well as the law, Kelman says, “should have been in place as we started to grow with online access and social media … So of course, I’m in support of it, but at the same time, it shouldn’t be a Band-Aid in the sense that parents will just be like, ‘Oh, it’s okay, because now there are restrictions.’ Parents should still put guardrails in place … Social media companies are designed to hook our children. So if we can prevent that with this ‘duty of care’ stuff, let’s put it out there. But let’s not forget our job as parents to give our kids access in a way that is safe and in their best interest.”