- Meta, TikTok, X, Snap, and Discord CEOs were grilled by the Senate on child exploitation and harm.
- Some of the CEOs, like Mark Zuckerberg, have sat in that same seat several times — but little has changed.
- This time, there’s real legislation in the works, and bipartisan momentum to make actual changes.
Thanks for signing up!
Access your favorite topics in a personalized feed while you’re on the go.
Sen. Josh Hawley has taken some extreme stances, outside even mainstream conservative norms and is a controversial figure in the Senate. But his consistent criticism of some of the real issues happening in Big Tech has sometimes reminded me of a classic Clickhole headline: “Heartbreaking: The Worst Person You Know Just Made a Great Point.”
On Wednesday, during a Senate hearing on child exploitation on social media, Hawley, a Republican from Missouri, pulled off one of those moments: He badgered Meta CEO Mark Zuckerberg into standing up, turning to face the room full of parents holding up photos of their teenagers who had died from suicide or were otherwise harmed after being exploited online — and forced him to apologize.
It was theatrics, to be sure, but it was genuinely a powerful moment.
Why this time could be different
I’ve watched a bunch of these kinds of congressional hearings, where tech CEOs swap the hoodie for a tie and sit before lawmakers who take an opportunity to make a big show of trying to flay the CEOs over whatever the hot scandal of the moment is: the Twitter Files, Russian election interference, alleged anti-conservative bias, etc.
Rarely does Congress land any real blows. Often, it’s a complete whiff, and lawmakers embarrass themselves with their lack of understanding of technology — like someone asking Sundar Pichai why an unflattering article about the senator appeared on his iPhone, to which the Google CEO replied, “Congressman, the iPhone is made by a different company.“
The best example of this was by Zuckerberg himself, who in 2018 was being grilled over Facebook’s data-collection practices. Utah Sen. Orrin Hatch, who was 84 then, asked how Facebook could be free to users. Zuckerberg blinked and said, “Senator, we run ads.”
This time, things were different.
Tension over child exploitation and harm to the well-being of teenagers has been building up for a while, punctuated by the bombshell report in The Wall Street Journal that Meta had willfully ignored its internal research about the effects of Instagram on teen girls.
In 2023, 33 states got together to sue Meta for what they say are its damaging effects on teens. Some states, like Florida, are working on passing laws that would restrict teens’ access to social media apps.
Shocking and sad stories about sextortion — young people who are tricked into producing explicit images and then are extorted for money, often by foreign actors — have been in the news, and parents of child victims packed the audience of the hearing Wednesday.
Kids Online Safety Act has bipartisan support
There is some real momentum to attempt to pass legislation on this issue, or to finally create a dedicated federal regulating body for social media. And unlike other hearings with tech CEOs that have devolved into things like arguing about political bias in content moderation, this is a largely bipartisan topic: Everyone wants to stop child exploitation.
The problem is that the potential legislation being discussed is not exactly perfect. The proposed Kids Online Safety Act, or KOSA, would give parents more control over their kids’ accounts and would require platforms to have stricter privacy settings for teens. This sounds great, but it also would require platforms to prevent teens from accessing content deemed harmful to teens.
Organizations like the ACLU and the Electronic Frontier Foundation oppose KOSA because they say it could go too far. For example, it could be interpreted to require platforms to stop kids from seeing content about trans issues or even discussing racism. (The EFF points out that Sen. Marsha Blackburn, Republican of Tennessee, one of the bill’s co-sponsors, has previously said that critical race theory is harmful to kids.)
Wednesday’s hearing also had a steady drumbeat of wanting to dismantle or reform Section 230. This rule generally protects internet platforms from being liable for the content posted on those platforms. Section 230 is controversial and has become a rallying point for some conservatives and others, who think it allows Big Tech to hide from criticisms that it isn’t fair to all sides.
Unlike past criticism of the law, though, Wednesday’s mentions of repealing it mostly appealed to the heartstrings: Getting rid of it could allow grieving parents to sue tech companies for compensation for the deaths of their teens or child victims of exploitation.
Still, the safety of children online is a tough issue
Child exploitation and harm isn’t an easy thing to solve. Clearly, these big platforms haven’t succeeded — sometimes motivated by a whole bunch of internal factors, including profit.
But if there were some magic bullet to stop child exploitation, I truly believe the companies would’ve used it by now. Whether lawmakers’ plans to regulate and fix this will work is anyone’s guess.
But there’s now enough wind behind this issue to make something happen.