Will Lawmakers Really Act to Protect Children Online? Some Say Yes.
In the final minutes of a congressional hearing on Wednesday in which tech chief executives were berated for not protecting children online, Senator Richard J. Durbin, Democrat of Illinois, urged lawmakers to act to safeguard the internet’s youngest users.
“No excuses,” he said.
Lawmakers have long made similar statements about holding tech companies to account — and have little to show for it. Republicans and Democrats alike have at various points declared that it was time to regulate the tech giants over matters such as privacy and antitrust. Yet for years, that was where it ended: with no new federal regulations for the companies to follow.
The question is whether this time will be different. And already, there are indicators that the topic of online child safety may gain more traction legislatively.
At least six legislative proposals waiting in the wings in Congress target the spread of child sexual abuse material online and would require platforms such as Instagram, Snapchat and TikTok to do more to protect minors. The efforts are backed by emotional accounts of children who were victimized online and died by suicide.
The only federal internet law to pass in recent years, SESTA (for the Stop Enabling Sex Traffickers Act and the Fight Online Sex Trafficking Act), which made it easier for victims of sex trafficking to sue websites and online platforms, was approved in 2018, also after heart-wrenching testimony from a victim’s mother.
Child safety is a personally relatable and visceral topic that is an easier political sell than some other matters, online safety experts and lawmakers said. At Wednesday’s hearing, confronted with stories of children who had died after sexual exploitation, Mark Zuckerberg of Meta said he was sorry that families had suffered.
“Similar to the tobacco industry, it took a series of embarrassing hearings for tobacco — but finally Congress acted,” said Jim Steyer, president of Common Sense Media, a nonprofit child advocacy group. “The dam finally broke.”
Any legislative progress on online child safety would be a counterpoint to the stasis that has enveloped Congress in recent years on other tech issues. Time and again, proposals for rules to govern tech giants like Google and Meta have failed to become law.
In 2018, for instance, Congress grilled Mr. Zuckerberg about a leak of Facebook user data to Cambridge Analytica, a firm that built voter profiles. The outrage over the incident led to calls for Congress to pass new rules to protect people’s online privacy. But while California and other states eventually approved online privacy laws, Congress has not.
Lawmakers have also attacked a legal statute, Section 230 of the Communications Decency Act, which shields online platforms such as Instagram and TikTok from many lawsuits over content posted by their users. Congress has not substantively changed the statute, beyond making it harder for the platforms to employ the legal shield when they are accused of meaningfully aiding sex trafficking.
And after companies like Amazon and Apple were accused of being monopolies and abusing their power over smaller rivals, lawmakers proposed a bill to make some of their business practices illegal. An effort to get the legislation over the finish line failed in 2022.
Senators Amy Klobuchar, Democrat of Minnesota, and Josh Hawley, Republican of Missouri, as well as other lawmakers, have blamed the power of tech lobbyists for killing proposed rules. Others have said tech regulations haven’t been a priority for congressional leaders, who have focused on spending bills and measures meant to subsidize American companies that make crucial computer chips and harness renewable energy.
The Senate Judiciary Committee, which hosted Wednesday’s hearing, talked up five child safety bills directed at the tech platforms ahead of the hearing. The committee passed the bills last year; none have become law.
Among the proposals were the STOPCSAM Act (Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act), which would give victims new avenues to report child sexual abuse material to internet companies, and the REPORT Act (Revising Existing Procedures on Reporting via Technology), which would expand the types of potential crimes online platforms are required to report to the National Center for Missing and Exploited Children.
Other proposals would make it a crime to distribute an intimate image of someone without that person’s consent and would push law enforcement to coordinate investigations into crimes against children.
A separate proposal passed last year by the Senate Commerce Committee, the Kids Online Safety Act, would create a legal duty for certain online platforms to protect children. Some of the legislative proposals have been criticized by digital rights groups like the Electronic Frontier Foundation, which say they could encourage the platforms to take down legitimate content while the companies attempt to comply with the laws.
Ms. Klobuchar, who questioned the tech executives at Wednesday’s hearing, said in an interview that the session “felt like a breakthrough.” She added, “As someone who has taken on these companies for years, it’s the first time I felt hope for movement.”
Others were skeptical. For any proposals to pass, they will need support from congressional leaders. Bills that were passed by committee last year will need to be reintroduced and go through that process again.
Hany Farid, a professor at the University of California, Berkeley, who helped create technology used by platforms to detect child sexual abuse material, said he had watched Congress hold hearing after hearing about protecting children online.
“This is one thing that we should be able to agree on: that we have a responsibility to protect kids,” he said. “If we can’t get this right, what hope do we have for anything else?”
Discover more from Divya Bharat 🇮🇳
Subscribe to get the latest posts sent to your email.