A former Facebook engineer turned whistleblower testified Tuesday on Capitol Hill that he warned people at the highest ranks of the social media giant about the dangers its platforms can have on adolescents but that his concerns were ignored.
What You Need To Know
- A former Facebook engineer turned whistleblower testified Tuesday on Capitol Hill that he warned people at the highest ranks of the social media giant about the dangers its platforms can have on adolescents but that his concerns were ignored
- Arturo Béjar described to members of the Senate Judiciary Committee’s Privacy, Technology and the Law Subcommittee the seismic shift he witnessed in attitudes toward protecting young people on the platforms after he returned to the company following a four-year absence
- Meta insisted in a statement to Spectrum News that it takes online safety for young users seriously and does not prioritize profits over safety
- There is bipartisan support on the Judiciary Committee for improving child safety online
Arturo Béjar described to members of the Senate Judiciary Committee’s Privacy, Technology and the Law Subcommittee the seismic shift he witnessed in attitudes toward protecting young people on the platforms after he returned to the company following a four-year absence.
From 2009-15, Béjar served as director of engineering on Facebook’s protect and care team, which was tasked with reducing online threats to both children and adults. He said during that stint, when he raised issues about youth safety, CEO Mark Zuckerberg and other executives were supportive and engaged “very practically.”
But Béjar said after he returned as a consultant in 2019, he found a company disinterested in tackling those issues. He said an email he sent to Zuckerberg and others on Oct. 5, 2021, presenting research he conducted detailing harm to adolescents received no reply or follow-up meeting.
But Béjar said his findings turned out not to be enlightening within the company. He said Facebook, whose parent company changed its name to Meta in 2021, had already conducted internal research exposing those problems, But the tech company, which also owns Instagram, only addressed them based on a very narrow of definition of “harm," Béjar said.
According to Béjar, an internal survey in 2021 found that one in eight Instagram users ages 13 to 15 said they experienced unwanted sexual advances on the platform within the previous seven days.
Eleven percent of 13- to 15-year-olds reported being either threatened, insulted, disrespected or excluded on Instagram within the previous week. And one in five said they had seen a post in the week before that made them feel worse about themselves.
In between his stints with the company, Béjar’s daughter, at 14 years old, and her friends became victims of unwanted sexual advancements and harassment on Instagram, he said. She reported the incidents to the company, but “it did nothing,” Béjar said.
Upon his return, Béjar discovered that most of the tools protecting kids that were adopted during his earlier tenure had been discarded.
“I observed new features being developed in response to public outcry, which were in reality kind of a placebo, a safety feature in name only to placate the press and regulators,” he said.
Béjar said Meta’s research into issues facing adolescents was not followed by adequate action, and he accused the company of presenting “profoundly misleading” data.
Béjar, who left Facebook for the second time in 2021, called social media “one of the most urgent threats to our children today.”
“Social media companies must be required to become more transparent so that parents and the public can hold them accountable,” he said.
“Many have come to accept the false proposition that sexualized content, unwanted advances, bullying, misogyny and other harms are unavoidable evil. This is just not true,” Béjar said. “We don’t tolerate unwanted sexual advances against children in any other public context, and they can similarly be prevented on Facebook, Instagram and other social media products.”
Meta insisted in a statement to Spectrum News that it takes protecting its young users seriously and does not prioritize profits over safety.
“Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” a spokesperson said. “The issues raised here regarding user perception surveys highlight one part of this effort, and surveys like these have led us to create features like anonymous notifications of potentially hurtful content and comment warnings. Working with parents and experts, we have also introduced over 30 tools to support teens and their families in having safe, positive experiences online. All of this work continues.”
There is bipartisan support on the Judiciary Committee for improving child safety online. Several members of the subcommittee complained Tuesday that six related bills they passed have yet to receive floor votes.
Among the bills is the Kids Online Safety Act, which would establish a series of new requirements, including that social media platforms must provide settings that better protect children and enable the strongest settings by default. The legislation also would mandate independent audits to ensure companies are addressing risks to kids.
“We can no longer rely on social media’s mantra ‘trust us,’” said Sen. Richard Blumenthal, D-Conn., chairman of the subcommittee and co-author of the bill.
Some senators pointed the finger at the technology industry’s lobbyists for the inaction in Congress.
Sen. Lindsey Graham, R-S.C., said he plans to return campaign donations from social media companies and called on other lawmakers to do the same.
“Their leverage here is just power over the political system,” Graham said. “So I’m calling on every member of Congress today: Don’t take their money until they change.”
Graham also said he thinks it’s important to end Section 230, the provision in the 1996 Communications Decency Act that gives online platforms legal immunity from liability for content posted on the internet.
“Until you open up the courthouse, nothing’s going to change,” he said. “The day you do, you’ll be amazed how many good ideas they knew about they didn’t tell us.”
Béjar is not the first former Meta employee to emerge as a whistleblower. In 2021, Frances Haugen, who worked in the company’s civic integrity unit, leaked to the media internal research showing the harm its platforms can have on young users and later testified before a different Senate subcommittee. She claimed Facebook and Instagram “put their astronomical profits before people.”