Advertisement

‘We cannot trust them with our kids’ - Meta slammed by whistleblower

Former director of engineering for protect and care at Meta Arturo Bejar said that dangerous and inappropriate content could be removed from Meta’s platforms in just six months.
Aoife Daly
Aoife Daly

14.24 21 Jan 2025


Share this article


‘We cannot trust them with our...

‘We cannot trust them with our kids’ - Meta slammed by whistleblower

Aoife Daly
Aoife Daly

14.24 21 Jan 2025


Share this article


A former Meta employee has said that the company is wilfully failing to protect children from violent and harmful content.

Meta, founded by tech billionaire Mark Zuckerberg, owns and operates Facebook, Instagram, Threads and WhatsApp, among other products and services.

On Newstalk Breakfast, former director of engineering for protect and care at Meta Arturo Bejar said that dangerous and inappropriate content could be removed from Meta’s platforms in just six months.

Advertisement

“If Mark Zuckerberg woke up tomorrow and said, ‘I want to create the safest environment for teenagers that I can with my company,’ it would probably take around six months to create a feed that’s free of [things] like pornography, violence, self-harm content,” he said.

“[Also] to change messages for kids, [so if] they get unwanted advances, all they need to do is flag it - no reporting - just say, ‘Hey, this is not for me’ - and these changes wouldn’t really affect their bottom line.

“I think it’s a matter of values. They keep demonstrating that we cannot trust them with our kids.”

Meta sign in front of Facebook headquarters on 1 Hacker Way, 28-10-2021. Image: Michael Vi / Alamy Stock Photo Meta sign in front of Facebook headquarters on 1 Hacker Way, 28-10-2021. Image: Michael Vi / Alamy Stock Photo

Mr Arturo said that he was directly involved in online moderation when he was with the company, but that many policies have since changed.

“For six years I was the person responsible for all of the different aspects of engineering and product and research to help people deal with harm online,” he said.

“So, there would be things like stopping spam, helping with bullying and harassment, helping people who might be thinking about committing suicide.

“I was a person responsible for the teams working on that, reporting directly to the CTO, and working frequently with Mark Zuckerberg and the executive team.

“I think that the policies that they changed, the reason that they were there in the first place was to protect vulnerable people and so they’re now walking away from all of that.”

Testimony before Congress

Mr Arturo has previously testified before Congress in the US about this issue, but said that Meta has gone to ‘extraordinary’ lengths to prevent regulatory legislation from passing.

“In the US, there’s remarkable bi-partisan support for child safety legislation,” he said.

“It passed through the Senate with an overwhelming vote. But literally Meta offered to build a giant multi-billion data centre in the home state of the Speaker of the House in the United States and then the bill didn’t make it to the floor.

“Meta has become this company that is aware of the harm that they enable; that they could prevent, and not only that, but they go to extraordinary lengths to stop other people from making it better.”

According to Mr Arturo, parents should prioritise having open conversations with their children about the content they are viewing online to help them process any disturbing things they might see.

Meta response

A Meta spokesperson responded to Mr Arturo's claims in a statement:

“There is no change to how we treat content that endangers children or content that encourages suicide, self-injury or eating disorders," a Meta spokesperson said.

"Our commitment to providing teens with safe, age-appropriate experiences remains the same, and we recently announced that we’re reimagining the Instagram experience for tens of millions of teens with new Teen Accounts, a protected experience, guided by parents, that automatically limits who can contact teens and the content they see.”

Main image: A child looks at a phone. Image: Westend61 GmbH / Alamy Stock Photo


Share this article


Read more about

Child Safety Online Facebook Instagram Meta Meta Whistleblower Online Safety

Most Popular