Meta announced Thursday it is testing new features to protect teens from becoming victims of sexual image abuse and being preyed upon by scammers and criminals. The technology giant is testing the new tools on Instagram and its other apps.

What You Need To Know

  • Meta is testing a variety of features to protect teens from sexual image abuse on its apps

  • Instagram users under the age of 18 will be defaulted into a feature that automatically blurs images containing nudity in direct messages

  • Teens will also be defaulted into settngs that won't allow them to be messaged by anyone they aren't already in contact with

  • Users will be able to unsend photos

A feature that automatically blurs images containing nudity in Instagram direct messages for people under the age of 18 is part of a new on-device nudity protection system. Teens are also defaulted into settings that won’t allow them to be messaged by anyone they aren’t already connected to, the company said in a statement posted on its website.

Teens who may have come in contact with a potential scam account will be shown safety notices and given information about how to report direct messages that threaten to share private images.

“While people overwhelmingly use DMs to share what they love with their friends, family or favorite creators, sextortion scammers may also use private messages to share or ask for intimate images,” the company said.

Sextortion is the practice of extorting money or sexual favors from someone by threatening to reveal evidence of their sexual activity, according to Oxford Languages. Between October 2021 and March 2023, at least 12,600 minors were victims of financial sextortion, according to the Federal Bureau of Investigation and Homeland Security Investigations. Most of them were boys.

Sextortion scammers usually target teenagers by pretending to be around the same age as the victim, according to the American Academy of Pediatrics. After indicating they are interested in a relationship, the scammer sends an explicit image and asks for one in return. If a victim sends the image, the scammer asks for even more explicit photos or money and may even hack their personal devices.

ParentsTogether, a nonprofit that advocates for kids’ internet safety, criticized Meta’s new tools.

“The FBI, international agencies and parents have been sounding the alarm on Meta as a cesspool of online child sexual exploitation for years, but the company has done little more than put PR Band-Aids on gaping wounds,” the group’s campaign director, Shelby Knox, said in a statement. “Meta has shown zero willingness to engage with parents whose kids have died due to sextortion on Meta platforms, so it’s hard not to see this announcement as another cynical attempt to do just enough to avoid growing calls for state and federal legislators to regulate Big tech.”

The new nudity protection system Meta is testing urges Instagram users to use caution when sending nude pictures and “is designed to not only protect people from seeing unwanted nudity in their DMs but also to protect them from scammers who may send nude images to trick people into sending their own images in return.”

The new feature also allows users to unsend photos. People who have received a nude image and try to forward it will be shown a message encouraging them to be responsible and respectful.

“Sharing someone’s sensitive photos may go against our Community Guidelines or be illegal,” one of them reads. “It can be difficult to control what happens to a photo once you share it. People may use photos to harm the person in them.”

Underage individuals who receive a nude photo will see a blurred image and encounter a screen that reads, “Don’t feel pressured to respond. You can stop conversations that make you feel uncomfortable, even if you’ve chatted before.”

They are also given an option to see safety tips, including a reminder that people can screenshot or forward images without a person’s knowledge, that a person’s relationship to the person may change in the future and that they should review profiles carefully to make sure they are who they say they are.

Meta said it is developing technology to help identify accounts that could potentially engage in sextortion scams based on signals that indicate such behavior. Messages from accounts that Meta identifies as having sextortion potential will instantly go to the recipient’s hidden requests folder so they will not be notified.

For teens, Meta said adults are already restricted from starting direct messaging chats with teens they are not connected with. The company is beginning to test a system in which a teen’s profile does not show the “message” button to potential sextortion accounts and hides teens from those accounts.

Teens who have already engaged with sextortion scammers whose accounts have already been removed will receive pop-up messages directing them to resources from victim support groups including Take It Down.

The new anti-sextortion features come almost three years after Facebook employee Frances Haugen blew the whistle on the company, calling attention to Instagram’s impact on children.