BETA
This is a BETA experience. You may opt-out by clicking here

Breaking

Edit Story

Instagram Targets Teen Sextortion Scammers With Nude Image Filter For Direct Messages

Following

Topline

Instagram and Facebook parent Meta on Thursday unveiled an array of new tools and safety features to protect teens from unwanted nude images and predators hoping to blackmail them with explicit images, as concerns grow about the welfare of younger users on social media amid police warnings over rising cases of “sextortion” online.

Key Facts

Meta said Instagram will soon begin testing features to protect users from financial sextortion, a term used for a growing trend of blackmail scams often targeting children or teens where predators solicit sexual material and then extort the target for money by threatening to post or share the material online.

To address the issue, as well as the well worn specter of receiving unsolicited nudes, Meta said it will soon be rolling out a nudity filter for direct messages, or DMs, on Instagram, a venue it acknowledged is sometimes used by people to share or ask for intimate images, scammers included.

The feature will be able to automatically detect nude images being sent, blurring them and encouraging “people to think twice before sending” them, telling users not to “feel pressured to respond” as well as reminding them that they can be unsent if they change their mind.

The filter, which users can choose to remove and view the image, will also point users towards resources and tips for sending sensitive images online, including the risk of people sharing or screenshotting them without consent and that the person may not be who they claim to be.

The feature will be “turned on by default” for teens under 18 globally, Meta said, and adults will get a notification “encouraging them to turn it on.”

In Meta’s announcement, Sameer Hinduja, co-director of the cyberbullying research center at the Berkman Klein Center at Harvard University, said the feature approached the issue of nude content online in a “thoughtful, nuanced and appropriate way,” reducing exposure to “potentially traumatic images” and educating people about the potential downsides.

Does This Mean Instagram Can See My Nudes?

While image analysis like Meta’s nude filter clearly requires access to the images, Meta said the Instagram filter will run using machine learning—a type of artificial intelligence—on users’ devices. As such, the company said it “won’t have access to these images,” with the sole exception of if users choose to report them.

Crucial Quote

“Companies have a responsibility to ensure the protection of minors who use their platforms, said John Shehan, Senior Vice President, National Center for Missing & Exploited Children, in the company’s press release. “We are hopeful these new measures will increase reporting by minors and curb the circulation of online child exploitation.”

Key Background

Police around the world have been warning of the rise in predatory behavior and sextortion targeting children, teens and young adults on social media. The results of such crimes can be devastating, even fatal, and there are a growing number of cases where young people are losing their lives over sextortion schemes. The FBI recently warned about an increase in financial sextortion from strangers online, typically targeting teen males ages 14 to 17. The agency said victims “often feel alone, embarrassed, and afraid to seek help,” adding that “​​there have been an alarming number of suicides identified in male victims of financially motivated sextortion schemes.” The FBI said it is important for victims to know they “are not alone” and to “ask a trusted adult for help” if targeted.

Further Reading

the GuardianTwo charged in Nigeria over alleged sextortion that led to Australian teenager's death
Follow me on Twitter or LinkedInSend me a secure tip