Instagram will roll out a new safety feature that blurs nudes in messages sent to minors in an attempt to mitigate abuse and sexually exploitative scams, says Meta
Credits: OLIVIER DOULIERY / AFP

Instagram will roll out a new safety feature that blurs nudes in messages sent to minors in an attempt to mitigate abuse and sexually exploitative scams, says Meta

Meta, the parent company of Instagram, has announced that it is preparing to introduce a new safety feature on the platform.

This feature aims to protect minors from abuse and sexually exploitative scams by blurring nude images in direct messages.

The feature, announced on Thursday, will automatically blur images that are detected to contain nudity. It will also discourage users from sending such images.

For teenage Instagram users, the feature will be enabled by default based on the birthday information provided on their account.

Adult users will receive a notification encouraging them to activate the feature as well.

This move comes in response to longstanding criticism faced by both Facebook and Instagram, alleging that they have caused harm to their youngest users, as reported by The Verge.

Such criticisms include concerns about the negative impact on children's mental health and body image, as well as accusations of knowingly hosting abusive parents and facilitating a "marketplace for predators in search of children."

By implementing this new safety feature, Meta aims to enhance the protection of minors on the Instagram platform and mitigate potential risks associated with the sharing of explicit content.

The new feature will be tested in the coming weeks according to the Wall Street Journal, with a global rollout expected over the next few months.

Meta says the feature uses on-device machine learning to analyze whether an image sent via Instagram’s direct messaging service contains nudity, and that the company won’t have access to these images unless they’ve been reported.

When the protection is enabled, Instagram users who receive nude photographs will be presented with a message telling them not to feel pressured to respond, alongside options to block and report the sender.

“This feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Meta said in its announcement.

In addition to blurring nude images in direct messages, Meta is implementing further measures to enhance safety on its platforms, particularly for children.

Users who attempt to send a nude image via direct message will receive a warning message about the potential risks associated with sharing sensitive photos.

Similarly, users who try to forward a nude image they have received will be discouraged by another warning message.

These efforts by Meta are part of their ongoing initiatives to strengthen protections for children on their platforms.

In February 2023, Meta supported a tool designed to remove sexually explicit images of minors from its platforms.

Furthermore, they have implemented restrictions to limit children's access to harmful topics such as suicide, self-harm, and eating disorders earlier this year.

By incorporating these warning messages and safety precautions, Meta aims to raise awareness among users about the potential consequences of sharing sensitive content and to create a safer environment, particularly for children, on their platforms.

* Stories are edited and translated by Info3 *
Non info3 articles reflect solely the opinion of the author or original source and do not necessarily reflect the views of Info3