Connect with us

Science & Tech

Instagram to Blur Explicit Content for Underage Users

Published

on

Clear Facts

  • Instagram, owned by Meta, is developing a new feature to blur explicit images sent to users who are under 18 years old.
  • The platform will push users to think about their actions before sending explicit content, and adults will be prompted to activate this safety feature.
  • This change comes amidst increased public demand for social media companies to improve protections for minors against online threats, including “sextortion”.

Instagram plans to roll out a feature that will automatically blur explicit images sent to underage users via private messaging, according to a recent announcement by parent company Meta.

In addition to blurring these images, Instagram will encourage users to think twice before sharing such content. An alert will also be shown to adult users, encouraging them to turn on this safety precaution.

The feature appears to be a response to growing public concern. Critics have been calling for social media companies to provide more robust protections for children online.

Meta also hopes this function will help protect young users from “sextortion” — a malicious act where criminals send explicit images to minors in attempt to blackmail them.

““While people use DMs to share with their friends, family or favorite creators, sextortion scammers may use private messages to share or ask for explicit images,” according to a blog post by Meta.

With this new feature, any explicit imagery will automatically be blurred, and the user will be asked if they wish to view it.

“We’ll also display a message encouraging users not to feel pressured to respond, with an option to block the sender and report the conversation,” Meta mentioned.

Furthermore, Meta confirmed that “severe action” is taken against users found to be involved in sextortion. The guilty party’s account will be deleted and steps will be taken to stop them from creating new ones. Cases deemed necessary will be reported to the National Center for Missing and Exploited Children and local law enforcement agencies.

In an ongoing effort to improve online child safety, Meta announced that it is sharing more data with Lantern, a digital child safety program.

John Shehan, SVP at the National Center for Missing and Exploited Children, said, “Meta’s proposed device-side safety measures within its encrypted environment is encouraging. We are hopeful these new measures will increase reporting by minors and curb the circulation of online child exploitation.”

This move comes after Senate hearings attended by Meta’s founder, Mark Zuckerberg, in January. The hearings included testimonies from families who had been victims of online exploitation.

“I’m sorry for everything you’ve all gone through. Nobody should have to endure what your families have suffered,” Zuckerberg stated. “This is why we have committed to working tirelessly to ensure that no one has to experience the types of things your families have suffered.”

Let us know what you think, please share your thoughts in the comments below.

Source

Advertisement
Advertisement
2 Comments

2 Comments

  1. OldMan

    April 14, 2024 at 11:13 am

    They got caught promoting pornography, and are updating policy to avoid laws being passed to prosecute them.

  2. M.B.

    April 14, 2024 at 6:40 pm

    I bet they don’t blur anything posted by the 2SLGBTQPIA+ community! 😒

Leave a Reply

Your email address will not be published. Required fields are marked *