Instagram is testing new tools to combat sextortion.

Reading Time: < 1 minute

Instagram is taking a stand against “sextortion” by introducing new tools to protect users, especially young people, from falling victim to this form of blackmail involving intimate pictures sent online. The platform will start testing these tools within weeks, including a “nudity protection” feature that blurs naked images in direct messages, which will be turned on by default for users under 18.

Governments worldwide have issued warnings about the increasing threat of sextortion, where victims are coerced into sending intimate photos under the threat of having them shared publicly. Recently, two Nigerian men pleaded guilty to sexually extorting teenage boys and young men in the US, leading to tragic consequences.

Instagram’s new nudity protection system, powered by artificial intelligence, aims to detect nude images in direct messages and give users the option to view them. The platform emphasizes that this system does not automatically report nude images to the company, but instead provides safety tips and reminders about the risks of sharing such content.

In addition to these measures, Instagram will also implement features to detect potential sextortionists and make it harder for them to interact with others. The platform is committed to combating this “horrific” crime and providing support to those who may have been affected.

This announcement coincides with WhatsApp, another Meta platform, lowering its minimum age in the UK and Europe from 16 to 13. While WhatsApp argues that users, including teenagers, have control over their interactions, critics like Smartphone Free Childhood have raised concerns about exposing children to harmful content.

Taylor Swifts New Album Release Health issues from using ACs Boston Marathon 2024 15 Practical Ways To Save Money