This new feature will attempt to address this threat by empowering users to automatically filter direct message requests that contain offensive content.
Instagram is working on a new security feature that will protect its users from receiving sexually explicit photos in their inboxes. According to known source report, the “Nudity Protection” feature will work to weed out instances of online harassment known as “cyberflashing.” Meta developer Alessandro Pauzzi also shared an early look at the new feature on Twitter. He shared a screenshot of the potential feature in a tweet, writing, “Instagram is working on nudity protection for chats. This technology covers photos that may contain nudity in chats. Instagram CAN’T access photos.” In recent days, there has been a significant increase in cyber-flashing incidents, which involve sending uninvited sexual messages to strangers, often women. This new feature will attempt to address this threat by empowering users to automatically filter direct message requests that contain offensive content. According to the report, Meta will use machine learning to help people protect themselves from nude photos and other unwanted messages.