Technology
Danish Kapoor
Danish Kapoor

Instagram rolls out new measures to protect young people from sexual blackmail

Instagram is introducing a series of new features to protect young people from sexual blackmail scams. Sexual extortion scams are known to be situations where scammers threaten to release private and sensitive images of victims in order to demand money or more photos. The measures taken against such dangers aim to provide the platform with a safer environment against malicious activities targeting young users.

A security measure that will be implemented soon will prevent screenshots or screen recordings of lost photos or videos sent in private messages. Even if the sender has given permission for these images to be replayed, Instagram will block these contents from being opened via the web browser. However, this feature does not fully prevent fraudsters from recording images or videos with another device.

Instagram will also use some new methods to detect signs of fraud. For example, it will detect suspicious behavior based on indicators such as how new an account is. The platform will block such accounts from sending follow requests to young people or direct these requests to the young user’s spam folder.

Instagram is also testing a new notification system against fake location statements, which is one of the tactics frequently used by sexual blackmail scammers. Instagram and Messenger will now show a warning to young users telling them that the person they are talking to is in a different country. This feature will help spot scammers’ attempts to lie about their location sooner.

In addition, Instagram will restrict suspicious accounts from accessing victims’ follower or following lists. Sexual blackmail scammers can use these lists to learn about their victims’ social circle. Instagram will also block suspicious accounts from accessing target people’s liked posts, photos they are tagged in, and other users in those photos.

To protect young people from sexually explicit content, Instagram is launching a new feature that will automatically blur nude images. This filter, which started testing in April, will be enabled by default worldwide for users under the age of 18. This measure aims to minimize the possibility of young people encountering disturbing content.

The platform will also offer an option that allows users to contact the Crisis Text Line in the US when sexual extortion or child safety issues are reported. In addition, educational videos will be shown to young people in the US, UK, Canada and Australia to raise awareness about sextortion scams. According to FBI data, there was an increase of at least 20 percent in sexual blackmail cases between October 2022 and March 2023.

To combat these scams, Meta has removed 800 Facebook groups and 820 accounts from its platforms in recent months. These accounts were specifically involved in sexual extortion scams associated with Yahoo Boys, a Nigerian cybercrime gang. In July, Meta closed more than 63 thousand Nigerian-based Instagram accounts.

instagram

It’s not just Instagram that takes precautions against sexual blackmail scams; Other platforms, such as Snapchat, have also developed systems that alert teens when messages are received from accounts that users have reported or blocked.

Instagram is taking other steps to help young users feel safe

These measures come as part of Meta’s efforts to make its platforms safer for children. Last month, Instagram announced that all younger users would have more private account settings and that security features like Sleep Mode would be enabled by default to silence notifications during nighttime hours.

Danish Kapoor