Instagram adds new safety features to combat sextortion scams targeting teens: All the details
Instagram has introduced a series of new safety features to protect users against sextortion scammers, who blackmail or use obscene images, videos, or conversations against teen kids unless they receive a payment.
Instagram, the popular social media platform, will no longer allow brief screen recordings or screenshots. Its parent company Meta has been mounting questions about its handling of younger users’ privacy and safety concerns in the last few months, and the platform has added a series of new safety features to protect users from sextortion scammers.
Financial sextortion usually occurs when a victim is coerced to provide sexually explicit photos or videos of themselves and threatened with them being shared publicly to pressure them to send money. Instagram noted that these scams often try to deceive teens about where they live to build trust.
This crime has become more common in the past few years. Meta has shared that it is testing a safety notice in Instagram and Messenger that will alert teens if the person they’re talking to is located in a different country, as these scammers often lie about their location. Moreover, if the platform detects any potential scammer who is already following a teen, it will prevent them from being able to view their follower lists and accounts that they have shared or are tagged with.
The social media platform is also launching introducing a ‘nudity protection’ feature in direct messages, which blurs out any images or videos that might contain obscene content. Meta has also confirmed that this feature will be turned on by default for users under 18. Further, Instagram won’t let users open ‘view once’ or ‘allow replay’ images or videos on the desktop or web interface to ensure that scammers can’t evade the safety measures.
All of these features come as part of Meta’s broader efforts to make its social media platforms safer for kids.