Sunday, May 19, 2024
- Advertisement -
More

    Latest Posts

    Meta launches new tools to safeguard young people from sextortion and online intimate image abuse

    Meta has launched new tools to help prevent sextortion and online intimate image abuse on Instagram to protect youth and teens and make it more difficult for potential scammers and criminals to find and interact with them. 

    The tech giant is testing new ways of helping people detect potential sextortion scams, encouraging them to report and empower them to say no to anything that makes them feel uncomfortable.

    Mia Garlick, Meta’s regional policy director, said the new measures will help protect young people from scammers.

    “We are focused on doing everything we can to stop these horrific scams. We will continue to invest in tools and partnerships to support young people to know they can say no to sharing anything that makes them uncomfortable and to provide resources should they find themselves in this situation.”

    She added Meta will continue to remain committed to working with its broader community and local law enforcement, including the Australian Federal Police, the Office of the eSafety Commissioner, the Australian Centre for Combating Child Exploitation and local youth online safety partners, to remind young people of the dangers of sending online images of a sexual nature on Meta’s apps and across the internet.

    Nudity protection in DMs

    Among the new tools that will undergo testing is a new nudity protection feature on Instagram, which will blur images detected as containing nudity in DMs and protect users from scammers who may send nude images to trick people into sending their own images in return.

    Meta noted that when the feature is turned on, people sending images containing nudity will see a message reminding them to be cautious when sending sensitive images, which can be unsent if they change their mind.

    Meanwhile, forwarding a nude image received will activate a message encouraging them to reconsider. If an image containing nudity is received, it will automatically be blurred under a warning screen, meaning the recipient isn’t confronted with a nude image, and they can choose whether or not to view it.

    The social media platform will also begin testing a message encouraging people not to feel pressure to respond and provide options of blocking the sender and reporting the chat. People will also be directed to safety tips, developed with guidance from experts, about the potential risks involved.

    Preventing Potential Scammers from Connecting with Teens

    Instagram is also developing technology to help identify where accounts may potentially be engaging in sextortion scams based on a range of signals that could indicate sextortion behaviour.

    The tech giant noted that while such signals are not necessarily evidence that an account has broken the platform’s rules, its precautionary steps to prevent these accounts from finding and interacting with teen accounts is critical.

    Any message requests from potential sextortion accounts that try to send will go straight to the recipient’s hidden requests folder, meaning they won’t be notified of the message and never have to see it.

    For those already chatting to potential scam or sextortion accounts, Safety Notices wil be shown to encourage them to report any threats.

    Instagram is also testing not showing the “Message” option on a teen’s profile to potential sextortion accounts, even if they’re already connected. The platform will start testing hiding teens from these accounts in people’s followers, following and like lists and making it harder for potential sextortion accounts to find teen accounts in Search results. 

    New Education Resources

    The Meta platform will also test new pop-up messages for who may have interacted with an account which may have been removed for sextortion. The message will direct them to expert-backed resources, including Instagram’s Stop Sextortion Hubsupport helplines, the option to reach out to a friend, StopNCII.org for those over 18, and Take It Down for those under 18.

    New Child Safety Helplines

    Instagram is also testing new child safety helplines in the in-app reporting flows, which will allow teens to report relevant issues, such as nudity, threats to share private images or sexual exploitation or solicitation. They will be directed to local child safety helplines where available. 

    Meta’s new tools on Instagram comes off that back of its partnership with the Australian Federal Police-led Australian Centre to Counter Child Exploitation (ACCCE), Kids Helpline and US-based organisation NoFiltr last year.

    Together, they launched a community service announcement to inform young people about the dangers of online sextortion.

    Meta also launched Take It Down; a global platform that lets young people take back control of their intimate images and helps prevent them being shared online – taking power away from scammers.

    The post Meta launches new tools to safeguard young people from sextortion and online intimate image abuse appeared first on Mediaweek.

    Latest Posts

    - Advertisement -

    Don't Miss

    Stay in touch

    To be updated with all the latest news, offers and special announcements.