Meta is rolling out a suite of new updates to further protect teens from online harm and create safe, age-appropriate experiences on Facebook and Instagram, it has announced.
Effective immediately, everyone who is under the age of 16 - or 18 in certain countries - will have increased privacy settings automatically applied to their profiles when they join Facebook.
The technology giant is also in the process of developing new tools to stop the spread of self-generated intimate images online, Vice President, Global Head of Safety Antigone Davis said in a statement released on Tuesday morning (NZ time).
In addition to its existing measures, Meta is now testing ways to protect teens and young people from messaging suspicious adults they aren't already connected to, Davis said. These adults will also be blocked from appearing in teens' 'People You May Know' recommendations.
"A 'suspicious' account is one that belongs to an adult that may have recently been blocked or reported by a young person, for example. As an extra layer of protection, we're also testing removing the message button on teens' Instagram accounts when they're viewed by suspicious adults altogether," Davis said.
Last year, Meta shared some of the measures it takes to protect young people from interacting with potentially suspicious adults, including restricting adults from messaging teenagers they aren't connected to, or from seeing teens in their 'People You May Know' recommendations.
Additionally, Meta announced it has developed a raft of tools to make it easier for young people to report any suspicious activity or content that makes them feel uncomfortable while using the likes of Facebook and Instagram, as well as introducing new notifications that encourage them to use the new safety features.
"For example, we're prompting teens to report accounts to us after they block someone, and sending them safety notices with information on how to navigate inappropriate messages from adults," Davis explained.
"In just one month in 2021, more than 100 million people saw safety notices on Messenger. We've also made it easier for people to find our reporting tools and, as a result, we saw more than a 70 percent increase in reports sent to us by minors in Q1 2022 versus the previous quarter on Messenger and Instagram DMs."
As of today, people who are under the age of 16 (or under 18 in certain countries) and already have accounts on Facebook will be encouraged to choose add additional privacy settings for who can see their friends list, who can see the people, pages and lists they follow, who can see posts they're tagged in on their profile, reviewing posts they're tagged in before the post appears on their profile, and who is allowed to comment on their public posts.
New tools to stop the spread of teens' intimate images
In the announcement, Davis also shared an update on the work Meta is doing to stop the spread of intimate images of teenagers online, particularly when the images are used for exploitation, blackmail and extortion - commonly known as 'sextortion'.
"The non-consensual sharing of intimate images can be extremely traumatic and we want to do all we can to discourage teens from sharing these images on our apps in the first place," Davis said.
"We're working with the National Center for Missing and Exploited Children (NCMEC) to build a global platform for teens who are worried intimate images they created might be shared on public online platforms without their consent. This platform will be similar to work we have done to prevent the non-consensual sharing of intimate images for adults.
"It will allow us to help prevent a teen's intimate images from being posted online and can be used by other companies across the tech industry. We've been working closely with NCMEC, experts, academics, parents and victim advocates globally to help develop the platform and ensure it responds to the needs of teens so they can regain control of their content in these horrific situations.
"We'll have more to share on this new resource in the coming weeks."
Meta is also working with Thorn, a company that builds technology to defend children from sexual abuse, and its NoFiltr brand to create educational tools that reduce stigma surrounding intimate images. It's hoped the new tools will empower teenagers to seek help if they are experiencing sextortion or regain control if they've shared compromising material online.
Research found more than 75 percent of people Meta reported to NCMEC for sharing exploitative content of children and young people had circulated the material out of outrage, poor humour or disgust, with no apparent intention of causing harm; but regardless of intent, sharing this content violates Meta's policies.
The conglomerate is now planning to launch a new PSA campaign that encourages people to stop and think before resharing this content online and to report it to Meta instead.
Anyone seeking support and information related to sextortion can visit Meta's education and awareness resources, including the Stop Sextortion hub on the Facebook Safety Center. There is also a guide for parents on how to talk to their teens about intimate images on the Education hub.