Snap has announced a new package of safeguards to further protect 13-17 year old Snapchatters from potential online risks.
These safeguards, which will begin rolling out in the coming weeks, are designed to protect teens from being contacted by people they may not know in real life, provide a more age-appropriate viewing experience on the apps content platforms and enable Snapchat to more effectively remove accounts that may be trying to market and promote age-inappropriate content through a new strike system and new detection technologies.
In addition, Snap is releasing new resources for families, including an updated parents guide at parents.snapchat.com that covers the platforms protections for teens and tools for parents, and a new YouTube explainer series.
Tony Keusgen, Managing Director, Australia and New Zealand, at Snap Inc., says that creating a safe and positive experience for Snapchatters has always been a top priority for the company.
"Snapchat is designed to be different to other apps and safety is at the heart of what we do. Our focus is on giving users a fun place to communicate with real life connections and that is why we have always had extra protections for teens," says Keusgen.
"But we know there is no finish line when it comes to keeping our community safe. As a messaging platform for real friends, our goal is to help Snapchatters communicate with people they actually know, and to ensure that the content they view on our app is informative, fun and age-appropriate."
Professor Amanda Third, Co-Director at the Young and Resilient Research Centre at Western Sydney University says, "When talking about anything in the online space, safety is of huge importance.
"It's great to see Snaps ongoing commitment to supporting its community, particularly teens, through these new measures to offer enhanced protection for younger users," she says.
"Through my work, I know the importance of education, not only for young people themselves but also for parents and other caregivers. So I would encourage families to try to learn as much as they can about the apps and technologies that are important to their teens."
The new safeguards announced include:
Most teens use Snapchat to chat with their close friends using pictures and text, similar to how older generations use text messaging or phone calls. When a teen becomes friends with someone on Snapchat, the platform is focused on ensuring that it is someone they know in real-life such as a friend, family member or other trusted person. To help encourage this, Snapchat already requires teens to be existing Snapchat friends or phone book contacts with another user before they can begin communicating. Snapchat also makes it harder for a teen to show up as a suggested friend to another user outside their friend network.
Building on this, Snapchat will begin to rolling out additional protections for 13-17 year olds, designed to further limit unwanted interactions or potentially risky contact, including:
In-App Warnings: Snapchat is launching a new feature that sends a pop-up warning to a teen if they are contacted by someone they don't share mutual friends with. This message will urge the teen to carefully consider if they want to be in contact with this user and not to connect with them if it isn't someone they trust.
Stronger Friending Protections: Snapchat already requires a 13-to-17-year-old to have several mutual friends in common with another user before they can show up in Search results or as a friend suggestion. The platform is raising the bar to require a greater number of friends in common based on the number of friends a Snapchatter has with the goal of further reducing the ability for teens and strangers to connect.
Across Snapchat, illegal and potentially harmful content is prohibited, including sexual exploitation, pornography, violence, self-harm, misinformation, and much more. To enforce Snapchats policies and take quick action to help protect its community, the platform has a long used zero-tolerance approach for users who try to commit severe harms, such as threats to another users physical or emotional wellbeing, sexual exploitation, and the sale of illicit drugs. If Snapchat finds accounts engaging in this activity, the account will be immediately banned, and measures will be applied to keep them from getting back on Snapchat, and may be reported to law enforcement.
New Strike System to Crack Down on Accounts Promoting Age-Inappropriate Content
While Snapchat is most commonly used for communicating with friends, the app offers two content platforms Stories and Spotlight where Snapchatters can find public Stories published by vetted media organisations, verified creators, and Snapchatters. On these public content platforms, additional content moderation is applied to prevent violating content from reaching a large audience.
To help remove accounts that market and promote age inappropriate content, Snapchat recently launched a new Strike System. Under this system, inappropriate content that is reported or proactively detected is immediately removed. If the team at Snapchat sees that an account is repeatedly trying to circumvent its rules, the account will be banned.
Education About Common Online Risks
In recent years and, especially with online communications becoming more prevalent during the pandemic, young people have become more vulnerable to a range of sexual risks like catfishing, financial sextortion, taking and sharing nude images, and more.
As Snapchat continues to bolster its defences against these risks, the platform also aims to use its reach with young people to help them spot likely signs of this type of activity and what to do if they encounter it. Today, Snapchat is also releasing new in-app content that helps explain these risks and shares important resources for Snapchatters, such as hotlines to contact for help. This content will be featured on the Stories platform, and surfaced to Snapchatters by relevant Search terms or keywords.
Several of the new product safeguards were informed by feedback from The National Center on Sexual Exploitation (NCOSE). Snapchats new in-app educational resources were developed with The National Center for Missing and Exploited Children (NCMEC).