Snap Inc. has unveiled a suite of new safeguards to further protect 13-to-17-year-old Snapchatters against potential online harms. These safeguards which will begin to roll out in the coming weeks are designed to protect youth from being contacted by people they may not know in real life, provide a more age-appropriate viewing experience on Snapchat's content platforms, as well as enable Snap to more effectively remove accounts that may be trying to market and promote age-inappropriate content through a new strike system and new detection technologies. These new measures will put in place greater enforcement and regulations to guide youth towards safer experiences while also limiting unwanted contact from suspicious accounts. Real relationships that are safer to navigate Moving forward, when youth become friends with someone on Snapchat – such as a close friend, family member, or other trusted person – they will need to be existing Snapchat friends or phone book contacts with another user before they can begin communicating. It is now also harder for a user to show up as a suggested friend to another user outside their friend network. As part of the rollout, Snapchatters can expect to see the following upgrades to their user experience: * In-App Warings: Youth will now receive pop-up warnings if they are contacted by someone they don't share mutual friends with or the person isn't in their contacts. This message will urge users to carefully consider if they want to be in contact with this user and not to connect with them if it isn't someone they trust. * Stronger Friending Protections: Snapchat already requires a 13-to-17-year-old to have several mutual friends in common with another user before they can show up in Search results. They are now raising this bar to require a greater number of friends in common based on the number of friends a Snapchatter has. This aim is to further reduce the ability for youth and strangers to connect. New Strike System to Crack Down on Accounts Promoting Age-Inappropriate Content Across Snapchat's two main content platforms - Stories and Spotlight - users can find public Stories published by vetted media organizations, verified creators, and Snapchatters. On these public content platforms, Snapchat applies additional content moderation to prevent violating content from reaching a large audience. To help remove accounts that market and promote age inappropriate content Snapchat recently launched a new Strike System. Under this system, any inappropriate content detected proactively or that gets reported will be immediately removed. If an account tries to repeatedly circumvent rules, it will be banned. Learn more about the Strike System here. In response to Snap's initiative, Secretary general of digital content council from Ignite, Mohammed Alrobayan, Stated: "We welcome Snap's proactive efforts to enhance the safety of teen users on its platform. Online safety is a shared responsibility and technology companies, such as Snap, are crucial in ensuring that young citizens can navigate the digital landscape with safely. All stakeholders should work together to create a secure online environment for our youth." According to Georg Wolfart, Head of Public Policy at Snap Inc. , at Snap MENA: "At Snap, the safety and well-being of our users, especially teens, is our top priority. We believe that everyone should have the opportunity to express themselves in a safe and respectful environment. These new safeguards reaffirm our commitment to providing a secure platform for Saudi Arabian teens, empowering them to make informed choices and explore the digital world responsibly." Beyond these developments, Snap has released new in-app educational content that will be featured on Snapchat's Stories platform and available for Snapchatters using relevant Search terms or keywords. Snap will also be releasing an updated safety guide for parents at parents.snapchat.com and new YouTube explainer series that walk through suggested protections for youth.