Meet Our Head of Global Platform Safety

We regularly remind our community to enable two-factor authentication and use strong passwords — two important safeguards against account breaches, and today are launching new content on our Discover platform with tips about creating unique account credentials and how to set up two-factor authentication

We are launching new privacy-focused creative tools, including our first-ever privacy-themed Bitmoji, stickers developed with Lavalife detroit the International Association of Privacy Professionals (IAPP), a new Lens in partnership with Future Privacy Forum that shares helpful privacy tips.

In the coming months, we will continue to leverage our research findings to inform additional in-app privacy tools for our community.

Hello, Snapchat community! My name is Jacqueline Beauchere and I joined Snap last Fall as the company’s first Global Head of Platform Safety.

My role focuses on enhancing Snap’s overall approach to safety, including creating new programs and initiatives to help raise awareness of online risks; advising on internal policies, product tools and features; and listening to and engaging with external audiences – all to help support the safety and digital well-being of the Snapchat community.

For example, by default, not just anyone can contact you on Snapchat; two people need to affirmatively accept each other as friends before they can begin communicating directly, similar to the way friends interact in real life

Since my role involves helping safety advocates, parents, educators and other key stakeholders understand how Snapchat works and to solicit their feedback, I thought it might be useful to share some of my initial learnings about the app; what surprised me; and some helpful tips, if you or someone close to you is an avid Snapchatter.

After more than 20 years working in online safety at Microsoft, I’ve seen significant change in the risk landscape. In the early 2000s, issues like spam and phishing highlighted the need for awareness-raising to help educate consumers and minimize socially engineered risks. The advent of social media platforms – and people’s ability to post publicly – increased the need for built-in safety features and content moderation to help minimize exposure to illegal and potentially more harmful content and activity.

Ten years ago, Snapchat came onto the scene. I knew the company and the app were “different,” but until I actually started working here, I didn’t realize just how different they are. From inception, Snapchat was designed to help people communicate with their real friends – meaning people they know “in real life” – rather than amassing large numbers of known (or unknown) followers. Snapchat is built around the camera. In fact, for non-first-generation Snapchatters (like me), the app’s very interface can be a bit mystifying because it opens directly to a camera and not a content feed like traditional social media platforms.

There’s far more that goes into Snapchat’s design than one might expect, and that considered approach stems from the tremendous value the company places on safety and privacy. Safety is part of the company’s DNA and is baked into its mission: to empower people to express themselves, live in the moment, learn about the world and have fun together. Unless people feel safe, they won’t be comfortable expressing themselves freely when connecting with friends.

The belief that technology should be built to reflect real-life human behaviors and dynamics is a driving force at Snap. It’s also vital from a safety perspective.

Snap applies privacy-by-design principles when developing new features and was one of the first platforms to endorse and embrace safety-by-design, meaning safety is considered in the design phase of our features – no retro-fitting or bolting on safety machinery after the fact. How a product or feature might be misused or abused from a safety perspective is considered, appropriately so, at the earliest stages of development.