Messenger

Messenger Policy Workshop: Future of Private Messaging

By Gail Kent, Messenger Policy Director

Over the past year, the use of private messaging apps like Messenger has surged. The global pandemic made staying connected with loved ones through private messaging, calling and video chats even more important. But the growth of private messaging predated the pandemic, which has brought important public policy considerations to the forefront. 

In March Stan Chudnovsky, Head of Messenger, shared three trends to expect in private messaging for 2021. We also hosted a virtual workshop with experts in privacy, safety, human rights and consumer protection for a discussion on the future of messaging. This was part of a new series of ‘Data Dialogues’ intended to bring together experts and practitioners from consumer advocacy organizations, academia, businesses, civil society and other companies to discuss challenging questions around how the industry approaches and protects people’s data. Below are some of the ways that we tackle these issues in relation to Messenger as well as the outcomes from our discussions. 

An Update on Our Approach to Private Messaging

Two years ago, Mark Zuckerberg announced a privacy-focused vision for social networking centered on messaging. It will be built on the principles of private interactions, encryption, reducing permanence, safety, interoperability, and secure data storage. Since then, we’ve built a team that’s focused on delivering this vision on Messenger. Our priorities are driven by talking with people about how they want to use Messenger and incorporating that feedback into usable private messaging features. We believe that listening to people is core to delivering a trusted and shared environment.

People tell us that they want reliable messaging with easy to find controls. For example, they say they want to know that their messages, whether text or voice, will be delivered wherever they are, especially in locations where connectivity isn’t reliable. They also want their communications to be protected and confidential. We’ve also heard people ask for more control over things like who can contact them, options to unsend a message or how long their messages stick around. In particular, we heard from the generation that’s grown up on the Internet that they’re cautious about their privacy and that they want ways to avoid others having a permanent record of their conversations. Based on this feedback, we introduced vanish mode, a feature you can enable in Messenger and Instagram Direct that allows you to send messages that automatically disappear after both people see them. 

People are also concerned about the security of their personal information online and the privacy of their messages. Seven out of 10 Americans said in 2019 that their personal information was less secure than five years earlier (source). And, over the last four years, more consumers around the world have used messaging apps that offer more privacy features (source). Over the past year, we introduced a number of privacy and safety tools, including more privacy settings, an app lock, safer message requests, message forwarding limits and more. We’re also working hard to bring default end-to-end encryption to all of our messaging services. This will protect people’s private messages and mean only the sender and recipient, not even us, can access their messages. While we expect to make more progress on default end-to-end encryption for Messenger and Instagram Direct this year, it’s a long-term project and we won’t be fully end-to-end encrypted until sometime in 2022 at the earliest. Moreover, the safety features we’ve already introduced are designed to work with end-to-end encryption, and we plan to continue building strong safety features into our services.

Key Takeaways From Workshop Discussions

As we address these issues, we want to make sure they are guided by input from outside experts. Here are some of the key takeaways and points of feedback from our workshop: 

  • Privacy expectations are important. People want to know how their data is being used and what data is accessible by us or others when messaging. In addition, people may have different privacy expectations based on the size or nature of a group chat or audience. Ultimately, privacy is personal and comes with different expectations depending on their situation. So transparency and controls are key.
  • Marginalized and vulnerable communities rely on private spaces as outlets for support and assistance. People need to trust that private spaces are secure from unwanted intrusion when they need to reach out and share sensitive, personal information. For many, a private space is a safe space. That should be a critical component for our broader strategy.
  • Consumers want more privacy and safety controls. We should consider ways to give people more privacy settings and features within our messaging apps. This also requires thoughtful product design and user education to make the features easy to find and easy to use. For example, while we’ve introduced a number of tools in response to harassment and bullying, more can be done to find ways to highlight tools to people at the times they need them most, such as when they are experiencing abuse.
  • People want messaging that’s free from unwanted intrusions. The ability to find and message other people on our platforms is an important part of helping people stay connected, but people also want controls to manage unwanted interactions. We already filter messages into request folders and we take steps to restrict adult-to-minor messaging on Messenger and Instagram. Last year, we introduced safety features like blocking images or links in message requests and messaging settings. This will give people  more  control over who can message them and who can’t. We continue to explore other ways to help protect people.
  • People want more protection from scams and hacking. While we’ve taken steps to address this ongoing challenge through new tools and user education on Facebook and Messenger, experts encouraged us to continue our efforts and do more to combat scams and protect people’s information.
  • Human rights concerns should be a priority. Human rights were a consistent theme across most of the issues we discussed in the workshop. In particular, stakeholders focused on the important role our products play in helping people to freely connect with others while also recognizing some of the tensions this creates with people’s right to privacy from unwanted intrusion. They urged us to continue considering the human rights impact of our products. 
  • We need to find a balance of safety, privacy, and security. There is a clear need to balance the privacy and security of people’s messages with maintaining a safe environment and providing data to law enforcement in response to potential real world harms. We discussed tools that can protect people’s privacy while also preventing harm from happening in the first place, using behavioural signals, traffic data or user reports rather than access to the content of all messages. There was no consensus on a recommended approach, but experts encouraged more consultation to help strike an appropriate balance.

The motivation behind all of this work is to build a trusted and secure private messaging service. Our journey in messaging will evolve as new challenges emerge. In addition to receiving feedback from consumers, we’ll continue to engage with this group from our workshop and other experts on these issues as we work collectively to find solutions.



To help personalize content, tailor and measure ads, and provide a safer experience, we use cookies. By clicking or navigating the site, you agree to allow our collection of information on and off Facebook through cookies. Learn more, including about available controls: Cookie Policy