Keeping young people safe online during the hyperconnected world of a pandemic: What can parents do?

DP- How important is the safety of young users on Facebook?

Amber Hawkes – Creating a safe environment where 3 billion people from all over the world can share and connect is critical and we take our responsibility seriously. 

 In 2019 alone, we spent more on safety and security than our entire revenue at the time of our IPO – more than USD$3.7 billion. Our content reviewers typically review more than 2 million pieces of content every day, and in the last few years, we have tripled the number of people working on safety and security to over 35,000. Of this, about 15,000 people are content reviewers. Of course, there will always be more to do – particularly when technology and online behaviours are constantly evolving. Our job is to continue our investment to stay ahead of this challenge.

DP – Can you please run us through Facebook’s Community Standards, especially regarding content regulation?

Amber Hawkes – Our Community Standards outline what is and is not allowed on Facebook. We aim to find the right balance between giving people a place to express themselves and promoting a safe environment. They cover a wide range of areas, and there is a number which is particularly important when it comes to Safety including Violent and Criminal Behavior, Suicide and Self Injury, Child Sexual Exploitation, Bullying and Harassment, Human Exploitation (including human trafficking), Privacy Violations, Hate Speech, Violent and Graphic Content, Cruel and Insensitive Content and Misrepresentation (using an inauthentic identity).

 We use a combination of artificial intelligence, reports from our Community and review by our teams to identify and review content against our Standards. Our team reviews content 24/7 in dozens of languages, and we have the ability to review in regional dialects in many countries. We have a network of local civil society partners around the world who can reach out to us via dedicated channels to alert us to emerging issues and provide essential context – including language support that we may not have.

DP – What are some key policies and tools Facebook has for young users? 

Amber Hawkes – We require everyone to be at least 13 years old before they can create an account on Facebook and in some jurisdictions, the age limit may be higher. We will delete the account of any person under the minimum age limit as soon as we become aware of it. 

 We’ve designed our platform to give people control over their own experiences including control over what they share, who they share it with, the content they see, and who can contact them. When it comes to teens, we’ve built in a number of additional protections to keep them safe. For example:

  • Advertising categories for teens are more limited than for people over 18 years of age
  • New accounts belonging to minors on Facebook are automatically defaulted to share with ‘friends’ only and their default audience options for posts do not include ‘public.’ If a minor wants to share publicly they must go to their settings to enable the option, and we remind them about the meaning of posting publicly
  • We keep face recognition off for minors 
  • We limit who can see or search specific information teens have shared, such as contact information, school, hometown or birthday
  • Messages sent to minors from adults who are not friends (or friends of the minor’s friends) are filtered out of the minor’s inbox 
  • We take steps to remind minors that they should only accept friend requests from people they know
  • Location-sharing is off by default for minors. When either an adult or minor turns on location sharing, we include a consistent indicator as a reminder that they’re sharing their location

DP – Due to the pandemic, we saw a rapid rise in the number of time users, especially young people have been spending on social media. How is Facebook ensuring young people’s safety?

Amber Hawkes – We offer really easy ways for people to report content or accounts that make them feel uncomfortable and we will take action as soon as we identify any content that violates our Community Standards. Beyond that, we’ve introduced a series of new tools and resources to help keep young people safe on our platform, including:

  • The safety of young people on Instagram in particular – is our most important responsibility. We have built extensive controls, easy reporting, and utilized the best available technology to keep people safe on Instagram.
  •  Additionally, the Activity Dashboard shows people how much time they’ve spent on Instagram for the past day and week, as well as their average time on the app. There is a daily reminder function to set a limit on the time spent on Instagram
  • Launching a Youth Portal, which is a central place for teens to get a better understanding of our products, hear from other peers, get tips and advice on controlling their experience. This is part of our safety centre, a resource for topics like suicide prevention, social resolution and bullying prevention.
  • Using artificial intelligence to help identify when someone might be expressing thoughts of suicide, including on Facebook Live and Instagram Live.

DP-  Why has Facebook partnered with Safer Internet Day?

Amber Hawkes – Every day, millions of people across the globe spend time on Facebook to connect with their family and friends. We recognize the important role we play in creating a safer online community and Safer Internet Day is a fantastic opportunity to remind our communities about the importance of safety. This is why we have created “top tips” for parents to help their children stay safe on Facebook, along with a family-friendly animation that parents and children can enjoy together. 

DP –  Despite so many policies in place, a lot of times parents tend to struggle when it comes to understanding social media on a more grassroots level. How can Facebook help with that?

Amber Hawkes – We have specially designed resources, guides and programs with information on teen online safety, including our Facebook Safety Center and Bullying Prevention Hub, our Instagram Safety Centre which includes dedicated sections on online bullying, and our Instagram Community Portal.

With more young people going online, engaging parents on the topic of online safety is more important than ever. We recently launched the Facebook Parents Portal as well as Instagram Tips for Parents, to help parents better understand our platforms. We provide tips on starting a conversation with children about online safety, as well as access to expert resources.

DP –  Coming to WhatsApp, another widely popular Facebook-owned messaging app, can you please tell us a bit about how users can report suspicious messages?

  • WhatsApp is a private messaging service, and it’s important to keep in mind the differences between private messaging and social media platforms. 
  •  As a private messaging platform, WhatsApp does not have access to the private messages people share with their friends and family, nor do we provide a search function within WhatsApp to find any groups or content. However, we use a combination of techniques to enforce our policies and prevent abuse, including:
  • We make it easy for people to report problematic content to us (or to block other users). When you receive a message from an unknown number for the first time, you’ll have the option to report the number directly inside the chat.
  • You can also report a contact or a group from their profile information at any time.  Once reported, WhatsApp receives the most recent messages sent to you by a reported user or group, as well as information on your recent interactions with the reported user.
  • Group Permissions: We help users decide who can add them to groups, which gives everyone the ability to control which groups they are added to. This significant change increases user privacy and prevents people from being added to unwanted groups.
  • WhatsApp uses advanced machine learning technology that works around the clock to identify and ban accounts engaging in bulk or automated messaging so they cannot be used to spread misinformation. 
  • We proactively ban 2 million accounts from WhatsApp per month and will continue to step up our efforts to prevent users from receiving unwanted mass messages. 
  • DP –   What about Instagram? How are you ensuring the platform remains safe for all users?

Amber Hawkes – People will only feel comfortable expressing themselves on Instagram if they feel safe and supported, and we are committed to building a platform free from bullying. We invest heavily in in-app tools to support our community:

  • Restrict mode: Once it is enabled, comments on your posts from a person you have restricted will only be visible to that person. They won’t be able to see when you’re online or when you’ve read their messages. They won’t be notified you’ve put them on Restrict. This tool was built specifically because of feedback from young people who said they wanted more control over what was happening when they were being bullied or harassed. 
  • Last year, we also rolled out the ability to delete comments in bulk and block or restrict multiple accounts. And we’ve made it easy to highlight positive interactions with Pinned Comments, as our research shows that elevating positive content is an effective way to set positive norms on your account.

More from this category

Advertisment

Advertisment

Follow us on Facebook

Search