Facebook launches new artificial intelligence technology to detect porn revenge shared on Facebook and Instagram, as well as an online resource center for its victims

0
4

By Antigone Davis, Global Head of Security

When intimate images of someone are shared without their permission, it can be devastating. To protect the victims, our policy has long been to suppress non-consensual intimate images (sometimes referred to as "revenge porn") when they are reported to us – and in recent years we have used pictures. -sharing. To find this content faster and to better help victims, we are announcing new detection technology and an online resource center to help people respond when such abuse occurs.

Finding these images goes beyond nudity detection on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near-naked images or videos that are shared without permission on Facebook and Instagram. This means that we can find this content before anyone reports it, which is important for two reasons: Often, victims are afraid of retaliation and are therefore reluctant to report the content themselves or are unaware that the content has been shared. A specially trained member of our Community Operations team will review the content detected by our technology. If the image or video violates our community standards, we will remove it and, in most cases, we will also disable an account to share intimate content without permission. We propose an appeal process if someone believes we have made a mistake.

This new detection technology is in addition to our joint pilot program with victim advocacy organizations. This program offers people an emergency option to proactively and securely submit a photo to Facebook. We then create a digital fingerprint of this image and prevent it from being shared on our platform. After receiving a positive feedback from victims and support organizations, we will expand this pilot project in the coming months so that more people can benefit from this option in the event of an emergency.

"We are excited to see the pilot project expand to include more women's safety organizations around the world, as many applications come from victims outside the United States." – Holly Jacobs, Founder of Cyber Civil Rights Initiative (CCRI)

We also want to do more to help people who have been the target of this cruel and destructive exploitation. To do this, we are launching "Not without my consent", a victim support center in our security center that we have developed with experts. Here, victims can find organizations and resources to help them, including steps to remove content from our platform and avoid sharing – and access our pilot program. We will also allow victims to more easily and intuitively report the sharing of their intimate images on Facebook. And in the coming months, we will create a Victim Assistance Toolkit to give people around the world more information with relevant local and cultural support. We will create it in partnership with Revenge Porn Helpline (United Kingdom), Cyber ​​Civil Rights Initiative (United States), Digital Rights Foundation (Pakistan), SaferNet (Brazil) and Professor Lee Ji-yeon (South Korea ).

Our work in combating these abuses and supporting the victims would not have been possible without the help of international experts. Today, on the sidelines of the 63rd US Commission on the Status of Women, we will be hosting an event with Dubravka Šimonović – United States Special Rapporteur on Violence Against Women – bringing together some of these human rights defenders. victims, industry representatives and non-profit organizations. We will discuss how this abuse manifests itself around the world. its causes and consequences; the next frontier of challenges; and deterrence strategies. We look forward to participating in this event and we are grateful for these partnerships as we continue to team up on this important issue.

To learn more about the research conducted as part of today 's announcement, click here.