"We must hold tech bosses accountable for child safety online"
Meta has a responsibility to protect children using their platforms, however it is currently planning to introduce end-to-end encryption as standard across Facebook Messenger and Instagram Direct. For the past 2 years as part of the #NoPlaceToHide Campaign I’ve been calling for them to rethink their plans both as the Head of Advocacy for the Marie Collins Foundation and somebody with lived experience of this abuse. This is an issue incredibly close to my heart and Meta’s plans will directly impact me, which is why I have chosen to speak out about it.
When I was 13 years old I was groomed online and sexually abused. A child sex offender manipulated me into sending him a topless photo of myself, which he immediately used to blackmail me for more explicit images and for my address. He came to my home the following morning and sexually abused me, taking yet more photos.
It all happened within 24 hours, and yet I am still dealing with the impact even now 20 years later.
I have no idea what happened to those photos. I know the police found them on his computer and they were used in the criminal justice process, but I don’t know who else has seen them or may see them in the future. I have no control over those images, I can’t ‘un-create’ them, I can’t get them back. They are out in the world for child sex abusers to view and share, and there’s nothing I can do about it.
I sometimes hear people talk about this as though they are “just images”. But I can’t tell you how big an impact this has had and continues to have on me. When I think of people seeing them, I feel like a victim all over again. The thought of abusers getting satisfaction from looking at them and sharing them makes me feel physically sick. When I walk into a room or down the street, I look at people’s faces and can’t help but think “have you seen images of me being sexually abused when I was 13? Do you recognise me?”
I remember the moment I learned that we have technological tools and people working very hard to detect and remove child sexual abuse material. I immediately felt a wave of relief knowing that even if my images were shared, they would be found and taken down quickly. Meta’s plan to introduce end-to-end encryption snatches that relief away from me again. It is utterly devastating.
The information that Meta and other tech platforms send to UK law enforcement contributes to over 800 arrests of child sex abusers and safeguards around 1,200 children per month. Meta is the biggest contributor to these reports. In 2022 there were 31.8 million reports of child sexual abuse material made to NCMEC, and of these 21.1 million came from Facebook and 5 million from Instagram.1
If Meta continues with their plan, they will be unable to see child sexual abuse happening on their platforms and therefore will be unable to report it. This will allow abusers to groom and abuse children without detection, and poses a catastrophic risk to children.
The debate around E2EE has long been framed as a binary choice that we must make: privacy vs children’s safety. This isn’t true, but I’d like to say that I’m all for strong privacy measures, after all my privacy is infringed by the fact that the images of my abuse are out there and could be shared. I think this is a key point that has been missed because victims and survivors have not been included in the discussions: we deserve privacy too!
The fact is that it is technically feasible to detect child sexual abuse within an end-to-end encrypted environment whilst maintaining strong user privacy. Meta needs to urgently invest in these technologies to create a solution for their platforms, however it is choosing not to. It is one of the biggest tech companies in the world with truly significant resources at its fingertips. They have the power to make positive change in this area, and yet they are choosing to turn a blind eye to the protection of children. We must all tell them that this is not ok, that they must safeguard children and that they cannot give child sex abusers a place to hide.