No place to hide campaign
The Marie Collins Foundation is one of a number of prominent child safeguarding charities who are concerned about the implementation of End-to-End Encryption (E2EE) on social media platforms and the effect this will have on children’s safety.
There is a delicate balance to be struck between protecting user’s privacy and protecting children from those who use technology to groom and sexually abuse children and share Child Sexual Abuse Material (CSAM) online. However, it is essential that the balance is found because the introduction of E2EE, without adequate safeguards, will significantly impact on our ability to protect children by giving child sex abusers the opportunity to conceal their activities and the abuse of children.
We are a member of the steering group for the No Place To Hide campaign, which aims to increase public awareness about this issue and engage with tech companies find a solution that keeps children safe online without compromising user privacy.
Why does this matter? Most social media companies currently monitor child sex abuse on their platforms and report this to police. But E2EE scrambles the data, meaning platforms will no longer be able to detect this abuse. Facebook currently make 94% of all reports of suspected child sex abuse online, so if they go ahead with their plans which will prevent them from doing this, the impact will be huge. If E2EE is rolled out without safeguards, it is estimated that over 14 million reports of suspected child sex abuse will go undetected.
How will this affect children? Law enforcement bodies around the world have been clear that E2EE makes it harder for them to stop child sexual abuse online. In the UK, 650 children are protected every month, thanks to these referrals. If E2EE is introduced as planned, it will result in less children being safeguarded, less child sex abusers being caught, and less CSAM (images and videos of child sexual abuse) being identified and removed from platforms, which causes ongoing harm and trauma for those victims.
Do some platforms already use E2EE? Some apps (for example WhatsApp) already have E2EE, but usually you need to already know someone and have their phone number to make contact with them. Platforms like Facebook and Instagram are different as abusers use these to browse for children they don't already know, before then moving the conversations to a more private platform where they can continue to groom and abuse the children without fear of being caught.
Do you support privacy? We fully support strong user privacy and encryption. But we also support children's right to be protected from child sex abusers, and strongly believe that the children depicted in CSAM shared online also have a right to privacy – to have those images identified and removed from the internet. We want to work with social media companies to find a solution that keeps children safe online without compromising user privacy.
Is there a solution that protects privacy and children's safety? Yes. The UK Safety Tech Challenge Fund is funding projects to develop innovative solutions to keep children safe in end-to-end encrypted environments, whilst upholding user privacy. Tech experts like Professor Hany Farid are also clear that if the motivation is there, solutions will be found.
Watch our Youtube video here.