The newly proposed UK and EU laws are asking social media platforms to read people’s messages and WhatsApp is refusing as it’s a violation of users’ privacy.
UK and EU governments are in hot pursuit of child abusers and want to fish them out through social media platforms. Their eyes are set on WhatsApp because its parent company—Meta Platforms—has proven to be a popular choice of criminals for the distribution of child sexual abuse material (CSAM).
Meta accounts for more than 90% of the child abuse reports made to the UK’s clearing house. Last year, it made nearly 27 million reports of suspected child abuse to a national clearing house.
Despite these numbers, Meta has been accused of underreporting the situation in fear of getting into legal troubles for wrongfully accusing an adult of child abuse. It is a really grey area for all the people involved.
In a bid to get black-and-white evidence, these proposed laws are demanding for messages to be read to correctly identify CSAM. This calls tech companies to toe the thin line between privacy rights and their responsibility to clamp down on child sexual abuse.
WhatsApp, which is popular for protecting message privacy with its end-to-end encryption technology, has picked a side of the thin line and it’s with the majority.
WhatsApp CEO, Will Cathcart, said he will not make WhatsApp ”less desirable to 98% of our users because of the requirements from 2%.”
Claiming that it reports more abuse suspicions than any other company, Cathcart explained that WhatsApp had already implemented effective measures to clamp down on CSAM that do not require sacrificing everyone’s security.
In 2021, in an attempt to balance privacy rights with its duty to clamp down on CSAM. Apple announced its scanning feature for iCloud Photos that would scan photos for child sexual abuse content. The company received backlash so heavy that the company removed any mention of the feature from its website.