Search This Blog

Powered by Blogger.

Blog Archive

Labels

Showing posts with label CSAM. Show all posts

EU Proposes New Law to Allow Bulk Scanning of Chat Messages

 

The European elections have ended, and the European football tournament is in full flow; why not allow bulk searches of people's private communications, including encrypted ones? Activists around Europe are outraged by the proposed European Union legislation. 

The EU governments' vote on Thursday in a significant Permanent Representatives Committee meeting would not have been the final obstacle to the legislation that aims to identify child sexual abuse material (CSAM). At the last minute, the contentious question was taken off the agenda. 

However, if the EU Council approves the Chat Control regulation later rather than sooner, experts believe it will be enacted towards the end of the difficult political process. Thus, the activists have asked Europeans to take action and keep up the pressure.

EU Council deaf to criticism

Actually, a regulation requiring chat services like Facebook Messenger and WhatsApp to sift through users' private chats in order to look for grooming and CSAM was first put out in 2022. 

Needless to say, privacy experts denounced it, with cryptography professor Matthew Green stating that the document described "the most sophisticated mass surveillance machinery ever deployed outside of China and the USSR.” 

“Let me be clear what that means: to detect “grooming” is not simply searching for known CSAM. It isn’t using AI to detect new CSAM, which is also on the table. It’s running algorithms reading your actual text messages to figure out what you’re saying, at scale,” stated Green. 

However, the EU has not backed down, and the draft law is currently going through the system. To be more specific, the proposed law would establish a "upload moderation" system to analyse all digital messages, including shared images, videos, and links.

The document is rather wild. Consider end-to-end encryption: on the one hand, the proposed legislation states that it is vital, but it also warns that encrypted messaging platforms may "inadvertently become secure zones where child sexual abuse material can be shared or disseminated." 

The method appears to involve scanning message content before encrypting it using apps such as WhatsApp, Messenger, or Signal. That sounds unconvincing, and it most likely is. 

Even if the regulation is approved by EU countries, additional problems may arise once the general public becomes aware of what is at stake. According to a study conducted last year by the European Digital Rights group, 66% of young people in the EU oppose the idea of having their private messages scanned.

Grooming Cases Reach Unprecedented Heights Amidst Regulatory Delays

 


Campaigners are calling for no more delays in the online safety bill, which is being pushed by the Government, as thousands of crimes related to online grooming have been reported during the wait for updated online safety laws. 

There has been a lengthy wait before the long-awaited bill can become law, but the proposed legislation went through repeated changes and delays on its way to becoming law in the autumn. Additionally, ministers have come under fire in recent days from tech companies for what they think is an attempt by the government to undermine encryption technology.

A call has been made by the NSPCC to support the bill after the charity announced that in the last six years, UK police forces have recorded 34,000 cases of online grooming crimes, affecting children and young people. In 2017, the charity began calling for more robust online safety regulations to be put in place in order to protect users. 

NSPCC statistics show that 6350 incidents of sexual communication with children were reported last year, an increase of 82 per cent since the offence of sexual communication with children was introduced in 2017/18, according to data obtained from 42 UK police forces. 

Moreover, the figures say that in 83 percent of the cases of social media grooming over the last six years, when the gender of the victim could be determined, the victims were girls, the charity noted. According to the police data, more than 150 apps, games and websites were also used for the purpose of targeting children. If children are to be protected from abuse and neglect, then the NSPCC believes that the Bill is indispensable. 

As a result of this law, firms and big tech bosses will have to adopt stricter responsibilities for protecting young users if it passes. Nevertheless, the NSPCC wants assurances that new technologies, including artificial intelligence, will be regulated by the legislation as well.

A study of the data shows that 73% of the reported crimes involved either Snapchat or an associated website, where 5,500 of the incidents involved children between the ages of 5 and 12. A few weeks away from the end of the summer recess, parliament will resume sessions to wrap up the debate on the bill, which is expected to be passed soon after. 

A severe impasse in the UK is threatening the future of end-to-end encryption due to its ongoing implications. Increasing numbers of tech companies are offering encrypted messaging services to app users in order to satisfy their demands for more privacy since this means that the message can be viewed only by the sender and the recipient, rather than anyone else. The records cannot even be accessed by the tech companies themselves in most cases. 

Even though most of us would agree that privacy is, in general, something we all cherish above all others, there is a grave element of risk that cannot be ignored when trying to achieve it. However, depending on which platform you go to, these privacy features can be accessed by everyone, and the platforms claim they offer extra protection for people such as victims of domestic abuse, journalists, political activists, and others. It is also claimed by them that if a backdoor is added to their services, it will undermine the security of their system for everybody. 

Despite the fact that the tech industry and legislators have a consensus that something needs to be changed, the tradeoff between privacy and security has prevented any meaningful progress from being made. 

As stated in the latest draft of the Online Safety Bill, it demands a backdoor through which the authorities will have the ability to access social media services for the sole purpose of unlawful surveillance. 

Nevertheless, there are concerns among tech companies that ostensibly loosening any protection against data scraping might provide hackers and data thieves with a window of opportunity to perpetrate havoc on our sensitive information by exploiting any loopholes. It is generally considered that social media platforms prefer developing their own safety precautions as opposed to taking a proactive approach to prevent the spread of child sex abuse material (CSAM). In addition, updates are used to tighten up their grip on the spread of other forms of harmful and age-restricted content, so that children do not encounter harmful content.

Even with the efforts of each individual company, the statistics indicate that the epidemic of online child grooming continues to worsen – an epidemic exacerbated by social media's unintentional role as a smokescreen for online child maltreatment. 

The chief executive of the NSPCC, Sir Peter Wanless, commented that the study demonstrates that there is a considerable amount of child abuse on social media as well as the human cost associated with fundamentally unsafe products. There are many offences against children online, so it is imperative that we remember how important the Online Safety Bill is and why children need the ground-breaking protection it will provide. 

As a result of the ongoing butting heads between Silicon Valley giants and government regulators, there has been speculation that the Communications Regulatory Authority, or Ofcom, might get involved and bring about changes that will impact the entire industry. If the right balance can be struck between both relevant entities, it will be interesting to see if both parties can benefit. There is no doubt about it, though, that it is a matter of equivocally needing all hands on deck in order to get the job done.