A quick update - The European Commission and EU member states have been pondering, for years now, if they should force WhatsApp/Apple/Signal/Telegram to scan all our private messages for suspected child sexual abuse material (CSAM). For various reasons it is a horrendous idea to break end to end encryption in this likely highly ineffective way. Variations of the proposal have also included a mandate to perform such scanning of images using AI, and even to also read our text messages to see if we aren’t “grooming” children.
“Introducing a scanning application on every mobile phone, with its associated infrastructure and management solutions, leads to an extensive and very complex system. Such a complex system grants access to a large number of mobile devices & the personal data thereon. The resulting situation is regarded by AIVD as too large a risk for our digital resilience. (…) Applying detection orders to providers of end-to-end encrypted communications entails too large a security risk for our digital resilience”.
Important excerpt: