Lokesh pfp
Lokesh
@lokeshlucky061
🚨 Apple sued over abandoning CSAM detection for iCloud Apple is facing a lawsuit for not implementing a system to scan iCloud photos for child sexual abuse material (CSAM). The lawsuit claims that Apple's failure to take action forces victims to relive their trauma, despite the companypreviously announcing plans to enhance child protection measures. In 2021, Apple proposed using digital signatures from
0 reply
0 recast
0 reaction