Apple Defends Plan to Scan Photos for Illegal Child Sex Abuse

Apple defended its new system that will scan iCloud for illegal child sexual abuse materials, or CSAM, amid a controversy over whether the system reduces Apple user privacy and could be used by governments to surveil citizens. Last week, Apple announced it has started testing a system that uses sophisticated cryptography to identify when users upload collections of known child pornography to its cloud storage service.

  • Read the article: CNBC