Case Study
Apple Walks Back Anti-CSAM Feature Over Surveillance Concerns
Apple ultimately took the right step to safeguard user privacy, security, and free speech by scrapping a controversial plan to scan users’ iCloud accounts to flag content (such photos sent over iMessage) that may be abusive or exploitative. The initial announcement had been met with widespread criticism, with over 90 civil rights and policy groups and security experts condemning the mechanism as a problematic surveillance technology that would undermine the privacy and security of Apple customers. After this extensive public backlash, Apple put its plan on hold and then announced in 2022 that it would not move forward. The company committed to finding other ways to help prevent child sexual abuse material while preserving its users right to privacy and free speech.