Case Study
Apple Defies Government Demand for Backdoor to Massive Acclaim
Following the tragic San Bernardino terrorist shooting, the United States government...+ Read more
Apple faced widespread public outcry after it announced that it would automatically scan images in users' iClouds, iPhones, and iPads for Child Sexual Abuse Material (CSAM) without users' permission. Advocates and consumers pointed out how the proposed undermined end-to-end encryption, invaded the privacy of all users, and created a system governments would exploit to search people's personal information, impose censorship, and harm marginalized communities. Apple's proposed policy left customers confused and appeared to be a contradiction their long-stated stance against government invasions of privacy. Following a massive backlash from the public and civil society, Apple announced it would postpone its rollout of the scanning system.