Case Study
Never Again, Everalbum
Photo app Everalbum used billions of private, user-uploaded images to train “Ever AI,” a facial recognition product that it marketed to private clients, law enforcement, and the military. The company did all this without obtaining consent from users, and without even telling its millions of users that their personal photos would be used to help build surveillance technology. In fact, Everalbum updated its privacy policy to reflect the facial recognition side of the business only after questions from media. Then the FTC took notice, and the settlement and consent order required Everalbum to get express consent from users before feeding their images into facial recognition algorithms, as well as delete all the algorithms Everalbum had illegally built people’s private photos. As Commissioner Rohit Chopra wrote, “The case of Everalbum is a troubling illustration of just some of the problems with facial recognition.” Following this investigation, a rebrand of its surveillance technology under the name “Paravision,” and months of backlash, Everalbum finally closed its consumer photo storage business in August 2020.