Case Study
SafeRent’s Algorithms Accused of Discrimination
A company that creates algorithms that landlords can use to screen tenants, called...+ Read more
Facebook was lambasted in the press for conducting an “emotional manipulation” study on its users without oversight or user permission. Facebook’s experiment involved selectively showing users certain kinds of posts by their friends and determining whether the altered feed affected the users’ moods. When the study was revealed, both the press and public railed against the company for treating users like guinea pigs without their informed consent. After weeks of bad press and the scrutiny of scientific groups, Facebook publicly apologized for the move and the company agreed to new procedures designed to protect users in future research.