RESPECT YOUR DATA

Limit and protect the data you collect and retain.
Protecting your users’ privacy requires you to be thoughtful about the data you collect and hold. By carefully considering the costs and benefits of collecting data and by properly safeguarding the information that you do collect, you may prevent privacy harms and increase consumer trust in your product. 
ENSURE YOU DO NOT USE DATA IN WAYS THAT HARM USERS.

If you aren’t careful, making decisions based on data can replicate the biases and discrimination that can exist in the real world, undermining user trust. This is especially true for decisions that impact employment, health, education, lending, or even policing and public safety for users, especially those in vulnerable communities. You can avoid this by carefully scrutinizing the ways in which your data-driven decisions affect your users and taking proactive steps to prevent or counteract these potential harms. Such steps protect the users you have and help attract new ones seeking a fair and equitable service.

Case Study

Google Heavily Criticized for Racially-Biased Search Results

Google received a barrage of negative press after an academic study demonstrated that search results for traditionally-black names disproportionately included ads suggestive of a criminal record.

Google received a barrage of negative press after an academic study demonstrated that search results for traditionally-black names disproportionately included ads suggestive of a criminal record. The research showed that these names were 25 percent more likely to be served with an arrest-related ad even if the subjects had no criminal record, raising the risk that employers might wrongly assess innocent applicants. Even though both Google and the study itself suggested the results were the product of the company’s algorithm, critics attacked the company for its racially-biased results and called the study a “powerful wakeup call.”

MINIMIZE THE LINKS BETWEEN YOUR DATA AND INDIVIDUAL USERS.

Tying identifiable data, including IP addresses or account information, to other records can increase the risk of harm to your users if a breach occurs and, as a result, may make your company more vulnerable to expensive lawsuits and government fines. Explore approaches that effectively mask user identity while preserving the business value of collected information and be particularly careful not to accidentally disclose identifiable data along with other potentially sensitive records

Case Study

All Your Base Are Belong to Us: Strava's Heat Map Firestorm

The exercise-tracking app Strava faced a barrage of criticism after a researcher discovered that the company's Heat Map feature - which displayed aggregated exercise data on a map - revealed sensitive user activity, including the potential locations of

The exercise-tracking app Strava faced a barrage of criticism after a researcher discovered that the company's Heat Map feature - which displayed aggregated exercise data on a map - revealed sensitive user activity, including the potential locations of secret U.S. military installations overseas, to the surprise of many users, including those who had adjusted separate privacy settings. Others pointed out that “anonymized” data incorporated into the Heat Map was, in fact, linkable to individual Strava users. Users seeking to opt out of the Heat Map found Strava’s privacy settings “difficult to find, understand, and use.” In a public statement, Strava bafflingly pointed users to these same controls, but not before prominent commentators called the mess-up a full-blown "data privacy debacle."

Case Study

Netflix Sued After Sending Not-So-Anonymous User Data to Researchers

In 2009, Netflix faced a flood of criticism, a class action lawsuit, and the loss of user confidence when it released a huge set of improperly anonymized data. While the company took some steps to remove personal identifiers, researchers were able to identify customers by comparing reviews with other online sources.

In 2009, Netflix faced a flood of criticism, a class action lawsuit, and the loss of user confidence when it released a huge set of improperly anonymized data. While the company took some steps to remove personal identifiers, researchers were able to identify customers by comparing reviews with other online sources. In the aftermath, Netflix was hit with a class action suit filed on behalf of a Netflix customer who was concerned the data would reveal her sexual preference to her intolerant community. After waves of bad press, the case settled out of court for an undisclosed amount.

Share This: