PARTNER WITH YOUR USERS

Put users in control and stand up for their rights.

Even if you plan to offer your product “for free” and generate revenue from advertising or other means, it is in your best interest to treat your users as partners: recognizing and respecting their expectations, giving them the tools to make their own decisions about their personal information, and standing up for them when they are unable to defend themselves. By doing so, you may not only avoid the consequences when users are unpleasantly surprised about how their data are used, you may find that users who trust you are more willing to pay for or engage with your service.

IDENTIFY AND RESPECT EXISTING USER EXPECTATIONS.
Many privacy catastrophes occur because companies focus on their internal perspective of the value of collecting or sharing data without adequately considering the potential wider effects on users or the general public. By looking at your product from various points of view, including bringing in focus groups or outside advisors to evaluate the consequences of your new product or feature, you can better anticipate and design for users’ actual expectations.
Case Study

iRobot Makes a Privacy Mess With Plans to Sell Home Data

iRobot faced widespread media backlash and consumer outcry when it came to light that the company was considering selling maps of customers’ homes for additional revenue.

iRobot faced widespread media backlash and consumer outcry when it came to light that the company was considering selling maps of customers’ homes for additional revenue. Though iRobot’s privacy policy states that its Roomba robotic vacuum collects data as it cleans, consumers and privacy advocates were shocked to learn that iRobot might “reach a deal” to sell this data – including details about home interiors - to major tech companies involved in developing “smart home” technology. Amid negative media coverage and consumer backlash, the company was forced to walk back its statements and clarify that it would not undertake any mapping or data extraction efforts without consumer opt-in. 

Case Study

Facebook Criticized for Conducting Secret Experiments on Users

Facebook was lambasted in the press for conducting an “emotional manipulation” study on its users without oversight or user permission.

Facebook was lambasted in the press for conducting an “emotional manipulation” study on its users without oversight or user permission. Facebook’s experiment involved selectively showing users certain kinds of posts by their friends and determining whether the altered feed affected the users’ moods. When the study was revealed, both the press and public railed against the company for treating users like guinea pigs without their informed consent. After weeks of bad press and the scrutiny of scientific groups, Facebook publicly apologized for the move and the company agreed to new procedures designed to protect users in future research.

Case Study

Microsoft in Hot Water After Search of Hotmail Account

Microsoft was roundly criticized for broadly interpreting its own privacy policy to justify searching a Hotmail account for the company’s benefit.

Microsoft was roundly criticized for broadly interpreting its own privacy policy to justify searching a Hotmail account for the company’s benefit. As part of a search for an insider who had leaked company information, Microsoft scoured the Hotmail inbox of the blogger publishing information on the leaks. The company initially defended its actions when the search became public, claiming its privacy policy authorized the search. After significant backlash from users and the press, Microsoft reversed its position, apologized for its actions, and changed its privacy policy to require the company to refer matters involving Microsoft property to law enforcement.

USE OPT-IN FOR ANY CHANGES THAT MIGHT CONFLICT WITH USER EXPECTATIONS.
Although it is important to notify users about any change that impacts their privacy, it is especially important to inform users and obtain their consent when you make a change that directly conflicts with their current expectations. Users who are not adequately informed and given an opportunity to opt in to a new feature may view the change as a betrayal of their trust.
Case Study

Google Buzz Stung for Exposing Private Contact Details

In early 2010, Google tried to jump on the social networking bandwagon by releasing its own service, Google Buzz. But the biggest buzz about the new service focused on privacy because Google pre-populated “following” lists with frequent chat and email contacts and made that information public by default.

In early 2010, Google tried to jump on the social networking bandwagon by releasing its own service, Google Buzz. But the biggest buzz about the new service focused on privacy because Google pre-populated “following” lists with frequent chat and email contacts and made that information public by default. Media articles called Buzz a “privacy nightmare” and warned that Buzz “managed to completely overstep the bounds of personal privacy.” Within weeks of launch, Google Buzz became the subject of an FTC privacy complaint and a class action lawsuit that resulted in an $8.5 million settlement. Google ultimately axed the entire Buzz service.

87% of global consumers think there should be laws to prohibit companies buying and selling data without opt-in consent. (2014)

Share This: