BE TRANSPARENT

Give users the ability to make informed choices.

The first step in establishing and maintaining a trust-based relationship with your users is giving them the information they need to make informed decisions. Doing so not only helps prevent surprises that can lead to backlash, it can also build loyalty among your current users and help you recruit new ones.

CLEARLY EXPLAIN WHAT DATA YOU COLLECT AND HOW YOU USE IT.
Many privacy fiascos are triggered when users are unpleasantly surprised to learn how a service actually works and how their personal data has been or could be collected and used. You can help avoid surprises that will lead to user backlash by making your privacy practices accessible and easy to understand. Having short-form privacy policies for mobile, Frequently Asked Questions pages, and visual ways of communicating like videos and graphs can also help your users understand your privacy practices.
Case Study

Spotify’s Problem “Isn’t Privacy, It’s Terrible Communications”

Spotify was widely criticized for the difficult-to-decipher language in an update to its privacy policy released in mid-2015.

Spotify was widely criticized for the difficult-to-decipher language in an update to its privacy policy released in mid-2015. Many users were confused by the policy and believed that Spotify wanted to track users “like a jealous ex.” Spotify’s CEO issued a public apology and rewrote the policy to be clearer, but the public relations damage from the “ridiculous” policy was done.

Case Study

DuckDuckGo Rewarded for Keeping Privacy Simple

Search engine DuckDuckGo reaped the benefits of having clear and privacy-friendly policies written in understandable English.

Its privacy policy starts with a clear statement that “DuckDuckGo does not collect or share personal information,” followed by an explanation about why users “should care.” This policy has been highlighted by the press, helping the company experience a 600 percent increase in traffic in the wake of the 2013 NSA revelations.

“Transparency is an essential element of trust, and consumers rank transparency as the most important thing organizations can do to build and grow trust when it comes to dealing with their personal data” (2022).

 

CLEARLY EXPLAIN HOW INFORMATION IS SHARED WITH OTHERS.
Because many users are particularly concerned about how and whether their data is shared with third parties, making sure that your users understand your data-sharing practices is essential to earn their trust and avoid misunderstandings or backlash. Make it easy for users to understand who can view or access their information, how it can be used, and how your company ensures that it is not misused.
Case Study

Lenovo Shamed for PCs Secretly Preinstalled with “Nefarious” Adware

Lenovo was lambasted in the press after security researchers revealed that the PC-maker was selling computers secretly preinstalled with “nefarious” adware that not only collected information about users’ online activity but also made encrypted web sessions vulnerable to attacks.

Lenovo was lambasted in the press after security researchers revealed that the PC-maker was selling computers secretly preinstalled with “nefarious” adware that not only collected information about users’ online activity but also made encrypted web sessions vulnerable to attacks. The adware, from a company called Superfish, posed a sufficiently serious threat that the Department of Homeland Security warned Lenovo customers to remove it immediately. Lenovo’s actions not only damaged its reputation, but also exposed it to a class action lawsuit for “compromising user security and privacy.”

Case Study

Premom Broke Privacy Promises in a Post-Roe World

The FTC sued Premom—an ovulation tracking app—for breaking “its promises and compromis[ing] consumer privacy” by deceptively sharing personal information

The FTC sued Premom—an ovulation tracking app—for breaking “its promises and compromis[ing] consumer privacy” by deceptively sharing personal information and violating the FTC’s Health Breach Notification Rule by failing to notify users of these unauthorized disclosures. Contrary to Premom’s direct promise not to share personal information with third parties without user consent, the company integrated software development kits from third party marketing firms, which shared information that could associate fertility or pregnancies to a specific individual. Premom also failed to properly encrypt the information it shared with third parties. Easy Healthcare—who owns Premom— agreed to pay $200,000 in fines and is banned from sharing personal information with third parties for advertising.

 

81% of Americans are concerned about how companies use information collected about them. (2023)

 

The majority (79%) of consumers surveyed believe technology providers are not very clear about their privacy and security policies, and 79% also feel it’s not very easy to control the information their technology providers collect about them. (2024).

 

FOLLOW YOUR PRIVACY POLICY.
Your privacy policy is a contract with your users. Failing to live up to your privacy promises may not only anger users but also result in fines and lawsuits. Make sure that your privacy policy is accurate and that everyone who has access to personal data understands and complies with it.
Case Study

Sell Data and Say You Didn’t: How Flo Health Got In Trouble with the FTC

The FTC sued Flo Health—a once-popular women’s health app used for tracking reproductive health information such as menstrual cycles and pregnancies—for lying to users about its privacy policies and claiming to keep health information private while actually selling the information to third parties.

The FTC sued Flo Health—a once-popular women’s health app used for tracking reproductive health information such as menstrual cycles and pregnancies—for lying to users about its privacy policies and claiming to keep health information private while actually selling the information to third parties. Flo Health misrepresented its compliance with international privacy laws that require notice, consent, and protection of personal information transferred to third parties and Flo agreed to settle these charges in 2021. Flo was required to obtain consent prior to sharing a consumer’s health information with a third party, undergo a compliance review conducted by a third party expert, notify users of the app that Flo shared information about their periods and pregnancies with third parties in violation of its privacy policies, and post a similar notice on its website.

Case Study

RadioShack Hammered for Unauthorized Sale of Customer Data

RadioShack was widely condemned when it announced plans to sell tens of millions of customers’ data in bankruptcy proceedings even though it had promised not to sell or share any of that information in its privacy policy.

RadioShack was widely condemned when it announced plans to sell tens of millions of customers’ data in bankruptcy proceedings even though it had promised not to sell or share any of that information in its privacy policy. The sale was put on hold after the Texas and Tennessee Attorneys General filed suit and the FTC requested that RadioShack restrict the use of any data sold given the “potential deceptive nature of the transfer.” The press chimed in, calling the company’s behavior “obnoxious.” The company ultimately was forced to destroy most of the data at issue and require the purchaser to comply with Radio Shack’s prior privacy promises.

Case Study

International Regulators Say Meta GenAI Generates Major Privacy Concerns

Meta ended up in hot water with international regulators, faced big daily fines, and was forced to pause its release of Generative AI products in the EU and Brazil after data protection authori

Meta ended up in hot water with international regulators, faced big daily fines, and was forced to pause its release of Generative AI products in the EU and Brazil after data protection authorities raised serious privacy concerns with the company’s use of personal information for its AI service. The Irish Data Protection Commission requested the pause after “uncertainty over how Meta would train its new AI systems using Meta users’ personal data from photos, comments and other content on public posts.” The Brazilian National Data Protection Authority demanded Meta suspend its practice of using personal information published on its platform to train AI, calling the practice an imminent risk of serious harm, and threatened to fine the company almost $9,000 per day if it failed to comply.

 

NOTIFY USERS ABOUT ANY CHANGES BEFORE THEY TAKE EFFECT.
It is more likely that users will embrace new or improved functionality or changes to your privacy practices if they are not surprised. Prominently disclosing meaningful changes in the way your product or service collects data, giving users the opportunity to provide input and express concerns, and obtaining opt-in consent can help prevent controversies for your company.
Case Study

LinkedIn Caught Trying to Quietly Loop In User Information to Train AI

LinkedIn became noisy with angry posts when the company was not fully transparent with users and tried to silently opt people into use of their information for AI.

LinkedIn became noisy with angry posts when the company was not fully transparent with users and tried to silently opt people into use of their information for AI. LinkedIn automatically “opted accounts into training generative AI models” prior to updating its terms of service and privacy policies. It forced users to dig into their settings and find a toggle switch to turn it off  and any information collected while people were automatically opted-in may be impossible to claw back.  User backlash was swift, with people writing publicly, “Thought you couldn't hate LinkedIn more? You were wrong” and “LinkedIn sucks… it’s about to suck even harder with the help of AI.” LinkedIn eventually updated its privacy policy and provided notice to users. But just like “what’s done is done” in terms of the information already collected, it will also be hard for the company to reverse the loss of user trust for opting people in rather than being transparent and giving its professional user base a real choice.

 

Case Study

Zoom Tries to Blur its Own Background—Quietly Changes Terms of Service to Allow AI to Train on User Information

Zoom failed to provide its users control over their personal information when it quietly updated its terms of service, requiring users to sacrifice their privacy in order to fuel Zoom’s artificial intelligence program.

Zoom failed to provide its users control over their personal information when it quietly updated its terms of service, requiring users to sacrifice their privacy in order to fuel Zoom’s artificial intelligence program. Specifically, the new terms of service required users to agree to “grant Zoom a perpetual, worldwide, non-exclusive, royalty-free, sublicensable, and transferable license” for various purposes, including “machine learning, artificial intelligence, training, testing, improvement of the Services, Software, or Zoom’s other products, services, and software, or any combination thereof.” Users raised alarms and called on privacy advocates and legal experts to consider whether Zoom’s policy was acceptable in terms of consent, data privacy, and individual rights, with one user saying: “I'm afraid unless you give users the option to opt out of having their data used in training AI, we're going to have to boycott your product. Unless you can verify that video content itself is not used in training. This is horrible optics.” 

A recent study showed that it would take a full workweek (46.6 hours) to read the privacy policies of the 96 websites Americans typically visit monthly. (2023).

 

Share This: