The 7 Things That Matter Most in Privacy
In late January we held a workshop that brought together some of the worlds leading thinkers in online privacy, with everyone from the FTC to the EFF represented. We spent the day working to answer the question: What attributes of privacy policies and terms of service should people care about? If you are new to the project, please read the original blog post, as it will answer a number of the probable nagging questions (like how to make icons enforceable).
The Should is Key:
The “should” is critical here. Privacy policies are often complex documents that deal with subtle and expansive issues. A set of easily understood and universal icons cannot possible encode everything. Instead, Privacy Icons should call out only the attributes which are not “business as usual”: the warning flags that your privacy and data are at risk.
Let’s take an example that came up at the workshop. Should we have an icon that lets the you know that your data will be shared with 3rd parties? Isn’t 3rd party sharing intrinsically a bit suspect? The answer is a subtle no. Sharing with 3rd parties should raise a warning flag but only if that sharing isn’t required. The classic example is buying a book on Amazon.com and getting it shipped to your home. Amazon needs to share your home address with UPS and Privacy Icons shouldn’t penalize them for that necessary disclosure. In other words, Privacy Icons should only highlight 3rd party data sharing when you do not have a reasonable expectation that your data is being shared.
An example of the multi-state icons found on the cloth tags.
After synthesizing the input from the workshop as well as the numerous projects that have come before us, Lauren Gelman, Julie Martin, and I spearheaded the effort to boil down these “shoulds” into 7 attributes. The vision is that each attribute will correspond to an icon, and that each icon can have different states. A good example of a multi-state icon comes from the tag on your shirt that tells you how it should be cleaned.
Here is the proposal for the information architecture of the attributes for the Privacy Icon. To be clear, there are no physical icons yet. Once we have general consensus on the attributes, we’ll begin work on designing the graphics both directly and via a Design Challenge.
- Is you data used for secondary use? And is it shared with 3rd parties?
- Is your data bartered?
- Under what terms is your data shared with the government and with law enforcement?
- Does the company take reasonable measures to protect your data in all phases of collection and storage.
- Does the service give you control of your data?
- Does the service use your data to build and save a profile for non-primary use?
- Are ad networks being used and under what terms?
For companies that go above and beyond by retain their data for a minimum amount of time, with minimal exposure, etc., we can also provide a “best practicies” icon.
Is your data used for secondary use? The European Union has spent time codifying and refining the idea of “secondary use”; the use of data for something other than the purpose for which the collectee believes it was collected. Mint.com uses your login information to import your financial data from your banks — with your explicit permission. That’s primary use and shouldn’t be punished. The RealAge tests poses as a cute questionnaire and then turns around and sells your data. That’s secondary use and is fishy. When you sign up to use a service you should care if your data will only be used for that service. If the service does use your data for secondary use, they should disclose those uses. If they share your data with 3rd parties, then they should disclose that list too.
Is your data bartered. You should know when someone is making a gain off your back. You should also know roughly how and for what that data is being bartered.
Under what terms is your data shared with the government and with law enforcement? Do they just hand it over without a warrant or a subpoena?
Does the company take reasonable measures to protect your data in all phases of collection and storage. There are numerous ways that your data can be protected: from using SSL during transmission, to encryption on the server, to deleting your data after it is no longer needed. Does the company protect your data during transmission, storage, and from employees? This icon should tell you what the weak link is.
Does the service give you control of your data? Can you delete your data if you choose? Can you edit it? What level of control do you have over the data stored on their server.
Does the service use your data to build and save a profile for non-primary use? This is a subtle one, as we want to include the concept of PII (personally identifiable information). What we are worried about are companies secretly building a dossier on you — say by taking your email address and then buying more information from a 3rd party about that email address to get, say, your credit rating. Then using that profile for uses with which you haven’t agreed.
Are ad networks being used and under what terms? On the web most pages include ads of some form, and the prevalence of behavioral tracking is on the rise. Yahoo, for instance, can track you across 12% of the web (from personal correspondence). While letting users get a handle on ad networks is important, raising the alarm on every page would be counter-productive. We haven’t figured out yet how to handle ad networks and are looking for more thought here.
The next steps are to socialize this list of privacy attributes and once we arrive on agreement, to begin the design process. Feedback is crucial at this juncture. Jump in.