Ten years ago, CyLab graduate student Aleecia McDonald walked into Lorrie Cranor’s office and asked, “What would happen if everybody read all of the privacy policies on all of the websites they visit?”
Cranor swiftly responded, “Don’t be ridiculous, that would never happen.” McDonald’s question, however, piqued Cranor’s interest: Why would that never happen?
“I study privacy policies, and I spend a lot of time reading them, and I do not spend 244 hours per year reading privacy policies,” Cranor told a crowd of hundreds at last month’s Enigma Conference in Oakland, California. “So a model that assumes consumers spend this much time is really unworkable.”
Cranor, a professor in Carnegie Mellon’s Institute for Software Research and the Department of Engineering and Public Policy, shared what she had learned over the years about privacy policies, and suggested a path forward toward improving their effectiveness.
We can hardly expect that people will walk around looking at all of their smart light bulbs, smart thermostats, and drones for privacy policies.Lorrie Cranor, Professor, EPP & SCS, Carnegie Mellon University
“Things aren’t going to get any better in an Internet of Things world,” Cranor said. “We don’t have people going and seeking out privacy policies on websites, so we can hardly expect that people will walk around looking at all of their smart light bulbs, smart thermostats, and drones for privacy policies.”
In the recent past, Cranor’s group developed a “Privacy Nutrition Label,” which simplified the most important information in privacy policies down to a simple table of facts, similar to a nutrition label. Other groups have distilled privacy policies down to simple symbols or icons, and some have even turned privacy policies into a video game.
But do any of these techniques work? That’s one of the most important questions, and Cranor pointed to some user studies conducted at CyLab.
Another study involved a “Privacy Facts” notice for Android apps. The researchers found that users tended to choose apps that appeared to be more privacy-friendly, according to the “Privacy Facts” being displayed.
“If you’re involved in a project where you need to present users with a privacy notice or security warning, or any other type of disclosure, you need to insist that the disclosure gets tested with users,” Cranor said. “This way, you can ensure that users will actually notice the disclosure, they’ll understand the disclosure, and they’ll be able to use it to make decisions that may impact their behavior.”