December 31, 2003

Privacy and New Technology: System Openness, User Control and Good Interface are Key to Making Users Feel More Comfortable, But So Would A Blanket Privacy Policy

Ross Mayfield has a really interesting discussion roundup on his site, about users driving policy. As the discussions around various blogs became more specific, much of it centered around privacy and social norms issues, particularly mismatched expectations between users and a system's designers. Design issues at the development level are key to narrowing these, giving users control and notice, as well as a good interface to easily understand and make good choices that suit their privacy needs and intentions with their information or system expectations. But I keep returning to the feeling that, regarding privacy, we really need a blanket privacy policy to make users feel comfortable as they interact in the digital world, and on the internet. This cannot be resolved with better interfaces, user control and system openness alone, though those are key to making information technologies work well and giving users what they want on a system level, leading to more informed users, and integrity in the relationships between systems and users and their data.

Systems and companies may make some relatively small amount of money now by using collected information from and about users, for purposes other than the users intended, for use outside of their relationships with those specific companies. But instances like those discussed below cause users to feel worried and sometimes outright scared, where they then refuse to participate in a system or with a company at all, or find themselves shocked after the fact by the results of their interactions with a company or entity. Unless people feel comfortable and protected, those profits resulting from systems currently selling or manipulating user data in ways the user doesn't intend will remain small in comparison to the tremendous amount of money to be made in web services, social networks, and with all sorts of other information technologies were most users to participate because they felt safe.

Most users will not now participate in information technology systems that require a lot of personal data unless there is something they get in return, and even then, it's a subset of the total internet user population. If users really trusted that they were in control of their own data, so they knew when their data went beyond those specific company systems and relationships, and could decide when and where to participate, instead of operating in a state of uninformed fear as companies currently now offer with no or little privacy policies, and little in the way of overall government protection, those companies (and many new ones based on new technologies) using exactly this kind of personal user data could make many times over what they do now. It is short term greed that keeps companies operating as they do, which keeps users from participating, which leads to few participants out of the whole of those using the internet. And yet, one company's policy to the next is confusing and unreliable, and not something people can or want to keep track of, and the resulting confusion also contributes to far less participation. I believe the only route to real information technology development with personal data and the profits that will follow is a blanket policy that every company will have to follow assuring customers of their own data privacy. Users would feel secure and many many more would participate, and those companies would make far more than they have seen under the current (no) privacy regime.

The discussion Ross catalogued partly centered around this: Danah Boyd responded to Wendy Seltzer (responding to Cory Doctorow saying that the last twenty years have been about technology and the next twenty will be about policy). Wendy suggested that originally, she thought that technology developments bringing about privacy tensions might ease as people became more sophisticated users, but instead she saw the gap as a critical mass of users would always lag behind technology developments as they learned a new information technology well enough to overcome, accept, steer away from or rearrange the privacy breaches, and so social norms developed as a result of these new technologies lag behind. Danah replied that social norms weren't falling behind, they are instead going in one direction while technologies are developed in another, and it baffles the social norms trying to cope.

I think in a way they are both right (both scenarios can exist with the same technology depending on use and result); it's not only lagging user competency and then the attendant reactions from users that will adjust, making some mental calculation with a new technology in order to get the amount of privacy or control they need, and it's not just diverging social norms, but also other issues on the design and development end that might solve this, like notice, good interface and user control, that allow for users to know immediately, and then deal with the privacy issues as they use the new technology, instead of finding out about their loss of privacy when it's too late, that will counter these kinds of issues. Technologists can do much better with design, as could corporate policies for privacy be much better, as could users in learning new technologies and protecting their own privacy as needed. But for most people and companies, the benefits will come when users know they are protected, understand a basic structure of privacy across companies and websites, which all interested can rely on, leading to users releasing information. Interesting uses of people's data will follow while still maintaining privacy and user control.

And yet instances of technology development seem to move in exactly the opposite direction at times, leading to scares with users, resulting in less participation with systems that might benefit us all if many participated, and well designed, with privacy built into the architecture, and privacy as a given right between users and the entities with whom they deal.

John Battelle points to a particularly disconcerting social and privacy issue brought up by a new web service, Cardbrowser. Apparently, they have 17,000 (and counting) business cards they've collected from some major conferences, with no privacy policy posted and little information about whether they let those giving the cards (presumably for the purposes of making a new contact person to person, not being entered into a web-searchable database for the whole internet to search, though this is unknown because they publish nothing about their data or privacy policies) know that the cards would end up there, or allowing users to be in control of their own information, or for that matter whether the companies on those cards know. Also, what about the idea that without your approval, Cardbrowser is linking and distributing your name, title, company name, phone numbers and location, attendance record, and dates, which is information that together with other personal information in publically available databases, might lead to even greater matching and sifting of personal digital identities that people don't want out there for just anyone to see without some reason or a warrant or some kind of permission and reciprocity (as our current analog social norms often dictate).

Similar issues exist with your cell phone keeping tabs on you. There's good and there's bad in systems like that, where some users want to keep track of their kids, which may not be objectionable, but others including the companies that buy the phones for their employees may do it for reasons that are totally unacceptable. These kinds of information technologies can allow uses that previously didn't exist, and therefore, there is a lag before a critical mass of users understands what is happening and does something about it, or at least has notice that the shift has occured and can then make choices about when to allow it, or self-censor.

In the case of the tracking phones, it becomes a matter of each user knowing when the tracking is turned on, and having control over that tracking. It's a matter of notice, and a matter of interface. A good interface, on any system that tracks your behavior, your movements, your private, semi-private, semi-public and public behavior, would show the tracking, and give control choices at the time of use. But well designed systems are rare today, and it's the invisible nature of the tracking, and our relationship to the data from the tracking, that causes consternation and upset. A blanket privacy policy would alleviate many fears and open up many new information technology development possibilities as well as many customers for companies to development relationships.

Posted by Mary Hodder at December 31, 2003 04:27 PM | TrackBack
Post a comment

Remember personal info?