I was speaking at the recent Google Zeitgeist conference in London. On one panel, a privacy advocate argued that she was against transparency, and that all this talk about openness was frightening. She argued that anyone who favors privacy should oppose transparency.
I for one, am both a transparency advocate and a privacy advocate. Transparency is an opportunity and even obligation for corporations and other institutions. But it is not an opportunity or obligation of individuals. Individuals have the obligation to withhold and protect their personal information. Let me explain.
In MacroWikinomics (September 28, 2010), Anthony Williams and I look at how the Net is finally becoming the basis for commerce, work, entertainment, healthcare, learning and much human discourse, and how we are the better for it. But one consequence of these digital interactions is the spinoff of a staggering and ever-increasing volume of data. At Zeitgeist, Google CEO Eric Schmidt notes that between the dawn of civilization and 2003 there were 5 exabytes of data collected (an exabyte equals 1 quintillion bytes). Today 5 exabytes of data gets collected every two days.
This has big implications for companies. People and institutions interacting with firms have unprecedented access to information about corporate behavior, operations, and performance. Armed with new tools to find information, a variety of stakeholders now scrutinize the firm like never before, informing others and organizing collective responses.
Customers can evaluate confidently the true worth of products and services. Employees share formerly secret information about corporate strategy, management and challenges. To collaborate effectively, companies and their business partners have no choice but to share intimate knowledge. Powerful institutional investors today own or manage most wealth, and they are developing x-ray vision. Finally, in a world of instant communications, whistleblowers, inquisitive media, and Googling, citizens and communities routinely put firms under the microscope.
A welcome upshot of increased scrutiny is that business integrity is on the rise. Companies need to do good – act with integrity – not just to secure a healthy business environment, but for their own sustainability and competitive advantage. Firms that exhibit ethical values and transparency have discovered that they can be more competitive and more profitable. Transparency is no longer simply an obligation to report information to an external party like a regulator or an institutional investor; it’s a new competitive force and an essential precondition for building productive relationships with stakeholders.
So far, so good. But the growing tsunami of data generated daily by digital interactions isn’t restricted to corporations. A lot of this data pertains to individuals, and much of it is controlled by third parties. Practical obscurity – the basis for privacy norms throughout history – is fast disappearing. More and more aspects of our lives are becoming observable, linkable and identifiable by others. Thanks to networked computing technologies, this personal data is archived online and will be forever searchable.
But this availability of personal information isn’t just something that is being done to the public, it is being done by the public. Many of us are willing accomplices in dissolving our own privacy rights, in exchange for new services, conveniences, and efficiencies. In 2005, prior to Facebook’s arrival, who would have predicted that hundreds of millions of people would be voluntarily giving up detailed data about themselves, their activities, their likes/dislikes, etc. online every day? It’s pretty clear that everyone gives away too much of their personal information on Facebook, Twitter and other social networks. There are probably thousands of new graduates this year who won’t get that dream job because the employer did a “reference check” online and found them doing something inappropriate.
This development is turning traditional privacy laws and regulations upside down. Privacy and data protection laws emphasize the responsibility of organizations to collect, use, retain and disclose (“manage”) personal information in a confidential manner. In contrast, collaborative networks encourage individuals to directly and voluntarily publish granular data about themselves, such as tagged photos, preferences/settings/likes, friends’ lists, and groups joined. The integration of personal profiles on networks such as Facebook with other online sites, communities and applications increases the damage.
Our digital footprints and shadows are being gathered, bit by bit, into a hundred thousand simultaneous locations. Toss in the emerging “augmented reality” tools where you point your mobile device at the street and it gives you real-time information about the world around you–everything from recognizing the faces of people nearby to letting you know about all the people on Twitter in your vicinity–and we can be sure that a ton of personal information about most of us is deeply and irrevocably embedded into the fabric of the Internet and available to the world.
To my astonishment, I run into people who argue this is a good thing, championing the notion of a new era of personal transparency. Perhaps this is what was confusing the Zeitgeist privacy advocate. For example. in the recently released book, The Facebook Effect, author David Kirkpatrick reveals that some of the social network’s management thinks that transparency is not just an opportunity for companies and other institutions to generate trust and be more effective. They think it’s an opportunity for individuals to do the same. The more transparent we are, the more moral our behavior will be. I’ve often wondered why Facebook has been plagued with so many privacy controversies in its short history. Now I know why. The social media giant thinks that “more visibility makes us better people. Some claim, for example, that because of Facebook, young people today have a harder time cheating on their boyfriends or girlfriends. They also say that more transparency should make for a more tolerant society in which people eventually accept that everybody sometimes does bad or embarrassing things.”
Some at Facebook refer to this as Radical Transparency – a term initially applied to institutions, and now being adapted to individuals. “Our mission since day one has been to make society more open” says one senior Facebook executive. And in an interview with Kirkpatrick, Facebook CEO Mark Zuckerberg said that “The days of you having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly,” and that “Having two identities for yourself is an example of a lack of integrity.”
It boggles the mind that someone as thoughtful as Mr. Zuckerberg would argue this. Of course we should have more than one online identity, just as we sensibly have multiple offline identities. My family knows one version of me, and even then I share information with my wife that I don’t share with our children. Friends know more about me than my business acquaintances. Readers of my books and articles have another impression. And on and on. All of this is appropriate.
Transparency is the opportunity and even the obligation of institutions to communicate pertinent information to their stakeholders. Individuals have no such obligation. Personal information, be it biographical, biological, genealogical, historical, transactional, locational, relational, computational, vocational or reputational, is the stuff that makes up our modern identity. It must be managed responsibly. In fact, to have a secure life and self-determination, individuals have an obligation to themselves to protect their personal information. And institutions should be transparent about what they do with our personal information. Transparency and privacy go hand in hand. Advocating individual privacy and institutional transparency simultaneously is not illogical; it is common sense.
Information privacy is the foundation of a free society, not just because of the harm that can occur from blackmail, identity fraud, impersonation, cyber-stalkers, and nosy employers. When data can be assembled into profiles, matched with other info and used to make (automated) judgments about (and decisions affecting) individuals, such as whether or not to hire them, or whether to admit entry, or to calculate benefits or terms of an offer, or to corroborate a claim, or discriminate against or manipulate, it should make us shudder to think about what it would be like to live in a world where all is known and nothing is forgotten. Senator Joe McCarthy or for that matter Hitler’s SS would been licking their lips as such a prospect. And in this volatile world can we assume that governments and people with power will always be benevolent?
Ultimately, in order to protect privacy, all of us will need to change our own online behavior. Impossible assignment, you say? Once again look to young people of the Net Generation to show the way. Recent research suggests that youth are already more diligent than older adults in protecting themselves. In a 2010 study, the Pew Internet Project found that people in their 20s exert more control over their digital reputations than older adults, more vigorously deleting unwanted posts and limiting information about themselves. This supports our findings and those of others who have argued that young people who have grown up digital are confronted with the privacy issue at an earlier age and come to grips with it earlier.
Whoa… Brilliant essay! I agree, and believe that it all comes down to who people really are! The whole transparency thing has made lying much, much harder.
Excellent work, a very balanced article.
I enjoyed this essay and feel that it captures a lot of my thoughts around being transparent while maintaining privacy. I completely agree that organizations should be transparent, and individuals should take steps to keep their personal information secure. The question around what and when to share isn't getting easier.
“Armed with new tools to find information, a variety of stakeholders now scrutinize the firm like never before, informing others and organizing collective responses”. – This type of collective response to an organization, is, in my view, a great thing. Organizations are held accountable for their actions, and just like crowd-sourced content on the web – the good always bubbles to the top. I had a session facilitator once say to our class “in the end, I make all work related decisions based on my ability to sleep at night – I will not be dishonest”. Individuals within an organization can and should be transparent for the overall health of the org, and to ensure that good work is recognized (and adopted) by others.
A really useful article. Please don't delete it as I shall probably link to it several times in the next few months as universities worldwide start new sessions. I'm not sure I share your generalised optimisim about 20 year-olds being more cautious – some are really good at the online profiling but many are not and that is just like the rest of us. Reminders needed at regular intervals for all of us.
When social media sites encourage individuals to share their locations and cellular providers offer AR services your identity and privacy become more at risk. Companies need to better educate the customer how much information and privacy is at risk in utilizing their services.
Understanding your social graph and who owns the information is key. You need a private Facebook account, a public Linkedin account and non-location enabled Twitter account. We all have different online personas and need to keep personal and business seperate. Transparency for corporations is important, but privacy for the individual is critical.
I completely agree with many of yours points of view, Mr. Tapscott. Especially the fact that having many concurrent identities does not imply an integrity issue on my part. We often make judgments about which information is appropriate for which circles within our set of relationships.
I also agree that most information today is not actually private, but is “practically obscure”, as you say, and that further the amount of obscurity is decreasing rapidly. I don't think that there are any good examples of systems deployed in large scale (society wide) that do a good job of promoting our ability to deny or allow access to information about ourselves. This is chiefly because the amount of personal information is vast, information about ourselves has never been so searchable and available, and we don't yet fully understand the implications of releasing it on such a large scale – and thus largely we forge forward, compromising our own privacy, intending to find out what to release and what not to release through experimentation (ouch).
I don't believe that models that allow you to once configure that who has what kind of access to personal information (and then forevermore forget about the permissions you've just granted) will prove to be very effective in protecting people. Though I hate to admit impediments to personal freedom, one need only look to the mortgage crisis to understand that sometimes people need to be protected from themselves, and that yes, sometimes too much margin on too much debt should not be allowed, even if you and the bank both agree that it's ok. Government regulation (of lending rules in the banking industry, for example) is necessary.
Note also that corporations are at liberty to change their usage policies regarding personal information according to their own whims, a la facebook. Even if the resulting backlash eventually causes the corporation to relent and comply with the public's wishes, it will be little consolation for the compromised parties.
While I agree with the clear statement that “institutions should be transparent about what they do with our personal information”, this information may be published deep within long, boring, legal documents or agreements that we agree to, but effectively never read. And I don't think it's reasonable to ask people to read and understand these documents, as much as it is unreasonable to expect them to know the ins and outs of emergency landing in case of flight trouble. We're responsible for the oxygen mask and the seat belt – the pilot is responsible for everything else. It would be ludicrous to print the pilot's manual on the back of my boarding pass or ticket, in an attempt to absolve the pilot of her responsiblities for learning the proper emergency procedures. And thus, pilots are licensed and insured.
To choose another benign example, how many PC-owning downloaders of the Safari browser realize that Apple's EULA “allows you to install and use one copy of the Apple Software on a single *Apple-labeled* computer at a time” ? Not many, I'd guess. Imagine your credit card usage agreement, which they update every few months or so – do we read that? Now imagine a EULA for your personal health information. Do you think you could even interpret one?
There exists a reasonable expectation out there that someone much more studied in the implications of releasing sensitive private information will arm us with the proper tools to allow us to make the right decisions, and to prevent abuse by those that don't have our best interests at heart. Don't get me wrong – I believe in a free society in which we naturally manage (and are be expected to manage) many bits of personal information, but we can't be expected to personally manage access to *all* of it – especially the very sensitive information. Without really knowing the details, we expect that the government in Canada will not release our personal health information to private insurance companies. We don't know this because we've all read the government's usage policy on our personal information – we simply expect that this information will be protected because the implications of releasing it inappropriately are potentially disastrous.
This is to say nothing of the fact that a mental model of all of the “accessibility” information we are supposed to manage in this brave new world would probably not fit in our brains. That's neither how we manage our most private information now, nor long before it was infinitely searchable. In the past, someone might ask us for a fine-grained piece of personal information in a real-time fashion (“Can I ask you a personal question?” or “can I have your social insurance number?”), and we'd use all at our disposal – including what smarter people than us have indcated about releasing such information, as well as our up-to-the-minute judgment of how trustworthy the requester is – to decide whether or not to give it to them.
I'm a foursquare user – my phone often prompts me if it can use my location when it thinks it needs it. I almost always answer “no” reasoning to myself that I much prefer to wander under safe cover of anonymity. However, when I eventually get hungry from walking around and I start to look for a good place to eat, I'm more likely to answer “yes”. I choose to relinquish my privacy *in that moment only* in favour of the convenience that this action affords me. I believe this is the right model for access to personal information, but we also need an expert to help us categorize the risks associated with relinquishing other information so that we can make informed decisions. We also need an (open source) tool that allows us to manage our information completely independently of how it will be used. Finally, we need a gatekeeper tool that notifies us in some way whenever this information is being requested. Finally, we need trust authorities and key chains to validate the identities of the requesters. Only then will we be properly armed to make a measured decisions about access to very sensitive information.
Jeremy Chan – @summersoulTO
I really think it boils down to: do people really need to know the details? In the case of a large corporation, transparency is important because the corporation's actions affect a vast number of people — stockholders, customers, etc. Knowing how these corporations work, how they spend investors' money, and so on needs to be freely available to anyone who wants to see it because the corporation affects many, many people. Private individuals, on the other hand? Not so much. What is the value in complete strangers knowing every intimate detail about me when it has no effect on them? Privacy for individuals needs to be more closely protected, IMO.