Fraud Management & Cybercrime , Social Media
"It'll Go on Your Permanent Record"
Law Professor Jeff Rosen on the Privacy Risks of Social Networking"Permanent record" was a concept then. But today, in the era of social media such as Facebook, LinkedIn and Twitter - it's a reality. Today's seemingly innocent status update, tweet or photograph could be the item that costs a job opportunity or destroys an individual's reputation tomorrow.
Jeff Rosen, professor of law at George Washington University, is concerned about privacy - the lack thereof - in cyberspace, and he has distinct ideas about how individuals and organizations can better protect their privacy.
In an interview, Rosen discusses:
- What individuals and organizations should - and should not - be doing online today;
- The biggest threats to privacy;
- How to monitor one's online reputation - and avoid a negative notation in the "permanent record."
Rosen is a professor of law at George Washington University, the legal affairs editor of The New Republic and a nonresident senior fellow at the Brookings Institution. His most recent book is The Supreme Court: The Personalities and Rivalries that Defined America, the bestselling companion book to the PBS series on the Supreme Court. He is also the author of The Most Democratic Branch, The Naked Crowd and The Unwanted Gaze, which The New York Times called "the definitive text in privacy perils in the digital age."
Rosen is a graduate of Harvard College, summa cum laude; Oxford University, where he was a Marshall Scholar; and Yale Law School. His essays and commentaries have appeared in the New York Times Magazine, The Atlantic Monthly, on National Public Radio, and in The New Yorker, where he has been a staff writer. The Chicago Tribune named him one of the 10 best magazine journalists in America and the Los Angeles Times called him "the nation's most widely read and influential legal commentator." He lives in Washington with his wife Christine Rosen and two sons.
TOM FIELD: To get us started, Jeff, why don't you tell us a little bit about yourself and your work please.
JEFF ROSEN: I'm a law professor who has been writing about privacy for more than a decade. Right now my interests are the way the web never forgets and the fact that everything about us, every tweet or blog or post lasts forever in the digital cloud.
But I started writing about privacy more than a decade ago, right before the internet exploded. It was in the middle of the Bill Clinton/Monica Lewinsky impeachment scandal, and I became interested in the question of why it was that it was possible to recover so much information about Monica Lewinsky; her emails, her address, her private diaries, how the law changed in a way that made that possible.
That was around the time that the internet was taking off, and companies were beginning to be concerned and interested about the way that people might be tracked and judged out of context, how a single snippet of information about them, their book preferences or the last thing they had shopped for, the cookies that they had received online might come to define them and embarrass them.
What is so striking is how the problem that seemed nascent a decade ago is now affecting millions and millions of people. Back then, at the dawn of the new millennium, it was just a theoretical possibility that your shopping preferences or emails might be exposed to the world. But as the web has expanded, that has become a reality, and in the age of Facebook and social networking, millions and millions of citizens are living with the tangible consequences of being forced to lose jobs or lose promotions because of stuff they posted about themselves, embarrassing Facebook pictures, or stuff that other people have posted about them.
So for the first time in a decade my sense is that this is a problem that individual citizens are experiencing and understanding at a very tangible and widespread level, and that is why I am so interested to be writing about privacy right now.
FIELD: Well, Jeff, you make a compelling case here; social media has ratcheted up everything. So in terms of how you look at privacy in cyberspace now, in the era of Facebook and Twitter and LinkedIn, what do you find to be your biggest concerns? And I would say that in terms of consumers as well as business organizations and government agencies, even.
ROSEN: I think in terms of consumers and citizens, people's biggest concern is the ability to control their reputations, to be able to control what is known about them. Privacy is a form of control has always been at the top of people's lists. How much should we be able to disclose, how can we opt in and opt out, can we find ways of developing rules of consent? But the truth is, in the age of social networking technologies, it is impossible to control everything that is known about us.
We just can't stop people from gossiping about us online, saying all sorts of things, the fact that it is impossible to get posts and sent photos back once they are out there is a reality, and as new technologies, including facial recognition technologies, are making it possible to snap pictures of strangers on the street, plug it into a database and identify them by name by matching them up with other photos, even our expectations of anonymity in public are being transformed. So I think this urge to control your reputation is a powerful concern that is at the top of people's list when they think about privacy, but it is an extremely difficult form of control to guarantee through law or technology.
In terms of companies and government, companies have to be concerned about the dangers of upsetting consumers when they use information in ways that people don't expect. Week after week we see some example of a trusted company either accidentally sharing data that they shouldn't or saying something impolitic about how privacy is over and people shouldn't expect it anymore and provoking some kind of backlash that harms their brand and reduces their trust. So I think as companies realize how much consumers actually care about control of their reputation, they are being more careful about the misuse of data.
And government, of course, is hungry for data for purposes of security, data mining programs in order to predict terrorism, and bad acts are widespread, the use of increasing technologies in security at airports, such as the backscanner technologies that are being implemented and the controversies they have produced are striking. And this need to balance privacy and security both to engage in thoughtful, predictive data mining and at the same time not to identify innocent people by name is a great challenge that government is grappling with right now.
FIELD: Jeff, a two-part question for you, and the first part is: What do you find that individuals and organizations are not doing today to ensure privacy?
ROSEN: Well, it is certainly the case that individuals make bad choices at times. We disclose things that we shouldn't and then we come to regret it.
I like to give the example of Stacey Schneider, the 25-year-old teacher four years ago who was hoping to get a teaching degree and was fired as a teacher in training and not allowed to graduate from the teacher's college because she had posted a picture on her MySpace page of herself wearing a pirate hat drinking from a plastic cup with a caption "drunken pirate." Her supervisors accused her of promoting underage drinking, and as a result she was fired and her teaching career was derailed. She sued and a federal judge rejected her claim, saying that because she was a public employee and her speech didn't relate to matters of public concern, it wasn't protected by the First Amendment.
So, that is a troubling example of just how a relatively small error of posting something that you think is just going to be seen by friends and ends up being seen by employers may have tangible consequences on a career.
There was a recent Microsoft study that suggested that 75 percent of employers have looked at social media technologies when evaluating job applicants, and 70 percent of them have rejected applicants or failed to promote them because of stuff they found online.
So individuals definitely have to be concerned about the consequences of disclosing the wrong thing in the wrong context, and they also need to think about ways that laws, technologies and changes in social norms can help them minimize the effects of their mistakes. It doesn't seem fair that an entire career should be derailed based on one indiscrete photo.
As for companies, they are also experiencing on a daily basis both dangers of sharing information in the wrong context that comes to embarrass people. We think of the example of just a few years ago of the AOL searches that were supposedly anonymized and were revealed to researchers, and then people were able to identify the searchers by name who understandably were extremely distressed to find their musing about embarrassing diseases and books exposed to the world.
So there is that simple danger of just making bad decision about disclosing information, and more concretely there is the fact that businesses are more frequently asked by consumers to create technologies of control. There is a great uproar over Facebook's changes in its privacy policies, suggesting that some social networking CEO's were wrong when they thought that Facebook users didn't care at all about privacy.
The fact that people were upset when they thought their data was being stored forever without their consent, and they felt powerless to be able to control their privacy settings in a more granular way indicates that businesses really have to worry about responding to this growing demand for consumer control over their online reputations.
FIELD: Jeff, I have got to follow up on one of the points you made. You talked about business organizations now that are using social networking activity as a consideration when they are considering either external or internal applicants. Certainly I have heard about this from HR executives. How do you feel about this practice of going online to look at a person's online activities, whether it be through LinkedIn or Twitter or Facebook, and this being a significant criteria?
ROSEN: Well, it is definitely widespread and in addition to those Microsoft numbers about the U.S., it is even widespread in Europe. Microsoft found that France and Germany are doing it as well; European HR managers are also looking online. How do I feel about it? As someone who is concerned about privacy as a citizen, I guess I can see arguments on both sides - I'm conflicted.
On the one hand, you understand that an HR person would feel compelled to find out as much information about people as possible, and why should they be denied access to public information that others outside the organization have access to as a matter of course?
On the other hand, it seems that there is a great danger of judging people out of context when you force applicants to open their Facebook pages during job interviews, which according to Reputation Defenders (a new firm set up to deal with this problem) is an increasingly common practice. Just in job interviews where ordinarily people don't ask certain questions about their private life, so it seems like a breach of boundaries to look at a Facebook page in real-time.
Now I gather that some employment lawyers are trying to negotiate this complicated challenge, and that's why they are recommending to HR managers that they not do Facebook and social networking searches before they decide to give someone a job, but after having made the initial decision they suggest it is okay to go online to confirm that there are no big skeletons in the closet. And that is why they can't be accused of having used a search to make the initial job decision itself. That is one solution of the problem.
Another more dramatic one proposed by people like Paul Ohm who is a law professor at the University of Colorado, is literally to forbid employers from engaging in social networking searches during job decision, at least in certain kinds of professions. There might be exceptions for especially sensitive jobs, but generally you could forbid these kinds of searches, and indeed the German Privacy Commissioner has just proposed a similar prohibition in Germany where there would be dramatic restrictions on the kinds of searches that HR departments could do.
This might face an uphill battle in America just because we tend to be suspicious of restricting the private sector's ability to have access to public information. I am not sure that the laws will actually pass here as opposed to in Germany, but it is interesting that they are on the table and suggest how widespread the challenge of this new technology really is.
FIELD: Now the second part to the question we were just discussing: We talked about what individuals and organizations are not doing to ensure privacy; the flipside is what they doing or should they be doing to ensure their privacy?
ROSEN: The most effective technological solution to the problem we are discussing, this inability to escape your past to have a new beginning, has to do with expiration dates for data. We need a way of deleting all of this information that is out there, and it shouldn't have to linger out there forever.
So for example, as Viktor Mayer Schonberger argues in his fascinating new book Delete, it is possible, technologically, for Facebook or for any social networking site to create an expiration date for data, essentially to say when you store a photo or a chat or a blog post to ask do you want this to last forever or would you like it to disappear after three days or three months.
Indeed there are small-scale apps that are attempting to do this. A very interesting new application called Tiger Text has been developed, which creates disappearing text messages, and you can say that you want the text to disappear after a specified period of time.
This is effective and a needed solution to the problem of resurrecting a world where certain kinds of - Paul Ohm calls it "water cooler chat" that used to be oral and would disappear as soon as people forgot about it, is now being written down and stored for a long period of time.
So I think expiration dates for data is the best thing that companies could do, but how to implement this is tricky. Facebook is understandably reluctant to make it part of the platform, but you could certainly create apps that would allow individual users who want their texts to disappear to use that on a more widespread basis, and I hope that as this problem becomes more obvious to people, possibilities for selective deletion will become more and more widespread.
FIELD: It strikes me, Jeff, as you described this that we have invented the permanent record that we all feared when we were younger and in school.
ROSEN: It's really true. We were told maybe as kids 'You better behave, or that is going to go onto your permanent file,' and now it turns out we really do have permanent files from which we can't escape. It is interesting, too, that this new world of the internet, which is sometimes misleadingly called a global community, is actually less forgiving than actual real communities long ago.
Many religious traditions have a sense that things go into your permanent file, God can read our thoughts, but if you say you are sorry, if you atone, if you apologize, then your file is cleared and you get a second chance. That ability to have a digital second chance is increasingly elusive today and we need to think about new ways of recreating it.
FIELD: Well, you have talked about this to some extent, but just to sort of articulate it, what do you see as sort of the biggest threats to privacy today?
ROSEN: I think the biggest threat to privacy has to do with the fact that the internet never forgets. The fact that everything we do and say and is done and said concerning us is written down and goes into more or less a permanent digital file.
That technological reality seems like an even greater problem than the breaches of consent rules, voluntary and involuntary sharing of data; it transcends the urge for opt-in and opt-out rules. Because, really, the truth is that what people want is not just the power to consent or not to consent to a particular sharing of data. What people want is the ability to control their entire reputation, which is in the end an unrealistic hope, but an understandable one.
So I think it is just the permanence of the internet is the biggest threat to privacy today and that is what we have to focus on when we think about the best solutions.
FIELD: Well, it's a message that we certainly have delivered to individuals and organizations, and yet we see this everyday, people are being tripped up by things that they've put online or people have put online for them. What advice would you offer to individuals or organizations to improve their efforts to ensure privacy? Where should they start?
ROSEN: Well, of course, the best advice that any individual or organization could get is behave well; act as if everything you do will be written down and may go into a permanent digital file. But although that is excellent advice, it is not sufficient.
All of us, no matter how hard we try to behave well are bound to get tripped up; that's part of being human, and inevitably we are going to make mistakes and say things we shouldn't and reveal things we shouldn't. The question is, how do we escape from these errors?
Now it is helpful for individuals and organizations to familiarize themselves with technologies that can help this problem. Reputation Defenders, which I mentioned, has a interesting and valued service, which is to help people improve their Google profiles by essentially putting a lot of good information or neutral information about them in a way that crowds out the bad information and pushes it to the back of a Google search.
That is a very good and effective solution, but the problem is that it can be expensive and it is not within the means of all citizens, so it is not something that everyone can avail themselves of, but those kinds of fixes can be useful.
Reputation Defenders also makes the excellent point and delivers the good advice that people should be aware of what is out there about them by monitoring their own searches, by being aware of stuff that is said about them. They can attempt to ask inaccurate information to be taken down and extreme cases, Google and other search engines actually will take down critically defamatory or inaccurate information.
But behind this good advice to behave well, be aware of what is out about you, try to be vigilant about removing false information. I think individuals just have to learn to live within the world in which we are never going to be able to exert perfect control over our reputations, and in that sense it is important not to judge ourselves out of context, not to allow a single error or embarrassing series of posts about us to define ourselves.
It is important to recognize that norms are changing. The kind of behavior that might have been disqualifying for a job 15 years ago, people are now learning to forgive as more and more individuals engage in this. Embarrassing pictures, drunken pictures from college, which caused people to lose jobs five years ago, may not be as disqualifying in the future.
But there will always be something. It is too optimistic to suggest that as society becomes more forgiving, we will just learn to forgive each other for all of our embarrassments. There is always going to be someone that has transgressed, individuals are going to be tripped up on this. There will be individual victims like Stacey Schneider, who ran into trouble with norms that they didn't expect, and that is why acknowledging the difficulty of the problem, recognizing that the solutions are legal, technological and have to do with social norms.
And then for HR departments, to be as humane as they can be. I certainly sympathize with the difficult situation that employers are in. They have to worry about not running afoul with the law of discrimination laws, they have to engage in due diligence to find out everything about applicants that they can. But do not be draconian and hold people accountable for more than they have, for recognizing that we all do make errors -- that could help. In that sense, one thing HR departments can do is give individuals a chance to respond to the negative information about them.
One scholar, Jonathan Zittrain at Harvard, has proposed that he calls reputation bankruptcy. He said that just as in the credit context every 10 years or so you can declare bankruptcy and wipe the slate clean, so you should be able to wipe clean the negative information about you on reputation ranking sites and on Google.
And even if you don't go that far and think that you should be able to have a complete clean slate, if employers are concerned about negative information and are thinking about not hiring someone, it would be a good thing and probably a fair thing to give them a chance to respond, to tell individuals what is out there about them and let them give their side of the story and put themselves in context and then by listening carefully to the other side of the story employers may be able to take a fair and humane response to this sort of information.
FIELD: Well interesting that you quote Jonathan Zittrain in that context because I am sitting here thinking that just as people should be encouraged to do a credit check once a year on themselves, they should do a reputation check as well.
ROSEN: They certainly should. It's a very smart thing to do and more than once a year because you know how fast this stuff can spiral out of control. Recognize that there may be multifaceted approaches to the problem. You can try to give your side of the story. Google has proposed kind of "write a reply" on certain forms of postings that if you think you have been misrepresented you should be able to give your account of what really happened. Recognize that law may be appropriate for really false and inflammatory information, but just being constantly aware and the need to monitor what is said about us and what we say about other people is a wise thing to do.