This is not a fantasy: corporations will use social data analysis like that outlined in the scenario below to make decisions on hires. They will infer who has migraines, drinks too much, screws around, or is a closet socialist.
Facebook’s Generation Y nightmare - Frédéric Filloux
"Tina Porter, 26. She’s what you need for the transpacific trade issues you just mentioned, Alan. Her dissertation speaks for itself, she even learned Korean…"
"But?…" Asks the HR guy.
"She’s afflicted with acute migraine. It occurs at least a couple of times a month. She’s good at concealing it, but our data shows it could be a problem," Chen says.
"How the hell do you know that?"
"Well, she falls into this particular Health Cluster. In her Facebook babbling, she sometimes refers to a spike in her olfactory sensitivity – a known precursor to a migraine crisis. In addition, each time, for a period of several days, we see a slight drop in the number of words she uses in her posts, her vocabulary shrinks a bit, and her tweets, usually sharp, become less frequent and more nebulous. That’s an obvious pattern for people suffering from serious migraine. In addition, the Zeo Sleeping Manager website and the stress management site HeartMath – both now connected with Facebook – suggest she suffers from insomnia. In other words, Alan, we think you can’t take Ms Porter in the firm. Our Predictive Workforce Expenditure Model shows that she will cost you at least 15% more in lost productivity. Not to mention the patterns in her Facebook entries suggesting a 75% chance for her to become pregnant in the next 18 months, again according to our models."
"Not exactly a disease from what I know. But OK, let’s move on".
Corporations will conceal their decision-making processes when it collides with privacy regulations and government restrictions on discrimination, most likely by outsourcing head-hunting to outside companies, who will outsource finding the best candidates to other outside companies, who will use programs rented from even more outside companies. They are doing this already: this is mainstream.
Is Instagram 3.0’s new maps feature a privacy wake-up call?
Yes, time to wake up.
The Privacy/Publicy dilemma is just that: a dilemma. There is no solution, per se.
If you want to live out loud, sharing photos of your comings and goings with anything other than a hand-picked coterie of friends — managed in some way so that they cannot play them forward to others — then you have to accept the possibility that someone might use that to stalk you.
This is a parallel to living in the real world by the way. When you go out on the town there is nothing to stop someone from following you around, noting where you go, what you drink, who you talk to, and taking pictures the whole time. That’s how private eyes make a living.
And that’s what this new release shows: Instagram is embracing the Privacy/Publicy dilemma, not avoiding it.
The Trust Paradox: Assurance structures designed to make interpersonal trust possible in uncertain environments undermine the need for trust in the first place.
Coye Cheshire from Online Trust, Trustworthiness, or Assurance? http://bit.ly/Pu3SAg
Megan Garber looks at some new research on privacy considerations in Facebook photo tagging by João Paulo Pesce and others, and boils it down for us:
On Facebook, Your Privacy Is Your Friends’ Privacy - Megan Garber via The Atlantic
The upshot? “Photo-tags can threaten privacy burdens in an indirect way,” the authors note, “by pinpointing the nodes in the social graphs on which privacy-attacking algorithms may extract information, thus enhancing their accuracy.” The social networks themselves, the researchers suggest, could work to solve that problem — by, say, creating a “hiding” feature that would allow users to disguise tags and prevent their unauthorized use without fully deleting them. Which would definitely be a nice thing to have. But the real solution, it seems, will be a social one, fit for the age of the social network. And it will start with users re-conceiving of themselves not simply as users sharing their own information, but as actors and influencers who are responsible for the network at large.
To turn this around, away from the conventional conservation-of-privacy ideal, we can say that publicy is an outcome of the social actions of social network participants, an emergent property. As individual’s add social metadata incrementally, others — or algorithms — could explore that metadata and be able to make potentially revealing inferences, like who was with who at a bar, what Facebook friends are actually close, and what connections are romantically involved.
The lack of moral outrage around the recent Supreme Court case — finding that anyone charged of a crime can be strip searched, even when there is no evidence of contraband or concealed weapons — may be the result of the relaxation of our sense of privacy, in general.
Strip-Search Case Reflects Death of American Privacy - Noah Feldman via Bloomberg
There are two main drivers pushing privacy into the dustbin of history, and both are related to technology. One is the increasing effectiveness of government surveillance. Cameras follow you in most public places in London today, and New York is catching up. Diffusion scanners at the airport already show you essentially naked. The coalition Conservative-Liberal Democratic government in the U.K. is preparing to allow the state to collect, without a warrant or even suspicion, all information on calls or texts except the content. The government’s ability to do all of these things causes many of us to think, irrationally, that it is reasonable for it to do so.
The other driving force is our increasing willingness to sacrifice privacy for practical advantage. When you sign up for a free Gmail account, you agree to allow a computer program to read all your e-mails. This is hardly a secret: The ads that pop up on your browser often relate to the text of the e-mail you have sent or received. Google Inc. gambled that people would rationalize the loss of privacy by saying that no human was reading the text. Google was right. The list goes on: Global-positioning-system technology on your mobile phone helps you find out where you are — and enables anyone with access to your provider to do the same.
We all know that our sense of privacy has been changing. It seems that every time you ride the bus you hear one-half of the most intimate conversations imaginable — emanating from a total stranger with a phone to his ear. The justices cannot help but be affected by these trends. Privacy is defined constitutionally by “reasonable expectation” of what should be private. This may sound circular, but it is in fact inevitable. The concept of privacy is inherently flexible, and the less we value it, the less our judicial institutions will protect it for us.
And if we drop our ‘reasonable expectations’ then we may be less surprised when people have to drop their pants, as well.
We are moving to a coercively public society, where publicy is the norm, and privacy — or the demand for it — will be cast as the intimation of illegal, immoral, or unreasonable behavior. This is why prospective employers believe they are justified in asking candidates for their facebook passwords, despite the illegality of ‘show-ercion’ of this sort.
And the publicy bias is going to grow.
Doing a presentation next week in San Francisco, Data Is The New Oil: The Journey From Privacy To Publicy.I will be sharing the podium with Gerd Leonhard, Andreas Weigend, and Jamais Cascio.
I am likely to use some of the slides in the deck above, Big And Small Data.
I’ve heard we are going to have a packed house, so If you want to attend you should sign up right away.
Facebook May Take Legal Action Over Employer Password Requests - Matt Brian via thenextweb
So, employers or colleges that are demanding access to private information on Facebook (or other web sites) are entering a legal minefield, and we will have to wait for court case to see how that shakes out. Morally, however, it is unambiguous shoercion: coercing individuals to show private information.
I will be speaking with Gerd Leonhard, Andreas Wiegand, and (hopefully) Jamais Cascio at an event in San Francisco, 10 April 2012, sponsored by Swissex.
The theme is Data is the New Oil: The Journey from Privacy to Publicy. As every web page we visit is logged, and every comment and tweet analyzed for sentiment and intention, more data is being logged weekly than existed on earth a few years ago, prior to the rise of the social web. We will explore the connections between our connected world and the complexities and challenges of a data economy.
If you are interested in attending, please register quickly, since there are only 150 or so seats.
Imagine brands sampling skin samples from products in supermarkets to derive a DNA profile of likely customers #dnasxsw— Stowe Boyd (@stoweboyd) March 10, 2012
Scott Fahrenkrug proposes that we create a global coop, to pool our DNA, and to keep it out of the hands of corporations that seek to use it without our involvement and without any recompense. There have been several court cases that have established that DNA in cells is not owned by those that produced them.
Privacy Management On Social Media Sites by Mary Madden via Pew
Social network users are becoming more active in pruning and managing their accounts. Women and younger users tend to unfriend more than others.
About two-thirds of internet users use social networking sites (SNS) and all the major metrics for profile management are up, compared to 2009: 63% of them have deleted people from their “friends” lists, up from 56% in 2009; 44% have deleted comments made by others on their profile; and 37% have removed their names from photos that were tagged to identify them.
How to read the ‘unfriending’ trend?
One option: This rise in unfriending might not be about friendship, per se. People might be just throttling back the torrent of information that they are receiving in their social streams: stream overload.
But the deleting of comments and removing name tags from photos would represent very different, and possibly more privacy-oriented motivations. However, if I delete a comment because someone writes something offensive, is that a privacy issue? Or is it a more of a cultivated image being publicly displayed? That would make it a publicy issue.
I think we will have to get a lot more fine-grained in determining causality in these cases, and more attuned to the publicy/Goffman angle: the presentation of self in everyday online life.