The uncanny valley of robotics is grounded in the social cues of the visual. We are repulsed by the plastic skin, by the stilted movements, by the soulless eyes of our robotic counterparts. In contrast, personally targeted digital experiences present a likeness of our needs and wants, but the contours of our data are obscured by a black box of algorithms. Based on an unknown set of prior behaviors, these systems anticipate intentions we might not even know we have. Our data may not be animate or embodied like a robot, but it does act with agency. Data likeness can’t be seen or touched, but neither can our sense of ourselves. This makes the uncanny even more unnerving.
Uncanny personalization occurs when the data is both too close and not quite close enough to what we know about ourselves. This is rooted in Sigmund Freud’s famous treatment of the uncanny, which he traced to the feelings associated with encountering something strangely familiar. In Freud’s original writing, the uncanny is the unheimlich—literally translated as “unhomely,” and the opposite of heimlich, which is the familiar, comfortable feeling of being at home.
Technologies that are simultaneously familiar and alien evoke a sense of dread. In the same way, when our data doesn’t match our understanding of ourselves, the uncanny emerges. Freud explains that heimlich also means that which is kept hidden, as in the private sense of the home. So when something becomes unheimlich, what should be hidden is exposed. We might think of our browsing history this way. With digital traces assembled by personalization engines, our most intimate behaviors are uncovered and reflected back to us. We don’t think an ad is relevant to us, but it repulses us because we are worried that it could be.
A friend’s Facebook status update captures this idea well: “I am never quite sure if Facebook’s advertising algorithms know nothing about me, or more than I can admit to myself.”
Sara M. Watson, Data Doppelgängers and the Uncanny Valley of Personalization
Watson crosswires two ideas – the ‘uncanny valley’ characterization of roboticist Masahiro Mori to describe the creepiness of robots that come too close (but still too far) to mimicking human beings, and the shortfall of targeted ads based on skewed personalization – and comes up with uncanny personalization. So now, when you are bombarded with personalized ads for liver spot cream, cowboy boots, or fair trade chocolate – things you believe you have no interest in – you’ll know what to call that sense of unease.
As Watson points out, this is related to what Kate Crawford calls ‘surveillant anxiety’:
The fear that all the data we are shedding every day is too revealing of our intimate selves but may also misrepresent us. Like a fluorescent light in a dark corridor, it can both show too much and not enough.
In an era trending toward publicy – when we are necessarily aware that our actions, words, and deeds are being traced online and in both public and private space by commercial and political organizations – the misperception of our wants and drives revealed by uncanny personalization is vertiginous, like the feeling of trying to step onto a stair that is slightly too low. We expect all stairs to be at a regular height, and one stair of odd dimensions makes us unsteady and untrusting of the impulse of the architect, or the building’s owners.
Expect more of these experiences as the new dogma of big data pervades all commercial activities, and new techniques of surveillance create an expanding tail of social exhaust to mine.