Stowe Boyd, Our Time Is Not Our Own: Time Is The New Space (2011)
Cosmic time is the same for everyone, but human time differs with each person. Time flows in the same way for all human beings; every human being flows through time in a different way.
Joyce Carol Oates, Marya
Sean Carroll, Ten Things Everyone Should Know About Time
“Time” is the most used noun in the English language, yet it remains a mystery. We’ve just completed an amazingly intense and rewarding multidisciplinary conference on the nature of time, and my brain is swimming with ideas and new questions. Rather than trying a summary (the talks will be online soon), here’s my stab at a top ten list partly inspired by our discussions: the things everyone should know about time.
Anthony Giddens (via gravity7)
The brain is a remarkably capable chronometer for most purposes. It can track seconds, minutes, days, and weeks, set off alarms in the morning, at bedtime, on birthdays and anniversaries. Timing is so essential to our survival that it may be the most finely tuned of our senses. In lab tests, people can distinguish between sounds as little as five milliseconds apart, and our involuntary timing is even quicker. If you’re hiking through a jungle and a tiger growls in the underbrush, your brain will instantly home in on the sound by comparing when it reached each of your ears, and triangulating between the three points. The difference can be as little as nine-millionths of a second.
Yet “brain time,” as Eagleman calls it, is intrinsically subjective. “Try this exercise,” he suggests in a recent essay. “Put this book down and go look in a mirror. Now move your eyes back and forth, so that you’re looking at your left eye, then at your right eye, then at your left eye again. When your eyes shift from one position to the other, they take time to move and land on the other location. But here’s the kicker: you never see your eyes move.” There’s no evidence of any gaps in your perception—no darkened stretches like bits of blank film—yet much of what you see has been edited out. Your brain has taken a complicated scene of eyes darting back and forth and recut it as a simple one: your eyes stare straight ahead. Where did the missing moments go?
The question raises a fundamental issue of consciousness: how much of what we perceive exists outside of us and how much is a product of our minds? Time is a dimension like any other, fixed and defined down to its tiniest increments: millennia to microseconds, aeons to quartz oscillations. Yet the data rarely matches our reality. The rapid eye movements in the mirror, known as saccades, aren’t the only things that get edited out. The jittery camera shake of everyday vision is similarly smoothed over, and our memories are often radically revised. What else are we missing? When Eagleman was a boy, his favorite joke had a turtle walking into a sheriff’s office. “I’ve just been attacked by three snails!” he shouts. “Tell me what happened,” the sheriff replies. The turtle shakes his head: “I don’t know, it all happened so fast.”
Just how many clocks we contain still isn’t clear. The most recent neuroscience papers make the brain sound like a Victorian attic, full of odd, vaguely labelled objects ticking away in every corner. The circadian clock, which tracks the cycle of day and night, lurks in the suprachiasmatic nucleus, in the hypothalamus. The cerebellum, which governs muscle movements, may control timing on the order of a few seconds or minutes. The basal ganglia and various parts of the cortex have all been nominated as timekeepers, though there’s some disagreement on the details. The standard model, proposed by the late Columbia psychologist John Gibbon in the nineteen-seventies, holds that the brain has “pacemaker” neurons that release steady pulses of neurotransmitters. More recently, at Duke, the neuroscientist Warren Meck has suggested that timing is governed by groups of neurons that oscillate at different frequencies. At U.C.L.A., Dean Buonomano believes that areas throughout the brain function as clocks, their tissue ticking with neural networks that change in predictable patterns. “Imagine a skyscraper at night,” he told me. “Some people on the top floor work till midnight, while some on the lower floors may go to bed early. If you studied the patterns long enough, you could tell the time just by looking at which lights are on.”
Time isn’t like the other senses, Eagleman says. Sight, smell, touch, taste, and hearing are relatively easy to isolate in the brain. They have discrete functions that rarely overlap: it’s hard to describe the taste of a sound, the color of a smell, or the scent of a feeling. (Unless, of course, you have synesthesia—another of Eagleman’s obsessions.) But a sense of time is threaded through everything we perceive. It’s there in the length of a song, the persistence of a scent, the flash of a light bulb. “There’s always an impulse toward phrenology in neuroscience—toward saying, ‘Here is the spot where it’s happening,’ ” Eagleman told me. “But the interesting thing about time is that there is no spot. It’s a distributed property. It’s metasensory; it rides on top of all the others.”[…]
“Time is this rubbery thing,” Eagleman said. “It stretches out when you really turn your brain resources on, and when you say, ‘Oh, I got this, everything is as expected,’ it shrinks up.” The best example of this is the so-called oddball effect—an optical illusion that Eagleman had shown me in his lab. It consisted of a series of simple images flashing on a computer screen. Most of the time, the same picture was repeated again and again: a plain brown shoe. But every so often a flower would appear instead. To my mind, the change was a matter of timing as well as of content: the flower would stay onscreen much longer than the shoe. But Eagleman insisted that all the pictures appeared for the same length of time. The only difference was the degree of attention that I paid to them. The shoe, by its third or fourth appearance, barely made an impression. The flower, more rare, lingered and blossomed, like those childhood summers.[…]
"We’re stuck in time like fish in water,” Eagleman said, oblivious of its currents until a bubble floats by. It’s usually best that way. He had spent the past ten years peering at the world through such gaps in our perception, he said. “But sometimes you get so far down deep into reality that you want to pull back. Sometimes, in a great while, I’ll think, What if I find out that this is all an illusion?” He felt this most keenly with his schizophrenic subjects, who tended to do poorly on timing tests. The voices in their heads, he suspected, were no different from anyone else’s internal monologues; their brains just processed them a little out of sequence, so that the thoughts seemed to belong to someone else. “All it takes is this tiny tweak in the brain, this tiny change in perception,” he said, “and what you see as real isn’t real to anyone else.”
I am looking forward to reading David Eagleman’s Brain Time, which is available online at Edge.
Anil Dash does a great job of framing the transience of Twitter, characterizing it as a ‘lossy’ system, where we don’t necessarily see every item and finding old tweets can be difficult if not impossible:
Anil Dash, If You Didn’t Blog It, It Didn’t Happen
THE PERILS OF A LOW STRESS ENVIRONMENT
Now, Twitter and other stream-based flows of information provide an important role in the ecosystem. Perhaps the most important psychological innovation of Twitter is that it assumes you won’t see every message that comes along. There’s no count of unread items, and very little social cost to telling a friend that you missed their tweet. That convenience and social accommodation is incredibly valuable and an important contribution to the web.
However, by creating a lossy environment where individual tweets are disposable, there’s also an environment where few will build the infrastructure to support broader, more meaningful conversations that could be catalyzed by a tweet. In many ways, this means the best tweets for advancing an idea are those that contain links to more permanent media.
So, if most tweets are too ephemeral to reach their full potential as ideas, what do we do about it? Well, obviously, one big step would be to simply make sure to blog any idea that’s worth preserving. It’s perfectly fine to tweet about trivialities — I do it all the time! But if you’re tweeting about your work, your passion, or something meaningful to you, you owe it to your ideas to actually preserve them somewhere more persistent.
And, of course, I should make a pitch that this is part of the reason I am so enamored of the work the ThinkUp community is doing. A free, thriving, powerful, relatively accessible app that archives Twitter and Facebook updates with a mind towards incorporating them into more persistent and meaningful media is an essential part of the ecosystem. This is especially true as political, social and artistic leaders start to rely on these ephemeral media, without realizing the cultural costs to those choices.
Given enough time, and without substantial changes to the way the big social networks work, if you didn’t blog it, it didn’t happen. In fact, I first wrote about this idea a bit on Twitter a few years ago. See if you can find it.
I agree with Anil: anyone who wants to hold onto an idea, and build on it, should put it in a blog post. Sure; twitter out a link to the post, get it out into the stream, but anchor it to something fixed, accessible, and easily addressable.
The utility of streaming media — like Twitter — isn’t necessarily pegged to the lossiness of the system, though. That’s just an artifact of the technology being used, like pixelation on low res displays, or the fact that new paper money can give you a paper cut: it’s not a function of the meaning of money or computers.
Twitter doesn’t have to be a black hole for ideas. Better search tools or better clients could hold onto tweets we read, retweeted, liked, shared, or tagged. It’s the tools that are limited, not the stream medium.
And having better tools wouldn’t necessarily mean that Twitter would lose its streaming character. One of the pivotal characteristics of the streaming medium is not being an inbox: tweets fall off the end on their own, without me having to file them or delete them. But that doesn’t mean they fall into nothingness.
Streams could be made richer. I would like to imagine advances like these coming out in the near term:
Lurking behind Anil’s practicality are the more philosophical issues of time and transience. Yes, we don’t need to retain every tweet ever read or written. We can accept the fast and furious impermanence of most tweets, and the up tempo pace of the Twitter bloodstream. But we want to also operate at a slower pace, dealing with deeper and abiding interests, ideas, and connections. We need to be able to shift tempo without missing a beat.
In an amazing inversion of the logic of ‘hot news’ — where the major news outlets want to be able to claim a blackout period during which no one else can report of news they have uncovered — the major news outlets stole the Rolling Stone article about McCrystal and published the PDF in its entirety, breaking copyright and ‘hot news’ principles. The perps included Time and Politico.
David Carr undoes them:
Media organizations can file all the briefs they want about protecting their work product from free-riders and insurgent hordes of digital pilot fish, but once they break their own rules and start feeding on one another, the game is sort of over.
Yes, the game is sort of over.
The Google/Twitter brief in the Theflyonthewall case, where the ‘hot news’ concept is being fought includes this observation:
The modern ubiquity of multiple news platforms renders ‘hot news’ misappropriation an anachronism, aimed at muzzling all but the most powerful media companies. In a world of citizen journalists and commentators, online news organizations, and broadcasters who compete 24 hours a day, news can no longer be contained for any meaningful amount of time.
It seems that the actions of Time and Politico lend support to this, although they went way too far by posting the article in its entirety.