Install Theme

Posts tagged with ‘attention’

Your Brain on Computers - Studying the Brain Off the Grid, Professors Find Clarity -- Matt Richtel →

A group of brain scientists take a rafting trip in Utah out of touch of cell phone rambling about attention, multitasking, and the impacts of disconnecting on cognitive behavior. And it’s interspersed with snippets of commentary about research on attention, etc. But no real case is put together, and the author, Matt Richtel has cherry picked the research, never mentioning research that counters the subtext. Richtel has written other pieces advancing the same case, in which he skips over contradictory evidence, as I reported in Do ‘Supertaskers’ Mean We Are Adapting To A Multiphrenic World?

Richtel seems the NY Times leading critic of being wired and the evils of a multiphrenic world: he is fighting the war on flow.

Demonizing Twitter: Fear Of The Future

Recently, David Carr wrote a piece called Why Twitter Will Endure, in which he expressed some surprise at his own conversion to Twitter advocate:

On Twitter, anyone may follow anyone, but there is very little expectation of reciprocity. By carefully curating the people you follow, Twitter becomes an always-on data stream from really bright people in their respective fields, whose tweets are often full of links to incredibly vital, timely information.

The most frequent objection to Twitter is a predictable one: “I don’t need to know someone is eating a donut right now.” But if that someone is a serious user of Twitter, she or he might actually be eating the curmudgeon’s lunch, racing ahead with a clear, up-to-the-second picture of an increasingly connected, busy world. The service has obvious utility for a journalist, but no matter what business you are in, imagine knowing what the thought leaders in your industry were reading and considering. And beyond following specific individuals, Twitter hash tags allow you to go deep into interests and obsession: #rollerderby, #physics, #puppets and #Avatar, to name just a few of many thousands.

The act of publishing on Twitter is so friction-free — a few keystrokes and hit send — that you can forget that others are out there listening. I was on a Virgin America cross-country flight, and used its wireless connection to tweet about the fact that the guy next to me seemed to be the leader of a cult involving Axe body spray. A half-hour later, a steward approached me and said he wondered if I would be more comfortable with a seat in the bulkhead. (He turned out to be a great guy, but I was doing a story involving another part of the company, so I had to decline the offer. @VirginAmerica, its corporate Twitter account, sent me a message afterward saying perhaps it should develop a screening process for Axe. It was creepy and comforting all at once.)

Like many newbies on Twitter, I vastly overestimated the importance of broadcasting on Twitter and after a while, I realized that I was not Moses and neither Twitter nor its users were wondering what I thought. Nearly a year in, I’ve come to understand that the real value of the service is listening to a wired collective voice.

Not that long ago, I was at a conference at Yale and looked at the sea of open laptops in the seats in front of me. So why wasn’t my laptop open? Because I follow people on Twitter who serve as my Web-crawling proxies, each of them tweeting links that I could examine and read on a Blackberry. Regardless of where I am, I surf far less than I used to.

At first, Twitter can be overwhelming, but think of it as a river of data rushing past that I dip a cup into every once in a while. Much of what I need to know is in that cup: if it looks like Apple is going to demo its new tablet, or Amazon sold more Kindles than actual books at Christmas, or the final vote in the Senate gets locked in on health care, I almost always learn about it first on Twitter.

Carr’s piece stirred some yowling in the commentariat, in particular a post from George Packer that casts Twitter as the worst part of a world moving too fast:

- George Packer, Stop The World

The truth is, I feel like yelling Stop quite a bit these days. Every time I hear about Twitter I want to yell Stop. The notion of sending and getting brief updates to and from dozens or thousands of people every few minutes is an image from information hell. I’m told that Twitter is a river into which I can dip my cup whenever I want. But that supposes we’re all kneeling on the banks. In fact, if you’re at all like me, you’re trying to keep your footing out in midstream, with the water level always dangerously close to your nostrils. Twitter sounds less like sipping than drowning.

The most frightening picture of the future that I’ve read thus far in the new decade has nothing to do with terrorism or banking or the world’s water reserves—it’s an article by David Carr, the Timess media critic, published on the decade’s first day, called “Why Twitter Will Endure.” “I’m in narrative on more things in a given moment than I ever thought possible,” Carr wrote. And: “Twitter becomes an always-on data stream from really bright people.” And: “The real value of the service is listening to a wired collective voice … the throbbing networked intelligence.” And: “On Twitter, you are your avatar and your avatar is you.” And finally: “There is always something more interesting on Twitter than whatever you happen to be working on.”

Nick Bilton responded to this post, trying to counter Packer’s points one by one, in The Twitter Train Has Left The Station:

[…] Mr. Packer’s misgivings seem to be based entirely on what he has heard about the service — he’s so afraid of it that he won’t even try it. (I wonder how Mr. Packer would feel if, say, a restaurant critic panned a restaurant based solely on hearsay about the establishment.)

“Twitter is crack for media addicts,” he writes. “It scares me, not because I’m morally superior to it, but because I don’t think I could handle it.”

Call me a digital crack dealer, but here’s why Twitter is a vital part of the information economy — and why Mr. Packer and other doubters ought to at least give it a Tweet.

Hundreds of thousands of people now rely on Twitter every day for their business. Food trucks and restaurants around the world tell patrons about daily food specials. Corporations use the service to handle customer service issues. Starbucks, Dell, Ford, JetBlue and many more companies use Twitter to offer discounts and coupons to their customers. Public relations firms, ad agencies, schools, the State Department — even President Obama — now use Twitter and other social networks to share information.

There are communication and scholarly uses. Right now, an astronaut, floating 250 miles above the Earth, is using Twitter and conversing with people all over the globe, answering both mundane and scientific questions about living on a space station.

Most importantly, Twitter is transforming the nature of news, the industry from which Mr. Packer reaps his paycheck. The news media are going through their most robust transformation since the dawn of the printing press, in large part due to the Internet and services like Twitter. After this metamorphosis takes place, everyone will benefit from the information moving swiftly around the globe.

You can see that change beginning to take place. During the protests in Iran last year, ordinary Iranians shared information through Twitter about the government atrocities taking place. That supplemented the reporting by professional journalists, who faced restrictions on their movements and coverage. More recently, after the earthquake in Haiti, Twitter helped spread information about donation efforts, connected people to their loved ones, and of course, spread news from inside the country — news that reprinted in this publication.

Bilton’s reasonableness completely misses the point, because Packerisn’t really concerned with Twitter’s relative merits, or even it’s potential utility to him as a journalist: he is lamenting the decline of a passing intellectual world in which criticism and long-form writing were the zenith, a pinnacle to which he had aspired and succeeded. In this rebuttal to Bilton’s piece, Packer makes this clear.

- George Packer, Neither Luddite Nor Biltonite

It’s true that I hadn’t used Twitter (not consciously, anyway—my editors inform me that this blog has for some time had an automated Twitter feed). I haven’t used crack, either, but—as a Bilton reader pointed out—you don’t need to do the drug to understand the effects. One is the sight of adults walking into traffic with their eyes glued to their iPhones, or dividing their attention about evenly between their lunch partner and their BlackBerry. Here’s another: Marc Ambinder, The Atlantics very good politics blogger, was asked by Michael Kinsley to describe his typical day of information consumption, otherwise known as reading. Ambinder’s day begins and ends with Twitter, and there’s plenty of Twitter in between. No mention of books, except as vacation material via the Kindle. I’m sure Ambinder still reads books when he’s not on vacation, but it didn’t occur to him to include them in his account, and I’d guess that this is because they’re not a central part of his reading life.

And he’s not alone. Just about everyone I know complains about the same thing when they’re being honest—including, maybe especially, people whose business is reading and writing. They mourn the loss of books and the loss of time for books. It’s no less true of me, which is why I’m trying to place a few limits on the flood of information that I allow into my head. The other day I had to reshelve two dozen books that my son had wantonly pulled down, most of them volumes from college days. I thumbed idly through a few urgently underlined pages of Kierkegaard’s “Concluding Unscientific Postscript,” a book that electrified me during my junior year, and began to experience something like the sensation middle-aged men have at the start of softball season, when they try sprinting to first base after a winter off. What a ridiculous effort it took! There’s no way for readers to be online, surfing, e-mailing, posting, tweeting, reading tweets, and soon enough doing the thing that will come after Twitter, without paying a high price in available time, attention span, reading comprehension, and experience of the immediately surrounding world. The Internet and the devices it’s spawned are systematically changing our intellectual activities with breathtaking speed, and more profoundly than over the past seven centuries combined. It shouldn’t be an act of heresy to ask about the trade-offs that come with this revolution. In fact, I’d think asking such questions would be an important part of the job of a media critic, or a lead Bits blogger.

Instead, the response to my post tells me that techno-worship is a triumphalist and intolerant cult that doesn’t like to be asked questions. If a Luddite is someone who fears and hates all technological change, a Biltonite is someone who celebrates all technological change: because we can, we must. I’d like to think that in 1860 I would have been an early train passenger, but I’d also like to think that in 1960 I’d have urged my wife to go off Thalidomide.

Bilton’s arguments on behalf of Twitter are that it’s useful for marketing and “information-sharing,” and that I, as a journalist, ought to understand the value as well as anyone: “Twitter is transforming the nature of news, the industry from which Mr. Packer reaps his paycheck. The news media are going through their most robust transformation since the dawn of the printing press, in large part due to the Internet and services like Twitter. After this metamorphosis takes place, everyone will benefit from the information moving swiftly around the globe.”

If there are any journalists left by then. Until that promised future, American newspapers and magazines will continue to die by the dozen, and Bilton’s Times will continue to cut costs by asking reporters and editors to take buy-outs, and the economic basis for reporting (as opposed to information-sharing, posting, and Tweeting) will continue to erode. You have to be a truly hard-core techno-worshipper to call this robust. A friend at the Times recently said he doubts that in five years there will be a print edition of the paper, except maybe on Sundays. Once the print New York Times is extinct, it’s not at all clear how the paper will pay for its primary job, which is reporting. Any journalist who cheerleads uncritically for Twitter is essentially asking for his own destruction.

Bilton’s post did prompt me to seek out a Tweeter, which provided half an hour of enlightenment, diversion, and early-onset boredom, at the end of which I couldn’t bring myself to rue all the Twitter links and restaurant specials and coupon offers I’ll continue to miss. It’s true that Bilton will have news updates within seconds that reach me after minutes or hours or even days. It’s a trade-off I can live with. As Garry Trudeau (who is not on Twitter) has his Washington “journotwit” Roland Hedley tweet at the end of “My Shorts R Bunching. Thoughts?,” “The time you spend reading this tweet is gone, lost forever, carrying you closer to death. Am trying not to abuse the privilege.”

[all emphasis mine.]

Here, Packer drops the curmudgeonly pretense of the first piece, and starts chewing the furniture. He makes clear that he believes Twitter is the archangel of a dark future, an appliance that will make us stupid. Twitter and other web tools take us away from grown-up activities like reading and walking slowly through museums. These are dangerous toys, he says, that could blow off your prefrontal cortext if you aren’t careful. And he’s an attention economist, saying we don’t have time to mess with this junk when there is so much to do! You can almost hear him yelling, “Get back to work, slackers!”

Then he attacks all those that smirked and called him a Luddite, calling us cultists and intolerant. (Well, I admit I am intolerant of people that call me an intolerant cultist.)

Packer also suggests that Bilton is a traitor to his calling, supporting the use of technologies that are directly leading to the erosion of old media. He has a point, since time that people spend using tools like Twitter does cut into traditional media, like TV, radio, and newspapers. But the media folks should take the rap for that, since they are losing us exactly because they failed to provide open social discourse. We moved onto the web to have what they failed to produce, and we are doing it ourselves, and to the degree that old media figure that out, the more they will change.

But beneath all this is fear: fear of the future, fear of change, and fear of the new.

Packer senses a world he loves slipping away. A world in which rereading Kirkegaard is seen as a noble end, and not just escapism or a mere hobby.

He makes Twitter a demon, and calls us cultists, worshipping technology. This is the war on flow, yet again. Packer and his ilk will say what we are doing is illegitimate, immoral, immature. Any slight merits these tools may have are overbalanced by the harm they do. While they may give their users pleasure, those pleasures are like drugs, gossip, or masturbation. We should put these dangerous mind-altering toys aside, and invest ourselves in grown-up activities, like quality face time with a small circle of ‘real’ friends, or reading.

Critics like Packer always miss the social dimension of these tools. They focus on their informational use, or talk about them as if they were communication devices like phones. Or compare them to drugs. But they are much more than that. Those of us online are deriving community and involvement from participating in these social settings, a sense of being connected that may have been missing in many people’s live before the web.

Sociality on the web is subversive, and it does alter the established role of media, which directly threatens Packer and other journalists. But mostly I think Parker is suffering a sort of future shock, a fear of the future, and the loss of a precious past, a time in which he knew what was right and wrong, what to do and say, and which way was up.

I might feel the same way if I thought the web was going to come to an end, and all that I have come to rely on — friendship, connection, and membership in a community of other minds — were to go away. But I think the web is here to stay, and if something new comes along, I would probably jump on that, anyway.

Enhanced by Zemanta

The False Question Of Attention Economics

A few posts have emerged recently that recapitulate the well-worn arguments of attention scarcity and information overload in the real-time social web. So, here at start of 2010, a new decade, I will try to write a short and sweet counter argument from a cognitive science/anthropology angle.

But first let me recount the two pieces that prodded me to this.

David Armano wrote a piece called The Human Feed: How People Filter Signal From Noise:

In the earlier days of the internet, the Web became a place quickly saturated with information and we needed something to beat the information into submission. Search engines were born—and as a result the internet became more productive.  Today, the internet is still about information—but it’s also about attention. There is a surplus of information, and a meta surplus of marketing in every form. For individuals, we are experiencing the opposite. We have a deficit in attention.

We’ve long exceeded the capacity of information that we can absorb and retain. We all suffer from technology induced attention deficit disorder, bright and shiny object syndrome and short term memory loss.

Bookmarks don’t help—now we need tools like del.icio.us. And of course we need Google more than ever. And there’s once more thing we need. We need each other to make sense of it all. We need a Web with a human touch to help guide us through the fragmented, landscape of the internet. And that’s where the human feed comes in. If you sign up to a service like Twitter, Friendfeed, or even subscribe to the del.ico.us links of real live people who you trust and look to for insights, you’ll find that a wealth of information will be brought right to you vs. you having to go out and hunt for it.

image

David Armano

Just for history’s sake, the origin of the ‘poverty of attention’ meme was Herbert Simon, way back in 1971:

…in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.

image

 
Herbert Simon

Likewise, just around the same time as Herbert’s comment, Alvin Toffler wrote Future Shock, in which he suggests that the speed up of modern society was basically driving us crazy, like soldiers suffering from shell shock, now called combat stress reaction. In other words, “too much change in too short a period of time.” One major component of future shock — to which he ascribes most of the major problems of our day — is information overload: too much information to make sense of, with the implied context of a future shock sped-up world.

image

And, going for the oldest philosophical natterings about this, Diderot, in 1755 wrote,

As long as the centuries continue to unfold, the number of books will grow continually, and one can predict that a time will come when it will be almost as difficult to learn anything from books as from the direct study of the whole universe. It will be almost as convenient to search for some bit of truth concealed in nature as it will be to find it hidden away in an immense multitude of bound volumes.

image 

Denis Diderot by Louis-Michel van Loo

So, Armano has joined that long list of philosophers, suggesting that there is too much to grasp, things are going too fast, and the techniques that we have used in the past are failing.

Another post, this one by Thomas Peterson, makes more or less the same case, although employing the stark metaphor Slaves of the Feed to suggest we have become entrapped by our tools:

Let’s start with what most people probably can agree.
Information is accumulating online. The amount of available information is increasing at an exponential rate, some say it doubles every second year. This mean that any illusion of being able to stay up to date with everything that is going on is utopian and has been probably since Gutenberg invented the press.

Most people know this, yet that is exactly exactly what we all seem to be doing.

There is no shortage of content aggregators and aggregators of aggregators, daily developed to give us a better overview of all the sources of information we have subscribed to and found ourselves now depending on.

This has resulted in an endless stream of articles, news, pictures, websites, products, updates, comments of updates and comments to these comments, being delivered to us second by second that each of us have to deal with.

Constantly checking our feeds for new information, we seem to be hoping to discover something of interest, something that we can share with our networks, something that we can use, something that we can talk about, something that we can act on, something we didn’t know we didn’t know.

It almost seems like an obsession and many critics of digital technology would argue that by consuming information this way we are running the danger of destroying social interaction between humans. One might even say that we have become slaves of the feed.

 

At every point in human history there have been philosophers claiming that the current civilization has fallen from an earlier halcyon state, that the ways of the ancients had been lost, and modern innovations and practices threatened to destroy all that was good in society and culture.

This thread of Western philosophical discourse — attention scarcity, future shock, information overload — has become the conventional wisdom. It seems to be based on unassailable and unshakable logic. But what is that logic?

The framing of the argument includes the unspoken premise that once upon a time in some hypothetical past attention wasn’t scarce, we didn’t suffer from too much information, and we had all the time in the world to reason about the world, our place in it, and therefore to make wise and grounded decisions.

But my reading of human history suggests the opposite. In the pre-industrial world, business people and governments still suffered from incomplete information, and the pace of life always seemed faster than what had gone on in earlier times. At every point in human history there have been philosophers claiming that the current civilization has fallen from an earlier halcyon state, that the ways of the ancients had been lost, and modern innovations and practices threatened to destroy all that was good in society and culture. 

So, this is merely the most recent spin on an ancient theme, as the Diderot quote indicates.

Imagine for a moment that it is true — there was an idyllic time back in the Garden of Eden — when we knew all that was necessary to know, and we had all the time in the world to make decisions. Maybe. I am betting it is a shadow of our psychology, the same sort of magical thought that believes in guardian angels and reincarnation. Just a slightly more intellectual superstition.

Another thread of this argument is that human beings don’t have the capacity to winnow out the information we need given the torrent of information streaming past, which is in a sense Diderot’s conjecture. But we really don’t know what we are capable of, honestly.

The human mind is exceptionally plastic, especially when young people are exposed to media and symbolic information systems at an early age. This is why those that take up the study of music, or programming, or karate at a young age, and study for 10,000 hours gain mastery of these skills, which can be accomplished before reaching 20 years of age. And even older people can have significant improvements in cognitive skills — like juggling or flight simulation games — with relative small exposure.

I suggest we just haven’t experimented enough with ways to render information in more usable ways, and once we start to do so, it will like take 10 years (the 10,000 hour rule again) before anyone demonstrates real mastery of the techniques involved.

These are generational time scales, people. And note: the only ones that will benefit in the next ten years will be those that expend the time needed to stretch the cognition we have, now, into the configuration needed to extract more from the increasingly real-time web.

The most difficult argument to make is the following:

  • We have always been confronted with a world — both natural and human-made — that offers an infinite amount of information.
  • We have devised cultural tools — like written language, mathematics, and the scientific method — to help understand the world in richer ways, over and above our emotional and inbuilt cognitive capabilities.
  • We are heading into a post-industrial world where information systems and the social matrix of the web have become the most important human artifact, one that is repurposing everything that has come before.
  • We will need to construct new and more complex cultural tools — things like augmented reality, massively parallel social tools, and ubiquitous mobile connected devices — and new societal norms and structures to assist us in using them effectively.
  • Many commentators — including Armano and Peterson — allude to the now generally accepted notion that we will have to leverage social systems (relying on social tools) to accomplish some part of the heavy lifting in whatever new schemes we develop for understanding this new world. But it has only been 10 years since we’ve been talking about social tools, and less than five that we had anything like real-time streaming applications or tools involving millions of users. It’s early days.

I think that the rise of the social web, just like writing, the printing press, and the invention of money, is not really about the the end of what came before, but instead is the starting point for what comes next: richer and more complex societies. These technologies are a bridge we use to cross over into something new, not a wrecking ball tearing down the old.

In the final analysis, I am saying there is no ‘answer’ to those that say we are overloaded, that we are being driven mad by or enslaved to the tools we are experimenting with, or that there is some attention calculus that trumps all other value systems.

I suggest we just haven’t experimented enough with ways to render information in more usable ways, and once we start to do so, it will like take 10 years (the 10,000 hour rule again) before anyone demonstrates real mastery of the techniques involved.

Instead, I suggest we continue experimenting, cooking up new ways to represent and experience the flow of information, our friends’ thoughts, recommendations, and whims, and the mess that is boiling in the huge cauldron we call the web.

There is no “answer” since they are asking a false question, one that hides preconceived premises and biases. Starting out with the assumption that we have moved past our abilities to cope with the stream of information, and therefore something has to give, is a bias.

In part, this arises from the desire of economists like Simon to find what is scarce, and ascribe a value to it. Or to media and PR types, who want to control discourse, and fill it with their ‘messages’ and influence social opinion or buying behavior.

But from a cognitive and anthropological viewpoint, these concerns are something like Socrate’s argument that learning to read and write would debase the cognition of those that had become literate. In his era the ability to remember thousands of verses of poetry was the baseline for being enculturated, and he believed that something fundamental would be lost if we were to rely on books instead of our memories. He believed that writing was the fall from a better time, a lesser way to think and understand the world.

I think that the rise of the social web, just like writing, the printing press, and the invention of money, is not really about the the end of what came before, but instead is the starting point for what comes next: richer and more complex societies. These technologies are a bridge we use to cross over into something new, not a wrecking ball tearing down the old.

There is no golden past that we have fallen from, and it is unlikely that we are going to hit finite human limits that will stop us from a larger and deeper understanding of the world in the decades ahead, because we are constantly extending culture to help reformulate how we perceive the world and our place in it.

Anne Truitt Zelenka v Peter Drucker

Anne has a solid post that includes this in a recast of something Peter Drucker once wrote:

[from Knowledge Economy (Drucker) vs. Web Economy (Zelenka)

Web workers do not start with their tasks or with their time [Drucker stated that executives start by examing where their time ‘goes’.]. They start with their attention. And they do not start out with planning or by finding out where their time actually goes. They start by finding where their attention wanders, and what gives them energy and increased attention. Then they attempt to let their attention flow freely and to cut back on redundant or tired information sources that demand their attention without providing new ideas or insight. Finally they combine what they have found into something new (software, web design, industry analysis, etc.) and make it available on the web where it can earn attention itself and lead to an ongoing multiplication of attention.

Steve Rubel Becomes Another Attention Economist

Steve Rubel is following the lead of many others into Toffler’s “information overload is driving us crazy” tarpit. He’s in good company, joined by Herbert Simon, Tom Davenport, and Linda Stone: the Attention Economists.

[from Micro Persuasion: The Attention Crash]

We are reaching a point where the number of inputs we have as individuals is beginning to exceed what we are capable as humans of managing. The demands for our attention are becoming so great, and the problem so widespread, that it will cause people to crash and curtail these drains. Human attention does not obey Moore’s Law.

[…]

With this philosophy in mind [Tim Ferliss’s 4 Hour Workweek], I have trimmed projects, RSS feeds and emails to hone in on the 20 percent that’s most important. It’s also why I am not trying every new site that floats in my inbox and deleting pitches that are clearly off topic w/o even reading them.

My attention has reached a limit so I have re-calibrated it to make it more effective. I think this issue is an epidemic.

No, I think we need to develop new behaviors and new ethics to operate in the new context.

Most people operate on the assumption that the response to increased flow is to intensify what was working formerly: read more email, read more blogs, write more IMs, and so on. And at the same time motor on with the established notions of what a job is, how to accomplish work and meet deadlines, and so on.

In a time of increased flow, yes, if you want to hold everything else as is — your definition of success, of social relationships, of what it means to be polite or rude — Steve is right: you will have to cut back.

Alternatively, we can start to shift everything, let go of a lot of the old ways, and operate on a new, pre-industrial, pre-agricultural footing.

  1. It’s OK not to respond to emails, vmails, or IMs. There is no possible way that you can live a public life, open to the world, and respond to every request that comes along. The same holds even if it is a friend, or colleague. People have to pick and choose: it’s a big world.
  2. It’s sensible to have a nomadic reading style: if something is important it will show up in a variety of places. Don’t be a slave to RSS readers: throw them away. (I have always hated RSS readers that emulate the email inbox, for exactly this reason: they make everything seem equally important… or equally unimportant.)
  3. Unlike Steve (or Tim Ferliss), I don’t know exactly how to trim out the 80% of everything that is junk, as Tim Ferliss suggests. I do fire clients that make things difficult, unpleasant, or unrewarding, but it’s not statistical. I constantly gravitate to projects and people that I think offer the greatest opportunities for growth, which means constantly leaving other things behind. But this is just another kind of flow, not a one-time triage: it is a constant attrition and acquisition.

Instead of the 4 Hour Workweek, though, I suggest that people read the Tao Te Ching:

9

Fill your bowl to the brim
and it will spill.
Keep sharpening your knife
and it will blunt.
Chase after money and security
and your heart will never unclench.
Care about people’s approval
and you will be their prisoner.

Do your work, then step back.
The only path to serenity.

The answer is not becoming obsessed with attention as a limited resource to be husbanded, or thinking of our cognition as a laser beam to be pointed at only at what is important.

We need to unfocus, to rely more on the network or tribe to surface things of importance, and remain open to new opportunities: these are potentially more important than the work on the desk. Don’t sharpen the knife too much.

xposted.com - The Intention Economy Incarnate

Doc Searls, inspired (or goaded?) by the Attention Economy meme of the Etech conference, has offered up a completely different, but alliterative term for the world we are now entering: The Intention Economy.

In the hallway yesterday I was talking with r0ml Lefkowitz, who now works with Seth Goldstein at Root.net. r0ml was talking about how his brother, not a techie, didn’t understand what r0ml meant by working with “attention”. After r0ml explained, his brother said, “Oh, isn’t that what they used to call ‘eyeballs’?”

Bull’s Eye.

Now, I’m sure eyeballs aren’t what Steve Gillmor means by Attention. Or what Seth and r0ml mean, either. In fact, r0ml explained to me that Root.net is actually concerned with something much simpler and less creepy than eyeballs; namely, leads. In other words, people who are ready to buy.

Though I’m not much more comfortable being a “lead” than being an “eyeball”, at least “lead” regards me as a potential buyer, rather than as yet another “consumer” who might become a buyer if I find a “message” persuasive. The chance of that happening in any individual case is so close to zero that advertising only yields useful numbers in the calculus of mass marketing. Which, even in 2006, at eTech, we still use.

So I’m thinking, Can’t we get past that now? Please?

Hence my idea: The Intention Economy.

The Intention Economy grows around buyers, not sellers. It leverages the simple fact that buyers are the first source of money, and that they come ready-made. You don’t need advertising to make them.

The Intention Economy is about markets, not marketing. You don’t need marketing to make Intention Markets.

The Intention Economy is built around truly open markets, not a collection of silos. In The Intention Economy, customers don’t have to fly from silo to silo, like a bees from flower to flower, collecting deal info (and unavoidable hype) like so much pollen. In The Intention Economy, the buyer notifies the market of the intent to buy, and sellers compete for the buyer’s purchase. Simple as that.

The Intention Economy is built around more than transactions. Conversations matter. So do relationships. So do reputation, authority and respect. Those virtues, however, are earned by sellers (as well as buyers) and not just “branded” by sellers on the minds of buyers like the symbols of ranchers burned on the hides of cattle.

The Intention Economy is about buyers finding sellers, not sellers finding (or “capturing”) buyers.

Doc’s model is smart: individuals define or describe what it is they intend to buy, or what they are intent upon, and this information — made available in perhaps even an anonymized fashion — allows sellers to connect with the appropriate buyers or users. It’s not where we have been spending our attention, its not even buried in our clickstreams, its what we have in our hearts and minds that matters. Perhaps part of that can be discerned from the conversations we are having online, in blogs and chat, but maybe not. Perhaps just a simple statement of intent might go a long way.

[Update: Greg Narain points out that the x:posted service is a perfect example of the dynamics that Doc is talking about. I am using this as the link to set up my own x:posted account, so I can begin to sell my writing, as described here.]