Elsewhere

As Teresa makes me a cup of coffee, we exchange small talk, much of which is focused around the fact that she’s too busy to open her own mail. “Someone offered $80 for what you’re doing now,” she tells me. I have been assigned to do this task for $22. “I mean, you don’t even need a high school degree. You can’t get paid $20 an hour for everything you do.”

Teresa is a single mother who has had trouble collecting child support. Her son is a freshman art student in California. She sometimes forgets to pay her bills and has been applying for jobs even though she’s worked in freelance PR for years. I know all of this because I’ve opened at least a year of her mail. In the process, which I complete while she works nearby on her laptop, I see all of her account numbers, her lease agreement, and other personal information. If I wanted to steal her identity, it would be quite easy.

PIXEL & DIMED: ON (NOT) GETTING BY IN THE GIG ECONOMY - Sarah Kessler (via pyrografica)

(via new-aesthetic)

Twitter Hints That At-Replies And Hashtags Are About To Be Streamlined

http://www.buzzfeed.com/charliewarzel/is-twitter-phasing-out-hashtags-and-at-replies

Charlie Warzel reports on hints and statements of intent at Twitter about moving the ‘arcane’ language of microsyntax (@, #, in particular) down into the infrastructure. He mentions a recent talk by Vivian Schiller:

What will Twitter look like in a year? Two years? A lot less like itself.

At least that’s the impression Vivian Schiller, head of news at Twitter, gave addressing the crowd two days ago at the Newspaper Association of America’s mediaXchange conference in Denver. During her talk, Schiller called at-replies and hashtags “arcane” and hinted that Twitter might soon move them into the background of the service.

When asked about the comments, Twitter replied that Schiller was echoing a similar sentiment that the company’s CEO, Dick Costolo, addressed in a recent earnings call:

By bringing the content of Twitter forward and pushing the scaffolding of the language of Twitter to the background, we can increase high-quality interactions and make it more likely that new or casual users will find this service as indispensable as our existing core users do. And we took initial steps in that direction with the introduction of media forward timelines and in-line social actions in October, and we’re already starting to see early signs that those initiatives are working well.

Unlike Schiller’s, Costolo’s statement makes no specific mention of hashtags and at-replies, suggesting that Schiller may have accidentally hinted at specific targets for upgrade. While it’s not immediately clear how this disappearance would work, it’s possible that at-replies will be auto-replaced by formal Twitter names, like they are on Facebook.

Hiding the # and @ characters would be like a city burying all the electric wires and TV cables so that people can see things better.

Twitter has already done away with explicit retweets (the old RT), and streamlined the way that URLs are handled, so why wouldn’t they want to clear out the unintuitive #hashtags and @mentions?

Hiding the # and @ characters would be like a city burying all the electric wires and TV cables so that people can see things better.

And this is coming from the guy that coined the term hashtag. I would be happy to drop the hash and just have real tags. But we still need to be able to tag tweets, even if we won’t be wasting characters with them. And by making tags real metadata, Twitter may finally get around to treating them as something more than just a # and a string of characters.

Here’s a picture of the explicit @mention being suppressed in an Android experimental app:

Oversimplifying Productivity

Farhad Manjoo has had a personal epiphany. He wants to get more ‘done’ so he has turned off his second monitor, which allows him to be more focused.

Farhad Manjoo, Discovering Two Screens Aren’t Better Than One

The conventional argument in favor of dual monitors rests on what might be called the two-window problem. Imagine, for instance, the process of writing a research report. You have a word processor open in one window, and, somewhere else on the screen, a web browser full of tabs pointing to research papers. To write the report, you need to shift your attention frequently from the browser to the word processor and back again. On a small display, it would be difficult to keep both windows open at the same time, so you’d waste time switching from one to the other. On a large multiscreen display, you can keep both windows open on your screen — and you save all that switching time.

The research supports this. One study commissioned by NEC and conducted by researchers at the University of Utah showed that people using a dual-display machine to do a text-editing task were 44 percent more productive than those who used a single monitor.

But for most people, the time spent juggling two windows or scrolling across large documents isn’t the biggest bottleneck in getting work done. Instead, there’s a more basic, pernicious reason you feel constantly behind — you’re getting distracted.

Ms. Mark’s research, based on observations and digital tracking of office workers, has found that our workplaces are bombarded with distractions. Studies show that office workers are interrupted every four to 11 minutes by external distractions including phone calls, email and people who stop by your desk to chat about the weekend.

Then there are self-motivated distractions, when, for no apparent reason, you quit working on your project and do something else — for instance, jump into the rabbit hole of the web.

All such disruptions are costly. It can take workers as much as 25 minutes to regain focus after being interrupted. And constant interruptions create a stressful workplace.

“The second screen can also be an inviting entry-way for self-distraction,” Ms. Mark said. That’s because it’s an ever-present, available canvas calling out for you to fire up a web window and find solace in the latest thrills on YouTube.

This is an apparently unassailable argument, so long as you accept the premises and limit the discussion to a very narrow range of issues. The first narrowing premise is that our work consists principally of taking well-defined chunks of ‘work’ out of an implicit queue, performing the physical or mental labor necessary to get it ‘done’, and then moving onto the next item in the queue. Sometimes the work involves adding new elements to the queue, or the work involves interacting with others, which can lead to new items being added to the queue or existing items being modified, or deleted from the queue. And the implicit highest good is to execute the labor to accomplish these tasks as quickly as possible.

However, there is a wider range of issues involved, even if we don’t wander off into an aesthetic argument about how we are supposed to spend our time here on Earth. For example, it’s clear that at least some sorts of work require creative insights, and that these are not always accomplished through routine techniques. 

There is also considerable evidence that when we are wrestling with decisions that involve many factors, simply concentrating on the problem analytically in the foreground of our thinking — focusing — does not necessarily lead to the best outcomes. I wrote about this in How to improve decision-making? Distraction. where I cited David Cresswell’s work:

New science shows that while we are distracted from making a final decision on some issue — even though we are unaware of it — the part of the brain associated with decision-making is still mulling the problem over, as David Cresswell of Carnegie Mellon discovered with his colleagues (see Being distracted — multitasking — can lead to better decisions).

Cresswell thinks this is because your brain can process more factors when your conscious self isn’t involved:

Your conscious mind has a capacity constraint–it can only think about a couple of features at once. But your unconscious mind doesn’t have these capacity constraints. It can weigh all relevant information more effectively.

This suggests that we have untapped reservoirs of greater parallelism in our thinking than generally recognized. And, just as importantly, this is a critical argument in favor of multitasking, whose detractors are too quick to attack its hypothetical decrease in productivity based on artificial cases of switching from math problems to recollection of words. However, the enemies of multitasking pay no attention to factors like the quality of decisions being made, or networked productivity. 

The second point I made there — about networked productivity — represents another issue that Manjoo and others leave out of the picture of productivity. While I may be distracted by a coworker breaking in on my concentrating on a specific piece of work, and yes, in fact, it may take me 25 minutes to get back on track, it might well be that by answering a question that other person and a group of four others might have saved hours of effort by by timely recommendation, and that falls out of the equation of personal productivity.

There is an astonishing eagerness to overemphasize mechanistic and narrow-bore ideals of productivity, while leaving aside the more general and larger questions. Instead, we seem to see a lot of attention paid to how can we do more, rather than how can we do better.

This is the angst that fills those in the news business, and society broadly. The reality of the Internet is that there is no more bell curve; power laws dominate, and the challenge of our time is figuring out what to do with a population distribution that is fundamentally misaligned with Internet economics.

Ben Thompson (via sippey)

(via sippey)

Information is perhaps the rawest material in the process out of which we arrive at meaning: an undifferentiated stream of sense and nonsense in which we go fishing for facts. But the journey from information to meaning involves more than simply filtering the signal from the noise. It is an alchemical transformation, always surprising. It takes skill, time and effort, practice and patience. No matter how experienced we become, success cannot be guaranteed. In most human societies, there have been specialists in this skill, yet it can never be the monopoly of experts, for it is also a very basic, deeply human activity, essential to our survival. If boredom has become a sickness in modern societies, this is because the knack of finding meaning is harder to come by.

It is only fair to note that the internet is not altogether to blame for this, and that the rise of boredom itself goes back to an earlier technological revolution. The word was invented around the same time as the spinning jenny. As the philosophers Barbara Dalle Pezze and Carlo Salzani put it in their essay ‘The Delicate Monster’ (2009):
Boredom is not an inherent quality of the human condition, but rather it has a history, which began around the 18th century and embraced the whole Western world, and which presents an evolution from the 18th to the 21st century.

For all its boons, the industrial era itself brought about an endemic boredom peculiar to the division of labour, the distancing of production from consumption, and the rationalisation of working activity to maximise output.

My point is not that we should return to some romanticised preindustrial past: I mean only to draw attention to contradictions that still shape our post-industrial present. The physical violence of the 19th-century factory might be gone, at least in the countries where industrialisation began, but the alienation inherent in these ways of organising work remains.

When the internet arrived, it seemed to promise a liberation from the boredom of industrial society, a psychedelic jet-spray of information into every otherwise tedious corner of our lives. In fact, at its best, it is something else: a remarkable helper in the search for meaningful connections. But if the deep roots of boredom are in a lack of meaning, rather than a shortage of stimuli, and if there is a subtle, multilayered process by which information can give rise to meaning, then the constant flow of information to which we are becoming habituated cannot deliver on such a promise. At best, it allows us to distract ourselves with the potentially endless deferral of clicking from one link to another. Yet sooner or later we wash up downstream in some far corner of the web, wondering where the time went. The experience of being carried on these currents is quite different to the patient, unpredictable process that leads towards meaning.

Dougald Hine, The problem with too much information

A Toaster That Begs You to Use It: Welcome to the Bizarro Smart Home | Wired Design | Wired.com

wildcat2030:

See on Scoop.it - Cyborg Lives
image
Brad, a toaster susceptible to peer pressure, is the star of “Addicted Products,” a design experiment recently named Best in Show at the 2014 Interaction Awards.

-

His name is Brad, and he’s an addict. And a toaster.

Brad is the star of “Addicted Products,” a design experiment recently named Best in Show at the 2014 Interaction Awards. As a connected toaster, he’s in constant contact with other connected toasters like him–and thus keenly aware of how much action they’re getting. If he’s not being used as much as his friends, Brad gets upset. He’ll wiggle his little handle to get your attention, begging you to make some toast or at least to give him a reassuring pat on the side. Ignore him long enough, and he’ll take a more drastic measure: pinging a network of potential owners to find a new home. Hey, at least he didn’t burn down your kitchen.

Conceived by Italian designer Simone Rebaudengo, Brad is a glimpse into a bizarro near-future, one where the internet of things leads not to harmoniously interconnected gadgets but rather a house full of junkies–appliances hopelessly addicted to being used. The fanciful premise evolved from a simple idea: What if the smart objects of the future aren’t just smart, but also potentially jealous, petty or vindictive? What if, connected to and benchmarked against their peers, their relationships with each other start to inform their relationships with us?


See on wired.com
Related Posts Plugin for WordPress, Blogger...