April 25th & 26th
287 Kent Ave, Brooklyn, NY 11211
Abstract Submission Deadline: January 19th
What does it mean that digital technologies are increasingly a part of...
Kevin Kelly is another deep mind countering economist Robert Gordon’s claim that information technology has had only a small impact on productivity, along with Krugman. HIs argument is that a/ we’ve only been at this 20 years, since the emergence of the web (he discounts Gordon’s start of the era of information technology back in the ’60s) and b/ Gordon’s obsession with productivity is dumb: we should be measuring the degree to which this wave of technology frees us to waste time. Kelly refers to this as the post-productive economy, a techno-utopian view with some sensible foundations.
Kevin Kelly, The Post-Productive Economy
It’s hard to shoehorn some of the most important things we do in life into the category of “being productive.” Generally any task that can be measured by the metrics of productivity — output per hour — is a task we want automation to do. In short, productivity is for robots.Long-term growth of that type that Robert Gordon studies is really weird if you think about it. As he notes, there wasn’t much of it in the world before 1750, before technological progress. Now several centuries later we have a thousand times as much wealth as before. Where does this extra good stuff come from? It is not moved from somewhere else, or borrowed. It is self-created. There’s a system which manufactures this wealth “out of nothing.” Much like life itself. There are certainly necessary conditions and ingredients, but it seems once you have those in place, the economy (the system) will self-generate this wealth.
A number of economists have wrestled with the origins of this self-generating wealth. Paul Romerand Brian Arthur both separately point to the recombining and re-mixing of existing ideas as the way economic growth occurs. This view focuses on knowledge as the prime motor in a self-renewing circle of increasing returns. Unlike say energy or matter, the more knowledge you spend, the more knowledge you earn, and the more breeds more in a never-ending virtuous spiral.
What is important is that this self-increasing cycle makes things that are new. New goods, new services, new dreams, new ambitions, even new needs. When things are new they are often not easy to measure, not easy to detect, nor easy to optimize. The 1st Industrial Revolution that introduced steam and railways also introduced new ideas about ownership, identity, privacy, and literacy. These ideas were not “productive” at first, but over time as they seeped into law, and culture, and became embedded into other existing technologies, they helped work to become more productive. For example ideas of ownership and capital became refined and unleashed new arrangements for funding large-scale projects in more efficient ways. In some cases these indirect ideas may have more long-term affect on growth than the immediate inventions of the time.
Likewise the grand shift our society is undergoing now, moving to a highly networked world in the third phase of industrialization, is producing many innovations that 1) are hard to perceive, 2) not really about optimizing labor, and 3) therefore hard to quantify in terms of productivity.
One has the sense that if we wait a while, the new things will trickled down and find places in the machinery of commerce where they can eventually boost the efficiency of work.
But it seems to me that there is second-order tilt in this shift to a networked world that says the real wealth in the long-term, or perhaps that should be the new wealth, will not be found merely in greater productivity, but in greater degrees of playing, creating, and exploring. We don’t have good metrics for new possibilities, for things that have never been seen before, because by definition, their boundaries, distinctions, and units are unknown. How does one measure “authenticity” or “hyperreality” or “stickiness”?
Productivity is the main accomplishment, and metric, of the two previous Industrial Revolutions. Productivity won’t go away; over the long term it will take fewer hours of human work to produce more of the goods and services those economies produce. Our system will do this primarily because most of this work will be done by bots.
The main accomplishment of this 3rd Industrialization, the networking of our brains, other brains and other things, is to add something onto the substrate of productivity. Call it consumptity, or generativity. By whatever name we settle on, this frontier expands the creative aspect of the whole system, increasing innovations, expanding possibilities, encouraging the inefficiencies of experiment and exploring, absorbing more of the qualities of play. We don’t have good measurements of these yet. Cynics will regard this as new age naiveté, or unadorned utopianism, or a blindness to the “realities” of real life of greedy corporations, or bad bosses, or the inevitable suffering of real work. It’s not.
The are two senses of growth: scale, that is, more, bigger, faster; and evolution. The linear progression of steam power, railways, electrification, and now computers and the internet is a type of the former; just more of the same, but only better. Therefore the productivity growth curve should continue up in a continuous linear fashion.
I suggest the growth of this 3rd regime is more like evolutionary growth, rather than developmental growth. The apparent stagnation we see in productivity, in real wages, in debt relief, is because we don’t reckon, and don’t perceive, the new directions of growth. It is not more of the same, but different.
This all reminds me of McLuhan’s claim —
Today in the electric age we feel as free to invent nonlineal logics as we do to make non-Euclidean geometries. Even the assembly line, as the method of analytic sequence for mechanizing every kind of making and production, is nowadays yielding to new forms.
— which I interpret to mean that as information technology advances over the next 100 years, it will push people increasingly into the role of artists, and out of the factories. An evolution of society, not just a speeding up.
Of course, the trick isn’t just convincing everyone that idleness should still come with a paycheck. The big hitch is managing to survive all the messes we’ve created in the name of global productivity and growth at all costs. There might be a techno-utopia in the out years, but in the meantime we have to learn to weather the postnormal, first.
One of Kelly’s paragraphs jumps out as perhaps the most challenging for those with the deepest identification with modern business ideology:
Civilization is not just about saving labor but also about “wasting” labor to make art, to make beautiful things, to “waste” time playing, like sports. Nobody ever suggested that Picasso should spend fewer hours painting per picture in order to boost his wealth or improve the economy. The value he added to the economy could not be optimized for productivity. It’s hard to shoehorn some of the most important things we do in life into the category of “being productive.” Generally any task that can be measured by the metrics of productivity — output per hour — is a task we want automation to do. In short, productivity is for robots. Humans excel at wasting time, experimenting, playing, creating, and exploring. None of these fare well under the scrutiny of productivity. That is why science and art are so hard to fund. But they are also the foundation of long-term growth. Yet our notions of jobs, of work, of the economy don’t include a lot of space for wasting time, experimenting, playing, creating, and exploring.
Paul Krugman begins to discuss our new economy — the Postnormal — but he doesn’t call it that, at least not yet. Mostly it seems like he is responding to an economic hand grenade thrown by Robert Gordon of Northwestern University, who suggests that the last era of industrialism — electrification — ran from 1890 to the 1960s, and since then we have been floundering, because the information technology revolution hasn’t had as big as an impact:
Is Growth Over? - Paul Krugman via NYTimes.com
Recently, Robert Gordon of Northwestern University created a stir by arguing that economic growth is likely to slow sharply — indeed, that the age of growth that began in the 18th century may well be drawing to an end.
Mr. Gordon points out that long-term economic growth hasn’t been a steady process; it has been driven by several discrete “industrial revolutions,” each based on a particular set of technologies. The first industrial revolution, based largely on the steam engine, drove growth in the late-18th and early-19th centuries. The second, made possible, in large part, by the application of science to technologies such as electrification, internal combustion and chemical engineering, began circa 1870 and drove growth into the 1960s. The third, centered around information technology, defines our current era.
And, as Mr. Gordon correctly notes, the payoffs so far to the third industrial revolution, while real, have been far smaller than those to the second. Electrification, for example, was a much bigger deal than the Internet.
It’s an interesting thesis, and a useful counterweight to all the gee-whiz glorification of the latest tech. And while I don’t think he’s right, the way in which he’s probably wrong has implications equally destructive of conventional wisdom. For the case against Mr. Gordon’s techno-pessimism rests largely on the assertion that the big payoff to information technology, which is just getting started, will come from the rise of smart machines.
[… a discussion of the state of AI… smart machines…]
So machines may soon be ready to perform many tasks that currently require large amounts of human labor. This will mean rapid productivity growth and, therefore, high overall economic growth.
But — and this is the crucial question — who will benefit from that growth? Unfortunately, it’s all too easy to make the case that most Americans will be left behind, because smart machines will end up devaluing the contribution of workers, including highly skilled workers whose skills suddenly become redundant. The point is that there’s good reason to believe that the conventional wisdom embodied in long-run budget projections — projections that shape almost every aspect of current policy discussion — is all wrong.
What Krugman doesn’t say — and perhaps doesn’t see — is that the ephemeralization of labor due to information technology has been going on for decades, and that may be it’s biggest impact: ending jobs. Remember all those jobs were people typed in data based on insurance, medical, and government forms sent through the mail? All those secretaries that used to manage people’s travel, appointments, and business meetings? Telephone operators? The expediters that would figure out the best route for a fleet of UPS trucks to make their deliveries? That’s all gone now, slurped up by information technology.
Consider only a few breakthroughs likely to have an impact in the near future. Autonomous vehicles are on the immediate horizon, and their biggest impact won’t be on hipsters in urban centers: it will be on truck transport. In 2010, there were 1,604,800 truck drivers in the US, making an average of $18.15/hour (Bureau Of Labor Statistics). The number of drivers is growing, and there is a unmet demand because the work is so awful: long hours, not great pay, and dangerous work.
So, imagine autonomous trucking systems taking over freight, at least long haul freight: 33% of those jobs. Bang. Gone.
239,900 taxi drivers and chauffers. Bang. Gone.
67,100 Train engineers and operators. Bang. Gone.
100,000 airplane pilots (another industry with a looming talent cliff, because more than half are over 50 and must retire at 65). Bang. Gone.
A couple of million semi-skilled workers on the street, with nowhere to work.
And that’s only one breakthrough. Imagine if there were 10 others, that each erased 10 million jobs.
How many Blockbuster jobs were lost when Netflix and Redbox came along? 60,000, and innumerable other chains and mom-and-pop video stores likely tripled that.
What about a Redbox in a Subway? What if you walked in, typed what you wanted into a touchscreen, and swiped your credit card, and left? A store could be managed with one person per shift — to deal with the machinery — and Subway would save a fortune, and maybe drop the price a bit. And you’d still get a fresh, made to order sandwich. There are 4 million people working in the fast food industry, as of 2011.
I project that we will see over 25 million more people out of work because of these advances. Permanently.
It’s the Postnormal, and — among other activities — US and state policies should be directed toward shorter work week for the same pay, which would share the bounties of ephemeralization with the working people and not just to the folks that own the machinery. More important: If we share the work that is left, so that people worked, say, 25 hours a week, what are they supposed to do?
Of course if things just keep on as is at present, these people will be shit-canned, and join a permanent underclass of people who cannot find work in a world where there isn’t enough to go around.
I will dig up more on Robert Gordon’s thesis, and await Krugman’s next piece with interest.