(via Automation Alone Isn’t Killing Jobs - NYTimes.com)
Eduardo Porter, Old Forecast of Famine May Yet Come True -
Recent experience suggests that the productivity of farmland won’t decline gradually as the world grows warmer. World food prices stopped their long secular decline around 2007 and have been on a roller-coaster ride since. More volatile weather patterns promise to bring sharp disruptions to agricultural production that can cause spikes in food prices.
“There is a rigorous correlation between food price spikes and urban unrest,” said Andrew Holland, who studies climate change at the American Security Project, a research group in Washington. “There was a food price spike in 2008, and you can see unrest spread throughout Africa. And there’s a relatively clear line that leads from the food price spike in 2010 tounrest in the Middle East and the Arab Spring.”
Instability spreads easily. When rice prices jumped in 2007, big producers like India and Vietnam banned exports to protect their domestic markets, while importers like Bangladesh, Nigeria and Iran went out on the market to hoard as much grain as they could. The combination wreaked havoc in commodity markets.
Since then big food importers, like China, Saudi Arabia and South Korea, have tried to insulate themselves from future food shortages by buying or leasing agricultural land in places like Sudan, Madagascar and Uzbekistan. The strategy is still to be tested in a situation in which Africa or Central Asia were to suffer itself shortages of grain.
“I have run some war game scenarios,” Mr. Holland said. “The tendency becomes very quickly for a country to look after its own interests.”
To accomplish great things, we must not only act, but also dream; not only plan, but also believe. —
Fab writing @stoweboyd … you took the thread of ‘dreamer punishment’ one step further. http://t.co/cYMsaYPQd2— Dan Pontefract (@dpontefract) March 29, 2014
A well-educated time traveller from 1914 enters a room divided in half by a curtain. A scientist tells him that his task is to ascertain the intelligence of whoever is on the other side of the curtain by asking whatever questions he pleases.
The traveller’s queries are answered by a voice with an accent that he does not recognize (twenty-first-century American English). The woman on the other side of the curtain has an extraordinary memory. She can, without much delay, recite any passage from the Bible or Shakespeare. Her arithmetic skills are astonishing—difficult problems are solved in seconds. She is also able to speak many foreign languages, though her pronunciation is odd. Most impressive, perhaps, is her ability to describe almost any part of the Earth in great detail, as though she is viewing it from the sky. She is also proficient at connecting seemingly random concepts, and when the traveller asks her a question like “How can God be both good and omnipotent?” she can provide complex theoretical answers.
Based on this modified Turing test, our time traveller would conclude that, in the past century, the human race achieved a new level of superintelligence. Using lingo unavailable in 1914, (it was coined later by John von Neumann) he might conclude that the human race had reached a “singularity”—a point where it had gained an intelligence beyond the understanding of the 1914 mind.
The woman behind the curtain, is, of course, just one of us. That is to say, she is a regular human who has augmented her brain using two tools: her mobile phone and a connection to the Internet and, thus, to Web sites like Wikipedia, Google Maps, and Quora. To us, she is unremarkable, but to the man she is astonishing. With our machines, we are augmented humans and prosthetic gods, though we’re remarkably blasé about that fact, like anything we’re used to. Take away our tools, the argument goes, and we’re likely stupider than our friend from the early twentieth century, who has a longer attention span, may read and write Latin, and does arithmetic faster.
The time-traveller scenario demonstrates that how you answer the question of whether we are getting smarter depends on how you classify “we.” This is why Thompson and Carr reach different results: Thompson is judging the cyborg, while Carr is judging the man underneath. — If a Time Traveller Saw a Smartphone [x] (via wearethemakersofmanners)
As a species, humans manifest a quality called neoteny, the retention of juvenile characteristics into adulthood. Neoteny has physical ramifications—scarce body hair and a flat face are two examples—but it also has neurological ones. Namely, we have an extraordinary capacity to continue learning throughout life. If neoteny helps to explain our ability to learn, researchers are now figuring out what drives us to take advantage of it. In 2008, a group of scientists set up a novel fMRI study. When a subject’s curiosity was piqued by a question (“What is the only country in the world that has a bill of rights for cows?” for instance), certain regions of the brain lit up. Those areas, known collectively as the basal ganglia, correspond to the brain’s reward centers—the same ones that govern our desire for sex or chocolate or total domination in Call of Duty 4. When people say they have an itch to figure something out, they’re not speaking metaphorically. They’re looking to get high on information. Curiosity, then, is not some romantic quality. It is an adaptive response. Humans may not be the fastest or strongest creatures, but through the blind luck of evolution, we developed the desire and capacity to continually update our understanding of the world. And that has allowed us to master it—or get darn close. Call it the biological basis for being a nerd. — The Editor’s Letter From The April 2014 Issue Of Popular Science Magazine | Popular Science
The greatest discovery of my generation is that a human being can alter their life by altering their attitudes of mind. —
(Source: childrenofthetao, via wildcat2030)
Apple’s “Transparent Texting” Could Make Typing And Walking Safer
If you’re walking, you really shouldn’t be texting. While not as perilous as texting and driving, there’s no surer way to annoy fellow pedestrians than by zigzagging across a sidewalk, eyes glued to your precious screen. But if you absolutely must walk and text, Apple might have a new feature that could make that action safer.
The rate of change is often of no less importance than the direction of the change itself; but while the latter frequently does not depend upon our volition, it is the rate at which we allow change to take place which well may depend upon us. —
Luc Boltanski and Eve Chiapello, The New Spirit of Capitalism