Battelle has an interesting theory about Apple’s purchase of Topsy. It’s not, he says, about Twitter at all: it’s about making search the central metaphor of a new iOS UX.
John Battelle, APPLE+TOPSY: IT’S NOT ABOUT TWITTER (AND TWITTER IS PROBABLY COOL WITH THAT)
In Apple Won’t Build a (Web) Search Engine and Of Course Apple Is Going to Do Search, I argued that Apple must get into the “app search” game. Just as web search became the coin of the web realm, app search will be next. It won’t look like web search, I argued, but at its core, it’s quite similar.
That was three years ago, right after Apple bought Siri, launched iAds, and was relentlessly touting the growth of its app ecosystem. I was certain Apple was going to figure out a way to create value above the level of a particular app, using all that tasty data it had within its restrictive walled garden to build the next generation iOS interface.
But so far, Apple has failed to innovate inside its own ecosystem (unless you count minimalist icons and bright base colors as innovation). Three years later, we’re still stuck in a user interface of app-filled screens, most of which we never use, each disconnected from the other save for the fact they happen to reside on your phone, possibly right next to each other, but otherwise unaware of the value they might reap should they magically start sharing links and data with each other. (You know, the way the web works.)
This has to change.
Google knows it, which is why I find Google Now so fascinating. Apple knows it too – the days of home screens littered with app icons are numbered. What will replace it?
My guess is some kind of intelligent, search-driven interface that “understands” you, based on the intent you signal through your use of all kinds of apps – including browser apps, of course, as well as true search apps like Siri (or Google Now). This new kind of interface responds to your voice as well as your location, your history, and anything else you might willingly (or unwittingly) feed it. It will strive to always put the very thing you need at your fingertips – something that simply isn’t possible without understanding your interactions as the equivalent of …. well, a personal interest graph.
And to do that, Apple needs a powerful engine, the kind of engine that, say, has been hard at work understanding a massive corpus of interest data for, say, six or so years. Something like Topsy.
I go along with some of John’s complaints. As I have added a few dozen apps that I use regularly to my iPhone, the UX has become an impediment. Yes, iOS 7 is an improvement, but its the same old restaurant with better tablecloths.
I’d like to see a better UX, and search offers some great angles. And it’s a good idea to buy new talent, considering how bad Spotlight was for so long (and it’s still not wonderful). In particular, creating elements in the OS where apps could register and allow a search to index their data stores, so that I could search against the data and not just apps. Including, of course, Twitter data, and all sorts of other things that could be relevant.
So, maybe we should call this app-centric search, not app search.
SkyJack - Hacking drones by Samy Kamkar
Nice one. The aerial robo-war above our heads has just begun. Respectively: Free amazon packages and drone [pizza, beer, kebab, burrito..] for the urban hacker youth in your neighborhood.
SkyJack is a drone engineered to autonomously seek out, hack, and wirelessly take over other drones within wifi distance, creating an army of zombie drones under your control.
Today Amazon announced they’re planning to use unmanned drones to deliver some packages to customers within five years. Cool! How fun would it be to take over drones, carrying Amazon packages…or take over any other drones, and make them my little zombie drones. Awesome.
Using a Parrot AR.Drone 2, a Raspberry Pi, a USB battery, an Alfa AWUS036H wireless transmitter, aircrack-ng, node-ar-drone, node.js, and my SkyJack software, I developed a drone that flies around, seeks the wireless signal of any other drone in the area, forcefully disconnects the wireless connection of the true owner of the target drone, then authenticates with the target drone pretending to be its owner, then feeds commands to it and all other possessed zombie drones at my will.
SkyJack also works when grounded as well, no drone is necessary on your end for it to work. You can simply run it from your own Linux machine/Raspberry Pi/laptop/etc and jack drones straight out of the sky.
// yeah, i know. this will not work on amazon’s drones. they will likely be autonomous using GPS. but hey, nice project.
[via René from Nerdcore] [SkyJack] [SkyJack on GitHub]
As Jamais Cascio once said, when imaging new technologies. start with how it will be used for crime.
A new era of production has begun. Its principles of organization are as different from those of the industrial era as those of the industrial era were different from the agricultural. The cybernation revolution has been brought about by the combination of the computer and the automated self-regulating machine. This results in a system of almost unlimited productive capacity which requires progressively less human labor. Cybernation is already reorganizing the economic and social system to meet its own needs.
The Ad Hoc Committee on The Triple Revolution, The Triple Revolution, April 6, 1964.
This committee included Linus Pauling (Nobel Prize in Chemistry), Gunnar Myrdal (Nobel Prize in Economics), and a long list of other notables, and the report was written for Lyndon Johnson.
They were right, they just got the timing wrong. It wasn’t the first wave of computers — laid down on top of existing business processes — that is emphemeralizing most human labor. It’s today’s wave, the new explosion of computing scale — in our hands (foreground) and in the cloud (background) — into which we are moving everything, and through which we can monitor and computicate with everything that’s connected: people, sensors, devices, intelligent and dumb apps, and autonomous gear.
Autonomous gear — like Bezos Amazon Prime Air demo (see What does the Amazon Prime Air experiment say about the future of work?) — is going to be a major force in sidelining people from things that are ‘work’ today, and will soon be viewed as something more like electricity streaming through the walls in our buildings: a resource like water, or cable.
You wouldn’t imagine having to pay someone to fill the tank of your toilet, would you? Soon that’s how we will think of pizza delivery or taking a cab to the movies.
Any prediction about what is in fact to come, when cast as fiction, runs the risk not just of being wrong but of being not about the future at all.
I scratched my head a little after watching the Amazon Prime Air video (see here), and considered the impact of delivery drones and autonomous vehicles on the future of work:
[…]the big question about drones and autonomous vehicles in general is about the impact on work. Right off the bat, the several million people (mostly men) employed as truck and delivery drivers will be out of a job. Yes, some of them might get work in the Amazon warehouses, but as soon as AI and robots are up to it, those jobs will be gone too.
This won’t be limited to megacorporations like Amazon, although Amazon might be planning to leverage this as an additional industry disruptor, like they’ve done with Amazon’s elastic computing technologies. Imagine a local florist, Bette, in downtown Beacon NY (my home) wanting to make a delivery to a local customer’s home. No longer reliant on Ralph, her former part-time driver, she simply logs into Amazon Prime Air, types in some details, and twenty minutes later a drone touches down in the loading zone outside her store, picks up the flowers for Mrs Johnson, and takes off for North Brett Street.
Of course, her flowers arrive by an autonomous truck three times weekly, and her Samsung Smart Pallet communicates with the truck, gathers her flowers, and brings them to her cold room, without the services of Sheila, her former part-time assistant.
But Ralph and Sheila are off starting microbusinesses, where autonomous vehicles make the economics work.
Read the rest.
The Future of the Book
As someone who made the leap from print to electronic publishing over thirty years ago people often ask me to expound on the “future of the book.” Frankly, I can’t stand the question, especially when asked simplistically. For starters it needs more specificity. Are we talking 2 years, 10 years or 100 years? And what does the questioner mean by “book” anyway? Are they asking about the evolution of the physical object or its role in the social fabric?
It’s a long story but over the past thirty years my definition of “book” has undergone a major shift. At the beginning I simply defined a book in terms of its physical nature — paper pages infused with ink, bound into what we know as the codex. But then in the late 1970s with the advent of new media technologies we began to see the possibility of extending the notion of the page to include audio and video, imagining books with audio and video components. To make this work conceptually, we started defining books not in terms of their physical components but how they are used. From this perspective a book isn’t ink on bound paper, but rather “a user-driven medium” where the reader is in complete control of how they access the contents. With laser videodiscs and then cd-roms users/readers started “reading” motion pictures; transforming the traditionally producer-driven experience where the user simply sat in a chair with no control of pace or sequence into a fully user-driven medium.
This definition worked up through the era of the laser videodisc and the cd-rom, but completely fell apart with the rise of the internet. Without an “object” to tie it to, I started to talk about a book as the vehicle humans use to move ideas around time and space.
People often expressed opposition to my freewheeling license with definitions but I learned to push back, explaining that it may take decades, maybe even a century for stable new modes of expression and the words to describe them to emerge. For now I argued, it’s better to continuously redefine the definition of “book” until something else clearly takes its place.
Bob Stein, The Future of the Book is the Future of Society
Now that we can easily create, manage, and share chunks of ‘content’ — writing, images, video, commentary, and metadata — digital books and other digital containers for information start to smell a lot alike.
Perhaps a better question is ‘what is the future of…’ for the constituent activities, like the future of poetry, or fiction, or erotic photography. The containers will increasingly be soft copy.
In an era where the coffee table has a touch screen, the idea of coffee table ‘books’ will be just a rendering on the screen of these works, that can be opened and read on the table, or on the visitor’s personal tablet.
Stein goes on in this essay to discuss this new sort of book as becoming social objects, where readers can participate in public or private communities commenting on a work, such as in classrooms, the workplace, or the open web.
This is where books blend and intermix with other web-based and digital forms of information, and where our intentions in the use of the objects is shown as the only consideration that matters, not the historical meaning of something like a ‘book’.