An ancient virus has come back to life after lying dormant for at least 30,000 years, scientists...
We join spokes together in a wheel, but it is the emptiness of the center hole that makes the wagon move.
We shape clay into a pot, but it is the emptiness inside that holds whatever we want.
Google is taking steps to become the major URL shortener, by adding a website for their solution, goo.gl.
This is an obvious attempt to take the high ground in the curious stratum of URL shorteners; a layer of the web made necessary by the rise of microstreaming applications like Twitter, where the space limitations put a premium on brevity.
[Disclosure: I am a consultant to Bit.ly, and have a financial interest in the company.]
There are many shorteners out there with great features, so some people may wonder whether the world really needs yet another. As we said late last year, we built goo.gl with a focus on quality. With goo.gl, every time you shorten a URL, you know it will work, it will work fast, and it will keep working. You also know that when you click a goo.gl shortened URL, you’re protected against malware, phishing and spam using the same industry-leading technology we use in search and other products. Since our initial release, we’ve continued to invest in the core quality of the service:
- Stability: We’ve had near 100% uptime since our initial launch, and we’ve worked behind the scenes to make goo.gl even stabler and more robust.
- Security: We’ve added automatic spam detection based on the same type of filtering technology we use in Gmail.
- Speed: We’ve more than doubled our speed in just over nine months.
And of course, Google wants the very best, up to the millisecond information to inform its algorithmic decisions about what ads to place on the page they are sending you to.
The market leader in the space has been Bit.ly, for some time. Early leading options have fallen to the wayside, like cli.gs and tr.im, as it became clear that the costs of managing a URL redirection capability could be high and the returns low (or non-existent) without a capacity for large scale and some way to make money off the information lurking in the clickstream. Note that Google’s announcement implicitly plays to the fear that some tiny start-up might not be around to serve the click weeks or months later; but Google certainly will.
I was involved in the formation of 301works.org, which is now a project of the Internet Archive, intended to backstop URL shorteners, so in case a URL shortener go out of business their clicks can still be served up. But even as the director of that effort, I realized that it was a stopgap, an interim solution, awaiting market maturation. After all, once billions a day of short URLs are being clicked, there is so much value in learning what is being clicked on an aggregate basis that end users would be presented with a choice of large, stable URL shorteners — like Google, Twitter, and Bit.ly — none of which are likely to go out of business.
And the clickstream is where the realtime analytics are lurking: who is clicking what right now. This is what Bit.ly and now Google have been aggregating, and this will be informing real-time trend analysis in the future.
Many have said that URL shorteners are evil (like Joshua Schachter), because they introduce overhead into every click, and they also increase the likelihood that such clicks might fail, because the service involved with resolving your www.cli.gs/8762ji2 link may no longer be around. 310works.org, goog.gl, and Bit.ly solve some of the issues involved in this argument, but not entirely.
But the battle will be around speed and analytics. No one, especially a large publisher or online retailer, wants to have a huge slowdown on every click. However, their backend servers are designed to do very slow aggregation of click data — my Google analytics account shows me yesterday’s data today, for example, which is completely unusable for publishers or just-in-time inventory planning.
And of course, Google wants the very best, up to the millisecond information to inform its algorithmic decisions about what ads to place on the page they are sending you to. Not just based on the old sort of clickstream — the series of clicks made by an individual that leads her to some page on the web — but the new sort of clickstream: all the clicks that are being made by the world, or perhaps better, by the corner of the world that comprises my network. That would inform the best ads, the best search hits, the best user experience for each individual.
Google has added some neat extras:
This last feature is something that goo.gl does not provide today, but I expect they will be. In fact, BackType would be a perfect acquisition for Google in their war for the clickstream.