Elsewhere

Backtype Acquired By Twitter: More Premium Services?

Sponsored tweets are a marginal business model, and Twitter can’t support it’s growth for even selling stock and licensing the firehose (especially when companies like Google decide to now longer pay for that license).

Media companies are obsessed with Twitter, since everything important appears there first, and with skillful use, publishers can dramatically increase the click rates on their URLs, micronized for the Twitter stream.

Twitter could develop a huge business, based on developing the right set of tools for media companies (as well as any companies interested in monitoring brands). And it appears that they are moving in that direction, as indicated by the recent purchase of Backtype:

Backtype blog

We’re thrilled to announce that BackType has been acquired by Twitter! We’ll be bringing our team and technology to Twitter’s platform team, where our focus will be developing tools for Twitter’s publisher partners.

Our vision at BackType has always been to help our customers understand the value of engagement on Twitter and other social platforms. We also created BackTweets to help publishers understand the reach of their tweets and content, who they are reaching, and how Tweets covert to web traffic, sales and other KPIs.

Joining Twitter gives us the opportunity to bring insight to tens of millions of publishers around the world that are using Twitter to communicate and connect with their audience. We’re also excited to bring our technology (especially Storm) to Twitter where it can have a big impact across the company.

What happens to BackType?

The BackTweets product will now be offered to current users for free. However, as we begin to focus on our work at Twitter, we will not be accepting any new registrations for BackTweets, and we will discontinue the BackType product and API services.

I enjoyed Backtype in the early days, but couldn’t afford the expense when they closed their free accounts. Now, those that paid for the premium accounts still have access to the service, but that seemes to be going away, and the Backtype team are likely to be developing new Twitter product for publishers.

They also mention Storm — a real time processing technology they developed — and how it might be used across the board at Twitter. Their description of Storm is both a bit grandiose and fairly cryptic, but in summary, Storm is a a distributed, reliable, and fault-tolerant stream processing system, capable of scaling in a parallel hardware set-up without reprogramming.

Considering that Twitter is generating billions of tweets each month, doing any serious analysis of the twitter stream is fairly compute intensive. Here’s one use case, based on Storm’s distributed remote procedure call (RPC) system (my clarifications of technospeak in brackets]:

via Backtype

An example of a query that is only possible with distributed RPC is “reach”: computing the number of unique people exposed to a URL on Twitter. To compute reach, you need to get all the people who tweeted the URL, get all the followers of all those people, unique [identify] that set of followers, and then count the number of uniques [sets]. It’s an intense computation that potentially involves thousands of database calls and tens of millions of follower records. It can take minutes or worse to compute on a single machine. With Storm, you can do every step of the reach computation in parallel and compute reach for any URL in seconds (and less than a second for most URLs).

This sort of processing is exactly what media companies and brand managers want. The editor of the NY Times would like to know everyone that saw any URL referencing a NY Times story, and what, if any, action they took once they saw it. The cascade of those actions, and the reactions of other, downstream users, is of immense value.

And it may be that Twitter sees Backtype’s technology as a way to exploit that value, as well as acquiring the team to assist in scaling up to that sort of computation challenge.

goo.gl And The War For The Clickstream

Google is taking steps to become the major URL shortener, by adding a website for their solution, goo.gl.

This is an obvious attempt to take the high ground in the curious stratum of URL shorteners; a layer of the web made necessary by the rise of microstreaming applications like Twitter, where the space limitations put a premium on brevity.

[Disclosure: I am a consultant to Bit.ly, and have a financial interest in the company.]

Google URL Shortener Gets a Website

There are many shorteners out there with great features, so some people may wonder whether the world really needs yet another. As we said late last year, we built goo.gl with a focus on quality. With goo.gl, every time you shorten a URL, you know it will work, it will work fast, and it will keep working. You also know that when you click a goo.gl shortened URL, you’re protected against malware, phishing and spam using the same industry-leading technology we use in search and other products. Since our initial release, we’ve continued to invest in the core quality of the service:

  • Stability: We’ve had near 100% uptime since our initial launch, and we’ve worked behind the scenes to make goo.gl even stabler and more robust.
  • Security: We’ve added automatic spam detection based on the same type of filtering technology we use in Gmail.
  • Speed: We’ve more than doubled our speed in just over nine months.

And of course, Google wants the very best, up to the millisecond information to inform its algorithmic decisions about what ads to place on the page they are sending you to.

The market leader in the space has been Bit.ly, for some time. Early leading options have fallen to the wayside, like cli.gs and tr.im, as it became clear that the costs of managing a URL redirection capability could be high and the returns low (or non-existent) without a capacity for large scale and some way to make money off the information lurking in the clickstream. Note that Google’s announcement implicitly plays to the fear that some tiny start-up might not be around to serve the click weeks or months later; but Google certainly will.

I was involved in the formation of 301works.org, which is now a project of the Internet Archive, intended to backstop URL shorteners, so in case a URL shortener go out of business their clicks can still be served up. But even as the director of that effort, I realized that it was a stopgap, an interim solution, awaiting market maturation. After all, once billions a day of short URLs are being clicked, there is so much value in learning what is being clicked on an aggregate basis that end users would be presented with a choice of large, stable URL shorteners — like Google, Twitter, and Bit.ly — none of which are likely to go out of business.

And the clickstream is where the realtime analytics are lurking: who is clicking what right now. This is what Bit.ly and now Google have been aggregating, and this will be informing real-time trend analysis in the future.

Many have said that URL shorteners are evil (like Joshua Schachter), because they introduce overhead into every click, and they also increase the likelihood that such clicks might fail, because the service involved with resolving your www.cli.gs/8762ji2 link may no longer be around. 310works.org, goog.gl, and Bit.ly solve some of the issues involved in this argument, but not entirely.

But the battle will be around speed and analytics. No one, especially a large publisher or online retailer, wants to have a huge slowdown on every click. However, their backend servers are designed to do very slow aggregation of click data — my Google analytics account shows me yesterday’s data today, for example, which is completely unusable for publishers or just-in-time inventory planning.

And of course, Google wants the very best, up to the millisecond information to inform its algorithmic decisions about what ads to place on the page they are sending you to. Not just based on the old sort of clickstream — the series of clicks made by an individual that leads her to some page on the web — but the new sort of clickstream: all the clicks that are being made by the world, or perhaps better, by the corner of the world that comprises my network. That would inform the best ads, the best search hits, the best user experience for each individual.

Google has added some neat extras:

via www.twitter.com/mattcutts

Secret goo.gl easter egg: take a link like http://goo.gl/LFwS and add “.qr” to get a QR code! See http://goo.gl/LFwS.qr

There is also the question of the UX in general — and I see that goo.gl has adopted one of the best features from Droplr, a tool I use everyday. Specifically, when I create a goo.gl URL the application places the URL in my Mac clip buffer, so I can paste it wherever I’d like. Bit.ly supports edited posting to Twitter, but I find myself working around that a lot of the time, nowadays. Here on stoweboyd.com and on my Underpaid Genius blog, for example, I have integrated Backtype javascript so users can see the number of references to the posts, and clicking on thise figures opens the corresponding page on BackType, showing the twitter, Friendfeed, Facebook, etc, references.

This last feature is something that goo.gl does not provide today, but I expect they will be. In fact, BackType would be a perfect acquisition for Google in their war for the clickstream.

Related Posts Plugin for WordPress, Blogger...