TV networks’ dominance of the delivery of TV content is rapidly collapsing, as alternatives expand and people build up their libraries:
Primetime Mystery: Where Did All the TV Viewers Go? - Derek Thompson via The Atlantic
The networks’ share of primetime TV audience (dark blue in the graph below [above in this post]) has declined from 45% in 1985 to 25% in 2009. Basic cable ate the networks’
lunchpost-dinner audience, and now it’s technology’s turn gobble up what’s left.
Even with this long trend line (and despite the fact that viewers often unplug in the spring), there is a sense that we’ve reached a tipping point thanks to what Gaspin calls “built-up libraries.” There is more good stuff to watch not-on-live-TV than on live-TV, and even the head of entertainment at NBC knows it. Television technologies are dragging us away from live television, to a world of smaller screens, shifting “windows,” and no more ads. In 2000, a company called Netflix was experimenting with movie rentals. Now they have more than 20 million streaming customers. In 2005, about 1% of households owned DVRs. Today, it’s more than 40%. In 2006, Hulu didn’t exist. Today it has just under 30 million monthly uniques, with more than 1 million paying subscribers. In 2009, there were no iPads. Today, there are 60 million, and most of them are in the United States. That’s a Cambrian explosion of options for “watching TV” without literally watching an actual TV.
So people are ‘watching TV’ but not watching network programming in real time: they have defected from the ‘appointment TV’ model, or defected from broadcast and networks as the delivery mechanism for TV media.
PS DVR is a strange intermediary technology, one that foreshadowed keeping your TV shows in the cloud. (Apple’s iTunes in the cloud is poised to destroy the market for DVR devices.)
(h/t emergent futures)
We are living in a time-shifted world — at least on TV — and networks are now counting DVR data in calculating what are the most popular shows. But advertisers are still not willing to pay for anything after the first three days:
DVRs and Streaming Prompt a Shift in the Top-Rated TV Shows - Bill Carter and Brian Stelter via NYTimes.com
Total popularity does not perfectly correlate with profitability, however, since the networks all agree to sell ad time based on a metric called “C3.” It measures the average viewing of the commercials within a show within three days of the first broadcast, so it excludes people who wait to watch Wednesday’s “Modern Family” until Sunday or Monday.
Still, advertisers are paying, happily so, for the three days that are counted.
“We do like viewing in the playback mode,” said Tim Spengler, the global chief executive of the media-buying firm Magna Global. “We’re finding that the viewers are more attentive. They are less distracted. They have picked a time when they have the opportunity for more engagement than they would have if their kids were bugging them or they had three things to do at once.”
Mr. Spengler said many advertisers, like fast food restaurants, movie companies and some retailers, do not want to pay for ads beyond three days because what they have offered might be out of date. But, he said, other advertisers recognize there is “some value” to the four additional days of viewing that are not counted by C3 — even among fast-forwarders, because they do see glimpses of messages here and there.
The networks would eventually like to sell ad time based on seven days of viewership, but most viewership happens in the three-day window; Paul Lee, the president of ABC Entertainment, said ABC is able to “capture about 93 percent” of the value of the “Modern Family” audience with the C3 ratings.
Ultimately, the long tail for shows — and ads — will stretch out past three days, and advertisers will be paying for what they get, although DVR data might prove to be less important than second screen data, since an ad going by on the TV has no impact on someone who is actively typing in a chat about the game he is watching.