It has been an important tenet of my thinking about digital change in the book business to understand that books are different from other media — music, TV, movies, newspapers, magazines — as we try to anticipate the future.
I’ve long recognized two big structural distinctions: the “unit of appreciation, unit of sale” paradigm and the dependence of some of the other media on advertising. (Movies are the least different from books in these ways. The biggest difference between books and movies is the much larger number of people and dollars necessary to deliver most movies than to deliver most books.)
I have just seen a 3-part series (number one; number two; and number three) by a digital thinker named Steve Gray — the Director of Strategy and Innovation for Morris Communications Company — that, although it is newspaper-centric (as Gray’s background and employer are), contains insight that is useful across media, including books. But the more I think about the very smart things in Gray’s posts, the more it reinforces that the lessons of digital change are not necessarily the same for general trade books as they are for other media.
The history that Gray reviews is familiar to most of us. But while book publishing people tend to focus on the changes enabled by Gutenberg, Gray’s newspaper-centric view makes the high-speed rotary press, which enabled publications cheap enough to be daily purchases by masses of people, the seminal moment.
High-speed presses made all print cheap for the incremental copy. In the case of radio and televison, of course, the incremental copy is free. So all these media, as well as movies, which used scale in a slightly different way, were about amortizing the costs of content creation across “mass market” consumption.
If Karl Marx had been writing a bit later than he did, he might have seen that controlling the “means of distribution” had become as important as he saw controlling the “means of production” to be.
For newspapers, magazines, TV, and radio, the ability to deliver expensive-to-produce content to mass audiences that really craved it created huge advertising revenues. As Gray documents, this led to the aggregating, the bundling, the combining of material into packages that were efficient for the advertising medium to distribute. This was a “unit of sale” that was the most efficient that could be delivered for a period of about 150 years, from the 1850s until just a few years ago.
And that’s what the Internet has blown up. Because now the distribution mechanism for expensive-to-create content is precisely the same as the distribution mechanism for any content. In the book business, we’ve been tracking that as “purchased in stores” (which is, in itself, expensive and pretty much restricted to expensive-to-create content) as opposed to “purchased online” (which is a channel open to all of us).
Gray calls this a change from the “mass media era” to the “infinite media era”.
But as Gray continues his analysis, he comes up with what looks at first glance to me like a contradiction to the proposition that audiences are splintering, visible in the charts he presents of where web traffic goes. In fact, at the domain level, the tendency to concentration shows no signs of abating. In a pie chart Gray shows from one of the markets in which his company has a newspaper, he shows how the visits divide among the top 70% of the traffic, before you hit the “long tail”. Well more than half the site visits in the top 70% are to three domains: Facebook, Google, and YouTube. Add in Yahoo, Yahoo Search, and Bing and you’ve covered over 75% of that traffic.
Obviously, the local newspaper’s share is tiny.
Within the aggregated traffic of the big domains, of course, the apparent anomaly gives way and interests splinter (and, in fact, a newspaper might have some of the traffic that is called “Facebook”, although it wouldn’t have much power to monetize it). Many of, let’s say, ten thousand people on the web site of the NY Times will view the same content. Ten thousand people on Facebook might not overlap at all; ten thousand people searching Google or YouTube might not contain repeats either. These sites have figured out how to aggregate and display a vast amount of (user- or algorithm-generated) content. Curated aggregators like newspapers or radio stations simply can’t compete.
As Gray points out, the tendency of the curated aggregation sites is to compare themselves to each other. If the newspaper in a town is generating more traffic than the biggest radio station, they might declare victory. And if that really defined their competitors for audience or advertising dollars, that comparison would be sufficient and valid.
But Gray also makes it clear that the advertisers the newspaper or radio station might pursue are going to increasingly find locally-effective alternatives from the global domains. And the great hope that local news can be the killer content that keeps people loyal to their legacy providers doesn’t get much support from Gray. What he sees in the stats is that people find “news” of their social circle, which is what they get from Facebook, is far more compelling than “news” of their local area, which is what they get from their local paper. And the former can lead to the latter but rarely vice-versa.
It is interesting, though, that Gray’s punch line, which (if true) is a knockout blow to newspapers, actually contains some rays of hope for big book publishers that can operate at scale.
He sees five key points to consider:
1. The mass media’s digital advertising must compete with vast inventories of low-priced space on millions of websites.
2. Mass media content is now just a drop in an infinite ocean.
3. Digital audiences for local mass media websites are dwarfed by those of national digital players that meet more individualized needs and interests.
4. Social media are unlocking an incredibly vast desire and capacity among humans to get and give personally relevant information.
5. Digital targeting is providing the tools to reach people across thousands of websites and billions of small networks.
Gray warns radio and TV broadcast media not to be smug about what has happened to newspapers, because the digital tools keep getting better and they’ll be disrupted too.
(Personally, I just ordered my first “Internet TV”, which will put YouTube or Netflix on 52 diagonal inches of real estate just as easily as CBS or MTV. I can’t believe that will increase my time spent with broadcast or cable media. And it is just as obvious that TV that can get lots of programming from a Wifi connection is going to be attractive to consumers and threatening to Cable TV economics. This will become standard.)
But that micro-targeting might affect newspapers and magazines and radio and TV stations far differently than it affects book publishers. And that’s because, when it comes to advertising, book publishers are, in a way, on the opposite side of the fence from these other media.
Those media don’t build an audience uniquely for every issue the way book publishers do for every new book (and that’s somewhat true even for vertical publishers). They’re trying to sell captive audiences; we in book publishing are trying to corral disparate audiences. That makes us more like the newspapers’ advertisers than like the newspapers themselves.
When I had the chance to bring Obama’s digital director, Teddy Goff, to the stage at Digital Book World, I did it because I thought the micro-targeting techniques they practiced during the presidential campaign had something to teach us. He made a strong impression on me, and probably on many in the DBW audience, when he spelled out that the Obama team figured out very early that they could reach every voter they needed to target in America through the people on Facebook who were already in their camp.
Not only were the friends of their supporters including all the targets, many of them couldn’t be reached effectively through any other means.
As the digital revolution proceeds, we each build out our social graphs. We show up on different sites making our interests public. Whether we sign up for alerts from a publisher or not, aggregating data from Facebook and way beyond (certainly including GoodReads, which for many publishers might be as rich a lode of targeting information as the much bigger Facebook or LinkedIn are) will build databases of cataloged consumers (that’s us) that Steve Gray sees advertisers using as a much cheaper substitute to paying for real estate on a newspaper’s web site.
That’s an existential threat to ad-supported media of any kind.
But it might be the salvation of general trade publishing, if one or more of the players can master the skills and build the information repository and tools fast enough.
But there’s still that very pesky “infinite” competition from smaller players — authors and publishers — that will peel off some book readers whether they have effective techniques to build large audiences or not. Of course, that hazard could also become opportunity. If one or two (I doubt five or six) big publishers develop these scale capabilities, they might have a compelling case to make to the owners of the smaller- or self-published titles even when the current compelling case — we can put you into bookstores — loses its appeal.
Where general trade publishing will be in another five or ten years is anything but clear.