New Models

Conferences are thermometers recording the level of fear about publishing changes


In the latest sign that the need for information about digital change in publishing has undergone a sea change in the past few years, it was announced today that Nielsen will not stage an independent conference in London this April, but will instead join forces with the London Book Fair to do an event there in March.

This reminds me that the best salesperson I ever worked with had a mantra 40 years ago that is proven over and over again to be true. “I never sell with logic,” he said, “unless I find no way to sell with fear.” Nothing demonstrates that more clearly than the rapid ups and now apparent downs of the digital change conference business in our industry.

We had done conferences intended to be profitable about digital change in the early 1990s in conjunction with other consultants and sponsored by industry publications, first Publishers Weekly and then Knowledge Industry Publications. (KIPI published the newsletter BP Reports, back when you could make money selling printed newsletters!) Then we worked on a series of conferences sponsored by VISTA Computer Services, now part of Ingenta. Those were free events which VISTA staged to promote their services. Before the enterprise giant SAP entered the publishing space in the late 1990s, VISTA was the dominant provider of enterprise software to book publishers in the US and UK. They decided they needed to learn about digital change, driven by fear.

As their then-chairman Denis Bennett said at the time, “we sell software to help publishers keep track of books in warehouses. What if there are no books? What if there are no warehouses?” He decided his customers needed to explore the same questions, so he funded a team led by Mark Bide of the UK and me to do research on digital change. First the findings benefited VISTA’s strategic planning and then they were turned into conference presentations to help publishers.

Meanwhile, Amazon grew, Barnes & Noble — first with Bertelsmann and then on their own — competed for online sales and ebooks reared their head through initiatives by Sony, Palm, and Microsoft. It became evident to many people that the industry might change a lot. And the era of digital conferences throughout the publishing calendar began. We did a conference at Frankfurt in conjunction with the Frankfurt Book Fair in 2001. Soon every industry gathering had to have some sort of digital show. Tim O’Reilly, a publisher of computer books, launched the Tools of Change conference in 2007. It was geeky, but opened the door to discuss business change, not just tech change.

Then in 2009, David Nussbaum and Sara Domville of F+W Media conceived Digital Book World and recruited me and then Michael Cader of Publishers Lunch to program and market it. That began a run of seven years for us, which had a bit of a bell curve. The first few years we were up and the last few years it got increasingly difficult to maintain the level of success we’d reached.

And that was because publishers lost the fear. This was for a variety of reasons. One is all to their credit: they hired in people who knew digital even if they didn’t (yet) know publishing. But it was also that circumstances changed. The surge in ebook sales taking share from print slowed down, then apparently stopped. New marketing procedures, still driven by major accounts but also now using new tools like NetGalley and ever-improving techniques and software assistance to find the right keywords for discovery, were developed to address the new marketplace.

What had been been a disruptive and frightening pace of change became a much slower boil. As the metaphorical frog in boiling water demonstrated, not feeling a change doesn’t mean one isn’t happening. But feeling the change drove the fear and fear drove the need for education and validation.

Now the challenges are more subtle. Amazon is past 50 percent of the sales for many publishers. That’s comprised of a lion’s share of online print sales and almost as much of the ebook sales. Not only does Amazon have a multiple of the biggest share of the book business any prior account had ever achieved, they aren’t shy about using their clout to claw back margin.

(Old joke of mine from a few years ago, but still true. “Amazon is every publisher’s most profitable account. That was never their intention and they’ll be inclined to change that fact as fast as they can.”)

If Amazon has consolidated the path to reaching half the US domestic market, Ingram has done very much the same thing for the global market. But while Amazon’s build-out has been largely at the expense of an ecosystem US publishers already reached (the most notable fatality being the Borders chain, which expired in 2011), Ingram has created a market expansion for many publishers by providing ready access to offshore sales opportunities that were previously very hard to access.

Global marketing channels — which is any way you use the Internet — and ubiquitous use of English means that the potential customer base for English-language books extends far beyond what US (or UK or any other English-language) publishers consider their home territory. Ingram has long been a supplier to bookstores and libraries all over the world. They distribute ebooks globally. They’ve complemented their capabilities with a growing print-on-demand network, making it even more efficient for them to put the books of their vending publishers anywhere there’s demand.

So, in 2016 publishers can literally reach most of the customers in the world through two intermediaries, Amazon and Ingram. Obviously, a publisher who calls on stores locally and around the world will stimulate sales that the best relationship in the world with Ingram can’t deliver entirely on its own. It still definitely “pays” for a publisher to push to get books in stores in the US and around the world on their own. And it is likely that books on display and selling in brick-and-mortar stores in the US and elsewhere actually stimulate sales at Amazon as well. But a publisher with no more organization than relationships with Amazon, Ingram, and a talented digital marketing team can publish successfully in today’s world.

One example is Diversion Publishing Corp., created by Scott Waxman, composed of Diversion Books, EverAfter Romance and Radius Book Group, which has developed a real business and marketplace presence working closely with Ingram’s organization for its brick and mortar distribution reach and Amazon for online sales. And it is worth noting that O’Reilly Media is among the companies that has for years reduced its fixed overhead by publishing its books leaning on Ingram.

The fear that is left in the marketplace can’t really be addressed by a conference. That would be the fear many publishers have about what Amazon will do to claw back margin in the future. Probably the greatest comfort publishers have is that they will be “even-handed” about it; looking for pretty much the same concessions from everybody. At the same time, there is fear around the other biggest domestic account for most publishers, Barnes & Noble, which is experimenting with ways to turn around their declining business but also has its hand out looking for more support. New independent stores continue to open, but others also close and, anyway, all of the indies apparently amount to about 8 percent of the total business, according to anecdata provided to us by a number of publishers.

So Amazon and Ingram, for different reasons, are the most reliable accounts of any in a publisher’s account base to grow. Increasing their understanding of how to make the most of the opportunities those two accounts present is the most important commercial task for publishers for the foreseeable future. That should be driven both by logic and fear.

Maximizing Your Potential With Amazon and Ingram. Now, THAT is a good topic for a conference!

7 Comments »

Newspaper publishers face very different and much more immediate threats than book publishers


The business news has been very painful for newspapers lately. A piece we saw a couple of days ago says both the New York Times and the Wall Street Journal are going to cut back sharply on their arts coverage. The advertising simply isn’t there to support it.

And recently before that, we read a piece suggesting that perhaps newspapers should have just ignored the whole digital thing (a frighteningly obtuse suggestion) and then right afterwards a Times story documenting the collapse of advertising dollars available for print which pretty much obviates the “just skip digital” idea. (One wonders if the people advocating that solution are not aware that overall ad budgets are reviewed by all advertisers regularly and the budgets are routinely reallocated to put more into digital and less into print! This is not a “secret” trend.)

I have been sharpening my understanding of book publishing economics with real experience for 50 years. My view of periodical media is purely as a consumer and fan, but a very longtime consumer and fan. I got the first magazine subscription of my own 60 years ago when my parents gave me my wish of a Sports Illustrated sub for my 9th birthday. Shortly thereafter, I talked them into getting The New York Times by home delivery and that started my daily habit of looking at the front page and the box scores before breakfast. When I went to UCLA, the first thing I did was get home delivery of the Los Angeles Times.

At one point or another, I got subscriptions of my own to Newsweek, Time, and US News & World Report. And Business Week. And Sport. And The Sporting News. And the Nation, the New Republic, and National Review. And New York and the New Yorker and, for a while, New West! When there was a sports daily called The National, I bought it at my newsstand along with the News and Newsday.

But that’s all done now. I have two print subscriptions left: The New York Times and The New Yorker. I have recently found that their online prompts through emails and digests have led me to read most of what the print edition offers on my phone before the print edition arrives! (Still, I have no plans to cancel either because digital-only isn’t that much cheaper and I still get a bit of value out of the print.)

While I think the book business still has years of viability in front of it, I can’t see a way to sustain the periodicals. It isn’t just about consumption in print versus consumption in digital. There are two massive differences between the businesses.

1. Newspapers (and magazines) depend on advertising in their business model; book publishers don’t.

2. Newspapers (and magazines) are aggregates of content while many books are themselves a single unit of content. You can get the box scores or the weather or the national news headlines from a variety of places, no matter how unique or distinctive are other parts of the newspaper you buy. You wouldn’t find an acceptable substitute for the sixth chapter of a novel you’re reading.

The first point was one the periodical publishers could figure they had covered because their online offering would pick up digital advertising for support as print declined. They could see that they could reach a much bigger audience with digital than they could with print. So there was room for optimism that moving the whole kit-and-caboodle to the web could work synergistically with the print edition. Of course, we’ve learned since that each recorded eyeball earns less online than each theoretical eyeball did when it was delivered by the print edition. And then there’s the way Google and Facebook have swept up all the online ad dollars.

But that wasn’t the bigger problem. The other one was.

The aggregation created by each newspaper was intended to compete as an aggregation. In the 20th century, a consumer didn’t have the choice of reading the Times’s op-ed page and the News’s coverage of the Yankees unless s/he bought both newspapers. But, even with the paywalls that are up today (and weren’t up at the beginning), there is a lot of competition for almost every single individual piece of content in every newspaper. And it has also become just about impossible for a printed newspaper to deliver you any “surprises”: any news that is important to you personally that you won’t have seen first in an email or an online aggregator (including that own newspaper’s web site.) Any “scoop” will be “reported” by competitors and the information itself would be in the public domain very quickly after it is released.

So, the fundamental distinction between the businesses is that publishers often sell an indivisible unit and newspapers (and magazines) sell aggregates of content nuggets, each of which is valued differently by different readers of the paper and each of which has its own array of competitors.

It occurred to me more than six years ago that there would be trouble when the consumer’s “unit of appreciation” did not equal the publisher’s “unit of sale”. The most dramatic example was offered by the recorded music business which sold albums (unit of sale) to satisfy its market’s appreciation for songs (unit of appreciation).

Both the “whole” newspaper and the record album made sense in a physical world. It would simply not be practical for the newspaper to deliver recipes and box scores on your lawn and national news and TV listings on mine. Record companies “stamped” records and CDs, and it was approximately the same cost basis to them whether they gave you one or two songs when they did that or twelve. Both business models were built on aggregations when physical requirements made the aggregations sensible and the consumer readily went along with it.

Book publishers certainly have serious challenges in front of them. In the short run, they are learning that novels work better as both print and digital products than cookbooks (where the unit of individual content appreciated is the recipe, although for the printed version there are rewards in the entire presentation). They are dealing with consolidation on the distribution side which threatens their margins at the same time that increased competition from indies forces down retail prices. There is reason to believe that long-form reading itself may diminish as our attention spans are increasingly shaped by mobile consumption with many built-in distractions. The commercial book business is already shrinking and it will continue to do so. But the core business model by which publishers acquire units of content, develop and refine them, and then market and distribute them, is currently only eroding. The advertising-based model for printed newspapers and magazines appears to be collapsing.

Newspapers and magazines have already diminished sharply and now seem to be doing so at an accelerated rate of speed. I don’t see any way to save the status quo ante. Newspapers and magazines aren’t “coming back”. But I do have a few thoughts that perhaps offer insight to the current owners and workers in an anachronistic paradigm about how to focus most effectively on the challenges they face.

1. Stop thinking about the overall business — The New York Times or Business Week or Sports Illustrated — and start thinking about the individual pieces of content created on a regular basis and the competition for each piece.

2. That leads pretty quickly to recognizing what’s “unique”. In the case of the Times, that’s likely to be the in-depth reporting by very solid intellects. I’d be figuring out how to broaden and deepen that and sell it in segments. There should be a global audience for Times reporting on Washington, on New York City, and on finance, for example. These should be built out at as subscription products. It is even possible that foundation or grant money would support them as well as subscribers.

3. Most local newspapers’ biggest unique asset is their local reporting. It is also what is most in society’s interest to maintain. But they really add very little value with a lot of their content, much of which comes from wire services and syndicators. Getting local support for local reporting will be harder in some places, easier in others. But building a local subscription base for local news coverage has shown signs of being a sustainable model; continuing to sell a locally-branded and -curated aggregation of information ubiquitously available elsewhere clearly is not.

This last makes it clear that I’ve elided one key reason for the shrinkage of the newspaper business. It no longer requires a local production-and-delivery system to hand you most of what that local production-and-delivery system sold — with what was usually really just a smattering of local customization — for the past 100 or 150 years.

My friend, Michael Cairns, saw the local angle years ago and well before that figured that The New York Times would be selling tech to enable better local reporting to other newspapers. Too bad that didn’t happen and it could have been (perhaps still could be) a great revenue stream for the Times.

There is developing public concern for the loss of journalism through the loss of newspapers. Contributions support the efforts of ProPublica, for example, although their stories may come to light through conventional vehicles like the New York Daily News. It has been suggested in serious circles that government should support independent journalism.

Each large and (historically) successful newspaper is a large business on a one-way path to oblivion. Within each of them, though, are a number of seeds for smaller businesses that might survive and thrive in an environment where they weren’t shackled to large overheads performing redundant services.

So while newspapers and magazines should continue to pursue events and any ecommerce opportunities they see, they should also recognize that they are riding on a seriously dated business model. If there’s still cash to extract from it, that’s fine. But it is like a mine that has been worked for years or a machine designed for years of use that has now performed for decades. Efficiency will continue to decline and eventually it won’t be commercially productive at all. Lots of perfectly competent and capable blacksmiths couldn’t adjust to a world that needed fewer of them and more auto mechanics.

The obvious exceptions to both my experience and my generalizations are the most niche-y specialist magazines, except the bigger they are, the more they would be subject to the problems I’m describing. But the more focused their audience’s interests, the more their advertisers are likely to be loyal and the more that other things like events might add measurable revenue. The critical mass requirements for print remain a challenge, but the chances of converting to a workable web-only proposition might be better if there’s a strong brand.

And intensely local works just like niche-y. It is ironic that many local newspapers have cut back on local coverage, which is precisely what can work for them commercially because the most local advertisers (restaurants, health clubs, local elections) don’t have a lot of other choices. And local coverage can attract other support.

4 Comments »

The latest marketplace data would seem to say publishers are as strong as ever


This post began being written a couple of weeks ago when I recalled some specific misplaced expectations I had for the self-publishing revolution and started to ponder why things happened the way they did in recent years. It turns out a big part of the answer I was looking for provides clarity that extends far beyond my original question.

For a period of a few years that probably ended two or three years ago, we saw individual authors regularly crashing bestseller lists with self-published works. Some, like Amanda Hocking, parlayed their bootstrap efforts into significant publishing contracts. Others, like Hugh Howey, focused on building their own little enterprise and tried to use the publishing establishment for what it could do that a self-publisher couldn’t. (In what was certainly a very rare arrangement of this kind with a major indie author, Howey made a print-only deal for his bestseller, “Wool”, with Simon & Schuster. And he made foreign territory and language deals and Hollywood deals as well.) And we know that there were, and are, a slew of indie authors who self-publish through Amazon and don’t even bother to buy ISBN numbers to get universal distribution under a single title identifier, effectively keeping them out of bookstores.

All of this was enabled by three big changes to the historical book publishing and distribution ecosystem. One was the rise of ebooks, which simplified the challenge of putting book content into distributable form and getting it into the hands of consumers. The second was the near-perfection of print on demand technology, which enabled even print books to be offered with neither a significant investment in inventory nor the need for a warehouse to store it. And the third was the increased concentration of sales at a single retailer, Amazon. Between print and digital editions, Amazon sells half or more of the units on many titles and, indeed, may be approaching half the retail sales overall for the US industry.

(This is very hard to measure or even get reliable anecdata for. Amazon sells globally. Indeed, one of its great contributions to publishers is pretty seamlessly enabling them to reach export markets through a domestic supplier. But it also means that publishers can tend to see all Amazon sales as “domestic”, even when they’re not. US publishers are often telling us that half their sales are coming from Amazon, but how much of those sales are to offshore accounts is not consciously backed out of the numbers.)

What the rush of indie bestsellers told us a few years ago was that things had changed to the point that a single person with a computer could achieve sales numbers that would please a big corporation going after sales with the tools provided by tons of overhead: careful curation and development, sophisticated production capabilities, teams of marketers and publicists, legions of sales people, and acres of warehouse space. This had not been possible before ebooks. And the market reach of the amateur publisher was extended even further as Amazon’s share of print sales surged as a direct result of retail shelf space declining with Borders’s passing and Barnes & Noble’s shrinkage.

For a period of time that was relatively brief and which now has passed, agents and publishers worried that self-publishing could be appealing to authors they’d want in their ecosystem. The author’s share of the consumer dollar is much higher through self-publishing. And the idea of “control” is very appealing, even if the responsibility that goes with it is real and sometimes onerous.

So, I warned with what felt like prescience, entity self-publishing might present an even greater threat to publishers than independent authors would.

I was thinking about the scale value that publishers brought to producing revenue for books. Historically, that had been about capabilities that only a book publisher would have at its disposal, the tools we referred to earlier. With Ingram then adding a turnkey service called “Spark” to reach the half of the market that was not delivered by Amazon in the US, access to other ebook retailers wherever they are, and enabling print sales around the world, a publisher could “rent” all the infrastructure it would need to reach all the audience there is with two stops: Amazon and Ingram.

The entities that I had my eye on from the book publishers’ perspective were those already in the print content business: newspapers and magazines. They all start out with assets that would seem to lend themselves to creating and promoting books. They have access to vast number of writers, on staff and through work-for-hire arrangements. They have editors on staff as well as the knowledge of how to find and hire more for projects. They have direct online access to a large number of consumers, including the opportunity to know their interests in a very granular way. They have advertisers who could be useful for promoting books or even buying them in bulk.

But despite the fact that there was, indeed, a slew of activity 2-to-4 years ago from a variety of non-book publishing content entities to get into ebooks, there have been no apparent breakthroughs. Nobody has cracked the code. Nobody who is not a book publisher has used the rent-a-scale capabilities to build a sustained book business.

It is not that many haven’t tried, or are still trying. Among those who have been or are still in the game are The New York Times, The Washington Post, The Guardian, The Atlantic, The Huffington Post, NBC, the Minneapolis Star-Tribune, and The Boston Globe. They have sometimes worked in conjunction with digital start-ups. For example, the New York Times worked with Vook (now called Pronoun and acquired earlier this year by Macmillan) and Byliner, whose original proposition was “short ebooks”.

There have been a variety of approaches to create the content. Sometimes these publications and websites have recycled their own material or used internal resources. The Boston Globe did an insta-book on Whitey Bulger and some on Boston sports teams, as well as creating a book of photos of Boston that had already run in the paper. (The Boston Bruins’ Stanley Cup championship was commemorated in a book delivered both in print and digital days after they won.) The Star-Tribune used internal staff to execute the mechanics of delivering ebooks. The Boston Globe’s Bulger book, published by Norton in print, showed them that they could do the ebook work themselves.

Obviously, the idea of book programs using magazine brands is not new with the digital age. Decades ago, Hearst, Rodale, and Meredith were all big magazine companies committed to real book programs, which was what it took to support the infrastructure or to form a close relationship with a publisher to provide it. Hearst has had a robust book program for a long time because they once owned the book publishers Morrow and Avon. When they were sold in the mid-1990s, the management saw virtue in maintaining the book program so they teamed up with Sterling Publishing for everything from assists creating the content to all the scaled book publishing functions. The relationship continues to this day, although Hearst also licenses other projects to other publishers. Rodale remains active in both books and magazines, with their own organization doing the books. And Meredith temporarily moved its book program from “independent” to publisher John Wiley. It is now a shadow of its former self.

Even in the simplified age we’re in now, leaning on a publisher with all the pieces in place can be a way to tackle the challenge of having an adequate infrastructure for books. I am currently reading a “Washington Post” ebook on climate change that was published in conjunction with Diversion Books, a digital-first publisher created by literary agent Scott Waxman during the height of the indie publishing ebook fever.

But searching for a surge in this kind of activity generated by the digital revolution consistently takes us back to two and four years ago. In 2012, Random House partnered with the website Politico to deliver four ebooks on the 2012 presidential race. We’re not aware of anything similar taking place this year. The Minnesota Star Tribune was pushing their ebook initiative in 2013. The Boston Globe got into the game in 2011. The Times did a story in 2011 about the phenomenon which covered a Vanity Fair ebook of collected articles about Rupert Murdoch and News Corporation when they were the caught in a scandal. Graydon Carter, the editor of Vanity Fair, loved the whole idea. He loved the idea of publishing articles which had already been fact-checked and copy-edited. “It’s like having a loose-leaf binder and shoving new pages into it.”

The Byliner collaboration with the New York Times was first reported in 2012, and the Times started their initiative with Vook almost simultaneously. At the same time, programs were being announced in the UK by the Guardian and the Financial Times.

All of that inspired the pundit in me to say “watch out”. But there’s been a lot less activity since. It’s worth asking why.

Of course, there are logistical and organizational challenges to just bolting a book publishing program onto an existing content-creating entity. The writers and editors at newspapers and magazines are already fully employed; they’re not looking for additional things to do. And the job specs and incentive arrangements are all about the principal activity. The marketing mechanisms at a periodical publisher are, likewise, fully engaged. So the newspaper or magazine might have more powerful tools for some marketing purposes than a book publisher does, but no book operation inside one of them could get them dedicated to help sell books on anything but the most sporadic and opportunistic basis.

In addition to the fact that the sailors all have existing assignments, a book publishing initiative would also lack a captain. We observed a couple of years ago that one of the great indie publishing successes, a cookbook called “Modernist Cuisine”, carrying a price tag of $625 and published by Microsoft co-founder Nathan Myhrvold, was largely made possible by the leadership of a veteran publisher, Bruce Harris. Yes, Ingram did the “scale” work: printing, warehousing, selling, distributing. And it wouldn’t have been possible without them. But Harris worked out the commercial equations (what should the retail price be, for example) and the marketing campaign that carried it to its success.

There are other veteran publishers like Harris available to be engaged as consultants, but it is also much easier for a single entrepreneur like Myhrvold to make use of one than it would be to have them integrate with an existing organization formed for another purpose.

I asked indie-publishing experts Jane Friedman and Porter Anderson (their weekly “Hot Sheet” newsletter for independent authors is a great resource) for their take on the question I was posing: what happened to all those newspapers and magazine initiatives? Why did it seem that none of them achieved the success I was expecting?

Friedman drew on her experience at Virginia Quarterly Review (VQR), which had publishing ambitions based on ebook economies but ultimately abandoned them. She saw the “complications” falling into three buckets.

Clearing rights for projects with multiple authors, which VQR would have been frequently called upon to do, was challenging, time-consuming, and frustrating.

The organizational structure and staffing was far from optimal for a book publishing operation.

The profit potential was too small to make it worth the effort to overcome the other two problems.

But, even accepting all of that,  I’d suggest that the biggest reason this activity was so feverish 2-to-4 years ago and isn’t so much now was revealed first in a vitally important post by hybrid author and helper-of-indies Bob Mayer and then reiterated by the latest report from the Author Earnings website.

Mayer built an impressive business for himself by reissuing titles of his that had previously been successfully published and gone out of print. He spells out clearly what has changed since the days of big indie success and the plethora of entity-based publishing initiatives.

The marketplace has been flooded. An industry that used to produce one or two hundred thousand titles a year now produces over a million. Nothing ages out of availability anymore. Even without POD keeping books in print, ebooks and used books make sure that almost nothing ever disappears completely. And Mayer’s sales across a wide range of titles — his and other authors whom he has helped — reflect the mushrooming competition. They’re down sharply, as are the sales of just about everybody he knows.

What Mayer wrote tended to confirm that the breakthrough indie authors happened far more frequently before the market was flooded. Authors who struck it rich in 2010 and 2011 (like Hugh Howey) were lucky to get in before the glut. Recommending that somebody try to do the same thing in 2013 or 2014 was telling them to swim in a pool with water of a completely different temperature.

On the heels of Mayer’s piece, Author Earnings made discoveries that seemed to startle even them. For those who don’t know, AE is a data collection and analysis operation put together by indie author Hugh Howey teamed with the anonymous analyst “Data Guy”. The AE emphasis is on what the author gets, (“a site for authors by authors” is what they call themselves) with less interest in what publishers want to know: how topline ebook revenues are shifting.

According to the industry’s best analyst, Michael Cader, the most recent AE report shows, for the first time since they’ve been tracking it, a reduction in earnings for indie authors and an increase for published authors. (Cader may have a paywall; here’s another report from Publishing Perspectives.) But even more startling is the shift in revenue. Publishers have booked 65% of Kindle revenues and Amazon Publishing has 10%. They put self-published authors at 20%, which is down from 25% previously.

It is not a big surprise that Amazon Publishing is able to grow its own share of Kindle revenue. But the fact that publishers are holding their own, in the aggregate, while indie authors are not, underscores the challenge that non-publisher books are facing. The title output of publishers has remained relatively flat. The title output of indies has surged. So the per title sales of indie books must be collapsing relative to the publishers’ output.

What this is telling us is that, whatever deficiencies there are in the way publishers are organized for publishing today, they clearly are able to marshal their resources more effectively for book after book than indies can. So, not only does the “entity publisher” have the challenge of refocusing an organization designed for something else to sell books, they’re fighting a tidal wave of competition that enters the market because of the low barriers to entry. In fact, if you were at a newspaper or magazine today and thinking about putting your company into the book business, there would be powerful arguments to follow the Hearst formulation of creating a home inside an established book publisher rather than building a low-overhead operation for yourself. But that option has always been available; it didn’t require a digital revolution to deliver it.

A lot has been made of the fact that big publishers are seeing topline revenue erosion across print and digital. But the ability for readers to consume books has, at best, remained flat (there are so many more distractions immediately available these days) and the number to choose from has exploded. That means the per-title sales are plummeting. Per-title sales are what tell us whether publishers or independent authors can make any money. And the math is clear: it is getting harder and harder to do so, but it seems to be getting harder faster for the indies than it is for the established publishers.

61 Comments »

The reality of publishing economics has changed for the big players


A veteran agent who was formerly a publisher confirmed a point for me about how trade publishing has changed over the past two decades, particularly for the big houses. This challenges a fundamental tenet of my father’s understanding of the business. (And that’s the still the source of most of mine.) I had long suspected this gap had opened up between “then” and “now”; it was really great to have it confirmed by a smart and experienced industry player.

One of the things that I took from my father’s experience — he was active in publishing starting in the late 1940s — was that just about every book issued by a major publisher recovered its direct costs and contributed some margin. There were really only two ways a book could fail to recover its costs:

1. if the advance paid to the author was excessive, or

2. if the quantity of the first printing far exceeded the advance copy laydown.

In other words, books near the bottom of the list didn’t actually “lose” money; they just didn’t make much as long as the publisher avoided being too generous with the advance or overly optimistic about what they printed. (Actually, overprinting was and is not as often driven by optimism as by trying to achieve a unit cost that looks acceptable, which is a different standard fallacy of publishing thinking.)

The insight that just about every book contributed to overhead and profit was obscured by the common practice of doing “title P&Ls” that assigned each book a share of company overheads. Whatever that number was, when it was calculated into the mix it reduced the contribution of each sale and showed many books to be “unprofitable”. That led publishers to a misunderstanding: perhaps they could make more money doing fewer books, if only they could pick them a little bit better. Trying to do that, of course, raised the overhead, which was neither the objective nor any help in making money.

(Raised the overhead? I can hear some people asking…Yes, two ways. One is that publishing fewer books would mean that each one now had to cover a larger piece of the overhead. The other is that being “more careful” about acquisition implies more time and effort for each book that ends up on the list, and that costs overhead dollars too.)

For years, this “reduce the list and focus more” strategy was seen by my father, and those who learned from him, as a bad idea.

One of the young publishers my father mentored was Tom McCormack, who — a decade after Len worked with him — became the CEO of St. Martin’s Press. There, McCormack applied Len’s insight with a vengeance, increasing St. Martin’s title output steadily over time. And, just as Len would have expected they would, St. Martin’s profits grew as well.

All of this was taking place in a book retailing world that was still dominated by stores making stocking decisions independently from most other stores. In the 1970s, the two big chains (Walden and B. Dalton) accounted for about 20 percent of the book trade. The other 80 percent was comprised of nearly as many decision-makers as there were outlets. So while it took a really concerted effort (or a very high-profile book or author) to get a title in every possible store location, just about every book went into quite a few. With five thousand individuals making the decision about which books to take, even a small minority of the buyers could put a book into 500 or 1000 stores.

But two big things have conspired to change that reality. The larger one is the consolidation of the retail trade. Now there are substantially fewer than 1000 decision-makers that matter. Amazon is half the sales. Barnes & Noble is probably in the teens. Publishers tell us that there are about 500 independent stores that are significant and that all the indies combined add up to 6 to 8 percent of the retail potential. The balance of the trade — about 25 percent — is the wholesalers, libraries, and specialty accounts. The wholesalers are feeding the entire ecosystem, but the libraries and specialty accounts are both very much biased as to the books they take and very unevenly covered by the publishers. In any case, ten percent of the indie bookstores today gets you 50 on-sale points, not 500. That’s a big difference.

The other thing that has happened is that the houses are much better organized about which books they are “getting behind”. This has the beneficial effect of making sure the books seen to have the biggest potential get full distribution. But it also has the impact of reducing the chances that the “other” books will get full attention from Barnes & Noble (able to deliver more outlets with a single buyer than one would customarily get from the entire indie store network). And, without that, it takes a lot of luck or online discovery to rescue a book from oblivion.

The agent who was confirming my sense of these things agreed that the big houses used to be able to count on a sale of 1500 or 2000 copies for just about any title they published. Now it is not uncommon for books to sell in the very low triple digits, even on a big publisher’s list.

Even before any overhead charge and with a paltry advance, that isn’t going to cover a house’s cost of publication. So there definitely are books today — lots of books — coming from major houses that are not recovering even their direct costs.

This is a fundamental change in big publisher economics from what it was two decades ago. While the potential wins have become exponentially bigger than they were in bygone days, the losses have become increasingly common. And while it is still an open question how well anybody can predict sales for a book that isn’t even written yet (which is the case for most books publishers acquire), there is a real cost to getting it wrong, even when the advance being paid is minimal.

So it is no longer irrational to cut the list and focus. Obviously, every book published is a lottery ticket for a big win, and the odds in a lottery are never good. But the world most general trade publishers have long believed in, where the big hits pay for the rest of the books, is really now the one they inhabit.

I am proud to be part of the organizing committee for Publishing People for Hillary. We’re staging a fundraiser for her in midtown Manhattan on Friday, September 30, at which Senator Cory Booker and Senator Amy Klobuchar will be the featured speakers. You can sign up to join us here. Contribution levels for the event range from $250 to $2500, with a special opportunity to meet the Senators at the higher levels.

And, having NOTHING to do with publishing, but for all baseball fans in the crowd, please check out this story about Yogi’s mitt and Campy’s mitt that you will not have seen anywhere else.

28 Comments »

eBook pricing resembles three dimensional chess


The current round of reporting from major publishers contains some danger signs. Their ebook sales are declining (in dollars and even more dramatically in units) in an ebook market that is probably not declining. The “good” news for the publishers is that print sales are pretty much holding their own, or even growing. And profits are being maintained, which is probably the most important metric in their board rooms. But the bad news is that total revenues are down. And print sales have been buoyed by the consumer excitement for adult coloring books (now spreading to adult “activity” books), so the combined results for many author-driven titles don’t necessarily reflect growth and total unit sales of print plus digital for many titles are almost certainly falling behind expectations

In a complicated marketplace with large unknowns around indie authors and indie books, particularly those that are Amazon-only, it is hard to be definitive about what the cause of this is. (Author Earnings does yeoman work trying to put the two overlapping markets in context.) Certainly, barriers to entry have come down and there are many more books in the marketplace competing for readers that don’t come from the companies the publishers think they’re competing against. But the publishers’ “success” in establishing agency pricing — where the price they set is the price the consumer pays — combined with Amazon’s decision to “respect” agency (at first with no choice but subsequently, after contracts were renegotiated, with apparent enthusiasm) and offer no pricing relief from their share of the book’s sales revenue is almost certainly a major component of the emerging problem.

Amazon doesn’t need big publisher books to offer lots of pricing bargains to their Kindle shoppers; they have tens of thousands of indie-published books (many of which are exclusive to them) and a growing number of Amazon-published books, that are offered at prices far below where the big houses price their offerings. That probably explains why Amazon can see its Kindle sales are rising while publishers are universally reporting that their sales for digital texts, including Kindle, are falling. (Digital audio sales are rising for just about everybody, but that is not an analogous market.)

This is putting agency publishers in a very uncomfortable place. It has been an article of faith for the past few years that there is revenue to unlock from ebook sales if only the pricing could be better understood. Just a bit more revenue per unit times all those ebook sales units is a very enticing prospect for publishers. After the agency settlements liberated publishers from the price limitations Apple had originally insisted on, the immediate tendency was for publishers to push ebook prices even higher.

And since ebooks are sold in a less price-competitive market than we had before agency, Amazon can devote its marketing dollars to cutting prices on the print editions. This undercuts the publishers’ intention to support a diverse (and store-based) retail network and, at the same time, often embarrasses them by making the print book price (set by Amazon) lower than the ebook price (which Amazon makes very clear was set by the publisher).

The fact that this is reducing publisher revenue and each title’s unit sales is concerning. But it is also making it much more difficult to establish new authors at the same time because lots of competing indies are still being launched with low price points that encourage readers to sample them.

It is maintained by many people that there has been a reduction in the rate of surprise breakout books over the past few years because of this pricing as well. This perception would be explained by the fact that price attracts readers to try new authors, and so the new rising talent would more frequently come from the lower-priced indies. Higher ebook prices reduce the speed with which a book can catch on in the marketplace. It feels like there is a consensus in the big houses now that it is harder to create the “surprise” breakouts. (This is a very difficult thing to actually measure.) The “Girl on the Train” phenomenon is always unpredictable, but big publishers still could count on it coming along often enough to keep the sales revenue trend line rising. That doesn’t seem to be the case anymore.

High ebook prices — and high means “high relative to lots of other ebooks available in the market” — will only work with the consumer when the book is “highly branded”, meaning already a bestseller or by an author that is well-known. And word-of-mouth, the mysterious phenomenon that every publisher counts on to make books big, is lubricated by low prices and seriously handicapped by high prices. If a friend says “read this” and the price is low, it can be an automatic purchase. Not so much if the price makes you stop and think.

This puts publishers in a very painful box. When they cut their ebook prices, they not only reduce sales revenue for each ebook they sell; they also hobble print sales. (Although if they cut prices as a promotion, and they market the promotion, apparently higher-priced print will also benefit from the promotion and see a resulting sales lift.) And singling out some of their ebooks for an ebook price reduction strategy could also raise a red flag with an agent. It is easy to understand a temporary price reduction that is promoted; as an overall pricing strategy it could be seen as a bite out of the author’s ebook earnings at the same time their print sale is threatened with the low-price ebook competition. And while an ebook price-reduction strategy would probably make at least Amazon and Apple, very important trading partners, quite happy, it risks angering others, including perhaps Barnes & Noble but certainly including all the indie bookstores.

On the other hand, the current “strategy” has plenty of risk.

An unpleasant underlying reality seems inescapable: revenues for publishers and authors will be going down on a per-unit basis. This can most simply be attributed to the oldest law there is: the law of supply and demand. Digital change means a lot more book titles are available to any consumer to choose from at any time. Demand can’t possibly rise as fast and, in fact, based on competition from other media through devices people carry with them every day, might even fall (if it hasn’t already). So publishers are facing one set of challenges with their high ebook prices; they’ll create another set if they lower them.

But, unfortunately, lower them they almost certainly must. With more data, we may learn that developing new authors absolutely requires it, particularly in fiction.

Here’s a suggestion for a new pricing routine that might be worth trying in the near term recalling a prior practice from quite a while ago.

There was a period earlier in my career, probably ending in the 1980s, when publishers priced new hardcovers like this: $22.95 until October 1, $24.95 thereafter. The books had the price on a corner of the jacket that could be snipped diagonally on October 1, so that only the $24.95 price would show.

Frankly, in this case the pricing device was not primarily intended to entice the consumer to buy the book before the up-pricing deadline. It was really designed to get the store to place a bigger advance order, for which the applicable discount would be based on the promotional price.

Now big advance orders are not nearly as important as they used to be, nor nearly as common. But there is still a huge dependence on consumers taking a risk on an author, particularly in the first moments after a book comes out. Two or three decades ago, this was the “secret” behind publishers moving an author from a star doing “mass-market originals” (low prices) to a hardcover bestselling author.

So what might be worth a try from the big publishers now would be “promotional ebook pricing” on launch. Make the ebook $3.99 until date X, and then raise it to the “normal” level (which for major publishers, when the hardcover is in the marketplace, would be $12.99 and up.) This is a very painful experiment to try because it will compete against the hardcover at launch, when the publisher is trying to pile up sales to make the bestseller list. It will annoy print booksellers as well.

But publishers have to find a way to put new authors into the market without a millstone of pricing that requires a significant commitment by the reader before they know the author.

Of course, that strategy suggests an even more disruptive reality about ebook pricing: it doesn’t have to remain “set” the way print book pricing does. Because of our convention of printing the publisher’s suggested retail price right on the book’s jacket or paperback cover, it is not really practical to change a book’s price except, occasionally (and less often in these low-inflation times) when a book is reprinted. (In higher-inflation times, we did sometimes employ the practice of “stickering” to increase price, but that was clumsy and impossible to conceal.) But with ebooks, prices can change pretty much as often as you like: up, down, and up again.

In fact, that already happens with promotional pricing such as has been pioneered by the email service, BookBub. The BookBub idea — emailing a subscriber list with notice of price promotions on ebooks — has been copied highly successfully by HarperCollins with their proprietary version, BookPerk, and to a lesser extent by other publishers as well. It is becoming established practice to temporarily lower the price of a title to get it ranked higher and then to raise the price and try to capture higher-revenue sales with the hyped “branding” the promotion created. So far, this is done with a clear game plan, such as discounting the first book in a series, or the most recent book in a series when a new title is about to come out.

But uncoupling the ebook pricing completely from print pricing, which seems to be where we will inevitably go, may also mean — it certainly can mean — all ebook pricing becomes dynamic. All of this definitely raises the bar for publisher knowledge of how consumers react to prices in different situations. It has been a widespread article of faith that retailers “understand” this behavior and publishers don’t. To the extent that retailers do understand it, they see it through a different lens; they almost never care about the impact of price changes on the overall sales curve for a single title. Titles are interchangeable for retailers and not for publishers. So while it is true that publishers have a lot to learn, it is probably not true that retailers already know it.

The points I wanted to make in this post were that publishers should contemplate uncoupling ebook pricing from print pricing, learn more about consumer behavior around pricing, and master the skill of managing (strategically and operationally) LOTS of ebook price changes all the time. There is another point herein, made in passing, that is worth deeper consideration on another day. Big publishers are seeing their revenue decline but their profits rise. Does that point to a strategy? For how long can publishers cut costs faster than revenues, particularly per-unit revenues, decline? Maybe for quite a while…

40 Comments »

Barnes and Noble faces a challenge that has not been clearly spelled out


The sudden dismissal of Ron Boire, the CEO of Barnes & Noble, follows the latest financial reporting from Barnes & Noble and has inspired yet another round of analysis about their future. When the financial results were released last month, there was a certain amount of celebrating over the fact that store closings are down compared to prior years. But Publishers Lunch makes
clear that store closings are primarily a function of lease cycles, not overall economics, and we have no guarantees that they won’t rise again this year and in the years to follow when a greater number of current leases expire.

With B&N being the only single large source of orders for most published titles for placement in retail locations, publishers see an increasing tilt to their biggest and most vexing (but also, still their most profitable) trading partner, Amazon.

Although PW reported immediate dismay from publishers over Boire’s departure, there has been plenty of second-guessing and grumbling in the trade about B&N’s strategy and execution. Indeed, getting their dot com operation to work properly is a sine qua non that they haven’t gotten right in two decades of trying. But one thing Boire did was to bring in a seasoned digital executive to address the problem. This is presumably not rocket science — it isn’t even particularly new tech — so perhaps they will soon have their online offering firing on all cylinders.

The big new strategy they revealed, one they’re going to try in four locations this year, is what they call “concept stores” that include restaurants. And, although it was a bit unclear from their last call whether the store-size reduction they’re planning extends to these restaurant-including stores, they have said that the overall store footprint they’re planning will be 20-25 percent smaller than their current standard. These two facts both make the point that B&N is facing a reality which has become evident over the last decade, and which questions a strategy and organizational outlook that was formulated in another time. If this new challenge is properly understood, and I haven’t seen it clearly articulated anywhere, it would make the restaurant play more comprehensible. (Note: I have to admit that my own recent post, where I traced the history of bookstores in the US since World War II, failed, along with everybody else, to pinpoint the sea change that makes B&N’s historical perspective its enemy while trying to survive today.)

Here’s the change-that-matters in a nutshell. A “bookstore” doesn’t have the power it did 25 years ago to make customers visit a retail location. Selection, which means a vast number of titles, doesn’t in and of itself pull traffic sufficient to support a vast number of large locations anymore. This changes the core assumption on which the B&N big store buildout since the late 1980s was based.

This has been true before. One hundred years ago the solution to the problem became the department store book department. Post-war prosperity grew shelf space for books, but the department stores remained the mainstays for book retail. The first big expansion of bookstores started in the 1960s when the malls were built out, which put Waldens and Daltons in every city and suburb in America. The mall substituted for the department store; it delivered the traffic. In fact, department stores “anchored” all the malls to be sure they’d get that traffic!

(Here are a couple of additional factoids to illustrate the importance of the department store channel in the mid-20th century. When Publishers Weekly did an article about the Doubleday Merchandising Plan in 1957, the stores they used as examples were the book departments of Wanamakers and Gimbels! When I came into the business fulltime in the 1970s, there were two significant “chain” accounts in Chicago: the bookstore chain Kroch’s & Brentano’s and the Marshall Field department stores.)

Bookstore customers came in many flavors, but they all benefited from a store with greater selection. My father, Leonard Shatzkin, first noticed that selection was a powerful magnet when he was overseeing the Brentano’s chain (no relation to K&B in Chicago) in the 1960s. Their Short Hills, New Jersey store was an underperformer. They doubled the number of titles in it and it became their best performer. Whether the bookstore customer knew what they wanted or just wanted to shop, the store with more titles gave them a better chance of a satisfying result.

Over time, that understanding was followed to a logical conclusion.

By the late 1980s, it appeared that standalone bookstores outside of malls could become “destinations” if their selections were large enough, and that created the superstore expansion: B&Ns and Borders. But, only a few years later when it opened in 1995, the universal selection at Amazon mooted value of the big-selection store, especially for customers who knew before they shopped what book they wanted. Selection as a traffic magnet stopped working pretty quickly after Amazon opened in 1995 although it was not so immediately obvious to anybody.

I had some experience with B&N data that demonstrated pretty emphatically by 2002 that the action on slow-selling university press titles had shifted overwhelmingly to Amazon. (At that time, the late Steve Clark, the rep for Cambridge University Press, told me that Amazon was a bigger account for CUP than all other US retail combined.) It took the further hit of expanded Internet shopping at the consumer level, which grew with increased connectivity even before ebooks, to make what had been a great business obviously difficult. Then, as if to emphasize the point, we lost Borders…

What just doesn’t make it anymore, at least not nearly as frequently, is the “big bookstore”. Although there is no scientific way to prove this, most observers I’ve asked agree that the new indie stores popping up over the past few years tend to be smaller than than the Borders and older indie stores they are replacing. We are seeing book retailing become a mix of pretty small book-and-literary-centric stores and an add-on in many places: museums, gift shops, toy stores. These have always existed but they will grow. And true “bookstore” shelf space will shrink, as has space for “general” books in mass merchants. The indie bookstore share will definitely continue to grow, but whether their growth will replace what is lost at B&N and the mass merchant chains is doubtful. Every publisher I’ve asked acknowledges significant indie store growth in the past couple of years, but they are also unanimous in saying the growth has not replaced the sales and shelf space lost when Borders closed.

Barnes & Noble is clearly rethinking its strategies, but this is one component that I have never seen clearly articulated. Back when I had my “aha!” moment about what was happening with the university press books, I suggested to one B&N executive that they had to figure out how to make the 25,000-title store work.

He said, “that’s not where we are. We’re thinking about the million-title store!” In other words, “we want to manage big retail locations”. This is thinking shaped by what we can now see is an outdated understanding of what the value of a big store is. So now they’re trying to sustain slightly-smaller big locations with things other than books. (Whether they plan to go as low as 25,000 titles in stores that used to stock four or five times that many is not clear. But they did say in their recent earnings call that the new concept stores would get 60 percent of their revenues from books, rather than the 67 percent they get now.) They have added non-book merchandise; now they’re thinking about restaurants. All of that is to increase traffic and to increase sales from the traffic they already get.

But there is another way to attack the challenge that “books alone” doesn’t work the way it used to. Barnes & Noble’s core competency is book supply to retail locations anywhere in the United States. Nobody, except Ingram, does this as well. (Although Amazon clearly is now planning to give it a try.)

Other retailers are suffering the same Internet sales erosion as booksellers, and a properly-curated selection of books can work for just about any store’s customer profile. Might Barnes & Noble complement its own stores by offering branded B&N Book Departments to other retailers? Let them bring in the traffic (although the books will undoubtedly bring in some more) and then B&N could manage those departments. (This is a variation of a tactic I suggested for Penguin Random House some years ago.) Let other retailers play the role the department stores and then the malls played for books in the past 100 years. Let’s not require the retail customer to come to a location strictly to shop for books.

The “trick” would be for B&N merchandisers to adjust their book selection to suit the specific customer base each store attracts. But is that a harder challenge than going into the restaurant business? And isn’t extending the B&N brand for books a more sensible tactic than trying to extend it to food? Or to create a new brand for food? And wouldn’t it be a good idea to get started on this tactic to expand book retail shelf space before Amazon, which keeps showing signs of wanting a retail presence, does?

This is not an easy market to just walk in and take over. There are already wholesalers providing books to retailers who don’t support a full-fledged buying effort for them. Those wholesalers are often getting more margin from the publishers than B&N is now, but that’s actually more of an opportunity than an obstacle. Presumably, a B&N-branded book section is worth something. (If it isn’t, that’s another problem.) Presumably, B&N has buying expertise and domain knowledge that would enable them to fine-tune a selection of books for each outlet’s customer base. And, presumeably, B&N’s supply chain efficiency would be superior to anybody else’s in the industry, except Amazon’s and perhaps Ingram’s.

The big bookstore model is an anachronism. Just making it big doesn’t pull in the customers anymore. So a new strategy is definitely called for. B&N is going part of the way to one by recognizing that they need to do more to bring in customers and, at the same time, they can’t profitably shelve 100,000 titles across hundreds of stores. Taking their capabilities to where the customers already are would seem like an idea worth exploring.

It should be noted that the Indigo chain in Canada, under the leadership of owner Heather Reisman, has apparently successfully transitioned to a “culture” store where books are the key component of the offering. She has apparently found a product mix, or an approach to creating one, that is working for Indigo. Every large book retailer in the world is going to school on what Indigo has done. Because Amazon and online purchasing in general have not taken hold in Canada the way they have in the United States, we can’t jump to the conclusion that the Indigo formula could be successfully applied here. But it sure wouldn’t be a crazy idea for B&N to buy Indigo to gain the benefit of Reisman’s insights and expertise, assuming that a) Canadian law would permit U.S. ownership of such an important cultural asset and b) Reisman herself would sell and then work for somebody else. Two very big assumptions.

It is also worth nothing that the Pocket Shop chain, the small-bookstore concept chain that we’ve written about previously, is going to start opening stores in the UK. 

26 Comments »

Book publishers do not do SEO like the big guys do although they could


Partner Pete McCarthy pointed me to an article a couple of weeks ago that also introduced me to a website called Viperchill and its gifted, self-promoting SEO/Marketing creator, Glen Allsopp. The linked post, which I strongly urge you to read, enumerates quite painstakingly the techniques used by 16 online media companies with a large portfolio of brands that enable them to dominate specific search results in Google across a very wide range of topics and categories.

The example ViperChill explained in detail was how Hearst created a lot of traffic very quickly to a new site and business it had created called BestProducts.com. Judicious placement of content and links to BestProducts from the very big brands that Hearst controls (Cosmopolitan, Womens Day, Marie Claire, Esquire, Elle) resulted in Google placing BestProducts startlingly high in search results.

This is a result of three elements Google values a great deal: “domain authority” and “inbound links”, nested in “content” that seems “natural”. “Natural” suggests that Google believes the content is genuine information, not a ruse to point to an otherwise irrelevant link.

This is tricky and problematic stuff for Google, as the story makes clear. Google’s objective is to deliver the most relevant search results for a user. While Womens Day’s editorial opinion about the best nail polish would seem worthy of high “authority” (which ultimately translates into an elevated position in the search results), Google does not intend to confer that authority on a nail polish suggestion that is motivated by BestProducts’s commercial interests. How can Google tell what motivates the placement of content and a link on Womens Day’s web site? They may still be figuring that out.

In other words, what is working so effectively for these brands, enabling them to use the collective authority of many powerful domains to drive traffic to something new and different, may not work forever without some serious adjustments. But it sure is working now!

This information wouldn’t be appearing on this blog if it didn’t have application to book publishers. It demonstrates a very large opportunity for many of them. The precise size of the opportunity depends entirely on the number of individual web domains that publisher controls or influences, the authority of each of the domains according to Google, and the judicious placement of content and links among those sites to push the desired result to a specific search term.

These powerful multi-brand content organizations have such massive traffic and authority that they can influence Google search for the most searched terms on the Internet. No book publisher would have comparable capability. But for terms that are more publishing-specific — those that reference books or reading groups or book genres or authors — the larger book publishing organizations have the ability to influence search results exactly the way these big outfits do.

And so would some smaller publishers, particularly vertical/niche publishers with any meaningful consumer-facing brands.

Probably the first big insight that created the success of Google was the recognition that links to content or a website told you something valuable about the worth of that content or website. So from the very beginning of SEO two decades ago, domain owners have understood that getting links is a way to improve their rank in search and increase their discoverability. What is documented in this article is that when one entity controls a large number of authoritative domains, they can constitute an ad hoc “network” that gets them the power of inbound links without having to persuade somebody outside their family of their worth. That’s particularly important when you’re trying to launch something new, as Hearst was with BestProducts.

And which publishers do every day with new books and debut authors.

There are two big steps publishers need to take in order to put themselves in position to execute this strategy effectively. The first is that they have to enumerate and understand all the web presences they own and control. Obviously, that includes the main domain for the publisher. But it also includes individual book sites, author sites, series sites, topical sites, or any other sites that have been created and which are regularly used and posted to.

In fact, any site that has meaningful domain authority can be helpful. We’ve worked with sites that have long since been defunct but that still have “weight” in the Google-verse. Those can be revived and used to impact SEO for current projects.

The second is to enumerate and understand all the related sites, owned or controlled by others, but where there is a mutual interest in some property between the publisher and the website owner. These will largely be sites for titles or authors, but might also include corporate sites for some authors and movie sites for some others. If a movie is made of a publisher’s book and there is no link from the movie site to the publisher’s information on the book, it is not good for anybody. The publisher is missing out on referrals that could lead to sales as well as additional discovery “juice”. The fans of the movie would want to know that it came from a book and it would be useful information for them that should be on the movie website and really constitutes an unfortunate omission if it is not. If the publisher sites can also influence the sites promoting the UK or international editions of the same title, they’d be helpful too.

Once the roster of usable sites is complete, the next step is to make sure there are relatively easy ways to add “natural” content to tie to links. Authors can really help each other here. They each have “authority” and they can, in combination, add a lot of power to the site for a new author or a new book. The more an author participates in useful reciprocal linking, the more that author helps their own cause and adds search power to the other authors in the publisher’s stable.

And the more the publisher can orchestrate these links, from their own sites and tethering their authors to each other on the web, the more the publisher adds otherwise unobtainable value for the author that costs nothing but a little administrative effort.

Indeed, the value added for authors, which would be tangible and visible, is one of the most important strategic reasons why publishers should heed the advice in this post.

The complete roster of useful websites, which is being added to all the time as new titles and authors are added to the publisher’s lists, should be centrally managed for maximum impact for the new titles publishers will launch. An SEO plan for every new book or author should be created from this roster of sites, soon pretty predictably adding another useful authoritative domain that can be put behind new titles that arise in the future.

An understanding of this opportunity also makes clear why authors having their own websites with their own domains is an important marketing component. Each site needs to be competently rendered and, of course, it should be linked to from the publisher’s own site. (And it should link back as well.) But assuring that those things get done should also be part of the standard book publisher playbook for maximum discovery. In the overwhelming majority of cases, it doesn’t appear to be now.

So, why is there that hole, universally (as far as we can see) across the industry? We’ve asked ourselves that question. The likely answer is that there is no one person or already-organized group of people within any house who can both do meaningful analysis and deliver the necessary execution with expertise, and who has the credibility and authority with all the stakeholders to make it happen. What’s essential is somebody who can corral parent companies, imprints, technology groups, authors, and agents and 1) get them to understand the value we’re pointing to here; 2) persuade them to participate; and 3) provide a convincing roadmap on how it will work.

Perhaps it is not surprising that we think it will take a powerful outside consulting team to make that happen, at least in the first place to do it. (After the value is more obvious, and as quickly evident to authors as it will be, others will figure out how to follow.) That’s the kind of thing that publishers would typically go to an outfit like Boston Consulting Group (there are others, like McKinsey or PwC) to get done. Of course, a smaller firm more focused on publishing might well do the job better, faster, and cheaper.

4 Comments »

Big data matters but textual analysis really does not


I was honored today with a lengthy response to a recent Shatzkin Files post on the Digital Book World blog from Neil Balthasar, who apparently uses techniques similar to those in a forthcoming book “The Bestseller Code: Anatomy of a Blockbuster Novel”. My post had been a response to a PW article announcing the upcoming publication of that book. I reacted strongly to this sentence near the top of the story:

“In the forthcoming The Bestseller Code: Anatomy of The Blockbuster Novel (St. Martin’s Press, Sept. 20), authors Jodie Archer and Matthew L. Jockers claim they created an algorithm that identifies the literary elements that guarantee a book a spot on the bestseller lists.”

It was the “guarantee a book a spot on the bestseller lists” that got my juices flowing.

In his response to me, Balthasar moves the goalposts completely. To him, textual analysis is just one of a number of inputs he uses in his company (called Intellogo, an entity with which I am not familiar at all.) In fact, he says “The Bestseller Code” does not claim what the sentence I quoted above clearly does claim. And then he goes on to suggest that my post suggests a lack of appreciation for “machine learning” and “big data” and succumbs meekly to the “romance of publishing”.

It seems pretty clear that he doesn’t know much about Pete McCarthy or me, nor is he much aware that we have spent our careers arguing to the romantics in publishing that they need to be more data-centric.

Balthasar claims an overlap in our viewpoints but creates a total straw horse by saying “…I agree in theory with Shatzkin that an algorithm alone cannot predict whether a book will be a bestseller or not, that isn’t precisely what The Bestseller Code claims, nor what our experience working with machine learning at Intellogo defines”. (Of course, it is precisely what the sentence quoted from the PW story does claim for the book!)

While we apparently agree that big data is an essential analytical tool for publishers marketing books today, where we emphatically part company is on the relative importance of textual analysis. Compared to research into the audience, segmenting it, understanding its search behaviors and social activities, and understanding the competitive environment for a book at the moment that it is published, the analysis of the book’s content adds very little, even when it is deeply analyzed and bounced against other sources.

Or, let’s put it this way. We do lots of projects designing digital strategies for books without performing textual analysis. Maybe some of those plans would be improved if we also used a book’s text as seed data for portions of our analysis. But there’s no way we’d try doing any meaningful marketing planning without the other things we do, no matter how rigorous or skilled a textual analysis was.

I’m glad a fan of “The Bestseller Code” is moved to put the textual analysis in the category of “among the things we do” rather than “we can predict a bestseller from the text”. But that wasn’t the proposition I was reacting to when I wrote the post that provoked his response.

When Balthasar says (as he does), “Imagine a day when we take all our data about what people are reading and provide publishers (and authors) ideas of what people want to read, where to find those audiences, and better ways to reach them”, he is pretty much stating the nature of our work at Logical Marketing. We do precisely what he’s suggesting today for a wide variety of clients. Textual analysis has almost nothing to do with it.

1 Comment »

The “Big Change” era in trade book publishing ended about four years ago


Book publishing is still very much in a time of changing conditions and circumstances. There are a host of unknowables about the next several years that affect the shape of the industry and the strategies of all the players in it. But as publishers, retailers, libraries, and their ecosystem partners prepare for whatever is next, it becomes increasingly evident that — from the perspective of trade publishing at least — we have already lived through the biggest period of transition. It took place from sometime in 2007 through 2012.

At the beginning of 2007, there was no Kindle. By the end of 2011, there was no Borders. And by the end of 2012, five of America’s biggest publishers were defending themselves from the US Department of Justice. The arrival of Kindle and the exit of Borders are the two most earthshaking events in the recent history of book publishing and its ecosystem. The Justice Department suit first distracted and then ultimately strait-jacketed the big publishers so it was both difficult to focus and then difficult to react to further marketplace changes.

Paying close attention to what we then called “electronic publishing” started for me in the early 1990s, with a conference other consulting colleagues and I organized for Publishers Weekly which we called “Electronic Publishing and Rights”. This was before Amazon existed. It was when the big transition taking place was from diskettes to CD-Roms as the means of storage. And it was even before Windows, so the only device on which you could view on a screen anything that looked at all like a book was a Macintosh computer, which had literally a sliver of the market. The most interesting ebook predecessor was the Voyager Expanded Book, and it could only be used on a Mac.

In this speech I gave in 1995, I put my finger on the fact that online would change all this and that publishers shouldn’t spend too much energy on CD-Roms.

The period from then until when it was clear Kindle was establishing itself — the awareness that it was for real slowly dawned on people throughout the year 2008 — was one where the inevitability of some big digital change was generally acknowledged. But dealing with it was the province of specialists operating alongside the “real business” and largely performing experiments, or getting ready for the day when it might matter. There was a slow (and inexorable) shift from store-purchasing to online purchasing. And the online purchasing almost all went to Amazon. But even that wasn’t seen as particularly disruptive. Neither ebooks nor online purchasing called for drastic changes in the way publishers saw their business or deployed their resources.

The first important new device for books in 2007 didn’t start out as one at all. It was the iPhone, first released in June of that year. Although Palm Pilots were the ebook reader of choice for a big chunk of the then-tiny ebook community, they lacked connectivity. The iPhone was not seen as an ereader when it came out — indeed, Apple head Steve Jobs still believed at that point that ebooks were not a market worth pursuing — but they could, and did, rapidly become one when it was demonstrated that there was a market. And they vastly expanded the universe of people routinely paying for downloaded content, in this case music from the iTunes store.

Then Kindle launched in November of 2007. A still unannounced number of Kindles sold out in a few hours and Amazon remained out of stock of them for several months! Because the original Kindle was $399, it was only a “good deal” for the consumer who read many books on which they could save money by buying electronic. What this meant was that Kindle owners bought ebooks in numbers much greater than the relatively small number of devices placed would have suggested. Throughout 2008, the awareness dawned on the industry that ebooks were going to be a significant business.

And that awareness rapidly shook loose a raft of competition. Barnes & Noble saw that they had to compete in this arena and started a crash program to deliver the Nook, which first appeared almost precisely two years after the first Kindle, in November 2009. Months earlier, Amazon had released the app that put Kindle on the iPhone. Meanwhile, Jobs had become persuaded to take ebooks seriously, and, anyway, he had a store selling content downloads to devices like crazy. Now, about to launch his new tablet format, the iPad, he had what looked like the perfect vehicle with which to launch ebooks. The iPad and the iBookstore debuted in April 2010. A month later, Kobo entered the market as a low-priced alternative with their first device. And by the end of the year, Google reorganized and rebranded what had been Google Editions into Google eBooks. The original concept was that they would populate the readers that were using epub, which meant Nook and Kobo at that time.

All of this change within three calendar years — 2008 through 2010 — created a blizzard of strategic decisions for the publishers. Remember, before all this, ebooks were an afterthought. Amazon had applied pressure to get publishers into the Kindle launch in 2007. Before that, no publisher that I can recall made any effort to have ebooks available at the time a book was initially launched. There were workflow and production changes (XML FIRST!) being contemplated that would make doing both print and digital editions a less onerous task, but they were seldom fast-tracked and doing ebooks meant taking on and managing a book-by-book conversion project.

During the period when Amazon was pretty much alone in the game (the pre-Amazon market leaders, Sony and Palm, faded very quickly), they started pricing Kindle titles aggressively, even willing to take losses on each sale to promote device sales and the ecosystem. This alarmed publishers, who were seeing small Kindle sales grow at what were frightening rates and raising the spectre of undermining their hardcovers. It didn’t hurt that the retailers with whom they (still, then, though not now) did most of their business were also alarmed. Nook arrived and Barnes & Noble would never have been as comfortable as Amazon with selling these new products at a loss. But B&N also worried about the impact that cheap ebooks might have on more expensive print book sales. Amazon didn’t.

So when Apple proposed in late 2009 and early 2010 that there could be a new way to sell called “agency” which would put retail pricing power for ebooks into the publishers’ hands, it met a very receptive audience of publishers.

And that, in turn, led to the Department of Justice’s lawsuit against the big publishers which was instituted in April of 2012.

Coinciding with and enabled by all of this was the huge growth in author-initiated publishing. Amazon had bought CreateSpace, which gave them the ability to offer print-on-demand as well as Kindle ebooks. The combination meant that a huge audience could be reached through them without any help from anybody else. When agency happened (2010), they started to offer indie authors what amounted to agency terms: 70 percent of the selling price for ebooks. This was a multiple of the percentage an author would get through a publisher.

Agency pricing fell right into Amazon’s and the self-published hands. Getting 70 percent on the ebook, the indie author got $2.10 pricing at $2.99 and $2.80 pricing at $3.99, royalties comparable to what they’d get from full-priced print. Many bestselling indie ebooks were priced at $0.99. The very cheap ebooks indie authors would offer juxtaposed against the publisher’s agency up-priced (many at $14.99) and undiscounted branded books created a market opening that allowed the Kindle audience to sample (aside from the free chapter that is standard in ebooks) cheap ebook authors for peanuts. Suddenly, names nobody had heard before were on the map, selling millions of ebooks, and taking mindshare away from the industry’s output. And it also handed the publishers’ authors an alternative path to market that could only have the effect of improving their negotiating position with the publishers.

Meanwhile, Borders sent the most persuasive possible signal that the shift in sales from stores to online, accelerated by the ebook phenomenon, was really damaging. They went out of business in 2011. That took the account that sold upwards of 10 percent of most publishers’ books, and a far greater percentage of the bookstore shelf space for backlist, off the board. Or, viewed another way, publishers went from two national retailers who could place a big order and put books in front of the core book-buying audience to one.

So the authors’ negotiating position was stronger and so was Barnes & Noble’s.

And all of those events — the devices, the ebook surge, the introduction of the agency business model, and the Department of Justice suing most of the big publishers, a very noticeable rise in successful independent publishing, and the increased leverage of the trading partners with whom publishers negotiate their revenues and their costs — were head and body blows to the titans of the industry. Every one of them threatened the legacy practices and challenged the legacy organizations and resource allocations.

During this period, Random House (the number one publisher) merged with Penguin (the number two publisher) and created a super-publisher that is not far from being as big as the four remaining members of what were called “The Big Six” in 2007. If you are viewing the world from the perspective of HarperCollins, Simon & Schuster, Hachette, or Macmillan, that might have been the biggest development of all.

Compared to the sweeping changes of that era, what has happened since and what is likely to happen in the next couple of years is small beer. There are certainly clear trends that will change things markedly over time.

Amazon continues to grow its share, and they are around 50 percent of the business or more for many publishers these days.

Barnes & Noble is troubled but in no immediate jeopardy and is still, by far, the number one brick-and-mortar account for publishers. But the optimistic view is that their book sales will remain flat in the near future.

Independent bookselling continues to grow, but even with their growth since Borders went down, they are less than 10 percent of the sales for most publishers. It is true that ebook sales for publishers have flattened (we don’t know the overall trend for sure because we don’t really know the indie sales at Amazon, and they’re substantial) and don’t seem likely to grow their share against print anytime soon.

These things seem likely to be as true two years from now as they are now. Nothing felt that way in from 2008-2012.

Digital marketing, including social network presence, is an important frontier. The industry has a successful digital catalog, called Edelweiss, which has obviated the need for printed catalogs, a cost saving many publishers have captured. And another start-up, NetGalley (owned by Firebrand), has organized the reviewer segment of the industry so that publishers can get them digital advance copies of books, which is cheaper and much more efficient for everybody.

Owning and mining email lists is a new skill set that can pay off more each year. Pricing in digital seems to offer great opportunity for improved revenue, if its effects can be better understood. International sales of American-originated books are more accessible than they’ve ever been as the global network created by Ingram creates sales growth opportunities for just about every publisher. That should continue and requires new thinking and processes. Special, or non-traditional, markets increase in importance, abetted by digital marketing. That will continue as well.

Audio, which has been one of the big beneficiaries of digital downloading, will continue to grow too. The problem from the publishers’ perspective is that Audible, owned by Amazon, owns most of that market. So they have a sophisticated and unsentimental trading partner with a lot of leverage controlling a market segment that is probably taking share from print and ebooks.

And with all of this, what will also continue to grow is relentless margin pressure from the publishers’ two biggest accounts: Amazon and Barnes & Noble.

But the challenges of today aren’t about change of the magnitude that was being coped with in the period that ended five years ago. They’re more about improving workflows and processes, learning to use new tools, and integrating new people with new skill sets into the publishing business. And there are a lot of new people with relevant skills up and down the trade publishing organizations now. That wasn’t so much the case when things were changing the fastest, 2007-2012.

It isn’t that there aren’t still many of new things to work on, new opportunities to explore, or long-term decisions to make. But the editor today can sign a book and expect a publishing environment when it comes out in a year or two roughly like the one we have today. The editor in 2010 couldn’t feel that confidence. The marketer can plan something when the book first comes up for consideration and find the plan will still make sense six months later. And while things still very much in flux in sales, a blow comparable to the loss of Borders isn’t on the

Of course, there could always be a black swan about to announce itself.

This post explains why, among other reasons, I will no longer be programming the Digital Book World Conference, as I did for seven years starting with its debut in 2010. At its best, DBW anticipated the changes that were coming in the industry and gave its attendees practical ways to think about and cope with them. Future vision was a key perspective to programming although we always strived to give the audience things they could “take back to the office and use”.

It has been harder and harder over the past couple of years to find the big strategic questions the industry needed answers to. The writing was on the wall last year when most of the publishers I talked to felt confident they understood where books were going; they wanted to hear from other segments of the digital world. That was a sign to me that the educational mission I had in mind for DBW since I started it was no longer in demand.

To their credit, the DBW management, as I understand it, is trying a new vision for the show, more focused on the immediately practical and the hands-on challenges of today. I wish them the best of luck with it.

8 Comments »

The sea change that comes with the latest iteration of the book ecosystem


In the past 10 years (since the mid-2000s), the ebook has arrived and the amount of shelf space for books in physical retail has declined, as book purchasing has continued to move to the Internet. This has put pressure on publishers’ distribution costs, as we discussed in a prior post.

In the 10 years before that (mid-1990s to mid-2000s), online bookselling began at what was, we now know, the very peak of book retailing, when the superstore chains B&N and Borders had built out hundreds of 100,000+-title stores and still owned mall chains Dalton and Walden that had many hundreds of smaller stores. And that was on top of the largest-ever network — many thousands — of independent bookstores, many of which were themselves superstores.

In the 10 years before that (mid-1980s to mid-1990s), Wall Street cash enabled the two big bookstore chains to build out their superstore networks, stocking publishers’ backlists deeply. With so many enormous stores opening, publishers received a bonanza of store-opening orders that went deep into their lists and were relatively lightly returned (until the store-opening process reversed itself 20 years later).

In the 10 years before that (mid-1970s to mid-1980s), the two mall chains (Dalton and Walden) rode the growth of shopping centers to a position of great importance in selling books to the public. They became the drivers of the bestseller lists. In the same decade. Ingram and Baker & Taylor built reliable national wholesaling networks, enabling the chains and a growing number of independents to replenish stock of unsold books quickly, increasing stock turn and profitability for booksellers (and lowering returns to everybody’s benefit).

In the 10 years before that (mid-1960s to mid-1970s), the department stores started to yield their strong position in book sales, victims both of their own structured discipline (open-to-buy rules) about inventory control that reduced their title selection and of the growth of the malls. The malls inadvertently doomed the department store concept (even though department stores were the “anchors” that made malls possible) by enabling specialty retailers of all kinds, including bookstores, to provide a better shopping experience than the department that sold those goods in the department store.

In the 10 years before that (mid-1950s to mid-1960s), an increasingly affluent society saw an ever-expanding number of bookstores while, in that era, mass-market paperbacks became ubiquitous in drug stores and newsstands, vastly increasing the number of places where Americans could find and buy books.

And the 10 years before that, which takes us back to the end of World War II, saw the birth of mass-market paperbacks and the development of modern publishing sales forces in trade houses. This was, in retrospect, the beginning of a half-century of uninterrupted growth for American book publishing. It has not necessarily now come to an end, but the growth of the segment controlled by big publishers may have ended.

What happens now? The online book market is likely more than half of the total book market. That is, books purchased online — print and digital — exceed the number sold in retail stores (obviously all print). Amazon is the single most powerful retailer, and they have also made themselves the first stop for any self-publishing or small-publishing entity that wants to reach readers. Ingram and the bigger publishers offer full-line distribution services to the most ambitious of those and to everybody else who wants to reach the whole book market.

Until the last 10 years, all the developments that affected book publishing tended to grow the availability of books relative to the availability of other media. When mall stores or superstores grew, there was no associated lift for television shows or movies (although recorded music also benefited from the malls). That is no longer the case. For the first time, really, books are competing with everything else you might read or watch or listen to in a way they never did before. Online doesn’t care what is in the file it displays. This is a qualitative difference in the nature of book availability growth compared to everything else that has happened in the lifetimes of anybody in the business.

The fact is that books used to live in a moated ecosystem, independent of what was going on in other media and book readers’ communication streams. Since ubiquitous broadband, that is no longer true. This presents publishers with two challenges they never faced before.

One is to take advantage of the opportunity to promote books to readers by tying them in to other media and events in ways that were never before possible. This is digital marketing promoting discovery. In fact, a greater percentage of the potential audience for any book should know about it within a few months of its publication than ever before, if publishers do their jobs right.

But the other is that publishers need to be alert to changes in book reading habits that are bound to occur because of integrated media. Yes, the publisher can promote the book to somebody watching a related video or reading an email on a related subject. But it is also true that promotion for movies and emails from friends can interrupt a reader in the middle of a chapter if they’re reading online. This is probably changing the way people read books and might even change how they want their books edited and shaped. Publishers who pay attention will see those changes as they occur.

We can interrupt people doing something else now to tell them about a book. But they now, in turn, can easily be interrupted while they’re reading the book. Digital change and media integration cut both ways. It is very early days for this reality. It only really occurs because of the combination of broadband and reading on an Internet-enabled device. That’s a relatively recent and still-growing circumstance. We don’t know where it will lead.

12 Comments »