Adobe

Marketing the author properly is a challenge for the book publishing business


A few years ago, trying to explain the difference between how books had weathered digital change compared to other media, I formulated the paradigm of the “unit of appreciation” and the “unit of sale”. The music business was roiled when the unit of appreciation (the song) became available unbundled from the prevailing unit of sale (the album). Newspapers and magazines presented individual articles that were appreciated within a total aggregated package that were the unit of sale. The ability of consumers to purchase only what they most appreciated shattered the business models built on bundling things together.

The bundling was acceptable to consumers when it was a requirement for delivery (I can’t just drop the baseball scores on your lawn; I need to deliver a whole newspaper) but often rejected when the individual content components were available on their own. (And, of course, it was even more damaging to the established media when units of appreciation like box scores became free!)

This played out in a more complicated way in the book business. For novels and narrative non-fiction, where the unit of sale equaled the unit of appreciation, simple ebooks have worked. That’s been great for publishers, since the ebooks — even at lower retail prices — deliver them margins comparable to, or even better than, what they got from print books.

But there is a big challenge related to this paradigm that the industry hasn’t really tackled yet. The “unit of appreciation” for many books is the author. And the “unit of appreciation” is also the “unit of marketing” and therein lies the problem. Because the industry hasn’t figured out how to bring publishers and authors together around how to maximize the value of the author brand.

Marketing requires investment. For an author, that means a web site that delivers a checklist of functionality and appropriate social media presences, as well as what any competent publisher would do to make the individual book titles discoverable.

But authors inherently do not want publishers to “control” their personal brand, particularly when so many of them have more than one publisher or self-published material in addition to what they’ve sold rights to. And publishers don’t want to invest in marketing that sells books they don’t get revenue from or to build up an author name that could be in some other house’s catalog a year or two from now.

The net result is an industry hodge-podge. Many authors have fragmented web presences, with pages on publisher sites, sites of their own, and Google Plus and Amazon author pages that are imperfectly managed (or not filled in at all), even though they are actually critically important to the success of a book.

This is a problem that has no single or simple answer.

Where the solution must start is with authors (which also means agents, but also means all writers with by-lines, whether they’re now writing books or not) recognizing that the author brand is a proprietary asset that, if properly nurtured, can grow in value over time. The value is reflected in email subscribers (to newsletters or notifications or whatever an author cares to offer that fans will sign up for), social media followings, and web site traffic. When it becomes large enough, the following becomes monetizable.

In our Logical Marketing work, we have encountered one literary agent who was focused on this. “I’m not concerned with title metadata,” s/he said. “That’s the publisher’s job. I want my authors to become list-gathering machines.” So we looked at three of the agency’s authors’ websites and made recommendations specifically addressing how to gather names. The agent is in a position to urge the authors to take the right follow-up actions.

But we’ve also found flaws in the web presences of authors that publishers asked us to evaluate. When that happens, we — actually they — often hit a brick wall. The marketing people don’t have access to the authors; those are relationships handled by the editors, often through agents. Editors don’t have the same understanding of web site flaws that marketers do, even after we explain them, and the agent-author relationships have other elements that are more important to the editor to manage. It is difficult for a publisher, with whom an author signed so they would market the book, to spell out a list of tasks the author should do to market their books (or themselves). It opens what can be a difficult conversation about who should do what and who should pay for what.

In another case, we worked with a publisher that has a celebrity author (in a how-to field) who has split his publishing between our niche-publisher client and a Big Five house. The author’s own web site is a critical part of the marketing mix and it promotes the books from both publishers. When we evaluated the author’s web presence, we suggested a range of improvements that suggested a rebuilt site was required. When the small publisher and author went looking for a developer, they were hit with an estimate of $60,000 to build what they wanted. In the meantime, we have found the resources necessary to do the site for a fraction of that cost, but it still isn’t free. Who should pay for it? That remains a question.

As it happens, the author rebuilt the site for something more than we’d have charged but less than the extortionate $60,000 price. It looks fine. But it is an SEO disaster. He isn’t registering for the most fundamental search terms relating to his books and expertise. The optimization is SO bad that his link traffic is exceeding his search traffic. So he’s got something that looks good to him but isn’t adding commercial value.

In fact, we have often seen stunningly bad author websites in our reviews, even for very high-profile and successful authors who have spent real money building their sites. Lots of video and flash may make something an author finds eye-catching, but it doesn’t help them get discovered or engage their fans.

Perhaps there will never be an “industry answer” to maximizing the marketing clout of our core “unit of appreciation”: the author. But we know that every author who has more than one published piece (book or article) on the Web under their name and who has the intention of publishing more should have the following built into a web presence they control and manage:

* a list of all their books making clear the chronological order of publication (organized by series, if applicable)
* a landing page for each book with cover, description, publisher information (including link to publisher book page), reviews, excerpts, and easy to find retail links for different formats, channels, and territories
* a clear and easy way for readers and fans to send an email and get a response
* a clear and easy way for readers and fans to sign up for email notifications
* a clear and easy way for readers and fans to connect and share via social media
* a calendar that shows any public appearances
* links to articles about or references to the author

They must have an active and up-to-date Amazon author page and Google Plus page; that’s critical for SEO. Twitter and Facebook promotional activity might be optional, none of the rest of this is if an author is serious about pursuing a commercially successful career.

And every publisher and agent should be urging authors to see these minimum requirements as absolutely necessary, offering advice, help, and financial support whenever possible. Authors should be wary of publishers who want to “own” the author’s web presence but they should expect publishers to be wary of any author who doesn’t nurture their own.

My marketing whiz partner Pete McCarthy’s recommendation is that the authors own their websites but that the publisher run a parent Google Analytics account across author sites. That would enable them to monitor across authors, use tools like Moz to improve search (that would be beyond most authors’ abilities to manage and understand), and provide real support to authors optimizing their own web presence. This kind of collaboration is particularly appealing because it is reversible; the author can at any point install their own Google Analytics and remove the site from the publisher’s visibility. What this takes is for a publisher to set up the “parent” Google Analytics account and make a clear offer to authors of the support they can provide. As far as we know, only Penguin Random House — using an analytics tool called Omniture subsequently acquired by Adobe — offers this capability. Pete set it up a few years ago when he was there. As far we know, nobody else has done so.

This solution allows authors to own their own sites and email lists — ownership of email lists is a massively underdiscussed point between authors and publishers — but for publishers to have a sense of what’s going on. That means they can make recommendations about marketing, employing what is usually (and should just about always be) their superior marketing knowledge on behalf of the shared objective of selling more books.

We still haven’t made the switchover from Feedburner, our frustrating email non-delivery service. If you didn’t see the post before last about how a Google-Ingram combination could create a meaningful challenger to Amazon (and I think that’s the only way one can happen — or at least I haven’t thought of another), you should take a look.

23 Comments »

Three words of wisdom: standards, rights, & data


The Book Industry Study Group’s annual membership meeting on Friday concluded with a panel discussion among four industry executives who have leadership roles in the group. They are also four of the sharpest minds in publishing and they all had provocative things to say. Recollection of detail is not my strongest suit and I didn’t take any notes, but all of them said things that stuck with me and which struck me as ideas that deserve more attention than they get.

Dominique Raccah, the founder and CEO of Sourcebooks, made the now-obvious (but new to me that morning) point that we are going to have to streamline generating metadata in multiple languages to take advantage of emerging global markets.

Maureen McMahon, the CEO of Kaplan, which serves a very targeted audience, recalled that five years ago she was able to track her very discrete list of competitors and closely calculate her market share. But as an information-provider, she now finds competitors can pop up from anywhere.

Ken Michaels, just appointed President of Hachette Book Group USA, reminded us that 70% of the sales are still print. He said that we need to stop talking about digital as if digital is all there is; that just as media and consumer habits are converging so must the approach publishers take to running their business. He stressed building workflows around content, not product, so you can curate and compose once for all formats, and incorporating digital as a way of life, even in publicity and marketing, rather than having any stand-alone digital workflows. In other words, it is time to integrate digital, not treat it as a thing apart.

All great insights, but what I really took to heart was some simple wisdom from Tom Turvey of Google. Turvey is spending a lot of time outside the US these days, as Google Play opens in markets across the globe. He reminds us that we are way ahead of everybody else in digital change. That means that potential markets abroad are only in their earliest stages of development. He sees that the publishers in those markets –and we as well — need to concentrate on three things: standards, rights, and data.

Standards, rights, and data. These are the three elements which can restrain digital growth, or propel it. They’d also serve as a good short summary of BISG’s agenda. Turvey took the opportunity to say that every country needs a BISG, but not every country has one.

Standards, of course, are a community endeavor. It is not for any one publishing player to create standards on their own for everybody else. If you’re powerful enough, like Amazon, it might be in your best interest not to throw yourself wholeheartedly into participation in standards that make it easier for others to compete with you. But, as publishers well know, insufficient standards can cost a lot of money, rendering content for different screens or even subtly different applications of epub or Adobe.

The challenges with rights are, first, having them, and second, making sure a file’s metadata spells them out clearly. One of the the first rules I learned when I came into publishing decades ago was “acquire rights broadly, license rights narrowly”. That is practice which was unambiguously the wisest commercial course until our current and developing age of digital delivery. Now agents (or publishers) having licensed rights “narrowly” can cause books not to be available to customers who would be happy to buy them when they easily could be doing so.

Data is a combination of an industry problem and an individual publisher challenge. The digital age is presenting us all with new metrics if we can gather and use them: from websites and Twitter and Facebook, as well as from publishers’ sales. We are beginning to learn what marketing and social activities move the sales needle and we’re finding it isn’t necessarily the same for different kinds of books. BISG and AAP have joined forces to deliver BookStats, the most rational and accurate book industry sales data we’ve ever had in the US and perhaps the most accurate industry data in the world. Tara Catogge of Readerlink Distribution Services did an eye-opening presentation of what that database can do earlier in the show, but we’re still at the earliest stages of learning how best to use it and we’re as blind as we’ve ever been everywhere else.

Standards, rights, and data. Publishers could benefit by reviewing their practices and progress in all three areas at a senior level on a regular basis. My hunch is that some, including the ones who joined Turvey on that stage, already do.

Two of those BISG panelists, Raccah and Michaels, are among the “innvoators” presenting at our Publishers Launch Conference next Monday, 10:30-6:30, at the Frankfurt Book Fair. Dominique will be talking about two new initiatives from Sourcebooks and Ken will be explaining the value of SaaS — software as a service — to modern publishing IT departments, including some tools his team at Hachette has developed and are making available to the industry. Pub Launch Frankfurt will also feature a presentation from Noah Genner, who runs Book Net Canada — their version of BISG — about a survey of Canadian book consumers they’ve just done: more about data.

9 Comments »

Tech companies need to look like they understand publishing, which they don’t always do


I showed up Tuesday morning at the gorgeous Cipriani restaurant and ballroom on 42nd Street for The Future of Publishing Summit, not knowing what to expect. I had been invited to attend this in an email last month which promised an interesting program (lots of big tech companies plus a book publishing “track” led by the always-interesting Carolyn Pittis of HarperCollins) at an all-day conference. I was invited because of my status as a “thought leader”; an all-day event like this with no fee is not unheard of, but it also isn’t common. I accepted.

Then when I heard from my friend Evan Schnittman of OUP over the weekend that he’d be going, I decided I should look at “what is this” more carefully. So I went to the web site for it and I found it almost impossible to figure out who was staging this thing and what they hoped to get out of it. My prior experience with free events — many I helped organize that were run by VISTA Computer Services (now renamed Publishing Technology) in the 1990s and several since hosted by MarkLogic — tended to have the organizer highly branded and visible. This one was opaque. “About us” on the “The Future of Publishing” web site described the conference, the agenda, and the goal of “setting the agenda for publishing’s new business model amid digital disruption”, and it led to a link listing the sponsoring companies. But nowhere did it say, “I’m the organizer of this event and this is why I want you there.”

When I got to Cipriani in the morning, I started to see some people I knew: Evan, David Young and Maja Thomas from Hachette, Peter Balis from Wiley, Dominique Raccah from Sourcebooks. “What is this about?”, I asked them. “Who is behind this?” Nobody really seemed to know.

As the day developed, it seemed that the two parties in charge were Tim Bajarin, President of Creative Strategies and Colin Crawford, former EVP Digital at IDG Communications, Inc. Bajarin kicked off the session recalling a critical meeting at UCLA in 1990 that really charted the course for CD-Rom development.

Uh oh, I thought. I wonder if these guys know what “CD-Rom” calls up in the mind of anybody in the room who was in trade publishing the 1990s.

What I had walked into took me back to the early 1990s when I went to a conference sponsored very openly sponsored by Microsoft for book publishers. The message then was, “here are the amazing things we are going to be able to do with CD-Roms in the very near future. To realize the true value of this technology, we need content. We’re not sure exactly how you make money from the content, but, hey, guys, get creative.” And, in fact, that was the message that the five key sponsors of this Summit — Sony, Adobe, Marvell, Qualcomm, and HP — had for their publishing audience.

This was the takeaway. Consumers are going to be navigating their content on faster, smarter, lighter, and cheaper devices that will open up more flexible and robust content delivery and consumption models. Publishers should take advantage of this! But “taking advantage” in this case often meant “more sound, more pictures, more video”. And that recalls the veritable disaster of CD-Rom development for book publishers: largely uncontrolled spending in development of new kinds of products, ostensibly but loosely rooted in books, that had no established market and never found one. The iPad had already unleashed several sparks of enthusiasm for enhanced ebooks; this conference wanted to pour fuel on those sparks and start a real fire burning.

The format of the day was that each of the primary sponsors got a half-hour to present their technology, following 30 minutes from Tom Turvey of Google on the forthcoming Google Editions. (Turvey joked about the fact that he had given the presentation to just about everybody in the room before in their office or his.) I’d say that most of the 30 minute presentations packed at least 5 minutes of useful information into them. There were definitely people buzzing about the fact that Adobe has a workaround to enable Flash-like content on the iPhone, which doesn’t support Flash. We all got the message that connectivity will be more robust and more routine; that both LCD color and e-ink (and before long, color e-ink) will be available in a staggering number of devices (or “form factors.”)

With all that capability in your hand, you can pull up just about any content you want. “Why would you read a plain old book” was certainly part of the message.

Then after a really terrific lunch, about half to two-thirds of the audience (I’d reckon; couldn’t really see because we were broken into three groups in different rooms for books, magazines, and newspapers and no more than a fourth of the audience was there for the final part of the program after the breakouts) remained to hear the content-based presentations. The intention here was “the tech guys will explain what’s coming in the morning; the publishing guys will explain where they are in the early afternoon; and then our experts will ‘pull it all together’ at the end of the day, allowing us to leave with a new plan for publishing.” The “experts”were additional sponsors, of course, and creators of tools or platforms for products or presentation: Zinio, Notion Ink, ScrollMotion, Vook, and Skiff. These are all very worthy companies with substantial propositions that have made real inroads working with established media.

But are they qualified to chart a commercial course forward for complex publishing enterprises? Frankly, I don’t think so.

Cader said privately on Monday that he had joined Conferences Anonymous. He wasn’t going. Admittedly, these guys had a rough row to hoe trying to tell people something new following on the heels of Digital Book World in January, Tools of Change in February, Pub Business Conference and Expo earlier in March, and an ABA meeting on digital change in between. People who are really junkies for this stuff were out at SXSW, which apparently also didn’t seem as revelatory to some savvy book practioners as it did last year (or so said my buddy from the Microsoft conference two decades ago, Lorraine Shanley.)

My sense of this one was “nice try”, but it didn’t work. The superficial logic of putting the tech and publishing people together, laying out the picture from each side and then coming up with “answers” within a single stimulating day is appealing, but it is ultimately impractical. Book publishers (and, I suspect, other publishers as well) aren’t going to do much today based on what they see tech might deliver two or four years from now. And book publishing isn’t one business anyhow. As Turvey of Google, who understands the publishing business better than any other tech company representative I know and, frankly, better than most publishers, spelled out in the beginning: “book publishing is about five different businesses that don’t have much to do with each other.” We in publishing know that very well. Tech companies that want to get our attention need to make clear that they know that too.

8 Comments »

The ebook TTS argument goes on


Random House came in for some ridicule last week because they have apparently disenabled TTS on ebooks they are giving away for free. I see this piece as nothing more than a cheap shot. Random House responded to the Authors Guild position opposing TTS by attempting to disenable it for the Kindle 2, as, we believe, other publishers will if it can actually be done.  If they are concerned about the authors’ wrath when the capability is on ebooks that were sold and on which the authors earned royalties, of course they’ll disenable it on the ones they give away too. What confirms this piece as a cheap shot is that there is no evidence presented that any other publisher takes a different position. Why single out Random House?

The author of another piece on the same subject is very gentle about the efforts “on behalf of authors” to block text-to-speech technology for ebooks, and in the Kindle 2 in particular. The authors’ position (to the extent that the Authors Guild and those literary agents who are opposing TTS actually represent the authors’ position) is just wrong. There is no evidence that any significant number of consumers buy books in multiple forms (the three main choices being printed, e-text, and audio). Even people who do both read and listen don’t tend to buy the book in two forms to enable that; they read some books and listen to others. Similarly, people who read both print and digital don’t try to do both with the same book. (What’s my evidence? Observation. But nobody has offered the least bit of evidence to the contrary and I haven’t met anybody yet who says “you aren’t talking about me.”)

So, in fact, enabling a digital file to serve two purposes would only increase sales by offering extra value. If that’s right (and it has at least as much chance of being right as the notion that there is cannibalization), blocking TTS is costing publishers sales and costing authors royalties.

I made the argument when this first came around three months ago that the TTS capability will be ubiquitously available so that people will be able to take any text they have and apply that capability against it. All Kindle 2 does is make it a bit more convenient. So this position is a fail on several counts. The fact that it is handicapping the handicapped is contemptible. The fact that it is denying authors and publishers revenue when it is supposed to be protecting them is just dumb. And standing in the way of applying developing technology to the benefit of all writers and readers can’t possibly be a sustainable position.

We did a quick check in this office for TTS apps. I think the Authors Guild and the agents should check these out.



Are they planning to sue the consumers who acquire and use these apps? Are they really going to add to the burden of ebook publishing the need to find ways to lock up the text against all these technologies?

Thanks to all of you who viewed the Shift speech over the past weekend. It is disappearing from our site but is replaced by a link to a new annotation platform from our client SharedBook. If you have thoughts on the speech, that’s the place to express them. There are browser limitations to that platform which are posted with the link.

2 Comments »

A few thoughts, some near heretical, about DRM


I got a call today from Laura Sydell of NPR in San Francisco to have a conversation about DRM. I found myself telling the story this way.

From the beginning, there were multiple ebook formats, the leading ones being Adobe, Palm, and Microsoft Dot Lit for a time, with Mobi originally intended to be the format that bridged the gap (at that time) between devices. Then Amazon smartly took Mobi out of play, blocking anybody else from peddling a device-agnostic solution. And now we have e-readers…

From the beginning, there has been a reluctance of people to read BOOKS (goodness knows they read many other things) on screens, or at least on the screens that were presented to them for the purpose. This distinctly separates the book business from the music business, which I know I wrote about last week, but which also applies here. Your ears don’t care whether the speakers or headphones got the sound from a download or a record. It all works the same to you. But, as we all know, reading a screen for most people is a sufficiently different experience than reading on paper that they’re likely to have an opinion about it (often whether they’ve actually tried it or not).

From the beginning, some people in the book business (mostly, I suspect, agents for very big authors and their publishers, who have the most at stake) have been concerned that there would be a spread of unauthorized digital copies if they didn’t “protect” them. They were apparentely learning a lesson from the music business. But the music business was “stuck.” The format they sold music in was a “gold master.” They distributed digital copies.

From the beginning, there has been a romantic notion called “interoperability”, which says it is a wonderful thing if the same file can work on lots of different devices. So you should be able to  read the book on your PC, or on your Sony- or Kindle-like device, and on your iPhone and/or Blackberry and your Sony Play Station, for that matter. Believe it or not, there are not only quite a few of the publishing digerati who think this is very important, there are many who actually blame the slow growth of the ebook market on the fact that the industry hasn’t accomplished the ability to deliver it. (Seems preposterous to me.)

The multitude of formats presented costs and hassles to the publishers. They had to do more work to put each book in shape for each format, and they had to do pretty meticulous quality control because a lot could go wrong. With ebooks not selling much at all, the difference between spending $250 to convert to one format, say (starting with a PDF print file), and then adding $50 or $100 more for additional formats created a whole decision-making cascade. This all choked off books from the ebook stream, on one format or another or at all, as publishers needed to “decide” to publish each book in one or more formats.

The multiple systems also prevented interoperability and restrained piracy. The DRM was actually a bit of window dressing; even unprotected files wouldn’t have traveled very far.

But then the industry, through the IDPF (International Digital Publishing Forum) developed the epub standard, which was code that could be read by many different systems and/or converted inexpensively to other systems. So the publishers could provide just one file, the epub file, and the distribution channels could do the conversion to different formats. A giant step toward interoperability (and efficiency.)

So now DRM is the one barrier to interoperability and so the drumbeat to get rid of it gets louder and louder.

Also from the beginning, people have noticed that, in most cases, the more of a book you give away digitally, the more you sell. This would almost certainly not be the right strategy with high-value scientific reference, or a directory, but it is the experience of many people over a long period of time. Tim O’Reilly has famously pointed out that obscurity is a much more prevelant problem for books and authors than theft through piracy. Cory Doctorow is certainly the most vociferous and among the most eloquent expressing contempt for the whole idea of DRM, the insult it constitutes to the audience of book readers, and its self-defeating nature. He has given away huge amounts of digital content and he credits doing so with growing his sales as a novelist.

My officemate and colleague Brian O’Leary of Magellan Media has been doing an ongoing study of the effects of free distribution with O’Reilly Media and Random House. They are documenting both the fact that there is no significant piracy of ebooks and that free distribution, even the limited piracy, seems to have a stimulative effect on sales.

We are at a moment where publishers are noticing this and taking it on board. O’Reilly and Thomas Nelson are the first I’ve noticed to start offering ebooks in multiple formats, with Nelson doing so to any buyer of a print book who registers on their site for it. (A nice way to capture names, too.) Others, noteably Hachette’s unit Orbit, and Random House, have started giving away ebooks (for free or, in Orbit’s case, a buck or near-free)  to promote books and authors. The ROI on these is close to infinity if it sells one more book!

I hope that this is an accurate summary of events so far, except that I left out the Kindle (on purpose). Now I’d like to offer some forward-thinking and observe an enormous irony.

1. Forward-thinking. This notion of giving away ebooks has a tragedy of the commons built into it. It’s free and it works. So everybody’s going to do it. The choice of ebooks you can legitimately download for free or under a buck will grow by leaps and bounds (it already has.) At just the moment that the ebook market is growing, and lots of new people are coming into it, many people will be able to form the habit of choosing from what is free or near-free. Ultimately, this will have two negative effects. One is that it will depress the pricing across all titles. And the other is that the giveaways will lose their stimulative effect.

I would not suggest that anybody voluntarily try to save the commons. It would not be in their own best interests to do that and they would not succeed. 

2. Because there is going to be a culture of free or almost-free, piracy might well become an issue for the most popular ebooks as takeup of ebooks grows. It clearly has never been a problem, but that doesn’t mean it never will. Things change. (See number 1.)

3. The Kindle. Amazon not only steered clear of the epub collaboration, they are aggressively blocking people from selling content that would be compatible with the Kindle. Everything about what they do is closed. The problem is that they’re defying history so far: growing faster with a closed system than all their competitors for ebook eyeballs combined.

That’s ironic.

But it’s not what’s most ironic.

I personally never got the thing about interoperability until now, when I am reading the great new biography of Abraham Lincoln by by Ronald White on both my Kindle and my iPhone. Whenever I switch over from one to the other, it knows my place and asks me if I want to advance to it. This is great! I love interoperability. I have no use for it between any other two devices, but between my Kindle and my iPhone? Terrific!

Of course, Amazon is probably able to deliver this functionality so seamlessly partially thanks to the fact that they have a closed system and more control.

That’s really ironic.

18 Comments »