With the exception of a handful of companies worldwide that chase big books with big advances, most of the money publishers spend is spent on printing. Even though many publishers write off overenthusiastic author advances, most of the money publishers lose is lost printing books that were never needed. And another chunk of it is lost financing inventory that is printed well before its necessary time. With the pressure on publishers to be more efficient growing daily, getting the printing decisions right — or at least getting them better — should be a priority for every publisher.
Fortunately, two of the biggest barriers publishers face to making more profitable printing decisions are of their own making. I say “fortunately” because barriers of one’s own making can readily be removed by those who see the light and choose to remove them. The third biggest barrier can now be lowered considerably, if not removed, by the use of data readily available to anybody who will systematically capture and organize it.
The three barriers are:
1. Unit cost accounting, which makes publishers misread the financial impact of their own decisions.
2. Printing too many seeking to avoid a “too-small” reprint, which is at least partly due to an underappreciation of what short-run and print-on-demand technologies can do to recover sales that publishers fear will be lost if their first printings are too small.
3. Inadequate information on which to base a sales forecast, which might even better be called an “inventory needs forecast”, and without such a forecast, no method of determining a printing quantity can make any sense.
Let’s take them one at a time.
“Unit cost accounting”, where a publishing enterprise’s finances are managed at the granular level of the individual copy, calculating a cost basis for each one, has been a standard of our industry for a long time. Its origins are reasonable. Accountants require a unit cost for each book as it is sold to keep proper track of a company’s financials to pay taxes and inform shareholders. There’s no problem with that. The problem comes when management starts to use that unit cost number to manage the business, particularly to determine the printing quantity.
The inherently misleading aspect of the unit cost is often compounded by publishers adding in some unit assessment for the book’s plant cost and another one for overhead. This elevates a small logical error to the level of an absurdity. There are a number of reasons why:
- It tempts publishers to think that the cost of printing is incurred when the book is sold. That isn’t true; it is incurred when the book is printed. And this is true from two critical standpoints: first, the timing of the expenditure and second, the irrevocability of the expenditure.
- It takes plant costs and operating overheads that will be recovered over an unknown number of books sold and assigns them somewhat arbitrarily to the books in the printing being contemplated.
- It ultimately assumes that the cost of books printed equals the cost of books sold, which is only true if all the books which are printed are sold. If that happened every time, we would have fewer concerns about the technique. But it doesn’t.
- It completely ignores the use and value of cash, which is not only the most limited resource for most publishers, but also the one resource which, when depleted, threatens imminent doom.
- It suggests that costs and risks go down with a larger printing when, of course, they actually go up.
Unit cost analysis is usually rolled into another fallacy, the “title P&L.” The “title P&L”, unlike the fictional “unit cost”, is not driven by the accountant’s need to figure profits or taxes. Rather, it is used as part of the publisher’s attempt to assess the worthiness of a project under consideration for purchase. In fact, it is almost never used to assess a project after it is published. Publishing companies make and lose money and need to be seen that way, but it actually fosters misunderstanding to see individual projects as having a profit or loss. The world makes more sense, and you are being more precise, if you see them either contributing to overhead and profit or failing to.
The alternative to “unit cost” analysis and the “title P&L”, is what we would call, with apologies to most of the world’s currencies, “whole dollar accounting” with a focus on each title’s contribution toward covering overhead and producing profit.
The technique is pretty simple. You figure how many you’re sure you can sell and how much revenue those sales would produce. You then figure how many you’re going to print and subtract that entire cost, plus the other expenditures directly attributable to the book — advance, additional royalty if applicable, plant costs, and, perhaps, title-specific editorial development and marketing expenditures — and see where that simple subtraction leaves you. If the result is negative, the book is probably a bad risk. If the number is positive, the project is contributing to overhead and profit.
And, of course, if you decide to increase the number of books you print without increasing the number of books you budget to sell, your subtraction will leave you with less, which is accurate reflection of what would happen to your company’s profits under the revised scenario.
Unit cost is not the only consideration that drives overprinting. So let’s look at the second barrier to good printing decisions: the fear of facing a too-short, too-uneconomic next printing. Since we have been doing more and more supply chain work, we have been stunned by the number of publishers who do printings for 2- or even 3-year sales expectations. We even have found one publisher that prints for seven years!
How long should a publisher print for? There are two answers.
For those books which are highly predictable and in no danger of sudden demise — the most stable, long-term backlist — it is best to do an EOQ, “economic order quantity” calculation. This calculates the balance between the unit-cost savings associated with larger quantities and the interest-related (and, if necessary, space-related) cost of holding inventory for a longer period of time. When there is no danger that the books being printed will become “obsolescent” inventory — books not needed at all because of a drop in demand for the title — the EOQ is the path to the most profitable printing decision.
But most books are not so reliably forecastable. The biggest risk with most books — demonstrated on every remainder table — is that the copies we are rushing to print today will never be needed at all.
That fact argues for a strategy of printing only as many as the publisher is quite certain will be needed to fill orders. For a first printing, that would have to be enough to cover the initial distribution, plus an overage to allow time to watch sellthrough and fill reorders before it becomes necessary to print again. Knowing what that overage needs to be is certainly the tricky part and, in fact, it is always a bit of a guess. The guess needs to account for how much time is required from ordering a reprint to receiving one as well as how volatile a publisher thinks, or hopes, the demand for the title can be.
Let’s explore why a “needs-based” rather than “cost-based” approach to the question makes the most sense. And we’ll demonstrate why overprinting today to avoid possibly not needing enough for a reprint tomorrow is a bad tactic.
Consider the publisher who can see a need for 6,000 copies to cover an advance and figures that the first 1,500 copies of additional demand will allow her to gauge whether the book will “work” or not. She wants another 1,000 copies to cover the time it will take from ordering a reprint to getting it delivered. So, by our advice, the correct printing would be 8,500 copies.
But her production manager says that printings of fewer than 3,000 copies will fail to “make margin” according to company rules. What if the evidence seen in the first 1,500 additional demand suggest that the aggregate total need will be 10,000? So, the argument goes, wouldn’t it make more sense to print 10,000 in the first printing, get the benefit of a better unit cost on a single printing, and not risk needing an uneconomic printing and going out of stock while waiting for the reprint?
Of course, all of these decisions were based on scenarios constructed by guesswork. So let’s consider two more, which are just as likely. What if they do print the 10,000, but the total aggregate demand turns out to be 11,500, or 12,000? Now, because you printed the initial 10,000, you don’t have demand of 3,000 for the second printing! And you would have if you’d printed 8,500! And, on top of that, the publisher took a greater than necessary risk on the first printing!
Or another alternative: what if aggregate demand tops out at the 6,000 that was sold to customers as advance orders? If that happens, printing 10,000 increases your obsolescence from 2,500 copies to 4,000, which is a 60% increase in waste!
Of course, both the danger of a temporary out-of-stock and the problem of the need for additional copies that would be uneconomic to print can be solved for one-color books, particularly those with high markups like professional books, by using digital technology. In the earlier example we discussed, a publisher might just accept lower margins, printing 1500 instead of 3000. But there is a point below which that wouldn’t work. If an additional 200 copies were needed, or 100, or 50, then no offset printing strategy could work except consistent overprinting the first time which we have demonstrated only changes the point of peril, it doesn’t eliminate the possibility of needing an additional quantity that would be seen as uneconomic to print.
Digital printing for short runs, even print-on-demand runs of one copy, is a compelling tool, probably a mandatory component of the title lifecycle for all 1-color professional books. Those are the books where sales lost to out-of-stocks are most costly. For 4-color books, there are questions of quality control that make using POD less automatic. And operating on trade book margins usually would require a different royalty structure than for a normal print run to make the proposition work economically and, even then, a price increase might also be required.
Because digital printing for POD or short run works from the same PDF files as the offset printers do, “setting up” books for POD should already be routine for one-color professional books.
Whether you follow our advice, which is to print only what you are confident you will need, subject to the limitations of an EOQ decision, or whether you follow some other rule like printing for a year, or two years, or even seven, you still need to forecast needs to know what to print. So now let’s turn to the data that supports the guesswork that underlies every printing decision made by any rational technique: the inventory-needs forecast.
For first printings, the key is to get the orders in! As the account base in most markets has consolidated, much of the advance sale on many books is coming from a smaller and smaller group of accounts. Those accounts tend to be very diligent at getting orders in on big books from big publishers, where promotional space is allocated and there is a clear perception that sales can be lost if books aren’t in place from Day One. In fact, a big danger for publishers here is that being late with new books can result in advance orders being cancelled and, on the smaller books, they might not be reinstated. This can be a cause of inventory obsolescence as well.
For smaller publishers and smaller books, though, time pressure on the buyers may result in the order being delayed, sometimes past first shipment date and often past the date when the printing needs to be ordered. I am a bit surprised that no publisher has yet offered an incentive to its customers for timely delivery of those initial orders, although the scheme would have to somehow exclude the big books that the big accounts order on time anyway. No doubt, the account needs adequate time to make their decision, so the publisher making that offer would also have to provide information in time for the account to be able to take advantage of it. But a deadline for getting the order in that had some extra discount or a payment extension attached to it would seem to be a sensible and mutually margin-enhancing proposition.
For subsequent printings, the key is to know inventory status in the supply chain! At most publishers, this is done by doing a lot of individual title research when a reprint is needed. The technique employed by most publishers to do it is so labor-intensive and time-consuming that it is only done for the biggest books. Building what we call a Supply Chain Tracker takes some time and some effort if it is done in-house, but the payoff in reprints postponed, reduced, or avoided can be substantial.
What a publisher needs to build a Supply Chain Tracker is regular, preferably weekly, data feeds from the accounts holding the most inventory. The key data points to capture for each ISBN are the current quantity on-hand, the current quantity on-order, and sales or demand for the past 52 weeks and for the past 4 weeks.
With the demand data, a rep who knows the book and the account should be able to do a pretty reliable estimate of the account’s need for the next 52 weeks. With the inventory data, management can see quickly whether more copies will be needed soon, or ever, and whether there is excess stock that might be called back. Depending on the publisher, between 5 and 10 feeds should provide the needed information for more than 80% of the inventory on more than 80% of the titles.
Most (but not all) of the biggest US trade publishers have created “Supply Chain Departments” which are gathering data across the supply chain as we suggest here. Taking these feeds and archiving them creates a tool that enables any publisher to better control printings and reduce the manufacturing of books that will never be needed. Doing that creates profit from money that was going down the drain. No publisher today can afford not to.