Follow This Blog: RSS feed
Someday My Printz Will Come
Inside Someday My Printz Will Come

How Many Stars Does it Take to Catch a Printz?

A guest post by Elizabeth Fama (YA author) and John Cochrane (Professor of Economics), with heroic data collection by Jen Baker (Librarian).

[Note from Karyn: Usually when someone is kind enough to write a guest post, I labor over a worthy introduction. But true to her detail-oriented self—see post, below—Elizabeth wrote her own introduction. So I’ll just say that we apologize for the delay between posts, but there was this thing known as ALA. We’ll be back on track in the upcoming weeks with a writeup of the RealCommittee celebrations from Sophie, and then more on that pesky series issue. For now, please enjoy the amazing statistical guest post below. I love empirical data!]

Back in her April “Reading, Reading, Reading!” post, Karyn said, “Remember that any book with three or more stars [from the six major review journals] is automatically a contenda,” leading me to ask in the comments, “Is there an empirical rationale for considering 3-star books auto-contenders? Has the Printz (including honor books) statistically gone to books with multiple stars, or is this just a handy way of forming our reading list?” Anecdotally it didn’t seem true. Last year, for example, Chime by Franny Billingsley earned six stars but no major awards, and Where Things Come Back by John Corey Whaley earned only one star, but took home both the Printz and the Morris.

In the comment section of that post and others, we all offered our views of why stars and Printz awards might not match up, but I wanted to see exactly how much they didn’t match up. And so the lovely Jen Baker, who is equally fascinated by quantifying children’s literature, compiled a spreadsheet with the starred reviews that all twelve years of Printz Winners and Honors earned (or didn’t) from the six journals. I enlisted the help of my economist husband to crunch the numbers and create the charts.

If you’re a numbers nerd, stick around for some fun statistics. If you’re not, skip straight to the conclusion at the bottom.

The Data: How many starred reviews each Printz Winner and Honor received between 2000 to 2012 from each of the major review journals: Booklist, The Bulletin of the Center for Children’s Books (Bulletin), Horn Book (Horn), Kirkus, Publishers Weekly (PW), and School Library Journal (SLJ).

How Many Stars Do Printz Books Typically Win?

Our first chart shows the percentage of Printz Winners and Honors that received one, two, three, four, five, or six stars. Clearly, a contenda doesn’t have to have three stars. Yes, six Winners did get three stars, and one (Postcards from No Man’s Land) got five. But six Winners also got only one or two stars.

Similarly, the Honors books are about as likely to get more than three stars as they are to get fewer than three stars. Two Honors have been won with zero stars (The Earth, My Butt, and Other Big Round Things and Repossessed), but so far no book has won the gold without garnering at least one star.

In nine out of thirteen years, at least one of the Honor books has received more stars than the Winner. The extreme case is perhaps the six stars of Why We Broke Up, which received an honor, while Where Things Come Back won with one star. (See also Octavian Nothing’s six stars compared with Jellicoe Road’s two.)

Are Some Journals’ Stars More Informative Than Others?

First we want to clarify that this question only refers to predicting the Printz awards, not to the inherent value or “correctness” of each journal’s star choices. We assume that the journals have their own audiences, and different objectives in making their recommendations, and that their stars are probably not there to predict Printz awards.

That said, how well can we use the journals to forecast Printzes? The next plot shows which journals gave stars to the Winners and Honors:

For example, looking at the leftmost bars, you can see that Booklist gave a star to 53% of the Printz Winners and 60% of the Printz Honors.

The graph suggests that stars from Booklist signal a future Printz award the most, with Bulletin and Horn the least, and Kirkus, PW, and SLJ in between. Kirkus stars are poorly correlated with Winners, but do fairly well with Honors. SLJ does as well as Booklist in predicting Honors.

We say “suggest” because there are some caveats here. The first is: beware of averages of small numbers. The little blue brackets next to each bar are the standard errors. They’re a good guess of the band of uncertainty in these numbers. The Winner statistics are a lot less certain than the Honor statistics, because there are so few Winners.

The second caveat is that we only know the number of Winners and Honors that got a starred review from, say, Booklist. We don’t know the converse: the percentage of total Booklist starred reviews that won a Printz. (Ideally, we’d like to have data on how many starred reviews each journal gave each year, including books that didn’t win awards.) As an extreme example, if Booklist gives every book a star, then all winners would have a star but the star would not be a useful forecast of winners.

To infer from our graph that a Booklist star is a better predictor of a Printz than a Bulletin star, you have to assume that Booklist and Bulletin review about the same number of books each year, and each journal gives out roughly the same number of stars. Now, that may even be approximately true, but we don’t have the data, and it’s an important qualification.

How Much Agreement or Disagreement is there Between the Journals?

The top left plot answers the question, “Suppose Booklist gives a star review to a Printz-award book. How often do the other journals also star-review that book?” The answer is remarkably low. Bulletin and Horn only star such a book about 15% and 25% of the time, respectively. Kirkus, PW, and SLJ are more likely to agree, but still only about half the time.

The reverse question gives a different answer (top right). If Bulletin gives a starred review, Booklist is likely to agree about 50% of the time, as is everybody else. Similarly, a Horn Book star (middle left) gives a 60% chance of a Booklist star, and a half a chance for everyone else.

SLJ is another interesting case (bottom right). Booklist starred the same Printz title that SLJ starred 60% of the time, but the others (and especially Bulletin and Horn) did not. Yet SLJ agreed with the Bulletin and Horn’s picks.

The simple interpretation is that Booklist and SLJ are less choosy. However, that’s too superficial. They may be starring books for specific audiences that Bulletin and Horn do not share.

What Genres Win the Printz?

The next chart is unrelated to stars, but breaks down the Printz awards by genre. The categories, shortened a bit in the graphs, are Realistic or Contemporary Fiction (9.5 winners, 25 honors) Fantasy (2, 11)  Memoir or Biography (1, 5) Historical Fiction (0.5, 7) and  Poetry/Non-Fiction (0, 1). Note that classifying books is a slippery business; for instance, sometimes realistic fiction has a hint of fantasy (Kit’s Wilderness), or a fantasy is set in a historical period (Monstrumologist) or realistic fiction is told through poems (Keesha’s House), etc. If you’d like to see what genre we assigned to each book, that data is posted on Elizabeth’s blog.

Clearly, the vast majority of Printz awards go to realistic or contemporary fiction, with the others picking up the pieces in order. In general, categories other than realistic or contemporary fiction have a better chance of winning an honor than the gold.

Another caveat, before you give up your Printz hopes because you write poetry: we don’t have the denominators of these fractions. We don’t know how many overall books are written in each category. For example, if there are 3,000 realistic fiction books per year, and only a handful of poetry, then the poetry actually has a better chance of winning a Printz.

We can, however, tell that categories other than realistic or contemporary fiction have a better chance of winning an honor than the gold: the blue bar is higher than the maroon for realistic/contemporary fiction, but the opposite pattern holds for the other categories. This comparison is valid even though we don’t see all books published. Thus, it seems the Printz committees honor these genres, but more often save the “win” for realistic or contemporary fiction.


The too-long-didn’t-read (or I-hate-math) bottom line:

1. Printz Winners and Honors are just as likely to have fewer than three stars as more than three stars from the six major review journals.

2. Booklist’s stars seem more likely to predict a Printz Winner, and Booklist and School Library Journal are the most likely to predict an Honor book. The main warning about this conclusion is that Booklist and SLJ may give more stars overall.

3. Looking only at Printz Winners and Honors, the journals agree with each other’s stars about 50% of the time. Booklist and SLJ give out more stars to books their colleagues overlook.

4. The category of realistic or contemporary fiction brings home the most awards, with fantasy a distant second. Book in genres other than realistic or contemporary fiction are more likely to receive an honor than the gold.

About Karyn Silverman

Karyn Silverman is the High School Librarian and Educational Technology Department Chair at LREI, Little Red School House & Elisabeth Irwin High School (say that ten times fast!). Karyn has served on YALSA’s Quick Picks and Best Books committees and was a member of the 2009 Printz committee. She has reviewed for Kirkus and School Library Journal. She has a lot of opinions about almost everything, as long as all the things are books. Said opinions do not reflect the attitudes or opinions of SLJ, LREI, YALSA or any other institutions with which she is affiliated. Find her on Twitter @InfoWitch or e-mail her at karynsilverman at gmail dot com.


  1. A little more data since Elizabeth brought up the point that the amount of stars each journal gives could influence things a little. Below are the number of starred reviews for books that each of the journals gave in 2011 (assuming my data is good, of course).

    Booklist: 202 stars; Bulletin: 82 stars; Horn Book: 64 stars; Kirkus: 340 (!) stars; PW: 267 stars; SLJ: 275 stars

    The number of titles reviewed by each journal is trickier; I don’t track that so I don’t have data ready at hand and I don’t currently have time to go back through the 2011 journals and count. So let’s estimate! I counted the number of reviews in a single journal (except for PW where I counted two) – for SLJ and Booklists in particular if they reviewed a non-fiction series together I counted that as one review, so remember these are gross estimates:

    Booklist (3/1/12): 97 reviews x 20 issues a year=1,940 reviews a year – percentage starred: 10%
    Bulletin (April 2012): 88 reviews x 11 issues a year=968 reviews a year – percentage starred: 8%
    Horn Book (Mary/June 2012): 87 reviews x 6 issues a year=522 reviews a year – percentage starred: 12%
    Kirkus (6/1/12): 77 reviews x 24 issues a year=1,848 reviews a year – percentage starred: 18%
    PW(5/21/12 and 6/11/12): 30 and 35 reviews averaging 32.5 reviews x 51 issues a year=1,657.5 reviews a year – percentage starred: 16%
    SLJ (February 2012): 273 reviews x 12 issues a year=3,276 reviews a year -percentage starred: 8%

    Again a reminder that this sample size is much too small to actually be statistically relevant, but for quick and dirty estimating I figure it works. What’s interesting about this to me is Kirkus and PW give a much higher percentage of their reviews stars than anyone else. You could probably compare this to Elizabeth’s graphs regarding how well each journals’ stars “predict” Printz honoring and come up with some interesting hypotheses or correlations as well, but I’m out of time for today. I hope to be back by the end of the week with updates on starred titles through June, but we’ll see how it goes.

  2. Jen B: Elizabeth Bluemle has just posted an up-to-date list of “Stars So Far” here.

  3. Karyn Silverman says

    So much for the vaunted Kirkus rep as the “hard” journal! Although that number (for total reviews) might be a bit low; for Kirkus, at least, the number tends to fluctuate according to publishing list sizes, so the late summer/early fall issues are probably a bit thicker, as there are so many pubs in the fall season, and summer is generally light. I know Kirkus best, but if any of the others are firmly review the month before pub, they would also likely have higher numbers. I’ll see if I can track any real numbers down from any contacts.

  4. As a matter of fact, Jen B. has sent us all the stars given by the 6 journals (not just the ones bestowed on award winners) for 2011 and part of 2012. That’s actually enough data for us to say a lot of things about how liberal each journal is with stars, and more about how their stars ultimately relate to awards. My husband is uncontrollably creating graphs, as we speak.

  5. Statistically-minded historians of the field might be interested to note that Printz Honor book REPOSSESSED received a starred review from the late KLIATT.


  1. […] a lot of praise; it garnered 3 starred reviews from the 6 review journals we look at, which is right in the wheelhouse of a medal. And with good reason. This book hits a lot of high points in regards to the criteria […]

Speak Your Mind