Amazon vs. expert reviews of a book

…experts and consumers agreed in aggregate about the quality of a book.

Amazon reviewers were more likely to give a favourable review to a debut author, which the Harvard academics said suggested that “one drawback of expert reviews is that they may be slower to learn about new and unknown books”.

Professional critics were more positive about prizewinning authors, and “more favourable to authors who have garnered other attention in the press (as measured by number of media mentions outside of the review)”.

Discovering that an author’s connection to a media outlet increased their chances of being reviewed by roughly 25%, and that the resulting review was 5% more favourable on average, the academics then investigated whether this was down to collusion.

They concluded that the bias was down to the media outlets aiming their reviews at their audience, “who have a preference for books written by their own journalists”, rather than collusion.

Here is more, written up by The Guardian.  The research paper, by Michael Luca, is here.  It is not his first published paper, and he teaches at Harvard Business School.


I've noted that IMDB has the regular score for each movie (a composite of votes by individual users) and a metascore (that reunites a wide array of reviewer's opinions). This is probably because they want to compete with Rotten Tomatoes' more "professional" advice. It's interesting to have a study like this in books, and I'd like to see how it changes in the movie industry.

The paper has two more co-authors in addition to luca -- loretti dubresco and alberto motta -- also, the disparity between "expert" reviews and ordinary consumer reviews raises a fascinating problem: the "wisdom of crowds" versus the considered judgment of a thoughtful scholar (of course, most so-called "experts" we see on the op-ed pages or on TV (especially political commentators) are really just partisan or ideological hacks with certain axes to grind -- but going back to the disparity between experts (real or pretend) and ordinary consumers, I see this disparity with movie app "flickster" which includes a separate listing for consumer reviews and for film-critic reviews

Are crowd-sourced ratings self-reinforcing and initial-condition dependent? i.e. Is the final rating strongly predicted by the initial mean rating of a critical-mass of early raters?

Hi Tyler,

I love your blog and really wanted to purchase a copy of your latest book to read on my iPad

Unfortunately your publisher does not allow me to purchase a digital copy of your book from Amazon because I am a New Zealander

I do not believe you would intentionally exclude New Zealanders from the joys of reading, you must have mistaken us for Australians who are "cocky a-holes descended from criminals and retarded monkeys"

I have included a link to a New Zealand documentary on racism to help explain our situation.

Kindest regards,


In other words, the finding is an approximation of what everyone always knew: professional critics are corporate whores.

Not at all. They're lemmings who would praise anything by certain authors (DeLillo, Irving, McCarthy).

I've read a zillion Amazon reviews and about the same number of professional reviews. Here are the patterns I've noticed:

Amazon reviews of mass market fiction are largely driven by whether the reader would like to have the main character in the novel as a friend.

Amazon reviews of very high end intellectual books are relatively few in number, but often quite good.

Professional reviews of intellectual books tend to be fairly inane, especially if the book doesn't fall on one side or another of pre-existing battlelines. For example, Steven Levitt's Freakonomics was lavishly reviewed, especially for its abortion-cut-crime theory, even though 10 minutes of Googling would have shown reviewers that there had been a growing attack on the plausibility of the theory for six years (which was vindicated six months later when Christopher Foote and Christopher Goetz of the Boston Fed showed Levitt's theory was based on his own programming errors).

Outright demolitions, like Steven Pinker's whomping of Malcolm Gladwell a few years ago, are in the prestige press.

As for IMDB movie ratings (on a 1 to 10 scale) by registered readers, they are pretty reliable: an 8.5 is better than a 7.0 which is better than a 5.5. I would probably agree with the rank ordering of two movies 90+% of the time when the IMDB rating is at least 1.5 points different. That said, this applies mostly to movies that appeal to IMDB's base of youngish male fanboys. IMDB's volunteer raters aren't very good at, say, distinguishing an above average romantic comedy from a below average one.

As for professional movie reviewers, their aggregated opinions tend to be pretty reliable, although they routinely overrate good but not great documentaries, foreign films, and indie films. Professional reviewers frequently completely miss the point of movies they like, such as "Borat" and "District 9," but that's a different problem.

Interesting comments above.

One of my hobbies is running a video games-centric website, and I've observed a few parallels between professional game reviews and sell-side equity research. Just as sell-side analysts are famous for being too bullish, video game review scores are famous for only using the "7 to 10" range (i.e. a game the reviewer, in the text, describes as weak will still score 7/10 -- you have to release something truly awful to score below that). Like the book reviewers described in the paper, game reviewers also tend to be too forgiving towards heavily-hyped "blockbuster" releases.

Nowadays I rely on informed fellow consumers -- especially friends with similar tastes to mine -- for recommendations. I read professional reviews when I want to "drill down" in detail on titles that have already caught my eye.

Check out the reviews for "Structure and Interpretation of Computer Programs" there are the not so smart people who hated it, and the smart people who loved it, and reread it. Biggest split I've seen.

Comments for this post are closed