I Want a Real Rotten Tomatoes for Books

When Literary Hub’s Book Marks first appeared, it was touted as Rotten Tomatoes for books. That hasn’t worked out. MetaCritic doesn’t cover books. Why isn’t there a Rotten Tomatoes for books? Bookworms need a site that charts the progress of every book published. Tragically, books are published faster than they can find readers. We need better tools for tracking the success of what we should be reading. Masterpieces go unnoticed, brilliant scholarship ignored, all because of the inefficiency of how readers discover books.

How would you design the perfect meta-data site for books? Here’s how I imagine it. I go to MetaBookReviews (I made up this title). It offers a control panel of options for searching for books. The site allows user to search for a single title or generate lists of books by conditional filters. I picture the home page as a data control center with radio buttons and selector boxes.

One filter is the time selector. This week, this month, this year, last year, this decade, 19th century, etc. I pick last year. It asks me to choose between fiction and nonfiction. I pick fiction. It offers me a list of genres. I pick science fiction. It asks for type, and I select novels. It asks me theme, and I pick space opera. I then press Go. I get a list of space opera novels published in 2015. The site should collect data on any book with an ISBN.

Next, I’m offered a selector to choose the way I want my list ordered. I could choose cover view, and see an array of covers. People really do judge books by their covers. I could list by rating, like we see at Rotten Tomatoes. Or I could list by title, author, date published, etc.

What I want is the list ordered by total number of book reviews. This was what Book Marks could have done, but didn’t. They limit themselves to reviews from major sites, and collect too few reviews. I’d want reviews from any online publication that published a significant review. I mentioned this idea to my friend Mike, and he spent weeks using Python and Natural Language Processing Toolkit to see if he could post-process Google returns and identify significant book reviews. It’s hard. People write about books in dozens of ways, and that throws off the program. The best accuracy Mike could get at identifying solid book reviews was around 80%. That could be improved, and with human curation, a database could be developed that linked book titles to reviews, thus providing counts.

If I’m presented with a list of forty 2015 space opera books, and a few at the top have a dozen or more reviews, and the bottom half have no reviews, which titles do you think I’ll click on? What if I wanted to review unknown books?

Picture the returned list as a table, and one column is the number of reviews, but other columns include Amazon customer ratings (average stars/number of reviews), Goodreads ratings (average stars, number of ratings, number of reviews), Worlds Without End (number of lists the book is on, number of awards and nominates), or maybe Internet Science Fiction Database (number of editions). These sites are specific to science fiction, so data sources for other genres would be used elsewhere. If I click on a title, I should see all the meta-data for that book, including a list of links to all its book reviews. See sample below.

Lists are another way people are persuaded to buy books, like those here at Book Riot. When looking at title entries at MetaBookReviews, I should see a list of all the lists the book has appeared on. See sample below.

All these sources of meta-data go into providing a metric for measuring the attention a book is getting. That’s usually the best indicator, but not the only one. I asked for space opera books. Just seeing a list of all space opera novels for 2015 might inspire me to try books that aren’t getting attention. I’d also love to be able to filter those books by their sub-themes. Often I’m in the mood to read a book that covers a particular subject. That could sell more books.

To illustrate my point, I’ve picked The Long Way to a Small, Angry Planet by Becky Chambers.

The Long Way To A Small Angry Planet by Becky Chambers

Quantitative Data

Book Reviews

Searching on Google [“The Long Way to a Small, Angry Planet” Chambers Review] gets 16,000 returns. Obviously, there hasn’t been 16,000 book reviews written for The Long Way to a Small, Angry Planet. But how many genuine reviews were written? That’s what my friend Mike was trying to find out with his programming project. If there was a MetaBookReviews to collect them, either via AI or human curation, wouldn’t that be a valuable tool? Rotten Tomatoes often finds well over a hundred reviews for a movie, why don’t books get as many reviews?

If the book was nonfiction, it could also curate lists of scholarly articles about the book.


Of course this is just a small sample, but a quick glance reveals a lot.

Can you imagine having a site like MetaBookReviews? Think about all the ways you could mine information. Or just think how convenient it would be to look up a book title and have all that information in one place. When I want to know about a movie I go directly to Rotten Tomatoes or IMDb, and bypass Google. That’s what makes a successful app, when it’s a single, best source of information.

Many existing web sites could offer these features. For science fiction, Worlds Without End or ISFDB could add these features. Another logical place would be Wikipedia. I assume Book Marks hoped to be this site, but they don’t collect enough data yet. They could expand into this system too. And if you think about it, Amazon could do this too, either at their main site or Goodreads. Sooner or later, someone will do this. It’s just obvious in the evolution of big data.