News Ticker

SFFMeta: The MetaCritic for SF/F/H Book Fans

You’ve all heard of MetaCritic, right? This is a website that aggregates movie/dvd/music/game reviews from all over the web and presents a single rating to help consumers decide if it’s something they’d like to spend their time with.

The only thing that’s missing at MetaCritic is book coverage. Genre fans, at least, need wait no longer. A new website called SFFMeta is here. This website aggregates reviews of science fiction, fantasy and horror books from all over the web. SF Signal is listed as one of the source web sites.

The neat thing about this site is how it lets you crunch the data. Wanna see the top rated books in their system? No problem. Wanna see how a specific author rates? Or a specific reviewer? All do-able.

SFFMeta is a helpful website, to be sure, but I can’t help offering up come suggestions:

  • Although the search feature allows users to search for illustrators, there is no way to see a complete list of illustrators like there is authors.
  • Seeing a list of top rated books is cool. You know what else is? Seeing the lowest rated books. Or, at least be able to reverse sort a list. This is one of my favorite features of MetaCritic.
  • (This one is admittedly a hard thing to measure, but for the sake of clarification, I note it here.) The system used for calculating aggregate rating should be posted. Is it a straight average? While I am smitten by SF Signal’s inclusion in the list of reviewers, SFFMeta should know that we have multiple reviewers each of whom may have a different rating system. The star system I use, for example, is not linear. Based on the concept that there are more ways for something to be good (good, very good, outstanding) than bad, anything 3 stars of above is good. But the linear-ized 60% rating makes it sound worse than it is.
About John DeNardo (13012 Articles)
John DeNardo is the Managing Editor at SF Signal and a columnist at Kirkus Reviews. He also likes bagels. So there.

15 Comments on SFFMeta: The MetaCritic for SF/F/H Book Fans

  1. Hello,

    Thanks for your post. Happy that you find the web site useful!

    The illustrators (and editors) lists are indeed needed and missing, I will try to add them shortly. I will also put lowest scores lists, will be interesting to see šŸ™‚

    For the scores, I put an explanation in the FAQ (http://www.sffmeta.com/faq#q4). It is basically a straight average. My goal is for the scoring system to represent 3 stars by a score of 60. So 60 does mean a good, definitely readable, but not outstanding book. 80 being excellent and 100 a classic or masterpiece of course. I will update the color coding to make 60 “green” to make it more obvious that it is an ok score.

  2. Any plans to filter out titles with only one or two reviews from lists? Currently they are dominated by them, possibly due to some reviews giving out exagerated scores.

  3. Well i had lists for new books with at least 3 reviews at first (While testing before the launch), to try to not have this problem. But it takes many months for a book to have at least 3 reviews unless it is a MAJOR author, and even then can take some time. So the current, meaning last 3 months lists were very short and would rarely be updated.

    So i decided to put every books, so at least the new reviews will popup and it will fill out with time. I think i might add a “Year-to-date” high score list with only book with 3 reviews included on the left bar, could help.

  4. It’s an interesting idea but to really make it work I think there will have to be quite a few more sources. There’s plenty of material out there but the focus on ratings exculde the sources that don’t use any form of rating.

    Also, aren’t broken links going to be a problem? All the bookspotcentral links for instance only work because they are automatically redirected to bscreview.com. A site that has decided not to use ratings any more (but still hosts lots of good reviews).

  5. I do use sources that do not have any rating. A score from 1 to 5 stars (20 to 100) is assigned by our web site when no rating is included, determined from the review feeling. I am open to add more sources in the future and to source web sites suggestions.

    I just tried a few bookspotcentral links and they worked, got to the actual reviews. Can you tell me which links were not working ?

  6. Have a look at the url. The link work just fine, they are just redirected to the bscreview.com domain. I haven’t found any that are actually broken but the old name bookspotcentral got me thinking that as the site get’s older this must become a problem.

    There’s a blog <A HREF=”http://fantasybookreviewer.blogspot.com/”>here</A> that has attempted to create a review index of epic fantasy, most of these still work but there are a number of broken links there.

    There was a Speculative fiction reviewer meme going around a while ago which resulted in <a href=”http://www.graspingforthewind.com/2009/08/21/sffh-reviewer-linkup-mem/”>a long list</A> of review blogs that might be worth checking out. Most of these only review a book or two a week, personally I found it very hard to even keep that up for a longer period of time. But over time, with this many blogs, it still adds up.

     

  7.  

    You might consider changing the author section so it alphabetizes on last name instead of first name. First name is . . . unconventional.

  8. @Eric: I suspect it will be hard to normalize a system for all review outlets as each reviewer will have their own criteria.

    RE: Number of reviews per title:
    I think MetaCritic suffers from the possibility of too few reviews to lend any statistical weight to the average — it’s just that theirs is a much smaller window in which it occurs since films and TV garners so much more attention than books.

    RE: Reviews without ratings.
    Again, MetaCritic takes the same approach where they estimate a reviewers rating.  Not all reviews include ratings, so you’re dealing with someone’s best guess based on comments, tone and how they interpret what the reviewer writes up.

  9. I must admit, I’m not impressed.

    It’s one thing for Rotten Toms to reduce complex opinions to broadly positive or broadly negative opinions but it’s quite another for SFFMeta to reduce similarly complex opinions into scores out of 5.  That requires a good deal more interpretation and, going purely by my reviews, I can’t say that I trust SFFMeta’s interpretative skills.  And if I doubt the capacity for the site to turn my opinions into marks out of 5, I see no reason why I should trust their capacity to interpret any other review published on a website that does not automatically give a quantitative score.

    I’m not even moaning about the complete stupidity of reducing complex opinions to “60 flaming broadswords out of 100”, I’m saying that the human element in interpreting score-less reviews as numerical scores renders the entire enterprise highly questionable.

  10. @Jonathan: I saw your Twitter comment: “sffmeta..com have taken my nuanced reviews and reduced them to percentage scores AND done so badly.” Agreed about the interpretation…but is it really any different than with MetaCritic?  I don’t think so.  The major difference I see is that MetaCritic has the fortune to have many more review outlets available as sources, thus providing a better statistical average.  BUT, they are still based on human interpretations.

    The message for an aggregate site is clearly the same for any individual review source: caveat emptor.  Accept reported ratings at your own risk.  It really comes down to whether your tastes match the tastes of the individual reviewers.

    I look at SFFMeta (like I do MetaCritic, which I am trying real hard not to type as “MeatCritic”, which sounds like a whole different site entirely) as a fun-to-check site for a general consensus with the caveat that quality is (as always) subjective.

  11. I like the idea, but I think it’s flawed when it comes to fantasy titles.

    Most of the review sources are heavily sf-favoring sites. While I would like to think reviewers on these sites are sff neutral, unless they are reviewing elsewhere, it’s almost inherent that fantasy reviews are not going to be judged evenly. By that, I don’t mean poorly, necessarily, but more aggressively. Where a mediocre sf book might get a C, a mediocre fantasy might get a C- or even a D. That’s just human nature.

    Case in point, I’ve been well-reviewed on the web with several books, but only one review for each book from one site are showing up with a 60 rating. To check if I was just being sensitive, I looked at several other authors in fantasy and noticed the same trend—one NYT bestseller whose books have gotten rave reviews almost uniformly in several places is showing up with 60s on two books. So, I wouldn’t recommend the site to fantasy readers as a valuable source (sf readers who might like fantasy, maybe).

    The major problem is the chosen rating system. Inherently in the US, we are trained to think a 100 point scale means that a 60 is failing. Even a 70 isn’t impressive. That’s behind part of Jonathan’s point above. The site can argue all is wants that that isn’t what it means, but its fighting something live 50-plus years of a school grading system indoctrination. I first came to it through Metafilter, looked at it three or four times before coming here, and had no idea the rating system was defined other than by its numbers. I doubt most readers will say “Oh, a 60, but what does that mean? I must click thru to see if that’s good or bad because I think a 60 is bad.” They’ll just think it’s bad and move on.

  12. I find RT useful as a review collection site.  I know which film critics I broadly agree with and I use RT as a hub for accessing their opinions.  I sometimes look at metacritic for video games but I never get any use out of it because a) I’m not that familiar with the world of video game reviewing and b) the people I am familiar with and trust tend to be bloggers and therefore don’t get included in the run down.  So I’m not that familiar with metacritic and my complaints can probably be ignored by people who ARE comfortable with the metacritic formula.  I’m probably behind the times in this respect šŸ™‚

     

    Hmmm — My problem wasn’t so much a reflection of the grading system (here in the UK I remember a score of 50% could get you an A at A-level maths) as the interpretation of reviews.  My reviews of Flood and Genesis were broadly positive while acknowledging flaws in both books.  These nuanced positions got turned into scores of 40% for both books.In both reviews I expressed concerns over the plotting (particularly WRT Genesis) but I praised the writing and the thematic content quite heavily.  So either the person doing the interpreting can’t cope with a nuanced opinion or there’s something weird going on whereby plot carries more weight than anything else.  Either way, I don’t trust the site because if they mid-read me (and the NYT by your reckoning) then they’re misreading other people too.

     

  13. Gimlet (formerly Hmmm) // December 21, 2009 at 2:10 pm //

    (I changed my screen name because I confused myself when I read Jonathan M’s post!)

    @Jonathan–yes. I think we are agreeing on the same thing from two different angles. Your nuanced reviews do get mis-read and then the error is compounded by being funneled into an easily misinterpreted point system. I wasn’t intending on putting words in your mouth, so sorry if it came out wrong.

    To clarify, btw, by bringing up the NYT, I’m not implying that inherently means quality work. I raised it not for the NYTBR (altho it is an oddly missing reviewer), but for the best-seller list. Again, not a criteria for quality work. BUT, it seems to me if a genre work is hitting the NYT bestselling list, it likely is going to receive more than one review among genre sites because of the high profile author/book. If SFFMeta’s review sources are producing only one review for a high profile author/book, then I am questioning its criteria for review source inclusion too.

    I’m not the biggest sf reader these days, but I do read sf sites to keep tabs on what’s going. SFFMeta includes review sources (“quality sites” to paraphrase them) that I have never heard of. It makes me wonder if my point about sf sites judging fantasy books more stringently also applies to SFFMeta’s judgement on what is a “quality” fantasy review site too. Is it really possible that Kelly Link, Melissa Marr, Kim Harrison, Richelle Mead—all popular and bestselling authors doing solid work in urban fantasy and paranormal romance–have books that are generating only one review to base a rating on? Even Jim Butcher and Laurell K. Hamilton, the heaviest hitters in urban fantasy, are popping up with books that have only one review source.

    Now, to be fair, plenty of the sf titles are popping up the same way. But the above folks have all written several books in the past few years, yet they’re not all listed. Lois Mcmaster Bujold, however, is showing 18 year old reviews! She’s good, of course, but the author is not the point I’m making.

    Again, I think the site is a great idea–but it’s not a go-to source yet with the current bugs.

     

  14. Yes, unfortunately the interpretation of my reviews leaves something to be desired as well, Jonathan. I think the problem with converting detailed book reviews into a numerical scores was one of the reasons that MetaCritic dropped books, wasn’t it?

  15. Jonathan,

    Only your review of Flood got turned into 40, Genesis got 60, which means a good, but not great book, because of the mixed feelings. After checking your review, I agree that it was mis-read and changed the score to 80.

    Your conclusion for the Flood review is : an oddly unsatisfying read that squanders its attention on an endless series of checkpoints (there goes Europe, there goes Asia, there goes Africa) and poorly exploited character narratives while the real meat of the piece; the set pieces and the book’s wider themes are either glossed over or rationed out.

    Honestly, I do not see many positives points…

    I agree that our rating system is not flawless, but is an honest attempt to give a broad assessment of a books reviews.

    If you (or other reviewers) do not agree with a score, please contact us and we can reconsider our score.

     

    For the number of reviews, books published before 2007 will have a mostly spotty representation and might not have many reviews, we mostly includes reviews from 2007 and up.

Comments are closed.

%d bloggers like this: