Rotten Tomatoes scores are no way to judge a movie

Suicide Squad Rotten

Look, let's acknowledge right up front that, yes, here at Cult Spark we run a star rating with our movie reviews (basically for fun and easy referencing). But we've always considered the text of those reviews the important part and recognize that the mere concept of taking a piece of art, gauging its creative strength and weaknesses, and somehow converting that to a hard number that definitely assesses its value is ludicrous. I know that. You know that. Let's just accept it as a given.

But, even past that, I posit that even if you're looking for a quick numerical value that roughly estimates a film's worth, the specific system used by Rotten Tomatoes is an especially terrible place to turn your attention. I've been thinking about this following the recent outcry from the hardcore DC fan community after Suicide Squad settled into its painful-looking 26 percent Rotten Tomatoes score. Their argument goes that critics are too hard on Warner Bros.' current run of shared-universe DC movies, that a 26 percent (out of a theoretical 100) for Suicide Squad, following a 27 percent for Batman v Superman: Dawn of Justice, just isn't fair, especially when stacked next to some of the lesser liked Marvel films, such as Iron Man 2 (72 percent on the Tomatometer) and Avengers: Age of Ultron (75 percent). And, hey, I get it. A quick glance of those numbers could lead one to believe that critics thought Age of Ultron — with its bullshit Thor-in-the-magic-lake scene and interminable climax — was somehow nearly three times better of a film that Suicide Squad. That doesn't seem quite right, even if I do think Age of Ultron is the better movie.

So here's the problem. That's not how Rotten Tomatoes works. A 26 percent rating at that site does not mean that critics registered at the site gave it on average a 2.6 rating out of 10. What it means is that only 26 percent of critics elected to give it a "fresh" rating, a completely binary decision that critics must make when they submit their reviews to Rotten Tomatoes. That choice is entirely up to the critic, but let's say, in general, if a film writer considers a movie a 6 or better (on a 10-point review scale), they mark it down as fresh. (This seems like the perfect place to set the line, since Rotten Tomatoes consider a movie to be fresh overall if it clears that 60 percent barrier.)

In Suicide Squad's case, that indicates that only 26 percent of critics thought the film deserved at least three stars out of five. The other 74 percent considered it not up to that level. That doesn't mean the latter group hated it. That doesn't mean they think it deserved a 1 out of 10, or a 2 out of 10. It just means they didn't think it deserved a 6. For all we know, a majority of Rotten Tomatoes critics may have considered Suicide Squad to be in the 4-5 range, nudging up close to average, but decided to mark it as "rotten" as it fell just short in their eyes. Likewise, many of the critics who decided to label Age of Ultron as "fresh" may have rated it in the 6-7 range. (For the record, this is exactly where I fell with both of these movies, giving Suicide Squad two stars out of five — or equal to a 4 out of 10 — and Age of Ultron three-and-half-stars — or a 7 out of 10.) This line of thinking could lead to the possibility that, really, there are not many folks out there who think Age of Ultron is three times the movie Suicide Squad is. A good number of critics probably only consider it a tick or two up the rating scale. But because of the awkward binary nature of the Rotten Tomatoes system, the Tomatometer score is unable to properly convey that nuance.

A better system might be to allow critics to assign a definite rating to a film based on a 10-point scale. Instead of just "rotten" or "fresh," the writer could give it 4 or a 7 or whatever they think it deserves. In fact, there is a review aggragator out there that does run its numbers using this method. Metacritic attaches a number between 1 and 100 to each of the reviews it logs. If the critic gives a value along with the review, they use that. If not, a Metacritic staffer reads the review and awards it a number they think best matches what the critic wrote. It's not a perfect system. I'm a little uncomfortable with someone else trying to match a critic's intent with their own numerical value, but at least it results in a score that is a little less likely to be misunderstood. At Metacritic, Suicide Squad earned a 40, while Iron Man 2 clocked in at a 57 and Age of Ultron a 66. That seems to be a fairer numerological assessment of those three films. (And if you still think Suicide Squad is getting the shaft, I'd advise you to watch this and … I don't know … improve your taste in movies, I guess.)

Another downside to Metacritic is less film critics use it. At this point, the site has only aggregated 53 reviews toward its Suicide Squad score. Compare that to the 284 reviews Rotten Tomatoes has logged. The more reviews you are able to include, the more you can build a case for a critical consensus. Metacritic, however, seems to be more of a thing in the videogame industry, where most reviews come with a hard score and a big title like Uncharted 4 can be the subject of over 100 reviews logged at the site.

One side note: Both Rotten Tomatoes and Metacritic also feature a "users score" or "audience score," which aggregates reader reviews. Never trust those! Not because readers' opinions are somehow less valid that critics, but because rabid fanboys of all factions like to swarm aggregate sites, logging in extremely high user scores for a hotly anticipated geek property before it's even released. (And this behavior can go the other way too. Go read about what the grown man-babies who are scared of girls did to the new Ghostbusters before it opened in theaters.)

Anyway, regardless of whether Metacritic has the better system, Rotten Tomatoes has the better brand. Studios releasing a movie with a high "fresh" number will trumpet those digits in commercials and on the poster. Sites like Fandango display the Rotten Tomatoes scores right there on their app where moviegoers decide what to see and order their tickets. It's big business, with the Tomatometer rating carrying a power similar to the one Siskel and Ebert wielded with their thumbs in the 1980s and 90s. That's a shame. And not only because movies are too complex a form of art to be judged solely by a ridiculous two-digit number, but also because the math behind that number is simplified to the point of being deceptive. Not to mention seriously misunderstood.

Author: Robert Brian Taylor

Robert Brian Taylor is a writer and journalist living in Pittsburgh, PA. Throughout his career, his work has appeared in an eclectic combination of newspapers, magazines, books and websites. He wrote the short film "Uninvited Guests," which screened at the Oaks Theater as part of the 2019 Pittsburgh 48 Hour Film Project. His fiction has been featured at Shotgun Honey, and his short-film script "Dig" was named an official selection of the 2017 Carnegie Screenwriters Script and Screen Festival. He is an editor and writer for Collider and contributes regularly to Mt. Lebanon Magazine. Taylor also often writes and podcasts about film and TV at his own site, Cult Spark. You can find him online at rbtwrites.com and on Twitter @robertbtaylor.