“Don’t pay any attention to the critics – don’t even ignore them.” – Samuel Goldwyn
I love going to the movies, but each time a new movie comes out, I am left having to decide if the film is worth my time and money. When deciding I have a few choices: I can watch the preview and try to decide, but usually that is not a very accurate method of determining if the movie will be good or not. I can wait till one of my friends recommends it to me, but then I am left having to wait till they see the movie first, and I am not sure that they would go to see all the movies which might have been of interest to me. The other option, which I most often choose, is to go online and see what the film critics think. Usually I go to Rotten Tomatoes which compiles reviews of dozens and sometimes hundreds of movie critics across the country and gives an overall rating for each film. I try to see most movies which have a score of 90 percent or higher. Which means that 9 out of 10 critics think it’s a worthwhile movie to see.
After following that method for a few years, I started to wonder if film critics are the best source of predicting whether a movie is worth the effort or if audience opinions are a more accurate indication? In other words are the opinions of the general public better that the opinions of professional film critics in predicting if a film is good or not? So I decided to dig deeper and find out.
Over the last few years each time I saw a movie I would write down the name and give it a grade of how good I thought it was. I always enjoyed documenting my life and making notes of things that I found interesting. I decided to compile a list of around 150 movies that I had seen over the last few years and convert my ratings to a 0 to 100 scale (100 being the best), and compare the ratings of film critics on the same scale to see how my ratings compare to those of film critics.
I compared my movie ratings with the following sources:
Rotten Tomatoes Rating – This is simply a compilation of dozens of movie reviewers from across the country in newspapers and online. The score is simply a rating of how many critics think the movie is worth seeing and how many think it’s worth skipping.
Rotten Tomatoes All Critics Average Rating – This is the overall rating that the film critics have given a movie. If a critic uses a 1 to 5 rating for a movie or a 0 star to 4 stars, for example, Rotten Tomatoes converts it to try to make the ratings be uniform. I converted their ratings to a 0 to 100 scale to be uniform with my ratings.
Rotten Tomatoes Audience Ratings – This is a rating of what percentage of the general Rotten Tomatoes audience recommends a movie or not.
Rotten Tomatoes Audience Average Rating – This is an overall rating that the Rotten Tomatoes audience gave a movie.
Rotten Tomatoes Top Critic Rating – Top critics are a group of film critics, which Rotten Tomatoes believe are the most influential. These people would be the most read reviewers in the country. The critic rating is once again the percentage of how many would recommend seeing a movie and the percentage that think that it’s best to skip it.
Rotten Tomatoes Top Critic Average Rating – Overall Rating given by Top Critics on Rotten Tomatoes converted to a uniform scale.
Metacritic MetaScore – Metacritic is very similar to Rotten Tomatoes, they use a compilation of ratings from numerous film reviewers across the county to determine an overall number grade for the movie
Metacritic Audience Rating – This is the overall grade the general Metacritic audience gives a movie.
IMDB – Another rating which I converted to a 0 to 100 scale judged by the general movie public.
If we take all my ratings and all the other ratings and compare them by using Pearson’s Correlation, we can determine which ratings most accurately match my ratings. Or in other words, which opinions best determines whether I will like a movie or not. Pearson Correlation measures how close a set of numbers correlate to each other on a scale of -1 to 1. One means total correlation, 0 means no correlation and -1 means the set of numbers negative correlation.
So which was most accurate?
Here is how they panned out from the best predictor to the worst:
- Rotten Tomatoes Audience Rating = .41
- Rotten Tomatoes Average Audience Rating = .40
- Rotten Tomatoes All Critics Average Rating = .38
- Metacritic User Score = .36
- Rotten Tomatoes Rating = .35
- IMDB = .34
- Metacritic Meta Score = .33
- Rotten Tomatoes Top Critics Average Rating = .31
- Rotten Tomatoes Top Critics Rating = .30
To my surprise, the general audience over all is a lot more accurate in predicting if I would like a movie or not. The general Rotten Tomato Audience had a .41 correlation with my ratings. The least accurate predictors were Rotten Tomatoes Top Critics .3 correlation.
What is most interesting is that regardless if you look at Rotten Tomatoes or Metacritic in each website, the audience was a lot more accurate in predicting the movie likeability than were the critics. And the least accurate predictors were the “Top Critics” or the critics who are considered the most influential and the most important in the film industry. Another surprise was that when it came to mystery and suspense movies Rotten Tomatoes and Top Critic Ratings had a negative correlation, meaning that if I were to go see a mystery or suspense film, I would have a large chance of liking movies that the critics would be lukewarm about.
Also I found out that I have a lot in common with the general public when it comes to movies that the audience says go see but does not particularly like. (large gap between audience rating compared to average rating) Movies such as ZombieLand which the audience recommended seeing but did not particularly like. Or another way of saying “The movie is ok but you should still go see it”. Almost a .81 correlation.
But what happens if the audience and the film critics can’t agree? In the situations where the critics love a movie that the audience does not, the audience is usually right (.67 to .31). And in situations where the audience loves a movie a lot more than the critics, the audience is again usually right (.56 to .46).
The other interesting discovery is that overall the average person’s movie choices mirror more that of many movie critics than those of top critics (.64 to .57). In other words, an average moviegoer should listen less to those film critics who are considered most influential and best known, but instead listen to the general consensus of numerous film critics. (Which mathematically makes sense, since a large sample size of numerous critics should correlate more with the general consensus of the public compared to a small sample size of top critics.)
I always considered myself a bit of a movie snob with eclectic movie tastes, but from analyzing the numbers it looks like my taste is more similar to that of the general public than it is to the elite movie critics and professional movie experts. Moving forward, crowd-sourcing movie reviews from the general public might be a more accurate way of predicting if I will like a movie or not.
This small exercise really made me reexamine how much weight in my life I place on the opinions of so-called “experts.” Maybe the opinions of the most influential people are a lot less important then we think.
How much weight in your life are you putting on the opinions of critics and experts?