Measuring the opinions of people who see a film is far from an exact science. Even sites like Rotten Tomatoes — which aggregate the scores of critics, who often give films an actual grade — are flawed in a number of ways.
But figuring out what ordinary audience members think about a film and measuring their reactions is even less clear. There are a couple of methods, beloved by film marketers who want to convince potential ticket-buyers to spend their time and money seeing their film.
Two of the most commonly cited measures of audience opinion are CinemaScore, which is a formal poll of people who saw a film, and audience score aggregators like Rotten Tomatoes, which collect scores from anyone with internet access and the desire to plug an opinion into the site.
But neither of these are as helpful as they might seem for determining a film’s quality or likeliness to delight any given audience member. And looking at how they’re collected helps to unveil why.
What is CinemaScore, and how does work?
The people who advertise movies love to tout a film’s impressive CinemaScore after opening weekend. The CinemaScore rating is easy to understand as a barometer of quality, grading films using a very familiar scale of “A+” to “F.” But what do those grades actually measure?
CinemaScore grades are compiled by teams of representatives in 25 cities across North America. On the movie’s opening night, ticket-buying attendees in five or six of those cities (chosen at random) are polled on a simple set of questions using the CinemaScore ballot:
The information that CinemaScore collects is fairly granular, including not just a rating of the movie but also the gender and age range of the audience member, their reasons for attending the movie, and their likelihood to buy or rent the movie in the future. Ultimately, CinemaScore typically collects about 400 to 600 ballots for a film.
Studios and distributors can pay CinemaScore to receive that more granular data, which can help them tailor marketing in the short and longer term. But one piece of data is more readily available: the letter grade, which CinemaScore tabulates and averages from the grades opening-weekend audience respondents give the film.
The CinemaScore letter grade is a useful piece of information, but it’s important to know its limitations. Keep in mind that the respondents are self-selecting, meaning that everyone who fills out a CinemaScore ballot has already chosen to spend the time and money on attending a film in the theater on its opening night, and to stay for the length of the film’s runtime.
Such an audience is likely to be biased toward the film before they show up in the theater — unlike, for instance, critics, who see a wide variety of films whether or not they would choose to see it in their free time.
What that means for a movie’s CinemaScore grade is that the expectations these viewers bring into the theater likely exert some effect on the grade: The audience member will be very happy if the film meets their expectations, or, if it doesn’t, potentially very disappointed.
This self-selecting aspect of CinemaScore grades means they’re not necessarily the best measure of a film’s success as a work of art. A better way to think about a CinemaScore grade is that it measures how well the film’s advertising sets expectations for the audience that is attracted to that advertising.
Films that receive high CinemaScore grades have accurately portrayed to audiences what they’ll see in the theater. Films with low CinemaScores typically have disappointed the audience’s expectations in some way — which can mean that films that play with genre conventions often attract lower grades.
For example, horror films that don’t follow rigid horror conventions, such as last year’s Mother! and this year’s Hereditary, seem to disproportionately earn low grades.
Furthermore, a high CinemaScore grade tends to correlate with a long run in theaters, which makes a lot of sense: If the opening weekend audience likes a movie, then its “buzz” will be good, and word-of-mouth will encourage more reticent audiences to go see the film. The score can also help distributors decide whether, and how rapidly, the film should expand to more theaters, and the information CinemaScore gathers helps distributors tailor their marketing in the weeks following release.
Whether the CinemaScore letter grade is publicly available depends on how widely the film is opening. Ed Mintz, who founded CinemaScore in 1979, told me by email that CinemaScore is available for every film that opens on more than 1,500 screens, which is considered a “wide” opening and would include most major Hollywood releases.
“If the release is 1,499 or less, then the distributor for the film (if interested) can choose to do a proprietary project,” Mintz said, meaning that distributors can pay CinemaScore to collect audience responses to the film and share them privately with the distributor. So it’s not that limited-release films don’t receive CinemaScores — it just doesn’t happen automatically, and those films that do open on fewer than 1,500 screens aren’t listed publicly on CinemaScore’s website.
This happens more than you might think. For instance, the 2010 film The King’s Speech is often cited as having earned the rare A+ CinemaScore — but if you search for the film on the company’s website, the score isn’t available; the film first opened in four theaters, and expanded to only 700 for its first weekend in wide release about a month later.
More recently, the Twitter account for Dinesh D’Souza’s political film Death of a Nation touted an A CinemaScore, and when this was called into question by some who noticed the movie’s absence on CinemaScore’s website, the site’s official account Tweeted that the movie was “privately” polled. The film opened on 1,005 screens, which means the movie didn’t meet the requirements to have its score automatically released.
But Aaron Brubaker, one of Death of a Nation’s producers, confirmed to me by phone on the Monday following the film’s release that the distributors had enlisted CinemaScore to poll opening weekend audiences, which resulted in the A score.
So CinemaScore is a good measure of one kind of audience reaction: the audience most primed to like the film in the first place. It’s not, however, a measure of the film’s overall quality, or its ability to persuade those who are skeptical about the movie to love it.
This is especially important when evaluating low scores for artistically challenging films. And by the same token, a high CinemaScore grade should be taken with a healthy grain of salt for highly partisan films such as D’Souza’s (or, conversely, from a liberal documentarian such as Michael Moore), or for faith-based films, for which the audience (especially on opening weekend) is likely to be almost entirely people who already agree with the film’s ideological perspective. It doesn’t measure the accuracy or persuasive power of the film’s argument.
Yet because CinemaScore has some kind of methodology behind the information it gathers, it is, at least, a fairly reliable source of what it actually measures. And at present, it’s the only such measure — especially when compared to the other measure of audience opinion: the audience grade collected by sites like Rotten Tomatoes, Metacritic, and IMDB.
What about audience scores on sites like Rotten Tomatoes?
Another favorite measure of audience opinion comes from websites that collect and tabulate the opinions of self-selecting volunteers. Rotten Tomatoes and Metacritic are primarily interested in tabulating critics’ opinions, but they also collect grades from audiences and display them on the site. And IMDB doesn’t collect critics’ grades at all — just audience scores. (In some iterations of IMDB, you may also see the Metacritic score, which IMDB draws from Metacritic’s data set of aggregate critics’ opinions.)
Often, critics and audiences roughly track with one another. But sometimes critics’ scores and audience scores diverge greatly, a fact that people with critically derided films sometimes proclaim as if it says something positive about their film:
This 0% "critics" rating comes from 11 people—the 90% approval is from over 3,000 audience reviews!— Death of a Nation (@doanfilm) August 8, 2018
Get tickets now and see why people love #DeathofaNation!
Audiences loved Gotti but critics don’t want you to see it… The question is why??? Trust the people and see it for yourself! pic.twitter.com/K6a9jAO4UH— Gotti Film (@Gotti_Film) June 19, 2018
Oh boy, critics had their venom & knives ready . Fans LOVE the movie. Huge positive scores. Big disconnect w/ critics & people. #Baywatch https://t.co/K0AQPf6F0S— Dwayne Johnson (@TheRock) May 26, 2017
The narrative this argument is resting on is the assertion that critics, with their high-falutin’ ways and their snobby tendencies, are disconnected from the real, authentic folk, and therefore shouldn’t be trusted. Critically panned movies like Death of a Nation, Gotti, Baywatch — they’re for the fans, not the critics.
But not so fast. Even accounting for a CinemaScore grade’s limited utility, at least there’s some methodology behind it: It polls people who did, at minimum, actually watch the movie. But with Rotten Tomatoes, Metacritic, and IMDB scores, that’s not necessarily the case. None of these sites require users to prove that they’ve seen the film. All a person has to do is register for an account on the site.
The fact that a person needn’t see the film to vote on it has been exploited, at times, by activist groups seeking to drive down the audience score on these sites before the film is even released (as was the case, for instance, of the Ghostbusters reboot or Black Panther), or potentially artificially inflate it. The potential for “falsification” of the score is high.
That means these sorts of audience scores are already suspect. But even if there’s no foul play, the randomness of the sampling is less rigorous than the information collected by CinemaScore, which surveys everyone in the same theater. It seems reasonable to assume that the people motivated to spend the time entering an audience score on a website feel very strongly about the film, either in a positive or negative direction.
Additionally, the data would likely skew to favor the opinions of people who use those sites — which may, for instance, favor those with more leisure time, more access to the internet, and more technologically savvy than others.
Then consider that subset of people against critics, a group of people that skews male and white, but that also doesn’t choose to review a film because of their feeling about the film, but because it’s their job to review the film. That can lead to more shoulder-shrugging reviews — the 3/5 star score — but it makes for a more moderate score.
What’s the best way to tell whether viewers liked a film?
So is there a way to definitively tell if audiences loved or hated a film? Not really. Audiences, like critics, are people, and they have different opinions.
But if you’re trying to determine whether an audience liked a film, there are some ways to get a general sense of the reactions:
- CinemaScore is probably still a useful, if limited, measure — but only for films that open on more than 1,500 screens, and select films that pay to be polled and choose to announce the CinemaScore grade. A CinemaScore grade is most useful for suggesting whether or not a movie met the audience’s expectations. That’s not the same as quality, but it can be useful nonetheless.
- Check out the box office statistics for a film (the most reliable site for this information is BoxOfficeMojo). If a film had a high per-screen average on its opening weekend — meaning a lot of people went to see it — that means that audiences were enthusiastic about the film going in. If the film makes a lot of money in its second weekend, that’s a good indicator that the word-of-mouth for the film was good, which in turn means the opening-weekend audiences were enthusiastic about it. If the film expands onto more screens in its first few weeks in theaters, that indicates that the distributors believe people want to see the film. And if a film makes an enormous amount of money, then it’s a great indicator that audience opinion was strong.
- Search for the film on social media, including searching for the film’s hashtag (#BlackPanther) on Twitter. You can get a limited idea for the film’s popularity by seeing what people are saying.
But the best measure of audience opinion — in a limited, but useful way — is to talk to your friends. Ask them what they thought of a film. Find some people, both critics and laypeople, whose opinions you find interesting or trust. The beauty of art is that everyone has different opinions about it — and flattening that into an aggregate number may not be the most helpful measure anyhow.