• Blog
  • About
  • Contact
Menu

Stuff George Writes

Street Address
City, State, Zip
Phone Number
In which a parent pretends he has time to write

Your Custom Text Here

Stuff George Writes

  • Blog
  • About
  • Contact

IMDB Score Bias: It's Temporal

May 1, 2018 George Saines
Photo by Studio Tempura.

Photo by Studio Tempura.

This post was originally published on 2/7/2011.

I'm a nerd, economist, and a movie snob. Sometimes it makes me hard to deal with.

My brother has a natural fear of picking movies with me. During our holiday visits home we invariably try to watch a movie and it ends in eyes being rolled in my direction. My family has taken to calling any movie I pick as a "depressing indie drama."

I don't think of this as being difficult, I think of it as getting the most out of my time. Having seen thousands of great movies, I have trouble committing two hours to a movie of dubious quality. In an effort to avoid wasting time on bad and mediocre flicks, I am on a quest to better predict how much I will enjoy a given movie. I've rated more than 700 movies on Netflix, I visit IMDB about 25 times week, I've tried Flixster, Rotten Tomatoes, and the blogs of well-known critics. The goal is to accurately correlate my movie-watching happiness with the ratings provided by these sources. So far the results are disappointing. No one source accurately predicts my preferences. Even inter-comparing and creating composite indexes frequently leads to contradictory predictions. To date, the best predictor I've found is a film's IMDB rating, but this number is far from perfect.

IMDB ratings are worst when movies are newly released. For a film like Citizen Kane, the IMDB score is accurate, and no wonder: enough people have seen it to decide how good it is. In fact, Orson Welles' masterpiece has 145,319 ratings on IMDB, a score of 8.6/10, and is listed by the American Film Institute as the best movie ever made. [1] Citizen Kane is pretty similar to other critically acclaimed films on IMDB. Among the top ten films, the average number of IMDB votes is 152,073 and the median score is 8.45. With so many ratings, my guess is that these movies are more accurately rated than a movie with 1% as many reviews that was released last week.

Take Inception for example. When it was released it had a rating of 9.3 on IMDB and thousands of reviews. But how could this be? Was Inception actually a better movie than Citizen Kane, Casablanca, The Godfather, Gone with the Wind, Lawrence of Arabia, The Wizard of Oz, The Graduate, On the Waterfront, Schindler's List, and Singin' in the Rain?  Having seen all of these films, I had a hard time believing it.

So, I hypothesized that IMDB ratings were biased upwards for young movies. When new movies come out, the first people to see them are early adopters and critics. As an example, someone disinterested in a new film may see it eventually [2], but they are unlikely to see it the first day it comes to their local theater. My contention was that seeking out such pre-releases, in combination with marketing and release hype, would select and reinforce overly-positive movie reviews.

To test this theory, I spent three months sampling a randomly-selected group of 21 new releases. I started sampling on November 9th by finding IMDB's list of upcoming movies and recording the first data point for all of them [3]. I then checked the ratings once weekly to see if my prediction about prerelease hype held up to a little empirical rigor. My sample was surprisingly diverse. Among the movies I sampled there were big budget Hollywood films like Tron: Legacy as well as indie films like Rare Exports [4]. Because some of the films were slated to release later in the month of December and some had pre-screeners who rated the movies before a popular release, I didn't have an equal number of data points for each film. Almost every film did reach score equilibrium; the score remained stable for at least three sampling periods (three weeks). Here is a time series for the films. I've omitted the titles since it would clutter the graph too much:

Looking at the graph is a bit confusing, and there isn't a clear trend. So I turned to the numbers. With a little statistical crunching I found that the average movement in rating was -.2125, significant at 95% confidence. In other words, new movies do have inflated IMDB ratings, on average those ratings are .2 points above where they will eventually settle.

The greatest volatility in rating was in the first two sample periods, which is to be expected. The Tempest and Casino Jack were the biggest losers (shedding 1.6 points in the 3 month period). There were several films that appear to have been correctly assessed from the get-go and had no rating change after 12 weeks: I Love You Philip Morris, The Tourist, The Fighter, Little Fockers, and a French film by the name of The Illusionist. The rest suffered small declines in score that are consistent with my theory. 

The takeaway here is that if you are asked to watch a new release, assume that the IMDB rating is overly-optimistic by about a fifth of a point, then go anyway and have a good time.

[1] Even a film snob like me must admit that it is ridiculous to make such a claim but it sure sounds definitive.

[2] I suspect the biggest reason that disinterested people see films is social pressure.

[3] The equivalent page for this week would be here.

[4] I tracked all of the following films: Black Swan, I Love Your Phillip Morris, Rare Exports, The Warrior's Way, The Tourist, The Tempest, The Chronicles of Narnia: Voyage of the Dawn Treader, The Company Men, The Fighter, Tron: Legacy, Yogi Bear, How do you Know, All Good Things, Rabbit Hole, Casino Jack, Little Fockers, True Grit, Somewhere, The Illusionist, Gulliver's Travels, and Country Strong.

In Movies, Research, Anecdotes

How Netflix Can Ruin Your App's Value Proposition

December 6, 2016 George Saines
Photo by Adrian Black.

Photo by Adrian Black.

When you are running a subscription-based web app, cancellations are a part of everyday life. At Skritter, we realized early on that knowing why people cancelled would be of critical importance. We ask everyone who cancels to let us know why, and a surprising 50% of people actually do. We get these missives via emails and I track them to keep a pulse on what we need to improve. And so it was a fairly routine morning a few weeks back, while reading one such cancellation email, that I was struck by what the user had written:

"I have no money these days. Great program, but $10 is a bit too expensive. Would sign up and keep it for 1/2 the price. Netflix costs less than you guys."

Comparing prices between products is nothing new of course, and the big players in any space will always influence prices for the little guys, but this was personal for two reasons. First, I use Netflix all the time and personally love the company and their product. Second, I also happen to run a small startup which charges more per month than the big red giant I so adore. This user's comment was damning because while Skritter helps people a lot, it was hard to argue that Skritter is more fun or better value than renting 20,000 streaming movies.

This got me thinking about web app subscriptions as a whole. For founders, choosing a price for your product is difficult. It's also one of the most important decisions you will make as a business owner. So it's no surprise, faced with such an important decision, that many founders--ourselves included--look to the market for pricing clues.

The market for subscription services is homogenizing and falling. Back when monthly subscriptions were relatively unusual, there was less consensus regarding what an online web application should cost per month. Was it $50? Or was it $2? There was and is no one right answer for all teams. Unlike the market 2 years ago, however, customers seem more certain today how about much web apps should cost. This certainty is being driven by relative giants, the Netflixes and Xbox Lives of the world, who seem to be settling on a base price between $5-10/mo.

Just as people browsing the App Store expect paid apps to be a few dollars [1], so too do customers now expect to pay a certain amount for consumer web applications.

Big companies are providing customers with price anchors which will increasingly impact customer perceptions of value. While individual web apps will always have unique aspects that enable variable pricing, I don't think the day is far off when a young entrepreneur starting a B2C web app will unthinkingly set his price at $8.95/mo because that's what everyone expects.

The trend, however, is worse still. Because large consumer internet brands--like the industrial widget manufacturers of old--can leverage similar mass production economics, the perceived "correct" cost of a web app is likely to decline over time. Netflix already benefits from a virtuous cycle of user adoption, which allows it to reap more profit by lowering prices. There will be some basement price at which Netflix will no longer want to lower their price, but given that it used to cost $14.99/mo to get worse service than I currently get for $7.95/mo, I suspect that price is lower still. [2]

For those of us in the far remote niches of the internet, I believe that companies like Netflix have an increasing ability to ruin the perceived value add of B2C products through increasingly lower prices. And here I thought the selection was the worst part of Netflix streaming.

 

[1] $4.03 for iPhones, and $4.37 for iPads, according to marketing firm Distimo.

[2] And perhaps more importantly, the predictability of revenue streams.

In Startups, Movies, Marketing

Powered by Squarespace