Even if you limit the field to movies that have seen a theatrical release, the same principal applies. Some movies are shockingly bad, but are made with few little effort by people who know they are making a bad movie. These seem less interesting, less sad and less funny than very bad movies made by people trying very hard and spending lots of money.
Personally, when rating the badness of work, I judge it against my own assessment of the potential of the work. The potential isn't directly measurable or quantifiable, but it consists of things like the budget, the vision of the creator, the skill and effort of the artists involved in the creation, etc. etc.
So, for example, I think of Star Wars: Episode 1 as being a worse film than a lot of 50s low-budget sci-fi B-movies, even though I might choose to watch it more readily. The B-movies were pumped out without much effort. They're boring and poorly made on a technical level, but they never really had a chance to be any good. Episode 1 was made on a vast budget, drew on extremely rich source material, and was the culmination of unimaginable amounts of time and effort by a large number of hard-working people. For me, that makes the failure of Episode 1 more profound and more egregious.
What about a blue screen?
But no, a game company is the worst because a bunch of fanboys rallied to say so.
It's really quite fascinating to me how a movie like that, so incredibly terrible with seemingly no irony in it's production, has somehow become a fantastic work of art. It's that piece in the modern museum you think looks absolutely so terrible but something about it appeals to you.
I've probably seen The Room a half dozen times in various settings (movie theater showing it in jest to a huge audience, group of friends over a few beers, etc), and it's always overwhelmingly entertaining because it's just THAT bad. It almost impresses me more that they pulled this off (unintentionally, of course) when compared to some of my actual favorite films.
It's fascinating because of some similarities to the truly worst movie in history, The Tango Lesson[0]: each movie has one person as writer, director, and star, and (apparently) focuses self-indulgently on the abortive romantic entanglements of the author, who is utterly unaware that nobody else could possibly care.
The Tango Lesson falls through the bottom of the badness scale into a realm where it can't even be enjoyed for its unredeemable awfulness. Everyone I was with thought it was dreadful, and what was left of the audience shuffled out shaking their heads.
The Nostalgia Critic reviews The Room: http://thatguywiththeglasses.com/videolinks/thatguywiththegl...
The 'best' bad movies are hard to quantify. I certainly can't think of any 'good' bad movies that planned or aimed to be regarded as bad. As another commenter said, you need the creators to believe that they were making something worthwhile...
I would be interested to see how its rating has changed over time.
There are certainly films that are "so bad they're good", but then there are just bad films.
Unexpectedly, however, "Plan 9 from Outer Space" is rated much more evenly for a film that's been considered to be the worst of all time.
I'm a die-hard MST3K fan and I've only seen Manos two or three times.
Even so, I Heart Huckabees tops my list of movies that I just can't watch. That one's actively grating rather than just incompetent.
R.O.T.O.R. - http://www.imdb.com/title/tt0098156/ratings?ref_=tt_ov_rt
My personal favorite bad movie has always been "Cool As Ice" [1], but even that seems to be rated higher than R.O.T.O.R.…
[1] http://www.imdb.com/title/tt0101615/ratings?ref_=tt_ov_rt
FiveThirtyEight has had a rough start so far so I may be more inclined to seek more nuance than I normally would as they've tended to be extraordinarily simplistic (Silver himself notwithstanding) to date.
I think the author calls data erroneous as the rating does not represent the actual quality of the movie - a thing which IMDB wants to achieve. He doesn't call it erroneous for actual errors in the process of collecting ratings.
I think, as you said, you are just inclined to seek more nuances.
In the (paraphrased) words of Andrew Gelman:
"I can't knock this guy for slamming my book without reading it. For example, I have never read the autobiography of Uri Geller, but I'm confident it's full of crap."
If you don't think Gonojagoron Moncho is lying to these people about the point they're taking offense over, how is it relevant that they haven't seen the movie?
I'm sure it's possible, but no rating represents the "actual quality of the movie." A rating is based on a subjective amalgam measurement that encompasses a myriad biases, even when watched and with total scrutiny.
Far too often do you find a movie (usually a somewhat recent release) that's just flat out terrible despite having a > 7 score on IMDB (and usually a Metacritic score in the 30's or 40's)
Example: "Spy Kids" got a 93% fresh rating from critics, but 45% from audience ratings.
Code attached should you want to investigate for yourself.
If I had unlimited time and money I would create a new movie review site that fights corruption. I would have a phone app for doing the reviews and it would require taking a photograph of your ticket stub at the theater and uploading it for proof (w GPS tag). It is far too easy and anonymous to participate in these online movie ratings sites that have a large impact on people's decision making. I have more ideas for the review site if anyone wanted to discuss it further.
If you're interested, there's a really interesting documentary about Troll 2 called Best Worst Movie, which covers the film's production and cult following. After watching the documentary, Troll 2 becomes an enjoyable movie because you know all the crazy backstories to the actors and scenes.
It's interesting that so many people gathered solely to express "we want this guy dead, not imprisoned." I wonder if there's more context?
Given the dynamics of Bangladeshi politics, a life sentence from the current government is actually a jailing until the opposition comes to power and pardons everyone convicted by its predecessor.
I'm against capital punishment, but I understand where the protesters are coming from: they want real justice for the atrocities in 1971, and see the death penalty as the only way to obtain lasting closure.
Politics in a democracy of 180MM people is really, really messy.
That, uh, seems like plenty of context to me? Throughout history, seriously bad dude plus nationalistic fervor equals lynch mob pretty darn often. The seriously bad dude part is usually even optional!
The most obvious effect is movies that drop in the rankings 30 days after release; people who just saw a movie give it 10/10 and then stop participating, so 30 days later they are no longer considered 'active' and their vote stops contributing to the overall ranking.
If someone only gives 10s or 1s to everything, their vote of 10, should probably have less weight, than someone who distributes their votes more evenly.
Did they start hanging out with the bad kids, take up cigarettes, drinking, gambling only to progress to crack and burglaries one of which ended with our Data shooting a home owner who returned unexpectedly?
I guess I don't understand what data is. I always thought it was a set of values. And I always thought that the problem when using data was in the interpretation, and that a prudent consumer of data would always be careful to distinguish between a random sample and self-selecting sample when drawing conclusions, and then would only state conclusions couched in the language of statistical inference.
Leaving aside the question of why I should give a fuck about this supposed outrage, why does the author expect there to be a strong correlation between movie quality and the ratings on a website devoted to providing entertainment by having users rate movies?
When The Matrix is purported to be better a better movie than Lawrence of Arabia, the problems of interpretation are systemic.
I thought it was indicative of a larger trend where crowdsourced data are used to illustrate a point. Like the Google flu trends articles, which have gone around HN at least twice, once when they were successful (https://news.ycombinator.com/item?id=5040204) and once when they were critiqued (e.g., https://news.ycombinator.com/item?id=7455307).
I work a lot with sampled data, and I have found that sampling issues can be some of the most difficult to appreciate and to quantify -- even for experts.
I guess it comes down to sampling from one distribution, P(x), when the situation you really care about samples according to a different distribution P'(x). If P is far from P', your conclusions from P can be arbitrarily bad. If you have an adversary moving P around deliberately, as here, it's even worse.
If there is an interesting statistical result it's that the movie's rating is entirely consistent with crowd sourced predictions. The theory is that 'wisdom of crowds' results directly from diversity among those making predictions.[1] In the case of the lowest rated movie, those making predictions were unusually homogeneous, and therefore an inaccurate prediction as to the quality is unsurprising.
Again, it's all in the interpretation, e.g. there's statistical evidence that a lot of morons ranked the The Matrix.
[1] Diversity Prediction Theorem: http://vserver1.cscs.lsa.umich.edu/~spage/ONLINECOURSE/predi...
That tipped the romset into the spotlight - there were some leaderboards for highest recent activity and so on. Other people started downloading the romset and voting on it.
Suddenly this obscure romset was catapulted into most of the lists for "most active" and "best" etc.
I had rated the game honestly. I had fond memories playing the cab for a week on holiday in my youth. But I was surprised that so many other people felt the same, especially on the Mame platform where the game's controls made it tricky.
All my deliberate attempts at voting shennanigans failed miserably. (Although I haven't investigated MTurk or similar yet.)
I wish there was a site like Meatball wiki where people could share their vote-weighting methods.
Obscure? :-(
This was my favorite arcade game as a kid. I don't think it was very obscure. Though I have an actual Tron arcade machine not 10 feet from me, so perhaps I'm the wrong person to judge that…
This one has caused a stir on IMDB for reasons of moral integrity, or bending the truth. Rather odd, since this has been going on in films and similar media for a very long time.
"International Gorillay", a film from Pakistan depicting Salman Rushdie [1]. Coincidentally it was released a year or so after Rushdie's "The Satanic Verses" [2]. Ayatollah Ruhollah Khomeini [3] wanted Rushdie dead because of this novel. BTW Iranian films can be very good, like "Where is the Friend's Home?" [3].
[0] http://www.imdb.com/title/tt0052077/ [1] http://www.imdb.com/title/tt0251144/ [2] http://en.wikipedia.org/wiki/The_Satanic_Verses [3] http://en.wikipedia.org/wiki/Ruhollah_Khomeini [3] http://www.imdb.com/title/tt0093342/
The only movies that escape from the IMDB average are a) decent movies that are loved by the masses, b) great movies the masses don't watch (being in black and white or not in English alone pretty much guarantees 2 bonus points) and c) movies everyone agrees on are total crap.
What you may want is a personalized rating based on people with the same tastes as you.
I haven't used it in a while, but I used to find MovieLens, from the University of Minnesota, useful for that: http://movielens.umn.edu
It even has a feature where you can get joint personalized recommendations for you and a friend (assuming the friend also has a MovieLens account, of course), which is useful for brainstorming movies to rent with someone that might be mutually enjoyed.
Data analysis by Eugene Bialczak. Also, a disclaimer: the author wrote much of the IMDb Trivia App.
Besides, for us in small countries, we would probably have almost no ratings in any film besides blockbusters.
It's similar to the problem of an author's "best" or most notable works getting lower crowd-sourced reviews than their average work. What happens there is that generally people only bother to read, and subsequently review, books that they think there's a good chance they'll like. If you're a mediocre author putting out filler in your genre, your 6th, average, book will get pretty good reviews because everyone who bought it knew what they were getting. If your 7th book happens to be excellent, hype will induce a lot of people who don't like your genre to buy it and see what everyone's talking about -- and they'll take it out on you in their reviews afterwards.
It's difficult to mock properly when all you've given is a question mark. :)
"The next lowest-rated movie on IMDb — 1.8 stars overall ..."
I am not sure what the writer means by a "qualified" movie, but this one does rate less than 1.8: http://www.imdb.com/title/tt2094870/
It has votes from only 195 users as of this writing, though.
I'm thinking this movie otherwise gets forgotten in the trash bin of bad movies and the data would never tell you anything because it wouldn't exist.
To me that's just stating the obvious. Of course if there is such a thing as a worst movie then it will have a higher percentage of 1 star votes than other movies. So I don't know how that's evidence for anything except that the movie seems to be bad.
I though the majority of IMDB data was not downloadable?
Also you can buy a license for access to more complete datasets.
seriously, i do take IMDB ratings into account, but i consider them unreliable at best. Inception, when it came out, was the best movie of all time for a while, according to IMDB users. enough said.
For me the best place are meta sites that gather from many related sizes.
At Rotten Tomatoes they rated Seth Rogen-vehicle "Neighbors" as 100%, last I checked. Meaning only: no reviewer at that point had said it was awful. And yet, I saw it and it was awful.
At IMdB, I can see that it rates as 7.6/10.0, and also that teenager voters loved it (9.0/10.0), where older viewers hated it (5.0/10.0). Far more useful information.
Gunday (the film of the parent article) gets fairer treatment from the IMdB's top 1000 users, who rate it 4.9/10.0
Each site has its own user base, and those users have their own biases. A review from Rotten Tomatoes will vary greatly from those of sites that include only noteworthy critics or only crowdsourced opinion without the "community" aspect. Some communities will be more critical, while some will be less. Like most online reviews, the criteria for rating is completely subjective; one user gives it a 10 because it had their favorite actress in it, while another user gives it a 5 because there was a scene they didn't like. What's awful to you may not be awful to me, and crowd-sourced data or aggregate generalized polling isn't a great way to distinguish that.
If you look below that, you can see the actual "Average Score", which in the case of "Neighbors" is actually lower than IMdB's, at 7.1/10.
The release for that movie is the 9th, so its not out yet outside of limited distribution. Wait until all the reviews come in this weekend for a more complete score.