Lists by Tony1337
97-99 = 9+ = A+ = masterpiece
93-96 = 9 = A = standing ovation
90-92 = 9- = A = great acclaim
87-89 = 8+ = A- = acclaim
83-86 = 8 = A- = great praise
80-82 = 8- = B+ = praise
77-79 = 7+ = B+ = excellent
73-76 = 7 = B = very good
70-72 = 7- = B = good
67-69 = 6+ = B- = above average
63-66 = 6 = C+ = average
60-62 = 6- = C = below average
55-59 = 5 = C- = fair
50-54 = 5 = D+ = fair
45-49 = 4 = D = poor
40-44 = 4 = D- = poor
30-39 = 3 = F = bad
20-29 = 2 = F = awful
0-19 = 1 = F = awful
The first column is the rating you see below. The second column is the rating that I actually gave it on IMDb. It is also what I would bring up on a message board. (For example, instead of saying I rate The Godfather 99/100, I say I rate it 9+/10.) The third column is an equivalent letter grade. The fourth column is a description.
Occasionally for films ranging in the 90s, I will add a + to signify that I am especially fond of the film. Usually I do not give 10/10s, but those films are exceptions.
87 = movies that really made me laugh, cry, think, dream, etc.
77 = movies I strongly recommend
63 = movies I enjoyed
Note: Since IMDb only has movies, TV shows, and video games, for other media I will choose something I believe best represents the work. Please ignore the bold titles and instead look at the description below.
Key for side markings:
+/-: + means I rated it 9+ or 10 out of 10. (blank) means I rated it 7+ thru 9 out of 10. - means I rated it 7 or below.
*/^: * means that the work is especially significant, artistically, culturally, or historically. (blank) means of comparable significance to most others on this list. ^ means relatively less importance.
Countries represented: (may not be updated)
United States (58)
United Kingdom (19)
Genres represented: (may not be updated)
Music Recording (14)
Music Composition (11)
Video Games (3)
I am ranking them on overall film quality, rather than how funny they are. If you want a list of the funniest films in my opinion, please see http://www.imdb.com/list/DQ-yTLtiNpQ.
I am ranking them solely on comedic content, so this ranking is not how they would appear in my list of all films. If all you wanted a list of my favorite films that happened to be comedy, please see http://www.imdb.com/list/PPpJZ-mOJhk.
Overall film quality will be considered but more important is the amount, quality, depth, and accuracy of the mathematical content.
There is a slight bias towards films in which romance plays a greater role, so this ranking is not necessarily how they would appear in my list of all films. If all you wanted a list of my favorite films that happened to be romance, just look through my ratings and pick out the ones you consider romance.
I've given up on trying to tell between sci-fi and fantasy. (Why is Inception sci-fi and Pleasantville fantasy? Both use a small amount of futuristic technology and also use the imagination.) So here's a combined list of both genres.
It will be a personal ordering of all action/thriller/crime films I rated 7-/10 or higher.
Notably, the RT top critics % was removed for being too variable (due to the few number of top critics on RT) and a new source, MRQE, was introduced. Since average rating tends to be a better measure of quality than merely percent approval, the weight of the RT critic rating has been increased at the expense of the RT critics % (Tomatometer). However, percentages are still useful for several reasons:
1) For users, this helps prevent fanboying because as opposed to giving a movie a 10 when they should have given it an 8, all a user can do is give it a "fresh" vote.
2) For critics, sometimes reviewers do not give a rating, but it is clear from their review whether they consider the film "fresh" or "rotten." Moreover, the same 2.5/4 rating for two critics might mean different things; one might be mostly positive with a few major gripes, while another might be mostly negative but readily acknowledging the film's strong points.
So they are here to stay, just weighted less.
Now the weightings are:
IMDb: 30% (3x)
- RT Critics % aka Tomatometer: 10% (0.1x)
- RT Audience %: 10% (0.1x)
- RT Critics Rating: 15% (1.5x)
- RT Users Rating: 5% (1x)
- MC Critics aka Metascore: 15% (0.15x)
- MC Users: 5% (0.05x)
MRQE: 10% (0.1x)
- Besides the eight sources on Version A, I used essentially all the lists I could find in the first 30 pages of Google.
- To allow as many sources as possible, certain exclusions were permitted, in particular that of time: even a 2003 list (namely OCFS) was permitted for inclusion. Categorical exclusions based on nationality or studio were still prohibited, with the sole exception of AFI due to its importance.
- Because their reliability would vary greatly, I gave more weight to more reliable sources like official content on movie websites than stuff like blogs. Stuff like OCFS and AFI were also downweighted despite their great reputation due to their exclusion of films due to recentness or foreignness, respectively.
- Because there were more sources used, I was able to expand the list to 75 films.
I searched on Google for "best animated films" (without quotes). I looked at all the lists I could find in the first 30 pages of results and selected the ones that were
(i) reliable, i.e. from a well-known entertainment website, film critic society, etc.
(ii) inclusive, i.e. not restricted to just American, Disney, etc. films
(iii) recent, i.e. from 2008 or later (this goes along with being inclusive)
This ensures that all films from every country and every year up to WALL-E are represented fairly in this compilation. Films from later years than 2008 are also included, but they may deserve a higher ranking.
The actual sources used were:
Worlds Best Films
I then assigned a score to each rank. For example, in a list of 50 films, 1st place might get 120 points, 2nd place = 118, ..., 50th place = 22, and everything not listed = 0. In general, the more films that were on a list, the more it was weighted, in order to maintain a constant step of 2 points between places. Then I added everything up.
See http://www.imdb.com/list/jGULmr_PHPM (Version B) for a list with more sources and more films.
Also, I am ranking them with an emphasis on the epic part, so this ranking is not necessarily how they would appear in my list of all films. How "great" or "epic" I consider the film will be as important as how much I liked it. If all you wanted a list of my favorite films that happened to be epics, just look through my ratings and pick out the ones you consider epics.
Methodology in layman's terms:
I took an average of ratings from IMDb, Rottentomatoes, and Metacritic. I have designed it so that critics' and users' voices are heard equally. Everything after this point will be slightly technical stuff.
Methodology in more detail:
For selecting movies whose scores are to be computed, I have used the IMDb top 250, the top 100 out of the Empire 500, the AFI 100 (10th anniversary edition), and the Empire 100 of world cinema. Moreover, a film must be classified as a Feature Film to qualify. (I would say Documentary as well, but there happen to be none in the top 100, so this point is moot.) In particular, The Decalogue (with a score of 94.71) and Heimat (with a score of 90.73) have been excluded even though they are on the Empire world cinema list. They would have been #2 and #18, respectively.
For computing scores, data from IMDb, RottenTomatoes, and MetaCritic have been used. The weights are selected to reflect a fair balance between critics' and users' ratings, as well as across websites. Another goal is to give more weight to more reliable sources of information. The weightings (and multipliers, since the raw score may be out of 5, 10, or 100) are as follows:
IMDb: 40% (4x)
- RT Critics % aka Tomatometer: 20% (0.2x)
- RT Top Critics %: 5% (0.05x)
- RT Audience %: 10% (0.1x)
- RT Critics Rating: 5% (0.5x)
- RT Audience Rating: 5% (1x)
- MC Critics aka Metascore: 10% (0.1x)
- MC User Score: 5% (0.05x)
To use The Godfather as an example, we have 4(9.2) + 0.2(100) + 0.05(100) + 0.1(97) + 0.5(9.1) + 1(4.4) + 0.1(100) + 0.05(8.6) = 94.75.
However, certain scores have a larger relative standard deviation than others, so effectively the weights are 25%, 20%, 10%, 10%, 5%, 5%, 20%, 5%, respectively. (This is a rough estimate based on raw standard deviation times weight, then adding all of those up and computing percentages. It is not especially accurate because of its failure to account for correlation, but I just wanted to make it simple.) This gives 25% to IMDb and MC apiece and 50% to RT. The reason RT gets more is because it has 5 different ratings that each measure different aspects. Also, this weighting means that critics get 55% of the voting power while users get 45%, which is approximately 50-50.
Occasionally for old or obscure films, some ratings (especially Metacritic) tend to be unavailable. In that case, those components are simply removed from the average and everything else is weighted up accordingly. For example, for Citizen Kane, we have (4(8.6) + 0.2(100) + 0.05(100) + 0.1(91) + 0.5(9.4) + 1(4.1)) / 0.85 = 90.94, where the 0.85 comes from the fact that 15% of the points (i.e. Metacritic) are missing. Now, different ratings have different distributions, so this isn't entirely correct, but I'm too lazy to care, and besides, it doesn't seem to mess up my list too much.
An interesting thing to note is that when comparing two ratings in the same category (critics or users), you get about 60% positive correlation. When comparing across categories, you get only about 10%. Of course, this is very rough and not statistically valid because I only sampled the lists of top films of all time, so clumping at the top is expected. The true correlation for all films ever made should be much, much higher. Also, using the IMDb top 250 is definitely going to cause some bias (to the correlation figure) since the IMDb rating is being used both to select candidates and to evaluate them, but it is the only systematic way to get in new films like Toy Story 3 or A Separation. However, note that it does not actually bias the final scores; it only affects which movies are selected to be included in my data set. (It is my hope that I computed scores for all movies that should be in the top 100.) Fortunately, "junk" (at least according to everyone but IMDb) like Shutter Island gets weeded out by poor RT/MC scores. This goes to show you the importance of including both critics' and users' ratings. In this list my aim is to have films that are highly acclaimed by both critics and audiences alike.