Lists by Tony1337

a list of 114 titles
Here's my scale:

97-99 = 9+ = A+ = masterpiece
93-96 = 9 = A = standing ovation
90-92 = 9- = A = great acclaim
87-89 = 8+ = A- = acclaim
83-86 = 8 = A- = great praise
80-82 = 8- = B+ = praise
77-79 = 7+ = B+ = excellent
73-76 = 7 = B = very good
70-72 = 7- = B = good
67-69 = 6+ = B- = above average
63-66 = 6 = C+ = average
60-62 = 6- = C = below average
55-59 = 5 = C- = fair
50-54 = 5 = D+ = fair
45-49 = 4 = D = poor
40-44 = 4 = D- = poor
30-39 = 3 = F = bad
20-29 = 2 = F = awful
0-19 = 1 = F = awful

The first column is the rating you see below. The second column is the rating that I actually gave it on IMDb. It is also what I would bring up on a message board. (For example, instead of saying I rate The Godfather 99/100, I say I rate it 9+/10.) The third column is an equivalent letter grade. The fourth column is a description.

Occasionally for films ranging in the 90s, I will add a + to signify that I am especially fond of the film. Usually I do not give 10/10s, but those films are exceptions.

Significant cutoffs:
87 = movies that really made me laugh, cry, think, dream, etc.
77 = movies I strongly recommend
63 = movies I enjoyed
 
a list of 112 titles
These are my top picks for works in any art medium for every year since 1900. This will include film, literature, music, visual art, architecture, and more. My "top pick" will be based both on personal opinion as well as how artistically or culturally significant the work seems to be.

Note: Since IMDb only has movies, TV shows, and video games, for other media I will choose something I believe best represents the work. Please ignore the bold titles and instead look at the description below.

Key for side markings:
+/-: + means I rated it 9+ or 10 out of 10. (blank) means I rated it 7+ thru 9 out of 10. - means I rated it 7 or below.
*/^: * means that the work is especially significant, artistically, culturally, or historically. (blank) means of comparable significance to most others on this list. ^ means relatively less importance.

Countries represented: (may not be updated)
United States (58)
United Kingdom (19)
France (9)
Russia/USSR (9)
Japan (4)
Colombia (2)
Romania (2)
Spain (2)
Australia (1)
China (1)
Germany (1)
Mexico (1)
Nigeria (1)

Genres represented: (may not be updated)
Literature/Novels/Stories (26)
Film (25)
Music Recording (14)
Music Composition (11)
Art/Sculpture (8)
Nonfiction (7)
Architecture (6)
Opera/Musical/Ballet (6)
Photography (4)
Video Games (3)
Drama/Poetry (2)
 
a list of 12 titles
This is a list of my favorite films that I consider comedy.

I am ranking them on overall film quality, rather than how funny they are. If you want a list of the funniest films in my opinion, please see http://www.imdb.com/list/DQ-yTLtiNpQ.
 
a list of 16 titles
This is a list of films that made me laugh the most, for one reason or another.

I am ranking them solely on comedic content, so this ranking is not how they would appear in my list of all films. If all you wanted a list of my favorite films that happened to be comedy, please see http://www.imdb.com/list/PPpJZ-mOJhk.
 
a list of 15 titles
This list contains a personal ordering of all animated films I rated 7-/10 or higher.
 
a list of 9 titles
This is a list of my favorite films that involve mathematics or mathematicians.

Overall film quality will be considered but more important is the amount, quality, depth, and accuracy of the mathematical content.
 
a list of 11 titles
Since this list is personal, the classification of something as "drama" is also personal (in particular, much narrower than most definitions). My definition of drama is a film which focuses on character development and interaction and that can be made to work on a stage without removing the essence of the film. It should also be on a more intimate scale than an epic.
 
a list of 12 titles
To be considered a romance film, romance must be a major aspect of the events surrounding the main character(s).

There is a slight bias towards films in which romance plays a greater role, so this ranking is not necessarily how they would appear in my list of all films. If all you wanted a list of my favorite films that happened to be romance, just look through my ratings and pick out the ones you consider romance.
 
a list of 14 titles
Since this list is personal, the classification of something as "sci-fi/fantasy" is also personal. My definition of sci-fi/fantasy is that either (i) a human is placed in a scenario that conflicts with the natural laws of the world the human is accustomed to; (ii) the film takes place primarily in an imaginary land where magic or other unusual events that violate natural laws occur; and/or (iii) the film takes place primarily in a futuristic world where new technology enables events to occur that are currently not possible. So Toy Story, for example, does not count as fantasy in my book.

I've given up on trying to tell between sci-fi and fantasy. (Why is Inception sci-fi and Pleasantville fantasy? Both use a small amount of futuristic technology and also use the imagination.) So here's a combined list of both genres.
 
a list of 12 titles
I consider all three of these to be very similar genres with a lot of overlap, so I'm going to lump them together.

It will be a personal ordering of all action/thriller/crime films I rated 7-/10 or higher.
 
a list of 100 titles
Please see http://www.imdb.com/list/ia_Y40nc7Do (Version A) for my first attempt. Basically the same methodology, only the sources and weightings of them have been modified a little.

Notably, the RT top critics % was removed for being too variable (due to the few number of top critics on RT) and a new source, MRQE, was introduced. Since average rating tends to be a better measure of quality than merely percent approval, the weight of the RT critic rating has been increased at the expense of the RT critics % (Tomatometer). However, percentages are still useful for several reasons:
1) For users, this helps prevent fanboying because as opposed to giving a movie a 10 when they should have given it an 8, all a user can do is give it a "fresh" vote.
2) For critics, sometimes reviewers do not give a rating, but it is clear from their review whether they consider the film "fresh" or "rotten." Moreover, the same 2.5/4 rating for two critics might mean different things; one might be mostly positive with a few major gripes, while another might be mostly negative but readily acknowledging the film's strong points.
So they are here to stay, just weighted less.

Now the weightings are:
IMDb: 30% (3x)
RT: 40%
- RT Critics % aka Tomatometer: 10% (0.1x)
- RT Audience %: 10% (0.1x)
- RT Critics Rating: 15% (1.5x)
- RT Users Rating: 5% (1x)
MC: 20%
- MC Critics aka Metascore: 15% (0.15x)
- MC Users: 5% (0.05x)
MRQE: 10% (0.1x)
 
a list of 75 titles
This list ranks the 75 animated films with the highest scores according to the methodology described in http://www.imdb.com/list/Loo6LIkilf8 (Version A), with the following differences:

- Besides the eight sources on Version A, I used essentially all the lists I could find in the first 30 pages of Google.
- To allow as many sources as possible, certain exclusions were permitted, in particular that of time: even a 2003 list (namely OCFS) was permitted for inclusion. Categorical exclusions based on nationality or studio were still prohibited, with the sole exception of AFI due to its importance.
- Because their reliability would vary greatly, I gave more weight to more reliable sources like official content on movie websites than stuff like blogs. Stuff like OCFS and AFI were also downweighted despite their great reputation due to their exclusion of films due to recentness or foreignness, respectively.
- Because there were more sources used, I was able to expand the list to 75 films.
 
a list of 50 titles
This list ranks the 50 animated films with the highest scores according to the methodology listed below.

Methodology:
I searched on Google for "best animated films" (without quotes). I looked at all the lists I could find in the first 30 pages of results and selected the ones that were

(i) reliable, i.e. from a well-known entertainment website, film critic society, etc.
(ii) inclusive, i.e. not restricted to just American, Disney, etc. films
(iii) recent, i.e. from 2008 or later (this goes along with being inclusive)

This ensures that all films from every country and every year up to WALL-E are represented fairly in this compilation. Films from later years than 2008 are also included, but they may deserve a higher ranking.

The actual sources used were:
IGN
TimeOut
IMDb
TIME
TotalFilm
Chlotrudis
Rottentomatoes
Film4
Worlds Best Films

I then assigned a score to each rank. For example, in a list of 50 films, 1st place might get 120 points, 2nd place = 118, ..., 50th place = 22, and everything not listed = 0. In general, the more films that were on a list, the more it was weighted, in order to maintain a constant step of 2 points between places. Then I added everything up.

See http://www.imdb.com/list/jGULmr_PHPM (Version B) for a list with more sources and more films.
 
a list of 9 titles
 
a list of 10 titles
Since this list is personal, the classification of something as "epic" is also personal.

Also, I am ranking them with an emphasis on the epic part, so this ranking is not necessarily how they would appear in my list of all films. How "great" or "epic" I consider the film will be as important as how much I liked it. If all you wanted a list of my favorite films that happened to be epics, just look through my ratings and pick out the ones you consider epics.
 
a list of 100 titles
This list ranks the 100 films with the highest scores according to the methodology listed below.

Methodology in layman's terms:
I took an average of ratings from IMDb, Rottentomatoes, and Metacritic. I have designed it so that critics' and users' voices are heard equally. Everything after this point will be slightly technical stuff.

Methodology in more detail:
For selecting movies whose scores are to be computed, I have used the IMDb top 250, the top 100 out of the Empire 500, the AFI 100 (10th anniversary edition), and the Empire 100 of world cinema. Moreover, a film must be classified as a Feature Film to qualify. (I would say Documentary as well, but there happen to be none in the top 100, so this point is moot.) In particular, The Decalogue (with a score of 94.71) and Heimat (with a score of 90.73) have been excluded even though they are on the Empire world cinema list. They would have been #2 and #18, respectively.

For computing scores, data from IMDb, RottenTomatoes, and MetaCritic have been used. The weights are selected to reflect a fair balance between critics' and users' ratings, as well as across websites. Another goal is to give more weight to more reliable sources of information. The weightings (and multipliers, since the raw score may be out of 5, 10, or 100) are as follows:

IMDb: 40% (4x)
RT: 45%
- RT Critics % aka Tomatometer: 20% (0.2x)
- RT Top Critics %: 5% (0.05x)
- RT Audience %: 10% (0.1x)
- RT Critics Rating: 5% (0.5x)
- RT Audience Rating: 5% (1x)
MC: 15%
- MC Critics aka Metascore: 10% (0.1x)
- MC User Score: 5% (0.05x)

To use The Godfather as an example, we have 4(9.2) + 0.2(100) + 0.05(100) + 0.1(97) + 0.5(9.1) + 1(4.4) + 0.1(100) + 0.05(8.6) = 94.75.

However, certain scores have a larger relative standard deviation than others, so effectively the weights are 25%, 20%, 10%, 10%, 5%, 5%, 20%, 5%, respectively. (This is a rough estimate based on raw standard deviation times weight, then adding all of those up and computing percentages. It is not especially accurate because of its failure to account for correlation, but I just wanted to make it simple.) This gives 25% to IMDb and MC apiece and 50% to RT. The reason RT gets more is because it has 5 different ratings that each measure different aspects. Also, this weighting means that critics get 55% of the voting power while users get 45%, which is approximately 50-50.

Occasionally for old or obscure films, some ratings (especially Metacritic) tend to be unavailable. In that case, those components are simply removed from the average and everything else is weighted up accordingly. For example, for Citizen Kane, we have (4(8.6) + 0.2(100) + 0.05(100) + 0.1(91) + 0.5(9.4) + 1(4.1)) / 0.85 = 90.94, where the 0.85 comes from the fact that 15% of the points (i.e. Metacritic) are missing. Now, different ratings have different distributions, so this isn't entirely correct, but I'm too lazy to care, and besides, it doesn't seem to mess up my list too much.

An interesting thing to note is that when comparing two ratings in the same category (critics or users), you get about 60% positive correlation. When comparing across categories, you get only about 10%. Of course, this is very rough and not statistically valid because I only sampled the lists of top films of all time, so clumping at the top is expected. The true correlation for all films ever made should be much, much higher. Also, using the IMDb top 250 is definitely going to cause some bias (to the correlation figure) since the IMDb rating is being used both to select candidates and to evaluate them, but it is the only systematic way to get in new films like Toy Story 3 or A Separation. However, note that it does not actually bias the final scores; it only affects which movies are selected to be included in my data set. (It is my hope that I computed scores for all movies that should be in the top 100.) Fortunately, "junk" (at least according to everyone but IMDb) like Shutter Island gets weeded out by poor RT/MC scores. This goes to show you the importance of including both critics' and users' ratings. In this list my aim is to have films that are highly acclaimed by both critics and audiences alike.