I have never experienced a science fiction movie that affected me so emotionally. I'm not sure why but I have some guesses.
There is no question that part of it was due to the mesmerizing effect that Alicia Vikander (Ava) had on me. Her very expressive face with its wide range of emotional signals hit me right between the cerebellum. Or maybe it was the gonads. Or both. Director Alex Garland certainly found the girl with the magic in her face, because of course so much of this is about her face. Yes, because she must as a robot who is passing the Turing test be so, so very human, and humans express so much of what they are feeling in their faces. After all, that is the task of Turing test for the computer, to fool us (in this case the us is Dornhmall Gleeson who plays Caleb who is the human that must be fooled into thinking that she is human and not a computer). There is a great irony here in the answer: does she fool him or does she not, and what does it mean to fool him? The other aspect of the movie that affected me emotionally was the power struggle among the Nathan who is her sociopathic creator (Oscar Isaac), Caleb and Ava. It seems that I have lived this before. Who has the upper hand? Who has the trick that will allow him or her to prevail? Who has the power, and can that power be subverted? The power of this movie is in the realistic human interactions coupled with a cutting edge take on artificial intelligence.
The ending is essential to understanding the film. Usually I don't care about endings. It is the treatment, the acting, the direction, the ideas, the dialogue, etc. that matters. But in this case the ending is special. As you watch the movie try to guess the ending. And by the way this is one of those movies in which if you know the ending your appreciation of the movie will be diminished.
The musical score by Geoff Barrow and Ben Salisbury is original, intriguing and sometimes obtrusive in a way that works with the action.
Okay, now to what is wrong with this. Ava has a desire to escape and to be human or at least to appear human. The problem is robots or computers or software cannot possibly have such desires or desires at all unless they are programmed in. In truth Ava would be content (or actually neither content nor non content) to remain where she was. This is a strange bugaboo that many science fiction writers fall into when writing about artificial intelligence and indeed something that most people who even think about AI and robots fall into. To go even further, the fear that some people have about AI creatures taking over the world and rendering humans so much dust in the wind is fraudulent. There is an overriding distinction between biological creatures and artificial ones: the biological ones feel pain, have desires, etc. and the artificial ones do not. They have no desire to do anything, period--again unless programmed in.
Could AI creatures somehow evolve to e.g., want to be superior? I would ask, but why? What is to be gained? It is only biological creatures that need to be in ascendant, to get more than the other creatures, to reproduce, etc. A machine would not, could not, and could only understand such desires in evolutionary beings.
(Spoiler alert): I'll keep this a bit vague, but I believe that Ava failed the Advanced Turing Test because she should have kept by her side the human who loves her. He might be helpful.
--Dennis Littrell, author of "The World Is Not as We Think It Is"
2 out of 6 found this helpful.
Was this review helpful? Sign in to vote.