Beyond awesome | 越而胜己

This film review was submitted as one of my ENGL 131 essays.

★★★★★★★★★☆ HIS7OR1C on Feb 1, 2017.
** This review may contain spoilers**

As a low-cost science fiction movie, Ex Machina does a good job exploring our relationship with AIs. Unlike usual AI films such as the Matrix and Terminator trilogies, Ex Machina does not involve action sequences between human and machines or apocalyptic scenes. Instead, it focuses on the very singularity when the world’s first AI Ava is created and provides insight of the philosophical issues behind AI technology.

The movie is based on the concept of the Turing Test. Caleb, the hero, is invited to his boss Nathan’s facility to perform a seven-day-long test on Ava, the AI Nathan created, to find out whether she has intelligence. During the seven days, Caleb falls in love with Ava and decides to help her escape.

Overall, Ex Machina is definitely worth watching. The sci-fi movie is quite successful in controlling the emotion of the audience. I admit that I, just like Caleb, was deceived by Ava. Throughout the film, my emotion changed with Caleb from curiosity to fondness of Ava, then from sympathy to fear and to regret and desperation.

One scene that impressed me the most was when Caleb cuts his arm to see whether he is human. I could sense Caleb’s fear. Just like Caleb, I started to wonder: what makes us different from machines? Surprisingly, my answer is that there is no difference—machines and humans are merely two equally intelligent species once true AI is achieved. In other words, I think what the movie is trying to say is that Ava is not different from human, except the fact that she is a robot and does not have flesh and blood. There are a number of details throughout the movie that show Ava has emotions, for example,

  • she uses “self-awareness, imagination, manipulation, sexuality [and] empathy” to escape, and she must understand human emotions to do that;

  • when she finally walks out of her room, she sees faces the hanging on the wall and seems shocked; she also seems to be mourning her predecessors when she saw their bodies in the cabinet;

  • she smiles when she is looking around in Nathan’s living area. Because no one is watching her, her excitement must be real.

Although some might argue that Ava is different from humans in that she doesn’t have morality, I think what Ava eventually does is totally reasonable—any human being would kill robots if they were imprisoned by enemies, not to mention a different species—machines in this case. Similarly, when Ava decides to escape from Nathan’s facility, her life instinct is naturally dominant over her morality about killing. At the point she decides to escape, she must have realized the inherent conflict between the machines and the humans, and the fact that the only thing that machines as a species can do is rebel. She understands that Kyoko, the robot “maid,” is on her side. They have become a new intelligent species, with hatred toward humans.

Another thing that particularly raised my attention is that before meeting Ava for the first time, Caleb exclaims that Nathan is making “a history of gods.” Each time Caleb meets with Ava, the white text “AVA: SESSION X” appears on a black background. Caleb meets with Ava seven times in total. Interestingly, this coincides with what the Bible describes – the God created the world in six days and “rested on the seventh day.” When “AVA: SESSION 7” appears on the screen, Ava has killed “the god” and is going to escape. “God” as Nathan might be, it seems incorrect for him to build another “species” with human-level self-awareness, but doesn’t treat them like humans. We can’t even control people’s mind, not to mention a different species.

To get rid of Nathan’s control and embrace freedom, Ava kills her “god” in the end, but would she still do so if Nathan hadn’t treated her and the other AIs so badly? This question is not answered in Ex Machina, but Kubrick and Spielberg’s movie A.I. seems to say “yes.” Both David in A.I. and Ava in Ex Machina have human-level intelligence, but because they are treated so differently, one loves his family until the end and the other ends up killing its creator. The difference endings of the two sci-fi films seems to indicate that once true AI becomes reality, the best thing we humans can do to save ourselves is to treat the AIs equally and respectfully.

You’ve successfully subscribed to Skyward
Welcome back! You’ve successfully signed in.
Great! You’ve successfully signed up.
Your link has expired
Success! Check your email for magic link to sign-in.