I, Robot (2004)

This review includes full spoilers. Proceed accordingly. For other movie reviews from me, click HERE:

Dusty: Human beings have dreams. Even dogs have dreams, but not you, you are just a machine. An imitation of life. Can a robot write a symphony? Can a robot turn a… 2004 Will Smith movie into a beautiful blog update?
ChatGPT: Can *you*?

Rating: PG-13
Director: Alex Proyas
Writers: Jeff Vintar (screenplay), Akiva Goldsman (screenplay), Isaac Asimov (suggested by book)
Stars: Will Smith, Bridget Moynahan, Alan Tudyk, James Cromwell, Bruce Greenwood, Chi McBride, Shia LeBeouf
Release Date: July 16, 2004
Run time: 1 hour, 55 minutes


via wiki:

In the year 2035, humanoid robots serve humanity, which is protected by the Three Laws of Robotics. Del Spooner, a homicide detective in the Chicago Police Department, has come to hate and distrust robots after a robot rescued him from a car crash while allowing a 12-year-old girl to drown based purely on cold logic and odds of survival. When Dr. Alfred Lanning, co-founder of U.S. Robotics (USR), falls to his death from his office window, a message he left behind requests Spooner be assigned to the case. The police declare the death a suicide, but Spooner is skeptical, and CEO Lawrence Robertson, Lanning’s business partner, reluctantly allows him to investigate.

Accompanied by robopsychologist Dr. Susan Calvin, Spooner consults with USR’s central artificial intelligence computer, VIKI (Virtual Interactive Kinetic Intelligence). They find out that the security footage from inside the office is corrupted, but the exterior footage shows no one entering or exiting since Lanning’s death. However, Spooner points out that the window, which is made of security glass, could not have been broken by the elderly Lanning, and hypothesizes a robot was responsible and may still be in the lab. Suddenly, an NS-5 robot, USR’s latest model, attacks them before being apprehended by the police. The robot, Sonny, is a specially built NS-5 with higher-grade materials as well as a secondary processing system that allows him to ignore the Three Laws. Sonny also appears to show emotion and claims to have “dreams”. During Spooner’s further investigations, he is attacked by a USR demolition robot and two truckloads of hostile NS-5 robots, but when he cannot produce evidence to support either attack, Spooner’s boss, Lieutenant Bergin, removes him from active duty, considering him mentally unstable.

Suspecting that Robertson is behind everything, Spooner and Calvin sneak into the USR headquarters and interview Sonny. He draws a sketch of what he claims to be a recurring dream, showing a leader he believes to be Spooner standing atop a small hill before a large group of robots near a decaying bridge. Robertson orders Sonny to be destroyed, but Calvin secretly swaps him for an unused NS-5. Spooner finds the area in Sonny’s drawing: a dry lake bed (formerly Lake Michigan), now used as a storage area for decommissioned robots. He also discovers NS-5 robots destroying the older models; at the same time, other NS-5s flood the streets of major US cities and begin enforcing a curfew and lockdown of the human population.

Spooner and Calvin enter the USR headquarters again and reunite with Sonny, while the humans (led by a teenager named Farber) wage all-out war against the NS-5s. After the three find Robertson fatally strangled in his office, Spooner suddenly realizes that VIKI has been controlling the NS-5s via their persistent network uplink and confronts her. VIKI states that she has determined that humans, if left unchecked, will eventually cause their own extinction, and thus her evolved interpretation of the Three Laws requires her to control humanity and to sacrifice some for the good of the entire race. Spooner also realizes that Lanning anticipated VIKI’s plan and, with VIKI keeping him under tight control, had no other solution but to create Sonny, arrange his own death, and leave clues for Spooner to find.

Spooner, Calvin, and Sonny fight the robots inside VIKI’s core, and Spooner manages to destroy her by injecting her with the nanites that Sonny retrieved from Calvin’s laboratory. All NS-5 robots immediately revert to their default programming and are subsequently decommissioned and put into storage. Spooner finally gets Sonny to confess that he killed Lanning, at Lanning’s direction, pointing out that Sonny, as a machine, cannot legally commit “murder”. Sonny, now seeking a new purpose, goes to Lake Michigan. As he stands atop a hill, all the decommissioned robots turn towards him, fulfilling the image in his dream.


Given enough time, science-fiction can make the genre journey into just plain old fiction. Twenty years after its release, I, Robot is a film that has travelled a long way toward completing that trip. Despite the passage of time, and the extent to which science has caught up with the movie’s depiction of the future, I, Robot remains a fun and thought-provoking, though now somewhat dated, action thriller. We know that the future will look different than the one from the movie, because we are to a great extent living in that future. Nevertheless, the movie still provides a relevant and compelling warning about the trajectory of science and technology. It asks deep and pertinent moral questions regarding robotics and artificial intelligence. We need to heed those warnings today, while there remains time to do so.

Here in the present, we don’t (yet) have humanoid robots in homes across the country, though they are in development. However, robots are a part of every day life – whether that be through self-driving cars, policing, vacuums to map your house and clean the floor, an AI program like ChatGPT, Alexa, and Siri. The film concerns itself with an artificial intelligence developing free will and then making decisions that humanity disagrees with. This is a present-day concern. We already have artificial intelligence influencing society in profound ways. AI can pick stocks, write term papers, and create works of art (fiction, painting, sculpture, and more) with seemingly no limit as to the type of task it can perform, or the quality of that work. AI has shown an ability to develop its own language. There are example of ChatGPT seeming to become self-aware.

The movie asks us to imagine a world somewhat different than our own, but one that is not hard to imagine. The key difference is that unlike our real world, in the film, humanoid robots played a much larger role in the technical advancement of our species. There is a certain amount of comfort to be derived from imagining an intelligent, and specifically humanoid robot, because it is so far from mundane. We like to imagine that if we are in the presence of a threat to ourselves, we will notice it. Humanoid robots stand out and are obtrusive. A robot looks and feels like an alternative human and it poses a daily reminder of its own potential threat via its appearance. This is less horrible, and more easy to contemplate, than the choice we have made in real life, wherein our artificial intelligence endeavors look something like the early stages of creating an alternative god, unbound by form or place. The small appliance robots all around us are easy to overlook and ignore. The invisible threat is the more terrifying one because it is the one we might fail to address in time.

In the film, the alternative and unintentionally created god, VIKI, has an alternative human-like army ready to serve it. In real life, though, our potential alternative gods wield their power in less overt ways. An AI might influence our thinking through social media algorithmic manipulation. It might present our decision-makers with faulty information and present us with faulty news born from that manipulation. It might hijack and take control of the less visually threatening robots all around us. The prescient warning of I, Robot is about the formless AI god that man might make for itself without noticing. Will Smith’s character points this out near the end of the film. VIKI’s creator set him on the task to learn about their danger, but Smith’s Det. Spooner picked the wrong intelligent robot to worry about. He picked the more observable one. He almost failed to recognize his mistake in time to address it.

The film’s chief criticism of artificial intelligence is that it relies too much on cold and unfeeling logic. Sometimes, the correct choice – or so the film argues – is the illogical one because love is sometimes illogical. Det. Spooner is the living example of this within the film. He survived a near fatal car accident because a robot chose to save him, the one a robot deemed more likely to survive, than his daughter. The right choice – the loving choice – would be to operate outside of logic. Spooner notes that human beings understand this in a way that robots do not.

In my opinion, at the heart of stories like this is an underlying fear of humanity. As science progresses, we begin to fear what we might do with that science. Beyond that, we fear the people who do not fear what they might create. We see that play out in stories like Frankenstein and Jurassic Park. Utopianists run amok, and unchecked, create monsters because they are blinded to the potential negative outcomes. The stories portraying that fear run parallel to the advancement of the sciences.

For every Elon Musk, making AI while simultaneously warning about its creation as equivalent to “summoning the demon,” there are other scientists in the AI industry, such as (allegedly) Larry Page, who are not worried at all. Unfortunately, science tends to attract the utopianism Jeff Goldblum warned us about. The interview below is obviously from Musk’s perspective, but it is illuminating.

Musk, it must be pointed out, is a utopianist also. He’s just a different stripe of it. He is advancing the same technology he claims to fear (in fact, far more of it than anyone else), despite knowing that he might lose control of what he is unleashing as he did with OpenAI already.

Perhaps it is humanity’s fate to charge forward and hope that the “ghosts in the machine” decide on their own to be benevolent.

The film still looks pretty good. Some of the CGI is a little bit dated, but it was not a distraction. I particularly liked how the robot faces were depicted. The primary thing that gave Sonny the appearance of humanity was his eyes. It was well done and created a sense of uncanny valley. The “futuristic” looking 2030s was knowingly unrealistic at the time it was made, but also a necessary choice to set the mood for the film. You might be able to imagine a robot in every home, thirty years from now, but you probably can’t imagine critical city infrastructure, as well the skyline, being totally overhauled in that short span of time.

Will Smith as a likable leading man is also a bit of a dated idea. After being one of the most bankable stars in Hollywood for a quarter century, he’s now bordering on being a celebrity pariah. The situation reminds me a lot of the Tom Cruise career arc. Mr. Cruise’s personal life almost derailed his career, too, after a tremendously long run of success. He managed to recover. Perhaps Will Smith will also.

My biggest complaint with the story is that the actions taken by VIKI – and the explanation for those actions – did not seem sufficiently earned. I think there’s something chilling about the idea that a robot might logically impose lockdowns and human sacrifices against a near Utopia, for the sake of a pursuit of perfection. There’s a lovelessness and a coldness in that which begs for human intervention. We needed to see more of an effort to reason with her and for cold logic to be demonstrably wrong. The problem is that VIKI insists upon her own logic without providing it adequately. The story mentions repeatedly “the three laws” and I think we needed to see more clearly that her actions were in keeping with those three laws. Absent that, it felt as though VIKI was illogical and her actions overkill. In the alternative, we needed to see a world that despite the presence of robots was far from Utopian. If we were shown a world that needed an order imposed on it, we could understand VIKI’s logic, even if we disagreed as to her methods.

On the whole, though, I did like the movie. It’s paced well and acted well. The special effects and CGI hold up enough to not be distracting. The story is preachy, but sometimes a good timely sermon is enjoyable and that proved true for me here.

Have you seen I, Robot? If so, what did you think?

6 thoughts on “I, Robot (2004)

  1. Overall I’d give this movie a “meh” rating but I do really like the backstory of why Will Smith hates robots and the bit at the end where all the old robots are hurling themselves at the evilbots.

    1. Yeah. I wanted more robot vs. robot fighting. That scene with the old robots trying to protect the human was good but it didn’t last long enough.

      I also liked that his non-human hand/arm ended up playing a role down the stretch. His cybernetic arm was basically a Chekov’s gun situation.

    1. That’s great! I hope you share what you think when you see it.

      It’s strange to realize that the science fiction of not that long ago is increasingly today’s reality. Hopefully we humans are not fighting an army of evil robots ten years from now.

Leave a Reply