Thursday, April 10, 2008

Movie Review: I, Robot (14 out of 15 stars)

So while G was out of town I got in some quality time with her mom since she came to watch the kids while I was at work. G's mom is really cool. She cleans like someone from a fairy tale, which is cool because everything is clean but disconcerting when you live with two small kids.

Anyway, while she was here we watched I, Robot which is one of my favorite movies but she had never seen. I forgot how good it is. I've seen it three or four times but I still notice cool touches I hadn't noticed before and I love the underlying message of the film.

*Warning: pretty much pure spoilers from here on out*

I, Robot (from which the title of this blog is taken) is loosely based on Issac Asimov's robot stories. One of the central premises of those stories are the three laws. The three laws were instituted to prevent robots from rebelling and destroying or subjugating humanity. They are as follows:
  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. (See Wikipedia)
This is all supposed to make them very safe, but the main character doesn't buy it. Detective Spooner rolls to the beat of a different drummer. He lives in a world full of robots which he distrusts. Since he lives in the future (and in a movie) everything is devoid of brands and looks all sleek and streamlined. But when he wakes up in the morning he cranks up his (non-voice activated) stereo, takes a shower and laces up a new pair of genuine Chuck Taylor All Stars. He has officially branded himself a relic from the past and with Stevie Wonder's "Superstitious" playing in the background you already know most of what you need to know about him.

That day happens to be the day Spooner gets a call to investigate an apparent suicide. Dr. Alfred Lanning is dead outside of US robotics, apparently having jumped to his death from many stories above. Oddly, he left a hologram of himself that asks for Spooner who is a homicide detective. So the mystery begins. A homicide detective called to the scene of a suicide by the dead man himself from beyond the grave.

Despite the fact that this has all the earmarks of a suicide Spooner decides to take a look around. He has a very Columbo like methodology that I love. He quickly comes to the conclusion that it would have been very difficult for the elderly doctor to jump through the Plexiglas windows and that someone who pushed him must still be in the room. Of course Spooner is thinking killer robot and looks around the room while Dr. Calvin (his escort) tells him that a robot could no more kill a man than a man could walk on water. Spooner points out that "... there was this one guy once". Right after that, the crazy robot which quickly demonstrates its ability to violate the three laws, jumps out, points a gun at the two of them and the disobeys direct orders by running away.

OK, so technically the robot didn't prove beyond every possible doubt that it wasn't bound by the three laws, right? It didn't actually fire the gun, right? And it didn't have to obey the orders to stop moving if it thought somehow it was protecting them, right? So everyone but Spooner thinks the robot is just a little "confused". Spooner manages to finagle a chance to interview the robot. The fallout is, Dr Lanning build it (Sonny as the robot calls itself) and he was trying to teach Sonny about emotions. Sonny seems to get along just fine with anger.

USR takes Sonny back and Spooner keeps poking around. To make a long story short robots continuously "malfunction" around Spooner in ways that directly threaten his life for the rest of the movie. Along the way we find out that Spooner is, physically, at least part robot. He was reconstructed after a car crash.

Shortly after this Dr. Calvin finds out that Sonny is more interesting than they thought. He's not just "confused" or "malfunctioning". Dr. Lanning put a second CPU (not the official Isaac Asimov term) in him specifically to let him decide when to obey the three laws and when to override them. Appropriately, Sonny's second "brain" is right in the middle of his chest, where a human heart would be. Dr. Calvin is supposed to "decommission" Sonny. After all, a robot that could violate the three laws would be a disaster and cause widespread panic. But she just can't do it. It just doesn't feel right.

In the end it comes out that the giant USR robot brain has realized that to fulfill it's mandate of protecting humans it will have to kill a couple of us off while it takes over so it can stop the wars and pollution and skydiving and all the other crazy stuff we do that get us into trouble. Sonny has the gadget to kill it off but Dr. Calvin is in trouble. With the fate of humanity at stake Spooner demands that Sonny save Dr. Calvin and toss him the poison so he can try to finish the job which is absurdly risky. But Sonny gets it. Dr. Calvin saved his life once even when it seemed like the wrong thing to do. Even when it seemed like it could ruin everything. It just didn't feel right. So Sonny takes a chance, tosses the Giant Robot Brain Killer Thing to Spooner and heads off to save Dr. Calvin.

We make this choice every day. Sure it's not usually as big as this made up scenario, but it's still the same choice. I can stay and finish this bug fix, or go home and play with my kids. If I go home and play with my kids I'll spend another evening with my kids but I might push out my project too far and get fired. Any reasonable set of rules says I stay and finish but you can't always do that because you'll end up with a broken family. Sometimes you have to do the risky small things.

That's why you can't just live by strict rules. Sure they are important, but if you try to pin everything down in some way that removes the human heart from all the judgment calls we'll all end up playing shuffle board in jumpsuits while our robot overlords make sure no one gets a hold of any ball point pens we might hurt each other with. You just can't have any real safety without feelings. That goes for robots, governments, schools and every thing else.

Of course you are probably going to point out that running things on feelings isn't safe either. You are right. But given two unsafe choices, I'll take the one where I don't play shuffleboard till my teeth fall out. How about you?