I, Robot - Isaac Asimov

I was this many years old when I realized I, Robot was not actually a novel and doesn't actually have a coherent plot with a car chase. That being said, I, really enjoyed I, Robot, despite thinking Asimov's techno-optimism is pretty off base. The collection is stitched together by the conceit that Dr. Susan Calvin is recounting these events in the history of robotics and so even the ones where she was not personally present, like in all of the programmer bros stories, she's telling the story based off of their reports. Apparently this is called a 'fixup' collection, where the short stories are later collected in a compendium and given a through-line via the addition of introductory material and retconning. Also apparently the title I, Robot is stolen from another, earlier sci-fi work but the publisher made him change it from Mind and Iron, which honestly kind of slaps.

Robbie (1940, revised 1950)

A little girl gets too attached to her robot nanny/friend and her parents, who have the most toxic 50s relationship ever (yes, yes, fuck off with your dates, you know what I mean) alternately give to, take away from and finally re-give the robot to the girl. Introduces the term 'Frankenstein Syndrome' to mock the 'irrational' fear of the girl's mother that she's getting too attached to the robot, and she definitely has a point despite her dickish husband: it's fucked up to have an object that can emotionally manipulate a child *cough*good thing we avoided that hellscape*cough*.

Runaround (1942)

First of the programmer bros stories, which I really liked in general. Why are there always 12 Donovans to every Powell? Probably because U.S. Robotics and Mechanical Man, Inc. pays shit. This is the first indication that there are multitudes contained within the laws of robotics, specifically that each law can be 'tuned'. So the story involves some 'drunk' robot and the debuggerers have to figure out what's wrong with it. B.B. Rodriguez owes a lot to this story. So anyway, it turns out that the issue was the conflict between laws 2 and 3. The robot was told to do something mildly dangers on the planet surface but not given much urgency to do so (weak law 2: obey humans) but the robot was also tuned to have a particularly strong self-preservation (strong law 3) because it was a newer, more expensive model, so as the robot approached the danger, it was getting into a kind of equilibrium rut where law 2 compelled it weakly on toward danger and law 3 compelled it as strongly away from danger at a certain distance from the danger where this all happened to equal out. The solution of course was for Powell to risk his life to jolt the robot out of the rut (strong law 1: don't harm a human or let a human come to harm).

Reason (1941)

Well I hear she's a brain. You can't reason with brains. Quinn Morgandorfer

This one is also a banger of a programmer bros story. Basically they get this new robot, QT-1, who gets cute by denying the existence of earth and basically anything not directly observable on the station where the they're working (Robots are banned from Earth because of 'Frankenstein Syndrome' or something). Both Roberto and Hal owe a lot to QT-1, the first insane robot. Fortunately the robot kind of independently decides to do what they were training it to do, but does so for 'religious' rather than rational reasons. And as you do, the programmer bros say 'good enough' and establish some robotic monasteries so that QT-1 can train acolytes in the art of whatever the hell they were out there to do. Surely nothing sinister can come of this (no but really it doesn't? Which is some techno-optimism if I've ever seen it).

Catch That Rabbit (1944)

Another programmer bros tail, but this one wasn't particularly interesting to me. Basically, in the course of testing an 'overseer' type robot, the programmer bros notice some weird behavior with the drones it's supposed to be controlling. It turns out it was getting overwhelmed and was "twiddling its fingers" which is metaphor that they used to describe the overseer-drone relationship earlier in the story. Meh.

Liar! (1941)

This one was pretty ok. The U.S. Robotics and Mechanical Men corporation, in its infinite wisdom, accidentally created a telepathic robot, but it turns out because it interprets law 1 as being "don't hurt people's feelings", the robot lies to everyone it talks to and tells them whatever they want to hear, apparently unaware of the social problems doing so would cause...

Little Lost Robot(1947)

U.S. Robotics and Mechanical Men decides to make a robot that was missing part of law 1, the "cannot allow a human to come to harm" rule, because the robots were trying to 'save' humans from a particular type of radiation that humans could actually endure for a while with minimal harmful effects but which would instantly destroy a robot but since law 1 beats law 3, the suits got sick of wasting robots and made a new model. When they did so, the new model also had a personality quirk that manifested as a type of superiority complex, so that when a human worker tells a new model robot to "get lost" in frustration, the robot hides among a bunch of other robots with all 3 laws intact. To avoid the rouge-bot from getting into general circulation, they have to devise a test to find out which robot is the one with the modified laws. Ultimately they use a hack, realizing that the other robots had not yet been trained about the radiation, so Dr. Calvin exposes herself to the radiation and the only one to respond is the Nestor 10, since it was trying to imitate the other robots but did not realize they did not know the radiation was harmful to humans. I feel like maybe they just should have told the robots the radiation was only harmful after x time... seems like an easy solution.

Escape! (1945)

This one was weird. So basically USR&MM gets contracted to put hyperspace plans into their positronic 'brain' computer (same tech as their robot brains, but more powerful and not mobile, as was the style at the time) after they had done so with a rival's, presumably normal electric computer and it blew the fuck up. For some reason, Susan Calvin tells the computer that "potential deaths" don't both humans, which apparently allowed the jump to hyperspace which involves the crew ceasing to exist temporarily. Because this was a technical violation of law 1, the brain designed the ship with only beans and milk for food and no showers or beds or manual controls. It's almost an accidental metaphor for the current moment......

Evidence (1946)

This one was legitimately good. It's about a political campaign in which one candidate, Stephen Byerley, is accused of being a robot. There are not many who believe the accusation, but if true it would end his campaign since robots cannot hold office. Conveniently Byerley is also a staunch privacy advocate, so refuses all attempts to 'prove' his human-ness, wears an x-ray blocking field, etc. The only way for him to prove he's human, essentially, is for him to break one of the laws of robotics. All the while he's gaining fame due to the controversy. Finally he punches out a heckler at campaign event, but Doc Calvin thinks the heckler was also a robot, which wouldn't violate law 1. She's also not particular concerned about this because she thinks the robots should be in charge, for some reason.

The Evitable Conflict (1950)

A continuation of Evidence, Byerley has been elected to a second term as world coordinator. Earth is effectively managed in four regions by a central economic machine, but there have recently been errors. Byerley believes this is due to the efforts of some humans who want to return control of the economy to humanity but Susan decides that this is actually actions by the machines to cloak their own control of humanity? Or something. Because they now believe the 1st law is actually (apparently later the zeroth law) that robots cannot let humanity come to harm, and therefore must guide the course of humans through wild cloak and dagger shit because humans would not allow robots to be overtly in control. Sue says this is fine because humans have never been in control because hurricanes exist or something.

Previous Post Next Post