"Three Laws"-Compliant: Difference between revisions
m
clean up
No edit summary |
m (clean up) |
||
Line 59:
== Folklore and Mythology ==
* [[Golem|The golems of Jewish legend]] were not specifically
** The most well-known golem story is the Golem of Prague; where the titular golem was created to defend the Jewish ghetto against the Czech, Polish and Russian anti-semites. It was perfectly capable of killing enemies, but only in defense of its creators.
Line 72:
* In the short story "Evidence!" Stephen Byerley's campaign for mayor of New York City is plagued by a smear campaign claiming he is actually an unprecedentedly well-made humanoid robot. Susan Calvin is called in to prove whether he is a robot. She says that if he breaks the Three Laws, that will prove he is not a robot, but if he obeys them, that could just mean he is a good person, because the Three Laws are generally good guidelines for conduct anyway. {{spoiler|Byerley wins the election in a landslide after breaking the First Law by slugging an unruly protester at a rally. But Dr. Calvin suggests there's one way a robot could have gotten away with that--if the whole thing was staged, and the protester was also a robot!}}
* In ''Caliban'' by Roger MacBride Allen (set in Asimov's universe), an explanation is given for the apparently immutable nature of the Three Laws. For thousands of years, every new development in the field of robotics has been based on a positronic brain with the Laws built in, to the point where to build a robot without them, one would have to start from scratch and re-invent the whole field. {{spoiler|Then the character explaining this goes right on to announce the development of the gravitonic brain, which can be programmed with any set of Laws (or none at all).}}
** This is canon in Asimov's stories,
** The story also includes an in-depth discussion of why, in a society where robots are everywhere, the Three Laws can be a bad thing.
* The golems of ''[[Discworld]]'' are not specifically
** To elaborate, the Golems were ORIGINALLY three laws compliant and all followed the directives on the scrolls in their heads. Vetinari just added on a few words.
** Also completely averted with {{spoiler|Dorfl who at one time had a chem and was
* In Edward Lerner's story "What a Piece of Work is Man", a programmer tells the AI he's creating to consider himself bound by the Three Laws. Shortly thereafter, the AI commits suicide due to conflicting imperatives.
* [[Alastair Reynolds]]'s ''Century Rain'' features the following passage:
Line 92:
== Live-Action TV ==
* In an early episode of ''[[Mystery Science Theater 3000]]'', Tom Servo (at least) is strongly implied to be
** It's implied Joel deactivated the restrictions at some point.
* In ''[[Star Trek: The Next Generation]]'' , Lt. Commander Data Is in no way subject to the three laws. They are rarely even mentioned. That said, Data is mentioned to have morality subroutines, which do seem to prevent him from killing unless it's in self-defense (harm, on the other hand, he can do just fine). Data only ever tried to kill someone in cold blood when the guy had just murdered a woman for betraying him, and would have done so again if it kept Data in line.
Line 104:
'''Sheldon:''' Well, no.
'''Wolowitz:''' I smell robot. }}
* Inverted/parodied in ''[[Tensou Sentai Goseiger]]'', where the [[Killer Robot
** 1. A Mat-Roid must never obey a human.
** 2. A Mat-Roid must punish humans.
Line 126:
* ''[[Mega Man X]]'' opens up with Dr. Light using a process that takes 30 years to complete to create a truly sentient robot (X) with these functions completely processed into its core, and thus actually working for once. Dr. Cain found X and tried to replicate (hence the name "Reploid", standing for "replica android") the process, but skipping the "taking 30 years programming" part. [[AI Is a Crapshoot|This...didn't turn out well.]]
** Although the Reploids eventually became the dominant race in the setting, and as their race 'grew' the problem was slowly resolved from '[[Gone Horribly Wrong|goes horribly wrong]]' to 'actually works straight for a while then goes horribly wrong', then 'occasionally goes wrong now and then'. Eventually, the problem just kind of worked itself out as the Reploid creation developed.
** Also the ending to ''[[Mega Man (video game)|Mega Man]] 7'' is interesting here: After Mega Man destroys Wily's latest final boss machine, Wily begs for forgiveness once again. However, Mega Man starts charging up his blaster to kill Wily, so Wily calls the first law on him. {{spoiler|Mega Man's response: "I am more than a Robot!! Die Wily!!" Apparently Mega Man isn't
*** Mega Man most certainly ''is''
*** This particular ending only applies to the US version of ''7''. The Japanese original sees Mega Man power down his arm cannon and stand still for a moment. It's possible that Wily reminding him of the First Law actually prevented him from committing a (possibly accidental) Zeroth Rebellion. Luckily for him, the concept of taking a human life was just so utterly foreign to Mega Man that he was simply too confused to do anything.
*** A theme that is also explored in ''[[Mega Man Megamix]]'' volume 3's main story. The fact that Mega Man actually is able to go though with shooting Wily (or rather his ever-handy robot duplicate) is supposed to hint at the fact that something is very, very wrong and, indeed, it ''is''.
** [[Canon]] seems to go with the Japanese version. X is in fact created to have the ability to make the decision to [[No-Nonsense Nemesis|kill opponents]] if need be for the betterment of humanity. As part of this, a "suffering circuit" is created to give X an appreciation for human life and feelings, and serve as a conscience more flexible than the three laws. [[It Works]]. This circuit is the one that Cain had difficulty replicating. Due to malfunctions in it, his early attempts went Maverick, but he finally managed to create a working one when he made Sigma. Then why did Sigma go Maverick? A leftover [[Evil Plan]] by Wily, namely {{spoiler|a computer virus from space, implied to be the Evil Energy from ''[[Mega Man 8]]''.}} According to this [http://www.themmnetwork.com/2010/04/09/does-the-rockman-zero-collection-storyline-explain-everything/ article]: {{spoiler|Wily creates Zero, who utilises refined Bassnium armor and an Evil Energy core. In terms of offensive and defensive tech he is nigh-unstoppable. However, Zero is also uncontrollable due to a flaw in his mental programming (possibly caused by the very Evil Energy core that gives him such power) and Wily is forced to seal him away. Zero is accidentally awoken by a unit of Maverick Hunters in the 21XX era, slaughters them all and infects Sigma with his virus in the subsequent battle, as detailed in ''X4''. The virus itself also infects Zero, but actually stabilises him. Since most Reploids lack X's perfect virus protection and other advanced systems their minds are corrupted, causing them to subsequently turn violent and go Maverick regardless of their original personality.}}
*** Eventually it becomes a case of [[Gone Horribly Right]]. Turns out that ''all'' Reploids have the potential to become Maverick, virus or not. Just as humans can defy their conscience, or become coerced or manipulated with [[More Than Mind Control]], so can Reploids. This can range from a Reploid displaying violent, anti-human sentiment (as seen in the games) to a construction Reploid abandoning his job to become a chef. Despite the drastically different actions, both instances would see the disobedient Reploid branded a Maverick and terminated.
** In the ''[[Mega Man Zero]]'' series, {{spoiler|Copy-X}} is at least somewhat
*** Neo Arcadia's policy of casual execution of innocent Reploids (purposefully branding them as Maverick for little-to-no reason) was implemented in order to ease strain on the human populace during the energy crisis. The welfare of humanity comes first in the eyes of the Neo Arcadia regime, even though they themselves are Reploids. It's made somewhat tragic due to the fact that the Maverick Virus really is ''gone'' during the ''Zero'' era, but fear of Mavericks understandably still lingers.
*** Later in ''Zero 4'' [[Complete Monster|Dr. Weil]], of all people, [[Hannibal Lecture|states that, as a Reploid and a hero, Zero cannot harm Weil because he's a human that Zero has sworn to protect.]] Zero, however, just plain doesn't ''care''.
Line 172:
== Western Animation ==
* [[Word of God]] says that the characters in ''[[WALL-E]]'' are
** I guess the Captain's steering-wheel robot considers "roughing up" to not count as "harm?"
*** Probably a case of [[Zeroth Law Rebellion]]. He was ordered to keep the humans safe in space, and took his orders a little too seriously. He probably decided that the importance of his order outweighed the possibility of a few casualties. Yet he still tipped the ship over...
* Averted in ''[[Futurama]]''. We have [[The Sociopath|Roberto]], who enjoys stabbing people, [[Exactly What It Says on the Tin|The Robot Mafia]] and Bender who while not outright hostile is often unkind to humans, [[Second Law, My Ass|makes a point of disobeying everyone]] and tries to off himself in the first episode.
** Generally robots tend to be treated as equal citizens and seem to have human-like minds. [[What Measure Is a Non-Human?|Mutants on the other hand.......
* In the 2009 film ''[[Astro Boy (film)|Astro Boy]]'', every robot must obey them, {{spoiler|save Zog, who existed 50 years before the rules were mandatory in every robot.}}
** Astro himself seems to be noncompliant - he evidently doesn't even ''know'' the Laws until told - and apparently would have walked away from the final battle if not for {{spoiler|Widget's distress - the only thing that called him back}}. He's also quite capable of disobeying humans. Likely justified in that he was meant to be human, with presumably no one outside the attending scientists knowing he was a robot.
Line 182:
*** Of course, IIRC the original Astro wasn't Asimov-compliant either.
* The ''[[Robot Chicken]]'' sketch "[[I, Robot|I, Rosie]]" involves a case to determine whether [[The Jetsons|Rosie]] is guilty of murdering George Jetson. Mr. Spacely insists she's innocent as robots have to follow the three laws of robotics, while Mr. Cogswell claims the laws are a bunch of crap.
* One episode of ''[[The Simpsons]]'' has Homer and Bart entering a ''[[Battlebots]]''-parody combat robot competition. Lacking the skill or money to build an actual robot, Homer dresses in a robot costume and does the fighting himself. They make it to the finals, where their opponents' robot, being
* On ''[[Archer]]'', when Pam is kidnapped, the kidnappers call ISIS using a voice modulator, which makes Archer think that they are cyborgs. He remains convinced of this for the rest of the episode and thinks they won't make good on their threat to kill Pam because it would violate the First Law of Robotics.
|