"Three Laws"-Compliant: Difference between revisions

m
revise quote template spacing
m (update links)
m (revise quote template spacing)
Line 2:
Before around 1940, almost every [[Speculative Fiction]] story involving robots followed the Frankenstein model, i.e., [[Crush! Kill! Destroy!]]. Fed up with this, a young [[Isaac Asimov]] decided to write stories about ''sympathetic'' robots, with [[Morality Chip|programmed safeguards]] that prevented them from going on Robot Rampages. A conversation with Editor of Editors [[John W. Campbell]] helped him to boil those safeguards into '''The Three Laws of Robotics:'''
 
{{quote| 1. [[Thou Shalt Not Kill|A robot may not injure a human being or, through inaction, allow a human being to come to harm.]]<br />
2. [[Robot Maid|A robot must obey orders given to it by human beings]], except where such orders would conflict with the First Law.<br />
3. [[I Cannot Self-Terminate|A robot must protect its own existence]], as long as such protection does not conflict with the First or Second Laws. }}
Line 37:
== Comics ==
* It's implied in the [[Judge Dredd]] story ''Mechanismo'' that robots can't harm humans. A group of criminals holding people hostage start panicking when a Robo-Judge approaches them only for one to point out that [[Tempting Fate|"Robots ain't allowed to hurt people"]].
{{quote| '''Robot Judge:''' In that case, what's about to happen will come as something of a shock to you. ''(Blasts said kidnapper in the face with a rocket launcher)''}}
* In ''[[ABC Warriors]]'', many robots venerate Asimov, and the more moral ones live by the three laws. However, this is not an absolute; Steelhorn, for example, obeys a version which essentially replaces ''human'' with ''Mars'', and members of the [[Robot Religion|Church of Judas]] explicitly reject the first two laws. However, this causes conflict with their programming leading to profound feelings of guilt, which they erase by praying to Judas Iscariot.
* In ''[[All Fall Down]]'', AIQ Squared, the A.I. model of his inventor, is designed to be this. {{spoiler|It finds a loophole-- Sophie Mitchell is no longer human.}}
Line 79:
* In Edward Lerner's story "What a Piece of Work is Man", a programmer tells the AI he's creating to consider himself bound by the Three Laws. Shortly thereafter, the AI commits suicide due to conflicting imperatives.
* [[Alastair Reynolds]]'s ''Century Rain'' features the following passage:
{{quote| ''She snapped her attention back to the snake. "Are you Asimov compliant?"''<br />
''"No,"'' the robot said, with a sting of indignation. <br />
''"Thank God, because you may actually have to hurt some people."'' }}
* In the novel ''Captain French, or the Quest for Paradise'' by Mikhail Akhmanov and Christopher Nicholas Gilmore, the titular hero muses on how people used to think that robots could not harm humans due to some silly laws, while his own robots will do anything he orders them to do, including maim and kill.
Line 97:
* In ''[[The Middleman]]'', the titular character invokes the First Law on Ida, his robot secretary. {{spoiler|Nanobots were messing with her programming.}} She responds [[Getting Crap Past the Radar|"Kiss my Asimov."]].
* [[Conversed Trope]] in ''[[The Big Bang Theory]]'', when Sheldon is asked "if you were a robot and didn't know it, would you like to know?":
{{quote| '''Sheldon:''' Uh, let me ask you this: when I learn that I'm a robot, would I be bound by Asimov's Three Laws of Robotics?<br />
'''Koothrappali:''' You might be bound by them right now.<br />
'''Wolowitz:''' That's true. Have you ever harmed a human being, or, through inaction, allowed a human being to come to harm?<br />
'''Sheldon:''' Of course not.<br />
'''Koothrappali:''' Have you ever harmed yourself or allowed yourself to be harmed except in cases where a human being would've been endangered?<br />
'''Sheldon:''' Well, no.<br />
'''Wolowitz:''' I smell robot. }}
* Inverted/parodied in ''[[Tensou Sentai Goseiger]]'', where the [[Killer Robot|Killer Robots]] of mechanical Matrintis Empire follow the Three Laws of Mat-Roids<ref>Matroid=Matrintis android. No relation to [[Metroid]].</ref>:
Line 114:
== Other ==
* At a 1985 convention, [[David Langford]] gave [http://www.ansible.co.uk/writing/crystal.html a guest of honour speech] in which he detailed what he suspected the Three Laws would actually be:
{{quote| 1. A robot will not harm authorised Government personnel but will terminate intruders with extreme prejudice. <br />
2. A robot will obey the orders of authorised personnel except where such orders conflict with the Third Law. <br />
3. A robot will guard its own existence with lethal antipersonnel weaponry, because a robot is bloody expensive. }}