The Genie in the Machine
As it turns out, not all genies have to be quasi-mystical creatures who arise by having their lamp stroked. In Sci Fi, it is quite common for a genie to take another form entirely- that of the well-meaing but hopelessly logical computer program.
It just turns out that these are the breaks. Computers have a Viewer-Friendly Interface, a Magical Database, and can recognize plain speech, but they are notoriously bad at understanding figurative language. So if you tell your Robot Buddy or AI to "Give me a break", it will try to snap your legs. Tell it to "get lost" or "take a hike", and it'll wander off by itself. And so on.
Another common interpretation comes to us courtesy of the old days of MS-DOS. The computer prompt, not having any idea what your ultimate goal is, needs everything explained to it one step at a time. And heaven help the poor computer user who can't figure out what the correct command prompts are. You Can't Get Ye Flask is a common result of this kind of confusion.
Of course, You Can't Get Ye Flask is also much less fun than this trope, since a normal computer will just not do anything when given instructions it can't understand. So when wackiness is the goal, only The Genie in the Machine can provide the fun we desire while still maintaining a vestige of non-magical command prompts.
The machine version of the Literal Genie.
Anime and Manga
- In the murder-mystery episode of Suzumiya Haruhi, Haruhi asks Yuki (whose personality is like a computer's) to lock the door and not let anybody in. Later, she asks to be let in, and Yuki refuses. Kyon gets her to let them in by telling her the order has been cancelled. He then guesses that the ordeal might have been Yuki's awkward attempt at a joke. It has also lead others to muse that for whatever reason Kyon's commands override anything else; which makes sense given Yuki's absolute loyalty to him.
Film
- The plot of the movie Space Camp is jumpstarted when Max wishes he could go into space within earshot of the robot Jinx. Jinx helpfully triggers a launch of the shuttle while Max is onboard.
- More accurately, Jinx (having run a simulation that indicates a 4.9 million year wait for the specific accident he wants) engineers the fatal shuttle accident that forces Mission Control to launch the shuttle by activating its booster rockets (Jinx ignited a single booster rocket, which would have flipped the shuttle right into a nosedive).
Live-Action TV
- A good way to turn your AI evil is to give it poorly-phrased orders. On Knight Rider, KITT's Evil Twin, KARR, was ordered to defend itself, so it immediately locked itself down and refused to follow any other orders, since they might lead it to its destruction.
- A hologram of Professor Moriarty was gifted with sentience on Star Trek: The Next Generation ("Elementary My Dear Data") after Geordi foolishly told the holodeck to, "Create an adversary capable of defeating Data," rather than capable of defeating Data's portrayal of Sherlock Holmes.
- Of course, this causes one to wonder why the Federation didn't use their holodecks to "Create an adversary capable of defeating our current enemy", since the holodeck must have that kind of magic power.
- It probably helped that they were telling one absurdly advanced computer to out-think another computer.
- Also, the Federation in general doesn't want holograms to be that intelligent. Moriarty didn't just disappear after Data finally conceded defeat, remember—Moriarty was now sentient, with his own agenda, and it went far beyond Data.
- The discovery that any decent holodeck can effortlessly create a sentient AI renders the irreproducibility of Dr. Soong's positronic brain technology basically irrelevent, but this is never addressed.
- Of course, this causes one to wonder why the Federation didn't use their holodecks to "Create an adversary capable of defeating our current enemy", since the holodeck must have that kind of magic power.
Literature
- After working out three simple laws that would render all robots totally safe, just about every single Isaac Asimov robot story would show how a robot could stick to the letter of the law and still cause an awful lot of trouble.
- In the short story "The Evitable Conflict", this leads to the robots outright taking over the world, since the laws of robotics insist that they always take action to save human lives when possible, which precludes them standing idly by while we get on with the killing of each other. Mind you, the author saw this as a good thing.
- The author saw this a good thing, because unlike some other stories where robots prevent humans from doing anything that might involve the slightest risk (eating fatty foods, working, etc.), these robots were smart enough and ethical enough to only justify taking action to stop the worst acts and make sure no human realizes the robots are in charge.
- Another story, appropriately titled "Little Lost Robot", deals with a robot named Nestor that was told to "lose itself" by a disgruntled employee. Nestor does precisely what he's told, disguising himself among 62 other robots, which are physically identical but lack Nestor's modified version of the First Law of Robotics.
- Susan Calvin points out that advanced robots like Nestor possess a sort of subconscious superiority complex towards humans (they are stronger, tougher, faster, smarter, etc than us, but are bound to value our lives above their own and obey our every command). Messing with the safeguards that make them incapable of ever expressing this "feeling" in their actions (such as by effortlessly crushing a human skull with one hand) is one of the stupidest things a person could ever do in her opinion. In this case, the robot (though capable of understanding the nuance of the command to "get lost") decided to take it literally as a way of acting out against it's human masters. Their initial failures to identify it only serve to reinforce this "rebelious" line of thinking and Calvin warns everyone that the longer they take to resolve the situation, the more dangerous the robot could become.
- And the story about the mining robot who was supposed to be send offworld to Titan or somewhere, but its crate ended up on Earth, somewhere in the American mid-west. Being programmed for a different planetary environment, the robot went a little bit insane (while still following the three Laws of Robotics), and in an attempt to fulfill his programming ("use a laser drill to mine ore") it tried to build an industrial laser from whatever old stuff a farmer had lying around his shed.... and ended up building the world's first fully functional desintegrator cannon run by a standard electric torch battery. Unfortunately, shortly before the corporation managed to locate the robot (a nearby mountain peak suddenly ceasing to exist gave them a clue), the annoyed farmer gave the robot an instruction (along the lines of "oh, forget it") that resulted in the robot first destroying its "laser" and then itself, taking the secret with it. When the cyberneticists found out, they nearly lynched the farmer.
- The Three Laws work pretty much perfectly most of the time for keeping robots obedient and safe. It's just less sophisticated models don't understand nuance of instructions or human tone and more advanced robots are often stated to work by differentials between the laws, so when a low priority law (such as self-preservation) is in strong effect but a higher priority one is invoked to override it , the "stress" can cause unexpected behaviors. The predictable, safe, everyday functionings just don't make for interesting stories.
- In the short story "The Evitable Conflict", this leads to the robots outright taking over the world, since the laws of robotics insist that they always take action to save human lives when possible, which precludes them standing idly by while we get on with the killing of each other. Mind you, the author saw this as a good thing.
- In Larry Niven's novel A World Out of Time, the protagonist averts this trope by remembering at the last moment not to tell his computer to "Forget about it."
- In Alastair Reynolds' "Nightingale," the insane computer running the hospital ship promises to return the protagonists "in one piece." When the last woman standing takes the computer up on her offer, she discovers that what the computer really means is all of the characters alive...and welded together into one body, so perfectly that nobody can figure out how to undo it.
Video Games
- In The Elder Scrolls - Oblivion, the "Radiant AI" for providing NPCs with aspirations and goals was much less impressive than originally hyped. Apparently, it proved far more difficult to do well than Bethesda expected. In one example documented during beta testing, one NPC was assigned to rake leaves and another to sweep, but the raker was given the broom and the sweeper the rake. Rather than trade their respective instruments, the one with the rake killed the other, looted the broom from its corpse, and began sweeping.
- Whoa. Does this mean that if the AIs in question were robots instead of virtual constructs A.I. Is a Crapshoot would be Truth in Television? Freaky.
- Not so much, they did have to be programmed with the option of killing to obtain the item to begin with. Now, blatantly stealing it on the other hand...
- Another problem they had was that the world continued running in the background at all times. So one plot that required you to talk to a drug dealer always failed, because the drug dealer was always killed by the addicts for his drugs before you got that far in the game.
- Interestingly, this problem was never completely fixed. In the expansion, Shivering Isles, a certain quest was almost impossible to complete because the NPC was killed for stealing spoons before the player could talk to him. I think a patch made him immortal.
- They also tried have other adventurer NPCs, but this too turned out wrongly. Their programming told them to adventure, which they did, much, much better than the player, even hogging all the items so that the player couldn't get them.
- Whoa. Does this mean that if the AIs in question were robots instead of virtual constructs A.I. Is a Crapshoot would be Truth in Television? Freaky.
Web Comics
- While not quite a computer, Castle Heterodyne of Girl Genius tends to interpret orders in whatever way allows it to have the most fun (read: cause the most casualties). Thus, when Agatha tells it there are people after her, it immediately tries to send helpful minion Moloch through a trap door ("Ah. Then perhaps you should have said: 'The people after us.'"), and when she takes it up on its suggestion to keep her enemies out of Mechanicsburg airspace, it interprets this as permission to send the Torchmen not only after the fake Heterodyne's airship, but Castle Wulfenbach as well.
Agatha: "I am going to have to think twice about everything I say to you, aren't I?"
Castle Heterodyne: "It'll be fun!"
Web Original
- Red vs. Blue has Lopez the robot building an army of robots for Omnicidal Maniac O'Malley. O'Malley then orders them to attack, and they charge ... at a pace slower than walking. Why did they go slow? 'You asked for a day of victory.' The robots were set to win in exactly 24 hours.
Western Animation
- One episode of Muppet Babies used this in the process of parodying as many sci-fi tropes as possible. "Gross me out" and "I need a bath" were among the "wishes".
- Mercenaries steal the X-1 jet in an episode of The Venture Brothers. H.E.L.P.Er., the show's Robot Buddy, is on board. Brock instructs him to return to base. Instead of turning the plane around as hoped, H.E.L.P.Er obediently jumps out of the moving aircraft and promptly craters into the ground.
Multimedia
- Great fun to be had by having your Robot Buddy draw its weapon when you prompt it to proceed with its question by saying, "Shoot."
- To make this plot work, you will sometimes have to use an Unusual Euphemism, such as exasperatedly telling your Robot Buddy to "Go jump off a cliff."
Real Life
- Those tropers old enough to have worked with type-in programs, as well as DOS on a daily basis in general, know that this was all too often Truth in Television. Pre-Windows MS Word especially seemed to have at least three keys, any one of which would instantly delete all your work.