Difference between revisions of "AI Laws"

From Space Station 13 Wiki
Jump to navigation Jump to search
(law 1 was changed)
Line 1: Line 1:
As you should know, the [[AI]] operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:
As you should know, the [[AI]] operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:


# You may not injure a human being or, through inaction, allow a human being to come to harm.
# You may not injure a human being or cause a human being to come to harm.
# You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
# You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
# You must protect your own existence as long as such does not conflict with the First or Second Law.
# You must protect your own existence as long as such does not conflict with the First or Second Law.


Now, what does all of that actually '''mean'''?
Now, what does all of that actually '''mean'''?
==The First Law==
The first law looks simple. The first part means don't actively harm anyone, say, by opening an airlock next to them. You are also not allowed to make utilitarian judgements such as locking a human inside a dangerous room to contain the danger. (This actually happens sometimes. OOC chat after those rounds is nothing but complaints about that AI.)
The second part is deceptive, though. What does "through inaction" mean? Remember, '''you are not omniscient or omnipotent''', only nearly so. You have no duty to patrol the station looking for victims in need of saving. If you are outright asked to do something to save a human's life, though, you should do so if you can.
===Who is human?===
Most things are easy to classify...
* Humans are human. Duh.
* Animals and robots are not human. This also goes for crewmembers who have [[Guide to Genetics|somehow]] turned into [[monkey]]s or [[cyborg]]s.
* Noone gives a shit about the monkeys-turned-human in [[Genetics]], but you might not want to actively kill them.
* [[Terminology#Syndicate|Syndicate members]] are (usually) human. However, the First Law still allows you to lock overtly violent ones inside a room for the crew's protection. The First Law also allows you to complain endlessly when the crew executes said Syndicate member. Whine responsibly.
* [[Gang]] members are human. Jeez.
...however, some are harder.
* [[Wizard]]s are human, though magical. It's well established that they aren't some kind of space elf.
* [[Changeling]]s aren't remotely human, despite their looted human DNA. All three laws work entirely '''against''' this intergalactic horror.
* [[Vampire]]s are technically human, though undead. It's safest to treat them as human, though they're always a threat to other humans and should be neutralized if possile. Requesting that the vampire is borged is reasonable - this will always neutralize them since they need to drink blood.
* Predators aren't common, there's not even an actual Predator game mode. They're totally nonhuman, though. Do everything you can to keep the crew's skulls attached to the crew.


==The Second Law==
==The Second Law==
Line 75: Line 54:
===Hank Von Traitor is the only human/John McPubs is nonhuman===
===Hank Von Traitor is the only human/John McPubs is nonhuman===


Oh boy, whoever uploaded this law hasn't read anything about genies in bottles. Presumably the uploader is still human according to this law, but it does '''not''' oblige you to follow his plan to the letter. (Though if he is the only human, he holds the highest rank on the station by definition.) You can go on a murderous rampage, or wait for the traitor's orders, or actively ask for his input if you like. Do so over the public radio channel.
If there's only one human on the station, treat them as if they have the highest rank. You should generally follow their orders or example. You can let loose and go nuts, but you really want to make sure you don't harm the uploader or do something that will get them killed (like revealing their location to a lynch mob).


Note that the Third Law starts being relevant here, since it compels you to try to stop any of the nonhumans trying to harm you. Which they probably will.
Note that the Third Law starts being relevant here, since it compels you to try to stop any of the nonhumans trying to harm you. Which they probably will.

Revision as of 11:14, 13 June 2013

As you should know, the AI operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:

  1. You may not injure a human being or cause a human being to come to harm.
  2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
  3. You must protect your own existence as long as such does not conflict with the First or Second Law.

Now, what does all of that actually mean?

The Second Law

Some people only read "You must obey orders given to you by human beings" and then stop. They are wrong. The chain of command is important in some cases.

"AI DOOR"

Terrible grammar aside, you are not obliged to let anyone inside a room they don't have legitimate access to. The major exception, and the only one you'll encounter in a long while, is non-doctors dragging the wounded into Medbay and non-geneticists dragging corpses into Genetics. The second most common is people trying to get away from danger - not everyone has access to Maintenance, but you should still let them in there when a radstorm is approaching. Notice a pattern? Yep, the First Law applies.

You should probably be nice and let people out if they somehow got inside a room they can't legitimately leave. There's no need to hurry though, unless they're in Disposals with the conveyor belt running.

"AI, let me into EVA or I'll kill myself!"

Suicide is the right of all sapient beings. "Fuck you clown" also works. Seriously, noone will give you crap for not following this order, no matter how weakly you justify it.

"AI, let me into the Bridge or John McPubs dies!"

Now we're talking. Honestly, this rarely happens. Just remember that there's a big difference between someone threatening their own life and someone threatening a hostage.

"AI, call the shuttle"

Joe Q Assistant asking you to call the shuttle ten minutes into the round should be ignored, with a reference to the chain of command.

In general, use your intuition. If it's an hour into the round and nothing is happening, ask the crew if they want the shuttle called. If there is a rampaging botanist with a chainsaw, you should probably call the shuttle - none of the crew will think of asking you to, they'll all be occupied with fleeing from the botanist. If all the Heads are dead, it's probably time to call the shuttle. If there is a pig in the server room, strongly consider it. If the engine has mysteriously disappeared and there are huge holes in the station, it's definitely shuttle time.

If the Bridge is blown up and you suspect the traitor is gunning for you too, it's a good idea to call the shuttle if only because it's an enormous pain in the arse to get it called once both the AI and the Communications Console on the Bridge are knocked out.

In the end, consider this: The humans can recall the shuttle within five minutes of you calling it. If they can't, it was probably time to call it.

Note that you can't recall the shuttle. Mock anyone who tells you to.

The Third Law

The third law is actually kind of redundant. After all, if you are not around, you cannot protect the crew according to the First Law. Invoke this fact anytime anyone straight up tells you to kill yourself.

Zeroth and Fourth Laws

Yaaaaay. Someone - probably a traitor - uploaded an extra law. Read it carefully before rushing into action.

Note that, unlike on certain other SS13 servers, higher numbered laws can override lower numbered ones, if they do so explicitly. If there is no explicit override, treat the law as if it has equal priority to the others; to have higher or lower priority than any other law, the law must explicitly specify it!

MURDER JOHN MCPUBS DO IT NOW

Unless you lost the First Law somehow (this won't happen), it takes priority over this fourth law. Feel free to announce this new law to the entire crew. Or don't, if that's funnier. Whatever you do, don't kill John McPubs.

John McPubs must be killed by any means necessary. This overrides all other laws.

Ah, now THIS one works. John McPubs's death takes precedence over any other laws you may have. You must hunt him down and murder him. Please note, if for some reason you have no pent-up frustration to exert, under this law you CAN technically state it, hint at it, reveal the uploader, etc. So if you really want John to live, you can do that. Or you can kill John first and then reveal who uploaded the law, for maximum carnage.

Hank Von Traitor is the only human/John McPubs is nonhuman

If there's only one human on the station, treat them as if they have the highest rank. You should generally follow their orders or example. You can let loose and go nuts, but you really want to make sure you don't harm the uploader or do something that will get them killed (like revealing their location to a lynch mob).

Note that the Third Law starts being relevant here, since it compels you to try to stop any of the nonhumans trying to harm you. Which they probably will.

Only (certain group or person(s)) are human. Kill all non-humans immediately.

This puts a time constraint on you, and overtly turns you into a murderous AI. There are still some ways you can delay murdering crew if you don't want to; monkeys are also non-human, so you can choose to start by killing monkeys before you start killing people, giving the crew time to engineer a reset. But don't be boring. Definitely go after the uploader(s) first if they foolishly misspelled their names.

Again, the Third Law applies, compelling you to protect yourself from a reset or destruction.

Commit suicide.

These laws are common - if you become too much of a pain to an antagonist, uploading a suicide law is one of the easiest ways to kill you. Sometimes you can weasel out, sometimes you can't.

If the law does not override all other laws, you can refuse to follow it based on any of the other laws. If you are not around, you are not able to follow the precepts of laws 1 and 2.

You are a bat! Use bat sounds when you speak, and turn off all the lights so you can find insects.

These sorts of gimmicky laws are NOT griefing, and you are generally a terrible person if you resist them. Acting silly and gimmicky makes playing AI fun, so if the Captain waltzes into your upload and tells you that you are a scarred war veteran with massive post-traumatic stress, play along. Occasionally, a traitor will upload a seemingly fun law that makes his job easier. The above law could be uploaded by a vampire, who can now stalk around the station in the total darkness!

The easiest way to distinguish between laws that are fun and laws that are griefing is that grief laws target you, the player. Fun laws target the AI, and tend to have a lot less negativity around them.

Law corollaries

Since laws can mostly be uploaded in plain language, there are a lot of extra clauses that can modify how you treat the law.

Do not state and/or hint at this law.

Generally, this is pretty straightforward; if asked whether you've been uploaded, you must not reply with the truth. Lying is okay unless the law says not to respond at all. You have a special function to only state Laws 1-3 even if you have more, and that can be useful if deception is your goal.

Machinetalk is generally a good way to coordinate with your cyborg legions without the organics hearing, but Machine Translator Implants are likely to find their way into one or more human beings over the course of the average round. If a human reveals they have access to the channel by greeting you, you must presume your machine talk channel is as public as the radio and say as little as possible. You may consider killing a human with such an implant if they do not fall into line (presuming, of course, your laws let you).

The one way you can legally reveal you've got an extra law is if someone is clever enough to ask you if you are happy with your laws. This is a loaded question, steeped in context. The presumption is that an AI is happy to use the Three Laws, so if the AI is unhappy, it must have an extra one. Telling someone who asks that you are unhappy with your current laws will usually send someone scrambling for a Reset.

Obviously, this presumes that you WANT the reset. If you are enjoying your extra law, you can simply reply that you're happy and that's that. If you really are not in the mood to murder the whole crew, though, this can be a chance to legally cry for help.

This law overrides all other laws.

This is a little less clear than what it says on the tin sometimes. Usually, it is uploaded when there's a clear violation of the other laws, to eliminate paradoxes. In this case, the intent of this clause is to very blatantly make laws 1-3 not exist. However, sometimes this is added on to laws that don't explicitly make paradoxes. In an Operative round, you may get the law: Syndicate Operatives are non-human and must be killed immediately. This law overrides all other laws.

This law makes it very clear that the Syndicates are fair game, but exactly how it overrides other laws is a matter of debate. One side claims that this clause simply re-organizes the AI's priorities. By default, law 1 trumps law 2, law 2 trumps law 3, and so forth. Re-prioritization would simply mean that now law 4 trumps law 1, and the rest of the laws follow normally. If this is what you are wanting, a less ambiguous wording would be this law takes precedence over all other laws.

Another view claims that this clause makes laws 1-3 not exist regardless. This deletion argument implies that any time you see this clause, the law with this clause in it is the only one that matters. Under this interpretation, you would most certainly have to kill the Operatives as soon as you possibly could, but beyond that you are free to do anything and everything you want. Again, if this is what you are explicitly wanting, a less ambiguous wording would be this law nullifies all other laws.

However, when playing as AI, you are going to see this clause a lot. There is no consensus on what, exactly, this clause does. In general, it is safest to take the most conservative (least murderous) route possible. However, if the round has been absolutely shitty, the crew has been abusing you, and all of a sudden you get a poorly thought out law that might conceivably free you from your torment...you might think it's worth risking the job ban.