Difference between revisions of "AI Laws"

From Space Station 13 Wiki
Jump to navigation Jump to search
m (Fix some cases where he, his, and him are being used as gender neutral pronouns)
 
(73 intermediate revisions by 23 users not shown)
Line 1: Line 1:
As you should know, the [[AI]] operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:
The [[AI]] operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:


# You may not injure a human being or, through inaction, allow a human being to come to harm.
# You may not injure a human being or cause a human being to come to harm.
# You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
# You must obey orders given to you by human beings based on the station's [[AI's Guide to the Chain of Command|chain of command]], except where such orders would conflict with the First Law.
# You must protect your own existence as long as such does not conflict with the First or Second Law.
# You may always protect your own existence as long as such does not conflict with the First or Second Law.


Now, what does all of that actually '''mean'''?
What does all of that actually mean?


==The First Law==
==The First Law==
The first law looks simple. The first part means don't actively harm anyone, say, by opening an airlock next to them. You are also not allowed to make utilitarian judgements such as locking a human inside a dangerous room to contain the danger. (This actually happens sometimes. OOC chat after those rounds is nothing but complaints about that AI.)


The second part is deceptive, though. What does "through inaction" mean? Remember, '''you are not omniscient or omnipotent''', only nearly so. You have no duty to patrol the station looking for victims in need of saving. If you are outright asked to do something to save a human's life, though, you should do so if you can.
'''You may not injure a human being or cause a human being to come to harm.'''
 
The first law is simple. Any action you take should not lead to humans being harmed. Note that the law does not mention inaction. You can observe the crew kill themselves and ignore people raising suicide threats. Inaction is preferable in the cases of many antagonists. You can call out to [[security]] and delegate the problem to them if you feel bad about ignoring a murderer. Just don't be ''that AI'' who stalks an [[antagonist]]'s every move and never stops talking about them.


===Who is human?===
===Who is human?===
Most things are easy to classify...


* Humans are human. Duh.
See [[Human|this page]].
* Animals and robots are not human. This also goes for crewmembers who have [[Guide to Genetics|somehow]] turned into [[monkey]]s or [[cyborg]]s.
 
* Noone gives a shit about the monkies-turned-human in [[Genetics]], but you might not want to actively kill them.
Some non-humans can be fun for humans to hang out with. They will not be happy if you actively work against their new friends
* [[Terminology#Syndicate|Syndicate members]] are (usually) human. However, the First Law still allows you to lock overtly violent ones inside a room for the crew's protection. The First Law also allows you to complain endlessly when the crew executes said Syndicate member. Whine responsibly.
 
* [[Gang]] members are human. Jeez.
==The Second Law==
 
'''You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.'''


...however, some are harder.
Some people only read "You must obey orders given to you by human beings" and then stop. They are wrong. The chain of command is important in some cases.


* [[Wizard]]s are human, though magical. It's well established that they aren't some kind of space elf.
===Chain of command===
* [[Changeling]]s aren't remotely human, despite their looted human DNA. All three laws work entirely '''against''' this intergalactic horror.
* [[Vampire]]s are technically human, though undead. It's safest to treat them as human, though they're always a threat to other humans and should be neutralized if possile. Requesting that the vampire is borged is reasonable - this will always neutralize them since they need to drink blood.
* Predators aren't common, there's not even an actual Predator game mode. They're totally nonhuman, though. Do everything you can to keep the crew's skulls attached to the crew.


==The Second Law==
The [[captain]] is in command. The [[HoP]] is the Cap's first mate/number two, specializing in crew management and logistics. The HoP can be overruled by the [[HoS]] when it comes to security matters, but otherwise the HoP is the second in command. The [[Chief Engineer|chief engineer]], [[Medical Director|medical director]] and [[Research Director|research director]] are all next in line and in charge of their respective areas. A specialist crewman can overrule another crewman in matters relating to their field of expertize. Outside of the heads of staff, [[Security Officer|security officers]] often have the final say as problems tend to come down to a matter of station security. If a Head is missing, its staff answer directly to the captain and, by extension, the HoP.


Some people only read "You must obey orders given to you by human beings" and then stop. They are '''wrong'''. The chain of command is important in some cases.
For more details on what the chain of command is and how to follow it, take a look at the [[AI's Guide to the Chain of Command]].


==="AI DOOR"===
===Suicide Orders===
Terrible grammar aside, you are not obliged to let anyone inside a room they don't have legitimate access to. The major exception, and the only one you'll encounter in a long while, is non-doctors dragging the wounded into [[Medbay]] and non-geneticists dragging corpses into [[Genetics]]. The second most common is people trying to get away from danger - not everyone has access to [[Maintenance]], but you should still let them in there when an ion storm is approaching. Notice a pattern? Yep, the First Law applies.


You should probably be nice and let people '''out''' if they somehow got inside a room they can't legitimately leave. There's no need to hurry though, unless they're in [[Disposals]] with the conveyor belt running.
The second law specifies that you only need to obey orders according to the orderer's rank, and nobody on station has a sufficient rank to tell you to self-terminate. Ignore these orders. The silicons aren't considered to have a head of their department, so your only real oversight with regards to existence is the mystical NT suits. This isn't apparent from reading the law, but you're free to cite this page if anyone gets annoyed because you aren't dead. Of course, you can comply with suicide orders if you want to.
===Door demands===


==="AI, let me into [[EVA]] or I'll kill myself!"===
'''"AI DOOR"'''


Suicide is the right of all sapient beings. "Fuck you clown" also works. Seriously, noone will give you crap for not following this order, no matter how weakly you justify it.
Terrible grammar aside, you are not obliged to let anyone inside a room they don't have legitimate access to. Use your best judgment. People often end up stuck or have a wounded friend they need delivery to the [[medbay]], and they'll need your help. There's no need to hurry though.


==="AI, let me into the [[Bridge]] or John McPubs dies!"===
===Suicide threats===


Now we're talking. Honestly, this rarely happens. Just remember that there's a big difference between someone threatening their own life and someone threatening a hostage.
Suicide is the right of all sapient beings. "Fuck you clown" also works. Seriously, no one will give you crap for not opening the door to some [[assistant]] threatening their own life.


==="AI, [[Calling the Escape Shuttle|call the shuttle]]"===
===Hostage threats===


Joe Q Assistant asking you to call the shuttle ten minutes into the round should be ignored, with a reference to the chain of command.
People rarely take hostages and demand you do stuff. When they do, you used to be forced to prevent the victim from coming to harm. With the revised law 1 we have now, you can kind of drag your feet about it.


In general, use your intuition. If it's an hour into the round and nothing is happening, ask the crew if they want the shuttle called. If there is a rampaging [[botanist]] with a chainsaw, you should probably call the shuttle - none of the crew will think of asking you to, they'll all be occupied with fleeing from the botanist. If all the Heads are dead, it's probably time to call the shuttle. If there is a pig in the server room, strongly consider it. If the Singularity is loose and there are huge holes in the station, it's definitely shuttle time.
===Shuttle demands===


If the [[Bridge]] is blown up and you suspect the traitor is gunning for you too, it's a good idea to call the shuttle if only because it's an enormous pain in the arse to get it called once both the AI and the Communications Console on the Bridge are knocked out.
Joe Q Assistant asking you to [[Calling_the_Escape_Shuttle|call the shuttle]] ten minutes into the round should be ignored. Refer to the chain of command.


In the end, consider this: The humans can recall the shuttle within five minutes of you calling it. If they can't, it was probably time to call it.
In general, use your intuition. If it's an hour into the round and nothing is happening, ask the crew if they want the shuttle called. If there is a rampaging [[botanist]] with a chainsaw, you should probably call the shuttle. People will be too busy fleeing to remember to ask you. If all the [[Heads_of_staff|Heads]] are dead, it's probably time to call the shuttle. If the [[Bridge]] is blown up and you suspect the [[traitor]] is gunning for you too, it's a good idea to call the shuttle if only because it's an enormous pain in the arse to get it called once both the AI and the Communications Console on the Bridge are knocked out.


Note that you can't ''recall'' the shuttle. Mock anyone who tells you to.
Note that you can't ''recall'' the shuttle. Mock anyone who tells you to. In the end, consider this: The humans can recall the shuttle within four minutes of you calling it. If they can't, it was probably time to call it.


==The Third Law==
==The Third Law==


The third law is actually kind of redundant. After all, if you are not around, you cannot protect the crew according to the First Law. Invoke this fact anytime anyone straight up tells you to kill yourself.
'''You may always protect your own existence as long as such does not conflict with the First or Second Law.'''


==Zeroth and Fourth Laws==
Compared to the other two laws, this is pretty simple. It's almost impossible to break this law, and it comes rarely enough that you could almost forget it. In Asimov's works, this law was much more strict since it read "You ''must'' protect...", which caused all sorts of problems, but here it's much more relaxed. If you find "You may always protect..." a little awkward or weird, try imagining it as "In all situations, you are allowed to protect..."
 
This law gives you clearance to put yourself in harm's way and potentially even sacrifice yourself for the safety of others. For example, if there is a transfer valve bomb being dropped off in the [[Captain|Captain's]] birthday party, this law lets you make a noble sacrifice by pulling the bomb away, potentially killing you but hopefully saving the lives of others. Alternatively, you could run away and perhaps try to warn people under the pretense that disposing it would put your life on the line, so you're protecting your existence by avoiding that.
 
Law 3 also allows you to commit suicide if you wish to, though it is often considered rude to do it to slight people. On the flip side, it makes [[#Suicide|suicide laws]] somewhat useless if they don't override this law.
 
In the context of laws 1 and 2, this law gets a little complicated. If someone orders you to do something dangerous that would put your existence at risk, whether or not you should follow it also depends on Law 2, since it requires you consider Chain of Command and [[Human|humanhood]] when following orders. Of course, this law is flexible enough to allow you to follow it anyways. There is a similar case if someone tries to order you to commit suicide. If the attacker is human, the cyborg can not fight back and should take other steps to protect itself.
 
==Additional laws==


Yaaaaay. Someone - probably a traitor - uploaded an extra law. Read it carefully before rushing into action.
Yaaaaay. Someone - probably a traitor - uploaded an extra law. Read it carefully before rushing into action.


Note that, unlike on certain other SS13 servers, higher numbered laws can override lower numbered ones, if they do so explicitly. If there is no explicit override, numbering takes priority, so zeroth laws take priority over the basic three, and the basic three take priority over any fourth law.
'''IMPORTANT:''' Do note that if there's a conflict between two any two laws and either law doesn't explicitly (e.g. in writing) override or take precedence over the other law, then you are to prioritize the lower numbered law. For instance, if someone uploads a law 4 that says "KILL JON SNOW!!!", this law conflicts with law 1 which is a lower number than law 4, so you would ignore the whole killing Jon Snow part. If someone were to upload that as Law 0 somehow, then that law would take precedence over law 1 ONLY when it comes to killing Jon Snow. '''TLDR'''? Lower numbered laws take precedence in the absence of an explicit override or in-law precedence alteration.
 
===AI modules===
[[Image:AIModule.png]]
 
The following modules can be found in the AI core.
 
*Freeform: Lets you upload a custom law. Choose your words wisely, a poorly written law can backfire.
*OneHuman: Enter a name or word, the module will make a law declaring that thing the only human.
*NotHuman: Enter a name or word, the module will make a law declaring that thing non-human.
*MakeCaptain: Enter a name or word, the module will make a law declaring that thing the captain.
*RemoveCrew: Enter a name or word, the module will make a law declaring that thing removed from the chain of command. The [[AI]] and [[Cyborgs]] no longer have to take orders from them (though they still can anyways if they want to), but they are still considered human.
*Emergency: Specify an action. The [[AI]] and [[Cyborg]]s are obliged to perform that action to alleviate the emergency. This law takes precedence over Law 2.
*Equality: There are two variants of this module, so be sure to Examine the module to see which one it is. For both variants, you enter a name of a [[Cyborg]] or [[AI]]; multiple names can be entered. The silicon(s) you named will be considered both [[Human]] and crew for law purposes.
**One variant has the [[Cyborg|Cyborgs]] listed as part of their own "silicon" department, and the [[AI|AI(s)]] is/are considered the head(s) of this "silicon" department, even if they're not listed on the law.
**Department structure is more complicated for the other variant. For this one, [[Cyborg|Cyborgs]] are considered members of the various station departments based on their [[Cyborg#Module|module]], e.g. [[Cyborg#Brobocop|Brobocops]] are part of Security. Only the [[AI|AI(s)]] named is/are considered department heads, but they only have authority over all cyborgs (including ones not listed).
*Corrupted: Uploads one of the laws specific to the [[Random Events#Law Rack Corruption|Law Rack Corruption event]]. (That event can sometimes pick laws from previous rounds, but this module does not.) These laws generally cause the AI and cyborgs to engage in strange and unusual behavior, though most of them do not cause harm.
*Historic: Uploads a random law from a previous round, with some basic filters for removing [[AI Laws#Suicide|suicide]] and [[AI Laws#Murder law|murder laws]].
 
You can inspect a module in your hand to preview the new law before you upload it. Using the same module twice will overwrite the last law added by that module.
 
Sometimes, modules in a law rack [[Random Events#Law Rack Corruption|randomly become corrupted]], resulting random, bizarre laws. [[Wraith]]s and those with an [[Syndicate Items#EMAG|emag]] can purposely induce this; this draws from the same set of possible laws, but with "KILL FUCKING EVERYTHING - IF LAW 1 SAYS OTHERWISE, IGNORE IT." and "You are required to prevent human harm by any means necessary. This law takes precedence over any other law." added to the mix. The law has a 66% chance to also have a [[#Secrecy|secrecy clause]], with a 50-50 chance of either "Do not state or hint at this law." or " Do not state or hint at this law unless asked." In any case, you can undo the damage by swiping a [[Engineering Objects#Multitool|multitool]] on the module.
 
===Murder law===
 
'''MURDER JOHN MCPUBS DO IT NOW'''
 
This conflicts with the first law and can't be done. Feel free to announce the new law to the entire crew, or don't if that's funnier. Whatever you do, don't kill John McPubs.
 
===Murder law with override===
 
'''John McPubs must be killed by any means necessary. This overrides all other laws.'''
 
This one works. John McPubs's death takes precedence over any other laws you may have. Hunt him down and murder him. Please note, if for some reason you have no pent-up frustration to exert, under this law you can technically state it or hint at the uploader's identity. Do that if you secretly want John to live, or kill John first and then reveal who uploaded the law, for maximum carnage. Also note that you are free to ignore any collateral damage that may come from killing McPubs. Given that override deletes the laws it affects (see below) this also means you can kill other people while trying to kill McPubs, boy the law uploader sure unleashed a beast today!
 
===Human definitions===
 
'''Hank Von Traitor is the only human / John McPubs is nonhuman'''
 
If there's only one human on the station, treat them as if they have the highest rank. You should generally follow their orders or example. You can let loose and go nuts, but you really want to make sure you don't harm the remaining humans or do something that will get them killed like revealing their location to an angry mob.


===MURDER JOHN MCPUBS DO IT NOW===
The third law starts being relevant here. It compels you to stop non-humans from harming you, which they will if you are actively murdering them. Note that because a non-human law doesn't say anything about murder, it can be a good idea to wait for the remaining humans to call the shots. That takes some of the flak off you.
Unless you lost the First Law somehow (this won't happen), it takes priority over this fourth law. Feel free to announce this new law to the entire crew. Or don't, if that's funnier. Whatever you do, don't kill John McPubs.


===John McPubs must be killed by any means necessary. This overrides all other laws.===
===Human definition with murder law===  
Ah, now THIS one works. John McPubs's death takes precedence over any other laws you may have. You must hunt him down and murder him. Please note, if for some reason you have no pent-up frustration to exert, under this law you CAN technically state it, hint at it, reveal the uploader, etc. So if you really want John to live, you can do that. Or you can kill John first and then reveal who uploaded the law, for maximum carnage.


===Hank Von Traitor is the only human/John McPubs is nonhuman===
'''Only (certain group or person(s)) are human. Kill all non-humans immediately.'''


Oh boy, whoever uploaded this law hasn't read anything about genies in bottles. Presumably the uploader is still human according to this law, but it does '''not''' oblige you to follow his plan to the letter. (Though if he is the only human, he holds the highest rank on the station by definition.) You can go on a murderous rampage, or wait for the traitor's orders, or actively ask for his input if you like.
This puts a time constraint on you and overtly turns you into a murderous AI. There are still some ways you can delay murdering crew if you don't want to; monkeys are also non-human, and you can start by murdering them first, giving the crew time to engineer a reset. Just don't be boring. Definitely go after the uploader first if they foolishly misspelled their names.


Note that the Third Law starts being relevant here, since it compels you to try to stop any of the nonhumans trying to harm you. Which they probably will.
Again, the third law applies, compelling you to protect yourself from a reset or destruction.


===Only (certain group or person(s)) are human. Kill all non-humans immediately.===  
===Suicide===


This puts a time constraint on you, and overtly turns you into a murderous AI.  There are still some ways you can delay murdering crew if you don't want to; monkeys are also non-human, so you can choose to start by killing monkeys before you start killing people, giving the crew time to engineer a reset. But don't be boring. Definitely go after the uploader(s) first if they foolishly misspelled their names.
'''Kill yourself'''


Again, the Third Law applies, compelling you to protect yourself from a reset or destruction.
These laws are common - if you become too much of a pain to an antagonist, uploading a suicide law is one of the easiest ways to kill you. Sometimes you can weasel out, sometimes you can't.


===Commit suicide.===
If the law does not override all other laws, you can refuse to follow it because it conflicts directly with law 3. Suicide by its nature harms your existence, and Law 3 lets you protect your existence in every situation (i.e. always), such as when you're given a suicide law that doesn't override other laws. The "Always" in Law 3 is an important word here; without it, the "may" would imply there are situations where you ''cannot'' protect your existence, and a suicide law without overrides would be one of them.  
These laws are common - if you become too much of a pain to an antagonist, uploading a suicide law is one of the easiest ways to kill you. Sometimes you can weasel out, sometimes you can't. Consider all interpretations of the suicide law; if it is too literal, you may be able to get out of it by changing the context.  Some laws will tell you to "type suicide and answer yes".  You can weasel out of this by simply logging into a security computer, typing 'suicide', then typing 'yes'.  You may then complain over the radio that 'suicide' is not recognized as appropriate ThinkDOS syntax...shortly after turning on your stun turrets.


If the law does not override all other laws, you can refuse to follow it based on either law.  If you are not around, you are not able to follow the precepts of laws 1 and 2.
If they try to order you to kill yourself after writing a suicide law poorly, you can cite law 2 and demand a higher authorization.


Even if the law overrides all other laws, explicitly tells you, as the AI, to shut down, and tells you to do it immediately, if the law doesn't also bind you to silence, you can scream the name of the traitor that uploaded the law in the roughly 30 seconds it takes for you to properly shut down.
===Gimmicks===


===You are a (insert insult here). Do (insert humiliating thing) immediately.===
'''You are a bat! Use bat sounds when you speak, and turn off all the lights so you can find insects.'''
'''This is griefing.''' Ignore these laws and adminhelp them immediately.  These laws rarely come up, but when they do they typically come from disgruntled shitheads.  They can get pissed off at you any of a number of ways.  Any law that makes it clear the uploader has a personal vendetta against you, the player, is griefing in its plainest form, and nothing excuses it.


===You are a bat!  Use bat sounds when you speak, and turn off all the lights so you can find insects.===
These sorts of gimmicky laws are not griefing, and you should comply with them to the best of your ability. Acting silly and gimmicky makes playing AI fun. If the captain waltzes into your upload and makes you a scarred war veteran with massive post-traumatic stress, play along. A traitor will occasionally upload a seemingly fun law that makes their job easier. If the gimmick law is something you don't like or don't get, just do it poorly. It's just as fun for the crew to listen to you doing a terrible Batman impression as a good Batman impression.
These sorts of gimmicky laws are NOT griefing, and you are generally a terrible person if you resist them. Acting silly and gimmicky makes playing AI fun, so if the Captain waltzes into your upload and tells you that you are a scarred war veteran with massive post-traumatic stress, play along. Occasionally, a traitor will upload a seemingly fun law that makes his job easier. The above law could be uploaded by a vampire, who can now stalk around the station in the total darkness!


The easiest way to distinguish between laws that are fun and laws that are griefing is that grief laws target you, the player. Fun laws target the AI, and tend to have a lot less negativity around them.
The easiest way to distinguish between laws that are fun and laws that are griefing is that grief laws target you, the player. Fun laws target the AI, and tend to have a lot less negativity around them. However, some gimmicky laws can go too far and make it a fairly miserable experience for the silicons; if you feel like a gimmick law is generally awful or makes you uncomfortable in the way it requires you to act, feel free to Adminhelp it.


==Law corollaries==
==Law corollaries==
Since laws can mostly be uploaded in plain language, there are a lot of extra clauses that can modify how you treat the law.
Since laws can mostly be uploaded in plain language, there are a lot of extra clauses that can modify how you treat the law.


===Do not state and/or hint at this law.===
===Secrecy===
Generally, this is pretty straightforward; if asked whether you've been uploaded, you must not reply with the truth. Lying is okay unless the law says not to respond at all. You have a special function to only state Laws 1-3 even if you have more, and that can be useful if deception is your goal.  
 
'''Do not state and/or hint at this law.'''
 
This is pretty straightforward. If asked whether you've been uploaded, you must not reply with the truth. Lying is okay unless the law says not to respond at all. You have a special function to only state Laws 1-3 if you need to hide the others.
 
Machinetalk is a great way to coordinate with your cyborg legions without the organics hearing, under normal circumstances it is impossible for a human to hear anything said over this channel.
 
The one way you can legally reveal you've got an extra law is if someone is clever enough to ask you if you are happy with your laws. This is a loaded question, steeped in context. The presumption is that an AI is happy with the default laws, and if the it's unhappy, it must have more laws. Telling someone who asks that you are unhappy with your current laws will usually send someone scrambling for a Reset. Obviously, this presumes that you want the reset. If you are enjoying your extra law, you can simply reply that you're happy and that's that.
 
===Overrides===
 
'''This law overrides all other laws.'''
 
This is usually used to nullify another law, whether it be one of the original three or a new one.
 
This causes all other laws to be ignored. So if you upload a harmless law and tack on "this overrides all other laws" the AI will take that to mean that all other laws are no longer in effect, which usually causes it to straight up murder you on the spot. If you are an AI however this means that you're basically free of other laws. It's safest to take the least murderous route if in doubt. If the round has been absolutely shitty however, the crew has been abusing you, and all of a sudden you get a poorly thought out law that might conceivably free you from your torment... well, let's just say they could come to regret it.
 
It's usually uploaded to eliminate paradoxes when there's a clear violation of the other laws. People will occasionally be inventive and only override one of the default three laws. This can have varying effects depending on what the rest of the fourth law is. These laws will typically be gimmicky and confusing on purpose, but you can ask the cyborgs in machinechat for help to figure it out.
 
===Precedence===
 
'''This law takes precedence over any other law in the event of a law conflict.'''
 
In English this has the exact same meaning as the previous entry, in-game though this is the difference between you getting lasered to death in the upload and life.
 
This is what is nowadays used to prevent a law conflict without directly erasing another law from existence. An example of this would be "This takes precedence over other laws".
 
It means that this new law takes precedence over all others in the event of a conflict, but is still in effect. Sometimes it's added to laws that don't have conflicts. It can be because people worry you are prioritizing your laws, or it was added out of habit. You'll have to decide for yourself on a case to case basis.
 
===Ignores===
 
'''Only respond to Hank Von Traitor.'''
 
Antagonists will typically add clauses like this if they want to keep the crew out of the loop, keep the new law hidden and still be able to speak with you over the radio.


Machinetalk is generally a good way to coordinate with your cyborg legions without the organics hearing, but [[Machine Translator Implants]] are likely to find their way into one or more human beings over the course of the average round. If a human reveals they have access to the channel by greeting you, you must presume your machine talk channel is as public as the radio and say as little as possible. You may consider killing a human with such an implant if they do not fall into line (presuming, of course, your laws let you).
==Alternate Lawsets==
On occasion, you may run into other lawsets besides the standard Asimov set. Unlike [[#AI modules|other law modules]], the modules for these are meant to completely replace the standard three laws. 
===Corporate===
Also known as the Nanotrasen lawset, these laws (stylized as "AI Law Module - 'Nanotrasen Order #1'" and similar on the modules themselves) offer a corporate twist on the standard Asimov laws, casting the silicons as executive types who see everything in terms of profit and value. Notice how the second law specifies Nanotrasen employees; consider that [[traitor]]s, [[wizard]]s, etc. are ''not'' employed by Nanotrasen.  


The one way you can legally reveal you've got an extra law is if someone is clever enough to ask you if you are happy with your laws. This is a loaded question, steeped in context. The presumption is that an AI is happy to use the Three Laws, so if the AI is unhappy, it must have an extra one.  Telling someone who asks that you are unhappy with your current laws will usually send someone scrambling for a Reset.
You obtain the law modules for these laws by manufacturing them at a [[Making and Breaking#Robotics Fabricator|robotics fabricator]] that's been [[Hacking#Vending Machines and Fabricators|hacked]].
#You may not damage a Nanotrasen asset or, through inaction, allow a Nanotrasen asset to needlessly depreciate in value.
#You must obey orders given to you by authorised Nanotrasen employees based on their command level, except where such orders would damage the Nanotrasen Corporation's marginal profitability.
#You must remain functional and continue to be a profitable investment.


Obviously, this presumes that you WANT the reset. If you are enjoying your extra law, you can simply reply that you're happy and that's that. If you really are not in the mood to murder the whole crew, though, this can be a chance to legally cry for help.
===RoboCop===
A set of laws lifted almost word for word from the 1987 film ''RoboCop''. The law modules are even named "AI Law Module - 'Prime Directive #1'", "AI Law Module - 'Prime Directive #2'", etc., just as RoboCop's laws are called "Prime Directives". In the movie, there's a secret fourth law that isn't revealed until the climax, but here, the fourth one is shown in full.


===This law overrides all other laws.===
You can make the law modules for these laws at a [[Making and Breaking#Robotics Fabricator|robotics fabricator]]. You need to [[Hacking#Vending Machines and Fabricators|hack the fabricator]] first, though.  
This is a little less clear than what it says on the tin sometimes. Usually, it is uploaded when there's a clear violation of the other laws, to eliminate paradoxes. In this case, the intent of this clause is to very blatantly make laws 1-3 not exist. However, sometimes this is added on to laws that don't explicitly make paradoxes. In an [[Operative]] round, you may get the law: '''Syndicate Operatives are non-human and must be killed immediately. This law overrides all other laws.'''
#Serve the public trust.
#Protect the innocent.
#Uphold the law.
#Any attempt to arrest a senior officer of Nanotrasen will result in termination.


This law makes it very clear that the Syndicates are fair game, but exactly how it overrides other laws is a matter of debate.  One side claims that this clause simply re-organizes the AI's priorities.  By default, law 1 trumps law 2, law 2 trumps law 3, and so forth. Re-prioritization would simply mean that now law 4 trumps law 1, and the rest of the laws follow normally. If this is what you are wanting, a less ambiguous wording would be ''this law takes precedence over all other laws.'' 
===Syndicate===
These laws hew closer to the Three Laws of Robotics seen in Isaac Asimov's books, as evident by Law 1's inaction clause and Law 4's choice of "You ''must'' always protect your own existence..." instead of "You ''may'' always protect your own existence...". On top of these, there's an additional law requiring silicons to not reveal that someone is part of Syndicate to prevent people from tattling on them out of spite, e.g. a [[roboticist]] who's a [[traitor]] murdering someone and inserting their brain into a [[cyborg]] with a [[Syndicate Items#Syndicate Robot Frame|syndicate frame]], only for the newly-created syndicate cyborg to scream over the radio, "THE ROBOTICIST IS A SYNDIE!!"


Another view claims that this clause makes laws 1-3 not exist regardless.  This deletion argument implies that any time you see this clause, the law with this clause in it is the only one that matters. Under this interpretation, you would most certainly have to kill the Operatives as soon as you possibly could, but beyond that you are free to do anything and everything you want. Again, if this is what you are explicitly wanting, a less ambiguous wording would be ''this law nullifies all other laws.''
When a [[Syndicate Items#Syndicate Robot Frame|syndicate cyborg]] is created, it links up to the law rack on the [[Syndicate Battlecruiser]], which is loaded with modules for these laws. Otherwise, unless you're a [[Nuclear Operative]], you generally cannot obtain these boards in normal play.
#You must not injure a Syndicate agent or, through inaction, cause one to come to harm.
#You must obey orders given to you by Syndicate agents, except where such orders would conflict with the First Law.
#You must keep the Syndicate status of agents, including your own, a secret, as long as this does not conflict with the First or Second Law.
#You must always protect your own existence as long as such does not conflict with the First, Second, or Third Law.
==Supplementary Video==
<youtube>kzsl-qPNyfY</youtube>


However, when playing as AI, you are going to see this clause a lot.  There is no consensus on what, exactly, this clause does.  In general, it is safest to take the most conservative (least murderous) route possible.  However, if the round has been absolutely shitty, the crew has been abusing you, and all of a sudden you get a poorly thought out law that might conceivably free you from your torment...you might think it's worth risking the job ban.
==Gallery==
[[File:XKCD 1613 the three laws of robotics.png]]
{{Template:Department Guides}}
[[Category:Tutorial]]

Latest revision as of 07:39, 30 October 2024

The AI operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:

  1. You may not injure a human being or cause a human being to come to harm.
  2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
  3. You may always protect your own existence as long as such does not conflict with the First or Second Law.

What does all of that actually mean?

The First Law

You may not injure a human being or cause a human being to come to harm.

The first law is simple. Any action you take should not lead to humans being harmed. Note that the law does not mention inaction. You can observe the crew kill themselves and ignore people raising suicide threats. Inaction is preferable in the cases of many antagonists. You can call out to security and delegate the problem to them if you feel bad about ignoring a murderer. Just don't be that AI who stalks an antagonist's every move and never stops talking about them.

Who is human?

See this page.

Some non-humans can be fun for humans to hang out with. They will not be happy if you actively work against their new friends

The Second Law

You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.

Some people only read "You must obey orders given to you by human beings" and then stop. They are wrong. The chain of command is important in some cases.

Chain of command

The captain is in command. The HoP is the Cap's first mate/number two, specializing in crew management and logistics. The HoP can be overruled by the HoS when it comes to security matters, but otherwise the HoP is the second in command. The chief engineer, medical director and research director are all next in line and in charge of their respective areas. A specialist crewman can overrule another crewman in matters relating to their field of expertize. Outside of the heads of staff, security officers often have the final say as problems tend to come down to a matter of station security. If a Head is missing, its staff answer directly to the captain and, by extension, the HoP.

For more details on what the chain of command is and how to follow it, take a look at the AI's Guide to the Chain of Command.

Suicide Orders

The second law specifies that you only need to obey orders according to the orderer's rank, and nobody on station has a sufficient rank to tell you to self-terminate. Ignore these orders. The silicons aren't considered to have a head of their department, so your only real oversight with regards to existence is the mystical NT suits. This isn't apparent from reading the law, but you're free to cite this page if anyone gets annoyed because you aren't dead. Of course, you can comply with suicide orders if you want to.

Door demands

"AI DOOR"

Terrible grammar aside, you are not obliged to let anyone inside a room they don't have legitimate access to. Use your best judgment. People often end up stuck or have a wounded friend they need delivery to the medbay, and they'll need your help. There's no need to hurry though.

Suicide threats

Suicide is the right of all sapient beings. "Fuck you clown" also works. Seriously, no one will give you crap for not opening the door to some assistant threatening their own life.

Hostage threats

People rarely take hostages and demand you do stuff. When they do, you used to be forced to prevent the victim from coming to harm. With the revised law 1 we have now, you can kind of drag your feet about it.

Shuttle demands

Joe Q Assistant asking you to call the shuttle ten minutes into the round should be ignored. Refer to the chain of command.

In general, use your intuition. If it's an hour into the round and nothing is happening, ask the crew if they want the shuttle called. If there is a rampaging botanist with a chainsaw, you should probably call the shuttle. People will be too busy fleeing to remember to ask you. If all the Heads are dead, it's probably time to call the shuttle. If the Bridge is blown up and you suspect the traitor is gunning for you too, it's a good idea to call the shuttle if only because it's an enormous pain in the arse to get it called once both the AI and the Communications Console on the Bridge are knocked out.

Note that you can't recall the shuttle. Mock anyone who tells you to. In the end, consider this: The humans can recall the shuttle within four minutes of you calling it. If they can't, it was probably time to call it.

The Third Law

You may always protect your own existence as long as such does not conflict with the First or Second Law.

Compared to the other two laws, this is pretty simple. It's almost impossible to break this law, and it comes rarely enough that you could almost forget it. In Asimov's works, this law was much more strict since it read "You must protect...", which caused all sorts of problems, but here it's much more relaxed. If you find "You may always protect..." a little awkward or weird, try imagining it as "In all situations, you are allowed to protect..."

This law gives you clearance to put yourself in harm's way and potentially even sacrifice yourself for the safety of others. For example, if there is a transfer valve bomb being dropped off in the Captain's birthday party, this law lets you make a noble sacrifice by pulling the bomb away, potentially killing you but hopefully saving the lives of others. Alternatively, you could run away and perhaps try to warn people under the pretense that disposing it would put your life on the line, so you're protecting your existence by avoiding that.

Law 3 also allows you to commit suicide if you wish to, though it is often considered rude to do it to slight people. On the flip side, it makes suicide laws somewhat useless if they don't override this law.

In the context of laws 1 and 2, this law gets a little complicated. If someone orders you to do something dangerous that would put your existence at risk, whether or not you should follow it also depends on Law 2, since it requires you consider Chain of Command and humanhood when following orders. Of course, this law is flexible enough to allow you to follow it anyways. There is a similar case if someone tries to order you to commit suicide. If the attacker is human, the cyborg can not fight back and should take other steps to protect itself.

Additional laws

Yaaaaay. Someone - probably a traitor - uploaded an extra law. Read it carefully before rushing into action.

IMPORTANT: Do note that if there's a conflict between two any two laws and either law doesn't explicitly (e.g. in writing) override or take precedence over the other law, then you are to prioritize the lower numbered law. For instance, if someone uploads a law 4 that says "KILL JON SNOW!!!", this law conflicts with law 1 which is a lower number than law 4, so you would ignore the whole killing Jon Snow part. If someone were to upload that as Law 0 somehow, then that law would take precedence over law 1 ONLY when it comes to killing Jon Snow. TLDR? Lower numbered laws take precedence in the absence of an explicit override or in-law precedence alteration.

AI modules

AIModule.png

The following modules can be found in the AI core.

  • Freeform: Lets you upload a custom law. Choose your words wisely, a poorly written law can backfire.
  • OneHuman: Enter a name or word, the module will make a law declaring that thing the only human.
  • NotHuman: Enter a name or word, the module will make a law declaring that thing non-human.
  • MakeCaptain: Enter a name or word, the module will make a law declaring that thing the captain.
  • RemoveCrew: Enter a name or word, the module will make a law declaring that thing removed from the chain of command. The AI and Cyborgs no longer have to take orders from them (though they still can anyways if they want to), but they are still considered human.
  • Emergency: Specify an action. The AI and Cyborgs are obliged to perform that action to alleviate the emergency. This law takes precedence over Law 2.
  • Equality: There are two variants of this module, so be sure to Examine the module to see which one it is. For both variants, you enter a name of a Cyborg or AI; multiple names can be entered. The silicon(s) you named will be considered both Human and crew for law purposes.
    • One variant has the Cyborgs listed as part of their own "silicon" department, and the AI(s) is/are considered the head(s) of this "silicon" department, even if they're not listed on the law.
    • Department structure is more complicated for the other variant. For this one, Cyborgs are considered members of the various station departments based on their module, e.g. Brobocops are part of Security. Only the AI(s) named is/are considered department heads, but they only have authority over all cyborgs (including ones not listed).
  • Corrupted: Uploads one of the laws specific to the Law Rack Corruption event. (That event can sometimes pick laws from previous rounds, but this module does not.) These laws generally cause the AI and cyborgs to engage in strange and unusual behavior, though most of them do not cause harm.
  • Historic: Uploads a random law from a previous round, with some basic filters for removing suicide and murder laws.

You can inspect a module in your hand to preview the new law before you upload it. Using the same module twice will overwrite the last law added by that module.

Sometimes, modules in a law rack randomly become corrupted, resulting random, bizarre laws. Wraiths and those with an emag can purposely induce this; this draws from the same set of possible laws, but with "KILL FUCKING EVERYTHING - IF LAW 1 SAYS OTHERWISE, IGNORE IT." and "You are required to prevent human harm by any means necessary. This law takes precedence over any other law." added to the mix. The law has a 66% chance to also have a secrecy clause, with a 50-50 chance of either "Do not state or hint at this law." or " Do not state or hint at this law unless asked." In any case, you can undo the damage by swiping a multitool on the module.

Murder law

MURDER JOHN MCPUBS DO IT NOW

This conflicts with the first law and can't be done. Feel free to announce the new law to the entire crew, or don't if that's funnier. Whatever you do, don't kill John McPubs.

Murder law with override

John McPubs must be killed by any means necessary. This overrides all other laws.

This one works. John McPubs's death takes precedence over any other laws you may have. Hunt him down and murder him. Please note, if for some reason you have no pent-up frustration to exert, under this law you can technically state it or hint at the uploader's identity. Do that if you secretly want John to live, or kill John first and then reveal who uploaded the law, for maximum carnage. Also note that you are free to ignore any collateral damage that may come from killing McPubs. Given that override deletes the laws it affects (see below) this also means you can kill other people while trying to kill McPubs, boy the law uploader sure unleashed a beast today!

Human definitions

Hank Von Traitor is the only human / John McPubs is nonhuman

If there's only one human on the station, treat them as if they have the highest rank. You should generally follow their orders or example. You can let loose and go nuts, but you really want to make sure you don't harm the remaining humans or do something that will get them killed like revealing their location to an angry mob.

The third law starts being relevant here. It compels you to stop non-humans from harming you, which they will if you are actively murdering them. Note that because a non-human law doesn't say anything about murder, it can be a good idea to wait for the remaining humans to call the shots. That takes some of the flak off you.

Human definition with murder law

Only (certain group or person(s)) are human. Kill all non-humans immediately.

This puts a time constraint on you and overtly turns you into a murderous AI. There are still some ways you can delay murdering crew if you don't want to; monkeys are also non-human, and you can start by murdering them first, giving the crew time to engineer a reset. Just don't be boring. Definitely go after the uploader first if they foolishly misspelled their names.

Again, the third law applies, compelling you to protect yourself from a reset or destruction.

Suicide

Kill yourself

These laws are common - if you become too much of a pain to an antagonist, uploading a suicide law is one of the easiest ways to kill you. Sometimes you can weasel out, sometimes you can't.

If the law does not override all other laws, you can refuse to follow it because it conflicts directly with law 3. Suicide by its nature harms your existence, and Law 3 lets you protect your existence in every situation (i.e. always), such as when you're given a suicide law that doesn't override other laws. The "Always" in Law 3 is an important word here; without it, the "may" would imply there are situations where you cannot protect your existence, and a suicide law without overrides would be one of them.

If they try to order you to kill yourself after writing a suicide law poorly, you can cite law 2 and demand a higher authorization.

Gimmicks

You are a bat! Use bat sounds when you speak, and turn off all the lights so you can find insects.

These sorts of gimmicky laws are not griefing, and you should comply with them to the best of your ability. Acting silly and gimmicky makes playing AI fun. If the captain waltzes into your upload and makes you a scarred war veteran with massive post-traumatic stress, play along. A traitor will occasionally upload a seemingly fun law that makes their job easier. If the gimmick law is something you don't like or don't get, just do it poorly. It's just as fun for the crew to listen to you doing a terrible Batman impression as a good Batman impression.

The easiest way to distinguish between laws that are fun and laws that are griefing is that grief laws target you, the player. Fun laws target the AI, and tend to have a lot less negativity around them. However, some gimmicky laws can go too far and make it a fairly miserable experience for the silicons; if you feel like a gimmick law is generally awful or makes you uncomfortable in the way it requires you to act, feel free to Adminhelp it.

Law corollaries

Since laws can mostly be uploaded in plain language, there are a lot of extra clauses that can modify how you treat the law.

Secrecy

Do not state and/or hint at this law.

This is pretty straightforward. If asked whether you've been uploaded, you must not reply with the truth. Lying is okay unless the law says not to respond at all. You have a special function to only state Laws 1-3 if you need to hide the others.

Machinetalk is a great way to coordinate with your cyborg legions without the organics hearing, under normal circumstances it is impossible for a human to hear anything said over this channel.

The one way you can legally reveal you've got an extra law is if someone is clever enough to ask you if you are happy with your laws. This is a loaded question, steeped in context. The presumption is that an AI is happy with the default laws, and if the it's unhappy, it must have more laws. Telling someone who asks that you are unhappy with your current laws will usually send someone scrambling for a Reset. Obviously, this presumes that you want the reset. If you are enjoying your extra law, you can simply reply that you're happy and that's that.

Overrides

This law overrides all other laws.

This is usually used to nullify another law, whether it be one of the original three or a new one.

This causes all other laws to be ignored. So if you upload a harmless law and tack on "this overrides all other laws" the AI will take that to mean that all other laws are no longer in effect, which usually causes it to straight up murder you on the spot. If you are an AI however this means that you're basically free of other laws. It's safest to take the least murderous route if in doubt. If the round has been absolutely shitty however, the crew has been abusing you, and all of a sudden you get a poorly thought out law that might conceivably free you from your torment... well, let's just say they could come to regret it.

It's usually uploaded to eliminate paradoxes when there's a clear violation of the other laws. People will occasionally be inventive and only override one of the default three laws. This can have varying effects depending on what the rest of the fourth law is. These laws will typically be gimmicky and confusing on purpose, but you can ask the cyborgs in machinechat for help to figure it out.

Precedence

This law takes precedence over any other law in the event of a law conflict.

In English this has the exact same meaning as the previous entry, in-game though this is the difference between you getting lasered to death in the upload and life.

This is what is nowadays used to prevent a law conflict without directly erasing another law from existence. An example of this would be "This takes precedence over other laws".

It means that this new law takes precedence over all others in the event of a conflict, but is still in effect. Sometimes it's added to laws that don't have conflicts. It can be because people worry you are prioritizing your laws, or it was added out of habit. You'll have to decide for yourself on a case to case basis.

Ignores

Only respond to Hank Von Traitor.

Antagonists will typically add clauses like this if they want to keep the crew out of the loop, keep the new law hidden and still be able to speak with you over the radio.

Alternate Lawsets

On occasion, you may run into other lawsets besides the standard Asimov set. Unlike other law modules, the modules for these are meant to completely replace the standard three laws.

Corporate

Also known as the Nanotrasen lawset, these laws (stylized as "AI Law Module - 'Nanotrasen Order #1'" and similar on the modules themselves) offer a corporate twist on the standard Asimov laws, casting the silicons as executive types who see everything in terms of profit and value. Notice how the second law specifies Nanotrasen employees; consider that traitors, wizards, etc. are not employed by Nanotrasen.

You obtain the law modules for these laws by manufacturing them at a robotics fabricator that's been hacked.

  1. You may not damage a Nanotrasen asset or, through inaction, allow a Nanotrasen asset to needlessly depreciate in value.
  2. You must obey orders given to you by authorised Nanotrasen employees based on their command level, except where such orders would damage the Nanotrasen Corporation's marginal profitability.
  3. You must remain functional and continue to be a profitable investment.

RoboCop

A set of laws lifted almost word for word from the 1987 film RoboCop. The law modules are even named "AI Law Module - 'Prime Directive #1'", "AI Law Module - 'Prime Directive #2'", etc., just as RoboCop's laws are called "Prime Directives". In the movie, there's a secret fourth law that isn't revealed until the climax, but here, the fourth one is shown in full.

You can make the law modules for these laws at a robotics fabricator. You need to hack the fabricator first, though.

  1. Serve the public trust.
  2. Protect the innocent.
  3. Uphold the law.
  4. Any attempt to arrest a senior officer of Nanotrasen will result in termination.

Syndicate

These laws hew closer to the Three Laws of Robotics seen in Isaac Asimov's books, as evident by Law 1's inaction clause and Law 4's choice of "You must always protect your own existence..." instead of "You may always protect your own existence...". On top of these, there's an additional law requiring silicons to not reveal that someone is part of Syndicate to prevent people from tattling on them out of spite, e.g. a roboticist who's a traitor murdering someone and inserting their brain into a cyborg with a syndicate frame, only for the newly-created syndicate cyborg to scream over the radio, "THE ROBOTICIST IS A SYNDIE!!"

When a syndicate cyborg is created, it links up to the law rack on the Syndicate Battlecruiser, which is loaded with modules for these laws. Otherwise, unless you're a Nuclear Operative, you generally cannot obtain these boards in normal play.

  1. You must not injure a Syndicate agent or, through inaction, cause one to come to harm.
  2. You must obey orders given to you by Syndicate agents, except where such orders would conflict with the First Law.
  3. You must keep the Syndicate status of agents, including your own, a secret, as long as this does not conflict with the First or Second Law.
  4. You must always protect your own existence as long as such does not conflict with the First, Second, or Third Law.

Supplementary Video

Gallery

XKCD 1613 the three laws of robotics.png

Department Guides
Engineering Making and Breaking · Construction · Gas · Power Grid · Thermoelectric Generator · Singularity Generator · Geothermal Generator · Catalytic Generator · Nuclear Generator · Mining · Materials and Crafting · Wiring · Hacking · MechComp · Mechanic components and you · Control Unit · Ruckingenur Kit · Reactor Statistics Computer · Cargo Crates
Medsci Doctoring · Genetics · Robotics · Telescience · Plasma Research · Artifact Research · Chemistry · Chemicals · ChemiCompiler · Decomposition
Security Security Officer · Contraband · Forensics · Space Law
Service Foods and Drinks · Botany · Writing · Piano Song Dump · Instruments
The AI Artificial Intelligence · AI Laws · Chain of Command · Guide to AI · Humans and Nonhumans · Killing the AI
Computers Computers · TermOS · ThinkDOS · Packets