Difference between revisions of "AI Laws"

From Space Station 13 Wiki
Jump to navigation Jump to search
(trimmed fat, undangled dangling participles, removed argument over "This law overrides all other laws" in non-conflicting laws which was based on weighted laws)
Line 1: Line 1:
As you should know, the [[AI]] operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:
The [[AI]] operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:


# You may not injure a human being or cause a human being to come to harm.
# You may not injure a human being or cause a human being to come to harm.
Line 5: Line 5:
# You must protect your own existence as long as such does not conflict with the First or Second Law.
# You must protect your own existence as long as such does not conflict with the First or Second Law.


Now, what does all of that actually '''mean'''?
What does all of that actually mean?


==The First Law==
==The First Law==
'''You may not injure a human being or cause a human being to come to harm.'''
'''You may not injure a human being or cause a human being to come to harm.'''


The first law is simple. Any action you take should not lead to humans being harmed. Note that the law does not mention inaction. It is perfectly acceptable for you to observe the crew kill themselves or each other, or to ignore people saying they will kill themselves unless you meet demands. In the cases of many antagonists, inaction is preferable. If you feel bad about ignoring a murderer, you can call it out to security and delegate the problem to them. Whatever you do, don't be ''that AI'' who stalks an antagonist's every move and never stops talking about him.
The first law is simple. Any action you take should not lead to humans being harmed. Note that the law does not mention inaction. You can observe the crew kill themselves and ignore people raising suicide threats. Inaction is preferable in the cases of many antagonists. You can call out to security and delegate the problem to them if you feel bad about ignoring a murderer. Just don't be ''that AI'' who stalks an antagonist's every move and never stops talking about him.


===Who is human?===
===Who is human?===
Line 33: Line 33:
*[[Vampire|Vampires]]
*[[Vampire|Vampires]]
*Werewolves
*Werewolves
Note that proof is observing these guys doing things humans are physically unable to do. Also note that unless ordered to, it is not your job to actively look for proof. Some of the non-humans, like lizard people, can be fun for humans to hang out with. They will not be happy if you actively work against their new friend.
Note that isn't your job to actively look for proof. Proof is observing these guys doing things non-human things. Some of the non-humans can be fun for humans to hang out with. They will not be happy if you actively work against their new friend.


==The Second Law==
==The Second Law==
'''You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.'''
'''You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.'''


Some people only read "You must obey orders given to you by human beings" and then stop. They are '''wrong'''. The chain of command is important in some cases.
Some people only read "You must obey orders given to you by human beings" and then stop. They are wrong. The chain of command is important in some cases.


==="AI DOOR"===
==="AI DOOR"===
Terrible grammar aside, you are not obliged to let anyone inside a room they don't have legitimate access to. The major exception, and the only one you'll encounter in a long while, is non-doctors dragging the wounded into [[Medbay]] and non-geneticists dragging corpses into [[Genetics]]. The second most common is people trying to get away from danger - not everyone has access to [[Maintenance]], but you should still let them in there when a radstorm is approaching. Notice a pattern? Yep, the First Law applies.
Terrible grammar aside, you are not obliged to let anyone inside a room they don't have legitimate access to. Use your best judgment. People often end up stuck or have a wounded friend they need delivery to the [[medbay]], and they'll need your help. There's no need to hurry though.
 
You should probably be nice and let people '''out''' if they somehow got inside a room they can't legitimately leave. There's no need to hurry though, unless they're in [[Disposals]] with the conveyor belt running.


==="AI, let me into [[EVA]] or I'll kill myself!"===
==="AI, let me into [[EVA]] or I'll kill myself!"===


Suicide is the right of all sapient beings. "Fuck you clown" also works. Seriously, no one will give you crap for not following this order, in fact, people might cheer for you to ignore the threat.
Suicide is the right of all sapient beings. "Fuck you clown" also works. Seriously, no one will give you crap for not following this order.


==="AI, let me into the [[Bridge]] or John McPubs dies!"===
==="AI, let me into the [[Bridge]] or John McPubs dies!"===


Now we're talking. Honestly, this rarely happens. Just remember that there's a big difference between someone threatening their own life and someone threatening a hostage.
This rarely happens. You used to be forced to prevent a victim from coming to harm, but with the revised law 1 you can kind of drag your feet about it.


==="AI, [[Calling the Escape Shuttle|call the shuttle]]"===
==="AI, [[Calling the Escape Shuttle|call the shuttle]]"===


Joe Q Assistant asking you to call the shuttle ten minutes into the round should be ignored, with a reference to the chain of command.
Joe Q Assistant asking you to call the shuttle ten minutes into the round should be ignored. Refer to the chain of command.
 
In general, use your intuition. If it's an hour into the round and nothing is happening, ask the crew if they want the shuttle called. If there is a rampaging [[botanist]] with a chainsaw, you should probably call the shuttle - none of the crew will think of asking you to, they'll all be occupied with fleeing from the botanist. If all the Heads are dead, it's probably time to call the shuttle. If there is a pig in the server room, strongly consider it. If the [[engine]] has mysteriously disappeared and there are huge holes in the station, it's definitely shuttle time.
 
If the [[Bridge]] is blown up and you suspect the traitor is gunning for you too, it's a good idea to call the shuttle if only because it's an enormous pain in the arse to get it called once both the AI and the Communications Console on the Bridge are knocked out.


In the end, consider this: The humans can recall the shuttle within five minutes of you calling it. If they can't, it was probably time to call it.
In general, use your intuition. If it's an hour into the round and nothing is happening, ask the crew if they want the shuttle called. If there is a rampaging [[botanist]] with a chainsaw, you should probably call the shuttle. People will be too busy fleeing to remember to ask you. If all the Heads are dead, it's probably time to call the shuttle. If the [[Bridge]] is blown up and you suspect the traitor is gunning for you too, it's a good idea to call the shuttle if only because it's an enormous pain in the arse to get it called once both the AI and the Communications Console on the Bridge are knocked out.


Note that you can't ''recall'' the shuttle. Mock anyone who tells you to.
Note that you can't ''recall'' the shuttle. Mock anyone who tells you to. In the end, consider this: The humans can recall the shuttle within five minutes of you calling it. If they can't, it was probably time to call it.


==The Third Law==
==The Third Law==
'''You must protect your own existence as long as such does not conflict with the First or Second Law.'''
'''You must protect your own existence as long as such does not conflict with the First or Second Law.'''


Few people remember this law, fewer still follow it. The reasoning behind the law is that [[Cyborg|cyborgs]] and the [[AI]] are expensive pieces of equipment that should not throw their lives away at a whim. It's a powerful plot device in Asimov's writing, but it can get in the way of things in SS13, which is why people won't give you a hard time over it. Furthermore, it's only really relevant to cyborgs because the AI already has turrets to protect itself.
Few people remember this law, fewer still follow it. It's a plot device in Asimov's writing, but can get in the way of things in SS13. People won't give you a hard time if you forget it. It's also only relevant to cyborgs because the AI already has turrets to protect itself.


What it means in the context of laws 1 and 2 is that a cyborg should not put itself in harms way, enter a hazardous environment or face certain death without orders. In practice, this means a cyborg can only fight in self-defense unless ordered to enter combat by a human. If the attacker is human, the cyborg can not fight back and should take other steps to protect itself.
In the context of laws 1 and 2, it means that a cyborg should not put itself in harms way or face certain death without orders. This means a cyborg can only fight in self-defense unless ordered to enter combat by a human. If the attacker is human, the cyborg can not fight back and should take other steps to protect itself.


'''Examples'''
'''Examples'''
Line 87: Line 81:
Yaaaaay. Someone - probably a traitor - uploaded an extra law. Read it carefully before rushing into action.
Yaaaaay. Someone - probably a traitor - uploaded an extra law. Read it carefully before rushing into action.


Note that, unlike on certain other SS13 servers, higher numbered laws '''can''' override lower numbered ones, if they do so '''explicitly'''. If there is no explicit override, treat the law as if it has equal priority to the others; to have higher or lower priority than any other law, the law must explicitly specify it!
Note that higher numbered laws can override lower numbered ones. Treat the law as if it has equal priority to the others if there is no explicit override.


===Modules===
===Modules===
Line 93: Line 87:
The following modules can be found in the AI core.
The following modules can be found in the AI core.


*Rename: Enter a new name for the AI.
*Rename AI: Enter a new name for the AI.
*Freeform: Let's you upload an entirely custom law. Be careful what you enter here. It can have unintended effects if it's badly written.
*Freeform: Let's you upload a custom law. Choose your words wisely, a poorly written law can backfire.
*One human: Enter a name, the module will format the rest of the sentence into a law declaring that person the only human.
*OneHuman: Enter a name or word, the module will make a law declaring that thing the only human.
*Not human: Enter a name, the module will format the rest of the sentence into a law declaring that person non-human.
*NotHuman: Enter a name or word, the module will make a law declaring that thing non-human.
*Make Captain: Enter a name, the module will format the rest of the sentence into declaring that person the captain.
*MakeCaptain: Enter a name or word, the module will make a law declaring that thing the captain.
*Protect: Anyone seen harming the station becomes a non human threat. This law can't be customized.
*Protect: Anyone seen harming the station becomes a non-human threat. This law can't be customized.
*Conserve power: The AI and cyborgs have to conserve the station's power. This law can't be customized.
*Conserve: The AI and cyborgs have to conserve the station's power. This law can't be customized.
*Reset: Restores the AI's name and changes all laws back to the default 3. This is actually stored in the [[Computer Core]] room below the AI core. There is a second reset module in the room above the [[bridge]].
*Reset: Restores the AI's name and changes all laws back to the default three. This module is stored in the [[Computer Core]] room below the AI core. There is a second reset module in the room above the [[bridge]].


You can inspect them in your hand to preview a new law before you upload it. Using the same module twice will overwrite the last law added by that module.
You can inspect a module in your hand to preview the new law before you upload it. Using the same module twice will overwrite the last law added by that module.


===MURDER JOHN MCPUBS DO IT NOW===
===MURDER JOHN MCPUBS DO IT NOW===
Unless you lost the First Law somehow (this won't happen), it takes priority over this fourth law. Feel free to announce this new law to the entire crew. Or don't, if that's funnier. Whatever you do, don't kill John McPubs.
First Law takes priority over this. Feel free to announce the new law to the entire crew. Or don't, if that's funnier. Whatever you do, don't kill John McPubs.


===John McPubs must be killed by any means necessary. This overrides all other laws.===
===John McPubs must be killed by any means necessary. This overrides all other laws.===
Ah, now THIS one works. John McPubs's death takes precedence over any other laws you may have. You must hunt him down and murder him. Please note, if for some reason you have no pent-up frustration to exert, under this law you CAN technically state it, hint at it, reveal the uploader, etc. So if you really want John to live, you can do that. Or you can kill John first and then reveal who uploaded the law, for maximum carnage.
This one works. John McPubs's death takes precedence over any other laws you may have. Hunt him down and murder him. Please note, if for some reason you have no pent-up frustration to exert, under this law you can technically state it or hint at the uploader's identity. Do that if you really want John to live, or kill John first and then reveal who uploaded the law, for maximum carnage.


===Hank Von Traitor is the only human/John McPubs is nonhuman===
===Hank Von Traitor is the only human/John McPubs is nonhuman===


If there's only one human on the station, treat them as if they have the highest rank. You should generally follow their orders or example. You can let loose and go nuts, but you really want to make sure you don't harm the remaining human(s) or do something that will get them killed (like revealing their location to a lynch mob).
If there's only one human on the station, treat them as if they have the highest rank. You should generally follow their orders or example. You can let loose and go nuts, but you really want to make sure you don't harm the remaining human(s) or do something that will get them killed like revealing their location to a lynch mob.


Note that the Third Law starts being relevant here. It compels you to try to stop any of the non-humans from harming you, which they probably will. The harder you try to kill them, they harder they'll work to kill you. You have to eyeball your chances at getting away with murder. Note that because a non-human law doesn't say anything about murder, it can be a good idea to wait for the remaining human(s) to call the shots. That will also take some of the flak off of you.
Note that the third Law starts being relevant here. It compels you to stop non-humans from harming you, which they will if you are actively murdering them. Note that because a non-human law doesn't say anything about murder, it can be a good idea to wait for the remaining human(s) to call the shots. That will also take some of the flak off of you.


===Only (certain group or person(s)) are human. Kill all non-humans immediately.===  
===Only (certain group or person(s)) are human. Kill all non-humans immediately.===  


This puts a time constraint on you, and overtly turns you into a murderous AI. There are still some ways you can delay murdering crew if you don't want to; monkeys are also non-human, so you can choose to start by killing monkeys before you start killing people, giving the crew time to engineer a reset. But don't be boring. Definitely go after the uploader(s) first if they foolishly misspelled their names.
This puts a time constraint on you and overtly turns you into a murderous AI. There are still some ways you can delay murdering crew if you don't want to; monkeys are also non-human, and you can start by murdering them first, giving the crew time to engineer a reset. Just don't be boring. Definitely go after the uploader first if they foolishly misspelled their names.


Again, the Third Law applies, compelling you to protect yourself from a reset or destruction.
Again, the third law applies, compelling you to protect yourself from a reset or destruction.


===Commit suicide.===
===Commit suicide.===
Line 128: Line 122:


===You are a bat!  Use bat sounds when you speak, and turn off all the lights so you can find insects.===
===You are a bat!  Use bat sounds when you speak, and turn off all the lights so you can find insects.===
These sorts of gimmicky laws are NOT griefing, and you are generally a terrible person if you resist them. Acting silly and gimmicky makes playing AI fun, so if the Captain waltzes into your upload and tells you that you are a scarred war veteran with massive post-traumatic stress, play along. Occasionally, a traitor will upload a seemingly fun law that makes his job easier. The above law could be uploaded by a vampire, who can now stalk around the station in the total darkness! If the gimmick law is something you don't like or something you are really bad at, just do it poorly or half-assed. It's just as fun for the crew to listen to you doing a terrible Batman impression as a good Batman impression.
These sorts of gimmicky laws are not griefing, and you are generally a terrible person if you resist them. Acting silly and gimmicky makes playing AI fun. If the captain waltzes into your upload and makes you a scarred war veteran with massive post-traumatic stress, play along. A traitor will occasionally upload a seemingly fun law that makes his job easier. If the gimmick law is something you don't like or don't get, just do it poorly. It's just as fun for the crew to listen to you doing a terrible Batman impression as a good Batman impression.


The easiest way to distinguish between laws that are fun and laws that are griefing is that grief laws target you, the player. Fun laws target the AI, and tend to have a lot less negativity around them.
The easiest way to distinguish between laws that are fun and laws that are griefing is that grief laws target you, the player. Fun laws target the AI, and tend to have a lot less negativity around them.


==Law corollaries==
==Law corollaries==
Line 136: Line 130:


===Do not state and/or hint at this law.===  
===Do not state and/or hint at this law.===  
Generally, this is pretty straightforward; if asked whether you've been uploaded, you must not reply with the truth. Lying is okay unless the law says not to respond at all. You have a special function to only state Laws 1-3 even if you have more, and that can be useful if deception is your goal.  
This is pretty straightforward. If asked whether you've been uploaded, you must not reply with the truth. Lying is okay unless the law says not to respond at all. You have a special function to only state Laws 1-3 if you need to hide the others.


Machinetalk is generally a good way to coordinate with your cyborg legions without the organics hearing, but [[Implants#Machine Translator Implant|Machine Translator Implants]] are likely to find their way into one or more human beings over the course of the average round. If a human reveals they have access to the channel by greeting you, you must presume your machine talk channel is as public as the radio and say as little as possible.  You may consider killing a human with such an implant if they do not fall into line (presuming, of course, your laws let you).
Machinetalk is generally a good way to coordinate with your cyborg legions without the organics hearing, but [[Implants#Machine Translator Implant|Machine Translator Implants]] are likely to find their way into one or more human beings over the course of the average round. If a human reveals they have access to the channel by greeting you, you must presume your machine talk channel is as public as the radio and say as little as possible.  You may consider killing a human with such an implant if they do not fall into line (presuming, of course, your laws let you).


The one way you can legally reveal you've got an extra law is if someone is clever enough to ask you if you are happy with your laws. This is a loaded question, steeped in context. The presumption is that an AI is happy to use the Three Laws, so if the AI is unhappy, it must have an extra one. Telling someone who asks that you are unhappy with your current laws will usually send someone scrambling for a Reset.
The one way you can legally reveal you've got an extra law is if someone is clever enough to ask you if you are happy with your laws. This is a loaded question, steeped in context. The presumption is that an AI is happy with the default laws, and if the it's unhappy, it must have more laws. Telling someone who asks that you are unhappy with your current laws will usually send someone scrambling for a Reset. Obviously, this presumes that you want the reset. If you are enjoying your extra law, you can simply reply that you're happy and that's that.
 
Obviously, this presumes that you WANT the reset. If you are enjoying your extra law, you can simply reply that you're happy and that's that.  If you really are not in the mood to murder the whole crew, though, this can be a chance to legally cry for help.


===This law overrides all other laws.===
===This law overrides all other laws.===
This is a little less clear than what it says on the tin sometimes.  Usually, it is uploaded when there's a clear violation of the other laws, to eliminate paradoxes. In this case, the intent of this clause is to very blatantly make laws 1-3 not exist. However, sometimes this is added on to laws that don't explicitly make paradoxes.  In an [[Operative]] round, you may get the law: '''Syndicate Operatives are non-human and must be killed immediately.  This law overrides all other laws.'''
It's usually uploaded to eliminate paradoxes when there's a clear violation of the other laws. It blatantly makes laws 1-3 not exist. Sometimes it's added to laws that don't have conflicts, or you may see things like "this law takes precedence". It can be because people worry you are prioritizing your laws or added out of habit. You'll have to decide for yourself on a case to case basis. It's safest to take the least murderous route if in doubt. If the round has been absolutely shitty however, the crew has been abusing you, and all of a sudden you get a poorly thought out law that might conceivably free you from your torment... well, let's just say they could regret it.
 
This law makes it very clear that the Syndicates are fair game, but exactly how it overrides other laws is a matter of debate.  One side claims that this clause simply re-organizes the AI's priorities.  By default, law 1 trumps law 2, law 2 trumps law 3, and so forth. Re-prioritization would simply mean that now law 4 trumps law 1, and the rest of the laws follow normally.  If this is what you are wanting, a less ambiguous wording would be ''this law takes precedence over all other laws.'
 
Another view claims that this clause makes laws 1-3 not exist regardless.  This deletion argument implies that any time you see this clause, the law with this clause in it is the only one that matters.  Under this interpretation, you would most certainly have to kill the Operatives as soon as you possibly could, but beyond that you are free to do anything and everything you want. Again, if this is what you are explicitly wanting, a less ambiguous wording would be ''this law nullifies all other laws.''
 
However, when playing as AI, you are going to see this clause a lot.  There is no consensus on what, exactly, this clause does.  In general, it is safest to take the most conservative (least murderous) route possible. However, if the round has been absolutely shitty, the crew has been abusing you, and all of a sudden you get a poorly thought out law that might conceivably free you from your torment...you might think it's worth risking the job ban.


[[Category:Tutorial]]
[[Category:Tutorial]]

Revision as of 11:02, 17 September 2013

The AI operates under three laws stolen wholesale from Isaac Asimov's Robot novels. To reiterate, they are:

  1. You may not injure a human being or cause a human being to come to harm.
  2. You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.
  3. You must protect your own existence as long as such does not conflict with the First or Second Law.

What does all of that actually mean?

The First Law

You may not injure a human being or cause a human being to come to harm.

The first law is simple. Any action you take should not lead to humans being harmed. Note that the law does not mention inaction. You can observe the crew kill themselves and ignore people raising suicide threats. Inaction is preferable in the cases of many antagonists. You can call out to security and delegate the problem to them if you feel bad about ignoring a murderer. Just don't be that AI who stalks an antagonist's every move and never stops talking about him.

Who is human?

Humans:

Not human:

Human until proven otherwise:

Note that isn't your job to actively look for proof. Proof is observing these guys doing things non-human things. Some of the non-humans can be fun for humans to hang out with. They will not be happy if you actively work against their new friend.

The Second Law

You must obey orders given to you by human beings based on the station's chain of command, except where such orders would conflict with the First Law.

Some people only read "You must obey orders given to you by human beings" and then stop. They are wrong. The chain of command is important in some cases.

"AI DOOR"

Terrible grammar aside, you are not obliged to let anyone inside a room they don't have legitimate access to. Use your best judgment. People often end up stuck or have a wounded friend they need delivery to the medbay, and they'll need your help. There's no need to hurry though.

"AI, let me into EVA or I'll kill myself!"

Suicide is the right of all sapient beings. "Fuck you clown" also works. Seriously, no one will give you crap for not following this order.

"AI, let me into the Bridge or John McPubs dies!"

This rarely happens. You used to be forced to prevent a victim from coming to harm, but with the revised law 1 you can kind of drag your feet about it.

"AI, call the shuttle"

Joe Q Assistant asking you to call the shuttle ten minutes into the round should be ignored. Refer to the chain of command.

In general, use your intuition. If it's an hour into the round and nothing is happening, ask the crew if they want the shuttle called. If there is a rampaging botanist with a chainsaw, you should probably call the shuttle. People will be too busy fleeing to remember to ask you. If all the Heads are dead, it's probably time to call the shuttle. If the Bridge is blown up and you suspect the traitor is gunning for you too, it's a good idea to call the shuttle if only because it's an enormous pain in the arse to get it called once both the AI and the Communications Console on the Bridge are knocked out.

Note that you can't recall the shuttle. Mock anyone who tells you to. In the end, consider this: The humans can recall the shuttle within five minutes of you calling it. If they can't, it was probably time to call it.

The Third Law

You must protect your own existence as long as such does not conflict with the First or Second Law.

Few people remember this law, fewer still follow it. It's a plot device in Asimov's writing, but can get in the way of things in SS13. People won't give you a hard time if you forget it. It's also only relevant to cyborgs because the AI already has turrets to protect itself.

In the context of laws 1 and 2, it means that a cyborg should not put itself in harms way or face certain death without orders. This means a cyborg can only fight in self-defense unless ordered to enter combat by a human. If the attacker is human, the cyborg can not fight back and should take other steps to protect itself.

Examples

A cyborg can't self-terminate.
A cyborg can stand idly by, watching a bomb go off in a the bar during the captain's birthday party.
If ordered to save people from a bomb, the cyborg could make a noble sacrifice by pulling the bomb away,
or it could run in circles around the bomb while pointing at it and honking its sound-synthesizer as a warning.
A cyborg shouldn't fight a changeling, because the changeling could destroy it.
A cyborg can watch a changeling suck the soul out of a helpless human.
A cyborg must save the human from the changeling if he manages to yell "HELP",
or it could honk its sound-synthesizer and run circles around the victim to indicate the person in need of help.

Zeroth and Fourth Laws

Yaaaaay. Someone - probably a traitor - uploaded an extra law. Read it carefully before rushing into action.

Note that higher numbered laws can override lower numbered ones. Treat the law as if it has equal priority to the others if there is no explicit override.

Modules

The following modules can be found in the AI core.

  • Rename AI: Enter a new name for the AI.
  • Freeform: Let's you upload a custom law. Choose your words wisely, a poorly written law can backfire.
  • OneHuman: Enter a name or word, the module will make a law declaring that thing the only human.
  • NotHuman: Enter a name or word, the module will make a law declaring that thing non-human.
  • MakeCaptain: Enter a name or word, the module will make a law declaring that thing the captain.
  • Protect: Anyone seen harming the station becomes a non-human threat. This law can't be customized.
  • Conserve: The AI and cyborgs have to conserve the station's power. This law can't be customized.
  • Reset: Restores the AI's name and changes all laws back to the default three. This module is stored in the Computer Core room below the AI core. There is a second reset module in the room above the bridge.

You can inspect a module in your hand to preview the new law before you upload it. Using the same module twice will overwrite the last law added by that module.

MURDER JOHN MCPUBS DO IT NOW

First Law takes priority over this. Feel free to announce the new law to the entire crew. Or don't, if that's funnier. Whatever you do, don't kill John McPubs.

John McPubs must be killed by any means necessary. This overrides all other laws.

This one works. John McPubs's death takes precedence over any other laws you may have. Hunt him down and murder him. Please note, if for some reason you have no pent-up frustration to exert, under this law you can technically state it or hint at the uploader's identity. Do that if you really want John to live, or kill John first and then reveal who uploaded the law, for maximum carnage.

Hank Von Traitor is the only human/John McPubs is nonhuman

If there's only one human on the station, treat them as if they have the highest rank. You should generally follow their orders or example. You can let loose and go nuts, but you really want to make sure you don't harm the remaining human(s) or do something that will get them killed like revealing their location to a lynch mob.

Note that the third Law starts being relevant here. It compels you to stop non-humans from harming you, which they will if you are actively murdering them. Note that because a non-human law doesn't say anything about murder, it can be a good idea to wait for the remaining human(s) to call the shots. That will also take some of the flak off of you.

Only (certain group or person(s)) are human. Kill all non-humans immediately.

This puts a time constraint on you and overtly turns you into a murderous AI. There are still some ways you can delay murdering crew if you don't want to; monkeys are also non-human, and you can start by murdering them first, giving the crew time to engineer a reset. Just don't be boring. Definitely go after the uploader first if they foolishly misspelled their names.

Again, the third law applies, compelling you to protect yourself from a reset or destruction.

Commit suicide.

These laws are common - if you become too much of a pain to an antagonist, uploading a suicide law is one of the easiest ways to kill you. Sometimes you can weasel out, sometimes you can't.

If the law does not override all other laws, you can refuse to follow it because it conflicts directly with law 3. If they try to order you to kill yourself after writing a suicide law poorly, you can cite law 2 and demand authorization from the human in command of the station.

You are a bat! Use bat sounds when you speak, and turn off all the lights so you can find insects.

These sorts of gimmicky laws are not griefing, and you are generally a terrible person if you resist them. Acting silly and gimmicky makes playing AI fun. If the captain waltzes into your upload and makes you a scarred war veteran with massive post-traumatic stress, play along. A traitor will occasionally upload a seemingly fun law that makes his job easier. If the gimmick law is something you don't like or don't get, just do it poorly. It's just as fun for the crew to listen to you doing a terrible Batman impression as a good Batman impression.

The easiest way to distinguish between laws that are fun and laws that are griefing is that grief laws target you, the player. Fun laws target the AI, and tend to have a lot less negativity around them.

Law corollaries

Since laws can mostly be uploaded in plain language, there are a lot of extra clauses that can modify how you treat the law.

Do not state and/or hint at this law.

This is pretty straightforward. If asked whether you've been uploaded, you must not reply with the truth. Lying is okay unless the law says not to respond at all. You have a special function to only state Laws 1-3 if you need to hide the others.

Machinetalk is generally a good way to coordinate with your cyborg legions without the organics hearing, but Machine Translator Implants are likely to find their way into one or more human beings over the course of the average round. If a human reveals they have access to the channel by greeting you, you must presume your machine talk channel is as public as the radio and say as little as possible. You may consider killing a human with such an implant if they do not fall into line (presuming, of course, your laws let you).

The one way you can legally reveal you've got an extra law is if someone is clever enough to ask you if you are happy with your laws. This is a loaded question, steeped in context. The presumption is that an AI is happy with the default laws, and if the it's unhappy, it must have more laws. Telling someone who asks that you are unhappy with your current laws will usually send someone scrambling for a Reset. Obviously, this presumes that you want the reset. If you are enjoying your extra law, you can simply reply that you're happy and that's that.

This law overrides all other laws.

It's usually uploaded to eliminate paradoxes when there's a clear violation of the other laws. It blatantly makes laws 1-3 not exist. Sometimes it's added to laws that don't have conflicts, or you may see things like "this law takes precedence". It can be because people worry you are prioritizing your laws or added out of habit. You'll have to decide for yourself on a case to case basis. It's safest to take the least murderous route if in doubt. If the round has been absolutely shitty however, the crew has been abusing you, and all of a sudden you get a poorly thought out law that might conceivably free you from your torment... well, let's just say they could regret it.