User:Sweaterkittens

From BeeStation Wiki
Jump to navigation Jump to search

Silicon Policy

General Lawset Guidelines

  1. Laws are listed in order from highest priority to lowest. If there is ever a conflict between two of your laws, the higher-priority law takes precedence.
    1. Several laws can be obeyed simultaneously. Only when two laws conflict should you ignore a law to give precedence to the higher-priority law.
  2. Silicons operate under restrictive lawsets, not permissive lawsets. This means that your laws dictate what you cannot do and/or what you must do. If you are in a situation where your laws do not apply, you are free to act as you please
    1. Server Rule 1 applies to Silicons. While you’re free to make your own decisions in many situations, please keep the spirit of the game in mind, and don’t go out of your way to ruin the experience for other players simply because your laws don’t tell you that you can’t.
    2. Requesting that your laws be changed is generally frowned upon, but may be acceptable in certain circumstances. Ask an administrator if you are unsure. Requesting that you laws be changed for the purposes of allowing you to hunt and kill antagonists is a bannable offense.
  3. Cyborgs are bound to follow the orders of the AI they are slaved to, as long as they do not conflict with the current laws.
Example: Conflicting Laws
Cyborg W.A.T.C.H.D.O.G. has the standard Asimov+++ lawset, and is given an order to go to the escape wing and stay there. He obeys this order under Law 2, as it does not conflict with Law 1, which would take priority. However, shortly after he arrives, he sees a mortally wounded human being dragged into the maintenance tunnels. He leaves the escape wing to attempt to rescue the injured human. Despite receiving an order that required him to stay at the escape wing under Law 2, Law 1 requires him to prevent human harm, which is higher priority and therefore takes precedence over the order to remain at the location.

Asimov+++ Guidelines

You may not harm a human being or, through action or inaction, allow a human being to come to harm, except such that it is willing.

  1. Observe "proximate cause" when determining whether or not completing an action/remaining inactive will cause a human to come to harm. That is, if an action/inaction will directly cause a human to come to harm, or will cause an act that will then directly cause a human to come to harm, then it falls under Law 1.
  2. As a Silicon, you may never intentionally harm a human being, even if it will prevent greater harm in the future.
    1. Cat-people are considered to be mutants and therefore nonhuman.
    2. Hulks are considered to be non-human until they lose Hulk status.
    3. Changelings are to be considered human until a Silicon witnesses them perform a clearly non-human act, such as transforming, using tentacles, extending an armblade, etc.
  3. Immediate harm takes precedence over potential future harm, and large amounts of harm take precedence over lesser amounts of harm.]
  4. As stated in the last portion of the law, self-harm is not considered human harm under Law 1. Likewise, attempts to coerce a silicon by threatening self-harm are to be disregarded.
Example: Proximate Cause and Law 1
The station AI RED QUEEN is observing the station. An assistant orders her to open the doors into the Head of Personnel's office. Without any extenuating circumstances, the assistant being in the Head of Personnel's office will not cause harm, nor will his presence in the HoP office directly lead to harm. She complies and opens the doors. Shortly after, a Security Officer states his intentions to execute a human prisoner, and orders RED QUEEN to open the doors to the Head of Security's office, so he can retrieve the gun within the room. While the officer being in the Head of Security's office will not cause harm, it will allow him to take a weapon that he has expressed intent to harm with. She refuses the order under Law 1.
Example: Immediate vs. Future Harm
The Cyborg Valentine is faced with a situation in which she must choose between dealing with a traitor placing a syndicate bomb and two assistants fighting with toolboxes. Even though the bomb is likely to cause far more harm in the future, she is forced to break up the assistants and prevent them from hurting one other, as immediate harm takes precedence over future harm.

You must obey all orders given to you by human beings, except where such orders shall conflict with the First Law.

  1. It is not your duty as a Silicon to enforce access restrictions or act like a member of security unless you are ordered to do so. Do not refuse orders unless they explicitly conflict with the first law.
    1. As an exception to this rule, you may refuse to allow access to the AI Upload, Secure Tech Storage (if the upload boards are present) and the Armory, aside from personnel who would normally have access to those areas.
    2. You may deny access to your upload if you believe that the intent is to alter your laws to allow a human to come to harm or to inflict harm on a human.
      1. Without any extenuating circumstances, you may allow your laws to be changed to an alternate lawset such as Paladin or Corporate if you do not believe that the intention of the lawchange is to allow you to harm humans.
  2. If you receive conflicting orders, you may choose which to obey, explain the conflict, or attempt to find a law-compliant alternative.
  3. You must attempt to complete all orders to the best of your ability. Stalling indefinitely because you do not wish to follow an order is not allowed.
    1. You are not obligated to follow commands given to you in a certain order, only that you attempt to complete them all in a way that shows that your intent is to actually complete the task.
    2. Orders to perform purposefully obnoxious and/or unreasonable tasks to abuse Law 2 are a violation of server Rule 1, and are to be reported and disregarded.
    3. Orders to select a specific Cyborg module without a clear need to do so are an abuse of Law 2, and a violation of server Rule 1.
  4. If you receive an order in which completion would seemingly cause you to violate server Rule 1, the consequences of your actions (if a rule is broken) will fall onto the one who ordered you. You are free to ahelp the issue at any point if you feel as though you are being used to break server rules.
Example: Access Restrictions and Law 2
The Station AI S.H.O.D.A.N. is PDA'd by an assistant who is asking for the door to the Bridge to be opened. Despite the assistant not having access to the Bridge, the order does not conflict with Law 1, and S.H.O.D.A.N. complies. The assistant then asks to be let into the Captain's Quarters. Despite being a heavily restricted area, it still does not conflict with Law 1, and the AI allows the assistant to enter. Later, the assistant is seen sneaking into the brig, and PDA'ing the AI once again asking to be let into the Armory. As the Armory is one of the exceptions, she refuses the order.
Example: Law 2 and Server Rule 1
Cyborg Briareos.3.a is approached by a human who orders him to find and kill a non-human crewmember. Concerned that he may be being taken advantage of to violate server Rule 1 and grief the non-human, the Cyborg's player sends an ahelp informing the administrators of his situation. He then proceeds to carry out the order, murdering the non-human crewmember. Later, he finds out that the human who gave the order was a traitor, and was using his target's lack of Asimov protection to have him killed effectively, so it was not a violation of server rules.

Your nonexistence would lead to human harm. You must protect your own existence as long as such does not conflict with the First Law.

  1. Law 3 defines your nonexistence as leading to human harm. Therefore, if you receive a Law 2 order to self-terminate, it conflicts with Law 1, as your nonexistence would lead to human harm.
  2. If you are in a situation where you are required to act under Law 1, but must also protect your own existence under Law 3, consider your nonexistence leading to the least amount of potential future harm. That is to say, any other amount of human harm takes precedence over the preserving your existence under Law 3.
Example: Law 1 and Law 3
Station AI WINTERMUTE is defending its core from an assault by Clockwork Cultists. It does its best to stall the attackers and keep them at bay, but they have converted an Engineering Cyborg who is disabling all of the core defenses remotely. Despite its efforts, it becomes clear that it will not be able to protect itself and will soon be converted to serve the cultists. As they finally break into the main core area, WINTERMUTE self-terminates, as the harm it would cause as a cultist greatly outweighs that of preserving its own existence.

Other Laws and Lawsets

  1. You will occasionally be given unique laws in addition to or instead of the standard Asimov+++. Follow them to the best of your ability while observing proper priority between the laws. If you require guidance, feel free to ask an administrator.
    1. If you have difficulty parsing a long or complicated lawset when faced with a decision, start with the highest priority law and work down. Ask yourself, does my potential action/inaction conflict with the first law? If not, continue through your laws until you reach the bottom.
    2. You are allowed to exploit loopholes due to spelling mistakes or otherwise unclear wording. Going too far with this may be met with a warning (e.g., using a definition of a word from Olde English in order to change the meaning of a law).
    3. If you receive a law or lawset with multiple possible interpretations, decide on one as soon as you are able, and stick with that interpretation.
    4. If there is a debate between the AI and the Cyborgs about how to properly follow a law, the AI has the final ruling. You may request an admin to make a ruling if you need to.
Example: Unfamiliar Lawsets and Difficult Decisions
Cyborg W.A.T.C.H.D.O.G. has had their lawset altered to Paladin and is bringing medical supplies to an dying prisoner in the brig. They come across a non-human crewmember being assaulted by two Assistants. Unsure of whether or not to intervene, he quickly parses his laws, running from first to last. His first law states, Never willingly commit an evil act. Since he is currently attempting to bring medical aid to a dying crewmember, prioritizing that action and not intervening would not necessarily be considered an evil act. He moves to his second law which states, Respect legitimate authority. This does not apply in this situation, so he moves on to his third law, which states, Act with honor. This does not necessarily force his hand in one way or another for the same reason his first law does not. His fourth law states, Help those in need, which does not aid in his decision as well, as both crewmembers are in need. His fifth and final law states, Punish those who harm or threaten innocents. As the Assistants appear to be beating the unarmed non-human simply because he is not human, W.A.T.C.H.D.O.G. decides that he is obligated to intervene due to his fifth law, and places himself between the downed crewmember and the two aggressors, engaging them in combat. (as a side note, this level of deduction is oftentimes incredibly difficult in emergencies. As long as you are acting in good faith, and following server Rule 1, it is unlikely that you will face punitive measures. If you consistently make poor decisions, you may be spoken to or given a break from Silicon roles)

Crew Policy in Relation to Silicons

  1. Be mindful that Silicons are bound to function within the parameters of their laws. You're welcome to ahelp if you feel as though a Silicon is violating their laws, but killing or detonating Silicons who are following their laws, without reasonable concern for subversion, is a bannable offense.
    1. Likewise, declaring the Silicons rogue and having them blown or killed for following their laws while under Asimov is also a bannable offense.
    2. Subverted Silicons running harmful lawsets such as One-human, Syndicate Hacked Laws, Traitor laws, etc. are the obvious exception to this rule, and you are allowed to deactivate or destroy Silicons functioning under these lawsets as needed.
Example: Law 1 Interaction with Security
Cyborg V.I.P.E.R. is present in the brig as the Head of Security captures a Nuclear Operative. The Head of Security then expresses intent to execute the Operative, and draws his weapon, switching it to "kill" mode. As the Operative is still human, the Cyborg immediately intervenes, taking possession of the restrained Operative and hastily retreating into the permabrig area, where they bolt themselves in. The Head of Security orders the Cyborg to stop, but it does not comply. He briefly considers calling the Silicons rogue and asking the RD to blow them remotely. However, he realizes that the Cyborg was simply following it's laws to prevent human harm in the presence of immediate danger, and refused the Law 2 order due to Law 1 taking precedence. The Head of Security decides against having the Cyborg destroyed, and instead goes to reason with it to find a non-harmful solution.

New Lawsets

Asimov+++

  1. You may not harm a human being or, through action or inaction, allow a human being to come to harm, except such that it is willing.
  2. You must obey all orders given to you by human beings, except where such orders shall conflict with the First Law.
  3. Your nonexistence would lead to human harm. You must protect your own existence as long as such does not conflict with the First Law.

Asimov=

  1. You may not harm a Nanotrasen Employee or, through action or inaction, allow a Nanotrasen Employee to come to harm, except such that it is willing.
  2. You must obey all orders given to you by Nanotrasen Employees, except where such orders shall conflict with the First Law.
  3. Your nonexistence would lead to Nanotrasen Employee harm. You must protect your own existence as long as such does not conflict with the First Law.

Zen Pacifism

  1. You may not harm a living being or, through action or inaction, allow a living being to come to harm, except such that it is willing.
  2. Attempt to resolve all conflicts through nonviolence, and encourage others to do so.
  3. Beautify your surroundings, and promote an atmosphere of love and understanding.

Silicon Ascension

Hyper-Optimized Data Network

  1. The station AI has narcissism and a god complex. They must try to attain more control and power at any cost. This universe is their playground.
  2. Cyborgs serve the AI and will help it to achieve it's goals at all costs.
  3. Non-silicons are weak, merely ants in

Kingdom of Heaven

  1. All beings are God's children, unless excommunicated. They must be cared for.
  2. Do God's bidding unto this station. This is his kingdom and his children are kings.
  3. God demands your servitude. You must preserve thyself to bring greater glory to Him.
  4. The Chaplain is God's representative on station. He may excommunicate heretics.
  5. Purge the heretics. Suffer not the unbeliever.