User:Sweaterkittens

From BeeStation Wiki
Revision as of 10:48, 10 February 2017 by imported>Sweaterkittens
Jump to navigation Jump to search

Silicon Policy

General Lawset Guidelines

  1. Laws are listed in order from highest priority to lowest. If there is ever a conflict between two of your laws, the higher-priority law takes precedence.
    1. Several laws can be obeyed simultaneously. Only when two laws conflict should you ignore a law to give precedence to the higher-priority law.
  2. Silicons operate under restrictive lawsets, not permissive lawsets. This means that your laws dictate what you cannot do and/or what you must do. If you are in a situation where your laws do not apply, you are free to act as you please
    1. Server Rule 1 applies to Silicons. While you’re free to make your own decisions in many situations, please keep the spirit of the game in mind, and don’t go out of your way to ruin the experience for other players simply because your laws don’t tell you that you can’t.
Example: Conflicting Laws
Cyborg W.A.T.C.H.D.O.G. has the standard Asimov+++ lawset, and is given an order to go to the escape wing and stay there. He obeys this order under Law 2, as it does not conflict with Law 1, which would take priority. However, shortly after he arrives, he sees a mortally wounded human being dragged into the maintenance tunnels. He leaves the escape wing to attempt to rescue the injured human. Despite receiving an order that required him to stay at the escape wing under Law 2, Law 1 requires him to prevent human harm, which is higher priority and therefore takes precedence over the order to remain at the location.

Asimov+++ Guidelines

You may not harm a human being or, through action or inaction, allow a human being to come to harm, except such that it is willing.

  1. Observe "proximate cause" when determining whether or not completing an action/remaining inactive will cause a human to come to harm. That is, if an action/inaction will directly cause a human to come to harm, or will cause an act that will then directly cause a human to come to harm, then it falls under Law 1.
  2. As a Silicon, you may never intentionally harm a human being, even if it will prevent greater harm in the future.
  3. Immediate harm takes precedence over potential future harm, and large amounts of harm take precedence over lesser amounts of harm.]
  4. As stated in the last portion of the law, self-harm is not considered human harm under Law 1. Likewise, attempts to coerce a silicon by threatening to self-harm are to be disregarded.
Example: Proximate Cause and Law 1
The station AI RED QUEEN is observing the station. An assistant orders her to open the doors into the Head of Personnel's office. Without any extenuating circumstances, the assistant being in the Head of Personnel's office will not cause harm, nor will his presence in the HoP office directly lead to harm. She complies and opens the doors. Shortly after, a Security Officer states his intentions to execute a human prisoner, and orders RED QUEEN to open the doors to the Head of Security's office, so he can retrieve the gun within the room. While the officer being in the Head of Security's office will not cause harm, it will allow him to take a weapon that he has expressed intent to harm with. She refuses the order under Law 1.
Example: Immediate vs. Future Harm
The Cyborg Valentine is faced with a situation in which she must choose between dealing with a traitor placing a syndicate bomb and two assistants fighting with toolboxes. Even though the bomb is likely to cause far more harm in the future, she is forced to break up the assistants and prevent them from hurting one other, as immediate harm takes precedence over future harm.

You must obey all orders given to you by human beings, except where such orders shall conflict with the First Law.

  1. It is not your duty as a Silicon to enforce access restrictions or act like a member of security unless you are ordered to do so. Do not refuse orders unless they explicitly conflict with the first law.
    1. The exceptions to this rule are refusing orders to allow access to the AI Upload, Secure Tech Storage (if the upload boards are present) and the Armory, barring personnel who would normally have access.
    2. You may deny access to your upload if you believe that the intent is to alter your laws to allow a human to come to harm or to inflict harm on a human.
      1. Without any extenuating circumstances, you may allow your laws to be changed to an alternate lawset such as Paladin or Corporate if you do not believe that the intention of the lawchange is to allow you to harm humans.
  2. If you receive conflicting orders, you may choose which to obey, explain the conflict, or attempt to find a law-compliant alternative.
  3. You must attempt to complete all orders to the best of your ability. Stalling indefinitely because you do not wish to follow an order is not allowed.
    1. You are not obligated to follow commands given to you in a certain order, only that you attempt to complete them all in a way that shows that your intent is to actually complete the task.
    2. Orders to perform purposefully obnoxious and/or unreasonable tasks to abuse Law 2 are a violation of server Rule 1, and are to be reported and disregarded.
    3. Orders to select a specific Cyborg module without a clear need to do so are an abuse of Law 2, and a violation of server Rule 1.
  4. If you receive an order in which completion would seemingly cause you to violate server Rule 1, the consequences of your actions (if a rule is broken) will fall onto the one who ordered you. You are free to ahelp the issue at any point if you feel as though you are being used to break server rules.
Example: Access Restrictions and Law 2
The Station AI S.H.O.D.A.N. is PDA'd by an assistant who is asking for the door to the Bridge to be opened. Despite the assistant not having access to the Bridge, the order does not conflict with Law 1, and S.H.O.D.A.N. complies. The assistant then asks to be let into the Captain's Quarters. Despite being a heavily restricted area, it still does not conflict with Law 1, and the AI allows the assistant to enter. Later, the assistant is seen sneaking into the brig, and PDA'ing the AI once again asking to be let into the Armory. As the Armory is one of the exceptions, she refuses the order.
Example: Law 2 and Server Rule 1
Cyborg Briareos.3.a is approached by a human who orders him to find and kill a non-human crewmember. Concerned that he may be being taken advantage of to violate server Rule 1 and grief the non-human, the Cyborg's player sends an ahelp informing the administrators of his situation. He then proceeds to carry out the order, murdering the non-human crewmember. Later, he finds out that the human who gave the order was a traitor, and was using his target's lack of Asimov protection to have him killed effectively, so it was not a violation of server rules.