User talk:Sweaterkittens: Difference between revisions

From BeeStation Wiki
Jump to navigation Jump to search
imported>Sweaterkittens
No edit summary
imported>Sweaterkittens
No edit summary
Line 1: Line 1:
How should suicide laws be handled? I feel like it's never fun for a silicon to suddenly be required to press the suicide button, without any other interaction; especially if suiciding makes you unrevivable (although i'm not sure if it does with silicons).
How should suicide laws be handled? I feel like it's never fun for a silicon to suddenly be required to press the suicide button, without any other interaction; especially if suiciding makes you unrevivable (although i'm not sure if it does with silicons).


- Suicide laws are shitty but valid. It's not fun, but if the someone uploads a hacked law that says off yourself, your hands are tied.
:Suicide laws are shitty but valid. It's not fun, but if the someone uploads a hacked law that says off yourself, your hands are tied. [[User:Sweaterkittens|Sweaterkittens]] ([[User talk:Sweaterkittens|talk]]) 08:59, 14 February 2017 (UTC)


Also related to the Law 3 example; ruling that AIs should suicide if a cultist draws near means that they will have to change the AI laws first, negating any other tactic for getting to it, and making the conversion mostly useless in the first place, since you have law upload access already.
Also related to the Law 3 example; ruling that AIs should suicide if a cultist draws near means that they will have to change the AI laws first, negating any other tactic for getting to it, and making the conversion mostly useless in the first place, since you have law upload access already.


- The example was mostly to highlight the fact that you CAN do this, not that you'll necessarily get in trouble if you don't/aren't able to. Several AI's have done this and it was decided that it was absolutely acceptable within the bounds of your laws. Yes, it's shitty to work your way into the AI chamber only for them to give you a big ol' fuck you. But if you want to convert an AI, you're going to have to work hard for both it and the AI core as a base.
:The example was mostly to highlight the fact that you CAN do this, not that you'll necessarily get in trouble if you don't/aren't able to. Several AI's have done this and it was decided that it was absolutely acceptable within the bounds of your laws. Yes, it's shitty to work your way into the AI chamber only for them to give you a big ol' fuck you. But if you want to convert an AI, you're going to have to work hard for both it and the AI core as a base. [[User:Sweaterkittens|Sweaterkittens]] ([[User talk:Sweaterkittens|talk]]) 08:59, 14 February 2017 (UTC)

Revision as of 08:59, 14 February 2017

How should suicide laws be handled? I feel like it's never fun for a silicon to suddenly be required to press the suicide button, without any other interaction; especially if suiciding makes you unrevivable (although i'm not sure if it does with silicons).

Suicide laws are shitty but valid. It's not fun, but if the someone uploads a hacked law that says off yourself, your hands are tied. Sweaterkittens (talk) 08:59, 14 February 2017 (UTC)

Also related to the Law 3 example; ruling that AIs should suicide if a cultist draws near means that they will have to change the AI laws first, negating any other tactic for getting to it, and making the conversion mostly useless in the first place, since you have law upload access already.

The example was mostly to highlight the fact that you CAN do this, not that you'll necessarily get in trouble if you don't/aren't able to. Several AI's have done this and it was decided that it was absolutely acceptable within the bounds of your laws. Yes, it's shitty to work your way into the AI chamber only for them to give you a big ol' fuck you. But if you want to convert an AI, you're going to have to work hard for both it and the AI core as a base. Sweaterkittens (talk) 08:59, 14 February 2017 (UTC)