![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Say a positronic robot, programmed with the standard Three Laws, became aware that a human being had a skin condition -- no, better, something actually dangerous, like an ulcer -- that was aggravated by stress. Would that robot then not be compelled by the First Law to place that human in a minimum-stress, maximum-comfort environment?
no subject
Date: 2006-05-27 07:11 pm (UTC)Preventing suffering is too complicated.
no subject
Date: 2006-05-28 04:21 am (UTC)Oddly, I don't think I've *read* this, but it's seeped into my brain anyway.
no subject
Date: 2006-05-29 12:15 am (UTC)no subject
Date: 2006-05-29 12:16 am (UTC)