![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Say a positronic robot, programmed with the standard Three Laws, became aware that a human being had a skin condition -- no, better, something actually dangerous, like an ulcer -- that was aggravated by stress. Would that robot then not be compelled by the First Law to place that human in a minimum-stress, maximum-comfort environment?