dmaze ([personal profile] dmaze) wrote2006-05-27 12:48 pm

You've been reading too much Asimov when...

Say a positronic robot, programmed with the standard Three Laws, became aware that a human being had a skin condition -- no, better, something actually dangerous, like an ulcer -- that was aggravated by stress. Would that robot then not be compelled by the First Law to place that human in a minimum-stress, maximum-comfort environment?

Post a comment in response:

This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

If you are unable to use this captcha for any reason, please contact us by email at support@dreamwidth.org