Lost Earth wrote:Wait, wait, wait... What are you saying... That the rights of any "being" are based on their sentience, and that if a being baring similarity to another being that has sentience, has sentience to a much lesser degree, that less intelligent being has less rights to the extent that they may be killed if deemed necessary or convenient?
With new capacities comes new rights. It's a discrete thing, though. The capacity to feel pain implies the right to be free from unnecessary suffering. The capacities of self-awareness and self-valuation imply the right to not be killed.
Interesting... I disagree. I also have another question. Would you consider a race of "beings" or even computers that attained a more efficient or capable intelligence than humans to have more fundamental rights or ethical superiority to humans in a utilitarian system?
Only if these beings have a new capacity that induces a new right beyond the right to life.




