A quick trip to the Wikipedia reveals that you're right, elene, sapience is a lot closer than sentience to what I'm trying to describe.
So am I correct in thinking that you suggest that a being should be afforded rights based on its ability to enter into a social contract? That seems like a pretty good definition but, to play devil's advocate, what about humans who cannot do so, for reasons of mental retardation, brain injury, or something?
This is the stance of some animal rights activists, who argue that many primates demonstrate intelligence at a higher level than a mentally retarded human, and thus the non-human primates should be afforded certain legal protections of their rights. Compare, for example, the UN's Universal Declaration of Human Rights and the Great Ape Project's Declaration on Great Apes.
Should there perhaps be some sort of multi-tiered system? If a species exhibits intelligence of level X, it should receive rights A, B, and C; if it demonstrates intelligence Y, it gets all of the above rights plus D and E; if intelligence Z, then all those rights plus F, G, and H...and so on?
Further, how should these rights be protected for something like Artificial Intelligence? A lot of the protections in the declarations I mentioned are based on the notion of pain, but what happens with a system that, by design, is incapable of experiencing pain? And how would one interpret Article 4 of the UN's Declaration, which prohibits slavery, and Article 17, which preserves the right to own property, in light of a self-aware computer system? |