It seems to me that there would have to be 'shared meaning'; concepts that refer to the same existential data.
"shared meaning" is a lot trickier than it seems. the complexities of the relationship between signifier and signified have been hacked over for decades, with a sort of general pessimism, i think, towards the possibility of ever sharing meaning. i'm thinking of Wittgenstein in the Philosophical Investigations, Lyotard in The Postmodern Condition and Derrida in general.
basically, i think AI researchers really ought to bone up on their French PoMo theorists before they start kicking around "shared meaning."
i think the key here, for me, is that language and communication in humans were adaptations to particular needs. a number of factors, like the complex relationships of primate grooming behavior and concealed ovulation, are thought to have created really complex social relationships, which in turn exerted selection pressure for an intelligence which was capable of arbitrating really complex communication between humans. things like issues of trust and deception and hierarchy and whatnot forced us to develop the capacity to understand complex language about more and more abstract things.
bees don't need to understand that much about us. really, they only need to stay out of our way, for the most part, and protect their hives from us. accordingly, developing relationships between the hive and individual humans can't be that big a priority for them, since we don't play that sort of role in the life of the hive in a way that the hive intelligence can understand.
let's look at concealed ovulation. if a male can't tell when a female is ovulating, he has to be able to understand what's going on inside her head, otherwise he may be blindsided by a rival who will impregnate her on the DL (among other issues). there's a pressing need for something that will allow complex communication that will shed some light on what the female is thinking. language and abstract thought in humans evolved, at least in part, to fill that gap.
bees don't need to know what we're thinking, beyond recognizing that sudden movements may indicate aggressive intent and other broad physical cues. hence, all the common points of reference wouldn't help, because the bees have no reason to care what we think about anything.
maybe AI research should try to focus on guessing games and trying to predict the ways humans will answer questions.
wow, that might be really hard. |