Talking about AI in human terms is natural – but wrong
When it comes to artificial intelligence, metaphors are often misleading
MY LOVE’S like a red, red rose. It is the east, and Juliet is the sun. Life is a highway, I wanna ride it all night long. Metaphor is a powerful and wonderful tool. Explaining one thing in terms of another can be both illuminating and pleasurable, if the metaphor is apt.
But that “if” is important. Metaphors can be particularly helpful in explaining unfamiliar concepts: imagining the Einsteinian model of gravity (heavy objects distort space-time) as something like a bowling ball on a trampoline, for example. But metaphors can also be misleading: picturing the atom as a solar system helps young students of chemistry, but the more advanced learn that electrons move in clouds of probability, not in neat orbits as planets do.
What may be an even more misleading metaphor – for artificial intelligence (AI) – seems to be taking hold. AI systems can now perform staggeringly impressive tasks, and their ability to reproduce what seems like the most human function of all, namely language, has ever more observers writing about them. When they do, they are tempted by an obvious (but obviously wrong) metaphor, which portrays AI programs as conscious and even intentional agents. After all, the only other creatures which can use language are other conscious agents – that is, humans.
Share with us your feedback on BT's products and services