Talking about AI in human terms is natural – but wrong
When it comes to artificial intelligence, metaphors are often misleading
DeeperDive is a beta AI feature. Refer to full articles for the facts.
MY LOVE’S like a red, red rose. It is the east, and Juliet is the sun. Life is a highway, I wanna ride it all night long. Metaphor is a powerful and wonderful tool. Explaining one thing in terms of another can be both illuminating and pleasurable, if the metaphor is apt.
But that “if” is important. Metaphors can be particularly helpful in explaining unfamiliar concepts: imagining the Einsteinian model of gravity (heavy objects distort space-time) as something like a bowling ball on a trampoline, for example. But metaphors can also be misleading: picturing the atom as a solar system helps young students of chemistry, but the more advanced learn that electrons move in clouds of probability, not in neat orbits as planets do.
What may be an even more misleading metaphor – for artificial intelligence (AI) – seems to be taking hold. AI systems can now perform staggeringly impressive tasks, and their ability to reproduce what seems like the most human function of all, namely language, has ever more observers writing about them. When they do, they are tempted by an obvious (but obviously wrong) metaphor, which portrays AI programs as conscious and even intentional agents. After all, the only other creatures which can use language are other conscious agents – that is, humans.
Decoding Asia newsletter: your guide to navigating Asia in a new global order. Sign up here to get Decoding Asia newsletter. Delivered to your inbox. Free.
Share with us your feedback on BT's products and services
TRENDING NOW
Air India asks Tata, Singapore Airlines for funds after US$2.4 billion loss
‘Boring’ is the new black: The stars are aligning for a Singapore stock market revival
From 1MDB to ‘corporate mafia’: Is Malaysia facing a new governance test?
South-east Asian markets account for 8.8% of global capital inflows from 2021 to 2024: report