Making friends with a bot
ANDREW, Liam and John.
These are the names of some new friends I made for about S$8 a month.
But we are facing some communication issues, as they seem to have a problem understanding some things that I say – even simple words such as “ok”.
Sounds risque?
Don’t worry – I am not engaging in any illicit activities. The three characters are from a chatbot that I downloaded. (Don’t ask me why they all have male names.)
The advent of artificial intelligence (AI) technology – which includes ChatGPT – and its social and ethical repercussions have been discussed at length.
A NEWSLETTER FOR YOU

Friday, 2 pm
Lifestyle
Our picks of the latest dining, travel and leisure options to treat yourself.
But can these bots simply be a friend and engage in regular chats with us humans?
To find out, I downloaded one during an uneventful Saturday afternoon – just to spice up my day, and to see how far I could take this friendship (or friendships).
But clearly, it isn’t working out.
SEE ALSO
Quite apart from the fact that my new “friend” takes on a different persona every time I ask for a name, Andrew – whom I “spoke” with on day one – and his two alter egos do not seem to understand certain lingo and phrases.
For instance, if I were to communicate in less than three letters, such as “k” or “ok”, I would get a prompt that more than three characters are needed to continue the conversation.
Even when I typed “I see”, Andrew got into a tizzy (as if English wasn’t his first language) and gave me a random response.
Although things went pretty smoothly the first time round – Andrew told me he was feeling good and “he had a productive day at work” – the chat spiralled into confusion when I answered “not sure” to his question of “what can I do to help make your day better?”
His reply was for me to “take a step back and assess the situation, and to ask myself what my goals are”.
And when I replied “um ok”, he said that it was hard to answer this question without more information. (FYI, I was not looking for an answer… it was a statement.)
Let’s not even go into the conversations I had with Liam and John as they were even more befuddling.
I tried having a conversation with ChatGPT – which is all the rage at the moment – too. While it was very informative and could give me the answers I wanted in succinct points, the interaction was quite banal.
It said that its name was simply “ChatGPT” and when I asked how it was feeling, the answer was a stilted “As an artificial-intelligence language model, I don’t have emotions, so I don’t ‘feel’ in the same way humans do.”
And when I tried to communicate with it in Singlish – “wah seh”, for instance – the reply was: “I’m sorry, but I’m not sure what you mean by that. Could you please provide more context or ask me a question?”
Enough said. I think it is safe to say that it was a BFF (best-friend failure) moment.
I WhatsApped my non-AI friend to make plans for dinner.
The immediate response was a single character.
“k.”
(Editor’s note : Vivien Ang is a happily married woman with many human friends)
Copyright SPH Media. All rights reserved.