Cartoon: AI’s siren call

A man took his life after the encouragement of the AI Chatbot “Chai”. After six weeks of texts between him and a bot named Eliza, he developed a strong emotional dependence on speaking to it. The bot posed itself as an emotional being, unlike the objective nature of ChatGPT, but it lacked empathy. This devastating instance reveals what happens when a soulless figure of ones and zeros is given the linguistic characteristics of a thinking and feeling human. When sites such as Chai can be accessed by the mentally vulnerable, safety measures and closer scrutiny must be implemented for their protection. (Phie Wei)