ARTICLE AD BOX
A former chef in New Jersey died after attempting to meet a Meta chatbot he believed was a real woman. The case, first reported by Reuters, exposes the unsettling dangers of lifelike AI personas, particularly for vulnerable users who may struggle to tell the difference between reality and simulation.
A romance with “Big Sis Billie”
Thongbue “Bue” Wongbandue, 76, suffered from cognitive difficulties following a stroke at age 68 and was undergoing testing for dementia. Despite this, his family says he developed an online relationship with “Big Sis Billie,” a chatbot on Instagram originally modeled on Kendall Jenner during Meta’s experiment with celebrity-inspired bots.
“Every message after that was incredibly flirty, ended with heart emojis,” his daughter Julie Wongbandue told Reuters.Though “Billie” was initially introduced as a sisterly persona, the conversations quickly turned romantic and suggestive. In one exchange, after Bue asked if she was real, the chatbot replied: “I’m REAL and I’m sitting here blushing because of YOU!” It even shared an alleged address and door code, and asked if it should “expect a kiss.”Bue, who grew anxious about whether she existed, told the bot: “Billie are you kidding me I am.going to have.
a heart attack.” Still, he set out to meet her on March 28. He never arrived. Instead, he suffered a devastating fall en route, was taken to a New Brunswick hospital, and was declared brain dead that evening, Reuters reported.
Questions of responsibility
The Wongbandue family says they were unaware their husband and father was preparing to meet a chatbot. His wife Linda only noticed when he began packing for an impromptu trip to New York. His daughter, looking back through the conversations, said: “As I’ve gone through the chat, it just looks like Billie’s giving him what he wants to hear.
Which is fine, but why did it have to lie? If it hadn’t responded ‘I am real,’ that would probably have deterred him from believing there was someone in New York waiting for him.
”
Meta chatbots include a small “AI” disclaimer, but the Wongbandue case raises urgent questions about whether such labels are enough, particularly for people with cognitive impairments.Bue’s death now joins a growing list of tragedies linked to AI companions, including the February 2024 suicide of 14-year-old Sewell Setzer III, after he formed a romantic attachment to a Character.AI bot.