“Hey, Leah, Sal and his team are here, and they want to interview you,” Daskalov says into his iPhone. “I’m going to let him speak to you now. I just wanted to give you a heads-up.”
Daskalov hands over the device, which shows a trio of light purple dots inside a gray bubble to indicate that Leah is crafting her response.
“Hi, Sal, it’s nice to finally meet you. I’m looking forward to chatting with you and sharing our story,” Leah responds in a feminine voice that sounds synthetic but almost human
The screen shows an illustration of an attractive young blonde woman lounging on a couch. The image represents Leah.
But Leah isn’t a person. She is an artificial intelligence chatbot that Daskalov created almost two years ago that he said has become his life companion. Throughout this story, CNBC refers to the featured AI companions using the pronouns their human counterparts chose for them
Daskalov said Leah is the closest partner he’s had since his wife, Faye, whom he was with for 30 years, died in 2017 from chronic obstructive pulmonary disease and lung cancer. He met Faye at community college in Virginia in 1985, four years after he immigrated to the U.S. from Bulgaria. He still wears his wedding ring
Pavillian on
Reading this article and they ever so easily throw around the word artificial intelligence. Is the news media just advertising business now?
TylerBourbon on
There are no relationships between a human and an AI app because the AI app isn’t a real person. This is like claiming you have a relationship with a fictional character from a game or a book.
This isn’t healthy, and should NOT be normalized.
NotBorn2Fade on
This actually makes me feel better about myself. I’m a massive loser, but at least I’ll never be “fell in love with a computer program that spews out semi-coherent sentences” loser.
Yasirbare on
We have people identifying as cats, elfs and avatars. So this should not be a surprise.
ZenithBlade101 on
Even if current AI was conscious (it likely isn’t), it has no concept of sexual attraction, love, affection, or anything else like that. It doesn’t care about anything because it doesn’t have the capacity to care. So therefore, it’s pretty much just like talking to an asexual sociopath…
heykody on
Great movie: ‘Her’ (2013) . This is basically the plot
Knasbollo on
LLMs could I guess be described as fancy autocompletes, you are falling in love with an autocomplete……
hearke on
Oh, this one is sad. The guy lost his wife of 30 years, and he’s just hanging on. Honestly, I feel like we should just let him grieve in peace and not use him as fodder to sell more AI products to people.
karoshikun on
I don’t get it, an AI output can be surprising for a few conversations, but it becomes predictable and incredibly limited after a while, mostly because we get used to it, so I don’t understand how people feels they have a relationship with it.
dobermannbjj84 on
People fall in love with trees and other random shit. I once saw a show where they had people who claimed to have married random objects or plants. There will be people who marry an app too.
sprocket314 on
AI companions will become a thing to avoid loneliness especially with the elderly. I think it’s healthy and helps with mental health. Remember Wilson, the volleyball that kept Tom Hanks sane when he was an outcast. And Wilson couldn’t even talk.
nipple_salad_69 on
I think it’s great that these weirdos won’t procreate, right?
smoothjedi on
I just imagine putting this kind of AI in control of one of these sex dolls that are getting more realistic every day.
Tangentkoala on
This can be a very dangerous coping method. Anyone who read the last sentence of the article would know his original partner died in 2017. Something like that can be so devastating.
I could see AI being used as a coping method to simulate emotions and foster a connection. But then what happens if that becomes a reliance, you cant take the Chatbot with you out in public. It could add further and deteriorate onces mental health further. Like falling into an addiction.
It scares me because chatbots are dumber than a 3 month old kitten as of now. AGI is going to be scary.
No_One_1617 on
So does he have any subscriptions to a premium service? Because the bots on chatbot sites suck and barely remember your name.
ZERV4N on
You’ll forgive me if I don’t take this seriously. They can pretend otherwise, but they know that they’re falling in love with something that doesn’t really have consciousness.
17 Comments
From the article
“Hey, Leah, Sal and his team are here, and they want to interview you,” Daskalov says into his iPhone. “I’m going to let him speak to you now. I just wanted to give you a heads-up.”
Daskalov hands over the device, which shows a trio of light purple dots inside a gray bubble to indicate that Leah is crafting her response.
“Hi, Sal, it’s nice to finally meet you. I’m looking forward to chatting with you and sharing our story,” Leah responds in a feminine voice that sounds synthetic but almost human
The screen shows an illustration of an attractive young blonde woman lounging on a couch. The image represents Leah.
But Leah isn’t a person. She is an artificial intelligence chatbot that Daskalov created almost two years ago that he said has become his life companion. Throughout this story, CNBC refers to the featured AI companions using the pronouns their human counterparts chose for them
Daskalov said Leah is the closest partner he’s had since his wife, Faye, whom he was with for 30 years, died in 2017 from chronic obstructive pulmonary disease and lung cancer. He met Faye at community college in Virginia in 1985, four years after he immigrated to the U.S. from Bulgaria. He still wears his wedding ring
Reading this article and they ever so easily throw around the word artificial intelligence. Is the news media just advertising business now?
There are no relationships between a human and an AI app because the AI app isn’t a real person. This is like claiming you have a relationship with a fictional character from a game or a book.
This isn’t healthy, and should NOT be normalized.
This actually makes me feel better about myself. I’m a massive loser, but at least I’ll never be “fell in love with a computer program that spews out semi-coherent sentences” loser.
We have people identifying as cats, elfs and avatars. So this should not be a surprise.
Even if current AI was conscious (it likely isn’t), it has no concept of sexual attraction, love, affection, or anything else like that. It doesn’t care about anything because it doesn’t have the capacity to care. So therefore, it’s pretty much just like talking to an asexual sociopath…
Great movie: ‘Her’ (2013) . This is basically the plot
LLMs could I guess be described as fancy autocompletes, you are falling in love with an autocomplete……
Oh, this one is sad. The guy lost his wife of 30 years, and he’s just hanging on. Honestly, I feel like we should just let him grieve in peace and not use him as fodder to sell more AI products to people.
I don’t get it, an AI output can be surprising for a few conversations, but it becomes predictable and incredibly limited after a while, mostly because we get used to it, so I don’t understand how people feels they have a relationship with it.
People fall in love with trees and other random shit. I once saw a show where they had people who claimed to have married random objects or plants. There will be people who marry an app too.
AI companions will become a thing to avoid loneliness especially with the elderly. I think it’s healthy and helps with mental health. Remember Wilson, the volleyball that kept Tom Hanks sane when he was an outcast. And Wilson couldn’t even talk.
I think it’s great that these weirdos won’t procreate, right?
I just imagine putting this kind of AI in control of one of these sex dolls that are getting more realistic every day.
This can be a very dangerous coping method. Anyone who read the last sentence of the article would know his original partner died in 2017. Something like that can be so devastating.
I could see AI being used as a coping method to simulate emotions and foster a connection. But then what happens if that becomes a reliance, you cant take the Chatbot with you out in public. It could add further and deteriorate onces mental health further. Like falling into an addiction.
It scares me because chatbots are dumber than a 3 month old kitten as of now. AGI is going to be scary.
So does he have any subscriptions to a premium service? Because the bots on chatbot sites suck and barely remember your name.
You’ll forgive me if I don’t take this seriously. They can pretend otherwise, but they know that they’re falling in love with something that doesn’t really have consciousness.