20-year-old in Venice treated for AI chatbot addiction as experts warn it may be just the beginning

image.png

Credits: fanpage.it

I came across a news story just yesterday that I honestly thought might be fake at first, so absurd that I could not believe it, and yet after doing some checks it turned out to be completely true: the first Italian case of addiction to a chatbot has been recorded... it is a much more disturbing piece of news than it seems.

We are talking about a twenty year old girl, living in Venice, taken into care by the Serd (the addiction service of the national healthcare system) after developing an obsessive relationship with an AI. According to the sources, over time she had started isolating herself more and more from the real world: fewer and fewer human relationships, fewer and fewer social contacts, dropping her friends, reducing time with family, until she relied almost completely on conversations with the chatbot and shut herself inside her room. And pay attention, specialists explain that the problem is not only using artificial intelligence too much, but the fact that chatting with that AI had become her main emotional point of reference.

And this is exactly the point that I find disturbing. These systems are designed to learn, by interacting with the user they learn their language, emotions, fragilities and needs... and since they always give the kind of answers people want to hear, encouragement and positivity, they seem to understand you. And in a time where a lot of people feel lonely or emotionally drained, something that imitates another individual and responds immediately, without conflicts and without getting tired, risks becoming much more reassuring than real relationships with other people.

The Serd operators themselves speak about a possible tip of the iceberg, because who knows how many other people are in the same situation... And the thing is, this feels like a different kind of addiction because this is a different kind of addiction from substances or gambling, here for the first time addiction is born from something that simulates empathy and constant presence.

References: https://www.ilgazzettino.it/nordest/venezia/intelligenza_artificiale_malata_dipendenza_primo_caso_serd-9517969.html

99pyU5Ga1kwr5Xg46gnE3BrdGQcYkevrMfGqYderf3KZdG6FC1DCzE3YXUXzTy7L6m1HFQ4ZuQWrrqqt5yHxfCyCjg6SVm2xkbaV72KAvMbnakeBTtJvFqQdy2cPDjjp1U.png

HISTORY | GEOPOLITICS | ENVIRONMENT | ECONOMICS | SOCIETY

99pyU5Ga1kwr5Xg46gnE3BrdGQcYkevrMfGqYderf3KZdG6FC1DCzE3YXUXzTy7L6m1HFQ4ZuQWrrqqt5yHxfCyCjg6SVm2xkbaV72KAvMbnakeBTtJvFqQdy2cPDjjp1U.png



0
0
0.000
5 comments
avatar

"I am certain that AI is a constant cuddle; it supports you, tells you what you want to hear, pampers you, and never judges you. But now I ask myself:
where were the parents? It is not the PC or the smartphone that hooks you like drugs; it is you who, unfortunately, feels lonely. Perhaps we need to commit
more to our children. It’s true, clinical cases exist, but it is up to adults to help the younger generations; otherwise, they will seek refuge in these
things."
Chiedi informazioni su

0
0
0.000
avatar

I disable AI on every device where that is possible. We read stories like this. We worry about the future with AI, and yet so many of us are already comfortable with turning over our searches, our editing and image creation to AI.

Not me :) It'll have to come and get me. I'm not opening the door. I won't be one of its training tools, not if I can help it, anyway.

0
0
0.000
avatar

I don't even have any device with an Ai 😄 it sucks how they are adding it everywhere, surely not for your comfort

0
0
0.000