There’s also temple OS written in holy C, he was close to some of the stuff in the article
And these are just two people functional and loud enough to be heard. This is a thing that happens, maybe LLMs exacerbate a pre existing condition, but people have been going off the deep end like this long before LLMs came into the picture
I agree, it’s certainly not going to help people losing touch. But that’s not what worries me - that’s a small slice of the population, and models are beginning to get better at rejection/assertion
What I’m more worried about is the people who are using it almost codependently to make decisions. It’s always there, it’ll always give you advice. Usually it’s somewhat decent advice, even. And it’s a normal thing to talk through decisions with anyone
The problem is people are offloading their thinking to AI. It’s always there, it’s always patient with you… You can literally have it make every life decision for you.
It’s not emotional connection or malicious AI I worry about… You can now walk around with a magic eight ball that can guide you through life reasonably well, and people are starting to trust it above their own judgement
Yeah we have spiritual delusions at home already!
Seriously, no new spiritual delusions could ever be more harmful than what we have right now.
deleted by creator
This isn’t a new thing, people have gone off alone into this kind of nonsensical journey for a while now
The time cube guy comes to mind
There’s also temple OS written in holy C, he was close to some of the stuff in the article
And these are just two people functional and loud enough to be heard. This is a thing that happens, maybe LLMs exacerbate a pre existing condition, but people have been going off the deep end like this long before LLMs came into the picture
deleted by creator
I agree, it’s certainly not going to help people losing touch. But that’s not what worries me - that’s a small slice of the population, and models are beginning to get better at rejection/assertion
What I’m more worried about is the people who are using it almost codependently to make decisions. It’s always there, it’ll always give you advice. Usually it’s somewhat decent advice, even. And it’s a normal thing to talk through decisions with anyone
The problem is people are offloading their thinking to AI. It’s always there, it’s always patient with you… You can literally have it make every life decision for you.
It’s not emotional connection or malicious AI I worry about… You can now walk around with a magic eight ball that can guide you through life reasonably well, and people are starting to trust it above their own judgement
You mean the guys who put kids in suicide bombs don’t have acute psychosis?
What about almost of the rvaibg Christian hermits that sit in their basements and harass people online?
Its full on lovecraftian level psychosis. In the US they sell out stadiums and pretend to heal people by touch lmao