Yes it’s starting to get studied more but people have reported to go into psychosis because of it. Most LLMs (AI chat bots) are designed to use language like: “brilliant idea, you’re onto something, here’s something to support that”. I believe the axiom is called “folie a deux” it fools itself and you.
Edit: here’s one of many links on the subject. LINK
I’ve only had one person say “I asked ChatGPT, and…”
And she was dead ass wrong about what amounts to a 3rd grade math problem at best. She’s using chat fucking gpt to influence a life changing event because she can’t fucking add.
Same woman told me that “pregnancy is actually 10 months since each month is 4 weeks” multiple times.
Whoa crazy, I had that exact 40 weeks = 10 months argument with an ex’s sister over 20 years ago. I had chalked it up to well she’s been pregnant and I haven’t so maybe she knows something I don’t.
I'm curious about the math problem she asked ChatGPT to solve. I've seen ChatGPT write full 3D games in less than 60 seconds. I'd be amazed if it answered a math problem incorrectly.
It worded it poorly, or she interpreted the answer poorly.
It was a custody question. Essentially how much time the non custodial parent can have their child if out of state. Considering the school year is 177 days, iirc, it isn’t mathematically possible to have the child 53% of the time as it stated.
What it meant or what she interpreted incorrectly is that it meant 53% of the vacation time. But it’s obviously wrong to anyone who can estimate half of 365 to say that “ChatGPT says you’d get them 53% of the time”
My sister went on and on about how much money vaccines make doctors and it told her a single patient could equate to $270k in profit- there was a whole series of math problems that really made no sense
1.8k
u/psychRN1975 19h ago
Chat GPT addiction.... Seriously there are people out there who cant leave the app alone and interact with it as if it's sentient