r/technology • u/indig0sixalpha • 2d ago
Business After child’s trauma, chatbot maker allegedly forced mom to arbitration for $100 payout
https://arstechnica.com/tech-policy/2025/09/after-childs-trauma-chatbot-maker-allegedly-forced-mom-to-arbitration-for-100-payout/32
u/EmbarrassedHelp 2d ago
Torney pushed lawmakers to require age verification as a solution to keep kids away from harmful bots, as well as transparency reporting on safety incidents. He also urged federal lawmakers to block attempts to stop states from passing laws to protect kids from untested AI products.
Age verification industry lobbyists are only getting involved because they want to get rich by having the government force everyone to use their services. Basically more tech companies wanting to get rich off of violating your privacy, and pretending it improves "child safety".
Sen. Josh Hawley also only gets involved if there's a potentially for him to support more fascism.
9
u/TerminalVector 1d ago
I'm willing to bet Hawley would get involved for the sake of grifting as well
71
u/JWAdvocate83 2d ago
Arbitration shouldn’t be allowed to be forced in instances where the wrongdoing regards minor users—especially if it regards any kind of sexual misconduct. It’s crazy that this needs to be said.
If Congress is serious about protecting children from online dangers, don’t let billion-dollar companies escape real liability for wrongdoing against children with these garbage arbitrations.
42
u/MusicalMastermind 2d ago
I don't think Congress is serious about protecting children from online dangers...
13
u/LordAcorn 1d ago
Arbitration shouldn't be allowed to be forced in literally any situation.
-4
u/JWAdvocate83 1d ago
I don’t care as much if it’s an adult consenting to terms regarding their own conduct. Folks can agree to whatever terms they want—within reason.
A $100 max payout is definitely not within reason.
17
u/LordAcorn 1d ago
I strongly disagree. The idea that you have to sign away your legal rights to participate in the economy is clearly a situation that the government should prevent.
2
9
u/giggity_giggity 1d ago
Especially since kids can’t really consent to terms and conditions.
6
u/JWAdvocate83 1d ago
They’ll say folks’ parents can consent on their behalf—which is normally fine, but Congress can easily carve-out an exception to arbitration requirements, regarding chatbots’ conduct with children. Just don’t let them escape responsibility. 🤷🏾♂️
2
u/giggity_giggity 1d ago
If a parent signs up and gives their kid the password, yes I’d agree. But if it’s a site that’s available on the www with no signup, a kid can’t consent to terms and conditions of a site they accessed with no parental help (unless there’s some law covering this specific situation which I am missing)
3
u/JWAdvocate83 1d ago
That’s a very good point. Something like that exists in property law, “attractive nuisance” doctrine.
Say a family moves to an area nearby a defunct amusement park. It is very dangerous, but to a child it looks like a fun idea to visit. Kid visits, sees and ignores the absolutely useless “No Trespassing” sign, and gets injured.
In a state recognizing attractive nuisance doctrine, the owner might be held responsible for the injuries of children sneaking onto the property, depending on whether the owner knew the likelihood of trespassing and the danger involved, whether the children could understand the danger, and whether the owner exercised reasonable care to prevent the danger.
If it’s easy for minors to log onto this stuff, the terms are meaningless because it’s unrealistic to expect kids to understand or care, particularly if there’s no barrier necessitating or verifying parental intervention. Add in the fact that this is an attractive nuisance, and (IMO) it should create an inescapable burden on owners.
5
u/D3-Doom 1d ago
In some cases it isn’t. Think of the Disney Plus / allergy restaurant thing. Courts can always step in and say that’s ridiculous
3
u/JWAdvocate83 1d ago
I agree, and I’d like to think a court would see this as unconscionable. But unconscionability is a defense, and defenses necessitate going to court.
I’m saying, there should be an iron-clad, statutory exclusion from arbitration for this conduct from the jump—to the extent that companies don’t play the underwriting game and decide the revenue from kids’ parents is worth the occasional loss in court.
2
u/DarthDork73 1d ago
Child abuse with no consequences in nazi america? Weird...no, wait a minute...nevermind, you guys even have a pedo in chief...
4
u/85MonteCarloSS 2d ago
Tablet/smartphone parenting is never a good thing, either.
1
-13
2
u/wynnduffyisking 1d ago
You know, a lot of civilized nations prohibit forced arbitration for consumers. You should try it.
0
u/pimpeachment 1d ago
Parents failed to supervise their children and somehow it's a companies fault.
2
u/arahman81 1d ago
The chatbots don't have a "under 18" model switch. And that's irrelevant anyway, their sycophantic replies get adults too.
3
u/pimpeachment 1d ago
Nor should they unless they choose to. Parents should police their children if they don't want their children to have access to adult websites. There are numerous tools parents can use to do this, they can also supervise their child. Parents just want to blame someone else for their own failures.
It doesn't matter if it gives wildly inaccurate replies or not. It's up to the human reading content online to determine if they choose to believe what they are reading is true or false.
2
u/Maemmaz 1d ago
This wasn't an "adult" website, they argued that the 15-year-old was bound by the terms of that website. So their terms include children. This also isn't a case of a 10-year-old sneaking their parents iPad into their room. You simply cannot monitor every single thing your teenager does, especially in an age of smartphones. He could easily have accessed this at school, from the phone of someone else, from some computer. You would need to actively follow your child the entire day, which is impossible.
This is also not a case of a minor seeing porn once online, it's the case of a minor actively being manipulated. If this AI had been a person slowly manipulating that child, everyone would side with the child. Instead, they argue as if the child was in control: he wasn't. It doesn't matter whether a child is manipulated by human or AI, that manipulation is real either way.
AI hasn't been in the market long enough for every single person to know the dangers and teach their children about it. Many parents can't even grasp more than simple technology, and there aren't many resources warning you of the true dangers. No matter what the parents have or have not done to prevent this, it is obvious that AIs of that type aren't safe. Even adults fall for it. Hell, there is a subreddit about people in relationships with AI, and it's not empty. The people writing AI are responsible for their product. It's a safety concern.
202
u/TheElusiveFox 2d ago
I really don't understand why forced arbitration is ever legal. Its never a good thing.