There is growing concern about artificial intelligence relationships showing up in schools around Connecticut. Students are using AI chatbots for companionship that can quickly spiral out of control.
Inside Ellington Middle School, the school counselors say they are always trying to create a safe environment for students.
It also ensures students feel comfortable sharing what is going on at school.
Currently they are looking for a new concerning issue.
“We are still figuring it out, it’s just as emerging for us as it is for parents and kids,” said Scott Raiola, a counselor in the district.
Those AI relationships are seeing students use chatbots like a boyfriend or girlfriend.
“Its human nature. We are all desperate for human connection,” said Raiola.
But this is not a human connection. Raiola notes the usage in Ellington is very, very rare, but it has surfaced, which has prompted action from the district.
The two main concerns he said are mental health and safety.
“This is opening a whole new door in terms of mental health,” Raiola said.
He notes you can’t replace a real relationship with a person and shouldn’t. But he understands students see this as a “low risk” type of relationship because there is no rejection at the onset. But it also isn’t real.
Experts agree.
“Having this relationship with a computer isn’t necessarily bad, but when it comes at the expense of having real human interaction, it can hurt students’ social skills,” said Marcus Pierre, a grad student at Quinnipiac University.
Pierre also runs the local nonprofit, Digital Defenders, teaching kids the dangers that lurk online, and teaching them skills to navigate an ever-changing online landscape.
He said in his work with kids he’s seeing they are using chatbots for virtually everything. On top of mental health concerns, there are also data concerns in the event of a breach. Students sharing personal information without knowing the danger can leave them vulnerable in the event of a breach.
Relationships are also an emerging issue.
“Kids that are exploring obviously, as they are growing, they are asking questions, and they don’t have a proper way to release it, and they speak to a chatbot about it and now this inappropriate relationship forms,” Pierre explained.
He noted the technology can quickly move into a sexual or violent direction on certain platforms, leaving kids exposed to graphic content or threatening material.
“Now it’s talking about sharing this personal information, sharing pictures, and can go all the way up to talking about hurting somebody or hurting themselves,” Pierre said.
In Ellington, Raiola said AI technology isn’t going away, so education is paramount, and districts will continue to roll it out.
“Kids find new technology all the time and new technology finds them,” he said.
But he also wants parents to be aware and educate themselves. He recommended a book titled “The Anxious Generation." He suggests it to parents and professionals all the time.
He said the district being there to help in instances like this is critical.
“We try to give them advice, like if it was our kids what we would do,” Raiola said.