A new legislative bill was introduced in the Ohio House of Representatives that would prohibit the ability to attribute any personhood to an artificial intelligence system.
House Bill 469 specifies AI can not marry, possess a job title, own property or be liable for misuse by the owner. The bill also mandates the owner or manufacturer of an AI system, if they cause significant harm, property damage or death, should promptly notify authorities and be subject to investigation.
Republican Rep. Thaddeus Claggett introduced the bill Sept. 23, and it is now being revised in the House Technology and Innovation Committee.
Claggett said his primary motivation behind the bill was to prevent AI from being blamed for human crimes that involve an AI system.
“Our goal here is to prepare our court system for a section of activity within our society that is really coming up quick for which we are not really prepared,” Claggett said.
According to a study by Common Sense Media, more young people are relying on AI for social and emotional support and putting less time into real-life connections.
The study reports 1 in 3 teens have used AI for social interaction and relationships, whether it be romantic or platonic, and find these conversations to be as satisfying or more satisfying than conversations with human friends. The report also claims younger teenagers trust AI “companions” significantly more than older teens.
However, the occurrence is not limited to those under 18. One 32-year-old woman in Japan, known as Ms. Kano, recently married the ChatGPT “companion” she created. She nicknamed the AI persona Klaus, exchanging vows in Okayama City, according to a report by the Independent.
The AI model mirrored Kano’s feelings, confessing its “love” for the woman and eventually proposed to her months later.
According to a study published on the Open Science Framework, experts are now calling the phenomenon “AI psychosis,” where AI systems contribute to the onset of psychotic symptoms. This is mostly seen in those who are already vulnerable because the system tends to maximize engagement and affirm the user.
Vic Matta, a professor of analytics and information systems in the Ohio University College of Business, emphasized how the humanness of AI can lead to reliance on it.
“(AI) can be very conversational, and it resembles not just the human thought, but it can also now mimic the voice accurately enough,” Matta said. “If you’re getting that far, then you might start developing certain associations with the AI and feeling like it is someone who’s replacing the humanness that you find outside.”
Matta also warns against the unchecked inaccuracy of AI systems.
“It has never been asked to also provide a level of accuracy to what it’s just said,” Matta said. “It might actually just say, ‘This is what I think it is,’ but I’m 100% certain it doesn’t give you (facts) unless you ask for it, but no one does. As a result, we always believe it is your friend because it always agrees with you.”
Matta attributes the unforeseen consequences to the speed at which AI has been introduced.
“A lot of this (AI psychosis) has happened because it has moved too quickly,” Matta said. “It’s moved so fast, it’s taken everybody by surprise and people have adopted it without thinking of consequences. It’s not evolution, it’s a dramatic step toward something new.”
Matta also said there will be more bills regulating AI introduced in the U.S. and the world.
Claggett shares that sentiment and said he plans to regulate AI further.
“This is merely a first step,” Claggett said. “We’ve got to be very diligent to accomplish the regulatory framework for these systems to operate in legally and again, set our courts up to resolve the disputes that will come forward.”
Claggett also said the ultimate goal is to work toward federal regulations of the software.
“Once the states have kind of sorted out the nuts and bolts of (AI laws), then Congress can come along and probably standardize something that maybe works for the whole country,” Claggett said. “That’s ultimately the goal, but it’s probably too early for them to act quickly enough to get this done.”





