A new legislative bill will require schools to develop a model policy to address the appropriate use of artificial intelligence by students and staff.
Ohio House Bill 96, passed Aug. 20, 2025, states all Ohio public school districts, community schools and STEM schools must adopt an AI policy by July 1 or adopt the model policy created by the Department of Education.
The Ohio Department of Education collaborated with the Ohio AI in Education Coalition to develop the potential model policy, according to a press release.
The Ohio AI in Education Coalition encourages schools to form an AI workgroup to shape policy and implementation, establish a policy legislating the use of AI and offer professional support for district personnel on how to use it.
Ohio University has established the Center for Teaching, Learning and Assessment as a professional support to outline the restrictions and possibilities for AI in the classroom, according to OU’s website.
“A one-size-fits-all approach to AI in higher education is counter-productive," the website states. “These concerns (AI implication) may provoke a desire for an institutional response that clearly defines how and when AI should be used at the university. However, we believe positions regarding AI use are better developed within specific contexts by the stakeholders most closely engaged with those domains.”
Jennifer Lisy, assistant professor of instruction for Teacher Education, is one of the CTLA Faculty Fellows at OU. Fellows need to be full-time faculty members and work with CTLA to teach and learn more about instituting AI in education, according to OU’s website.
Lisy mentioned the Asynchronous Institute, think tanks and learning communities are offered through the program.
“The university, I think, has done a really nice job of not setting really stringent policies that are one-size-fits-all, because it's not a one-size-fits-all thing,” Lisy said. “It looks very different in coding than in chemistry, than in a history class, than in an education class.”
In Lisy’s education courses, students fill out AI use statements with certain assignments. In the statements, students explain why they chose to use or not use AI and if they would choose the same decision again. If a student used AI, they must write how it helped them, how they used it, which AI tool they used and what was changed from the assignment.
“Taking that moment to pause, and it’s better cognition is like thinking about thinking,” Lisy said. “At the end of the day, they're going off into their own classrooms on their own, so they have to figure out what works for them.”
Brian Hoyt, OU professor of management, described putting accurate policies around an evolving technology as an impossibility. Hoyt said he agrees with OU allowing professors to create individual course policies in alliance with the school’s protocol.
“The learning objectives have to be connected to the knowledge and content that they have to know to use AI,” Hoyt said. “They have to have the skills and tools in place to use AI.”
AI is used in three of Hoyt’s courses. In his lower-level classes, AI is restricted to any work except for extra credit. Hoyt encourages students to learn to use AI prompts and reflect on the tool’s efficiency.
In Hoyt’s advanced classes, he allows the use of AI for teaching tools. Hoyt uses the program Kritik to evaluate students' original work versus applied AI.
“Students don't have to feel like they're using AI inappropriately,” Hoyt said. “They have to use it. They have to use it well, that's part of the assignment and the evaluation, but they're able to use it without thinking, ‘Oh boy, I don't want them to know that this is AI, this is my work.’ It just lays it all out, and they get to see the effectiveness of AI.”
Students must be aware of the responsibilities and consequences of using AI in school, according to Ohio’s AI in Education Coalition. One factor students and faculty should remember, as stated in the “AI Strategy" outline the coalition made in 2024, is that AI cannot replicate emotions or ethics.
“There are concerns about the ecological impacts, there are concerns about security and privacy, and those are legitimate,” Lisy said.





