To the Jordan School District, AI is not the enemy. But they still want guardrails
Since students now have easy access to chatbots that can answer math problems or write essays in seconds, school districts are challenged with figuring out how to deal with generative artificial intelligence programs, like the popular ChatGPT.
Utah’s Jordan School District has blocked ChatGPT for students and teachers on all school devices and networks. But the district is still exploring the possibilities of AI with a program tailored specifically to educators.
Kasey Chambers is a digital learning specialist for the Jordan School District and focuses on AI. She described the software as a curated, filtered version of ChatGPT to help with educational tasks. It gives teachers sample queries they can input like “brainstorm first day icebreakers” or “write a weekly email.”
Braxton Thornley, a language arts teacher and digital instructional coach at Bingham High School, has used ChatGPT to help him write sample paragraphs to illustrate writing styles to his classes. He used it from home since the site is blocked by the district.
Thornley was part of the pilot program over the summer and said he hasn’t noticed a big difference between SchoolAI vs ChatGPT since he has been using them for the same purposes. But one benefit Thornley sees in using a school-focused AI program is that it addresses some data security and privacy concerns, a common worry in the context of schools.
SchoolAI says it complies with applicable federal student privacy laws.
The district has been pleased with the pilot and has opened it to more teachers, Chambers said. It is now looking to purchase a permanent AI program, whether that’s from SchoolAI or another company.
“We’re hoping that by the end of September, into October, we have something that we can roll out and begin to train our teachers on implementing,” Chambers said.
Looking forward, Thornley said he’s excited for the potential in developing a program for students.
He would like to see a program that can give students tailored practice questions if they miss something on a quiz — or individualized feedback on an essay before turning it in.
“Or maybe you have a bunch of students in class and every student is receiving individualized instruction for a concept that kind of meets where their current level is,” Thornley said. “Things like that, where we can individualize for students, is what's really exciting to me.”
SchoolAI does have an option where teachers can let students use the program, but the teacher decides what the chatbot can and can not do.
“That's a feature that we're really excited about,” Chambers said. “We're hoping to get it rolling soon.”
Chambers said they need a program that students can use safely and appropriately. For example, if a student asks it to write an essay, the program will instead give them tips on getting started.
Chambers said in order for AI to have a positive impact, schools need to teach students what it is and how to use it productively.
If schools simply block and ignore AI, Chambers said, “I feel like students are more likely to use it to cheat, because they’ll think it’s just this solution to all of their problems.”