How Schools Are Coaching — or Coaxing — Teachers to Use ChatGPT

Artificial Intelligence

How Schools Are Coaching — or Coaxing — Teachers to Use ChatGPT

"If educators can reclaim their time by using generative AI to do the mundane tasks, they can get back to doing some of the fun stuff that made them get into teaching,” says one school district director of technology.

By Olina Banerji     Aug 3, 2023

How Schools Are Coaching — or Coaxing — Teachers to Use ChatGPT

This article is part of the collection: Artificial Intelligence Holds Promise for Education — and Creates Problems.

Six months out from when it broke the internet, ChatGPT — and its numerous clones and adaptations — have drummed up great interest, and concerns, for teachers, school leaders and districts.

The introduction of generative AI into society shines a bright spotlight on these educators. Soon, they will have to understand it, regulate its use and also implement it in their own pedagogy.

So education leaders are investing in new training and professional development for teachers on the best use cases for AI. Most crucially, educators want to get a handle on what generative AI is in order to know what skills their students have to be proficient in as they leave school and enter the workforce.

“They were born into this technology. We weren’t. They’re going to figure it out before we figure it out,” says Tracy Daniel-Hardy, director of technology at the Gulfport School District in Mississippi. “We’ll be doing them a disservice if we don’t figure it out.”

For leaders like Daniel-Hardy, the introduction of generative AI to the teaching-learning process “feels different,” even though they’ve seen multiple waves of disruptive technology circle in and out through classrooms. What’s unprecedented about ChatGPT and its clones is access, says Brian Stamford, program director for accountability and innovative practices for Allegheny Intermediate Unit in Pennsylvania, a regional public education body that provides services like professional development for educators in suburban Allegheny County.

“When we roll out hardware or one-to-one edtech in schools, we need to purchase laptops and carts and wireless access points. These generative AI tools work on the web, and quite a few of them are available at no or very low cost. Students and teachers in schools rich and poor will have access to these tools,” Stamford explains.

He says the second big reason why things feel different this time around is the AI’s ability to think through tasks at a pace never seen before.

“Educators may find their assignments and assessments become obsolete quickly,” he says.

The massive disruptive potential of generative AI is clearly not lost on teachers. It makes sense that in a recent report published by PowerSchool — an edtech school solution provider — most educators were only “neutral” about the value that AI would bring to their classrooms.

The district-level machinery, as well as school leaders, are more hopeful that educators will see this value quickly and adopt AI tools into their teaching process. These divergent views between administration and teachers is even more stark in a survey conducted by Clever, where 49 percent of educators said they believe that AI will make their jobs more challenging, whereas a similar proportion — 46 percent — of administrators said they believe AI will ease the educator workload.

The skepticism comes from the fear of the unknown, Daniel-Hardy says. Some educators think students will use ChatGPT to generate work that’s not authentically theirs, she says. Or they are jaded about the way that new tech is introduced into their classrooms every five years with claims that it will drastically change the way things are taught. Every time this happens, Daniel-Hardy says, educators have to learn how to use a new tool, only for it to be replaced by something else.

But she is hopeful that generative AI will break this cycle, and any opposition to its use.

“I do hope educators are not spending too much time listening to the naysayers, and being too cautious and nervous about using it, because that will be such a detriment to education,” Daniel-Hardy says.

Bridging that gap, and easing fears, will lie in getting educators acquainted with AI — a training need underscored by the fact that 96 percent of the 1,000 educators that Clever surveyed said they have not received professional development on the topic. Schools have recognized these needs, although training of generative AI specifically is still nascent.

Stamford, for instance, has created two types of seminars for the teachers in the Allegheny school districts he services — a general introduction, and a subject-specific one, where he brings together educators from the same discipline to share activities they use AI tools for.

Training educators is top of mind for leaders like Daniel-Hardy, though the Gulfport district hasn’t launched any formal coaching yet.

“We have to infuse AI into the regular tech training we do,” she says.

The first few wins for teachers through the use of generative AI should be about “reclaiming time,” she adds. “If educators can reclaim their time by using generative AI to do the mundane tasks, they can get back to doing some of the fun stuff that made them get into teaching.”

Ready or Not

To coach or coax educators into using generative AI tools, there is a consensus among the trainers that it first needs to be demystified.

Steve Dembo believes this demystification should come soon, because educators don’t have the luxury of time before yet another version of ChatGPT is launched. Dembo is the director of digital innovation for Western Springs School District 101 in Illinois, and he’s created a new training module for teachers in his district.

Using AI in teaching is like learning a new skill, says Dembo, so it has to start with applying it to something familiar. For example, the use of AI “can start with a two-week lesson plan. Then we experiment with making a rubric for it,” he explains.

At each step, Dembo shows educators that they can modify the content being created. “It’s important to show them that this is a chat engine, that it is malleable. We can go back and forth with it, change three assessment points to five,” says Dembo.

Once educators are comfortable with this step, Dembo introduces a faux student essay relevant to the lesson plan, to show educators how the AI can grade the paper on the rubric and provide feedback — an end-to-end process that could save time, and drudgery.

The thing that some educators can’t wrap their heads around, still, is that they’re working with a large language model. They use ChatGPT like Google, introducing a new search with every query, unrelated to the previous query. “This honing in on doing small modifications, and then having to do it again until you get the result that you want, that's sort of a skill that needs to be modeled and demonstrated for them,” says Dembo.

Stamford, in Pennsylvania, is trying to do this by getting educators to use ChatGPT for their everyday tasks. Educators are used to editing their responses (or inputs) in this case to other types of AI, like voice assistants.

“I ask them to think about everyday tasks they’re struggling with, from planning a dinner with gluten-free options, to identifying problems with their cars or trucks. This tinkering gives educators insights on how they might use ChatGPT for professional uses,” he says.

Stamford has introduced a number of free, easy-to-use text editors, or generative AI art tools, in these workshops. He is testing out a second type of workshop too, which brings teachers together who teach the same subjects. This gives educators an option to discuss which AI tools might be useful to teach certain topics.

A foreign language teacher in his workshop used an AI tool to create a scene with different forms of transportation. A series of prompts like “Munich town square, bus, airplane flying overhead, train station” — all vocabulary words from a German lesson — helped the teacher create an innovative way to practice fluency in a foreign language.

In another example, educators have asked their students to generate something on ChatGPT and tracked their prompts to figure out if the students understand the content. “Teachers can actually use this as part of their assessments,” says Stamford.

In the fall, he plans to expand these hour-long workshops to full-day ones.

Stamford believes educators should pick up prompt engineering — the ability to give ChatGPT input that yields the needed result — as a bonafide skill.

Dembo disagrees.

“Just because we have a new gizmo doesn’t mean all of us have to look under the hood. Generative AI is just going to be part of the tech tools educators use in the future,” he says.

In its early phases though, educators do need to understand how it works.

Existential Concerns

Through their training, both Dembo and Stamford are also trying to quell existential worries. In a egocentric sort of way, Dembo says, teachers are worried about the relevance of what they’re teaching, and how they’re teaching it. Educators may not be able to get away with just giving out assessments and not really explaining why it’s still relevant to learning a concept.

Dembo says he faced that challenge too, when he taught his own computer science class in a previous role. “Students can produce decent code from ChatGPT that I won’t be able to spot any problems with,” he says.

Dembo says students in a computer science class may not need to master a computing language like Python anymore, but rather just know enough to guide an AI tool to create something using the language. Or they will need to know enough to edit the code to make changes. It changes the bar for knowledge, and subsequently the bar for assessment.

It also changes what teachers expect from their students, Dembo says. “To be honest, it’s very easy to say this. But as a teacher, walking into the classroom, looking at 20 students, and trying to figure all this out is scary,” he adds.

But it's not just about being duped by ChatGPT (and students), it’s also about reevaluating how much students really need to master.

“I think students aren’t necessarily trying to be deceptive. They want to be more productive and use their time on things that they think are worthwhile,” says Daniel-Hardy, of Gulfport. Memorizing historical dates may not make the cut in the era of post-AI learning.

Beyond cheating of course, there is another common fear that ChatGPT will diminish critical thinking abilities in students. If the machine is doing the thinking, then what are students really learning?

Dembo refutes this claim right out the gate.

“Students are just as worried about this [critical thinking loss]. Teachers will have to be more transparent about what they’re expecting students to learn from an assignment,” says Dembo.

If it’s a routine task, like writing five paragraphs of summary, AI can easily reproduce that. The assessments will now have to be designed differently. From a student’s perspective, they also need to know when it’s OK to use ChatGPT for learning, and what constitutes cheating. Dembo says he had explored some of these ethical concerns around the use of ChatGPT with students in his previous computer science classes. His former students, in grades six to eight, have now charted an “ethical use policy” that covers their whole school.

As teachers grapple with these big questions about what AI means for their profession, they need to have access to frequent training about it, Dembo says: “You need to give teachers time to experiment with it, and preferably learn in small cohorts, where they can share what they’re discovering.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

Next Up

Artificial Intelligence Holds Promise for Education — and Creates Problems

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up