Tech giants Google, Microsoft and OpenAI have unintentionally assigned educators around the world major homework for the summer: Adjusting their assignments and teaching methods to adapt to a fresh batch of AI features that students will enter classrooms with in the fall.
Educators at both schools and colleges were already struggling to keep up with ChatGPT and other AI tools during this academic year, but a fresh round of announcements last month by major AI companies may require even greater adjustments by educators to preserve academic integrity and to accurately assess student learning, teaching experts say.
Meanwhile, educators also have scores of new edtech products to review that promise to save them time on lesson planning and administrative tasks thanks to AI.
One of the most significant changes was OpenAI’s announcement that it would make its latest generation of chatbot, which it dubbed GPT-4o, free to anyone. Previously, only an older version of the tool, GPT-3.5, was free, and people had to pay at least $20 a month to get access to the state-of-the-art model. The new model can also accept not just text, but spoken voice inputs and visual inputs, so that users can do things like share a still photo or image of their screen with the chatbot to get feedback.
“It’s a game-changing shift,” says Marc Watkins, a lecturer of writing and rhetoric at the University of Mississippi and director of the university’s AI Summer Institute for Teachers of Writing. He says that when many educators experimented with the previous free version of ChatGPT, many came away unimpressed, but the new version will be a “huge wake-up call” for how powerful the technology is, he adds.
And now that students and professors can talk to these next-generation chatbots instead of just type, there’s fresh concern that the so-called “homework apocalypse” unleashed by earlier versions of ChatGPT will get worse, as professors may find it even harder to design assignments that students can’t just have these AI bots complete for them.
“I think that’s going to really challenge what it means to be an educator this fall,” Watkins adds, noting that the changes mean that professors and teachers may not only need to change the kind of assignments they give, but they may need to rethink how they deliver material as well now that students can use AI tools to do things like summarize lecture videos for them.
And education appears to be an area identified by tech companies as a “killer application” of AI chatbots, a use case that helps drive adoption of the technology. Several demos last month by OpenAI, Google, and other companies honed in on educational uses of their latest chatbots. And just last week OpenAI unveiled a new partnership program aimed at colleges called ChatGPT Edu.
“Both Google and OpenAI are gunning for education,” says José Bowen, a longtime higher ed leader and consultant who co-wrote a new book called “Teaching with AI.” “They see this both as a great use case and also as a tremendous market.”
Changing Classes
Tech giants aren’t the only ones changing the equation for educators.
Many smaller companies have put out tools in recent months targeted at educational uses, and they are marketing them heavily on TikTok, Instagram and other social media platforms to students and teachers.
A company called Turbolearn, for instance, has pushed out a video on TikTok titled “Why I stopped taking notes during class,” which has been viewed more than 100,000 times. In it, a young woman says that she discovered a “trick” when she was a student at Harvard University. She describes opening up the company’s tool on her laptop during class and clicking a record button. “The software will automatically use your recording to make notes, flashcards and quiz questions,” she says in the promotional video.
While the company markets this as a way to free students so they can focus on listening in class, Watkins worries that skipping notetaking will mean students will tune out and not do the work of processing what they hear in a lecture.
Now that such tools are out there, Watkins suggests that professors look for more ways to do active learning in their classes, and to put more of what he called “intentional friction” in student learning so that students are forced to stop and participate or to reflect on what is being said.
“Try pausing your lecture and start having debates with your students — get into small group discussions,” he says. “Encourage students to do annotations — to read with pen or pencil or highlighter. We want to slow things down and make sure they’re pausing for a little while,” even as the advertisements for AI tools promise a way to make learning speedier and more efficient.
Slowing down is the advice that Bonni Stachowiak has for educators as well. Stachowiak, who is dean of teaching and learning at Vanguard University, points to recent advice by teaching guru James Lang to “slow walk” the use of AI in classrooms, by keeping in mind fundamental principles of teaching as educators experiment with new AI tools.
“I don’t mean resisting — I don’t think we should stick our head in the sand,” says Stachowiak. “But it’s OK to be slowly reflecting and slowly experimenting” with these new tools in classrooms, she adds. That’s especially true because keeping up with all the new AI announcements is not realistic considering all the other demands of teaching jobs.
The tools are coming fast, though.
“The maddening thing about all of this is that these tools are being deployed publicly in a grand experiment nobody asked for,” says Watkins, of the University of Mississippi. “And I know how hard it is for faculty to carve out time for anything outside of their workload.”
For that reason, he says college and school leaders need to be driving efforts to make more systematic changes in teaching and assessment. “We’re going to have to really dig in and start thinking about how we approach teaching and how students approach learning. It’s something that the entire university is going to have to think about.”
The new tools will likely mean new financial investments for schools and colleges as well.
“At some point AI is going to become the next big expense,” Bowen, the education consultant, told EdSurge.
Even though many tools are free at the moment, Bowen predicts these tools will end up costing colleges at a time when budgets are already tight.
Saving Time?
Plenty of the newest AI tools for education are aimed at educators, promising to save them time.
Several new products, for instance, allow teachers to use AI to quickly recraft worksheets, test questions and other teaching materials to change the reading level, so that a teacher could take an article from a newspaper and quickly have it revised so that younger students can better understand it.
“They will literally rewrite your words to that audience or that purpose,” says Watkins.
Such features are in several commercial products, as well as in free AI tools — just last month, the nonprofit Khan Academy announced that it would make its AI tools for teachers free to all educators.
“There’s good and bad with these things,” Watkins adds. On a positive note, such tools could greatly assist students with learning disabilities. “But the problem is when we tested this,” he adds, “it helped those students, but it got to the point where other students said, ‘I don’t have to read anything ever again,’ because the tool could also summarize and turn any text into a series of bullet points.”
Another popular feature with new AI services is to try to personalize assignments by adapting educational materials to a student’s interest, says Dan Meyer, vice president of user growth at Amplify, a curriculum and assessment company, who writes a newsletter about teaching mathematics.
Meyer worries that such tools are being overhyped, and that they may have limited effectiveness in classrooms.
“You just can't take the same dull word problems that students are doing every day and change them all to be about baseball,” he says. “Kids will wind up hating baseball, not loving math.”
He summed up his view in a recent post he titled, “Generative AI is Best at Something Teachers Need Least.”
Meyer worries that many new products start with what generative AI can do and try to push out products based on that, rather than starting with what educators need and designing tools to address those challenges.
At the college level, Bowen sees potential wins for faculty in the near future, if, say, tools like learning management systems add AI features that can do tasks like build a course website after the instructor feeds it a syllabus. “That’s going to be a real time saver for faculty,” he predicts.
But teaching experts note that the biggest challenges will be finding ways to keep students learning while also preparing them for a workplace that seems to be rapidly adopting AI tools.
Bowen hopes that colleges can find a way to focus on teaching students the skills that make us most human, as AI takes over routine tasks in many white-collar industries.
“Maybe,” he says, “this time we’ll realize that the liberal arts really do matter.”