It seems hardly a day goes by without the topic of artificial intelligence being raised, whether in warnings or questions about AI’s future use and how it will change life as we know it.
For educators and students, adapting to AI is less theoretical and more immediate following the rollout of ChatGPT in November of last year. The AI-enabled writing tool was quickly put to use for assignments such as writing essays to taking tests. It’s estimated 100 million people used it in its first two months, setting records.
“ChatGPT is sort of the breakthrough moment in terms of the public’s understanding of artificial intelligence,” says Jamie Merisotis, president and CEO of Lumina Foundation, and the author of Human Work in the Age of Smart Machines.
“It’s not as if we didn’t know this was coming, but now that it’s here and people understand it, you see people grappling with the reality of what are the implication of that for K-12 education, for higher education, for workforce training, for all of the different elements of the learning ecosystem.”
ChatGPT, says Merisotis, now has educators thinking about the broader movement toward generative AI.
“I think that’s a good thing. AI is going to perform tasks that humans can’t or shouldn’t do – the speed, the repetition, the ability to analyze massive amounts of data. Generative AI tools are going to change the nature of how we learn and how we work,” he explains.
Exactly what that will look like is a work in progress.
When ChatGPT first rolled out, one major concern in schools was cheating, prompting some school districts to ban the AI tool, including New York City schools, the nation’s largest school system. In May, that changed. The city’s School Chancellor cites an initial reaction of “knee-jerk fear and risk” to generative artificial intelligence and being caught off-guard, but now vows to embrace the technology.
The Need for Policies and Guardrails
With educators exploring just how to embrace it, the U.S. Department of Education makes the case for educators to act now in showing leadership and engagement with AI.
“We cannot afford to be passive recipients of AI-enabled tools in our schools and in our classrooms and in students’ lives and learning,” says Roberto J. Rodriguez, assistant secretary, Office of Planning, Evaluation, and Policy Development at the U.S. Department of Education.
“We have to be active and our educators need to be active in using their professional judgment, our decision makers need to be active in setting policies and responsible guardrails around how AI is used,”
While the DOE hasn’t set policies, it is recommending some and offering insights in a report focusing on AI and the future of teaching and learning.
Rodriguez explains that the report follows listening sessions conducted last summer by the DOE’s Office of Educational Technology that explored AI’s use in education. More than 700 people participated with attendees ranging from students to researchers, policymakers to parents.
The result is an acknowledgement of both opportunities and risks linked to using AI in classrooms. Some of those opportunities include using AI to help better assess students’ skill development as well as support students in special education classes. Among the risks: algorithms that led to racial bias as well as privacy and security of student data being exposed.
One key takeaway: AI won’t replace teachers. “There was a really clear sentiment that we should make sure that, as we say, humans are in the loop,” explains Rodriguez.
Freeing Up Teachers to Teach
Bird says he believes AI can’t replace the role humans play in all aspects of our society, and that includes education.
“I look at generative AI as a tool you put into your toolbox like, back in the day, chalk and/or an eraser. And then it evolved into computers. These are tools for faculty, for teachers. What we are focused on is improving the toolbox to enhance the student experience,” he explains.
“The more time that you can give to teachers to interact and utilize these tools to enhance their productivity, as it were, then the more time they can spend actually teaching and interacting and doing what we do now, which is human conversation,” adds Bird.
Mark Schneider is the director of the Institute of Education Sciences (IES), which is a science agency that sits inside the Department of Education. He shares this example of a project that is examining how AI could be used to ease some of the administrative burdens on teachers.
“We burden pathologists and all special education teachers with incredible amounts of paperwork through preparing IEPs (individualized education plans), for example.”
He adds, “There are many, many students that need help with regard to speech and language pathology, and the number of speech pathologists in schools is small and declining because there are much better opportunities in the real market. So, what happens is that students with speech and language pathologies get drive-by assessments.”
Schneider says the idea being explored is to create and AI-driven universal screening, with students sitting in front of a monitor doing an assessment assisted by AI. Additionally, he says, ChatGPT could be used to write initial IEPs to eliminate some of the other paperwork.
“This is the most critical part of this deal. The idea is not replacing teachers, but lessening the burden on them, so that they could do the things that only humans and only teachers could do. They could work much more closely with students. They could provide the support, the emotional support. They could provide all the kinds of teaching that teachers do, right? So we’re tapping AI to do that,” explains Schneider.
The Role of EdTech
With teachers at the forefront of using AI, Rodriguez says, they also need more support.
“Our educators need more information about how AI is evolving, how it’s used in the lives of learners. They need more capacity and support around utilizing AI as another edtech tool in helping support the success of their students,” he adds.
Rodriguez stresses that education technology or edtech companies, which have been growing rapidly in recent years, also carry a responsibility in making sure AI is safe and effective and protects students’ privacy.
“Let’s make sure that those systems are trustworthy, they’re safe, they’re secure, they’re used equitably with students before they are deployed in classrooms, before they are deployed in the teaching and learning process,” he adds.
One effort to make sure the technology is used responsibly and equitably is The Edtech Equity Project. It provides guidance for both companies and schools to fight racial bias in education technologies.
Edtech companies are also providing teachers with tools to detect AI writing among students. Turnitin, which describes itself as an academic integrity company is among them.
Turnitin already offers a product called a Similarity Report that allows teachers to check student submissions to see if they match other things found online, in journals or books. In April, that tool was extended to detect AI writing, including ChatGPT.
“So that report the teachers already use, we turned on a button that will show them whether we detect a hint of AI writing that they can then learn more about,” explains Annie Chechitelli, chief product officer at Turnitin.
She says the technology had been in the works for a while. “We had started working to understand the statistical fingerprints of AI writing and then when ChatGPT was released, it was just a really nice application that put it at the fingertips of everyone.”
Chechitelli adds: “This is all very new to us. And it’s really to help teachers start a conversation with students.”
By mid-May, Turnitin reported that out of 38.5 million submissions processed for AI writing, 3.5% reported between 80% to 100% of AI writing. The company, however, acknowledges that it’s found cases of false positives of AI writing and is acting to reduce those instances.
It also cites the “unchartered territory” for the education community and is providing resources for teachers about how to navigate conversations with students if AI writing is detected.
Apart from AI detection, says Chechitelli, is the challenge of educating teachers.
“The thing that keeps me up, that I ponder on and that we want to keep improving is how we help educators get ready for this future, how we give them information, how we help them learn, how we help them think through their assessments in a new light, in a way that’s not fearful, and also in a way that’s not overly taxing,” explains Chechitelli.
AI as an Idea Generator
While there are steep challenges, the case is being made for an upside for teachers using AI technologies in the classroom.
“As an idea generator, they help you brainstorm ideas. I think it is great. I’ve heard of teachers using it to brainstorm some ideas for their lesson plans,” says Tony Wan, head of platform at Reach Capital, an early stage venture capital firm investing in education and workforce development technologies.
While much focus has been on the question of whether AI will eliminate jobs, Wan also sees the potential to improve the day-to-day role of teachers. “I hope that if this is output that a machine can create, I’d like to think that humans will be pushed to do more valuable work or spend their time more productively,” stresses Wan.
Lumina’s Merisotis also sees upsides.
“At the end of the day, I think it will free us up to do some of the things that we are better at: human interpersonal communication, our ability to wrestle with ethical questions, our compassion, our empathy,” stresses Merisotis. He adds: “I think that the breakthrough moment is an important opportunity for us to understand what AI can do.”