skipToContent
United StatesHE higher-ed

Q&A: How are teachers reckoning with AI in schools?

University of Washington News United States
Q&A: How are teachers reckoning with AI in schools?
A UW-led team of researchers interviewed 22 teachers about AI use. Photo: iStock Artificial intelligence has swept into American schools, and more is sure to come. This year, both Google and Microsoft — the two biggest companies at the forefront of the AI boom — announced major investments in AI training for teachers. But what do teachers think of this transformation of their work? Katie Davis, a University of Washington professor in the Information School and co-director of the Center for Digital Youth, studies how technology affects young people’s learning and development. Davis has also been teaching for over two decades — first as an elementary school teacher and now as a professor — so she’s acutely aware of how earlier technological revolutions in teaching have not always played out as hoped. Davis and a UW-led team of researchers interviewed 22 teachers in Aurora Public Schools in Colorado — a district that’s investing heavily in AI through systems like Google’s Gemini and MagicSchool , an AI tool that helps teachers plan. Overall, teachers were ambivalent about the technology. They liked that it could reduce workload, especially for rote tasks, but worried that it could erode the social aspects of teaching. The team presented its research April 15 at the Association for Computing Machinery Conference on Human Factors in Computing Systems in Barcelona. UW News talked with Davis about the study and how ostensibly democratizing technologies can widen disparities in schools. Why did you want to study AI adoption by schools? Katie Davis : At least since the introduction of the radio, every new technological invention has been hyped for how it will change teaching and learning. Computers are the prototypical example. They were pushed into schools only to start collecting dust, because they didn’t really change anything. We saw it with massive open online courses , too. Ten or 15 years ago, these courses were supposed to transform education and put colleges and universities out of business. But that hasn’t happened. Often the hype centers on closing educational inequities. But these new technologies actually tend to aggravate existing inequities. The schools serving the most affluent students have the resources to think carefully about how to incorporate technologies into their curriculum so that they’re supporting student learning goals and outcomes, whereas more under-resourced schools don’t have the resources or the time to do that kind of work. So they end up incorporating technologies in ways that don’t necessarily help students learn; instead, they make things more efficient or keep track of students. When AI started being intensely pushed into schools, I thought here we go again. AI is here and it’s not going anywhere, so I would love for us to understand how it’s being taken up in schools and, ideally, to prevent this recurring pattern. What did you hear from teachers about AI? KD : Teachers expressed a deep ambivalence toward AI. It wasn’t as if any one teacher said it’s all great or it’s all terrible. I think the single strongest driver for teachers to use AI was to prevent burnout. Teachers are being asked to do more and more — not just teach, but care for students’ entire emotional, cognitive and academic lives. It really weighs on them. So a lot of them talked about turning to AI to be a thought partner, to help them brainstorm lesson ideas, create assessments and differentiate lessons for different learners. Another really big benefit for this particular school district was multilingual support. The district serves students who speak more than 160 languages. One teacher we spoke with said she had four main languages represented in her classroom but she only spoke English, so she was turning to AI to help her translate materials for her students and for their families so that she could communicate with them. I think it’s really important to note that this district is going all in on AI. They’re encouraging teachers to use it and providing professional development, and teachers are talking among themselves and sharing ideas. This kind of institutional support and more informal teacher conversations are also encouraging teachers to use AI and explore how they might incorporate it into their teaching practice. AI is often presented as a democratizing technology, but a Financial Times story recently showed that higher wage earners are using AI more than lower wage earners in the same industry — possibly increasing disparities. Are you seeing anything like that playing out in education? KD : The way that manifests in education is in the kinds of support that students have access to. It’s more likely that better-resourced schools are also going to provide some form of AI literacy instruction — to really engage students in thoughtful reflection about what AI is, how it may or may not be useful for their learning, and to actually get them to think about these issues in a deep way. Whereas in under-resourced schools, the easiest thing to do is to just block AI. That’s not going to prevent students from using it, but they will end up using it in a communication vacuum, without any adult guidance. You can see how that would create disparities in how well students can use it. I was really interested in the finding that teachers are concerned that students will know they’re using AI. KD : That is one of the most interesting findings for me. Teachers are definitely aware that if their students think they’ve used AI, students and their parents will feel that their teachers are cheating them out of a proper education. Teachers are very worried about both students and their more AI-resistant colleagues seeing them that way. I don’t think this is unique to teachers — I feel it in university jobs, too. Many people have this perception that using AI is cheating or taking the easy way out. But there’s another layer: Teachers are personally worried about their own authentic voice and professional identity. They’re asking, “If I am using AI, at what point am I no longer a teacher? Where’s that line between using AI as a thought partner to augment my professional practice versus it now replacing my professional practice?” What are ways schools might amplify the positive parts of using AI while mitigating some of these negative effects? KD : One of the first things is to bring AI out of the shadows and talk about it. Since we published this piece, I’ve been engaging with groups of teachers around the country in professional development experiences around AI, and they really enjoy having a community of practice. They feel that those spaces don’t necessarily exist in their schools. It’s like there’s this vacuum of communication — students don’t talk about it because they’re implicitly getting the message that it’s not OK to use it, and it’s the same with teachers. Professional development is also very important. But a lot of professional development for teachers is just one-off PowerPoint presentations. It doesn’t really connect to whatever is going on in the classroom. Professional development needs to be done in a sustained way that meaningfully connects AI to teachers’ immediate classroom experiences. School leaders need to be able to communicate AI policies, so that teachers are aware of them and understand how they apply in their specific schools. If you take Washington state as an example, the Office of Superintendent of Public Instruction has a really great blueprint and guidance for using AI. But my sense is that not many teachers are aware of it, or even if they are, there hasn’t been any concerted effort to say, “OK, this is what that means in our school.” We need to be working at many levels to make sure that AI is integrated into education well. Is there anything you want to add? KD : Something I hold very dear as a teacher is that teaching is relational. Kids don’t learn in isolation. The CEO of Khan Academy gave a TED Talk saying the ideal vision is for every kid on the planet to have their own personal AI tutor and for every teacher to have their own personal AI teaching assistant. Maybe that would be great, but I worry that this push toward AI will erode the relationships between teachers and students. Teaching and learning are social processes. It’s not just about putting information into a student’s brain. Students learn through dialog, through participation in cultural practices. To remove that element of learning really concerns me. Co-authors include Aayushi Dangol , a UW doctoral student in human centered design and engineering; Smriti Kotiyal of Artech and Robert Wolfe of Rutgers University, both of whom contributed to this research as UW graduate students in the Information School; Alex J. Bowers of Columbia University; Antonio Vigil of Aurora Public Schools; Jason Yip , a UW associate professor in the Information School; Julie A. Kientz , a UW professor and chair of human centered design and engineering; Suleman Shahid of Lahore University of Management Sciences; Tom Yeh of the University of Colorado Boulder; and Vincent Cho of Boston College. This research was supported by a Spencer Foundation Vision Grant and the AI Research Institutes program by the National Science Foundation and the Institute of Education Sciences. For more information, contact Davis at kdavis78@uw.edu .
Share
Original story
Continue reading at University of Washington News
www.washington.edu/news
Read full article

Summary generated from the RSS feed of University of Washington News. All article rights belong to the original publisher. Click through to read the full piece on www.washington.edu/news.