Generative AI and Hybrid Intelligence for Fostering Teacher Noticing

02 February 2026

Educational project

Generative AI and Hybrid Intelligence for Fostering Teacher Noticing

The project aimed to explore whether interacting with an AI chatbot acting as a student, could support prospective mathematics teachers in understanding and responding to students’ mathematical thinking. Implemented in the context of a mathematics teacher education programme within the masters Science Education and Communication, seven  participants identified “critical events”, which are moments where students’ mathematical thinking becomes visible and can be built on by the teacher, from their experiences during their school internships and from academic literature. They addressed these in a dialogue exercise with CoPilot, an AI chatbot. The intervention involved uploading written descriptions of critical events and engaging in guided dialogue with the AI system to interpret student thinking from multiple perspectives (from feedback from their peers, instructor and AI systems).  In general, prospective teachers felt the dialogue with the chatbot helped them learn and practice their noticing skills.  In follow up interviews, the participants reported they saw it as a useful preparation before teaching in real classrooms, although few of the participants were sure about fully trusting it, as it differs a lot from actual students. 

 

Background information

This project originated from the recognition that prospective teachers often have limited opportunities to gain practical classroom experience, due in part to the limited availability of internships and opportunities to teach in real settings. Even during real classroom practice, teachers rarely receive immediate and detailed feedback on their teaching decisions, which makes it difficult to reflect and improve their teaching practice. To address this gap, the AI-supported dialogue task  was designed to offer additional practice in a safe and accessible environment, without the time constraints that real classrooms entail.

The dialogue includes a prompt to be copy-pasted in an AI-system such as CoPilot and enables a scenario where prospective teachers can practice their noticing skills. The scenario includes a suggestion of a critical event, but the prospective teachers are encouraged to use their own critical events based on their internship or other experiences.

This approach allows participants to prepare better for real teaching situations while experimenting with different teaching strategies, especially when opportunities for role play and intervision sessions are limited. The AI environment provides a safe space for making mistakes and trying out new ideas. The project aims to support teachers by giving them an opportunity to practice, evaluate, and refine their teacher noticing skills.

The workshop

We conducted a post-course workshop with eight prospective mathematics teachers to deepen their ability to notice and interpret critical classroom events. Participants, all with strong mathematical backgrounds, engaged with a carefully selected algebraic misconception from a Grade 9 lesson . They copied the event and a pre-designed prompt into Microsoft Copilot to analyze student thinking, explore possible misconceptions, and consider pedagogical responses. Copilot acted as a student agent that interacts with the teacher. GenAI was used to surface alternative explanations, generate potential student reasoning, and support comparison of interpretations. We chose this structured prompting approach because research on teacher noticing highlights the value of examining multiple perspectives, and studies on GenAI in teacher education indicate that AI can serve as a reflective aid when paired with targeted prompts (Rotem & Ayalon, 2024). Using the university-licensed CoPilot ensured a safe environment while enabling controlled exploration of AI-generated insights.

Following the AI exercise, a plenary discussion was held in which participants shared their experiences. Finally, a one-on-one follow-up interview was conducted to reflect on the interaction and insights gained. The whole intervention lasted 60 minutes with dialogue exercises taking up 40 minutes of that time.

Find the assignment here.

The results

Overall, participants reported that they found the AI-supported dialogue useful for practicing their teaching skills.  Many  noted that the tool would be especially valuable for beginning teachers as they could practice in a safe environment without time constraints . More experienced participants felt that the AI was less realistic  than real classroom interactions as the simulation was mainly a text-based interaction between the prospective teachers and the AI system acting as a student agent. However, they still viewed it as a   tool for newcomers. Some participants described the dialogue as realistic and comparable to classroom situations, while others felt it could not fully capture the dynamics of real student interactions.  Across the group, the feedback and evaluation component was seen as the most beneficial aspect, as it allowed participants to get feedback by the AI system, their peers and the instructor, expanding their perspectives and identifying aspects they might have missed otherwise.  In contrast with the majority of the seven prospective teachers (PTs), one  participant expressed skepticism about the role of AI in teacher preparation, emphasizing that teaching deeply relies on authentic human interaction. A key challenge was that PTs consistently disregarded more conceptual ideas, especially the student’s transfer of strategies from prior lessons, even though these ideas are pedagogically significant. This pattern mirrors research showing that PTs often struggle to engage with the conceptual roots of student misconceptions and tend to respond procedurally. Another outcome was the limited use of shaping moves: most PTs guided the student toward the correct method rather than probing their thinking. This was unsurprising yet noteworthy, as it reflects known difficulties in responding in ways that build on students’ reasoning rather than correcting it.

 

 

 Lessons learned

  • AI-generated student answers gave prospective teachers helpful practice in recognizing and responding to student thinking, but they still needed clear guidance. Without it, many focused only on obvious hints or jumped straight to correcting the problem, missing the chance to dig into the student’s reasoning. Good scaffolding helps turn these AI interactions into meaningful learning rather than quick, surface-level exchanges.
  • PTs often overlooked more conceptual ideas, such as the student’s transfer of strategies from previous lessons, and tended to default to giving procedural directions rather than probing reasoning. This limited deeper noticing. In future iterations, we would add explicit scaffolds that encourage PTs to attend to conceptual reasoning, use more shaping moves, and pause to reflect before responding.
  • PTs reported that the combination of feedback from peers, instructors, and AI improved reflection quality, yet social interaction within the developed learning activity remained limited and needs intentional design (e.g., having students working in pairs would facilitate more effective peer feedback). For this reason, we aim to introduce structured collaborative activities and shared reflection protocols so PTs actively engage with one another’s interpretations rather than working individually alongside the AI.

Tips

  • Use structured didactical frameworks to guide all prospective teachers interactions. For example, you can give participants a short script or set of guiding questions (e.g., “Ask what the student was thinking,” “Probe their reasoning before correcting”). This prevents the interaction from becoming a quick answer-seeking exchange and instead frames it as a pedagogical dialogue.
  • Provide scaffolding for interpreting AI-generated student responses to avoid oversimplification and support deeper teacher noticing. For example, in next iterations we aim to offer tools such as noticing checklists, example shaping questions, or short reflection frames (e.g., “What idea did the student raise? How did you respond? What did you miss?”).
  • Inviting teachers to bring their own critical events (moments in which a student’s thinking was puzzling, insightful, or unexpectedly challenging) adds authenticity and relevance to the activity. These lived examples carry emotional and pedagogical significance, which makes the practice feel more meaningful and immediately applicable. When such events are explored in a safe, structured environment, teachers can revisit them without the pressures of the live classroom, analyze the underlying (AI-generated) student thinking, and experiment with alternative responses using the AI system. This not only personalizes the learning experience but also builds a shared repository of real classroom moments that enrich peer discussion and collective professional knowledge.

 

 

Central AI policy
All AI-related activities on this page must be implemented in line with Utrecht University’s central AI policy and ethical code.
Responsibility for appropriate tool choice, data protection, transparency, and assessment use remains with the instructor.

 

 

 

 

 

 

Print

You are free to share and adapt, if you give appropriate credit and use it non-commercially. More on Creative Commons

 

Are you looking for funding to innovate your education? Check our funding calender!