Transitioning from general-purpose to educational-oriented Generative AI: Maintaining teacher autonomy.




This section proposes a teacher-centred approach for transitioning from general-purpose generative Artificial Intelligence (GenAI) to education-oriented AI that emphasises contextual relevance, pedagogical grounding, and professional autonomy. Drawing on Human-Centred Design principles and Molenaar’s Human-AI Automation Model, the chapter conceptualises two levels of teacher autonomy. It then presents insights from the life cycle of ideating and co-creating a GenAI prototype that supports instructional design and enactment, illustrating how teacher involvement can support the development of AI tools that serve pedagogical goals and real classroom needs

Advancements in Artificial Intelligence (AI) are impacting everyday life in a variety of domains, including education, healthcare and industry. In the educational sector, the current and rapid (r)evolution of generative AI (GenAI) promises more accurate and personalised learning support, automation of some teaching routines and augmentation of teachers’ pedagogical actions. At the same time, the use of AI tools involve potential risks to human-led teaching, such as lack of human interaction between students and teachers, threats to teacher autonomy or data privacy and ethical issues. Against this backdrop, it is important to distinguish between general-purpose AI tools and those AI tools specifically designed for education. Educational AI tools, such as intelligent tutoring systems or adaptive learning technologies, are developed with the explicit aim of supporting teaching and learning, offering functions like personalised tutoring, automated feedback, and curriculum-aligned content generation. On the other hand, general-purpose AI, often using GenAI-powered tools as chatbots and conversational agents, addresses a wide range of tasks across domains. This distinction underscores that while the responsible integration of AI into education should be approached with care, particular attention is needed when adopting general-purpose tools, as they are not designed with pedagogical goals in mind.

General-purpose AI, such as ChatGPT or Claude, has increasingly found its way into educational settings. Yet, as general-purpose tools they are not designed for educational use. Significant concerns with these tools are their lack of: 1. connection with the educational context and curriculum 2. grounding in pedagogical and learning theories 3. personalisation to learners’ needs 4. support of teacher autonomy. Connecting the characteristics of the educational context (e.g. delivery setting, learning objectives and activities, teaching and assessment methods) with the technology is critical for ensuring that teaching genuinely supports learning. These characteristics are reflected in both instructional and learning design processes. Instructional design provides systematic guidance for translating curriculum and subject knowledge into coherent learning pathways and aligning objectives, methods, and assessments. Learning design, in turn, emphasises the creation of meaningful learning experiences grounded in pedagogical theory, accounting for prior student knowledge, motivation, collaboration, and the multiple ways learners interact with content, teachers, and peers. However, general-purpose AI is not sensitive to differences in course structures, teaching methods, evaluation procedures, and content curricula, risking that the AI output can be irrelevant or confusing. For example, a history teacher may struggle to ensure that GenAI-generated explanations follow the national curriculum’s emphasis on critical source analysis, while a mathematics teacher may find that GenAI responses skip over foundational concepts needed by students who lack prerequisite knowledge. Effective learning also relies on pedagogical principles that guide how knowledge is delivered, practiced, and assessed. General-purpose AI tools lack such grounding and typically cannot be configured to match with the pedagogical intentions of a course (e.g. scaffolding knowledge step by step, fostering inquiry-based learning). As a result, they produce content that may seem accurate or plausible but does not necessarily promote deep understanding or meaningful learning. While GenAI tools excel at providing immediate, direct answers to user prompts, the learning process itself often benefits from strategically delayed and less direct feedback, since meaningful learning is rooted in inquiry, not just in receiving the correct answers. Moreover, general purpose AI offers limited personalisation and adaptation to learners’ individual needs. While GenAI may appear adaptive, it typically lacks genuine learner modelling, which takes account of prior knowledge or enables appropriate responses to affective, motivational, or cognitive differences among students. Finally, general-purpose AI challenges teacher autonomy over the educational triangle that connects the teacher, the learning environment, and the student. That is, by mediating interactions within this triadic relationship, AI risks shifting the balance of agency away from teachers and learners toward algorithmic systems whose operations and decision-making processes are largely opaque. This can lead to a reconfiguration of pedagogical control, where instructional decisions, feedback, and even the pacing of learning lacks grounding in human interpretation, empathy, and contextual understanding. Such a shift raises critical questions about autonomy, accountability, and the preservation of humancentred education in technologically mediated learning environments. Building on the abovementioned context, we raise a critical question for research and practice: How can general-purpose AI tools be effectively transitioned into educational AI that connects to the educational context in classrooms, aligns with pedagogical theories, supports personalisation to learners’ needs and supports teachers’ autonomy? Addressing this question is key to ensuring that AI not only generates content but also meaningfully enhances teaching and learning. Prior research has stressed the importance of Human-Centred Design (HCD) in the co-creation of technological solutions to achieve a careful consideration of instructional and learning design aspects, learning theories and teacher perspectives. These approaches actively position the educational stakeholders (e.g. teachers, curriculum designers) as co-design partners to achieve a synergy between their needs, the technological innovations and the pedagogical context. Nevertheless, existing literature reviews on Human-Centred Design in AI note that its adoption in actual cases is still scarce and mainly regards stakeholder involvement in initial brainstorming on the users’ needs rather than their actual participation in the design and development of tools. We discuss the use of Human-Centred Design to enhance teacher autonomy, i.e. the degree of professional freedom granted to teachers to make decisions about learning, assessment and the tools available to mediate these processes. Within this frame we explore how to enhance teacher autonomy via Human-Centred Design with teachers participating as:

 1. Co-design partners during the design of an AI tool (e.g. interface, tool features, infrastructure) expected to be integrated for a particular educational purpose
2. Co-orchestration partners where the teachers plan, execute and reflect on each lesson and apply the tool to their own teaching by deciding, for example, how the educational tasks can be divided between the teacher and the AI. 

This session introduces a teacher-centred approach to the transition from generative AI as a general-purpose tool to educational AI, and stresses the importance of maintaining connection to the educational context, pedagogical objectives and autonomy of teachers during this transition. The chapter draws upon the HumanAI Automation Model proposed by Molenaar to examine how teachers can be meaningfully positioned as co-actors in the design and integration of GenAI technologies. Building upon this theoretical foundation and the limited evidence on Human-Centred Design in AI, we offer empirical insights into the development lifecycle of educational GenAI systems, with teachers as co-designers, following Human-Centred Design principles (i.e. gathering initial teachers’ needs, eliciting co-design requirements and resulting in co-development). We conceptualised and designed a prototype to support teacher instructional/learning design (e.g. feedback design) and classroom enactment (e.g. capturing student-generated GenAI analytics), according to teacher needs and the current limitations of GenAI tools. Our takeaways aim to highlight the key role of educators as co-design partners to ensure that GenAI tools support pedagogical goals, classroom needs and teacher autonomy during the learning process.


Within the landscape of Technology-Enhanced Learning, researchers stress the need to adopt HumanCentred Design approaches in the design of technological innovations, so that the end-products meet the user needs (e.g. teachers, students). Human-Centred Design places people’s needs, values, and rights at the core of digital design. As far as educational-oriented AI is concerned, Human-Centred Design views AI-driven tools as a means of empowering learners and educators, supporting efficiency, active learning, critical thinking, and creativity. The approach also offers frameworks, such as Value Sensitive Design, that explicitly integrate human values into technology design. Yet, there are limited examples of adopting Human-Centred Design in the design of AI and GenAI solutions. Examples include studies by Holstein, McLaren, and Aleven), who positioned teachers as co-designers of a wearable AI tool to augment students’ monitoring in K-12 classrooms. Likewise, Lister et al. followed a participatory approach to design a virtual agent with visual-disabled students assisting them within the context of distance learning. When it comes to general-purpose GenAI, there are a few studies that implemented Human-Centred Design at the initial design stage by understanding stakeholders’ perspectives in K-12 education. For instance, Han et al. interviewed primary school teachers and students, and found that GenAI could be beneficial in personalising the learning experiences and providing instant feedback to them. The authors indicated concerns over data authorship, lack of critical thinking in the case of hallucinations and students’ and teachers’ autonomy. Likewise, Hays, Jurkowski, and Sims, Kaplan-Rakowski et al., Laak and Aru and Monteiro et al. shed light on teachers’ viewpoints on the use of ChatGPT for educational purposes. In most studies, teachers reported that ChatGPT might be beneficial for students, but they regretted their lack of control on the answers given, and had concerns about the privacy of the data and who can analyse it and about the lack of contextualisation. For instance, Prestridge, Fry, and Kim interviewed ten secondary school teachers to understand the potential added value of GenAI in their courses and they stressed the importance of considering the different course contexts to use GenAI meaningfully. These studies also discussed the need for teachers' professional development, both to provide guidance on how to use GenAI for education and as a means to address feelings of fear and replacement by AI. The above studies provide important insights about teacher and student needs regarding the use of GenAI for teaching and learning, such as the importance of teacher control over the GenAI generated learning content. However, most studies simply focus on teachers’ general perceptions about the use of only ChatGPT via surveys with Likert scale items, or short-answer formats. They do not extract design guidelines for the development of educational-oriented GenAI tools and do not position the stakeholders as co-partners of such development and integration processes. One exception is the study by Han et al., which limits its focus to a GenAI writing tool for primary education. Table 8.1 summarises the current state of the literature in this regard. In all studies, to the best of our knowledge, there is a lack of evidence that HumanCentred Design processes informed the design of a GenAI system based on participants’ requirements and actively involved them as co-designers. 




Given the limited application of the full cycle of Human-Centred Design in general-purpose GenAI, we deem that GenAI tools designed without teachers’ pedagogical considerations may inadvertently hinder their autonomy, since teachers have to deal with AI-generated outputs that do not align with their pedagogical intentions. Indeed, prior research indicated that while ChatGPT and other GenAI tools are nowadays used by most students in higher education, they limit teacher control during classroom practices. For instance, if a teacher proposes that their students use ChatGPT to help create an essay, the teacher may not be able to monitor what the students are asking the chatbot, what feedback is given to them and/or when such prompts or answers need to be corrected, thus leaving aside teacher knowledge/expertise from the learning situation. Although initially introduced to discuss how educational AI affects teacher control, the Human-AI automation model can also be applied to understand different levels of automation in learning situations using GenAI systems. This model articulates the transition of control between teacher and intelligent technology through six levels of automation during the teaching and learning practice (Figure 8.1), i.e. from ‘teacher has full control’ (left), to ‘technology has full control’ (right). Projecting the model into GenAI usage, in the second level (Teacher Assistance), we have cases where teachers have full control over the learning situation (e.g. teaching methods, feedback on course assignments) and GenAI tools propose additional information, explanations, examples, or text snippets for the teacher to implement in their lessons, improving teachers’ existing instructional design. In the third level of the model (Partial Automation), teachers give part control of specific tasks to GenAI. For instance, GenAI highlights common student errors and provides an overview to students highlighting which errors they have made. The teacher can then use this list in a class discussion of errors and elaborate on how to resolve them. In the next levels of the model, advancing towards the right side, the AI takes almost full control largely acting independently. For instance, GenAI tools such as platforms like Synthesis Tutor1 where GenAI generates content, feedback, and assessments dynamically for each student, without expecting any teacher intervention. An apparent limitation of this model is that it does not integrate the entire instructional cycle in which teachers plan, execute, reflect on, and revise their lessons. The model focuses mainly on actions performed during teaching, while it is equally important to consider teacher autonomy and control during course design and refinement. In the planning phase, critical instructional and learning design decisions are taken about sequencing, scaffolding, and alignment with learning objectives, pedagogical tasks and teaching methods. In the refinement process, teachers continually improve their lessons.




Let us consider the case of a GenAI-feedback tool: during the instructional cycle, the teacher specifies the assessment criteria, the type of feedback that should be emphasised (e.g. formative comments on argumentation rather than grammar), and the depth or timing of responses that are appropriate for their students. Based on this instructional plan, the functioning of the GenAI can be refined. Now the general-purpose AI tool is embedded in an instructional cycle, which predetermines how the GenAI tool operates. This allows for a division of labour in which the teacher retains complete control over the instructional plan and functioning of the GenAI, while the GenAI supports the teacher with high automation during the enactment of the lesson, generating feedback for student submissions in real time without the teacher’s direct intervention. In this scenario, teacher autonomy is exercised during the orchestration cycle of the instruction (in this context, orchestration refers to the general management of the learning activities). This means that the teachers can adjust the system requirements of the GenAI tool given their educational context to ensure that they continue to determine the pedagogical framing of the feedback, while automation during enactment reduces teacher workload and ensures consistency. Drawing upon these reflections, the Human-AI Automation model presented above can be extended to understand teachers’ autonomy within GenAI-powered systems with respect to design and orchestration decisions. Teacher autonomy is not only exercised through control of AI during the course enactment, but also during the course design and instructional cycle. Accordingly, we conceptualise two different levels of autonomy where teachers act as: 
1) codesign partners during the AI tool design (e.g. voicing their needs regarding the interface, tool features, and tool infrastructure) and 
2) co-orchestration partners in instruction cycle where they plan, execute and reflect on each lesson and apply the tool to their own teaching, deciding for example how the educational tasks will be divided between the teacher and the AI. 



participatory approaches Reflecting on these different levels of autonomy for teachers and the constraints in the general-purpose AI tools, we aimed to explore how we can support the transition from general AI technologies into education-oriented tools. The study presented in this chapter is part of a project addressing the following research question: To what extent can we enhance teacher autonomy by positioning them as co-design tool partners in the transition from general-purpose AI tools into educational AI? To answer the research question, we followed Human-Centred Design, by giving teachers the possibility to voice their needs when it comes to GenAI use, as well as their pedagogical goals and classroom practices. This emphasis on teacher input supports the development of GenAI as a co-design partner rather than a one-size-fits-all assistant. 

This section first presents the process of understanding teachers’ and students’ needs in GenAI usage and then the ideation and low-fidelity development of a GenAI chatbot with extended functionality to foster teacher autonomy for secondary education. To guarantee that the GenAI system satisfies the learning and teaching needs, we followed a Design-based Research (DBR) methodological approach (Figure 8.2). Accordingly, we further divided the research question into the following sub-questions that were addressed in each design-based research phase (see Figure 8.2): 
1. How do teachers and students use GenAI?
2. What limitations and challenges do teachers and students face when using GenAI?
3. How do teachers perceive the requirements for GenAI tools identified from prior interviews? 
4. How do teachers perceive the added value and potential adoption of the proposed GenAI tool features in their educational contexts?


Our research design followed a qualitative phenomenological process to get from: 1) general problem understanding on GenAI usage (questions 1 and 2, Figure 8.2) to 2) requirements’ validations of teachers’ main needs (question 3, Figure 8.2) and 3) low-fidelity prototyping of a GenAI system satisfying such needs (question 4, Figure 8.3). During Phase 1, we conducted a set of face-to-face, semi-structured, one-to-one interviews with ten secondary school teachers and 12 students. In Phase 2, we conducted a focus group with nine teachers to reflect upon teachers’ requirements (as collected from Phase 1) for using GenAI for secondary education. These data collection approaches were chosen to help us understand in-depth teachers’ and students’ needs when using GenAI. Lastly, in Phase 2 we also conducted three co-design events with eight teachers, each time working with the requirements gathered from the focus group towards the design of a low-fidelity prototype of a GenAI system (see Table 8.4). During Phase 2, teachers worked with three different scenarios for the prototype addressing different teaching moments (i.e. course design, course enactment, after course reflection and assessment) taking into account the complexity and nature of the teaching process. Prerequisites for participants to participate in this study were the previous use of GenAI tools for formal teaching and learning purposes. Teachers often face difficulties in connecting the course learning design and their teaching needs with the desired data-driven information about the student (Mangaroska and Giannakos, 2019[33]). Thus, we employed a set of techniques to better guide our teachers in their role as co-designers and support them during this process. For example, we conducted interviews and a focus group to understand teachers’ current use of the existing GenAI tools. Additionally, we followed the ‘superpower’ approach proposed by Holstein, ΜcLaren and Aleven, asking teachers about the ‘superpowers’ that an ideal GenAI tool could support. During the co-design events we used “speed-dating” processes and prototype simulation exercises, to discuss the use of GenAI in relation to the actual learning scenarios of the teachers. The data sources employed in the interviews were the stakeholders’ recordings about the perceived GenAI challenges, added value and actual use cases, answers to a profiling questionnaire and artifacts (post-it notes with additional ideas). In the focus group, the data sources regarded mainly a profiling questionnaire and the generated artifacts (postit notes with challenges and superpowers). Content analysis of the collected data was employed utilising inductive coding, i.e. categories derived from participants’ answers. For instance, we extracted the following categories based on participant answers on their current use of GenAI: 1. GenAI for replacing current learning and teaching tasks [Replacement] 2. GenAI for complementing current learning and teaching tasks [Complementarity] 3. GenAI for supporting learning both for teachers and students [Learning]. Information about the stakeholder groups and their characteristics is presented in Annex 8.A.




At the initial interviews, the majority of the teachers (N=8) reported ChatGPT as their main GenAI tool and two teachers reported using other tools such as Snippet for code generation or Microsoft Copilot, which they found more accurate than ChatGPT. Half of the teachers (N=5) used these tools for course design (e.g. for outlining the course structure), and the other half during course enactment, either asking students to use it to conduct learning assignments or as annotation tool used by the teachers to comment and reflect upon student answers. Specifically, most teachers (N=8) reported using GenAI tools for replacement purposes (see Table 8.1, category “replacement”, A and B) to conduct orchestration tasks, such as the creation of learning materials, which previously had to be done manually. Several teachers use GenAI tools to conduct educational tasks more effectively (N=3) like introducing GenAI tools as learning options for students to support their learning activities (see Table 8.1, category “complementarity”, A and B). Finally, some teachers (N=3) use GenAI as a learning tool to enhance their teaching methods (see Table 8.1, category “Learning”, A and B). The teacher group thus represented a variety of views. 


Most students (N=9) reported ChatGPT as the main GenAI tool used, and three of them reported the use of other tools, such as Wombo Dream and DALL-E, for generating photos and artwork; or Deep AI and Microsoft Copilot for text generation. Also, most students (N=9) used GenAI at home for assignment preparation, while only three used GenAI either at home or at school following the teacher's recommendations and guidance. The use of these tools was related to a wide variety of topics: History (N=4), Computer Science (N=2), Geography (N=1), Economics (N=1), Literature (N=1), English (N=1), Sex Education (N=1), Physics (N=1). Unlike the teachers, most students described using GenAI as a resource for gathering information for their assignments (N=8, Table 8.2 [Complementarity] C, D).

In several cases GenAI was found to act as a partner for students to test their knowledge, to provide explanations or to practice and improve their writing skills (N=4, Table 8.2 [Learning] C, D). At the same time, three students stated they used GenAI tools by copy-pasting their outputs to prepare their assignments (Table 8.2 [Replacement] C, D). In that case, there is cognitive offloading as AI is replacing the work done previously by the students themselves.




When it comes to GenAI-associated fears (which can guide the design of the new system), most of the teachers’ concerns (N=16) were related to students’ cognitive offloading and the negative impact on learning (see Table 8.3, [Cognitive Offloading] A, B). Another GenAI disadvantage was related to an overreliance on the AI-generated outcome. Many teachers (N=13) were concerned about students becoming over-dependent on AI without questioning the results (see Table 8.3, [Overreliance] A, B) and on how they themselves might also rely too much on the GenAI output (N=7) (see Table 8.3, [Overreliance] C). Additionally, eight teachers connected such overreliance with the lack of quality of AI answers given the hallucinations that an AI model can produce (see Table 8.2, [Overreliance] B). Furthermore, many teachers (N=8) were concerned about how GenAI impacts their own role and control in the teaching process. Two teachers expressed their worry regarding the student-teacher relationship (see Table 8.3, [Teacher Replacement] A-C). Lastly, many teachers (N=7) were concerned about how to monitor students’ use of GenAI (see Table 8.3, [Monitoring] A-C). Reflecting the teachers’ concerns, most of the students (N=7) indeed characterised GenAI as their personal “24-hour teacher” (Table 8.3, [Teacher Replacement] D, E) and five students expressed their unquestionable trust in its output (Table 8.3, [Overreliance] D). Also, a few students admitted being sceptical about how GenAI use affects their cognitive development and their work evaluation. Concretely, four students mentioned that the use of GenAI may hinder their learning growth because they tend to simply use the output with no further thought or work on their part (see Table 8.3, [Cognitive Offloading] C, D).




While teachers and students had identified potential benefits of GenAI tools such as their usefulness for assignment support and efficiency, the interviews indicated that when GenAI is not pedagogically oriented, it can decrease teachers’ awareness and capacity to exert control over the students’ learning progress, i.e. it can decrease teacher autonomy during the instructional cycle. A loss of autonomy can potentially impact effective and contextualised teaching, especially important in primary and secondary education. Consequently, the results helped in the identification of design requirements for GenAI systems in education to foster teacher autonomy (see Figure 8.3).


Teachers raised the need to control the GenAI output in order to enhance students’ cognitive skills (e.g. “I am afraid students have stopped thinking or brainstorming, this impacts the development of critical thinking”). Currently, students (and teachers) use general-purpose GenAI models which are likely to not be contextualised. Thus, the envisioned tool should provide teachers the opportunity to train the models with their own documentation, and to place some “controls” on the responses these models provide to students. For instance, teachers might increase the level of hallucinations to increase students’ critical thinking when using these tools, or the depth and timing of the GenAI’s output. Han et al. also mentioned the need to create GenAI tools that will permit teachers to maintain their agency and control by finetuning options in the GenAI system.




Tracking students’ interaction with the GenAI tool was an important aspect raised by the teachers to assess whether these tools support the learning progress or merely replace the task completion (e.g. “I want to follow the students’ progress more closely using GenAI”). Students also reported bad practices of copy- pasting the GenAI output directly (e.g. “I know that this is not the proper way to proceed but I normally copy-paste the information that ChatGPT gives me”). Monitoring the student interactions with GenAI tools permits the identification of potential knowledge gaps and the shaping of pedagogical informed interventions, readjusting the learning objectives, the lesson plan and the GenAI use. Thus, the desired GenAI tool should permit teachers to follow student interactions with GenAI and provide pedagogical guidance accordingly


Teachers employ GenAI tools for replacement and complementarity purposes based on the nature of different educational tasks. Therefore, there was agreement that it is desirable to let teachers define the level of autonomy they desire with the tool, as proposed by Molenaar. For instance, some teachers might prefer to programme semi-automatic reactions when students overuse the GenAI system; others might prefer to get an alert, and others may not care about such issues.


Overview of the envisioned system according to the retrieved requirements: R1: Tuning the GenAI Output; R2: Monitoring GenAI use; R3: Providing configurable options. During the Co-Design Phase the teachers provide configurable options to semiautomate different educational tasks and train the GenAI model according to their course documents, curriculum etc. During the Co-Orchestration Phase, students interact with the different GenAI models (e.g. ChatGPT models, Co-pilot) according to the prior teacher configurations and teachers can monitor student interactions (i.e. GenAI Analytics) and get alerts of student progress.


The design requirements derived from the interviews led to the development of a low-fidelity prototype under three different scenarios (see Table 8.4 and below). The GenAI could be finetuned by developers to better support classroom use while maintaining teacher autonomy. This prototype was then used in other co-design sessions with additional teachers to confirm the elicited requirements and to modify it according to their preferences and needs. Prior to these sessions, the prototypes were fed with fictional data supporting the different design requirements described before (e.g. GenAI analytics about students who asked for more exercises, students who copy-pasted the given answers to course assignments). During the sessions, teachers interacted with the prototype simulating three scenarios at different teaching moments: pre-course design configuring and contextualising the GenAI chatbot; course monitoring and reflection on the GenAI analytics captured from the interactions of the students with the chatbot; and after-course reflection and assessment of students’ submissions and of the employed teaching methods. Afterwards, teachers were asked to complete several surveys with both close and open-answer questions to assess the integration of the prototype, its usefulness, its usability and potential adoption in their regular practice. Further information about the low-fidelity prototype and one of the co-design sessions can be found in Ortega-Arranz et al.. Each scenario (see Table 8.4) aimed to address different teaching moments (i.e. course design, course enactment, after course-reflection and assessment), taking into account the complexity and nature of the teaching process. This was also about filling gaps: a lot of AI tools mainly focus on course enactment and offer limited educational tools supporting teachers in course design and after-course assessment or automatically generating learning tasks and feedback interventions that are context aware.










The teachers actively contributed to shaping the prototype according to real-case scenarios and their own teaching contexts. They offered valuable feedback on what felt promising as well as areas that need further refinement. Overall, their impressions were positive, and the collaborative process highlighted both the opportunities and challenges in co-designing GenAI for educational contexts. A key outcome of the GenAI co-design approach was that teachers wanted to remain actively present as co-orchestration partners with the GenAI tool throughout the instruction process. This co-orchestration unfolds through a deliberate division of labour over time. During the lesson planning and preparation phase, teachers set the pedagogical parameters within which the GenAI operates, defining learning objectives, instructional strategies and assessment criteria. This ensures that the system’s functions remain grounded in the teacher’s pedagogical judgment and contextual understanding of the learners. During the lesson enactment phase, the GenAI can execute predefined tasks, such as monitoring student progress or providing adaptive feedback, allowing teachers to redirect their attention toward more advanced pedagogical responsibilities, including facilitation of critical discussion, individualised mentoring, and emotional support.




Regarding concrete findings, while we explored the three scenarios, in this section we will only focus on Scenario 1 on helping teachers monitor the student-GenAI interactions. 

Here are the identified findings:
 • The prototype could help teachers to understand their students’ moments of progress and adjust the feedback accordingly. Seven teachers mentioned that such a tool would support their effectiveness by gaining insights on student progress, the methods they are applying and the moments they are struggling. It would help them know when and what type of support to provide (Table 8.5 [Insights]). Two teachers stressed the added value of personalising the support to individual student needs (Table 8.5 [Personalisation]).

The prototype could enhance teachers’ feeling of autonomy and control over AI. Five teachers noted that the envisaged tool could foster their autonomy and increase their level of control within learning situations involving GenAI tools (Table 8.5 [Autonomy]). During the co-design session they proposed further ideas for GenAI analytics to support their level of pedagogical oversight of their students, such as “an overview of cohorts of students that are and are not efficient in working with AI”, “group report on students who work together using GenAI”, “aspects that are changing notably about the student progress while they are using GenAI for a period. If you could get that in a file you could also share and re-use”, “a report on GenAI prompt use or progress throughout the week”. 
Providing more insights to teachers could add more complexity. Four teachers discussed the added value of the tool in terms of awareness of student progress compared to the extra complexity or workload that it adds (Table 8.5 [Complexity]). Several teachers stated that the tool might add cognitive load or time to their tasks, yet it would also offer insights to understand their students and help them stay in control over AI.

During our Design-Based Research process, we examined how teachers and students actually use GenAI, what challenges they face and how they envision an ideal GenAI tool. Their main challenges concerned overreliance on AI outputs, limited critical evaluation, and uncertainty about appropriate use of GenAI. Teachers emphasised the importance of and need for context-sensitive, transparent, and customisable GenAI tools that complement rather than replace their professional judgment. Concretely, we found that teachers use GenAI for three different purposes, but especially for replacement tasks such as lesson planning, content creation or teaching recommendations. Secondary school students use GenAI in assignments that require text-generation to get further explanations in STEM-related lessons, to get additional information on a given topic and to improve their writing skills. These results are in line with the ones obtained by Laak and Aru (2024[25]) regarding GenAI use cases for both teachers (e.g. recommendations on teaching methods) and for students (e.g. GenAI seemed more helpful in some courses rather than in others). Moreover, our study suggested that in practice teachers and students desire different AI use and automation levels based on the nature of the learning and the teaching tasks. Likewise, Brandão et al.described that different activities can lead to different GenAI use; GenAI can serve for automating trivial tasks and it can be used as a critical partner for cognitively demanding activities. This aligns with Cukurova et al. and Cukurova, who emphasise that AI in education should augment human capabilities rather than replace them, a finding echoed in our own results. In our interviews, teachers’ concerns on GenAI were focused on student cognitive offloading for core learning processes and on GenAI overreliance without developing students’ critical thinking. Similarly, prior studies; Zhai, Wibowo and Li, highlighted that an overreliance on AI can undermine students’ critical thinking. Buckingham-Shum discussed the benefits of short-term productivity that GenAI offers in minor tasks and how its uncritical use within the learning practices may impact foundational learning skills, e.g. critical thinking. Further research is needed on how to potentially reshape the teaching and assessment processes questioning what new skills and processes are important to boost in the AI age. At the same time, many teachers expressed concerns about the impact of GenAI on their own role in the teaching process. Such concerns have been acknowledged as well by who detected several barriers to the adoption of AI, including the lack of teachers’ skills and technological competences, the potential loss of teaching autonomy and teachers’ agentic role within the learning process and data ownership. In line with these fears, in our study more than half of the students reported perceiving GenAI as a ‘second teacher’, always available to provide instant answers and feedback. Chan and Tsi and Giannakos et al. stressed the need to foster human expertise which encompasses students’ emotional and contextual knowledge as well as pedagogical/didactical proficiency. This is essential to provide more holistic learning interventions. Framing the teachers’ and students’ GenAI use and concerns within the Human-AI automation model, we observed that teachers’ current practices corresponded to high levels of automation in our sample, where GenAI often operates as an autonomous agent. In such cases, teachers have limited ability to control, adapt, or critically mediate the GenAI outputs to align with their pedagogical goals or specific classroom contexts. This dynamic positions teachers more as end-users of pre-configured GenAI tools than as decision-makers exercising their pedagogical considerations.

The evidence gathered from the Prototype in Scenario 2, which allowed teachers to configure how students interact with GenAI, helped to address several of the limitations of general-purpose GenAI tools discussed above, such as the lack of educational contextualisation, personalisation, pedagogical foundation and teacher autonomy. First, teachers reported that such a tool would allow them to monitor students’ progress, identify learning difficulties, and adjust feedback accordingly, thus emphasising the value of personalisation to individual learners. Second, several teachers highlighted that the prototype enhanced their sense of autonomy and control, as it enabled them to interpret and act on data within their pedagogical judgment, reinforcing rather than diminishing their professional autonomy. In our next steps, we plan to engage in the co-design of the GenAI tool in connection with teachers’ lesson plans and classroom activities, with the aim of contextualising the tool to specific curriculum and pedagogical needs. Our study indicated that teacher autonomy is not a static condition but can be fostered via participation in both tool design and classroom co-orchestration with GenAI. We would like to acknowledge that an important dimension of teacher autonomy lies also in teachers’ decision-making actions, e.g. to reject or override GenAI suggestions, such as discarding outputs that do not align with their pedagogical context, However, our analysis in this chapter centres on the structural forms of autonomy embedded in the design and orchestration of GenAI systems. When teachers are actively involved in the tool design phase, they help embed their pedagogical values, contextual knowledge, and ethical considerations into the system’s parameters, ensuring that AI aligns with curricular intentions and classroom realities. When teachers are acting as co-orchestration partners during the instructional cycle, the foundation for a division of labour (with GenAI) over time is established: teachers determine the pedagogical tasks and when and how automation can happen. Consequently, these two levels create a matrix (see Figure 8.8). A GenAI system may have low AI automation in enactment but still limit teacher autonomy if its design is predefined. Another GenAI system might support high automation during enactment, but still allow high teacher autonomy since its elements and aspects are co-designed. Within this dynamic, the role and autonomy of teachers can evolve in multiple directions. In some scenarios, GenAI systems risk replacing teachers by taking over core instructional functions such as content delivery, assessment, and feedback without permitting them to control or monitor the GenAI output, thereby marginalising human judgment and pedagogical expertise. In other scenarios, GenAI may complement teachers by handling some instructional tasks, enabling teachers to focus on higher-order teaching activities such as critical discussion and socio-emotional support. In some scenarios, GenAI could augment teachers, enhancing their capabilities and insights by performing tasks and generating learning insights that were previously difficult or impossible to achieve, such as real-time identification of the misconceptions of multiple individual students. This spectrum from replacement to augmentation highlights the ethical and pedagogical imperative to design and orchestrate GenAI systems that reinforce the central role of teachers in shaping meaningful learning experiences. 





The acceptance and adoption of GenAI-powered systems in formal educational activities have to balance automation with teacher autonomy. Current GenAI tools rarely account for the autonomy teachers require in earlier stages of course design. Without meaningful involvement of teachers in the design of these systems, GenAI risks reinforcing predefined models of teaching and learning that may conflict with instructional and learning design principles. This section discussed the importance of maintaining teacher autonomy during the design and use of GenAI in educational settings. The chapter showed how Human-Centred Design and Design-Based Research approaches can help to transition from general-purpose GenAI to educational GenAI, highlighting the need to involve stakeholders as co-design and co-orchestration partners when developing and using such systems. Human-Centred Design and Design-Based Research offer promising approaches to address this challenge. Human-Centred Design ensures that teachers and students are engaged as active participants in the design of educational technologies, while DesignBased Research involves iterative cycles of development and testing in authentic contexts. We expanded the Human-AI automation model proposed by Molenaar to frame teacher autonomy in designing and using GenAI systems in education. Accordingly, we distinguish between two key forms of teacher autonomy in the context of AI-supported education. The first involves teachers acting as co-design partners in the creation of the AI tools themselves. In this role, they contribute their insights and express their needs regarding elements such as the user interface, functional features, and underlying infrastructure, ensuring that the tools are purposefully aligned with educational goals. The second form positions teachers as co-orchestration partners throughout the instructional cycle. Here, teachers plan, enact, and reflect on each lesson while meaningfully integrating the AI tool into teaching practice. This level of autonomy allows them to determine how instructional tasks are distributed between themselves and the AI, shaping how the technology supports learning in their own classrooms. Our study shows, using tangible evidence, how teachers and students can be involved in the design of educational GenAI systems and what kinds of systems meet their expectations. We hope it can inspire policy-makers and EdTech developers and companies.


Figure 8.8 depicts the demographics of the teachers who participated in the different sessions. Despite all of them previously using GenAI tools for their teaching practice, 63.2% of the teachers reported limited AI competence and only 36.8% feel confident enough with GenAI. Students (N=6 male, N=6 female), also coming from the Netherlands, were between 12-14 (N=7) and 15-17 (N=5) years old.




Comments

Popular posts from this blog

(Day 2) Beyond the Algorithm: Navigating the Future of Artificial Intelligence - 49th Annual UNIS-UN International Student Conference.

(Day 1 - Part 2) Beyond the Algorithm: Navigating the Future of Artificial Intelligence - 49th Annual UNIS-UN International Student Conference.