In this study, we drew from the broad idea that teachers are designers of learning experiences (Henriksen et al., 2020; Kirschner, 2015; Warr & Mishra, 2021), including experiences that integrate the use of technology to benefit student learning (Goodyear & Retalis, 2010). Broadly, thinking like a designer means using a “strategic approach to analyzing and finding solutions to messy real-world problems” (Henriksen & Richardson, 2017, p. 61). Teachers as designers consider goals related to student outcomes and construct learning activities that simultaneously aim to meet those goals and provide evidence that enable them to assess whether learners have met those goals after instruction (Wiggins & McTighe, 2005).
For this study, the teachers of focus were teacher educators (TEs) who prepared preservice teachers (PSTs) and whose pedagogy was a “pedagogy of teacher education” that was bound up in the “complex interplay” of teaching and learning about teaching and learning (as in Loughran, 2006, p. 3). Loughran asserted that a pedagogy of teacher education was much more than teaching procedures and practices, and that developing a pedagogy of teacher education would involve the purposeful examination, description, articulation, and portrayal of the aforementioned complex interplay.
More recent work has echoed and extended Loughran’s (2006) ideas, with one extension being that a pedagogy of teacher education “requires being perceptive of and willing to explore the complexities, nuances, tensions, and ambivalences of teaching and teacher education” and “steers clear of prescriptive orientations toward teacher educators’ practice and blueprints for action” (Vanassche et al., 2024, p. 199), such as mandating set curricula. Thus, although the core ideas of a pedagogy of teacher education proposed by Loughran remain, the manifestation of a pedagogy of teacher education is emergent (Butler & Bullock, 2024), shifting as does the complex landscape of teacher education that includes increasing the use of technology and positioning TEs as designers of instruction.
In our study, TEs prepared PSTs within methods and content courses that attended to the middle school level, representing grades six to eight, or ages 11 to 14 in the United States educational system. More specifically, we examined three science and three mathemathics TEs as they designed and implemented instructional experiences for PSTs before and after the PSTs engaged in two different digital teaching simulations. These simulations aimed to help PSTs learn to facilitate argumentation-focused discussions.
Research Questions
The first research question that guided this study was as follows: How did TEs engage in instructional design around the two digital simulations, that is, (a) what were their instructional goals, (b) what was the nature and extent of the instruction they implemented, and (c) how did they evaluate the success of their instructional implementation? The second research question was as follows: What did TEs report they would do differently if given the opportunity to iterate and implement the digital simulations again in a future semester?
Literature Review
Argumentation Discussions
Learning to facilitate argumentation discussions is one of the many teaching practices that PSTs preparing to teach math or science must develop to support student learning (Kuhn, 2010; Osborne et al., 2013; Smith et al., 2008). Engaging students in argumentation discussions involves encouraging students to formulate claims, share those claims, support those claims with evidence-based reasoning or justification, critique others’ ideas, and possibly change their thinking during the discussion.
Prominent standards in math and science feature argumentation as a key practice for students to learn. The Next Generation Science Standards (NGSS Lead States, 2013) include that middle school students should be able to construct “a convincing argument that supports or refutes claims for either explanations or solutions about the natural and designed world(s),” using evidence to support and refute those claims (p. 63). The Common Core State Standards for math assert that students should be able to “justify their conclusions, communicate them to others, and respond to the arguments of others” as they refer to and reason about data (National Governors Association for Best Practices & Council of Chief State School Officers, 2010, pp. 6-7).
Mikeska et al. (2019) identified five dimensions of facilitating argumentation-focused discussions:
- Attending to student ideas …
- Facilitating a coherent and connected discussion,
- Encouraging student-to-student interaction,
- Developing students’ conceptual understanding, and
- Engaging students in argumentation. (p. 138)
Facilitating discussions that include argumentation is challenging (Bieda, 2010; Davis et al., 2006; Simon et al., 2006). Several studies have suggested that PSTs can learn to facilitate these and similarly complex discussions in math and science by practicing them (Dotger et al., 2015; Mikeska et al., 2023a; Straub et al., 2015; Thompson et al., 2022). For example, Dotger et al. explored the instructional moves that PSTs practiced when they responded to an actor playing the role of a student who had difficulty with their math homework. Using digital simulations with student avatars, Straub et al. found that the high school science teachers in their study “significantly increased targeted teaching practices in the simulator” and those “improvements transferred into the teachers’ original classroom settings” (p. 30). Further, TEs can help PSTs learn to ask purposeful questions and prompt students to engage with one another’s ideas toward orchestrating a productive discussion (Cartier et al., 2013; Smith & Stein, 2018).
Simulations to Support PST Learning
Teacher preparation programs provide PSTs with early opportunities to practice teaching with increased support and simplified experiences (Reich, 2022). Examples of these “approximations of practice” (Grossman, 2018) include PSTs engaged in peer teaching experiences or student teaching.
Over the last decade, digital simulations emerged as a means to approximate practice, with evidence growing that their use leads to gains in teacher knowledge and improved practice (Bondie et al., 2021; Girod & Girod, 2008; Hillaire et al., 2022; Mikeska et al., 2023a; Straub et al., 2015; Thompson et al., 2022). The digital simulation systems used in this study are Teacher Moments (TM) and what our project calls the Avatar-Based Simulation (ABS), which uses the Mursion® simulated classroom.
In TM, PSTs practice responding to students by constructing questions and other prompts (i.e., statements that are not questions that aim to elicit a response from students). In the ABS, PSTs facilitate discussions with a group of five student avatars. This is an interactive experience whereby the avatars can respond to PSTs’ questions in the moment.
TEs have increasing access to simulated approximations of practice. Some of these, like TM, are freely available, with TEs having access to develop and use this simulation without monetary cost. Others, like the Mursion® system have an associated cost yet are available for TEs at many institutions across the country to use. Having knowledge of and access to these digital simulations are barriers to their incorporation within PST education — barriers that we removed in the study that described here. Another challenge is how TEs can integrate these tools within their methods coursework to support PST learning. In this way, they are challenged to design around these digital simulations.
TE Integration of Simulations Into Methods Coursework
There is growing research on how TEs can design instructional experiences around face-to-face approximations of practice, especially rehearsals. Rehearsals involve a PST acting as a teacher (the “rehearsing novice”), while other PSTs act as students (Davis et al., 2017; Lampert et al., 2013). The TE is not only present, but a key facilitator of the experience. Lampert et al. identified four TE roles during rehearsals in elementary math methods classes, including the TE (a) providing feedback to direct the rehearsing novice with respect to their next move(s); (b) providing evaluative feedback about the teaching, including what was going well or not well; (c) scaffolding the rehearsal by temporarily playing the role of the teacher or a student; and (d) asking the rehearsing novice and other PSTs to reflect on teaching moves and student responses.
The TE may take on all these roles during pauses in the rehearsal or may wait to provide feedback or facilitate reflection after the rehearsal. The work done while pausing during rehearsal was a focus of work by Davis et al. (2017). This included the secondary science TEs in their study providing feedback and facilitating discussions about student sensemaking. Additionally, prior to the rehearsals, the TEs modeled a sensemaking discussion for their PSTs.
Work by Kumar (2022) described how the author implemented a practice-based course design in her early childhood course, which included preparing for, facilitating, and debriefing after a rehearsal to engage children in science talk about plant growth. Preparation strategies included the TE having PSTs (a) review a relevant lesson plan and consider what questions they might pose to children and (b) watch a video of a teacher interacting with children in a classroom garden and “reason about the effect of the teacher’s behavior on children’s engagement” (p. 337). Kumar used moves during the rehearsal similar to those previously described and debriefed after the rehearsal using the question, “How did your teaching moves (within the rehearsed activity) engage students in science talk?” (p. 342).
Kumar (2022) offered that a motivation for her work was that a “complete picture of how … [pedagogies of practice] can be enacted [by TEs] within a learning cycle of instructional activity involving multiple practice pedagogies has not been documented” (p. 334). We would add that little research has provided such a picture and done so in the context of digital approximations of practice.
A recent study by Mikeska et al. (2023b), however, provides some insight. These researchers explored how eight elementary TEs, four science and four math, each integrated one Mursion® simulated approximation of practice into their instruction. This digital simulation involved PSTs in the TEs’ courses, each facilitating an argumentation discussion with five student avatars in the Mursion® elementary classroom. The TEs used a wide range of pedagogical activities to (a) prepare PSTs to facilitate the discussion (e.g., by examining video exemplars) and (b) debrief after the discussion (e.g., through whole-class discussions). Mikeska et al. wrote the following:
The main implication [of the study] is that it is not the simulated teaching session in and of itself that supports PST learning, but it is the simulated teaching session coupled with strategically scaffolded and intentional pedagogical activities prior to and after the simulated teaching session that work in concert to support PSTs in learning how to enact core teaching practices. (p. 13)
This view builds from the seminal work of McDonald et al. (2013), who called for PST education to involve a cycle of collective pedagogical activity focused on building PSTs’ core teaching practices. Pedagogical activities in this cycle include those that prepare PSTs for and help them reflect after simulated or actual instruction. One outcome of this work is the development of a descriptive taxonomy of such pedagogical activities. Mikeska et al. (2023b) used and augmented McDonald et al.’s taxonomy of pedagogical activities, originally developed for real classrooms, and modified it to account for pedagogies visible when TEs were engaging PSTs with simulated classrooms.
An example of a pedagogical activity is transcript analysis, which was described in an action research study (Lottero-Perdue et al., 2022). In this study, a TE taught PSTs how to analyze transcripts to help them prepare for and reflect upon argumentation discussions PSTs facilitated in the Mursion® simulated classroom. PSTs were able to identify prompts in others’ transcripts that they planned to use in their discussions and analyzed their own discussion transcripts to identify productive prompts and prompts that could be improved to support their facilitation of future argumentation discussions.
There is a need for the research community to better understand how TEs use digital approximations of practice in their courses, including how they create instructional experiences that prepare PSTs for digital simulations and then offer reflective debriefing after PSTs have engaged with those digital simulations. This need arises from two circumstances: (a) Digital simulations are becoming more widespread in use in teacher education, and advances in generative artificial intelligence (AI) are likely to make their use even more widespread; and (b) even as simulations are used more broadly, there is no extant curriculum, to our knowledge, for TEs using digital simulations and few examples of how other TEs have designed instruction around the use of digital simulations.
The pedagogical activities that surround the use of digital simulation lie nearly entirely in the hands of the TE to conceptualize, plan, and implement. Better understanding how TEs approach this design task will inform how TEs think about the use of digital tools, and more generally, how they can best support TEs in creating, reenvisioning, and recreating activities within contexts that require adaptation and creativity at every turn.
This relates to the educational research community’s understanding of what constitutes a pedagogy of teacher education (Bullock & Butler, 2024; Loughran, 2006). This study informs this understanding and is unique in that, as described in sections that follow, we examined the use of, not only a single simulation or multiple iterations of the same simulation, but rather an intentionally scaffolded pair of different simulations.
Further, our findings and implications lie on a continuum between prescribing (a) the use of technological tools without consideration for how these tools integrate into instruction and (b) a single best approach or curriculum for incorporating these technologies into PST instruction. Within these extremes, we instead foregrounded features of the approaches the TEs took in designing instruction around the scaffolded pair of simulations.
Conceptual Framework
We framed this work on the conceptualization that teachers are designers (Henriksen et al., 2020; Kirschner, 2015; Norton & Hathaway, 2015; Warr & Mishra, 2021) and so, too, are teacher educators (Cutri & Whiting, 2018; Snow et al., 2023). Design is a complex set of processes that is relevant to and applied differently across fields including architecture, engineering, and fashion, but also including culinary arts, education, science, and choreography (Daly et al., 2012). Daly et al. studied design as experienced by individuals across various disciplines and created a spectrum of approaches that ranged from design as an “evidence-based decision-making endeavor” to “design as freedom” with the ability to explore “endless” possible outcomes (p. 199). In the middle of the spectrum was design as “personal synthesis,” in which individuals blended past experiences, knowledge, and information from other sources.
We built our framework on traditions in both instructional design (Goodyear & Retalis, 2010; Henriksen et al., 2020; Wiggins & McTighe, 2005) and engineering design (Crismond & Adams, 2012; Dym et al., 2005; Petroski, 1996). As instructional designers, TEs have goals, develop instructional activities to meet those goals, and use assessment to evaluate how their PSTs learned from those activities. This approach is similar to backward design, as expressed by Wiggins and McTighe, although their model emphasized a particular order for instructional design (goals > assessments > activity design) more than we did in our project.
Further, the process of design can be used across teaching tasks of various scales, from planning a short activity to a unit project to designing a course or program. TEs must be strategic when solving complex problems in instruction, including but not limited to the integration of technology and novel approximations of practice.
For example, Cutri and Whiting (2018), described their self-study of how they as TEs developed and taught blended courses, that is, those that had online and face-to-face components, for the first time. One of the three themes from their study was their routine engagement in “iterative course design” (p. 137). Working as instructional designers, these TEs integrated technology that was initially unfamiliar to them, facilitating for their PSTs what Goodyear and Retalis (2010) called “technology-enhanced learning” (p. 1). Snow et al. (2023) identified design as a key activity of TEs and examined how TEs designed course activities, professional learning experiences, and programs related to teacher education, with the idea being that design is useful across teaching tasks of various scales.
When positioning TEs as designers, we find utility in referencing three key aspects that are central to engineering design; that is, design as (a) a means to solve problems and address goals to solve those problems, (b) necessitating working within constraints, and (c) involving iteration (Crismond & Adams, 2012; Petroski, 1996). By participating in the Online Practice Suite project, of which this study is part (hereafter, “the larger project,” as described on the website, https://tarheels.live/onlinepracticesuite/), the TEs addressed the problem that PSTs need support to learn how to engage in key teaching competencies — with a goal of using simulations to address the problem. This problem and goal are complex for a variety of reasons, including that facilitating argumentation discussions is a challenging practice, and simulations are largely new to the work that TEs do to support teacher learning.
In engineering design, constraints must be followed. These constraints might come from science (e.g., we must consider gravity on Earth) or from logistics (e.g., money is limited). There were multiple constraints imposed by (a) the larger project (e.g., use of particular simulations); (b) the PSTs (e.g., variability among students); (c) the context (e.g., class format); and (d), the TEs (e.g., comfort with technologies).
Another ubiquitous idea in design is iteration. Designs are created, tested, compared against criteria, and revised in a recurring loop. Inherent in this iteration is an understanding of the importance of design failure, diagnostic troubleshooting, and improvement (Crismond & Adams, 2012; Lottero-Perdue & Parry, 2017; Simpson et al., 2019). Cutri and Whiting (2018) described how they iterated their instructional design attempts over multiple courses. In our study, TEs had an opportunity to reflect on their experiences, take what they learned from designing activities to support their PSTs in two cycles of digital simulations, and consider possible future use of simulations in their courses.
Method
This study was part of a larger project that gathered evidence on both (a) PST engagement, perspectives, and learning outcomes from engaging in a scaffolded series of simulations, each of which was bookended by preparation and debrief designed and facilitated by TEs; and (b) TEs’ instructional implementation, perspectives, and reflections with respect to those simulations. We received institutional review board approval by the project’s lead institution, ETS (IRB Study No. JamieM2020-10-01T121923).
This study focused on the second aspect of the larger project, specifically examining how TEs teaching middle school content or methods courses designed instruction to support PSTs learning with two digital simulations: Teacher Moments (TM) and the Avatar-Based Simulation (ABS). PST learning is not the focus of this study and is reported elsewhere (Howell et al., 2024; Mikeska et al., 2024). In the following sections, we describe the TE participants; the simulations and tasks; project-related constraints and supports for the TEs; and data sources, instruments, and analysis.
Participants
We recruited potential participants by posting information about the larger study on professional listservs and emailing information to TEs within project leaders’ professional networks. We selected participants based on several factors, including that (a) they taught a mathematics or science content or methods course in the fall 2022 semester, and (b) the collective group of participants we ultimately selected varied with respect to their and their institutions’ demographic characteristics.
Participating TEs agreed to incorporate the digital simulations, including preparing and debriefing PSTs before and after each simulation; support observation of preparation and debrief activities; and provide copies of relevant artifacts, complete surveys and a final interview, and participate in a community of practice (Table 1). This community met approximately monthly and included TEs who participated in other parts of the larger project, learned about the digital simulations, and shared instructional ideas. This community is similar to other TE communities of practice described in Butler and Bullock (2024), in which TEs examined an aspect of the pedagogy of teacher education (Loughran, 2006). Examples included a community of practice of doctoral students preparing to become TEs at one institution (Williams et al., 2024), and another community of practice focused on supporting current TEs as they integrated practice-based teacher education into their courses (Finkelstein et al., 2024; Grossman, 2018).
For this study, participants included three science TEs (Angela, Barry, and Christine) and three math TEs (Dawn, Erin, and Francisco); each had more than 10 years of experience. All names are pseudonyms.
Table 1
Teacher Educator Participant Demographics
| Pseudonym | Gender | Race/Ethnicity [a] | Discipline | University Setting | Prior Approximation Use | Prior Digital Simulated Approximation Use |
|---|---|---|---|---|---|---|
| Angela | Female | White/Caucasian | Science | Rural | Often | Rarely |
| Barry | Male | White/Caucasian | Science | Urban | Sometimes | Never |
| Christine | Female | White/Caucasian | Science | Suburban | Rarely | Never |
| Dawn | Female | White/Caucasian | Math | Suburban | Rarely | Sometimes |
| Erin | Female | White/Caucasian | Math | Suburban | Often | Never |
| Francisco | Male | Hispanic/Latino | Math | Rural | Often | Pilot [b] |
| [a] Participants did not select other descriptors for race/ethnicity but had the option to do so. [b] Francisco had not used digital simulations prior to the larger project of which this study is a part. One year prior to the semester of focus for this study, Francisco participated as a pilot teacher educator and used the Teacher Moments and the Avatar-Based Simulation one time each. | ||||||
Digital Simulations and Tasks
Teacher Moments (TM)
TM is a free digital simulation that enables PSTs to rehearse discrete challenging moments in teaching (Reich, 2022). It immerses PSTs in “vignettes of professional practice through video, images, and text, and they are called upon to improvisationally make difficult decisions through recorded audio and text” (Hillaire et al., 2022, p. 212). The vignette and prompts together constitute what we call a task.
We crafted two middle school tasks, one for math and one for science (Online Practice Suite, 2021c, 2021d). In each vignette, two students had different arguments (about a math problem or science investigation) and needed prompting by the PST to engage with one another’s ideas. In the science TM task, “Keeping the Heat,” two students, Victor and Rosa, each responded to a question about whether a paper or foam cup did a better job of keeping hot chocolate warm and why. The pair was asked to share and critique one another’s arguments. See Appendix A for a summary of the Keeping the Heat task.
The TM session began by presenting the vignette and proceeded in two rounds during which the PST was asked to provide multiple questions or other prompts (collectively and hereafter, “prompts”) to encourage the students to share and critique one another’s arguments. PSTs typed these prompts into the TM online interface as if they were speaking directly to students.
In the second round, the PST repeated this process after reviewing two to four teaching suggestions. An example of a suggestion is as follows: “Victor could be asked to critique Rosa’s ideas about heat being a substance itself rather than being the property of a substance.” The TM session was untimed, but PSTs typically completed it in 20 to 30 minutes. TEs could ask PSTs to engage in TM in or outside of class. TEs had access to responses from all PSTs to use for instructional purposes.
ABS Using Mursion
The ABS is a digital simulation that uses Mursion® technology that enables PSTs to practice facilitating discussions with a group of five avatars through the Zoom online meeting application (Mursion, 2024; Straub, 2018). It is a mixed-reality simulation in that the avatars represent a combination of “virtual and real where the virtual environment presents a detailed digital context with realistic avatars controlled by a human puppeteer” (Bondie et al., 2021, p. 108). We call this puppeteer a simulation specialist or sim.
When a PST facilitates a discussion with the avatars, the sim can see and hear the PST and respond in real time to the PST. See Figure 1 for a representation of a PST engaging in TM and, separately, ABS. The figure aims to depict that while both involve the use of a computer interface, TM is text-based, and ABS involves engaging with avatars.
Sim training is essential to enable sims to act appropriately as the student avatars (Bondie et al., 2021). The sim, often having prior experience as an actor, receives extensive training to operate the Mursion® software and affiliated tools and accurately portray the students involved in the project’s math and science tasks. Project training prepares sims to respond as the avatars with respect to the math or science task in consistent, but not scripted ways. The aim is for the sim to improvise within the constraints of the task and based upon the prompts posed by the PST (Bondie et al.). More details about sim training that was employed in this project can be found in our prior work (Mikeska et al., 2023a).
Figure 1
Preservice Teacher Using Teacher Moments and the Avatar-Based Simulations

For both the math and science tasks (Online Practice Suite, 2021a, 2021b), the five avatars were grouped together into two teams, each of which had a different argument related to a math problem or science investigation. The job of the PST was to facilitate an up-to-20-minute discussion to enable the teams to share and critique one another’s arguments, with the aim of arriving at consensus by the end of the discussion. The goal of the ABS Keep It Cold science task was for the PST to support the student avatars (hereafter, “students”) to “discuss and come to consensus on a model describing heat transfer between warm air and two separate cups of cold water that are made of different materials [paper and foam]” (Online Practice Suite, 2021b).
The task materials provided to PSTs include a description of the Keep It Cold investigation, as well as relevant background science activities (including the Keeping the Heat investigation from TM). Task materials also presented each team’s drawn model and associated explanation about why a paper cup of water had a larger increase in temperature over a 30-minute period than a foam cup of water. See a summary of the investigation in Appendix B.
When PSTs signed onto Zoom for their scheduled the ABS session, which took place outside of class time, they entered the Mursion® simulated environment, received assistance by the sim through an adult host avatar, practiced interacting with the student avatars in a scripted short activity, then started their Keep It Cold discussion (or mathematics discussion). Their discussion was video recorded and sent to the PST and their TE.
Reasons for Using TM and ABS
These simulations were chosen both on their individual merits and based on our hypothesized idea of their merit as a coupled set of simulations. TM is open source, easily accessible, widely used, and free of cost to the user. It also has demonstrated success in various contexts (Hillaire et al., 2022; Reich, 2022). It provides a place to practice discrete skills and, in this way, and due to its text-based interface, is a less complex simulation when compared to the Mursion®-based ABS.
Mursion® is not open source, requires a license to access, and involves an additional cost of an actor to act as the sim. Yet, it is a mixed-reality simulation within TE programs endorsed by professional TE organizations such as the American Association of Colleges for Teacher Education (AACTE, 2020).
The inherent complexity of ABS, requiring PSTs to facilitate a discussion with five avatars, is a closer approximation to discussions with real students than TM, which is both a benefit (a closer approximation) and a challenge (more of a cognitive demand on the PSTs). In the larger study, and thus in this one, we were curious about how beginning with TM and then moving to ABS might (a) create a scaffolded series of simulations, moving from lesser to greater complexity; (b) reduce the overall cost and time required when compared to using two ABS simulations; and (c) have the potential to offer similar benefits for PSTs who engaged in multiple ABS simulations in prior work (Mikeska et al., 2023a).
TE Instructional Constraints and Supports
Instructional constraints imposed by the project included that TEs would (a) engage their PSTs in TM and the ABS in that order; and (b) help PSTs prepare for and debrief after each digital simulation, having the PSTs engage in each preparation and debrief for at minimum 60 minutes in or out of class time. Also, TEs were not able to alter the TM or ABS tasks or digital simulation platforms (e.g., aspects of the Mursion® avatars).
TEs were provided with multiple resources from the project to assist with their design process but had agency in how they decided to design their instruction. Prior to the semester, we surveyed TEs for their preferred timing of the simulations. We accommodated most preferences, with minor adjustments to handle workload constraints, while ensuring all six TEs could participate in the ABS sessions. The resources provided were not a curriculum. Rather, we provided a guidebook, one each for TM and the ABS for both math and science (Online Practice Suite, 2021e, 2021f, 2021g, 2021h), as well as an electronic bank of resources shared within the community of practice.
It was strongly recommended that TEs read the guidebook, which included information about the digital simulations and five dimensions of facilitating argumentation discussions (Mikeska et al., 2019) and consider using other electronic resources we provided, including articles, activities other TEs had developed, and video examples of ABS discussions. Part of each guidebook emphasized TEs’ agency as instructional designers:
There are many productive ways to integrate [TM/ABS] into the course. We have designed the [tasks] with a clear goal in mind, but they may serve secondary goals for you as well. You will have flexibility over the nature and amount of support you give your PSTs …
The community of practice meetings mirrored the guidebook’s topics, allowing TEs to exchange approaches and experiences, fostering an environment for mutual learning and sharing among TEs. Moreover, TEs had the opportunity to experiment with TM and the ABS before their PSTs had the chance to do the same.
Data Sources and Instruments
The sources of data used were class observations, TE reflections, post-TM surveys, post-ABS surveys, and end-of-semester interviews. We used a background information survey to collect demographic data. See Figure 2 showing data collection over the semester and Table 2 for ways data sources aligned with the research questions.
Figure 2
Data Collection Instruments Used during the Semester

Table 2
Data Sources by Research Question (RQ)
| Research Question | Data Source | ||||
|---|---|---|---|---|---|
| Class Observations | Teacher Educator Reflections | Post-Teacher Moments TE Surveys | Post-Avatar-Based Simulation TE Surveys | End-of-Semester Interviews | |
| RQ1a Goals | x | ||||
| RQ1b Implementation | x | x | x | ||
| RQ1c Evaluation | x | x | x | ||
| RQ2 Considering Iteration | x | x | |||
Each TE had a project team member assigned to them (hereafter, “observer”) who took detailed field notes for each class session in which TEs prepared or debriefed from TM or the ABS or that involved instruction around discussion or argumentation. For five of the six TEs, observations were conducted via Zoom; the TE positioned a laptop such that the observer was able to see and hear most of the instruction via the laptop’s microphone and camera. For one TE, the observer was located close enough to the TE to conduct in-person observations to the TE’s course site. Observation notes were consolidated into one or more summaries for TM preparation, TM debrief, ABS preparation, and ABS debrief, respectively. The number of summaries per activity depended on the duration of the activity.
Shortly after observations were made, the observer shared the summary with the TE to confirm its accuracy. At the end of the summary, the TE was asked to respond to questions related to the observation. Each TE responded to between five and seven sets of reflection questions across preparation and debrief activities. In addition, each TE responded to a postsimulation survey (post-TM and post-ABS) after completing all debrief activities and grading and participated in a semistructured interview after completing all project-related activities. See Appendix C for reflection, survey, and interview questions used in the study.
Data Coding and Analysis
Our study utilized a convergent parallel mixed methods (QUAL-quan) design, meaning that we concurrently gathered mostly qualitative (QUAL) and some quantitative (quan) data, analyzed those data separately, and interpreted and drew from both analyses to help answer our research questions (as in Creswell, 2014). We approached the analysis of data from observations, reflections, qualitative survey items, and interview responses similarly. The first author began the process by using iterative qualitative analysis to develop a code book and assign codes to the data. For observations and some survey items, an existing code book from a prior project was used as a starting point (Mikeska et al., 2023b). Emergent codes were added to these a priori codes as needed, with additional iterations to refine the code list and assignment of codes.
After the first author developed an initial code list, including a priori and emergent codes, and applied those codes to the observation, reflection, and survey data, one of the other authors used the same list to code the data independently. The author pairs met to reconcile coding, altering the code list and application of codes as necessary. A third author reviewed and commented on the final coding of each pair. In short, every coding activity was double-coded and then reviewed by another member of the team. None of the authors were TE participants in this study. Five of the coauthors, and an additional project team member who is not a coauthor, were each assigned to observe one of the TEs. Coding was done to categorize what was recorded in the observation, not to evaluate the quality of the observation.
Regarding our analysis of observations, we coded each activity within the observation summary. There were 74 activities coded across the six TEs. Activities lasted from 3 to 91 minutes (M = 22, SD = 16). Often, more than one code was used to describe the substance or the pedagogical approach of the activity. Thus, we do not report the total minutes of each substance or approach code, but rather report the frequency with which activities we observed were coded for substance and approach.
After coding, we analyzed sets of coded data in response to each research question to identify larger patterns and themes, including the most frequently mentioned goals, observed instructional foci, and reported PST key learnings and ideas related to iterations. Our method of reporting information in tables goes beyond simply listing the number of TEs assigned to specific codes. We also detail the specific TEs assigned to each code. This approach allowed us to track the involvement of each TE in various aspects of instructional design and provided valuable information that aided us in selecting a TE to highlight in our findings section.
Our study emphasized qualitative data, with quantitative support enhancing our findings. For example, as part of describing what instruction TEs implemented, we also considered how much instructional time they devoted to implementation. For this analysis we used reports of time on observation summaries to calculate the total amount of synchronous instructional time for TM preparation, TM debrief, ABS preparation, and ABS debrief. We also reported the frequency of responses to Likert questions where relevant to the research questions.
Findings
We organized this section by research question. Tables demonstrate the range of ways the six TEs engaged as instructional designers, identifying goals, implementing activities, and evaluating the success of those activities and of their PSTs’ key learnings from them. In each of the subsections, we thread descriptions about how one of the TEs, Christine, engaged as an instructional designer. We use data from other TEs to describe approaches to instructional design not captured in our descriptions of Christine. Also, we present the results in aggregate, not by content area (i.e., science or math). In this study and our prior work (Mikeska et al., 2023b), while we observed differences in TEs’ contexts, choices, and experiences, that variability tended to be more individual in nature; we did not observe trends across math and science sites.
We chose Christine for several reasons. Like many TEs, Christine had never used digital simulations as approximations of practice. She engaged PSTs in somewhat more in- and out-of-class time than was required (i.e., 60 minutes for preparation and 60 minutes for debrief) for both TM and the ABS, representing a commitment that met but did not far exceed our expectations (see Tables 3 and 4). Compared to the other TEs, Christine spent less time than other TEs engaging her PSTs in TM and ABS preparation and debrief activities.
Nine PSTs were enrolled in her course, which was the average number across the TE participants. Christine was a fully engaged participant and thorough in her responses to survey, reflection, and interview questions, offering complimentary and constructive feedback about her instruction and PSTs’ responses to it and to elements of the project. Furthermore, Christine’s goals, implementation, and evaluation as an instructional designer were largely representative of the major themes we observed across the TEs.
Table 3
Synchronous Instructional Time Measured During Observations
| Activity | Angela | Barry | Christine | Dawn | Erin | Francisco | Total (All TEs) |
|---|---|---|---|---|---|---|---|
| TM Prep | 120 | 155 | 80 | 255 | 55 | 185 | 850 |
| TM Debrief | 35 | 35 | 40 | 50 | 55 | 30 | 245 |
| ABS Prep | 65 | 0 | 105 | 190 | 325 | 60 | 745 |
| ABS Debrief | 70 | 140 | 70 | 65 | 65 | 35 | 445 |
| Total TM + ABS Synchronous Time | 290 | 330 | 295 | 560 | 500 | 310 | 2,285 |
| Total Available Synchronous Time for the Semester | 1,380 | 2,540 | 2,270 | 3,300 | 2,270 | 2,220 | 13,980 |
| Percentage of Synchronous Instructional Time [a] | 21% | 13% | 13% | 17% | 22% | 14% | 16% |
| Note. The observed time is rounded to the nearest 5 minutes. Total Available Synchronous Time is calculated based on approximate class meeting time during the semester and rounded to the nearest 10 minutes. Percentage of Synchronous Instructional is the Total Synchronous Time divided by the Total Available Synchronous Time for the Semester. TM is for Teacher Moments. ABS is for Avatar-Based Simulation. [a] Mean = 16.7% and standard deviation 4.0%. | |||||||
Table 4
Preservice Teachers’ Asynchronous Out-of-Class Time Estimated by Teacher Educators
| Online Practice Suite Activity | Angela | Barry | Christine | Dawn | Erin | Francisco | Total |
|---|---|---|---|---|---|---|---|
| TM Prep | 60 | 60 | 15 | 90 | 60 | 90 | 375 |
| TM Debrief | 30 | 30 | 15 | 120 | 30 | 45 | 270 |
| ABS Prep | 60 | [a] | 30 | 100 | 60 | 120 | [370] |
| ABS Debrief | 0 | 30 | 30 | 150 | 60 | 30 | 300 |
| Total Asynchronous Time [b] | 150 | [120] | 90 | 460 | 210 | 285 | [1,315] |
| Note. Values in brackets are approximate because some information was not reported. TM is for Teacher Moments. ABS is for Avatar-Based Simulation. [a] Not reported. [b] Mean = 9.3% and standard deviation 4.1% (calculated with 5% for Barry). | |||||||
Research Question 1a: Instructional Design Goals
The TEs articulated their instructional design goals related to preparation and debrief activities in their reflections. One major pattern is that six TEs identified goals related to questioning, and five TEs identified goals related to argumentation, disciplinary content, and attending to student ideas (Table 5). Other patterns included that the goals related to content appeared only in preparation activities; argumentation and questioning goals were mentioned more frequently for TM than for the ABS; and there was a wider range of goals for the ABS as compared to TM. Christine mentioned goals relating to argumentation, questioning, disciplinary content, attending to student ideas and student-to-student interaction (Table 6).
Table 5
Goals for TM and ABS Preparation and Debrief Activities
| Goal Code | Number of Teacher Educators (n = 6) and TE First Initials (A-F) | |||
|---|---|---|---|---|
| TM Preparation | TM Debrief | ABS Preparation | ABS Debrief | |
| For preservice teachers (PSTs) to learn about or practice: | ||||
| Argumentation | 3 A C E | 3 A D F | 2 C D | |
| Questioning | 4 B D E F | 5 A C D E F | 2 E F | |
| Disciplinary content | 4 B C D E | 4 C D E F | ||
| Attending to student ideas | 2 C F | 1 B | 3 C D E | |
| Encouraging student-to-student interactions | 1 E | 1 C | ||
| The dimensions (in general) [a] | 2 D E | 2 B E | ||
| Formative assessment | 1 F | 1 E | ||
| For PSTs to: | ||||
| Engage with their peers | 1 A | 1 D | 1 D | |
| Compare digital simulated and real classrooms | 2 B E | |||
| For TEs to model discussion strategies | 1 D | 1 E | ||
| Note. Codes included in this table are those for which goals were mentioned for two or more TEs. A blank cell means that no code was applied for any of the TEs for a particular activity. Not shown here are codes we captured as “other” or “unclear/vague.” [a] Goals included here broadly referenced attending to “the dimensions,” i.e., of high-quality argumentation focused discussions (Mikeska et al., 2019). | ||||
Table 6
Christine’s Specific Instructional Goals
| Instructional Component | Goal | Code(s) |
|---|---|---|
| TM Preparation | “… to introduce this practice (making a claim supported by evidence) as an important part of effective science teaching and to give an example of how to structure the process of students making claims and getting peer feedback on their claims” | Argumentation |
| “... to help students think about/learn how to facilitate argument-based discussion between students” | Argumentation | |
| “... to use student work [in the TM task] to identify important ideas students had that would help move the discussion toward correct understanding as well as identifying key misconceptions that some students had that would need to be addressed” | Attending to Student Ideas Disciplinary Content | |
| TM Debrief | “... [to look] at the prompts they had written and talked about which ones they thought we be most effective – with the goal of helping them identify what made the questions effective” | Questioning |
| ABS Preparation | “... to help the PSTs understand heat conduction and the concepts of the “keep it cold” activity” | Disciplinary Content |
| “... to go in having identified the key correct and incorrect ideas from the students’ models as well as the evidence from the previous activities that they should try to link to the current activity” | Disciplinary Content Attending to Student Ideas | |
| ABS Debrief | “to … help [PSTs] identify what strategies are more effective in getting students to talk to each other, provide evidence for their ideas, and reach consensus on correct understanding” | Argumentation Student-to-Student Interactions |
| Note. TM is for Teacher Moments. ABS is for Avatar-Based Simulation. | ||
Research Question 1b: Instructional Design Implementation
To integrate TM and the ABS, TEs needed to redesign aspects of their courses. The amount of time needed for implementation was a significant adjustment to the course time distribution, as reflected in Tables 3 and 4. In the post-ABS survey, we asked how TEs changed their course to integrate TM and the ABS and why they made those changes.
Christine offered: “I devoted four full days to the prep or debrief … [and] I eliminated groups working together to design a unit during class.” We coded this aspect of her course redesign as replacing content. She and another TE who replaced content said that they did so because they had committed to the project. Other redesign strategies included that TEs made explicit connections between the project and other course activities (four TEs), added course activities (three TEs), and/or rearranged course activities (two TEs). Reasons for redesign beyond project commitment included wanting to integrate TM and the ABS within their courses (three TEs), meet PSTs’ needs (three TEs), and stress the importance of argumentation (one TE).
Observed Implementation Across the TEs
We coded observation summaries for the substance and pedagogical approach of the activities TEs implemented during synchronous preparation and debrief for TM and the ABS. Table 7 summarizes the substance codes, the first five of which overlap with goals codes, suggesting some consistency between the aims and focus of TEs’ instruction. We used “discussion, including questioning” in Table 7 rather than simply “discussion,” as we did in the goals section. The broader substance code largely included questioning and less frequently included topics such as discussion norms. Table 8 summarizes the pedagogical approaches the TEs employed as they helped their PSTs prepare for and debrief after TM and the ABS.
Table 7
Substance Codes for Learning Cycle Preparation and Debrief Activities
| Substance Code | Number of Teacher Educators (n = 6) and TE First Initials (A-F) | |||
|---|---|---|---|---|
| TM Preparation | TM Debrief | ABS Preparation | ABS Debrief | |
| TEs guided the preservice teachers (PSTs) to learn about, practice, or otherwise address: | ||||
| Argumentation | 5 A C D E F | 6 A B C D E F | 4 C D E F | 2 C E |
| Discussion, including questioning | 6 A B C D E F | 6 A B C D E F | 4 C D E F | 4 B C D E |
| Disciplinary content | 5 B C D E F | 4 C D E F | 1 E | |
| Attending to student ideas | 4 C D E F | 5 A C D E F | 3 C D E | 3 B D E |
| Encouraging student-to-student interactions | 3 C D F | 2 A E | 2 D E | 4 B C D E |
| Simulation logistics | 3 A C F | 5 A C D E F | 2 E F | |
| TEs asked PSTs to consider: | ||||
| Strengths or weaknesses of their own/others’ TM or ABS discussions | 1 E | 4 C D E F | 2 D F | 4 A C E F |
| Their emotions as related to TM or ABS | 1 A | 1 D | 2 E F | |
| Note. Codes included in this table are those we observed within the instruction of two or more TEs. A blank cell means that no code was applied for any of the TEs for a particular activity. Not shown here are codes we captured as “other” or “unclear/vague.” TM is for Teacher Moments. ABS is for Avatar-Based Simulation. | ||||
Table 8
Pedagogical Approach Codes for Learning Cycle Preparation and Debrief Activities
| Pedagogical Approach Code | Number of Teacher Educators (n = 6) and TE First Initials (A-F) | |||
|---|---|---|---|---|
| TM Preparation | TM Debrief | ABS Preparation | ABS Debrief | |
| Preservice teachers (PSTs) engage in discussion | 6 A B C D E F | 4 A D E F | 5 A C D E F | 5 B C D E F |
| TEs provide direct instruction | 4 C D E F | 1 E | 3 C E F | 1 E |
| PSTs analyze transcripts or videos | 2 D F | 5 B C D E F | 1 E | 4 A B D E |
| PSTs do the work of students | 5 B C D E F | 2 C E | ||
| PSTs plan to facilitate their discussions | 3 C D F | 3 C D E | ||
| TEs share exemplars or cases | 3 D E F | 2 D E | ||
| PSTs analyze student work | 2 C F | 2 C E | ||
| PSTs anticipate how middle school students might respond | 2 D F | 2 D E | ||
| TEs model instructional strategies for PSTs | 3 A D F | |||
| Note. Codes included in this table are those we observed within the instruction of two or more TEs. Blank cells mean that no code was applied for any of the TEs for a particular activity. Not shown here are codes we captured as “other” or “unclear/vague.” TM is for Teacher Moments. ABS is for Avatar-Based Simulation. | ||||
Christine’s Implementation
TM Preparation. Christine began to talk about argumentation by referencing the call in Ambitious Science Teaching (Windschitl et al., 2020) to press for evidence-based explanations. She explained that this entailed having students state their claim, evidence, and reasoning, and included teacher facilitation for students to review one another’s ideas. Christine elicited from PSTs what this facilitation might look like, then offered multiple sentence frames that teachers might use during peer review.
Next, Christine presented slides that described argument-based discussion and the five dimensions of high-quality argumentation discussions (Mikeska et al., 2019). She explained that educational research shows that argumentation discussion leads to dramatic learning gains for students and that the discursive aspect of this approach is important. She used direct instruction to share the purposes of productive discussions, what it looks like when students are engaged in these discussions, and the responsibilities of the teacher during these discussions.
Christine explained the TM task, including that PSTs would reference student work and type prompts to encourage discussion between two students. She said that their engagement in TM would not be graded and that she wanted it to be low stress as it was for her when she did it. She presented a slide showing three dimensions of high-quality argumentation discussions relevant to TM: attending to student ideas, student-to-student interaction, and argumentation.
The remainder of Christine’s preparation for TM with her PSTs was related not to the Keeping the Heat TM task content but rather to a scenario that she designed called the Tree Mass Task. Like Keeping the Heat, this task included two students who had different arguments and needed a teacher to encourage them to share and critique one another’s ideas. However, the Tree Mask Task that Christine developed focused on content about how trees gain mass as they grow (considering variables related to the soil, water uptake, and CO2 uptake). After some direct instruction to explain the background for this task, she asked the PSTs in her class to first do the work as the middle school students in the task would do, that is, to come up with an argument about why the tree gained mass over time. She then elicited her PSTs’ ideas and then shared the work of the two middle school students in the Tree Mass Task. She asked PSTs to work in groups to analyse the student work, identifying the claim, evidence, and reasoning for each student. She facilitated a class discussion to identify the students’ arguments and determine the correctness of their arguments.
Finally, Christine asked PSTs to work together to record prompts on a shared document that would help the students in the task to share and critique their ideas with one another. She organized the prompts from the groups and then facilitated a whole-class discussion asking what they think of the prompts the class generated. During the discussion, she emphasized the importance of having the students both share and critique one another’s ideas. She also emphasized the importance of evidence in the discussion. Christine concluded by asking the PSTs to review the Keeping the Heat task before doing the TM simulation, which she assigned for homework. No other homework was assigned.
TM Debrief. After the PSTs had completed the TM Keeping the Heat task and prior to meeting with her PSTs, Christine compiled the PSTs’ TM responses. She created two lists with PSTs’ prompts to encourage the students (Victor and Rosa) to share their ideas (List 1) and critique one another’s ideas (List 2). During class, she asked each PST to identify what they considered to be the three most effective prompts in each list and respond to a poll indicating their selections. She then presented the most frequently selected prompts on a slide — six for sharing ideas and four for critiquing ideas.
Christine then asked the PSTs to talk in groups and as a whole class about why the posted prompts were the most effective. After eliciting and responding to PSTs’ ideas in the class discussion, Christine offered summary points. These included that they as a class would spend more time discussing these ideas; it is possible for students to engage in these discussions; teacher questions should not be too guiding; and the discussion between Victor and Rosa, “based on our best prompts,” should result in the students having a better understanding of the phenomenon.
Last, Christine assigned PSTs to complete a reflection about TM outside of class. In this five-question assignment, PSTs were asked to reflect on (a) how important they thought it was to know student thinking prior to facilitating TM, (b) how important it was to require students to provide evidence for claims, (c) whether a discussion between Victor and Rosa would have led to them having better content knowledge, (d) their confidence in their own content knowledge, and (e) whether they thought that TM was a valuable preparation tool for a future science teacher.
ABS Preparation. Christine began ABS preparation with a slide presentation lecture about topics related to heat transfer. This focus on content was a different approach than she had used for TM. Reflecting on the TM learning cycle (i.e., preparation, simulation, and debrief), she said that she felt that the PSTs “weren’t prepared enough” for TM with respect to content since she used her Tree Mass Task to prepare them for TM. Therefore, she decided to use the ABS Keep it Cold scenario with associated heat transfer content to prepare PSTs for the ABS.
Two other TEs, Angela and Erin, also mentioned how they considered what they learned during the TM cycle to inform their instructional design related to the ABS cycle. Angela, like Christine, decided to use the ABS task explicitly to prepare her PSTs for the ABS. Angela also aimed to emphasize the critique part of argumentation more in the ABS than she had in TM.
Christine then asked the PSTs to do the same work as the students in the ABS Keep it Cold task, which included drawing and labelling a model to represent the temperature data in the task for a cold water in a paper cup and in a Styrofoam cup. PSTs developed their models and discussed ideas with one another as Christine monitored and talked with PSTs. She asked PSTs to turn in their drawn models and reviewed them in between classes.
In the next class, she guided PSTs in a collaborative effort to create an accurately drawn and labelled model. She elicited ideas from the PSTs, provided guidance, and constructed a drawn model for the class. She referred to this as a potential consensus model, that is, a model that all five of the students may be able to agree upon after a successful Keep it Cold argumentation discussion.
She also had PSTs work in groups to analyze three aspects of each team’s model: (a) correct ideas, (b) partially correct ideas, and (d) misconceptions. She posted the consensus model as a means of comparison to assist the PSTs in this process. Christine then facilitated a discussion in which she elicited the three aspects within each team’s model, offering insights to elaborate on PSTs’ contributions.
The final major part of ABS preparation involved the PSTs working together to plan their discussions. Christine asked the PSTs to brainstorm and post (to a shared document) three categories of questions or prompts; that is, those that they would use to (a) start their discussions, (b) pull out good ideas, and (c) address incorrect ideas. She reminded them to draw from the students’ prior activities (e.g., the Keeping the Heat investigation). After group work, Christine prompted PSTs to reflect on posted questions across three categories. She concluded by suggesting that the PSTs utilize these questions in their ABS discussions.
Throughout ABS preparation, Christine answered PSTs’ questions about logistics (e.g., whether they could bring notes into the discussion and how well-behaved the student avatars are) and offered other logistical suggestions to PSTs. Those suggestions arose from her experience of facilitating the Keep It Cold discussion. From this experience, she shared what she brought to the session (i.e., a marker, clipboard, printouts of the task, and blank paper), that it is acceptable not to reach a consensus model, and the need for student turn-and-talks to be brief.
Christine framed the ABS discussion as a tool to help PSTs learn and not as a tool to evaluate the PSTs. She shared critiques of her own discussion facilitation, admitting challenges in fostering student interaction and addressing particles in the consensus model. She reflected on trying to create a summary table during the discussion but that “went south”; she offered that she liked the idea of the summary table but would have asked questions differently if she had to do it again.
ABS Debrief. After PSTs facilitated their ABS discussions and before their ABS debrief class session, they completed an out-of-class ABS reflection assignment. There were five questions on the assignment, which asked (a) how they began their session and if that was effective, (b) to type out a transcript of a minute of the session that went well and to describe what made it effective, (c) to type out a transcript of a minute of the session that did not go so well and what they would do differently to improve the interaction, (d) how they felt about the activity and if it was fun or if they felt nervous, and (e) their overall opinion of the effectiveness of the activity in helping them (and other PSTs) prepare to facilitate argumentation discussions.
Christine began her ABS debrief session by asking PSTs for their general perceptions of their ABS experiences. She then showed the class three shared documents in which she listed PSTs’ responses (without names) to questions 1, 2, and 3 from their reflection homework. This was followed by discussions during which PSTs reflected on each document. The first two discussions, in groups then as a whole class, focused on evaluating the effectiveness of discussion prompts. The final discussion, in groups, focused on enhancing ineffective prompts identified by PSTs.
Research Question 1c: Instructional Design Evaluation
Instructional designers evaluate the success or failure of their implementation. Overall, the TEs evaluated their preparation and debrief activities both positively and negatively, with the balance shifted toward more positive insights overall. They also reflected on what the PSTs learned from their participation in TM and the ABS together.
Evaluation Across the TEs
TEs generally found their preparation and debriefing activities moderately successful, as indicated by their mixed feedback. Most TEs summarized their reflections with statements indicating that their activities went as planned, went well, or left them satisfied. This general positive code was applied to TM preparation (five TEs), TM debrief (four TEs), ABS preparation (four TEs), and ABS debrief (three TEs). Contrastingly, there were no general negative comments (e.g., that the activity went poorly or that the TE was not happy with the activity) across the activities.
TEs often indicated that specific activities they used were effective. For example, Erin offered that it was “helpful to have sample student work and roleplay as a teacher and students” as part of TM preparation. Overall, five of the six TEs commented on effective specific activities with respect to TM preparation (four TEs), TM debrief (two TEs), ABS preparation (two TEs), and ABS debrief (five TEs). Two TEs stated that specific activities they used were not effective. Finally, we identified several topics that characterized TEs’ positive or negative evaluations of their activities (Table 9).
Table 9
Teacher Educators’ (TEs’) Positive and Negative Evaluations of Preparation/Debrief Activities
| Major Codes | Number of TEs (n = 6) and TE First Initials (A-F) | |
|---|---|---|
| TE Positive Evaluations of their TM Preparation, TM Debrief, ABS Preparation, and/or ABS Debrief | TE Negative Evaluations of their TM Preparation, TM Debrief, ABS Preparation, and/or ABS Debrief | |
| Related to preservice teacher (PST) attributes or behaviors | ||
| PST engagement in activity | 5 2A 3B 3C 4E 2F | 4 2A 3D 2E 1F |
| PST work quality during activity | 4 1A 3D 3E 2F | 5 1A 3C 2D 3E 1F |
| PST characteristics | 4 2C 1D 1E 1F | 5 4A 1B 1C 2D 1E |
| PSTs preparedness for the activity | 2 1E 1F | 2 1A 1C |
| PST emotions | 4 2A 1C 1D 1E | |
| Related to Aspects of the Simulation | ||
| Argumentation | 2 1C 1E | |
| Discussion, including questioning | 2 1E 1F | 1 1E |
| Disciplinary content | 5 1A 1B 1C 1D 1E | 3 1C 1D 1F |
| Simulation logistics | 1 1A | 1 2C |
| Related to TEs | ||
| TE time management | 1 1A | 4 1A 2C 4D 1E |
| TE expresses what they would do differently in the future | 4 1A 2C 1D 1E | |
| Note. Codes included in this table are those we applied with respect to TEs’ evaluations of their activities by two or more TEs. A blank cell means that no code was applied for any of the TEs for a particular code and positive or negative evaluation. Not shown here are codes we captured as “other” or “unclear/vague.” The number preceding each letter refers to the number of activities (out of four possible, i.e., TM Preparation, TM Debrief, ABS Preparation, ABS Debrief) for which this code was applied for each TE initial listed. TM is for Teacher Moments. ABS is for Avatar-Based Simulation. | ||
Christine’s Evaluation
TM Preparation. Christine said that her TM preparation activities “went mostly according to plan.” Positive comments were about PST characteristics and engagement (e.g., “this group of students is great at participating in discussions, so they eagerly shared ideas”). Negative comments about TM preparation included TE time management, TE teacher direction, and PST work quality. Christine felt that she ran out of time to fully engage PSTs in the activities she planned, and as a result, “took over more than I intended and started suggesting questioning strategies I thought could be effective, rather than using their ideas.” This idea of being too teacher directed was a code we observed only in Christine’s responses; thus, it was not included in Table 9. Christine expected better prompts from her PSTs on encouraging students to share and critique ideas about tree mass, making it difficult for her to reference PST-generated prompts during instruction.
TM Debrief. As with her TM preparation, Christine said that her TM debrief “went mostly according to plan.” She commented that her PSTs “engaged well in the discussion.” If she had to do TM debrief over again, she would not have PSTs analyze both the rounds of questions from TM during the debrief. Negative evaluations of her TM debrief also focused on PST work quality in that Christine would have picked different questions as the most effective and “had a little trouble trying to steer [PSTs] away from some question types I didn’t like without being really direct.”
ABS Preparation. Overall, for Christine, ABS preparation activities related to disciplinary content “went well.” She said that PSTs “understood heat conduction [content] better than they did before” and “well enough to facilitate the discussion.” That said, she noted that some PSTs struggled with the content, including one who shut down and was upset when the PST realized they had a misconception. Christine said that the PSTs’ attempted models were “worse than I thought they should be.”
She also reflected on positive and negative aspects of PSTs, including that one PST was “extremely open about her … understanding of the concepts … [and] was eager to learn [and] asked good questions.” Another PST, however, seemed to try “to take over the explanation of the concepts in a way that was dismissive to the other students.” Christine wondered how she might have handled this more effectively, turning it into a “teachable moment.” Otherwise, Christine said that her PSTs “engaged well” with the activities that helped to prepare PSTs for the ABS discussion.
After using the ABS task to prepare her PSTs to facilitate the discussion, Christine wondered if she had been too leading, questioning whether she should have crafted a task akin to her development of the Tree Mass Task, thus reevaluating her deliberate emphasis on task content. She was concerned that the PSTs were “somewhat distracted by their concerns about the logistics of the ABS activity rather than on the task of facilitating an evidence-based discussion.” If she had to do ABS preparation over again, she “would make a reading assignment from the packet and have students come to class with the summary table filled out;” she suspected they had not prepared for class by reading the task as she had requested.
ABS Debrief. Christine said that her ABS debrief “went well.” Positive evaluations focused on PSTs’ learning about argumentation: “I think there was strong consensus that getting the students to critique each other and ask each other questions was difficult but important.” She mentioned that one PST may have been less student focused than was ideal. Christine also offered as a negative comment that PSTs were “a little too focused on the logistics of the simulation” during the debrief.
Reflections on PST Learning Across All TEs
The six TEs collectively shared 21 key learnings by PSTs; each TE listed two to five. Our codes identified main ideas across the key learnings. Five codes—argumentation, questioning, attending to student ideas, student-to-student interaction, and disciplinary content—were applied to two or more TEs and are shown in Table 10 alongside TEs’ reported goals and activity substance. Evidence TEs used to support PSTs’ key learnings was from debrief activities (six TEs), preparation activities (three TEs), and the simulations (two TEs), and course activities unrelated to TM or the ABS (e.g., lesson plans written for a field placement) (three TEs).
While Christine attended to the five codes listed in Table 10 in her goals and when we observed her instruction, she mentioned just three in her list of key learnings: argumentation, disciplinary content, and student-to-student interaction (Table 11). An example of the questioning code was articulated by Erin, who said that PSTs learned that they could support student learning through questioning “because by planning for their conversations [in TM and the ABS] … [PSTs] had the experience of helping students move along in their thinking by asking questions.” An example of attending to student ideas was from Barry when he said that TM helped PSTs to learn “to listen carefully, listen intently to what the students are saying.”
Table 10
Preservice Teacher (PST) Key Learnings Codes Compared to Goals and Implementation
| Major Codes | Number of Teacher Educators (n = 6) and TE First Initials (A-F) | ||
|---|---|---|---|
| TE Reported Goals for TM Preparation, TM Debrief, ABS Preparation, and/or ABS Debrief | Substance of TE Implementation as Observed During TM Preparation, TM Debrief, ABS Preparation, and/or ABS Debrief | TE Assessment of Key Learnings by PSTs Across the Semester | |
| Argumentation | 5 A C D E F | 6 A B C D E F | 5 A C D E F |
| Questioning | 6 A B C D E F | 6 a A B C D E F | 3 A D E |
| Disciplinary content | 5 B C D E F | 5 B C D E F | 3 C E F |
| Attending to student ideas | 5 B C D E F | 6 A B C D E F | 3 B D F |
| Encouraging student-to-student interaction | 2 C E | 6 A B C D E F | 3 A C D |
| Note. Codes included in this table are those we applied with respect to TEs’ assessments of PSTs’ Key Learnings by two or more TEs. “-” means that no code was applied for any of the TEs for a particular activity. Not shown here are codes we captured as “other” or “unclear/vague.” TM is for Teacher Moments. ABS is for Avatar-Based Simulation. [a] This code (for substance) included some other aspects of discussion (e.g., how to start a discussion), yet questioning was a major theme within this code. | |||
Table 11
Christine’s Key Learnings for Preservice Teachers (PSTs)
| Key Learnings - PSTs: | Key Learning Code | Evidence Teacher Educator Cited for Key Learning |
|---|---|---|
| “Learned the importance of requiring students to provide evidence for their claims” | Argumentation | Unclear (from “discussion”) |
| “Became more aware of how important it is to have strong content knowledge | Disciplinary Content | Debrief activities |
| “Learned how difficult it is for them as teachers to facilitate student-student interaction” | Student-to-Student Interaction | Preparation and debrief activities |
Research Question 2: Considering Future Iterations
In end-of-semester interviews, all TEs provided ideas for what they might do differently if they were to execute another iteration of TM and the ABS in a future semester. They would add activities or assignments (four TEs), remove them (two TEs), or otherwise alter them (three TEs). Christine suggested both adding and removing. About adding, she said that she would “implement the ABS pretty much as I did this semester” and if possible, “I would want them to have that experience again.” She elaborated that this second ABS experience would be “probably the same discussion, the same content, to give them a chance to go back in and try it again using what they learned the first time to see if they could improve.” TEs who proposed additions suggested incorporating practice discussions to support PST learning, including peer teaching, pre-ABS experiences in the Mursion® environment, or a second ABS experience.
Christine indicated that she would remove the TM simulation:
I wouldn’t do the [TM] activity, but I would … do all the parts of it except for actually going in and being online. So, reading through the students’ work and thinking about what questions are you going to ask … [and how you will] facilitate them asking each other questions about their work. And so, just as a written assignment and then discussion in my class about how are we going to do this.
Barry, another TE, suggested omitting the ABS simulation in his interview, critiquing its ability to accurately represent diverse students portrayed in the Mursion® system.
One of the three who suggested an alteration was Erin:
I would set up [TM] differently in a way that it’s seen as a building block towards hosting a small group discussion that would then eventually lead towards leading a whole class discussion and frame it around some of the dimensions and explain that they’ll get more work with more of the dimensions in other situations.
Erin also said she would do the simulations earlier in the semester. Two other TEs, Dawn and Francisco, said they would aim to make stronger connections between TM and ABS.
Post-ABS survey responses aligned closely with interview data. Five out of six TEs recommended TM for methods courses; five also found it very suitable, while one found it to be somewhat inappropriate. All six TEs, including Barry who initially questioned ABS, recommended its use; four deemed it very appropriate for methods courses, while two found it somewhat appropriate.
Discussion
Our first research question focused on TEs’ engagement as instructional designers within the project (Cutri & Whiting, 2018; Henriksen et al., 2020; Snow et al., 2023; Warr & Mishra, 2021). This included design as related to goals, assessing success and implementation (Wiggins & McTighe, 2005). Of note, all the common goals identified by TEs (argumentation, disciplinary content, questioning, and attending to student ideas) were well aligned to the design focus of the TM and ABS tasks and to the five dimensions of facilitating argumentation-focused discussions (Mikeska et al., 2019). This result suggests that TEs understood the project’s objectives and instructional purposes, internalizing them as their own goals.
Also, while there were commonalities in stated goals, there were also differences even though all participants were (a) using the same digital simulations and tasks, (b) had to manage the same project-imposed constraints, and (c) were engaged in the same community of practice. We conclude from this result that each TE was solving a unique design challenge as well as facing unique configurations of design constraints, including but not limited to varying amounts of instructional time they could devote to TM and the ABS activities. Also, given that their goals were slightly different, the criteria for evaluation of success related to PSTs’ key learnings differed as well.
Our parallel work studying outcomes for PST participants in the TEs’ classes suggests that some of the PSTs’ key learnings were realized (Howell et al., 2024; Mikeska et al., in press). Each TE in the group aimed to achieve a slightly different aim under varying conditions, rather than pursuing the same objective. This finding resonates with the design as “personal synthesis” category by Daly et al. (2012), where experiences, knowledge, and information — and likely individual constraints and criteria — come together to inform instructional design.
Answering Kumar’s (2022) aforementioned call to provide a more “complete picture” of learning cycles of instruction as enacted by TEs (p. 334), we analyzed the actions of the TEs during implementation of preparation and debrief activities through a set of complementary analyses. Structural modifications reported by the TEs included removing the previous semester’s activities, reordering course activities, and introducing new ones, specifically those related to project work on TM and the ABS. Certain TEs emphasized the integration of TM and the ABS with current coursework assignments or activities, rather than replacing them. This implies that integrating these innovations may have posed less of a challenge, as they discovered ways to include project activities without displacing the course content they typically address.
Analysis of the coding, informed by our prior work (Mikeska et al., 2023b), indicated that both substance and pedagogical approaches aligned well with the project’s goals. Although there were a few commonalities across TEs, no trajectories were identical. It is not surprising that the substance of all TEs’ instructional activities was related to argumentation (Osborne et al., 2013; Smith et al., 2008) — a key purpose for participating in the project. Nor is it surprising that TEs engaged in analysis of videos and transcripts since these were readily available.
The larger implication in comparing across TEs seemed to be the variability across TEs, suggesting that implementation of digital simulated approximations of practice, such as TM and the ABS, likely requires flexible approaches rather than a uniform curricular approach. This conclusion reinforces our conjecture that TEs must think of themselves as instructional designers and personal synthesizers working within their unique context and toward their unique goals (Daly et al., 2012).
In turn, this result suggests that designers of digital simulations would do well to think of ways to support TEs as designers and adapters of instruction by providing resources for productive adaptation and steering clear of guidance suggesting strict implementation protocols that may not accommodate the variability of teacher preparation contexts. These supports may take the form of various types of activities to employ, as in work by Davis et al. (2017) and Kumar (2022), or of possible roles for the TEs to take, as in work by Lampert et al. (2013).
In the second research question we asked what TEs would change if they implemented TM and the ABS in the future, which relates to the idea of iteration in design (Crismond & Adams, 2012; Cutri & Whiting, 2018; Lottero-Perdue & Parry, 2017; Simpson et al., 2019). Most suggestions reflected things they would do differently to prepare the PSTs for the activities. Of note were the cases where two of the TEs, Christine and Barry, indicated excluding TM or the ABS, reflecting a novel approach to instructional refinements. This idea suggests changing the constraints themselves, rather than designing instruction within the project’s constraints. This finding, along with the variability in goals across TEs, suggests that the instructional design frame is useful but also should account for TEs having greater agency in controlling constraints and criteria than might be the case in design cases more generally. This call for greater agency aligns with an approach to design “as freedom,” enabling TEs to have more “flexible and fluid boundaries” in their instructional design (Daly et al., 2012, p. 199).
Comparing the TEs’ stated goals, the activities they undertook in class, and their evaluations of PST learning, we noted that fewer takeaways were listed, as compared to goals or activities. Although this result could be an artifact of the way our questions were framed, it may also suggest that TEs are not systematically looking for evidence of each goal they set for their PSTs, a place where design-thinking could help. The attention to criteria for success suggests a more systematic evaluation process in service of improvement.
Limitations
One of the limitations of this study is that while we set out to describe the range of TEs’ instructional design practices across TE participants, we cannot extend our findings to suggest that the range would be a similar for a more representative group of TEs. Second, like any study, we were limited to the data we collected. In retrospect, it would have been helpful to include a presemester interview on TEs’ instructional design practices. Third, due to our larger study design, we were not able to support the TEs to enact and then reflect on a second iteration of integrating TM and the ABS into their course; rather, we were limited to asking how TEs would redesign if given another opportunity to implement TM and the ABS.
Implications
Taken together, the main storyline is one of alignment on the broad areas of focus that make sense in the context of the project-created constraints coupled with notable variability in implementation. Our results also provide evidence of sincere and insightful evaluation, but not of a clear link between stated goals and assessment of success. One implication is that while we can offer general guidance on using digital tools such as those in TM and the ABS, the variability in local contexts makes creating a definitive list of best practices unrealistic. This conclusion is consistent with Vanassche et al.’s (2024) assertion that a pedagogy of teacher education involves being willing to explore complexities and uncertainties and that such a pedagogy should not be minimized into prescriptive orientations and “blueprints for action” (p. 199).
That said, while somewhat speculative with respect to our findings, we did notice several promising TE practices that emanated from TEs’ instructional designs. First, instructional design often, but not always, began with modification — that is, altering resources others had developed to fit individual needs and contexts — rather than creating instructional activities starting with a blank page. This was because we encouraged TEs who participated in this study and in the larger project to share resources with one another, creating an online shared folder, and they did so. This kind of sharing among TEs engaged communities of practice is supported in other recent work (Finkelstein et al., 2024; Williams et al., 2024).
Second, the fact that we did not supply a curriculum but provided resources to the TEs in our study enabled TEs to consider connections between their unique instructional contexts and what the simulations had to offer. They chose what to emphasize as they prepared PSTs for and debriefed after each simulation. For example, the TEs typically focused on a subset of the dimensions of facilitating argumentation-focused discussions, not all five (Mikeska et al., 2019), as a way of narrowing focus. As described in the findings section, they also considered what of their original course they removed or altered to make space for the learning opportunities provided through the integration of TM and the ABS into their course.
Third, it is probably useful to describe some instructional practices that the TEs found to be successful and that we saw as promising to support PSTs as they learn to facilitate argumentation discussions. All evident in some way within Christine’s instructional design journey, were included in Table 8, and connect to other literature: drawing explicit attention to prompts and questions that PSTs can use as they prepare for their argumentation discussions (Cartier et al., 2013; Masters et al., 2024; Smith & Stein, 2018); using transcript analysis for PSTs to reflect on their own and one another’s argumentation discussions (Lottero-Perdue et al., 2022); and providing other structured reflection assignments to support the process of debriefing after PSTs facilitate their discussions (Snider et al., 2023).
Further, instructional design framing is a useful way for TEs to talk with one another about their common work because it draws attention to the need to clearly articulate goals, constraints, and criteria, which could enable better communication. It is also a place where designers of tools such as digital simulations can be thoughtful with respect to the supports they provide for integrating those tools into instruction.
A valuable future contribution might be to develop a tool to help TEs make their instructional design framing more explicit; that is, helping them to identify and evaluate their goals, constraints, and criteria systematically, perhaps using backward design framing, as suggested by Wiggins and McTighe (2005). Such a framework might help TEs make clearer links between their stated goals and their criteria for evaluation of success and that the emphasis on finding viable solutions to a design challenge rather than optimal ones would be a useful framing. After all, there is no single right way to best support PST learning; however, this does not mean that TEs cannot evaluate success relative to their own instructional goals and improvement ideas. Our hope is that TEs might see themselves as instructional designers reflected in the work that these six TEs undertook, see instructional design as part of teacher education, and find these outcomes useful as a way of organizing thinking toward incremental improvement over time.
Conclusion
What we have both proposed through our focus on instructional design in teacher education and learned from our findings is consistent with two conclusions drawn by Bullock and Butler (2024), in the final chapter of their edited book. Their conclusions were as follows.
- Developing an understanding of a pedagogy of teacher education is complex and deeply embedded in practice — it cannot be viewed as propositional knowledge.
- Developing an understanding of the nature and form of a pedagogy of teacher education is iterative and often features moments of learning what is unknown. (Bullock & Butler, 2024, p. 215)
Each of the courses in which the TEs in our study integrated simulations represented a different and complex context for instructional design, a key position for TEs to take up as part of their pedagogical practice. As they and we as a community venture into the increased use of simulation to support teacher education, including what will undoubtedly be driven by artificial intelligence in the near future, we are all stepping into the unknown. A pedagogy of teacher education is not stagnant but requires constant revision and iteration as we convert unknown to known, unfamiliar to familiar.
Our investigation of the ways TEs designed instruction for a novel sequence of simulations contributes to the field’s collective understanding of this process and may provide useful lessons for TEs and for researchers or simulation designers in ways to support implementations that respond to constraints that vary in creative and grounded ways.
Financial Support for Work
This grant was funded by the National Science Foundation (Grant No. 2037983). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Acknowledgements
We would like to thank the study participants, as well as others whose contributions made this study possible, including Denise Bressler, Devon Kinsey, Camila Lee, Justin Reich, and Meredith Thompson.
References
American Association of Colleges for Teacher Education. (2020). Teaching in the time of Covid-19: State recommendations for educator preparation programs and new teachers. https://www.aacteconnect360.org/communities/community-home/librarydocuments/viewdocument?DocumentKey=a5171ce3-b847-4722-bd74-5b18f353f5ca
Bieda, K. N. (2010). Enacting proof-related tasks in middle school mathematics: Challenges and opportunities. Journal for Research in Mathematics Education, 41(4), 351-382.
Bondie, R., Mancenido, Z., & Dede, C. (2021). Interaction principles for digital puppeteering to promote teacher learning. Journal of Research on Technology in Education, 53(1), 107-123. https://doi.org/https://doi.org/10.1080/15391523.2020.1823284
Bullock, S. M., & Butler, B. M. (2024). Signaling new directions: Lessons for understanding the pedagogy of teacher education. In B. M. Butler & S. M. Bullock (Eds.), Understanding a pedagogy of teacher education: Contexts for teaching and learning about your educational practice (pp. 214-218). Routledge. https://doi.org/10.4324/9781003365129
Butler, B. M., & Bullock, S. M. (2024). Understanding a pedagogy of teacher education: Contexts for teaching and learning about your educational practice. Routledge. https://doi.org/10.4324/9781003365129
Cartier, J. L., Smith, M. S., Stein, M. K., & Ross, D. (2013). Five practices for orchestrating productive task-based discussion in science. NSTA Corwin Press.
Creswell, J. W. (2014). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). SAGE Publications, Inc.
Crismond, D. P., & Adams, R. S. (2012). The informed design teaching and learning matrix. Journal of Engineering Education, 101(4), 738-797. https://doi.org/10.1002/j.2168-9830.2012.tb01127.x
Cutri, R. M., & Whiting, E. F. (2018). Opening spaces for teacher educator knowledge in a faculty development program on blended learning course development. Studying Teacher Education, 14(2), 125-140. https://doi.org/10.1080/17425964.2018.1447920
Daly, S. R., Adams, R. S., & Bodner, G. M. (2012). What does it mean to design? A qualitative of investigation of design professionals’ experiences. Journal of Engineering Education, 101(2), 187-219. https://doi.org/10.1002/j.2168-9830.2012.tb00048.x
Davis, E. A., Kloser, M., Wells, A., Windschitl, M., Carlson, J., & Marino, J.-C. (2017). Teaching the practice of leading sense-making discussions in science: Science teacher educators using rehearsals. Journal of Science Teacher Education, 28(3), 275-293. https://doi.org/http://dx.doi.org/10.1080/1046560X.2017.1302729
Davis, E. A., Petish, D., & Smithey, J. (2006). Challenges new science teachers face. Review of Educational Research, 76(4), 607-651. https://www-jstor-org.proxy-tu.researchport.umd.edu/stable/4124416
Dotger, B., Masingila, J., Bearkland, M., & Dotger, S. (2015). Exploring iconic interpretation and mathematics teacher development through clinical simulations. Journal of Mathematics Teacher Education, 18(6), 577-601. https://doi.org/http://dx.doi.org/10.1007/s10857-014-9290-7
Dym, C. L., Agogino, A. M., Eris, O., Frey, D. D., & Leifer, L. J. (2005). Engineering design thinking, teaching, and learning. Journal of Engineering Education, 94(1), 103-120. https://doi.org/ttps://doi.org/10.1002/j.2168-9830.2005.tb00832.x
Finkelstein, C., Jin, L., Liwanag, M. P., Lottero-Perdue, P. S., McQuitty, V., Monacelli, S., Moody, S. M., & Mullen, L. (2024). A collaborative faculty approach to a practice-based pedagogy of teacher education. In B.M. Butler & S.M. Bullock (Eds.), Understanding a pedagogy of teacher education: Contexts for teaching and learning about your educational practice (pp. 181-196). Routledge. https://doi.org/10.4324/9781003365129
Girod, M., & Girod, G. R. (2008). Simulation and the need for practice in teacher preparation. Journal of Technology and Teacher Education, 16(3), 307-337.
Goodyear, P., & Retalis, S. (2010). Technology-enhanced learning: Design patterns and pattern languages. Sense.
Grossman, P. L. (2018). Teaching core practices in teacher education. Harvard Education Press.
Henriksen, D., Gretter, S., & Richardson, C. (2020). Design thinking and the practicing teacher: Addressing problems of practice in teacher education. Teaching Education, 31(2), 209-229. https://doi.org/10.1080/10476210.2018.1531841
Henriksen, D., & Richardson, C. (2017). Teachers are designers. Phi Delta Kappan, 99(2), 60-64. https://doi.org/10.1177/0031721717734192
Hillaire, G., Waldron, R., Littenberg-Tobias, J., Thompson, M., O’Brien, S., Marvez, G. R., & Reich, J. (2022). Digital Clinical Simulation Suite: Specifications and Architecture for Simulation-Based Pedagogy at Scale. Proceedings of the Ninth ACM Conference on Learning @ Scale, New York City, NY, USA. https://doi.org/10.1145/3491140.3528276
Howell, H., Mikeska, J.N., Lottero-Perdue, P.S., Shekell, C., Kinsey, D., Lee, C., Reich, J., Maltese, A.V., Park Rogers, M. Cross Francis, D., Kaur Bharaj, P., & Halder, S. (2024). Exploring the use of simulations in secondary mathematics and science methods courses [Manuscript submitted for publication.] ETS.
Kirschner, P. (2015). Do we need teachers as designers of technology enhanced learning? Instructional Science, 43(2), 309-322. https://doi.org/10.1007/s11251-015-9346-9
Kuhn, D. (2010). Teaching and learning science as argument. Science Education, 94(5), 810-824. https://doi.org/10.1002/sce.20395
Kumar, A. R. (2022). Facilitating engagement with practice: Using a practice-based course model for pre-service early childhood teachers. Journal of Early Childhood Teacher Education, 43(3), 327-346. https://doi.org/10.1080/10901027.2020.1784322
Lampert, M., Franke, M. L., Kazemi, E., Ghousseini, H., Turrou, A. C., Beasley, H., Cunard, A., & Crowe, K. (2013). Keeping it complex: Using rehearsals to support novice teacher learning of ambitious teaching. Journal of Teacher Education, 64(3), 226-243. https://doi.org/https://doi.org/10.1177/002248711247383
Lottero-Perdue, P. S., Mikeska, J. N., & Nester, M. S. (2022). Using preservice teachers’ transcript coding of simulated argumentation discussions to characterize aspects of their noticing about argument construction and critique. Contemporary Issues in Technology and Teacher Education (CITE Journal), 22(1). https://citejournal.org/volume-22/issue-1-22/science/using-preservice-teachers-transcript-coding-of-simulated-argumentation-discussions-to-characterize-aspects-of-their-noticing-about-argument-construction-and-critique/
Lottero-Perdue, P. S., & Parry, E. A. (2017). Perspectives on failure in the classroom by elementary teachers new to teaching engineering. Journal of Pre-College Engineering Education Research, 7, 1-21. https://doi.org/https://doi.org/10.7771/2157-9288.1158
Loughran, J. (2006). Developing a pedagogy of teacher education: Understanding teaching and learning about teaching. Routledge.
Masters, H., Lottero-Perdue, P. S., Placa, N., Galindo, E., Howell, H., & Mikeska, J. N. (2024). Elementary preservice teachers’ use of prompts to encourage student-to-student talk during simulated argumentation discussions. School Science and Mathematics. https://doi.org/10.1111/ssm.12685
McDonald, M., Kazemi, E., & Kavanagh, S. S. (2013). Core practices and pedagogies of teacher education: a call for a common language and collective activity. Journal of Teacher Education, 64(5), 378-386. https://doi.org/10.1177/0022487113493807
Mikeska, J. N., Cross Francis, D., Lottero-Perdue, P. S., Park Rogers, M., Shekell, C., Kaur Bharaj, P., Howell, H., Maltese, A., Thompson, M., & Reich, J. (in press). Promoting preservice teachers’ facilitation of argumentation in mathematics and science through digital simulations. Teaching and Teacher Education.
Mikeska, J. N., Howell, H., & Kinsey, D. (2023a). Do simulated teaching experiences impact elementary preservice teachers’ ability to facilitate argumentation-focused discussions in mathematics and science? Journal of Teacher Education, 74(5), 422-436. https://doi.org/https://doi.org/10.1177/00224871221142842
Mikeska, J. N., Howell, H., & Kinsey, D. (2023b). Inside the black box: How elementary teacher educators support preservice teachers in preparing for and learning from online simulated teaching experiences. Teaching and Teacher Education, 122. https://doi.org/10.1016/j.tate.2022.103979
Mikeska, J. N., Howell, H., & Straub, C. (2019). Using performance tasks within simulated environments to assess teachers’ ability to engage in coordinated, accumulated, and dynamic (CAD) competencies. International Journal of Testing, 19(2), 128-147.
Mursion. (2024). Explore Mursion: How it works. https://www.mursion.com
National Governors Association for Best Practices, & Council of Chief State School Officers. (2010). Common core state standards (Mathematics).
NGSS Lead States. (2013). Appendix F: Science and engineering practices in the Next Generation Science Standards. In Next Generation Science Standards: For states, by states. The National Academies Press. https://doi.org/https://doi.org/10.17226/18290
Norton, P., & Hathaway, D. (2015). In search of a teacher education curriculum: Appropriating a design lens to solve problems of practice. Educational Technology, 55(6), 3-14. https://www-jstor-org.proxy-tu.researchport.umd.edu/stable/44430419
Online Practice Suite. (2021a). Hungry, Hungry Huskies Performance Task. Pre-release task version developed under grant no. 2037983. https://doi.org/10.17910/b7.1110
Online Practice Suite. (2021b). Keep it Cold Performance Task. Pre-release task version developed under grant no. 2037983. https://doi.org/10.17910/b7.1110
Online Practice Suite. (2021c). Keeping the Heat Discussion Task. Pre-release task version developed under grant no. 2037983. https://doi.org/10.17910/b7.1110
Online Practice Suite. (2021d). Strawberry Picking Discussion Task. Pre-release task version developed under grant no. 2037983. https://doi.org/10.17910/b7.1110
Online Practice Suite. (2021e). Teacher educator guidelines for using the secondary mathematics Online Practice Suite: Avatar Based Simulations (ABS): Facilitating student to student discussion on the Hungry, Hungry Huskies Activity. Pre-release Task Version developed under Grant No. 2037983. https://doi.org/10.17910/b7.1110
Online Practice Suite. (2021f). Teacher educator guidelines for using the secondary mathematics Online Practice Suite: Teacher Moments (TM): Facilitating student to student discussion on the Rate of Strawberry Picking Activity. Pre-release Task Version developed under Grant No. 2037983. https://doi.org/10.17910/b7.1110
Online Practice Suite. (2021g). Teacher educator guidelines for using the secondary science Online Practice Suite: Avatar Based Simulations (ABS): Facilitating student to student discussion on the Keep it Cold Activity. Pre-release Task Version developed under Grant No. 2037983. https://doi.org/10.17910/b7.1110
Online Practice Suite. (2021h). Teacher educator guidelines for using the secondary science Online Practice Suite: Teacher Moments (TM): Facilitating student to student discussion on the Keeping the Heat Activity. Pre-release Task Version developed under Grant No. 2037983. https://doi.org/10.17910/b7.1110
Osborne, J., Simon, S., Christodoulou, A., Howell-Richardson, C., & Richardson, K. (2013). Learning to argue: A study of four schools and their attempt to develop the use of argumentation as a common instructional practice and its impact on students. Journal of Research in Science Teaching, 50(3), 315-347.
Petroski, H. (1996). Invention by design: How engineers get from thought to thing. Harvard University Press.
Reich, J. (2022). Teaching drills: Advancing practice-based teacher education through short, low-stakes, high-frequency practice. Journal of Technology and Teacher Education, 30(2), 217-228. https://www.learntechlib.org/primary/p/221208/
Simon, S., Erduran, S., & Osborne, J. (2006). Learning to teach argumentation: Research and development in the science classroom. International Journal of Science Education, 28(2-3), 235-260. https://doi.org/10.1080/09500690500336957
Simpson, A., Anderson, A., & Maltese, A. V. (2019). Caught on camera: Youth and educators’ noticing of and responding to failure within making contexts. Journal of Science Education and Technology, 28(5), 480-492. https://doi.org/https://doi.org/10.1007/s10956-019-09780-0
Smith, M., & Stein, M. K. (2018). Five practices for orchestrating productive mathematics discussion (2nd ed.). National Council of Teachers of Mathematics.
Smith, M. S., Bill, V., & Hughes, E. K. (2008). Thinking through a lesson: Successfully implementing high-level tasks. Mathematics Teaching in the Middle School, 14(3), 132-138.
Snider, R., Shekell, C., & Cross Francis, D. (2023). What do we miss: What do preservice teacher self reflections conceal from teacher educators? Association of Mathematics Teacher Educators.
Snow, J. L., Jacobs, J., Pignatosi, F., Norman, P., Rust, F., Yendol-Hoppey, D., Naiditch, F., Nepstad, C., Roosevelt, D., Pointer-Mace, D. H., Kosnick, C., & Warner, C. (2023). Making the invisible visible: Identifying shared functions that enable the complex work of university-based teacher educators. Studying Teacher Education, 19(3), 351-375. https://doi.org/10.1080/17425964.2023.2213717
Straub, C. (2018). Best in class leadership development: How virtual reality and avatars are changing the learning landscape. https://www.denasamuels.com/wp-content/uploads/2020/03/How-Virtual-Reality-and-Avatars-are-Changing-the-Learning-Landscape.pdf
Straub, C., Dieker, L., Hynes, M., & Hughes, C. (2015). Using virtual rehearsal in TLE TeachLivE™ mixed reality classroom simulator to determine the effects on the performance of science teachers: A follow-up study. In [EDITOR NAME(S)?] Proceedings of TeachLivE 3rd National Conference, Orlando, FL.
Thompson, M., Leonard, G., Mikeska, J. N., Lottero-Perdue, P. S., Maltese, A. V., Pereira, G., Hillaire, G., Waldron, R., Slama, R., & Reich, J. (2022). Eliciting learner knowledge: Enabling focused practice through an open-source online tool. Behavioral Sciences, 12(324), 324-324. https://doi.org/10.3390/bs12090324
Vanassche, E., Meijer, P., Oolbekkinki-Marchand, H., & Vanderlinde, R. (2024). A pedagogy of teacher educator development. In B. M. Butler & S. M. Bullock (Eds.), Understanding a pedagogy of teacher education: Contexts for teaching and learning about your educational practice (pp. 197-213). Routledge. https://doi.org/10.4324/9781003365129
Warr, M., & Mishra, P. (2021). Integrating the discourse on teachers and design: An analysis of ten years of scholarship. Teaching and Teacher Education, 99. https://doi.org/10.1016/j.tate.2020.103274
Wiggins, G., & McTighe, J. (2005). Understanding by design (Vol. Expanded 2nd ed). ASCD.
Williams, O., Gannon, C., Holden, S., & Burris, J. (2024). New positions, new pedagogies: Learning and becoming a novice teacher educator community of practice. In B. M. Butler & S. M. Bullock (Eds.), Understanding a pedagogy of teacher education: Contexts for teaching and learning about your educational practice (pp. 30-47). Routledge. https://doi.org/10.4324/9781003365129
Windschitl, M., Thompson, J., & Braaten, M. (2020). Ambitious science teaching. Harvard Education Press.
Appendix A
Summary of Teacher Moments Task, “Keeping the Heat”
v24i4General3AppAAppendix B
Summary of the Avatar-Based Simulation Task, “Keeping the Heat”
v24i4GeneralAppBAppendix C
Reflection, Survey, and Interview Questions used in the Study
v24i4General3AppC ![]()