\nMemory<\/td>\n | 2.2096<\/td>\n | .50518<\/td>\n | 23<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n <\/p>\n Significant main effects were also found on mean scores from three of the six CIR sections: what<\/em>, emotions<\/em>, and perspectives<\/em>. For those means only, LSD post hoc analysis was used in order to determine the degree of difference between means of each set of treatment scores. CIR data excerpts for these sections are provided as scoring examples in order to further illustrate the findings.<\/p>\nWhat (Descriptions)<\/h3>\nA significant main effect was found for scores on the what<\/em> section of the CIR: F<\/em>(1,22) = 3.60, p<\/em> = .046. Partial ETA squared (effect size) was .26 and power was .60. Participants\u2019 descriptions of their critical incidents when referring to video were scored significantly higher on average p <\/em>< .05 than those written by participants when using audio. See Table 4 for descriptive data.<\/p>\nTable 4 \n<\/strong>Descriptive Statistics for CIR Scores on the What Section<\/p>\n\n\n\nTreatment<\/strong><\/td>\nMean<\/strong><\/td>\nSD<\/strong><\/em><\/td>\nN<\/strong><\/em><\/td>\n<\/tr>\n\nVideo<\/td>\n | 2.3478<\/td>\n | .64728<\/td>\n | 23<\/td>\n<\/tr>\n | \nAudio<\/td>\n | 1.9565<\/td>\n | .63806<\/td>\n | 23<\/td>\n<\/tr>\n | \nMemory<\/td>\n | 2.0000<\/td>\n | .67420<\/td>\n | 23<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n <\/p>\n The what<\/em> section of the CIR guides the participants\u2019 focus on the event that occurred. The highest score for this section was a score of 3 (Dialogic) with the lowest being a 1 (Routine).\u00a0 To receive a score of 3, the reflective writing had to contain a focus on students, informal or formal assessments, and interactions that would help the teacher to interpret if and how students were learning about the content (Ward & McCotter, 2004). The following is an example of a Dialogic reflection from a participant in the video treatment:<\/p>\nStudents are learning about gene splicing and using paper DNA to splice together firefly and plant DNA to make a glowing plant.\u00a0 At this part, students are identifying cut points, isolating the \u201cglowing\u201d gene, and inserting it into the plant DNA.\u00a0 Of the three students, one has done this before and is moving quickly, one has not but is familiar with the concept and is moving at the pace I expected, and one is not familiar or for whatever reason moving a little slower.\u00a0 As they isolate and cut, the third student starts falling a little behind, partly because it takes her longer to find the cut points on the firefly DNA and partly because she doesn\u2019t seem as interested in doing it. Because of time, I begin giving the next instructions before she finishes cutting.\u00a0 Then the students use the same enzyme to cut open the plant DNA.\u00a0 At this point, the third student is struggling to find the location and does not show that she understands the concept.\u00a0 I demonstrate using another student\u2019s DNA and that student leans over to help her.\u00a0 As they finish, I ask a few extension questions about why this technique is expensive and why some people object to it.\u00a0 Students 1 and 2 volunteer answers but Student 3 tunes out. (Participant 23; Reflection 2; Video Treatment)<\/p>\n Rather than focusing on the outcome of the learning, the participant reflects on the learning process, particularly for the struggling student. She pays attention to the students\u2019 level of participation and understandings of the concepts.<\/p>\n In this section, reflections again received a score of 2 (Technical) when the participant focused on teaching tasks such as asking questions, as in the following example:<\/p>\n At the beginning of my microteaching lecture, I provided pictures of cars and asked my students what made the cars operate, to which they replied \u201cgasoline.\u201d I said, \u201cCorrect, and we use oil to make gasoline.\u201d I then asked them if they knew of another word to describe oil, to which they replied, \u201cfossil fuel.\u201d I asked them if fossil fuels were good or bad for the environment, to which they replied bad, and I asked them why. The students told me that fossil fuels pollute the air and the environment. (Participant 14; Reflection 1; Memory Treatment)<\/p>\n Typical of a Technical reflection, Participant 14 did not extend the reflection to include the quality of students\u2019 responses, nor did she try to make connections between the specific instructional strategy and student learning.<\/p>\n A score of 1 (Routine) indicated that the participant was focusing on self or analyzing practice without a personal response. This type of lower level reflection does not focus on a particular problem or the complexity of the situation. The following is an example of a Routine reflection from a participant in the audio treatment:<\/p>\n During the beginning of my presentation, one of the students was using the restroom and did not join the presentation before 3 minutes in. When she came in she was somewhat disruptive while moving her chair around when she sat down and the other students were paying attention to her instead of me. (Participant 61; Reflection 2, Audio Treatment)<\/p>\n Rather than taking responsibility for the incident or questioning what role he might play in changing the situation, the participant places blame on the student who enters late. This was typical of a Routine reflection in the what<\/em> section.<\/p>\nEmotions<\/h3>\nA significant main effect was found for scores on the emotions<\/em> section of the CIR: F<\/em>(1,22) = 4.97, p<\/em> = .017. Partial ETA squared (effect size) was .32 and power was .75. In other words, participants\u2019 descriptions of the feelings they had during the critical incident experience when referring to video were scored significantly higher on average p <\/em>< .05 than those written by participants when using audio. See Table 5 for descriptive data.<\/p>\nTable 5 \n<\/strong>Descriptive Statistics for CIR Scores on the Emotions Section<\/p>\n\n\n\nTreatment<\/strong><\/td>\nMean<\/strong><\/td>\nSD<\/em><\/strong><\/td>\nN<\/em><\/strong><\/td>\n<\/tr>\n\nVideo<\/td>\n | 2.3478<\/td>\n | .57277<\/td>\n | 23<\/td>\n<\/tr>\n | \nAudio<\/td>\n | 1.8696<\/td>\n | .81488<\/td>\n | 23<\/td>\n<\/tr>\n | \nMemory<\/td>\n | 2.2174<\/td>\n | .73587<\/td>\n | 23<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n <\/p>\n Like the previous section, the highest level for the emotions<\/em> section was 3 (Dialogic) and the lowest was 1 (Routine). Using the reflective process to gain new insights into teaching is another dimension of the Dialogic exemplar (Ward & McCotter, 2004). Through inquiry, participants engage in ongoing questions about their practice to facilitate changes in their beliefs or professional practice. The following example represents such a Dialogic score:<\/p>\nI felt much more confident and relaxed with this particular incident than any of the previous Micro-teachings. I also tried to hold back and give the students the opportunity to answer my questions and respond to each other which was difficult\/frustrating at times (especially because of time constraints); however, I realize that giving students the time to answer questions is extremely important. Holding back and giving the students time to answer was especially difficult because I could tell one student was struggling more with the concept and examples of adaptation than the other. (Participant 15; Reflection 3; Video Treatment)<\/p>\n The focus on students leads the participant to change how she approaches classroom discourse. She pays particular attention to the struggling student.<\/p>\n A score of 2 (Technical) was assigned when participants illustrated concern about a specific teaching task rather than examining their emotions or insights into improving practice or questioning their instructional solutions. The following is an example:<\/p>\n Although my glass lesson was not nearly as exciting as the previous week\u2019s lesson on water, I was glad that my group of students was able to gain something from the lesson. Also, I only used 11 of my allotted 15 minutes, and really wish I had planned more to utilize that time. (Participant 35; Reflection 2; Memory Treatment)<\/p>\n A score of 1 (Routine) generally illustrated a lack of analysis. For example, in the passage below, the PST used \u201csnack ingredients\u201d in order to help students conceptualize the parts of a cell: “I was excited to teach this lesson. I wanted to make the students feel comfortable with exploring the cell in an interactive way” (Participant 54; Reflection 3; Audio Treatment).<\/p>\n Due in part to a lack of detail, the PST is unable to make any connections between her emotions, what she pays attention to during her teaching, or any changes in her approach that might be necessary.<\/p>\n Perspectives<\/h3>\nA significant main effect was found for scores on the perspectives<\/em> section of the CIR: F<\/em>(1,22) = 4.40, p <\/em>= .025. Partial ETA squared (effect size) was .30 and power was .70. In other words, participants\u2019 descriptions of their critical incidents written from the perspective of each actor who participated in the incident (done when referring to video) were scored significantly higher on average p <\/em>< .05 than those written by participants when using audio. See Table 6 for descriptive data.<\/p>\nTable 6 \n<\/strong>Descriptive Statistics for CIR Scores on the Perspectives Section<\/p>\n\n\n\nTreatment<\/strong><\/td>\nMean<\/strong><\/td>\nSD<\/em><\/strong><\/td>\nN<\/em><\/strong><\/td>\n<\/tr>\n\nVideo<\/td>\n | 2.0870<\/td>\n | .79275<\/td>\n | 23<\/td>\n<\/tr>\n | \nAudio<\/td>\n | 1.4783<\/td>\n | .66535<\/td>\n | 23<\/td>\n<\/tr>\n | \nMemory<\/td>\n | 1.9130<\/td>\n | .73318<\/td>\n | 23<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n <\/p>\n The highest score in this section was 3 (Dialogic) with the lowest being a 1 (Routine). As the term implies, an aspect of a Dialogic reflection is a dialogue with others or with self in which the participant considers the views of others<\/em>. The perspectives<\/em> section of the CIR renders the internal dialogue explicit. The following is a typical Dialogic example from the video treatment group. In this example, the PST reflected on an incident that occurred after a minilecture on waves and light, in which a student posed a question about dreaming in color. The PST focuses on what multiple students may be thinking as he facilitated the discussion:<\/p>\nJazmin (student): This is kind of interesting, I\u2019ve heard this thing about dreams before. I wonder if it\u2019s true.\u00a0 The purpose of this experience is for us to learn new scientific information.<\/p>\n Spencer (student): I always dream in black and white. That matches up with my life experience.\u00a0 It reminds me of this one dream that I had, which was crazy!<\/p>\n Matt (teacher):\u00a0 I\u2019m happy to answer questions and give you guys some new scientific insights; we\u2019re starting to get off track here, and I feel like the balance of our time is swinging away from topical learning into relaxed, social time.\u00a0 I need to bring things back under control if I\u2019m going get anything done\u2026 (Participant 53; Reflection 1; Video Treatment)<\/p>\n Reflections receiving a score of 2 (Technical) often had a narrow focus on a teaching task, but did not demonstrate thinking about the situation from multiple perspectives. In the following example, Participant 54 conducted a microteaching about cells and focused her critical incident on an instructional method used during the lesson:<\/p>\n I choose to be very interactive with students because I wanted to engage them in during the lesson. In addition, I wanted to keep their attention. The students seemed to be very receptive. The students commented that they wanted to come over because they were interested in knowing what the activity would be (Participant 54; Reflection 3; Audio Treatment)<\/p>\n A score of 1 or a Routine reflection was typically short and written as if the PST completed the analysis for its own sake (Ward & McCotter, 2004). For example, after listening to the audio recording of the lesson, this PST reflected on what a student thought about the lesson on why ice floats: “From a student\u2019s perspective: The lesson went well. You were able to make water interesting” (Participant 35; Reflection 1; Audio Treatment).<\/p>\n As illustrated in these examples, these reflections lacked insights into students\u2019 thinking as well as the participant\u2019s own thinking. Thus, the PST has negated any problem solving or insights that might occur if she had considered all<\/em> participants\u2019 perspectives.<\/p>\nLimitations<\/h2>\nParticipants wrote their reflections while together in a large classroom. Thus, even though they used headsets to listen to recordings of their critical incidents, they may have experienced some distraction. All participants video-recorded themselves teaching and edited the video for critical incidents before writing their reflection papers. This may have influenced their reflective writing in different ways. For example, a small number of the participants who were in the memory only treatment group still seemed to reference video in their guided reflection papers. All three raters in this study were authors of this paper, which is why multiple procedures were put in place to reduce researcher bias as much as possible, even though initial rating was done blind. As mentioned earlier in the paper, not all participants completed all three treatments, which is why the N<\/em> reported in the results section is 23 rather than 28. In addition, the sample size was small and power was adversely affected. Future studies will include a larger sample.<\/p>\nFinally, the Ward and McCotter rubric has a specific focus on student learning and teacher practice. The CIR was not designed specifically for that, but rather to help the PSTs focus on the meaning<\/em> of any incident that made them take notice of their practice rather than on only the experience<\/em> of it (Griffin, 2003). Thus, if the focus of participants\u2019 written reflection papers did not align with the intent of the Ward and McCotter instrument, it could have resulted in a lower score. Although this may be viewed as a limitation, we were interested in fostering the kind of reflective practice promoted by Ward and McCotter (2004) and their rubric.<\/p>\nDiscussion<\/h2>\nWith regard to individual sections of the CIR, it is noteworthy that the significant differences found on individual sections of the paper were on the first three sections of the CIR, which were developed to help PSTs describe, focus on, and recall the incidents. This result may have\u00a0been due to the fact that video provides a richer and more accurate recording of a teaching incident from which to reflect\u00a0than audio.<\/p>\n Also of note was that the Emotions and Perspectives sections of the CIR were specifically designed to elicit participants\u2019 emotions and shift participants\u2019 focus away from themselves, which is consistent with studies that have uncovered how working with video of one\u2019s own teaching can elicit a combination of cognitive as well as emotional processes that may influence teacher learning (Seidel et al., 2011). They have also found that viewing video of one\u2019s own teaching can help PSTs shift focus from themselves to others Calandra, Gurvitch, & Lund, 2008; Santagata, 2009; Sherin & van Es, 2009; van Es & Sherin, 2008).<\/p>\n No significant difference was found on the last three individual sections of the CIR between treatments. This occurrence may have been a result of small sample size or because the medium (video, audio, none) had less of an influence when writing about Cultural Relevance, (Teacher) Position, or (Future) Actions. Similar studies have also reported varying levels of reflection resulting from media conditions, but dependent on the type of question asked in the reflection instrument (Bergman, 2015; Welsch & Devlin, 2007).<\/p>\n Media in and of itself made less of a difference in this study; rather certain attributes of media may have been more or less supportive for different types of learning from PSTs\u2019 own teaching (Clark, 1983; Kozma, 1994). Regardless of where one stands on the importance of media for learning, these results are important, because video is currently being used on such a large scale in teacher education.<\/p>\n Finally, while the literature contains many reports that support the use of digital video for PST development, some in the past have warned that the process can feel difficult, cumbersome, and intrusive for participants and their students, which in turn, adversely affects levels of participation in video-aided teacher reflection exercises. To a large extent, this effect has been a result of the seemingly large amount of effort required to record footage of classroom teaching and, in some cases, a rather high learning curve for video editing software. We have found that using mobile devices for teacher education purposes has made the process easier because (a) using a mobile device to capture video is now rather a commonplace activity, (b) mobile devices are comparably more familiar and less obtrusive in classroom contexts than professional cameras or recorders, and (c) users can capture, edit, and share video footage with relative ease all on the same mobile device.<\/p>\n The purpose of this study was to examine what happened when a group of PSTs used video prompts, audio prompts, or memory alone during a guided reflective writing exercise. In conclusion, we found that that reflection papers written while referencing video of critical teaching incidents were of significantly higher quality than those written while referencing audio.<\/p>\n References<\/h2>\nBergman, D. (2015) Comparing the effects of classroom audio-recording and video-recording on preservice teachers’ reflection of practice. The Teacher Educator, 50<\/em>(2), 127-144.<\/p>\nBrantley-Dias, L., Calandra, B., & Fox, D. (2007). Teacher candidates\u2019 experiences with digital video editing for reflection: How much scaffolding do they need?<\/em> Paper presented at the annual meeting of the American Educational Research Association, Chicago, IL.<\/p>\nBrophy, J. (Ed.). (2004). Using video in teacher education. <\/em>San Francisco, CA: Elsevier.<\/p>\nCalandra, B. (2014). A process of guided, video-based reflection. In B. Calandra & P.J. Rich (Eds.), Digital video for teacher education: Research and practice.<\/em> New York, NY: Routledge.<\/p>\nCalandra, B., Brantley-Dias, L., Lee, J.K., & Fox, D.L. (2009). Using video editing to cultivate novice teachers\u2019 practice. Journal of Research on Technology in Education<\/em>, 42<\/em>(1), 73-94.<\/p>\nCalandra, B., Gurvitch, R., & Lund, J. (2008). An exploratory study of digital video editing as a tool for teacher preparation. Journal of Technology and Teacher Education, 16<\/em>(2), 137-153.<\/p>\nCalandra, B., & Rich P.J. (Eds.). (2014) Digital video for teacher education: Research and practice. <\/em>New York, NY: Routledge.<\/p>\nCalandra, B., Sun, Y., & Puvirajah, A. (2014). A new perspective on teachers\u2019 video-aided reflection. Journal of Digital Learning in Teacher Education. 30<\/em>(3), 104\u2013109.<\/p>\nClark, R. E. (1983). Reconsidering research on learning from media. Review of Educational Research, 53<\/em>(4), 445\u2013449.<\/p>\nDewey, J., & HMH, H. M. H. (1933). How we think: A restatement of the relation of reflective thinking to the educative process<\/em>. Boston, MA: D.\u00a0C. Heath.<\/p>\nGaudin, C., & Chali\u00e8s, S. (2015). Video viewing in teacher education and professional development: A literature review. Educational Research Review, 16<\/em>, 41-67.<\/p>\nGriffin, M.L. (2003). Using critical incidents to promote and assess reflective thinking in preservice teachers. Reflective Practice 4<\/em>(2), 207-220.<\/p>\nHofer, M., & Grandgenett, N. (2012). TPACK development in teacher education: A longitudinal study of preservice teachers in a secondary MA Ed. program.\u00a0Journal of Research on Technology in Education<\/em>,\u00a045<\/em>(1), 83-106.<\/p>\nJong, O. D., Van Driel, J. H., & Verloop, N. (2005). Preservice teachers’ pedagogical content knowledge of using particle models in teaching chemistry.\u00a0Journal of Research in Science Teaching<\/em>,\u00a042<\/em>(8), 947-964.<\/p>\nKozma, R. (1994). Will media influence learning? Reframing the debate. Educational Technology Research and Development, 42<\/em>(2), 7-19.<\/p>\nLai, G. & Calandra, B. (2010). Examining the effects of computer-based scaffolds on novice teachers’ reflective journal writing. Educational Technology Research and Development, 58<\/em>(4), 421-437.<\/p>\nLai, G., Calandra, B., & Ma, Y. (2008). Leveraging the potential of design-based research to enhance preservice teachers\u2019 online reflective practice: A case study. In C. Crawford et al. (Eds.), Proceedings of Society for Information Technology and Teacher Education International Conference 2008<\/em> (pp. 1132-1139). Chesapeake, VA: AACE.<\/p>\nLee, H. J. (2005). Understanding and assessing preservice teachers\u2019 reflective thinking.\u00a0Teaching and teacher education<\/em>,\u00a021<\/em>(6), 699-715.<\/p>\nMayer, R. E. (2005). Introduction to multimedia learning. In R. E. Mayer (Ed.), The Cambridge handbook of multimedia learning<\/em> (pp. 1\u201316). New York, NY: Cambridge University Press.<\/p>\nPaivio, A. (1986). Mental representations: A dual coding approach.<\/em> Oxford, UK: Oxford University Press.<\/p>\nPutnam, R. T., & Borko, H. (2000). What do new views of knowledge and thinking have to say about research on teacher learning?\u00a0\u00a0Educational Researcher<\/em>,\u00a029<\/em>(1), 4-15.<\/p>\nRich, P., & Hannafin, M. (2008). Capturing and assessing evidence of student teacher inquiry: A case study. Teaching and Teacher Education, 24<\/em>(6), 1426-1440.<\/p>\nRosaen, C. L., Lundeberg, M., Cooper, M., Fritzen, A., & Terpstra, M. (2008). Noticing noticing: How does investigation of video records change how teachers reflect on their experiences? Journal of Teacher Education, 59<\/em>, 347-360.<\/p>\nSantagata, R. (2009). Designing video-based professional development for mathematics teachers in low-performing schools. Journal of Teacher Education, 60<\/em>(1), 38–<\/strong>51.<\/p>\nSch\u00f6n, D. A. (1983).\u00a0The reflective practitioner: How professionals think in action<\/em>. New York, NY: Basic books.<\/p>\nSeidel, T., Blomberg, G., & Renkl, A. (2013). Instructional strategies for using video in teacher education. Teaching and Teacher Education, 34<\/em>, 56\u201365.<\/p>\nSeidel, T., Sturmer, K., Blomberg, G., Kobarg, M., & Schwindt, K. (2011). Teacher learning from analysis of classroom situations: Does it make a difference whether teachers observe their own teaching or that of others? Teaching and Teacher Education, 27<\/em>, 259-267.<\/p>\nSherin, M. G., & van Es, E. A. (2009). Effects of video club participation on teachers’ professional vision. Journal of Teacher Education, 60<\/em>(1), 20–<\/strong>37.<\/p>\nShrout, P. E., & Fleiss, J. L. (1979). Intraclass correlations: uses in assessing rater reliability.\u00a0Psychological Bulletin<\/em>,\u00a086<\/em>(2), 420.<\/p>\nShulman, L.S. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57<\/em>(1), 1-22.<\/p>\nSkeres, M.J., Bonasia, K., St.Laurent, M., Pishdadian, S., Winocur, G., Grady, C., & Moscovitch, M. (2016). Recovering and preventing loss of detailed memory: Differential rates of forgetting for detail types in episodic memory. Learning & Memory 23<\/em>(2), 72-82.<\/p>\nSun, J., & van Es, E. A. (2015). An exploratory study of the influence that analyzing teaching has on preservice teachers\u2019 classroom practice.\u00a0Journal of Teacher Education<\/em>,\u00a066<\/em>(3), 201-214.<\/p>\nvan Es, E. A., & Sherin, M. G. (2008). Mathematics teachers’ \u201clearning to notice\u201d in the context of a video club. Teaching and Teacher Education, 24<\/em>(2), 244 | | | | | | | | | | | | | | | |