{"id":6382,"date":"2016-05-10T14:22:44","date_gmt":"2016-05-10T14:22:44","guid":{"rendered":"https:\/\/citejournal.org\/\/\/"},"modified":"2016-11-07T21:53:13","modified_gmt":"2016-11-07T21:53:13","slug":"enabling-collaboration-and-video-assessment-exposing-trends-in-science-preservice-teachers-assessments","status":"publish","type":"post","link":"https:\/\/citejournal.org\/volume-16\/issue-2-16\/science\/enabling-collaboration-and-video-assessment-exposing-trends-in-science-preservice-teachers-assessments","title":{"rendered":"Enabling Collaboration and Video Assessment: Exposing Trends in Science Preservice Teachers\u2019 Assessments"},"content":{"rendered":"

Within the context of improving education in the science, technology, engineering, and mathematics (STEM) fields, underrepresented groups in the STEM fields, K-20 and industry partnerships, social media use in education, and assessment is the important concept of teacher reflection, practice, and improvement. As such, Rich and Hannafin (2009) called for \u201cevidence of impact\u201d of using video and reflection with preservice teachers (p. 64).\u00a0 Preservice and newer in-service teachers often struggle with reflection and, inherently, self-assessment. Interestingly, a significant discrepancy also arises between the peer and self-assessment of end-products. In developing culminating products such as videos and self-reflection documents, which are required by many teacher licensure programs, preservice teachers could benefit from additional peer support in order to improve their self-assessment and reflection skills.<\/p>\n

Currently, online video feedback systems consist of a binary like or dislike judgment, with disjointed and unfocused open-response comments. These systems of peer assessment and feedback offer little constructive benefit to video creators.\u00a0 Viewers also face a challenge when providing summative assessment of videos. While the determination of binary feedback of a video is a snap judgment, being able to define, describe, and justify the reasons behind the judgment is challenging due to the number of variables and the multiple points of reference under consideration throughout the video’s duration.<\/p>\n

Aggregated binary assessments, the type typically available on video sharing sites, are analogous to students\u2019 receiving a set of pass\/fail grades from all of their teachers, who all use their own private and unique scoring rubric. While popular social media sites such as Youtube and Facebook, as well as online learning sites like Coursera and Kahn Academy, all allow for discussions of video content, their decoupled free response structures do not allow for a continuous formative assessment of the original content but, rather, a highly variable summative assessment based on the final opinion of the content viewer.<\/p>\n

A different approach or tool that provides continuous feedback could promote positive attitudes towards technology use, which has been shown by Cullen and Greene (2011) to predict intrinsic and extrinsic motivation. With this in mind, we created YouDemo, an online tool, and used it with preservice teachers to assess various aspects of assessment. This study focuses on the discrepancies between peer and self-assessment, the relationship and bias between formative and summative assessment abilities, and the impact of assessing the work of peers and comparing it to one\u2019s self-assessment of similar work.<\/p>\n

Purpose\/Problem\/Gap in Literature<\/p>\n

Within the context of educational assessment, a binary rating system provides a weak summative and nonconstructive evaluation of the overall product. The evaluation becomes a function of an individual viewers\u2019 personal lens and is not based on a precisely defined metric (characteristic or quality) or metrics over the course of the entire work. Currently, video annotation is predominately composed of tools that allow for nonaggregating, text-based markup of videos. These tools include standalone PC applications such as VCode (http:\/\/social.cs.uiuc.edu\/projects\/vcode.html<\/a>) and ANVIL (http:\/\/www.anvil-software.org\/<\/a>), as well as online applications that are not typically freely accessible to teachers, such as VideoPaper (https:\/\/vpb.concord.org\/<\/a>) and MediaNotes (http:\/\/www.cali.org\/content\/medianotes\/)<\/a>. Presently, the only known free annotation tool is VideoANT (https:\/\/ant.umn.edu\/)<\/a>, which allows text-based annotations to YouTube videos (Hosack, 2010).<\/p>\n

In an educational setting, the age of traditional online courses and massive open online courses (MOOCs), online video-based critiques and assessment by peers and mentors can lack the depth and richness of in-person critiques and debates (Rich & Hannafin, 2009). Practice with video assessment and self-reflection is critical, because many preservice teachers are now subject to edTPA requirements (Barron, 2015) and must submit teaching videos and showcase their ability to reflect and self-assess. Their video submissions are critical to their final edTPA scores.<\/p>\n

The tool presented here, YouDemo.org<\/a>, targets preservice teachers, their K-12 mentor teachers, and university professors who are interested in critiquing peer videos and receiving aggregated evaluation feedback on their own videos. The tool links to existing YouTube videos, allows continuous critique of two metrics (or qualities), and provides a user access to the aggregated assessment.<\/p>\n

YouDemo enables the continuous assessment of two video-creator-defined metrics. For the remainder of this article, \u201cvideo creators\u201d include users who create or upload videos, while \u201cvideo evaluators\u201d or \u201cassessors\u201d are those who provide feedback for the videos. Creators can view the results of aggregated quantitative metric assessment as well as qualitative feedback provided by evaluators. Creators can then evaluate, reflect, and compare their own self-assessment with an aggregate of their peers\u2019 anonymous assessment of their work. This process allows video creators to gain authentic summative and formative feedback on their videos, which promotes reflection and pedagogical questioning.<\/p>\n

YouDemo provides a teaching mechanism for both formative and summative assessment that can support and enable learning at all levels of education. Additionally, the validity and reliability of tools or assignments used in the classroom are important assessment aspects, and YouDemo underwent this scrutiny. As stated by Mertler (2003),<\/p>\n

Evidence must be continually gathered and examined in order to determine the degree of validity possessed by decisions. Three formal sources of evidence that support the existence of validity include content, criterion, and construct evidence. Content evidence relies on professional judgment; whereas, criterion and construct evidence rely on statistical analyses. Content evidence of validity is the most important source of evidence for classroom assessments. As with validity, reliability addresses assessment scores and their ensuing use. (p. 66)<\/p><\/blockquote>\n

Over the course of 5 years, we trialed the continuous evaluation and video data aggregation at three universities in North America. In order to assess the impact of the tool, we conducted a mixed methods study, where a subset of the trial participants\u2019 feedback of their own videos and that of their peers was captured before and after its use.<\/p>\n

Although continuous rating and evaluation of a target source is not a new concept, having been used in election debates (Yang & Park, 2014), behavior coding practices (Messinger, Mattson, Mahoor, & Cohn, 2012), and even emotional response to music videos (Soleymani, Pantic, & Pun, 2012), we found no teaching connection. Thus, the new technology used during our study enabled preservice teachers with the means of collecting peer-assessment of any two instructor-selected video content qualities (such as content clarity, sound-level, humor, evidence of data collection, evidence of data analysis, and others).<\/p>\n

Other potential use cases include K-20 teachers collecting critique feedback on student work from a class of students, K-20 students collecting critique feedback on their work or a group’s work from a class or panel of teachers, or administrators collecting feedback on their own work, teacher work, or student work.<\/p>\n

To the best of our knowledge, no other online or free tool exists that allows continuous assessment of videos. Furthermore, no tools exist that allow users to specify and enforce the metric, or criteria, that they wish to have evaluated. The tool presented in this study, YouDemo, is a free tool for continuous, metric-focused evaluation of videos enabling formative anonymous, peer-assessment as well as experience in self-reflective practice.<\/p>\n

Theoretical Framework and Literature Review<\/p>\n

In using the video assessment technology, we embraced a social constructivist view (Vygotsky, 1978) as a theoretical framework. Focusing on the social process of learning while the preservice teachers critiqued the videos of themselves and their peers rather than only on the final product produced (the video itself) was paramount. Since STEM education is currently in the US national spotlight (Air Force Studies Board National Research Council, 2010; Bush, Karp, Popelka, & Bennett, 2012; National Governors Association Center for Best Practices and the Council of Chief State School Officers, 2010; National Science Board, 2012; National Council of Teachers of Mathematics, 2012; NGSS Lead States, 2013; National Science Teachers Association, 2012), gaining insights into STEM education video production and critique using a social constructivist perspective is important to consider while emphasizing critical content. Additionally, it is important to include the perspectives and assessments of currently underrepresented groups in STEM fields, such as minorities, students with low socioeconomic backgrounds, and women (Lehming, Gawalt, Cohen, & Bell, 2013), and how a technology implementation (like the tool presented here) might engage these groups with STEM content and purpose.<\/p>\n

Partnership building and sustained collaboration are extremely important for mutually beneficial interaction between STEM and educational partners (Borowczak, 2015; Burrows, 2011, 2015). Through the use of video technology that provides explicit feedback, teacher-to-student and student-to-student dyads can strengthen their collaboration efforts and partnerships through directed and focused reflection. Over the years, video has been used to assess pre- and in-service teachers (Hannafin, Shepherd, & Polly, 2009), enhance learning (Clarke, Flaherty, & Mottner, 2001; Williams, Farmer, & Manwaring 2008), build technical skills for careers (Clarke et al., 2001; Hunt, Eagle, & Kitchen, 2004), promote more efficient teaching and better learning (Hunt et al., 2004; Kpanja, 2001), increase student understanding (Dillon & Gabbard, 1998), and increase student participation and teamwork (Sweeney & Ingram, 2001; Ueltschy, 2001), amongst other outcomes.<\/p>\n

Thus, the whole scene of learning, or the process that leads to the product as expressed in sociocultural theory, is embraced. The individual parts in isolation do not create the scene. Using the whole scene within context will sharpen the understanding of how STEM education videos and their peer and instructor critiques can affect learning and understanding for the K-20 student audience.<\/p>\n

Building partnerships and collaborations through interactions are not limited to face-to-face meetings, as technology interactions can build partnerships and learning as well. McCabe and Meuter (2011) looked at the seven principles for good relationship practices that included (a) encouraging contact between faculty and students, (b) encouraging reciprocity and cooperation among students, (c) encouraging active learning, (d) giving prompt feedback, (e) emphasizing time on task, (f) communicating high expectations, and (g) respecting diverse talents and ways of learning (Chickering & Gamson, 1987).<\/p>\n

Determining if a tool enhances one or more of the seven principles is vital, as technology is one method to augment learning (McCabe & Meuter, 2011). Looking at technologies, and choosing the right one enables instructors to differentiate student instruction (Jones & Cuthrell, 2011).<\/p>\n

With 83% of young adults using social networking sites (McCabe & Meuter, 2011; Taylor & Keeter, 2010; Zickuhr, 2010), video is already a part of the daily life of most in-service teachers. \u201cVideo adds a new dimension to the ways in which teaching and learning can be viewed, described, and interpreted. In particular, the literature emphasizes that video footage enables data collection and analysis to be an ongoing and iterative process\u201d (Fitzgerald, Hackling, & Dawson, 2013, p. 61). Web 2.0 technologies are infiltrating schools of every level (Jones & Cuthrell, 2011). \u201cThe 21st century science classroom now contains nontraditional teaching tools, including laptops, personal digital assistants, and digital measuring devices\u201d (Bang & Luft, 2013, p. 118).<\/p>\n

University faculty members are utilizing YouTube and other social networking sites to distribute details of events and ideas (Haase, 2009). \u201cYouTube can be used as a tool to inform and display and as a forum for critical analysis and commentary\u201d (Jones & Cuthrell, 2011, p. 76). K-20 students are producing YouTube videos and displaying their own work in various settings, such as art and science classrooms (Sweeney & Ingram, 2001).<\/p>\n

As Liberatore (2010) stated, \u201cIt is clear that the tech-savvy students of the net generation enjoy finding and sharing the videos\u201d (p. 215). Acknowledging, then, that students would also like sharing self-produced videos is not a huge leap, and those self-produced projects allow for an authentic learning experience (Kearney & Schuck, 2006).<\/p>\n

Preservice teachers can benefit from recording and analyzing their own lessons (Friend & Millitello, 2014; Star, Lynch, & Perova, 2011, Van Es & Sherin, 2008). However, preservice teachers who are new to video self-observation tend to hyperfocus on their teaching methods (Fadde & Sullivan, 2013b). While coding videos can be daunting (de Mesquita, Dean, & Young, 2010), peer critique with classroom partners using video sharing and Web 2.0 technologies can generate discussion and learning with preservice teachers (Fadde & Sullivan, 2013b; Heintz, Borsheim, Caughlan, & Juzwik, 2010; Star et al., 2011).<\/p>\n

Providing preservice teachers with opportunities to practice analyzing videos of other peer preservice teachers may help the video creators eventually to evaluate video recordings of themselves (Fadde & Sullivan, 2013b). Research shows that well-defined and challenging but achievable tasks with immediate feedback are critical for skill improvement. The opportunity to correct errors and repeat the process until skills become more routine is also vital (Williams et al., 2008).<\/p>\n

There are limitations to technology use such as video assessment, and some tools will work better than others in different situations (McCabe & Meuter, 2011).\u00a0 Preservice teachers who are new to video self-observation tend to notice only their teaching delivery (Fadde & Sullivan, 2013a; Kagan & Tippins, 1991; Wang & Hartley, 2003). Using peer critique first is beneficial, since the focus is on peer teaching and the delivery is only one piece to assess (Kagan & Tippins, 1991).<\/p>\n

Methods<\/p>\n

To determine the usefulness of YouDemo in real-world preservice teacher applications, we tracked responses and solicited feedback on the tool itself. YouDemo, used in the study presented here as well as in prior studies (Borowczak & Burrows 2011; Burrows & Borowczak, 2014), enabled the assessment of online videos.\u00a0 To date, YouDemo has been utilized in over 900 assessments of 170 videos, with over 131 unique users. The tool does not edit, store, or manipulate videos in any way, rather it links to video already hosted on the Internet (e.g., YouTube).<\/p>\n

YouDemo targets three main users\u2014a video creator (e.g., a preservice student), peer evaluators (e.g., a student’s peers), and an expert assessor (e.g., a student’s instructor). Each of these users plays a different role in any assessment cycle. Figure 1 shows the five main stages within the continuous video assessment cycle, as well as the user associated with that stage: video linkage (video creator), assessment requests (video creator and expert assessor), peer-assessment (peer evaluators), aggregate assessment review (video creator and expert assessor), and sharing of results (video creator).<\/p>\n

\"Figure
Figure 1.<\/strong> The continuous cycle of video assessment: Creation, sharing, assessment, assessment aggregation, and sharing of assessment results.<\/figcaption><\/figure>\n

Stage A: Video Linkage for Assessment<\/p>\n

YouDemo does not store any videos, rather it relies on an existing video sharing site such as YouTube to store and play back videos. Users wishing to add a video to YouDemo simply link to an existing online video. During this process, the user has the opportunity to provide additional details about the video, the class it pertains to, summary details and, most importantly, two metrics that they want continuously assessed throughout the video playback. Figure 2 shows the linking process.<\/p>\n

\"Figure
Figure 2.<\/strong> The linking process consists of five required fields including a YouTube address, a video name, two metrics and a video summary. Optionally, the video creator can select a course and enter a course PIN (personal identification number) as defined by the course instructor.<\/figcaption><\/figure>\n

 <\/p>\n

Stage B: Disseminating Assessment Request Using Social Media<\/p>\n

Recognizing that today’s students enjoy sharing online videos (Liberatore, 2010), the YouDemo implementation connects to several popular social media platforms, including Facebook, Twitter, and Google+ (see Figure 3). This feature allows both creators and evaluators the ability to share, promote, and comment on the videos that they have added or the videos they have previously assessed. This type of propagation allows for an increased assessment population sample beyond the traditional confines of the typical classroom.<\/p>\n

\"Figure
Figure 3.<\/strong> An example of the social media integration available to video creators.<\/figcaption><\/figure>\n

Stage C: Video Assessment<\/p>\n

The video assessment portion of YouDemo consists of three main areas, including the video playback panel, a live assessment stream panel, and an information panel with video details and statistics. The video playback occurs using an interface similar to other online video sites with a play\/pause button. As seen in Figure 4, the assessment stream (and the data collected from it) is controlled by the evaluator throughout the entire video using either the four directional keyboard arrows or, while on mobile devices, the four onscreen arrows. Evaluators see both a historical summary of their ratings and an instantaneous qualitative mapping of the current rating using the mappings in Table 1. Since the primary objective of the tool is to gather information in real time, an evaluator can pause the video without restarting the entire video and rating process. Upon completion of the video playback, the collected live assessment scores are stored by the YouDemo tool.<\/p>\n

\"Figure
Figure 4.<\/strong> The assessment page containing three separate panels:\u00a0 the video panel, the assessment streams, and information panel.<\/figcaption><\/figure>\n

Table 1<\/strong>
\nThe Likert-Scale to Qualitative Text Mapping in the Current Implementation<\/p>\n

\n\n\n\n\n\n\n\n\n
Likert-Value <\/strong><\/td>\nQualitative Text<\/strong><\/td>\n<\/tr>\n
0-1<\/td>\nNon-Existent<\/td>\n<\/tr>\n
2-3<\/td>\nLacking<\/td>\n<\/tr>\n
3-6<\/td>\nAverage<\/td>\n<\/tr>\n
7-8<\/td>\nGood<\/td>\n<\/tr>\n
9-10<\/td>\nExcellent<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<\/div>\n

Stage D: Aggregated Video Assessment Results<\/p>\n

To collect meaningful and useful peer-assessment data a process should guarantee assessor anonymity while providing aggregated summative assessment. This process is at the core of YouDemo. YouDemo enables the collection of assessment data on any two video metrics, as defined by the video creator. The tool allows video creators to view assessment results only across all assessors. Figure 5 shows an example of the aggregation process, where the average of all the individual evaluator ratings form an aggregate rating, ultimately shown to the video creator. This convention fundamentally handles the key hurdles of anonymity and aggregation.<\/p>\n

\"Figure
Figure 5<\/strong>. Aggregation of evaluators\u2019 assessment scores anonymizes individual assessment scores.<\/figcaption><\/figure>\n

Figure 6 shows the aggregated results as presented to video creators in YouDemo. In this current implementation of the technology, the video creator has access to a graphical representation of the metric score over time, as well what was \u201cliked\u201d and what needs \u201cpotential change.\u201dOnce a video is assessed, the collected data is stored, processed, and used to derive a new aggregated assessment summary, which includes both the two continuous quantitative metrics and several qualitative open-response questions that follow the video. This mixed (quantitative and qualitative) data provide the creator insight to how the video is perceived by others in both the context of the metrics selected and the evaluator\u2019s personal lens.<\/p>\n

Stage E: Sharing Video Assessment Results<\/p>\n

The ability to share aggregated evaluations allows video creators such as preservice teachers to disseminate results to an instructor, an interviewer, a mentor teacher, or even their peers in order to understand more global trends. While the ability to disseminate results is not central to the scope of this work, it may be of particular interest in the context of classroom and online instruction when the number of students makes individual assessment infeasible.<\/p>\n

The continuous video rating technology allows for peer critique of teaching videos. While the focus of this discussion is on its use in university level secondary science methods courses, implementation of this technology in other K-20 classrooms might require modification[a<\/a>] of the ways video creators add videos and metrics.<\/p>\n

While the tool has been presented as a peer-to-peer assessment tool, another expected use is as an instructor-to-student tool in which a classroom instructor could upload a video for a flipped classroom and have metrics of \u201cDoes this make sense?\u201d and \u201cAre you learning?\u201d Students would be required not only to watch the video before class but to engage actively in rating the video. A teacher could easily view the students\u2019 overall self-assessment of the material as well as how engaging the material was in its presentation, before meeting in class with the students.<\/p>\n

Study<\/p>\n

While we have been using YouDemo for 5 years with 76 preservice science teachers, this study focused on 27 preservice science teachers\u2019 use of YouDemo over 2 years as they provided feedback to us in written and electronic forms.\u00a0 The preservice science teachers were a mixed group, with undergraduates and graduates obtaining degrees in both a STEM subject and science education. As part of their degree requirements, they took a course on how to teach science within the context of STEM integration. The course required them to create two videos per class and post them to YouTube. The videos took the form of STEM demonstrations directed at a K-12 student audience, STEM hot topic commercials, and practice teaching sessions (micro-teaches). The instructor (second author Burrows) provided guidelines that the videos should run between 2 and 10 minutes in length, highlight specific STEM content, and relate to real world STEM applications in an engaging manner.<\/p>\n

The study relies on three datasets: (a) participant self-assessment before and after their use of YouDemo (pre\/post self-assessment), (2) written peer assessments of participant videos, and (c) YouDemo assessment data of participant videos. The three datasets contained both quantitative data and qualitative data.<\/p>\n

The participant self-assessment consisted of summative assessment of the participant\u2019s own video with respect to two metrics.\u00a0 The self-assessment also contained open-response questions asking why the participant choose that self-assessment score. The written peer assessment asked the same questions (summative assessment of two metrics per video and open response) as did the self-assessment\u2014each video was peer assessed by two fellow students. Finally, the YouDemo assessment data contained formative assessment data\u2014tracking two assessment metrics throughout the entirety of the video\u2014and open-response data concerning the specific qualities of the video. The questions were as follows:<\/p>\n