Purmensky, K., Xiong, Y., Nutta, J., Mihai, & Mendez, L. (2020). Microcredentialing of English learner teaching skills: An exploratory study of digital badges as an assessment tool. Contemporary Issues in Technology and Teacher Education, 20(1). https://citejournal.org/volume-20/issue-1-20/current-practice/microcredentialing-of-english-learner-teaching-skills-an-exploratory-study-of-digital-badges-as-an-assessment-tool

Microcredentialing of English Learner Teaching Skills: An Exploratory Study of Digital Badges as an Assessment Tool

by Kerry Purmensky, University of Central Florida; Ying Xiong, University of Central Florida; Joyce Nutta, University of Central Florida; Florin Mihai, University of Central Florida; & Leslie Mendez, University of Central Florida

Abstract

Digital badges are a promising innovative tool to support teacher candidates’ instructional skill development. Although digital badges are increasingly utilized in online teaching and learning, their effectiveness is still under investigation. This exploratory study reports on 151 elementary level teacher candidates’ participation and success rate in a digital badge system named MELTS, which was specifically designed for cultivating, assessing, and recognizing 10 specific English learner teaching skills. To earn a digital badge, participants in the study were required to (a) pass online module assessments, (b) participate in coached skill practices, and (c) effectively demonstrate mastery of targeted teaching skills before an expert panel. Findings show that participants who completed the online modules and skills practices were successful in demonstrating the targeted teaching skills to receive MELTS badges. Although participants reported a positive experience in the skill practice sessions, the participation rate in the badging sessions was lower than expected. Implications and challenges are discussed.

Teacher preparation programs in the U.S. need to be proactive about meeting the needs of English learners (ELs) in the K-12 system by including targeted coursework and training for future teachers (Niehaus & Adelson, 2014; Nutta, Mokhtari, & Strebel, 2012). Data from the National Center for Education Statistics (NCES, 2019) show that the percentage of public school students who were ELs was higher in fall 2016 than in fall 2000 for all but seven states and the District of Columbia, with nine states having 10% or more ELs in public schools. Federal policy requires that states assess the English language proficiency of ELs annually, provide reasonable accommodations on state assessments, and develop statewide accountability systems that include goals and measures of progress for ELs (U.S. Department of Education, 2018).

Despite this growing need for teachers to develop EL teaching skills, gaps remain in the quality and effectiveness of training provided by teacher preparation programs. For example, Johannessen, Thorsos, and Dickinson (2016), authors of the 2014 report for the National Council on Teacher Quality, found that only 24% of teacher preparation programs train elementary teacher candidates (TCs) in EL support strategies. Research shows that TCs report feeling less prepared to teach ELs than native speaker students in their future classrooms (Nutta et al., 2012; Pappamihiel, 2007). Strong arguments support teacher preparation that provide generalist TCs with concrete theoretical knowledge, as well as practical teaching skills and experiences, in supporting ELs’ language development and academic achievement (Nutta et al., 2012).

Past years have also witnessed innovative efforts put forth by teacher educators who recognize and advocate for EL needs. For instance, the EL infusion approach, exemplified by “the addition of EL content into a general teacher preparation program in an interconnected, cohesive, and interdisciplinary manner” (Nutta, et. al, 2012, p. 20), has been adopted by teacher education institutions voluntarily or under state mandate in the United States and Canada to help generalist TCs to become knowledgeable and skilled at teaching ELs. A study by Lavery, Youngblood, and Nutta (2015) demonstrated that TCs trained by a multileveled EL-infusion preparation program achieved comparable impact when teaching ELs and native-speaking students. As such, EL infusion in generalist teacher education holds promise for narrowing the performance gap between ELs and native speakers in the K-12 classroom.

In addition, various forms of educational technology are being developed and introduced into teacher education programs to enhance teacher skill development. Incorporating digital badging and microcredentialing into teacher education is one such novel practice designed to enhance teacher development and learning.

Enabled by digital badges (hereafter referred to as “badges”), microcredentialing is increasing as a method of assessing teacher learning. As with conventional credentials, microcredentialing is an evidence-based approach to recognize the formal or informal learning of a particular microskill or practice (Hurst, 2015). However, microcredentialing also brings added values to the traditional assessment system by incorporating badge earner control, digitalization, accumulative progress, and public circulation through social networks (Casilli & Hickey, 2016). Microcredentials are designed to give educators, especially new teachers, “clear guidance as to the most critical skills they can begin to develop, demonstrate, and add to their portfolios” (Brown & Rhodes, 2017, p. 41).

Recent years have also witnessed the rising adoption of badges as an educational tool for offering an innovative means of motivating, scaffolding, recognizing, assessing, and microcredentialing learning (Devedžić & Jovanović, 2015; Jovanović & Devedžić, 2015). Commonly taking the form of a graphic or icon, a badge not only functions as a validated online record that rewards the recipient’s accomplishment, skill, competency, interest or affiliation, it also provides metadata that reflects the context, interaction, and meaning of the badge-earning process (Gibson, Ostashewski, Flintoff, Grant, & Knight, 2015). See Figure 1 for a visualization of a digital badge from our study.

Figure 1
Figure 1. Question wizard badge.

To date, however, the concept and practice of badging and microcredentialing within teacher education pertaining to TCs is in early stages (Lang, 2016). Empirical research is lacking in scholarly journals related to the best practices in developing and implementing badges and microcredentials in teacher education programs.

In an EL-infusion teacher education program at a large, urban research university, the Microcredentialing of English Learner Teaching Skills (MELTS) grant research team is embarking on a 5-year project using badges to develop and credential elementary level TCs’ EL teaching skills. Through MELTS, TCs who demonstrate mastery of EL teaching skills for scaffolding the learning of ELs at beginning, intermediate, and advanced levels of English proficiency are awarded badges. In doing so, the MELTS project opens doors to exploring how badging and microcredentialing technology can be effectively utilized in teacher education. This paper introduces the MELTS badging program and reports on elementary level TCs’ participation and completion rate in earning five MELTS badges.

Background

The accelerating development of badging platforms and open badges systems (see Casilli & Hickey, 2016, for a detailed introduction of Open Badges) has been evident in recent years. Even though a wide range of educational settings have undertaken real-world implementations of badges and badging ecosystems (Jovanović & Devedžić, 2015), the rapidly growing popularity of these systems is not yet evident in the field of research. Researchers’ exploration of the educational affordance of badges is largely conceptual, typically linking digital badges to the following usages: motivation, scaffolding, lifelong learning, assessment, recognition and credentialing (Jovanović & Devedžić, 2015).

Digital Badging as a Motivational Tool

Badges as rewards for learners’ engagement or achievements have closest relevance to stakeholders such as learners and teachers. A badge is believed to have the potential to incite and sustain students’ motivation and engagement through its gaming mechanism (Antin & Churchill, 2011; Dominguez et al., 2013; Kopcha, Ding, Neumann, & Choi, 2016); its participatory learning approaches and peer-based learning communities (Williams, Karousou, & Mackness, 2011); and its visual pathway of learning that helps students set goals and envision success (Bell, Bricker, Reeve, Zimmerman, & Tzou, 2013). As such, several empirical studies on digital badging and learner motivation yielded positive findings (Davis & Singh, 2015: Denny, 2013). They also opened a pivotal debate on how digital badging works with extrinsic and intrinsic motivation to engage learners.

Denny (2013) conducted a randomized controlled experimental study on the impact of badges on college students’ online learning participation. Data from over 1,000 students indicated that badges have a highly positive effect on students’ levels of participation in terms of quantity and length of time without compromising the quality of their contributions. Moreover, students expressed a preference for having badges incorporated into the interface of the online tool used in the study, and they reported high levels of enjoyment through the badge-earning process. 

In the high school sphere, Davis and Singh (2015) studied learner engagement with a digital badge system that awarded school credit for students’ participation in a network of afterschool programs serving youth from low income, immigrant backgrounds. Through interviews and focus groups with 43 students and 24 teachers, the study showed that participants recognized both the potential and challenges of using badges to motivate learners. They linked the motivational factor of badges to the promise of gaining reward or recognition, as well as the badges’ potential to unlock real opportunities for students if the badges were valued by external audiences. On the other hand, 32% of the adult participants expressed the concern that digital badges, operating as an extrinsic motivator, could mitigate students’ intrinsic motivation to learn. 

The fears expressed by some of Davis and Singh’s (2015) participants that badges would decrease levels of intrinsic motivation by emphasizing extrinsic rewards is referred to as motivation displacement (Deterding, 2011). Abramovich, Schunn, and Higashi (2013) explored this issue in their study in a middle school classroom. They found that different badge types interacted differently with learner types in affecting motivation. Low-performing students tended to earn participatory badges that provided extrinsic reward-based motivation, whereas high-achieving students responded negatively to participatory badges and instead chose to earn mastery badges that highlighted intrinsic motivation to learn content. The researchers suggested that educational digital badge and credentialing design must be based on considerations of the ability and motivations of learners. Nonetheless, the researchers remained positive about the future of digital badges in education.

Apart from boosting learner motivation and engagement in a discrete activity, digital badging and microcredentialing also play a role in supporting personalized lifelong learning through their embedded design principle (Devedzic & Jovanovic, 2015; Lang, 2016). This support is especially relevant to learners who want to exercise learner agency and autonomy to pursue lifelong learning experience. 

Digital badges can increase the visibility of learning pathways in formal and informal contexts. A learning pathway isdefined as a series of linked actions in which learners demonstrate progressively deeper participation in a personally consequential learning domain (Bell et al., 2013; Lave & Wenger, 1991). An opaque learning pathwaywould result in learners’ poor understanding of the steps needed to develop proficiency and achieve success (Davis & Singh, 2015).

A well-defined, discernible learning pathway, on the other hand, acts as a road map, through which learners engage in metacognitive activities such as goal setting, planning, and self-reflection. While a specific acquired skill recognized by a microcredential can be seen as a final goal or product, badges are the sequential steppingstones learners go through that add up to their learning gains and experiences.

In this sense, badging-supported learning can offer learners a sense of direction, control and ownership as they develop expertise in a specific learning domain (Riconscente, Kamarainen, & Honey, 2013). Research in this aspect is limited, yet Davis and Singh’s (2015) study provides some empirical confirmation. One participant stated that by using badges “individuals of any age can put together, curate and then follow their own highly individualized learning pathways with their own goals and outcomes in mind” (p. 78).

Digital Badging as an Assessment Tool

Badging advocates believe that badging offers a powerful response to the call for forward-thinking assessments that do not lag behind the steady growth of new methods of learning inspired by open education in the information age (Casilli & Hickey, 2016). In the context of higher education, Abramovich (2016) illustrated how badges as assessment tools can meet learner needs in face-to-face and online courses. Specifically, he emphasized the merit of badges for both formative and summative assessment. Badges designed as formative assessments provide feedback to the learner by indicating the time and effort that have been put into learning and the necessary requirements to achieve success. Summative assessment badges specify the exact knowledge or skill that was gained by the learner.

Abramovich (2016) cited two studies to support his arguments, one in a face-to-face class setting (Reid, Paster, & Abramovich, 2015) and the other in an online course setting (Auvinen, Hakulinen, & Malmi, 2015). Participants in Reid et al.’s (2015) study reported a positive experience with the badge framework as a form of assessment and agreed that badges adequately reflected their learning in class. This study also concluded that badges are feasible assessment tools and can especially benefit learners with a high intrinsic motivation to learn.

In Auvinen et al.’s (2015) study, badges were used to offer visualized formative feedback on time management, learning, and carefulness in an undergraduate computer science course. Findings suggested that badges were effective in informing students of their learning progress. Similar to Reid et al. (2015), high achieving learners in the Auvinen study were found to be more interested in the formative feedback offered by badging.

Digital Badging and Microcredentialing in Teacher Learning

Teacher professional development (PD) is about in-service teachers’ learning. Teachers participate in learning activities to develop skills and knowledge, which they can transform into practice for the benefit of their students’ growth (Avalos, 2011) and use as part of their regulatory workplace requirements (Gamrat, Zimmerman, Dudek, & Peck, 2014). 

Research, however, has documented the hard truth of PD: Currently recognized learning options do not fully satisfy teachers’ needs, and few school districts organize PD according to best practices ( Berry, Airhart, & Byrd, 2016; Grunwald Associates & Digital Promise, 2015). Traditional learning methods that rely on short, one-size-fits all workshops fall short in providing job-embedded, inquiry-driven, collaborative, personalized, and well-recognized learning experiences that teachers and school district leaders thirst for (Darling-Hammond, 2012). Faced with this scenario, the educational sector started to recognize the significant opportunity to provide alternative teacher learning and assessment experiences through digital badges and microcredentials.

A recent national survey by Digital Promise (Grunwald Associates & Digital Promise, 2015) investigated 856 K-12 teachers from public and private schools in terms of their satisfaction with PD and attitudes toward competency-based microcredentials. The results revealed that teachers’ satisfaction rate with formal learning was approximately three times lower than their participation rate. In order to satisfy their PD needs, almost three fourths of the teachers sought informal nonrequired learning opportunities.

In terms of the concept of microcredentials, only 15% of the teachers were somewhat familiar with it, whereas 65% of the teachers indicated interest in earning a microcredential as a part of their ongoing learning after being introduced to it. Teachers perceived microcredentials as appealing because of their easily accessible, personalized, and competency-based features. Overall, this survey showed great promise for utilizing microcredentials as an emerging professional learning option. 

Another timely empirical study that reflected overall positive attitudes toward using microcredentialing and badges in PD was by Jones, Hope, and Adams (2017). In this mixed-method study, survey and interview data were collected from 99 K-12 teachers who were awarded badges as recognition of learning in a PD course to become mentors to student teachers. Data suggested that teachers, especially those from the elementary level, were found to have a favorable view of receiving digital badges. Participants also shared their badges through digital media and indicated they would be more likely to share their badges with their administration than with colleagues.

In order to explore badging and microcredentialing in customizing workplace learning opportunities, Gamrat et al. (2014) examined data from 36 self-selected science teachers who completed 154 PD activities over a 3-month period through using a digital badge system, Teacher Learning Journeys (TLJ). This badging system was collaboratively designed by a university, a governmental agency, and a national professional association with the purpose of providing and assessing teachers’ implementation of online PD.

Through an analysis of the 36 participants’ TLJ artifacts (i.e., goal statements, interviews, and reflective activity logs) and an in-depth collaborative case study of eight teachers, findings pointed to success with badging for the participants. The key themes identified relating to the flexibility of the digital badging that the teacher participants appreciated were (a) flexible goal setting as personally relevant, (b) customized level of assessment and depth of content learned, and (c) archiving and sharing PD artifacts. The researchers came to the conclusion that personalization and customization are the key features of digital badges that benefit teacher learning.

Digital Badging and Microcredentialing in Teacher Education

A TC has dual roles and mindsets as a learner and a future teacher. However, the search for literature directly relevant to TCs and microcredentialing with badges indicated a strong need for further research (Lang, 2016). Chou and He’s (2017) study was one of a few that offered insights into the effectiveness of badges in a graduate program for teacher education. Specifically, this mixed method research focused on the impact of a badge system on students’ class participation and interaction depending on pedagogical orientation (i.e., read-write-reflect-comment and activity-based design) and course delivery format (i.e., online and face-to-face). Badges were designed to reward those who contributed to high-quality class discussion and peer project comments.

The findings indicated that badges were effective in enhancing student interaction but not student participation. Even though this study did not address TCs as a unique population, but as learners in general, it indicated the usefulness of badges in small graduate-level teacher education courses.

Pytash and Ferdig (2014) explored how badges could be conceptualized and utilized in a teacher education program. The preliminary results of their study highlighted how badges impacted the goal-setting and motivation of TCs as learners while also inspiring them as future teachers to seek ways of integrating this technology in their future classrooms. At Iowa State University, TCs learned educational technology and professional development topics through badges (Schmidt-Crawford, Thompson, & Lindstrom, 2014). The researchers suggested badging is a “very effective and efficient way” for them to (a) evaluate students’ competence-based technology knowledge with predetermined learning outcomes and (b) motivate learners through friendly competitions among peers (p. 111).

In summary, the current literature in the general domain of education points to the multiple roles and anticipated merits of badges and microcredentials. For TCs, studies demonstrate the potential for badging and microcredentialing in promoting, assessing, and recognizing TCs’ learning.

Berry, Airhart, and Byrd (2016) suggested that microcredentialing is well positioned for transforming teacher learning because it readily fits into existing systems of teacher certification and recertification and has great potential to receive funding from districts and states. Despite such findings, Lang (2016) pointed out that the concept and practice of badging and microcredentialing within teacher education pertaining to TCs is still in its early stages. Existing studies may only reflect participants’ surface-level understanding and perception of digital badges and microcredentials because participants in these studies either were newly introduced to badging and microcredentialing without real experience with them, or they had limited experience with one specific badging system at an early stage of implementation. Nevertheless, together these studies projected microcredentialing as a promising solution for long-term teacher learning.

The study described here was an attempt to fill the gap in research as to the impact of badges on TCs’ teaching skill development focused on the critical area of EL teaching skills. We investigated how many TCs completed the necessary requirements for obtaining a badge and whether TCs reported that the badges helped them achieve a higher level of EL teaching skill through the following research questions:

  • What is the completion rate of teacher candidates for obtaining a digital badge for English learner teaching skills?
  • Do teacher candidates indicate that the badging skill practice sessions prepared them to pass the MELTS badging assessment and be prepared to teach EL students in the future?

The MELTS Digital Badge Design

Overview

The MELTS badging project was designed over a 1-year period. The system contains 13 badges, with 10 skill badges and three excellence badges (see https://ccie.ucf.edu/melts/). With technical support from the university Center for Distributed Learning, the MELTS grant research team designed the badges, the badge issuing criteria and process, and all supporting materials (e.g., readings [digests], video modules, quizzes, and skill practice protocols). The MELTS grant research team consisted of three faculty member grant investigators, a grant coordinator, 12 education instructors, and five doctoral candidates.

Including both formative and summative assessment components, MELTS badges are designed as microcredentialing badges (Gibson et al., 2015) to provide a finer level of granularity to EL teaching skills assessment. The MELTS badges entail multifaceted, rigorous online and face-to-face assessment components. For each MELTS badge, TCs are required to (a) complete an online instructional module with digests, videos, and quizzes; (b) participate in a coached skill practice session; and (c) successfully demonstrate mastery in the teaching skill before an educational panel formed by the grant research team.

MELTS Skills

The MELTS badges feature 10 specific EL teaching skills based on best practices in teaching ELs and are associated with specific courses within an education program. Table 1 presents the skill selections and associated courses. Note that some courses support more than one MELTS skill development.

Table 1
MELTS Skills Selection and Associated Courses

Skill NameAssociated Courses
1 Leading a questioning sequence in social studiesTheory and Practice of Teaching ESOL Students in Schools
2 Teaching a classroom procedureTeaching Strategies and Classroom Management
3 Pre-teaching key vocabulary of a science lessonTeaching Social Science in the Elementary School
4 Direct teaching of a mathematics lesson segmentHow Children Learn Mathematics
5 Direct teaching of a language arts lesson segment, including small group questioningFoundations of Reading
6 Leading a follow-up discussion of the language arts lessonFoundations of Reading
7 Using templates, sentence frames, and sentence starters to scaffold a writing assignmentLanguage Arts in the Elementary School
8Conducting an informal reading inventoryPracticum for Assessment and Instruction of Reading
9 Conduct a writing assessment and providing level-appropriate feedbackIssues in Second Language Acquisition
10Discussing student progress at a parent conferenceTheory and Practice of Teaching ESOL Students in Schools
Note: The selection of the 10 EL teaching skills were based on What Works Clearinghouse Educator's Practice Guides (Institute of Educational Sciences, per grant guidelines (https://ies.ed.gov/ncee/wwc).

The purpose for developing TCs’ EL teaching skills is to provide them with more specific, rather than general, teaching English to speakers of other languages (TESOL) strategies. The MELTS project not only cultivates the 10 skills but requires the TCs to differentiate how they apply the skills to teach ELs at different proficiency levels according to recognized English language development standards provided by WIDA (WIDA Consortium, 2014; see also https://wida.wisc.edu/).

The National WIDA English Language Development Standards (WIDA Consortium, 2019) are recognized and utilized for EL instruction and assessment in 40 states, including the state in which this study took place, throughout the United States. WIDA English language proficiency standards reflect the social and academic dimensions of acquiring a second language that are expected of ELs in grade levels P-12 attending schools in the United States at five levels of proficiency (1– Entering, 2–Emerging, 3–Developing, 4–Expanding, 5–Bridging, 6–Reaching). For example, TCs in MELTS not only have to practice how to work with ELs in leading a sequence involving questioning (Skill 1), but also how to form and ask questions differently for ELs who are at WIDA English proficiency Levels 1 (Starting), 3 (Developing), and 5 (Bridging).

The grant team determined that addressing three of the five WIDA standards would provide a strong understanding of working with ELs at the beginning, intermediate, and advanced levels of language. TeachLivE™ is limited to three EL interactors at these levels; therefore, the skills were developed with these levels in mind. 

MELTS Online Modules

Ten self-contained modules based on the teaching skills are available to TCs during their courses through the university Canvas system. Each module contains the following:

  • an introduction to the skill;
  • a brief digest and digest quiz;
  • a video and video quiz; and
  • schedules and sign-up links for skill practice and final badging assessment.

Most instructors in MELTS make the modules an optional, extra-credit component of their coursework, while some instructors include MELTS as part (usually 1-2%) of their grade. In the end, TCs can choose whether to complete the badges, allowing for student autonomy and individual ownership over the process.

One unique feature of the MELTS online module is the 5- to 10-minute demonstration video introducing TCs to the target skill. Research has shown that teaching skill videos improve preservice teacher learning and acquisition of pedagogical knowledge (Beilstein, Perry, & Bates, 2017; Plöger, Scholl, & Seifert, 2018). Video demonstrations are, therefore, a critical part of the instructional process of MELTS, in which TCs can view the target skill with focused captions when important EL instructional strategies are used.

The 10 demonstration videos were professionally created by the grant research team and the university Center for Distributed Learning, showcasing how expert teachers use the target skill in a real-world setting. A sample video demonstrating Skill 3 (Pre-teaching Key Vocabulary of a Science Lesson) can be seen in Video 1 at https://vimeo.com/cdlvideo/review/213525987/eb576f4613

To complete the modules successfully, TCs must read the digests, watch the videos, and pass the module quizzes with a minimum score of 80%. Once they successfully complete the modules, they develop a plan for practicing the activity following the activity guidelines and sign up for a skill practice session.

Skill Practice Sessions

Attending a skill practice session for each target skill is the next step in the MELTS badging process. Real-time coaching is utilized during the skill practice sessions to foster skill development and teacher confidence (as recommended in Morphis, 2018; Stahl, Sharplin, & Kehrwald, 2016). Specifically, each skill is practiced by a participant with a coach alongside providing real-time guidance while the TC is teaching, and formative assessment feedback is given afterwards.

Coaching protocols were created to guide all the coaches in providing consistent, focused, skill-oriented training to all TCs (See appendix for an example of the coaching protocol for Skill 1). The practice sessions were listed in the online modules, and TCs signed up for sessions at specific times. Most sessions were scheduled during their regular class time, so TCs would come one-by-one to a room next to their classroom to receive the skill practice coaching from one of the grant research team members. These sessions were not graded individually, but TCs either received extra credit or it was included as part of their grade if they completed all the steps to earn the MELTS badge.

The coached skill practice sessions were conducted in either a high-tech or low-tech instructional environment. As one of the research objectives of the MELTS grant, the two instructional environments were included to determine if one type of skill practice was more effective than the other.

These coaching sessions were unique in that the TC was exposed to simulated ELs at various levels of English proficiency. This approach required the TC to demonstrate teaching skill, not for a general concept of an EL, but targeted to the specific level of English for that learner. As pointed out by Lavery et al. (2015), when teachers adjust their instruction to meet the needs of ELs at varying levels of proficiency, ELs experience the most gains in their language proficiency.

The high-tech instructional environment builds upon 2 years of field-testing of a mixed-reality teaching environment, known as TeachLivE™ (http://teachlive.org). Developed by a team of professors of engineering and education at the University of Central Florida, TeachLivE™ provides TCs with the opportunity to practice new skills in a simulated classroom. It is currently available at 85 campuses and in various schools throughout the United States, with its use depending on local needs.

The TeachLivE™ classroom is limited to five avatars and includes three EL avatars at WIDA English proficiency Levels 1 (Starting), 3 (Developing), and 5 (Bridging) and two native speaker avatars. In a TeachLivE™ coaching session, the TCs demonstrate their prepared activity for 5 minutes with their coach in attendance providing formative assessment feedback. Figure 2 shows a figure from the TeachLivE™ virtual classroom.

Figure 2
Figure 2. TeachLivE™ virtual classroom.

In the low-tech skill practice sessions, TCs took turns practicing the teaching skills in a group of 4-7 students, with one TC being the teacher and the other three to six TCs playing the roles of ELs at varying levels of proficiency. Before skill practice started, at least three TCs were trained to act the part of ELs at 3 WIDA levels. By the time each TC finished the skill practice in a group, the session typically took about 1 hour to complete.

Final Badging Assessment

During MELTS badging, TCs study targeted EL teaching microskills through online modules and actively practiced the microskills through coached skill practice sessions. However, in order to receive a MELTS badge, TCs had to pass the final badging assessment by effectively demonstrating the skill in 5 minutes through TeachLivE™ before an expert panel, which unanimously determined success based on a rubric for that skill. The expert panel consisted of two to five experts in the field of TESOL.

At least one member of the panel was a grant team member who trained the other panelists. TCs had to receive scores of 1 and above on all categories on the rubric based on a 4-point scale of 0 (Below Expectation), 1 (At Expectation), 2 (Above Expectation), and 3 (Outstanding). Panelists discussed results at the end of each session to ensure interrater reliability.

When TCs passed the final badging assessment for a specific EL teaching skill, they received the corresponding badge as a recognition and credential. All MELTS badges, including 10 skill badges and three excellence badges, are issued through Credly (https://info.credly.com). The three excellence badges recognize TCs who participated in earning all 19 skill badges to a varying degree of success. Badges can be updated and renewed every 7 years to keep them current.

Methods

Research Context

The research context of this exploratory, mixed-method study (Morse, 2003) of TC EL teaching skill development through MELTS badging was a 4-year, EL-infused, undergraduate teacher education program in a large public, urban research university in the Southeastern United States. After receiving IRB approval, the study included undergraduate Elementary Education (K-6) TCs enrolled in three ESOL-infused education courses in fall 2017, during which five of 13 MELTS badges (Skills 1, 2, 5, 6 and 10) were available to them. The three courses wereTeaching Strategies and Classroom Management, Theory and Practice of Teaching ESOL Students in Schools, and Foundations of Reading. In total, 151 TCs enrolled in the three courses, and 147 enrolled themselves in at least one MELTS module.

Participants

Participants enrolled in the MELTS modules and completed a survey about their experience. A detailed breakdown of the number of participants and their participation at different levels of the project can be seen in Table 2.

Table 2
A Breakdown of the Number of Participants by Course, Skill, Skill Practice Type, and Survey

Course and Skill (n = 151) [a]Skill Practice ParticipationSkill Practice TypeSurvey Completion
-High TechLow TechHigh TechLow Tech
Course 1 (Skill 2) 7068353324
Course 2 (Skill 1) 9987424514
Course 2 (Skill 10) 9972353714
Course 3 (Skills 5 and 6) 9382424046
All 3 Courses 40271413 
 [a] The numbers do not reflect students who are enrolled in two courses simultaneously. For this reason, the totals are not indicated for the columns.

Data Collection Procedure

The MELTS grant coordinator recorded participants’ completion of each required component (i.e., module, skill practice, and badging) of the MELTS badging by skill, in an Excel spreadsheet. First, participants navigated through the MELTS online modules as they would any other web-based course. The university online class management system, Canvas, automatically recorded participants’ engagement with the module content (e.g., digest or video) and their performance on the assessments (i.e., digest quiz and video quiz).

The data collection at this stage was automatic without direct contact between the participants, the instructors, and the researchers. All quantitative data regarding participants’ completion of online modules, skill practice, and final badging assessment by skill was compiled into one spreadsheet for analysis.

Participants who successfully completed the online modules had the choice of signing up for the face-to-face skill practice session. At the end of each MELTS skill practice session, participants were given a voluntary survey about their perception of the skill practice session. Originally, the survey was located within the online module, but the grant coordinator noticed that students were not completing the survey. The grant research team then decided to use iPads in person at the skill practice sessions to encourage completion. At this point it was late in the semester and few students completed the survey. This low number of participants in the skill practice evaluation survey is a noted limitation to the study and one that has been addressed in subsequent semesters.

Survey Instrument

The 14-item survey instrument included both quantitative and open-ended questions. It was designed and analyzed by the university Program Evaluation and Educational Research Group to investigate whether the participants believed the practice session prepared them to master the target microskill so that they could successfully pass the final badging assessment and serve EL students in the future (Matthews, Swan, & Peluso, 2018). The data collection was anonymous.

The survey was first administered online through the modules, but participation was so low that the PEER group started utilizing iPads so that coaches could administer the survey to participants after individual coaching sessions. This lack of participation is noted in the limitations to the study and has now been resolved to increase participation to almost 100% in succeeding semesters. The survey prompted the participants to report the format of their practice session (high-tech or low-tech) and reflect on its length, difficulty, and effectiveness for preparing them to work with ELs.

Open-ended questions prompted feedback about what they liked most about the session and to make suggestions for improvement. The following open-ended questions guided the survey design and analysis:

  • How do the preservice teachers rate the effectiveness of the practice sessions?
  • What did the preservice teachers like most about the practice sessions?
  • What recommendations, if any, did the pr-service teachers have to improve the practice sessions?
  • What differences exist, if any between microteaching and TLE TeachLivE™?

The survey was validated by the MELTS grant research team and the Program Evaluation and Educational Research Group. Its reliability will be evaluated as it is implemented each year for a new cohort of student participants.

Data Analysis

We collected both quantitative and qualitative data. Quantitative data consisted of participant completion rates and were recorded on a spreadsheet. Data analysis involved descriptive statistics (frequency and percentages). Quantitative and qualitative data were also collected via survey to investigate participants’ perceptions about the skill practice session. Of the 33 who completed the survey, seven respondents only reported the session format. Those surveys were excluded from the final analysis, resulting in a final sample of 26. Of those, 18 (69%) were evaluations of the low-tech practice session, while the other eight were for the high-tech format. Qualitative responses to open-ended questions in the survey were analyzed systematically using open coding (Corbin & Strauss, 2015). To increase reliability, data were independently coded by two university Program Evaluation and Educational Research analysts who then discussed discrepancies in order to reach a consensus and apply a final coding scheme (as in Creswell & Poth, 2018).

Results

MELTS Badging Completion Rates

  Of the 151 TCs who enrolled in the three participating courses in fall 2017, 147 participants completed at least one module. Forty of those students were enrolled in all three courses, and 99 were enrolled in at least two courses. The breakdown of participants enrolled in each class and their completion rates at each level of the badges is described in Table 3.

Table 3
Cross Tabulation of TCs’ Successful Completion of the MELTS Components by Skill

ComponentCourse 1
(Teaching Strategies and Classroom Management)
Course 2
(Theory and Practice of Teaching ESOL Students in Schools)
Course 3 (Foundations of Reading) All Courses
SkillSkill 2Skill 1Skill 10Skill 5 & Skill 6All Skills
Students Enrolledn = 70n = 99n = 99n = 93 n = 40
Module Completion63 (90%)79 (80%)62 (63%)80 (86%)23 (58%)
Skill practice68 (97%)87 (88%)72 (73%)82 (88%)27 (68%)
Badging23 (33%)38 (38%)36 (36%)32 (34%)17 (43%)

By examining specific counts and percentages demonstrating the participants’ completion rate at each step (i.e., enrollment, module completion, skill practice, and badging) of the MELTS project by skill (Table 3), the success rate of participants earning badges was demonstrated (Research Question 1). The percentage of participants who completed the MELTS modules (formative assessment) ranged from 58-90%, those who completed the skill practice session (formative assessment) ranged from 68-97%, and the percentage of participants who completed the final badging (summative assessment) ranged from 33-43%. Seventeen participants obtained all five badges.  

Table 4 displays the participants who chose to complete all badging requirements for each skill and shows that 100% were successful in obtaining the MELTS badges. Seventeen participants who were enrolled in all three courses choose to complete all five badges. Two candidates who did not pass the Skill 5/6 badging evaluation the first time were allowed to return and obtain their badges the second time they were assessed.

Table 4
Cross Tabulation of Badges Earned by TCs by Skill

Skill 2Skill 1Skill 10Skill 5 & Skill 6All Skills
Badge Participation 2338363217
Badge Earned 2338363217
Badge Success Rate 100%100%100%100%100%

Participant Perception of Skill Practice Sessions

Research Question 2 guided us in exploring the participants’ perceived practice session effectiveness as determined by the survey, particularly in terms of whether the sessions prepared them to pass the MELTS badging assessment successfully and serve EL students in the future. Issues with online survey completion resulted in a lower than expected n for this portion of the research, limiting our conclusions from this data. This issue was resolved in subsequent semesters with face-to-face surveys, resulting in more robust data, but for this semester the participation in the survey was limited to n = 24.

Quantitative responses. In aggregate, 24 (92%) of the respondents of the survey rated the coaching sessions in both low- and high-tech environments effective. None disagreed, but two (8%) neither agreed nor disagreed. Among those who practiced in a low-tech format (n = 18), all but one (94%) agreed it was effective, while all but one (88%) rated the high-tech session effective (n = 8).

Ninety-two percent of TCs agreed that after their practice session they reported feeling more comfortable working with ELs, managing their behavior, and supporting their needs, regardless of practice session format. They also enjoyed the experience and agreed it should continue to be a part of the program. All the teacher candidates who practiced with TeachLivE™ agreed they were no longer conscious of the fact they were working with virtual students after a few moments, compared with 56% of those who practiced in the low-tech format.

The TCs were also asked to rate their coaching session’s difficulty and length. One TC rated the low-tech practice session as too difficult. All others in both formats rated the difficulty about right. Among TCs who practiced in the low-tech format, six (33%) reported the session was too long, one too short, and the rest (n = 12, 61%) about right. A majority (n = 6, 75%) of those who practiced in the high-tech TeachLivE™ format rated it about right, with two rating it too short.

Qualitative responses. Ten TCs responded to this prompt. In a free-response format, participants were invited to share what they most liked, suggestions for improvement, and anything that surprised them about the session. Forty-five percent of participants reported that what they liked most was how realistic the TeachLivE™ session was. Among those who participated in a low-tech format (n = 3), 33% noted how realistic the experience was.

In discussing the low-tech skill practice session, one participant commented, “I liked practicing realistic things that would be useful in the classroom.” Another TC stated, “I enjoyed that we were able to practice the skill before the assessment. I liked that the coaches have been flexible and helped us a lot with a new program.”

More participants commented on the TeachLivE™ experience, with one TC noting, “Getting the experience of planning for and working with EL students, so that I do not have to worry about negatively impacting an actual student if my idea or execution fails me.” Another TC stated, “What I liked more about this experience [was interacting] with the students through reading the books and answering their questions. It felt like I was actually teaching real students.” A third candidate was even more personally reflective, writing, “This experience showed me my weaknesses when it comes to teaching English learners and helped me adapt new strategies to better help my students.”

Discussion

The present study yielded meaningful implications for using badges as an educational tool for assessing TCs’ EL teaching skills.

Success of the Digital Badging Process

Because badges can be earned with varying levels of difficulty of assessment (Hills & Hughes, 2016), one aspect of this study was to determine if TCs would choose to participate in and be successful at a rigorous and time-consuming voluntary badging assessment process. The results demonstrated that the small percentage of participants (33-43%) who completed entire MELTS badging components, including utilizing the modules, finishing the quizzes, and practicing the skills with coaches, were able to pass the summative badging assessments successfully and obtain the badges.

While only a small number of participants completed the entire badging process for each skill, these positive results reflect that the assessment process for TCs was promising and demonstrated learning for those who did. The problems noted in the survey for challenges in completing the badging (i.e., scheduling, understanding the value of the badges, and inconsistent expectations in the courses) have been addressed in subsequent semesters and should make a difference in future completion rates. Therefore, while the badging process was successful for those who chose to complete it, the low participation rate was unexpected.

Badges are designed to make learning visible through the steps of assessment that take place. In this process, participants who chose to participate in MELTS and earn the badges openly demonstrated that they can perform the EL teaching microskill at a high level of competence before a review panel of experts while teaching ELs at three levels of English proficiency.

Participants who have earned these badges through microcredentialing have demonstrated a high level of skill attainment, and the participants have made their learning discernible not only to themselves and the review panel but, through the badges, to all who view the badges. The badges the TCs earned and can display online provide documentation of the rigorous process they completed to earn the badge for each, targeted EL teaching skill. Through Credly, TCs can display their badges on Facebook, LinkedIn, Instagram, or any linkable website. These badges can also be printed on resumes and linked to the online microcredential.

The positive results for the participants who completed MELTS badging demonstrate enormous potential in employing badging technology in teacher education programs that include microcredentialing in teacher education. The early results show badges have great potential as an effective EL teaching skill development approach prior to entering the classroom experience because they (a) provide TCs with a specific, unambiguous, and goal-driven pathway of learning, (b) track TCs’ learning trajectory in a visible and motivational manner, and (c) allow systematic, standardized evidence-based assessments of TCs demonstration of one specific skill at a time.

Microcredentialing through badges demonstrates great promise to revolutionize the traditional assessment system by incorporating badge earner control, digitalization, accumulative progress, and public circulation through social networks (Casilli & Hickey, 2017). At the same time, while we were surprised by the low badging rate, the results mirrored what was found in a previous study. For instance, teacher participants in Diamond and Gonzales’s (2014) study, despite their interest and engagement with the teaching materials and training content, only had a 7% of badge earning rate.

Skill Practice Session Perceptions

In addition to documenting the completion rate of the TCs in the MELTS badging project, their perception of the usefulness of the skill practice session for improving their EL teaching skill was surveyed. Overall, TCs expressed satisfaction with the skill practice sessions, in both low-tech and high-tech formats, with the high-tech TeachLivE™ environment perceived slightly better. One conjecture for this more positive experience is that participants who played the role of mock ELs could not perform the WIDA levels of students as accurately and effectively as the trained avatars in the TeachLivE™ system. Participation in the final badging assessment sessions was not affected by whether students participated in either style of skill practice session.

Skill practice coaching can be an effective formative assessment opportunity to provide participants with focused feedback while they actively demonstrate EL teaching skills. Real-time coaching has been shown to foster skill development and teacher confidence, particularly for new teachers (Morphis, 2018; Stahl et al., 2016). The participation rate in the skill practice coaching was high (68-97% for each skill), and 92% indicated that the skill practice coaching was valued and helpful to the participants (although this response was limited to the 26 participants who completed the survey).

TCs rarely get this kind of focused, one-on-one coaching in their regular education courses due to the challenges of personnel to provide focused feedback, but it has proved valuable in the MELTS badging process. Compared to more traditional preservice teacher evaluation methods, in real-time coaching the feedback is immediate, giving students the opportunity to adjust their practice in the moment. 

This type of feedback has been shown to be more effective when it is not deferred until after the lesson (Stahl et al., 2016). Coaching is also considered a nonthreatening evaluation process, more of a collaboration to help develop positive teacher behaviors and skills (Britton & Anderson, 2010). During the MELTS skill practice sessions, coaches were able to form a positive relationship with the participants, spending time speaking with them before, during, and after the session to understand their approach and help them improve their EL teaching techniques.

While participants may have studied EL teaching skills in their courses or seen them modeled in the module videos, this opportunity was their first to demonstrate those skills and receive immediate feedback. Participants reported through the survey that they enjoyed these sessions and they had greater confidence to use these techniques in their future classrooms.

This design requires sufficient human resources (e.g., instructional coaches who are experts) as well as financial sustainability, which is perhaps why this it rarely seen among the badging designs documented in the literature (e.g., Randall, Harrison & West, 2013). However, this approach is a strength of the project and the reason why so many participants indicated positive experiences in the skill practice sessions. 

Motivation

Issues were found with both intrinsic and extrinsic motivation for participants completing MELTS badging. As the results from Research Question 1 demonstrate, while the success rate of participants who completed all 10 skills in MELTS was high (100%), the percentage rates of TCs who completed the entire MELTS badging project was considered low, with only 33-43% of TCs opting to complete the badging assessment to obtain the badges.

A core element of earning badges is learner autonomy, so TCs were allowed to choose whether to participate in the MELTS badging project or not. To provide TCs with positive, extrinsic motivation, though, most instructors gave extra credit or a small percentage of grade credit to TCs for participating in MELTS. The points were generally about 1-2% of the final grade, so participants reported to instructors that, while the ease of completing the modules online and participating in skill practice coaching (during or near class time) was worthwhile, the points were not enough to motivate them to complete the digital badge assessment outside of class time at the end of the term.

Additionally, participants told instructors that, even though the modules explained the extrinsic value of the MELTS badging project, focusing on the school systems seeking out candidates with skill in working with ELs, particularly in Florida where some schools have a majority ELs, they still felt unsure about the positive impact of obtaining a digital badge on their future teaching career.

These mixed results mirror other studies about badges (Fanfarelli & McDaniel, 2017). In this study about the value placed on badging by students, researchers discovered both positive and negative feelings. Some students thought the badges aligned well with the course in general, whereas students who expressed more negativity indicated that the badges were difficult and time-consuming. It is important to take steps to address these issues and ensure that TCs understand the value of badging for their future careers and for their future students.

Recommendations for Teacher Education Programs  

Badges are increasingly recognizable as potential alternative assessments for supplementing course activity, but both instructors and students have given them mixed reviews (Fanfarelli & McDaniel, 2017). Based on this exploratory research study of the first iteration of the MELTS badges, the following are recommendations for future practices and research into the effectiveness of badging for improving TCs EL instructional skill development.

Effective Badging Practices

As noted by other researchers (i.e., Abramovich et al., 2013; Casilli & Hickey, 2016), addressing logistics and value within any badging project is key to ensuring successful participation. In this study, the low percentage of TCs who completed the MELTS badging to fruition was unexpected. TCs were taking advantage of the digital badge formative assessment opportunities (modules, quizzes, coaching), but in the end were not choosing to obtain the badges at the same rate. Feedback from instructors who communicated with TCs about their lack of participation, indicated issues with the logistics (such as timing and scheduling) and understanding the value of the MELTS badges.

To ensure higher levels of participation in badging projects and research, future practitioners should anticipate the extra effort needed by participants to obtain badges and address those issues through transparent communication and informed scheduling. A careful look at participants’ schedules early is critical, as is communication with participants about what will be expected and when it will be expected.

Additionally, assumptions cannot be made that participants will automatically see the value of obtaining badges. Educators should be proactive in sharing the anticipated value of their own badging projects. Within our own MELTS badging project, we now visit classrooms to discuss the purpose of the badges and share videos from highly regarded instructors and principals of schools about the value of the project. As participants prepare to graduate, our grant team will provide further concrete proof of the value of digital badges for EL teaching skills by giving our cords at graduation for high-level participants, recommendation letters to schools, and connections to principals looking for teachers with demonstrated EL teaching experience.

Last, instructors are now requiring the MELTS badging completion as part of the course and it is included in their course grade (10% consistently across the board). This approach should make a significant change in the percentage of participants who complete the entire MELTS badging project and obtain the badge.

Return on Investment

Considering the amount of time and effort it takes on the part of educators and their students to earn digital badges, is it worth adding this assessment component to a learner program?  The initial data suggest that students will participate in activities that improve their EL teaching skills (such as the modules and coaching) and help them feel more confident in their ability to work with ELs.

While the badging numbers in this seminal semester reflected a lower than expected completion rate, addressing logistical issues in subsequent terms has already improved those numbers. As educators who work with the students, we see the immediate improvement in EL teaching skill during the coaching and badging sessions. Preservice teachers who earn the ESOL endorsement at the university demonstrate not just knowledge but demonstrable skill in working with this vulnerable population. For schools who are looking to hire qualified and skilled teachers, the digital badges will represent a meaningful representation of that work.

Digital Badging Research

More research is needed to determine the impact of badging on learning and assessment (Devedžić & Jovanović, 2015), particularly on teacher skill development and how it translates into long-term, classroom instructional techniques (Gamrat et al.; Gibson, et al., 2015). While this exploratory study demonstrated great promise for TCs to develop their EL teaching skills through a process of badging, more studies are needed to determine if the learning translates into classroom practices. Will TCs who obtain badges demonstrate a higher level of technique when they have their own classrooms with ELs? One goal of the MELTS grant is to study this critical leap from learning to teaching and future research will address that. This study has touched on this impact, but more research must be done if badges are to obtain a validated foothold in the assessment process within educational programs.

Limitations

The following limitations were noted with this study. First, published research is lacking on this topic. While research evaluating the effectiveness of badges is growing, little foundation was available for this study to build upon in the area of microcredentialing of EL teaching skills using badging. Another noted limitation is that the qualitative data on participants’ experience with the skill practice session in this study was based on a limited number of responses, limiting the implications of this particular data set.

Last, due to the exploratory nature of this study, the reported quantitative results were descriptive in nature, upon which summary data was reported about success rates and participant perceptions of the skill practice sessions. The results, however, will not generalize to other populations or other badging projects. Future study will collect quantitative data that enables more advanced research designs and inferential statistical procedures. Once participants are in their school internships, comparisons will be made between nonparticipants and participants as to their demonstration of EL teaching skills to look for significant differences between the two populations of students.

Conclusion

We offered transparency in this exploratory study so that future research can build on the MELTS model and gain insights from what was learned as beneficial or detrimental to incorporating badges in teacher education. Even though there is a dearth of scholarly peer-reviewed empirical studies specifically devoted to badges and microcredentials in teacher education, Casilli and Hickey (2016) pointed out three areas for optimism: (a) the transformations that badges have already affected in some educational systems, (b) the establishment of stable badging organizations and approaches, and (c) the growth of an organic community dedicated to them (p. 127). In the design and early implementation of MELTS badging, all three of these points were clear.

In addition, the design and implementation of a badging project is highly unlikely to be without challenges. Researchers (Casilli & Hickey, 2016; Davis & Singh, 2015; Jovanović & Devedžić, 2015) have identified technical, pedagogical and logistical difficulties involved in creating and utilizing these new technologies. Hickey et al. (2014) pointed out in their report of the Badges Design Principles Documentation Project that nearly all the projects they reviewed found new practices were needed beyond their original vision or plan. We found the points suggested in the literature was found to be true. Although the first round of implementing the MELTS badging project yielded positive research results for those who completed the process, numerous changes were implemented to address what was learned to improve participation and understanding of the value of badges.

Challenges remain to be addressed, including increasing motivation among the participants to complete the entire sequence to obtain the badge. Motivation to complete badging requires improved communication between instructors and students about the value of badges. Several empirical studies (Davis & Singh, 2015; Denny, 2013) on badging and learner motivation acknowledged the merits of badging and yielded positive findings. Students expressed a preference for having badges incorporated into the interface of the online tool used in the study and reported high levels of enjoyment through the badge-earning process. 

On the other hand, debate continues about how badging works with extrinsic and intrinsic motivation to engage learners. Although some researchers worry that extrinsic motivation provided by badges may undermine learners’ intrinsic motivation, other researchers advocated that extrinsic and intrinsic motivation each play a part in engaging learners to participate meaningfully in a learning activity (Deterding, 2012; Hickey & McCaslin, 2001).

Logistically, removing barriers to participation is another challenge that must be resolved. Removing these barriers includes improving timing for busy students so thatthe badging sessions better fit students’ schedules and providing more support from the grant research team within the classrooms to explain the badge earning process. In doing so, the collaboration between critical stakeholders, such as experts in the field, TCs, TC instructors, researchers, and technology experts is critical to the sustainability of badging.

The MELTS project will continue to engage all stakeholders in refining badging to promote the value of EL teaching skills TCs must develop. To bring about transformations in the field of teacher education focused on EL teaching skills, as well as education in general, primary stakeholders must collaborate to improve the design and implementation of badging and microcredentialing.

Author Note

This study was funded through a grant provided by the U.S. Department of Education, Office of English Language Acquisition, Grant ID: 14236011

The research team that studied the MELTS digital badge development and achievement process included the following:

Joyce Nutta, Principal Investigator
Florin Mihai, Co-Principal Investigator
Kerry Purmensky, Co-Principal Investigator
Leslie Davis, Project Manager
Ying Xiong, Graduate Research Assistant
Alex Davies, Graduate Research Assistant
Shizhong Zhang, Graduate Research Assistant
Nirmal Ghimire, Graduate Research Assistant

We would like to acknowledge the many instructors in the Department of Education, especially Cyndi Walters, and the in-service teachers in Pinellas County who provided valuable feedback throughout the process; the university CDL team, including Aaron Hose, Tim Reid, George Lopez, who helped create the videos, and Beth Nettles, who worked with us on the modules, and our wonderful graduate students, who assisted with the IRB and many details since then.

References

Abramovich, S. (2016). Understanding digital badges in higher education through assessment. On the Horizon, 24(1), 126-131.

Abramovich, S., Schunn, C., & Higashi, R. M. (2013). Are badges useful in education? It depends upon the type of badge and expertise of learner. Educational Technology Research and Development61(2), 217-232.

Antin, J., & Churchill, E. (2011). Badges in social media: A social psychological perspective. In G. Fitz-Patrick, C. Gutwin, B. Begole, & W. A. Kellogg (Eds.), Proceedings of the ACM CHI 2011 (pp. 1–4). New York, NY: ACM.

Auvinen, T., Hakulinen, L., & Malmi, L. (2015). Increasing students’ awareness of their behavior in online learning environments with visualizations and achievement badges. Learning Technologies, IEEE Transactions, 8(3), 261-273.

Avalos, B. (2011). Teacher professional development in teaching and teacher education over ten years. Teaching and Teacher Education, 27(1), 10-20.

Beilstein, S. O., Perry, M., & Bates, M. S. (2017). Prompting meaningful analysis from pre-service teachers using elementary mathematics video vignettes. Teaching and Teacher Education, 63, 285-295. https://doi.org/10.1016/j.tate.2017.01.005

Bell, P., Bricker, L., Reeve, S., Zimmerman, H. T., & Tzou, C. (2013). Discovering and supporting successful learning pathways of youth in and out of school: Accounting for the development of everyday expertise across settings. In B. Bevan, P. Bell, R. Stevens, & A. Razfar (Eds.), LOST opportunities: Learning in out-of-school time (pp. 119-140). New York, NY: Springer.

Berry, B., Airhart, K. M., & Byrd, P. A. (2016). Microcredentials: Teacher learning transformed. Phi Delta Kappan, 98(3), 34-40.

Britton, L. R., & Anderson, K. A. (2010). Peer coaching and pre-service teachers: Examining an underutilised concept. Teaching and Teacher Education, 26(2), 306-314.

Brown, D., & Rhodes, D. E. (2017). Show what you know. Phi Delta Kappan, 98(8), 38–42. https://doi.org/10.1177/0031721717708293

Casilli, C., & Hickey. D. (2016). Transcending conventional credentialing and assessment paradigms with information-rich digital badges. The Information Society, 32(2), 117-129.

Chou, C. C., & He, S. J. (2017). The effectiveness of digital badges on student online contributions. Journal of Educational Computing Research, 54(8), 1092-1116.

Corbin, J., & Strauss, A. (2015). Basics of qualitative research: Techniques and procedures for developing grounded theory (4th ed.). Thousand Oaks, CA: SAGE.

Creswell, J. W., & Poth, C. N. (2018). Qualitative inquiry & research design: Choosing among five approaches (4th ed.). Thousand Oaks, CA: SAGE.

Darling-Hammond, L. (2012). Creating a comprehensive system for evaluating and supporting effective teaching. Stanford, CA: Stanford Center for Opportunity Policy in Education (SCOPE).

Davis, K., & Singh, S. (2015). Digital badges in afterschool learning: Documenting the perspectives and experiences of students and educators. Computers & Education, 88, 72–83.

Denny, P. (2013). The effect of virtual achievements on student engagement. In W. E. Mackay, S. Brewster, & S. Bødker (Eds.), Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 763-772). Association for Computing Machinery, New York, NY. https://doi.org/10.1145/2470654.2470763

Deterding, S. (2012). Gamification: Designing for motivation. Interactions, 19(4), 14–17.

Devedžić, V., & Jovanović, J. (2015). Developing open badges: A comprehensive approach. Educational Technology Research and Development, 63(4), 603-620.

Diamond, J., & Gonzalez, P. C. (2014). Digital badges for teacher mastery: An exploratory study of a competency-based professional development badge system. CCT Reports. New York, NY: Education Development Center/ Center for Children & Technology.

Domínguez, A., Saenz-de-Navarrete, J., de-Marcos, L., Fernandez-Sanz, L., Pagés, C., & Martínez-Herráiz, J.J. (2013). Gamifying learning experiences: Practical implications and outcomes. Computers & Education, 63, 380-392. Retrieved from https://www.researchgate.net/publication/256194365_Gamifying_Learning_Experiences_Practical_Implications_and_Outcomes

Fanfarelli, J. R., & McDaniel, R. (2017). Exploring digital badges in university courses: relationships between quantity, engagement, and performance. Online Learning, 21(2), 144-165, doi: 10.24059/olj.v21i2.1007

Gamrat, C., Zimmerman, H. T., Dudek, J., & Peck, K. (2014). Personalized workplace learning: An exploratory study on digital badging within a teacher professional development program. British Journal of Educational Technology, 45(6), 1136-1148.

Gibson, D., Ostashewski, N., Flintoff, K., Grant, S., & Knight, E. (2015). Digital badges in education. Education and Information Technologies, 20(2), 403-410.

Grunwald Associates LLC & Digital Promise. (2015). Making professional learning count: Recognizing educators’ skills with micro-credentials. Retrieved from https://digitalpromise.org/wp-content/uploads/2014/05/making_professional_learning_count.pdf

Hickey, D. T., & McCaslin, M. (2001). Comparative and sociocultural analyses of context and motivation. In S. S. Volet, & S. Jarvela (Eds.), Motivation in learning contexts: Theoretical and methodological implications (pp. 33-56). Amsterdam, NE: Pergamon/Elsevier.

Hickey, D. T., Otto, N., Itow, R., Schenke, K., Tran, C., & Chow, C. (2014). Badges design principles documentation (DPD). Interim project report. Retrieved from Indiana University, Center for Research on Learning and Technology website: http://dpdproject.info/files/2014/05/DPD-interim-report-v4-january.pdf

Hills, L., & Hughes, J. (2016). Assessment worlds colliding? Negotiating between discourses of assessment on an online open course. Open Learning, 31(2), 108–115. https://doi.org/10.1080/02680513.2016.1194747

Hurst, E. J. (2015). Digital badges: Beyond learning incentives, Journal of Electronic Resources in Medical Libraries, 12(3), 182-189.

Johannessen, B. G., Thorsos, N., & Dickinson, G. (2016). Current conditions of bilingual teacher preparation programs in public universities in USA. Education and Society, 34(2), 27-48.

Jones, W. M., Hope, S., & Adams, B. (2017). Teachers’ perceptions of digital badges as recognition of professional development. British Journal of Educational Technology. 49(3), 427-438. doi:10.1111/bjet.12557

Jovanović, J., & Devedžić V. (2015). Open badges: Novel means to motivate, scaffold and recognize learning. Technology, Knowledge and Learning, 20(1), 115-122.

Kopcha, T. J., Ding, L., Neumann, K. L., & Choi, I. (2016). Teaching technology integration to k-12 educators: A ‘gamified’ approach. TechTrends, 60(1), 62-69.

Lang, J. R. (2016). Digital credentialing: Does it offer a meaningful response to initial teacher education reform? In R. Brandenburg, S. McDonough, J. Burke, & S. White (Eds.), Teacher Education (pp. 49-62). Singapore: Springer.

Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. Cambridge, UK: Cambridge University Press.

Lavery, M., Youngblood, A., & Nutta, J. (2015, June). Analyzing K-12 student learning gains: Preparing teacher candidates to reach English learners. Paper presented at the annual conference of the American Educational Research Association, Chicago, IL.

Matthews, K., Swan, B., & Peluso, F. (2018). Microcredentialing of English learner teaching skills practice session feedback from exit survey fall 2017. (Report No. 43UCFEDOELA2016.Y2F8). Orlando, FL: University of Central Florida, Program Evaluation and Educational Research Group (PEER).

Morphis, E. A. (2018). The power of one-to-one coaching: Preparing pre-service teachers for the early childhood literacy classroom. Contemporary Issues in Early Childhood, 19(1), 85-87.

Morse, J. M. (2003). Principles of mixed methods and multimethod research design. In A. Tashakkori, & C. Teddlie (Eds.), Handbook of mixed methods in social & behavioral research (pp. 189-208). Thousand Oaks, CA: Sage.

National Center for Education Statistics. (2019). English language learners in public schools. Retrieved from https://nces.ed.gov/programs/coe/indicator_cgf.asp#f3

Niehaus, K., & Adelson, J. L. (2014). School support, parental involvement, and academic and social-emotional outcomes for English language learners. American Educational Research Journal, 51(4), 810-844.

Nutta, J., Mokhtari, K., & Strebel, C. (2012). Preparing every teacher to reach English learners: A practical guide for teacher educators. Cambridge, MA: Harvard.

Pappamihiel, E. (2007). Helping preservice content-area teachers relate to English language learners: An investigation of attitudes and beliefs. TESL Canada Journal, 24(2), 42-60.

Plöger, W., Scholl, D., & Seifert, A. (2018). Bridging the gap between theory and practice: The effective use of videos to assist the acquisition and application of pedagogical knowledge in pre-service teacher education. Studies in Educational Evaluation, 58, 197-204. doi:10.1016/j.stueduc.2017.12.009

Pytash, K. & Ferdig, R. (2014). Implementing digital badges in a teacher education program. In M. Searson & M. Ochoa (Eds.), Proceedings of SITE 2014-Society for Information Technology & Teacher Education International Conference (pp. 1779-1780). Jacksonville, FL: Association for the Advancement of Computing in Education (AACE). Retrieved from https://www.learntechlib.org/primary/p/131033/

Randall, D. L., Harrison, J. B., & West, R. E. (2013). Giving credit where credit is due: Designing open badges for a technology integration course. TechTrends, 57(6), pp. 88–95.

Reid, A. J., & Paster, D. (2016). A case-study of digital badges in composition courses. In L. Y. Muilenburg & Z. L. Berge (Eds.), Digital badges in education: Trends, issues, and cases (pp. 189-202). New York, NY: Routledge.

Reid, A. J., Paster, D. & Abramovich, S. (2015). Digital badges in undergraduate composition courses: Effects on intrinsic motivation. Journal of Computers in Education, 2(4), pp. 377-398.

Riconscente, M. M., Kamarainen, A., & Honey, M. (2013, Aug 4). STEM badges: Current terrain and the road ahead. Retrieved from http://badgesnysci.files.wordpress.com/2013/08/nsf_stembadges_final_report.pdf

Schmidt-Crawford, D., Thompson, A. D., & Lindstrom, D. (2014). Leveling up: Modeling digital badging for preservice teachers. Journal of Digital Learning in Teacher Education, 30(4), 111.

Stahl, G., Sharplin, E., & Kehrwald, B. (2016). Developing pre-service teachers’ confidence: real-time coaching in teacher education. Reflective Practice, 17(6), 724-738. doi:10.1080/14623943.2016.1206882

U.S. Department of Education. (2018). Our nation’s English learners: What are their characteristics. Retrieved from https://www2.ed.gov/datastory/el-characteristics/index.html?utm_content=&utm_medium=email&utm_name=&utm_source=govdelivery&utm_term=

WIDA Consortium. (2019). English language development standards. Retrieved fromhttps://wida.wisc.edu/teach/standards/eld Williams, R., Karousou, R., & Mackness, J. (2011). Emergent learning and learning ecologies in Web 2.0. The International Review of Research in Open and Distributed Learning, 12(3), 39-59.


Appendix
Coaching Protocol Example for Skill 1: Leveled Questioning Badge

Loading