{"id":8488,"date":"2019-05-21T14:37:14","date_gmt":"2019-05-21T14:37:14","guid":{"rendered":"https:\/\/citejournal.org\/\/\/"},"modified":"2019-08-30T20:12:49","modified_gmt":"2019-08-30T20:12:49","slug":"research-methods-for-the-people-by-the-people-of-the-people-using-a-highly-collaborative-multimethod-approach-to-promote-change","status":"publish","type":"post","link":"https:\/\/citejournal.org\/volume-19\/issue-2-19\/general\/research-methods-for-the-people-by-the-people-of-the-people-using-a-highly-collaborative-multimethod-approach-to-promote-change","title":{"rendered":"Research Methods for the People, by the People, of the People: Using a Highly Collaborative, Multimethod Approach to Promote Change"},"content":{"rendered":"\n

 <\/strong><\/p>\n\n\n\n

The theory and practice of preparing\nteacher candidates to teach with technology is inconsistent at best and\nineffective at worst (Angeli & Valanides, 2009; Ertmer &\nOttenbreit-Leftwich, 2010; Tondeur, Roblin, van Braak, Fisser, & Voogt,\n2013). Some researchers have noted that\nthe quantity and quality of technology experiences that teacher candidates\nencounter during their preparation programs influence their adoption of\ntechnology (Agyei & Voogt, 2011; Tondeur et al., 2012), while others have\nidentified a gap between what teacher candidates are taught in preparation\ncourses and how PK\u201312 teachers are actually using technology in classrooms\n(Ottenbreit-Leftwich, Glazewski, Newby, & Ertmer, 2010; Tondeur et al.,\n2012). <\/p>\n\n\n\n

To address\nthis gap, the ways teacher candidates are being prepared to integrate\ntechnology within the context of their preparation programs must be continually\nexamined. Those who are preparing teacher candidates \u2014 teacher educators \u2014 must\nbegin to examine and reflect on their own practices to determine whether they\nare, indeed, designing and modeling instructional opportunities that are\npreparing teacher candidates to use technology effectively in PK\u201312 classrooms.\n<\/p>\n\n\n\n

The U.S.\nDepartment of Education (2017) has highlighted this concern, as well, and has\ncalled for teacher certification programs to devise methods that address a\ntechnology integration curriculum in a program-deep, program-wide manner. The\nchallenge, then, becomes determining what technology knowledge and skills all<\/em>\nteacher educators would need in order to design high-quality technology\nexperiences for teacher candidates in their courses.<\/p>\n\n\n\n

With a goal of building consensus in the field of teacher education, our research team embarked on an 18-month journey to bring focus and intentionality to efforts that prepare teacher candidates to use technology for teaching and learning. This research process solicited ideas from national and international experts on technology competencies that all teacher educators should use and were presented to the field for further comment and refinement all while being guided by an expert review panel. <\/p>\n\n\n\n

The Teacher Educator Technology Competencies (TETCs) were developed using a unique consensus-building and highly collaborative research methodology. Specific results from this study are described in detail in Foulger, Graziano, Schmidt-Crawford, and Slykhuis (2017; see also http:\/\/site.aace.org\/tetc\/<\/a>). The purpose of this article is to focus on and provide more detail around the three distinct collaborative research approaches (crowdsourcing, Delphi, and public comment) used to develop the TETCs. <\/p>\n\n\n\n

The development of the TETCs was\nmotivated by a call from the 2017 National Education Technology Plan authored\nby the U.S. Department of Education (2017), Office of Educational Technology,\nwhich recommended that teacher preparation programs \u201cdevelop a common set of\ntechnology competency expectations for university professors and candidates\nexiting teacher preparation programs for teaching in technologically enabled\nschools and postsecondary education institutions\u201d (p. 40). <\/p>\n\n\n\n

The 2017 National Education\nTechnology Plan purposefully shifted the idea of technology integration from a\nPK\u201312 focus of the prior plan to one that included commitment from every\neducational level, PK\u201320 (U.S. Department of Education, 2017). Specifically,\nthe plan called for teacher preparation institutions to assure their graduates\nknow that \u201ceffective use of technology is not an optional add-on or a skill\nthat [they] can simply … pick up once they get into the classroom\u201d (p. 32). <\/p>\n\n\n\n

If all teacher preparation programs,\nin the United States and around the world, are charged with the need to prepare\nteacher candidates to use technology in powerful ways, then all<\/em> teacher\neducators who are responsible for preparing these candidates must establish a\ncurriculum for teaching with technology, serve as role models for using\ntechnology in teaching, and provide support to teacher candidates for\ndeveloping their ability to teach with technology (Borthwick & Hansen,\n2017; Goktas, Yildirim & Yildirim, 2009; Tondeur et al., 2012).<\/p>\n\n\n\n

The technological pedagogical content knowledge framework (or technology, pedagogy, and content knowledge [TPACK]; Mishra & Koehler, 2006) has been used extensively across teacher education to guide and inform teacher preparation programs and to measure teacher candidates\u2019 learning outcomes (Mouza, 2016). Although this conceptual framework identifies seven knowledge constructs teachers need to integrate technology into instruction effectively, it does not offer specific solutions for developing TPACK among teacher candidates (Mouza, 2016; Niess, 2012). Thus, ascertaining and defining the role all teacher educators are expected to play in the process of preparing teacher candidates to teach with technology is often difficult. <\/p>\n\n\n\n

To address this challenge, four\nteacher education faculty members with educational technology expertise from\ndifferent teacher preparation programs across the United States used a\nmultimethod research approach to identify a set of technology competencies for\nteacher educators in hopes of promoting and starting a paradigm shift in teacher\neducation on the ways teacher candidates are prepared to use technology. The\nresult was an 18-month, process-oriented approach that involved national and\ninternational experts in the field providing input on the development of a set of TETCs (Foulger et al., 2017). <\/p>\n\n\n\n

The goal of this article is to focus on and describe\nthe research\nproject\u2019s multimethod approach (Morse, 2003), which emphasized a highly\ncollaborative and participatory set of processes used to build consensus. By sharing our research process in more\ndetail, we hope to encourage others to consider applying similar collaborative\nand participatory research processes in their own work. <\/p>\n\n\n\n

Collectively, this article documents\nthe methodological decisions made by the research team in order to answer the\ncall to develop a common set of technology competencies specific for teacher\neducators (U.S. Department of Education, 2017). Teacher educators are those\nindividuals who \u201cprovide instruction or who give guidance and support to\nstudent teachers [teacher candidates], and who thus render a substantial\ncontribution to the development of students into competent teachers\u201d (Koster,\nBrekelmans, Korthagen, & Wubbels, 2005, p. 157). Research decisions\nthroughout the process were also framed and guided by taking steps to include\nexisting research to guide competency content, involve educational technology\nexperts who work in teacher preparation, and address varied stakeholder needs. <\/p>\n\n\n\n

The research team designed the\nproject using a series of three highly collaborative research methods for\ndeveloping the TETCs. First, a crowdsourcing method was used to gather\nliterature on existing technology competencies specific to teacher educators.\nAfter an initial list of technology competencies was extracted from the\ncrowdsourced literature, a Delphi method was used to elicit, distill, and\ndetermine the opinions of a panel of experts (Nworie, 2011). <\/p>\n\n\n\n

Following six rounds of Delphi input\nand feedback from educational technology experts, a list of 12 TETCs with\nrelated criteria were developed that represented the knowledge, skills, and\nattitudes all teacher educators\nneed in order to prepare teacher candidates who enter PK\u201312 classrooms ready to\nintegrate technology to support their teaching and student learning (Foulger et al., 2017). Last, the TETCs\nwere presented to the field at conferences and a public comment period was used\nto gather additional feedback related to suitability, allowing more teacher\neducators additional opportunity to critically appraise the TETCs\n(Gopalakrishnan & Udayshankar, 2014). <\/p>\n\n\n\n

Collectively, these research methods\nwere carefully constructed, highly collaborative, and contributed to building\nparticipant consensus throughout the entire 18-month research process. The next\nsections of this article will discuss specific details that describe the\nimplementation of the multimethod research approach used for this project.<\/p>\n\n\n\n

Implementing a Multimethod Research\nApproach<\/h2>\n\n\n\n

The TETCs project was intentionally designed to incorporate a multimethod approach that fostered a high degree of collaboration among stakeholders during multiple points of data collection and analysis. Because the overarching goal was to identify technology competencies for all teacher educators, a multimethod research process was implemented and included multiple opportunities for stakeholders\u2019 input and feedback throughout the project. As Morse (2003) noted, \u201cMultiple methods are used in a research program when a series of projects are interrelated within a broad topic and designed to solve an overall research problem\u201d (p. 196). A multimethod design can include separate projects that are conducted sequentially in order to inform the research study as a comprehensive whole (Morse, 2003). <\/p>\n\n\n\n

The described research project used\nthe methods of crowdsourcing, Delphi, and public comment to identify the TETCs\n(Figure 1). These multiple methods were conducted sequentially because the\ncrowdsourcing results were used to plan the Delphi process, while the Delphi\nprocess findings informed the public comment phase of the research project.\nEvery member of the research team was highly involved with all phases of the\nresearch project, compiling and interpreting feedback, while being active and\ncontinual facilitators of the communication and feedback aspects of the\nproject. Next, each research method is described briefly and includes a summary\nof major strengths and challenges for each method.<\/p>\n\n\n\n

\"Figure
Figure 1. <\/strong>Implementation of multimethod approaches used to inform research project.<\/em><\/figcaption><\/figure>\n\n\n\n

Crowdsourcing<\/h3>\n\n\n\n

Crowdsourcing is a Web 2.0 form of\noutsourcing a task or function to an undefined group of people in the form of\nan open call (Howe, 2006). Although crowdsourcing started in the business world\n(Brabham, 2008), it has gained considerable attention and popularity in the\nacademic community (Solemon, Ariffin, Din, & Anwar, 2013). Crowdsourcing\nfacilitates the connectivity and collaboration of many individuals to\nparticipate in knowledge generation, and seeks to mobilize competence and\nexpertise, which are distributed among the crowd (Zhao & Zhu, 2014). <\/p>\n\n\n\n

In particular, technology enables a\nprocess that is highly collaborative and incorporates research perspectives and\nopinions from individuals who work together across great distances including\nacross countries and continents. The product of a crowdsourcing process is\noften shared freely and has strong agreement due to the participation of many\n(Morris & McDuff, 2015). <\/p>\n\n\n\n

Collective intelligence or crowd\nwisdom is a primary strength of crowdsourcing (Brabham, 2008; Howe, 2008). Such\na strategy involves sharing the wisdom or knowledge and ideas from a \u201ccrowd\u201d in\norder to solve problems or predict outcomes. It utilizes \u201ccollective brain\npower and energy to complete what they can\u2019t do on their own\u201d (Solemon et al.,\n2013, pp. 2067\u20132068). Thus, crowdsourcing is a mechanism used to gather opinion\nand judgment from a large group of individuals in the fraction of time it might\ntake one individual to complete the task. <\/p>\n\n\n\n

Today\u2019s technology can easily facilitate user-generated\ncontent and the exchange of ideas and opinions, so individuals can complete a\ncrowdsourcing task asynchronously and work at their own pace (Brabham, 2008). One challenge\nassociated with the crowdsourcing approach is guaranteeing all who want to\nparticipate can and that the crowd that participates represents a diversity of\nopinion and thought.  <\/p>\n\n\n\n

Delphi<\/h3>\n\n\n\n

A Delphi method is a research\napproach used to validate and refine ideas because it \u201cis designed to both\nobtain and identify areas of consensus and divergence of opinion\u201d (Nworie,\n2011, p. 29). This method allows \u201ca group of individuals, as a whole, to deal\nwith a complex problem\u201d (Linstone & Turoff, 2002, p. 3). The Delphi method\ninvolves experts who are carefully selected to share their opinions on an\nimportant idea or issue, and then their ideas are synthesized and incorporated\ninto the outcomes (Skulmoski, Hartman, & Krahn, 2007). The process is\nhighly interactive and includes iterative rounds of data collection in order to\nbuild reliability, determine suitability, and ultimately yield consensus\n(Linstone & Turoff, 2002). <\/p>\n\n\n\n

Questionnaires are typically\nconstructed for each round of the Delphi process to obtain feedback from a\npanel of experts. Panelists\u2019 responses from each round are analyzed and then\nused to construct the questionnaire for the next round. This iterative process\ncontinues until consensus among the panelists is reached. Consensus is achieved\n\u201cwhen a certain percentage of responses fall within a prescribed range for the\nvalue being estimated\u201d (Dajani, Sincoff, & Talley, 1979, p. 83).<\/p>\n\n\n\n

The Delphi method offers a unique\nresearch approach for investigating critical issues, defining problem areas,\nand identifying best practices and skill sets (Nworie, 2011). Strengths for\nusing the Delphi method include obtaining expert opinion, building consensus,\nforecasting trends, and interacting with research subjects. The approach is\nconducive to bringing geographically dispersed individuals together to serve as\na panel of experts who share their expertise about the topic under\ninvestigation. When using this technique, researchers are able to analyze data\nbased upon the panelists\u2019 expert opinions. There are also challenges associated\nwith the Delphi method that are worth noting.<\/p>\n\n\n\n

Delphi studies typically involve\nmultiple rounds of data collection and feedback; therefore, it can become a\nlengthy process and result in the attrition of participants (Nworie, 2011).\nSlow or nonresponse by participants to a questionnaire during a Delphi round is\nalso a related concern. Another challenge relates to the assumptions that can\nbe made about the expertise and experience of individuals who are selected for\nthe Delphi panel. It is assumed that all individuals selected will have a\nthorough understanding of the topic under investigation and that personal\nbiases will not influence their responses.<\/p>\n\n\n\n

Public Comment<\/h3>\n\n\n\n

Public comment is used in a variety of contexts to assure a goal will be met before finalization of a product, document, or decision. Successful approaches to public comment depend on information that is reliable, or in the case of human opinion, to people who are well informed on the subject matter. Public comment processes are typically used in high-stakes assessment practices, such as those employed in medical schools and government. Public comment addressing questions posed by a review committee assures specified criteria are met, potential flaw areas are identified, and possible edits are noted with the goal of improving the validity of items. <\/p>\n\n\n\n

Modifications are often adopted with\nthe goal of making sure questions are correct, fair, valid, and reliable\n(Gopalakrishnan & Udayshankar, 2014). The technology industry frequently\nsolicits public comment prior to establishing manufacturing and distribution,\nto minimize any vulnerabilities, make known any unavoidable risks to consumers,\nand ensure maximum security. This type of public comment requires both human\nanalysis and technology-based analysis (Quirolgico, Voas, & Kuhn, 2011). <\/p>\n\n\n\n

The public comment approach was applied to this research project for the purpose of increasing the visibility of the TETCs with yet another set of stakeholders before the final version of the competencies was released. Thus, one strength of using public comment is for gathering additional insight or thought about a topic, rule, or regulation with the understanding that comments might \u201chave substantial effect\u201d on the final outcome of what is being proposed (Balla, 2014, para. 1). Another strength associated with using public comment involves bringing legitimacy to the process; the public is given a chance to provide feedback so the process appears \u201cdemocratic and legitimate\u201d (Innes & Booher, 2004, p. 423). One challenge commonly associated with the public comment process is whose<\/em> voice is being heard? Although broad-based participation is typically encouraged, ascertaining who provides the comments and to what extent those comments are benefiting individual or community interests as a whole is often difficult. <\/p>\n\n\n\n

To provide\na broad-based international perspective, the public comment phase of the TETCs\nproject as well as the call for literature used during the crowdsourcing phase\nand the call for Delphi participants utilized international teacher educator\nnetworks (i.e., Society for Information Technology and Teacher Education [SITE]\nand International Society for Technology in Education [ISTE]) and social media\nnetworks (e.g., LinkedIn and Twitter) that reached a global audience. Diverse\neducational technology faculty members from around the world participated in\nthe calls for both literature and Delphi participants and provided feedback\nduring the public comment. For a complete list of literature used during the\ncrowdsourcing phase and a list of Delphi participants, see Foulger et al. (2017).<\/p>\n\n\n\n

Additionally,\nan advisory group was established to inform the research team and the research\nmethodology. Membership on the advisory group consisted of leaders from national\nand international organizations. The advisory group met periodically with the\nresearchers to provide insight on how to strengthen the methodology. The\nultimate goal of the group was to help researchers devise research methodology\nthat would prompt change in the field. <\/p>\n\n\n\n

By using all of these methods, the research team sought to create a research methodology that would result in technology competencies for <\/em>teacher educators, that would be representative of <\/em>teacher educators, and that would be created with input by <\/em>teacher educators, so the resulting competencies would be embraced and useful to all<\/em> teacher educators. The next section provides specific details on the multimethod approaches used to encourage collaboration and build consensus among stakeholders.<\/p>\n\n\n\n

Collaborative Multimethod Approaches\nUsed to Build Consensus<\/h2>\n\n\n\n

The multimethod research approach used to develop the TETCs was designed to be highly collaborative and build consensus during and across each phase of the entire research project. Each phase (i.e., crowdsourcing, Delphi, and public comment) of this multimethod approach is described in more detail in the following sections. Special attention is given to explaining how the method contributed to the research project as a whole, the strengths of the research methods from each phase as experienced by the research team of this study, and the ways the results of each phase informed the project\u2019s next steps. Specifically, an iterative research process was designed that offered multiple opportunities for stakeholders within and around teacher education to provide input and expert opinion to shape the development of the TETCs.<\/p>\n\n\n\n

Phase I: Crowdsourcing <\/h3>\n\n\n\n

Phase I of the development of the TETCs involved the crowdsourcing of existing literature. The goal of the crowdsourcing process was to identify an initial list of technology competencies for teacher educators that could be extracted from existing literature and then use that list of competencies (grounded in research literature) as a starting point for the Delphi phase. An open call targeting teacher educators and educational technology experts sought literature addressing technology competencies needed by teacher educators who support the development of teacher candidates as they learn to teach with technology. The call for literature was sent through various teacher educator networks (e.g., SITE and ISTE) and social media networks (e.g., LinkedIn and Twitter).<\/p>\n\n\n\n

Respondents to the call uploaded 93\nrelated articles and book chapters to a Web portal, which was developed and\nmanaged by the research team. To assure a comprehensive review of the\nliterature, the research team also searched for articles and uploaded\nadditional literature to the web portal. After a thorough review of the\ncrowdsourced literature by the research team, literature not specific to\nteacher educators was eliminated. In the end, 43 articles were selected as a\nstarting point to begin extracting a list of possible technology competencies\nfor teacher educators.<\/p>\n\n\n\n

Guidelines for writing an effective\ncompetency statement (European Commission: Education and Training, 2013;\nSturgis, 2012; University of Texas School of Public Health, 2012) were utilized\nby the research team to draft a list of initial competencies that stemmed from the\ncrowdsourced literature. This list of technology competencies for teacher\neducators from the crowdsourced literature included 31 competencies, related\ncriteria aligned with each competency, and references for each competency\nconnected back to the crowdsourced articles. <\/p>\n\n\n\n

The research team carefully reviewed\nthe 31 technology competencies with a focus on relevancy, duplication, wording,\nand quality assurance, according to the guidelines used for writing an\neffective competency. Several competencies were combined, while others were\nrevised. As a result, an initial list of 24 TETCs were extracted from the\ncrowdsourced literature. <\/p>\n\n\n\n

A strength of using the crowdsourcing technique to begin this research project was the ability to reach a large number of national and international experts with related knowledge and research that would have been unknown or otherwise unavailable (Brabham, 2008; Howe, 2008). One challenge the research team encountered with the crowdsourcing phase was sourcing relevant literature and articles that focused on teacher educators. More than half of the articles submitted to the open call were not used because the content was not specific to teacher educators. Phase II of the research project involved using a Delphi method that assisted with the identification and further refinement of the 24 competencies identified from the crowdsourced literature.<\/p>\n\n\n\n

Phase II: Delphi Method<\/h3>\n\n\n\n

To identify participants for the Delphi phase of the research project, an application was developed that included questions about participants\u2019 educational organization affiliation, department or college affiliation, role in preparing PK\u201312 teachers, and country of residence. A broad-based call for participation was posted on the same online networks as the call for literature during the crowdsourcing phase. Forty-six applications were received from individuals who wanted to participate in the Delphi phase of the project. Nworie (2011) recommended selecting divergent experts to help account for future developments in technology, the rapid expansion of pedagogy due to technology use, and any potential or probable changes in policy. Given that the Delphi process was conducted virtually and was not limited to time and location of the experts, a divergence of content expertise, geographic location, organizational affiliations, and college\/university settings were considered while selecting the panel participants.<\/p>\n\n\n\n

Eighteen participants were selected\nwith the intention of providing a broad perspective as a team through\ncomplementary individual expertise, experience, and affiliation. Of the 18\nparticipants selected, 17 agreed to participate in the Delphi phase and signed\nthe Institutional Review Board agreement. During this phase, participants were\nasked to complete six rounds of data collection and were never made aware of\nthe identity of the other participants.<\/p>\n\n\n\n

For each of the six rounds, the\nDelphi participants were sent a questionnaire with a preamble to guide their\nthinking, and then a series of questions about the teacher educator competencies\nor criteria asking them to either provide rankings or an open-ended response to\ndocument their thoughts and ideas. The research team compiled and analyzed the\nresponses after each Delphi round, formed the next iteration of the TETCs, and\nthen sent another questionnaire to the participants. This iterative feedback\nloop allowed the research team to build both quantitative and qualitative\nconsensus on the content of the TETCs and their associated criteria (Dajani et\nal., 1979).<\/p>\n\n\n\n

One strength of the Delphi process\nused for this research project was the lack of attrition of our Delphi\nparticipants. While not all 17 Delphi participants contributed to each of the\nsix rounds, no participant asked to be removed from the study, and all\ncontributed throughout the duration of the process. It is important for\nresearchers to develop strategies that encourage participation because a low\nresponse rate during the Delphi process can impact the study\u2019s validity (Hsu & Sandford,\n2007). In addition, the research team attributes the high participant\nretention during the Delphi phase to the perceived value of the TETCs and\nrelated criteria by the panel participants. <\/p>\n\n\n\n

The participants knew they were\nhelping develop a list of competencies to address an identified need within the\nteacher education community, and most expressed they planned to use the TETCs\nwithin their universities to guide technology integration efforts at their\ninstitutions. A related strength involved gaining six rounds of expert opinion\nspecifically on the competencies, while building consensus with the Delphi\nparticipants during and after each round (Nworie, 2011). Most Delphi studies\ntypically include three or four rounds of expert opinion.<\/p>\n\n\n\n

One clear challenge with the Delphi\nprocess was the extended time that was necessary to complete this phase of the\nresearch project. Designing and sending the questionnaires, allowing time for\npanel responses, compiling and analyzing the results, and changing the\ncompetencies and criteria accordingly, took 4-6 weeks of elapsed time for each\nof the six Delphi rounds. As noted by Nworie (2011), Delphi studies involving\nmultiple rounds of data collection and feedback can take a significant time to\ncomplete. Although the Delphi phase of this research project took 9 months to complete,\neach round was deemed necessary and important in providing the time needed for\ninput. With the Delphi phase of the\nresearch completed, the list of 12 TETCs was ready for public comment.<\/p>\n\n\n\n

Phase III: Public Comment<\/h3>\n\n\n\n

Once the research team was assured the Delphi process had run its course and the Delphi participants were in agreement that the TETCs were indicative of the knowledge, skills, and attitudes all teacher educators needed to support the development of teacher candidates\u2019 abilities to teach with technology, the multimethod research approach transitioned to Phase III, public comment. The research-related purpose of using public comment was to provide one final opportunity for additional stakeholders in educational technology and teacher education to offer input on the TETCs. Thus, the research team sought to influence change in the field by (a) distributing the TETCs to as many teacher educators as possible, (b) increasing anticipation for the release of the final TETCs, (c) soliciting input for further refinement, and (d) helping teacher educators begin to reflect on how the TETCs might be used in their college\/university.<\/p>\n\n\n\n

A brief questionnaire designed by\nthe research team gathered broad-based input from additional stakeholders and\norganizations in the teacher education community about the perceived usefulness\nand usability of the TETCs. The questionnaire was sent through the same\nchannels as were used for the crowdsourcing and Delphi phases. The\nquestionnaire included an explanation of the research project process, a draft\ncopy of the TETCs for participant review, and three questions: <\/p>\n\n\n\n

  1. What aspects of the TETCs do\nyou\/does your organization find most useful?<\/li>
  2. How would you\/your organization make\nuse of the TETCs?<\/li>
  3. What concerns do you\/does your\norganization have about the TETCs? <\/li><\/ol>\n\n\n\n

    A space for additional comments was also provided so participants\ncould provide insight and input beyond the questions listed on the\nquestionnaire. In this process, anyone\n(the public) could contribute comments about the TETCs; however, these comments\nwere not made available for other commenters to view. The comments were used by\nthe research team to further refine the TETCs.<\/p>\n\n\n\n

    Several national and international\nteacher educators and stakeholders viewed a draft copy of the TETCs during the\npublic comment phase. The public comment process increased awareness in the\nfield about the TETCs and justified the need for the TETCs. Providing a draft\ncopy of the TETCs to the public also allowed those in teacher education who\nwere anticipating the release to begin planning how they might use the TETCs in\ntheir colleges and schools of education. In sum, 31 individuals completed the\nquestionnaire on the TETCs during the public comment phase of the project.\nTwenty-nine responses were from individuals and two responses were from\norganizations. All responses originated from either the United States or\nAustralia.<\/p>\n\n\n\n

    Respondents during the public\ncomment phase stated that the TETCs were targeted, helpful, and fitting for the\nfield. Several respondents noted that the TETCs were aligned with the ISTE (2018)\nStandards for Educators, and one respondent said there was redundancy with the\nISTE standards. Some respondents commented they wanted to share the TETCs with\nsenior faculty and administrators at their institutions. <\/p>\n\n\n\n

    The TETCs seemed to overwhelm a few\nrespondents, who noted concerns such as, \u201ccould be misinterpreted as more standards\u201d\nand \u201ctoo many.\u201d One respondent discussed fitting terminology (e.g., technology\nto be an outdated term) and another noted lack of alignment to other\neducational organizations such as libraries and museums. Because the TETCs are specific to teacher educators\nwho prepare teacher candidates for licensure positions, such comments were\nnoted to be outside the scope of the study and were not included for analysis.<\/p>\n\n\n\n

    All told, the results and feedback\ncollected from the public comment phase warranted no significant changes to the\nTETCs; however, the research team opted to modify the initial stem of each\ncompetency to include the words \u201cteacher educator\u201d to help clarify the intended\naudience. The research team hoped this approach would continually remind\nreaders that the TETCs are intended for teacher educators specifically and not\nfor PK-12 teachers.<\/p>\n\n\n\n

    The public comment phase of this\nresearch project provided the research team with additional insight into the\ndevelopment process of the TETCs. Although the TETCs did not change\nsubstantially because of any comments received, this phase provided another\nchance for teacher educators and interested stakeholders to provide feedback\nabout the TETCs and their possible use in teacher education institutions.\nBecause public comment was allowed and considered, it did bring more legitimacy\nand clarity when developing the final version of the TETCs (Innes & Booher,\n2004). <\/p>\n\n\n\n

    Originally, the goal of the public\ncomment phase was to obtain additional feedback from the field to improve the\nTETCs before publication. However, once the process began the research team\nrealized that this phase could be used to meet more far-reaching goals related\nto individual and organizational usability related to the TETCs.<\/p>\n\n\n\n

    Still, it was challenging using\npublic comment to promote the TETCs by encouraging additional stakeholders to\nreact and provide feedback on the competencies. Although the research team\nconstantly looked for ways to promote collaboration and provide feedback about\nthe TETCs, only 31 comments were received during this phase of the research\nproject. It was unclear how many viewed the draft TETCs but did not provide\ncomments. Broad-based participation during the public comment phase was\nencouraged, yet only a small percentage of individuals still chose to\nparticipate and provide comments (as also in Innes & Booher, 2004). For a list of the findings from the project\nincluding the 12 competencies and related criteria and a more detailed\ndescription of the data collection and data analysis, see Foulger et al. (2017).<\/p>\n\n\n\n

    Implications for Research <\/h2>\n\n\n\n

    In order to respond to the need to develop a set of technology competencies for teacher educators (U.S. Department of Education, 2017), the research team designed a research project that used a highly collaborative, multimethod approach. Each method (crowdsourcing, Delphi, and public comment) was conducted separately with a specific purpose in mind, and each was planned sequentially as one approach informed the next (Morse, 2003). Eventually, a list of technology competencies was developed identifying the knowledge, skills, and attitudes all teacher educators need for preparing teacher candidates to use and integrate technology for teaching and learning (Foulger et al., 2017).<\/p>\n\n\n\n

    Professional organizations have typically taken the lead for developing standards to guide the professional development required for an organization\u2019s membership (e.g., Association of Mathematics Teacher Educators, 2017; ISTE, 2018; National Science Teachers Association, 2012; Thomas & Knezek, 2008). Large projects like these are usually funded, seek experts in the field to assist in the development of such standards, and go through multiple iterations of draft documents to reach consensus. <\/p>\n\n\n\n

    Since this task was similar to what\norganizations have instituted in the past, the research team carefully designed\nthe project by replicating methods that would be highly inclusive and\ncollaborative by including multiple opportunities throughout the project for\nexpert opinion and comment. It was a process-oriented approach designed to\ninclude as many experts (i.e., national\nand international teacher educators with expertise in educational\ntechnology and educational technology experts) as possible in each phase of the\nresearch project. <\/p>\n\n\n\n

    All three methods selected and\nincorporated into this multimethod design \u2014crowdsourcing, Delphi, and public\ncomment \u2014 encouraged gathering collective wisdom and knowledge from a crowd or\npanel of experts (Brabham, 2008; Howe, 2008; Nworie, 2011; Okoli &\nPawlowski, 2004; Rice, 2009; Shelton & Creghan, 2015). As a result of these\nefforts, other researchers may see the value of combining multiple methods for\nresearch projects designed for investigating critical issues or developing\nskill sets requiring divergence of opinion and the building of consensus.<\/p>\n\n\n\n

    In order to successfully develop the list of TETCs, the research team placed emphasis on keeping the stakeholders actively involved and engaged in all research activities during each phase and throughout the entire project. Since the target audience for the TETCs was teacher educators, requests encouraging stakeholders to help with various research tasks were posted using digital and social media outlets. These outlets proved successful for recruiting participants for each phase of the project. For example, 46 individuals applied to participate in the Delphi phase, while 17 (11 females and six males) agreed to participate from this strong and diverse pool of experts. Okoli and Pawlowski (2004) recommended recruiting a panel of at least 10\u201318 experts for a Delphi study. <\/p>\n\n\n\n

    Even though a larger panel of experts can present logistical\nand time investment challenges (Nworie, 2011), the Delphi participants were\ncommitted to assisting with the development of the TETCs and remained highly\nengaged during the 9 months it took to complete six rounds of data collection\nand analysis. Not all participants completed each of the six rounds, but no\nparticipant dropped out entirely. Every round of the Delphi process received\nfeedback from at least 14 participants. This type of active involvement and\nengagement during each phase of the project was noted and appreciated by the research team.<\/p>\n\n\n\n

    The stakeholders\u2019 commitment during\neach phase kept the process of using a multimethod approach highly\ncollaborative and informative, especially when used as a sequential research\nprocess as different stakeholders became involved with each phase. Other\nresearchers might consider using a multimethod approach when constant feedback\nand public comment are essential to the research process, especially when\ngathering iterative phases of data is necessary. <\/p>\n\n\n\n

    This research project was designed using a multimethod approach, with the primary intent of creating change in teacher education, specifically to impact teaching practices used to prepare teacher candidates who will ultimately use technology appropriately in their future classrooms. Perhaps the research outcomes from this project will initiate a new paradigm of thought, establish strong buy-in, and begin a synergistic movement to impact how teacher candidates are prepared at national and international teacher preparation institutions. <\/p>\n\n\n\n

    Findings from the research project\nshould encourage teacher educators to review their own practice and make use of\nthe TETCs. In time, teacher educators\u2019 practice might change, and then some\nwill embrace an action research approach to systematically examine their own\nteaching practices with technology (Mertler, 2016). Likewise, administrators in\ncolleges and schools of education may see merit in the findings and create a\nnew vision for preparing teacher candidates to teach with technology within\ntheir programs. Using three specific research methods collectively within the\nframework of one research project permitted the research team to receive\nopinion, input, and comment from a variety of stakeholders who were committed\nto promoting change within the teacher education community and, ultimately,\ndeveloped a set of TETCs that did not exist in the field prior to the research\nproject.<\/p>\n\n\n\n

    Conclusion<\/h2>\n\n\n\n

    As a result of using a highly\ncollaborative, multimethod research approach, the research team responded to\nthe call for developing a common set of technology competencies for teacher\neducators (U.S. Department of Education, 2017). The TETCs, the outcomes of\nusing this multimethod research approach, have initiated conversations within\nthe teacher education community for promoting change in how teacher educators\nuse and integrate technology.<\/p>\n\n\n\n

    This multimethod research approach was designed with the intent of fostering and encouraging collaboration and consensus among stakeholders for the purpose of promoting change in teacher education. Three specific research methods were used in a sequential and iterative manner with the aim to inform the development of a set of technology competencies for all <\/em>teacher educators. <\/p>\n\n\n\n

    Critical to the research design was\nthe deliberate attempt to offer multiple opportunities for stakeholders to\nprovide input and feedback, hence the need for using three research methods.\nThese specific methods were selected because each method complemented and built\nupon the other in terms of obtaining expert opinion, receiving multiple rounds\nof feedback, and creating consensus in order to have substantial effect on the\noutcome \u2014 change in teacher education and the preparation of teacher\ncandidates. Using any of the three methods in isolation would not have\ngenerated the same breadth of results and collaborative feedback.<\/p>\n\n\n\n

    Author Note<\/h3>\n\n\n\n

    All\nauthors contributed equally to the research and writing process.
    <\/p>\n\n\n\n

    References<\/h2>\n\n\n\n

    Agyei, D. D., & Voogt, J. M.\n(2011). Exploring the potential of the will, skill, tool model in Ghana:\nPredicting prospective and practicing teachers\u2019 use of technology. Computers\n& Education<\/em>, 56<\/em>, 91\u2013100. https:\/\/doi.org\/10.1016\/j.compedu.2010.08.017<\/a><\/p>\n\n\n\n

    Angeli, C., & Valanides,\nN. (2009). Epistemological and methodological issues for the conceptualization,\ndevelopment, and assessment of ICT\u2013TPCK: Advances in technological pedagogical\ncontent knowledge (TPCK). Computers &\nEducation<\/em>, 52<\/em>, 154\u2013168. https:\/\/doi.org\/10.1016\/j.compedu.2008.07.006<\/a><\/p>\n\n\n\n

    Association of Mathematics Teacher Educators. (2017). Standards for preparing teachers of mathematics<\/em>. Retrieved from https:\/\/amte.net\/standards<\/a><\/p>\n\n\n\n

    Balla,\nS. J. (2014, April 7). Measuring the\nimpact of public comments<\/em>. Retrieved from George Washington University\nRegulatory Studies Center website: https:\/\/regulatorystudies.columbian.gwu.edu\/measuring-impact-public-comments<\/a><\/p>\n\n\n\n

    Borthwick,\nA. C., & Hansen, R. (2017). Digital literacy in teacher education: Are\nteacher educators competent? Journal of\nDigital Learning in Teacher Education<\/em>, 33<\/em>,\n46\u201348. https:\/\/doi.org\/10.1080\/21532974.2017.1291249<\/a><\/p>\n\n\n\n

    Brabham, D. C. (2008).\nCrowdsourcing as a model for problem solving: An introduction and cases. Convergence<\/em>, 14<\/em>, 75\u201390. https:\/\/doi.org\/10.1177\/1354856507084420<\/a><\/p>\n\n\n\n

    Dajani, J. S., Sincoff, M.\nZ., & Talley, W. K. (1979). Stability and agreement criteria for the\ntermination of Delphi studies. Technological\nForecasting and Social Change<\/em>, 13<\/em>,\n83\u201390. https:\/\/doi.org\/10.1016\/0040-1625(79)90007-6<\/a><\/p>\n\n\n\n

    Ertmer, P. A., & Ottenbreit-Leftwich, A. T.\n(2010). Teacher technology change: How knowledge, confidence, beliefs, and\nculture intersect. Journal of Research on\nTechnology in Education<\/em>, 42<\/em>,\n255\u2013284. https:\/\/doi.org\/10.1080\/15391523.2010.10782551<\/a><\/p>\n\n\n\n

    European\nCommission: Education and Training. (2013, July). Supporting teacher competence\ndevelopment for better learning outcomes. Retrieved from http:\/\/ec.europa.eu\/assets\/eac\/education\/experts-groups\/2011-2013\/teacher\/teachercomp_en.pdf<\/a><\/p>\n\n\n\n

    Foulger, T. S., Graziano, K. J., Schmidt-Crawford, D. A., & Slykhuis, D.A. (2017). Teacher educator technology competencies. Journal of Technology and Teacher Education<\/em>, 25<\/em>, 413\u2013448. Retrieved from http:\/\/site.aace.org\/tetc\/<\/a> <\/p>\n\n\n\n

    Gopalakrishnan, S., &\nUdayshankar, P. M. (2014). Question vetting: The process to ensure quality in\nassessment of medical students. Journal\nof Clinical and Diagnostic Research<\/em>, 8<\/em>(9),\nXM01\u2013XM03. https:\/\/doi.org\/10.7860\/JCDR\/2014\/9914.4793<\/a><\/p>\n\n\n\n

    Goktas,\nY., Yildirim, S., & Yildirim, Z. (2009). Main barriers and possible\nenablers of ICTs integration into pre-service teacher education programs. Educational Technology & Society<\/em>, 12<\/em>(1), 193\u2013204.<\/p>\n\n\n\n

    Howe,\nJ. (2006). The rise of crowdsourcing. Wired\nMagazine<\/em>, 14<\/em>(6), 1\u20134.<\/p>\n\n\n\n

    Howe, J. (2008). Crowdsourcing: How the power of the crowd is\ndriving the future of business<\/em>. New York, NY: Random House.<\/p>\n\n\n\n

    Hsu, C.-C., & Sandford,\nB. A. (2007). Minimizing non-response in the Delphi process: How to respond to\nnon-response. Practical Assessment,\nResearch & Evaluation<\/em>, 12<\/em>(17),\n1\u20136.<\/p>\n\n\n\n

    Innes, J. E., & Booher,\nD. E. (2004). Reframing public participation: Strategies for the 21st century. Planning Theory & Practice<\/em>, 5<\/em>, 419\u2013436.<\/p>\n\n\n\n

    International Society for\nTechnology in Education. (2018). ISTE\nstandards for educators<\/em>. Retrieved from https:\/\/www.iste.org\/standards\/for-educators<\/a><\/p>\n\n\n\n

    Koster,\nB., Brekelmans, M., Korthagen, F., & Wubbels, T. (2005). Quality\nrequirements for teacher educators. Teaching\nand Teacher Education<\/em>, 21<\/em>, 157\u2013176.\nhttps:\/\/doi.org\/10.1016\/j.tate.2004.12.004<\/a><\/p>\n\n\n\n

    Linstone, H. A., &\nTuroff, M. (Eds.). (2002). The Delphi\nmethod: Techniques and applications<\/em>. Newark, NJ: New Jersey Institute of\nTechnology. Retrieved from https:\/\/web.njit.edu\/~turoff\/pubs\/delphibook\/delphibook.pdf<\/a><\/p>\n\n\n\n

    Mertler, C. A. (2016). Action research: Improving schools and\nempowering educators<\/em> (5th ed.). Thousand Oaks, CA: Sage.<\/p>\n\n\n\n

    Mishra,\nP., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A\nframework for teacher knowledge. Teachers\nCollege Record<\/em>, 108<\/em>, 1017\u20131054.<\/p>\n\n\n\n

    Morris, R. R., & McDuff, D. (2015). Crowdsourcing techniques\nfor affective computing. In R. A. Calvo, S. D\u2019Mello, J. Gratch, & A. Kappas\n(Eds.), Oxford handbook of affective\ncomputing<\/em> (pp. 384\u2013395). https:\/\/doi.org\/10.1093\/oxfordhb\/9780199942237.013.003<\/a><\/p>\n\n\n\n

    Morse,\nJ. M. (2003). Principles of mixed methods and multimethod research design. In\nA. Tashakkori & C. Teddlie (Eds.), Handbook\nof mixed methods in social & behavioral research <\/em>(pp. 189\u2013208).\nThousand Oaks, CA: Sage.<\/p>\n\n\n\n

    Mouza, C. (2016). Developing and assessing\nTPACK among pre-service teachers: A synthesis of research. In M. C. Herring, M. J. Koehler, & P. Mishra (Eds.), Handbook of technological pedagogical content\nknowledge (TPACK) for educators<\/em> (2nd ed., pp.\n169\u2013190). New York, NY:\nRoutledge.                                                      <\/p>\n\n\n\n

    National Science Teachers\nAssociation. (2012). 2012 NSTA standards\nfor science teacher preparation<\/em>. Retrieved from https:\/\/www.nsta.org\/preservice\/<\/a><\/p>\n\n\n\n

    Niess, M. L. (2012).\nTeacher knowledge for teaching with technology: A TPACK lens. In R. N. Ronau,\nC. R. Rakes, & M. L. Niess (Eds.), Educational\ntechnology, teacher knowledge, and classroom impact: A research handbook on\nframeworks and approaches<\/em> (pp. 1\u201315). https:\/\/doi.org\/10.4018\/978-1-60960-750-0.ch001<\/a><\/p>\n\n\n\n

    Nworie,\nJ. (2011). Using the Delphi technique in educational technology research. TechTrends<\/em>, 55<\/em>(5), 24\u201330. https:\/\/doi.org\/10.1007\/s11528-011-0524-6<\/a><\/p>\n\n\n\n

    Okoli, C., & Pawlowski,\nS. D. (2004). The Delphi method as a research tool: An example, design\nconsiderations and applications. Information\n& Management<\/em>, 42<\/em>, 15\u201329. https:\/\/doi.org\/10.1016\/j.im.2003.11.002<\/a><\/p>\n\n\n\n

    Ottenbreit-Leftwich, A. T., Glazewski, K. D., Newby,\nT. J., & Ertmer, P. A. (2010). Teacher value beliefs associated with using\ntechnology: Addressing professional and student needs. Computers &\nEducation<\/em>, 55<\/em>, 1321\u20131335. https:\/\/doi.org\/10.1016\/j.compedu.2010.06.002<\/a><\/p>\n\n\n\n

    Quirolgico, S., Voas, J.,\n& Kuhn, R. (2011). Vetting mobile apps. IT\nProfessional<\/em>, 13<\/em>(4), 9\u201311. https:\/\/doi.org\/10.1109\/MITP.2011.73<\/a><\/p>\n\n\n\n

    Rice, K. (2009). Priorities\nin K\u201312 distance education: A Delphi study examining multiple perspectives on\npolicy, practice, and research. Educational Technology & Society<\/em>, 12<\/em>(3),\n163\u2013177.<\/p>\n\n\n\n

    Shelton, K., & Creghan,\nK. A. (2015). Demystifying the Delphi method. In Research methods: Concepts, methodologies, tools, and applications <\/em>(pp. 84\u2013104). https:\/\/doi.org\/10.4018\/978-1-4666-7456-1.ch005<\/a><\/p>\n\n\n\n

    Skulmoski, G. J., Hartman,\nF. T., & Krahn, J. (2007). The Delphi method for graduate research. Journal\nof Information Technology Education<\/em>,\n6<\/em>, 1\u201321.<\/p>\n\n\n\n

    Solemon, B., Ariffin, I.,\nDin, M. M., & Anwar, R. M. (2013). A review of the uses of crowdsourcing in\nhigher education. International Journal\nof Asian Social Science<\/em>, 3<\/em>, 2066\u20132073.<\/p>\n\n\n\n

    Sturgis,\nC. (2012, July). The art and science of\ndesigning competencies<\/em>. Retrieved from International\nAssociation for K\u201312 Online Learning website: https:\/\/www.inacol.org\/resource\/the-art-and-science-of-designing-competencies\/<\/a><\/p>\n\n\n\n

    Thomas,\nL. G., & Knezek, D. G. (2008). Information, communications, and educational\ntechnology standards for students, teachers, and school leaders. In J. Voogt\n& G. Knezek (Eds.), International handbook\nof information technology in primary and secondary education<\/em> (pp. 333\u2013348).\nhttps:\/\/doi.org\/10.1007\/978-0-387-73315-9_20<\/a><\/p>\n\n\n\n

    Tondeur,\nJ., van Braak, J., Sang, G., Voogt, J., Fisser, P., & Ottenbreit-Leftwich,\nA. (2012). Preparing pre-service teachers to integrate technology in education:\nA synthesis of qualitative evidence. Computers\n& Education<\/em>, 59<\/em>, 134\u2013144. https:\/\/doi.org\/10.1016\/j.compedu.2011.10.009<\/a><\/p>\n\n\n\n

    Tondeur,\nJ., Roblin, N. P., van Braak, J., Fisser, P., & Voogt, J. (2013).\nTechnological pedagogical content knowledge in teacher education: In search of\na new curriculum. Educational Studies<\/em>,\n39<\/em>, 239\u2013243. https:\/\/doi.org\/10.1080\/03055698.2012.713548<\/a>\n<\/p>\n\n\n\n

    University\nof Texas School of Public Health (2012). Competencies\nand learning objectives<\/em>. Retrieved from https:\/\/sph.uth.edu\/content\/uploads\/2012\/01\/Competencies-and-Learning-Objectives.pdf<\/a><\/p>\n\n\n\n

    U.S. Department of Education. (2017, January). Reimagining the role\nof technology in education: 2017 National Education Technology Plan update<\/em>.\nRetrieved from https:\/\/tech.ed.gov\/files\/2017\/01\/NETP17.pdf<\/a><\/p>\n\n\n\n

    Zhao, Y., & Zhu, Q. (2014).\nEvaluation on crowdsourcing research: Current status and future direction. Information Systems Frontier<\/em>, 16<\/em>, 417\u2013434. https:\/\/doi.org\/10.1007\/s10796-012-9350-4<\/a><\/p>\n

    <\/div>

    <\/path><\/svg><\/i> \"Loading\"<\/p>

    <\/div>","protected":false},"excerpt":{"rendered":"

      The theory and practice of preparing teacher candidates to teach with technology is inconsistent at best and ineffective at worst (Angeli & Valanides, 2009; Ertmer & Ottenbreit-Leftwich, 2010; Tondeur, Roblin, van Braak, Fisser, & Voogt, 2013). Some researchers have noted that the quantity and quality of technology experiences that teacher candidates encounter during their […]<\/p>\n

    <\/div>\n

    <\/path><\/svg><\/i> \"Loading\"<\/p>\n

    <\/div>\n","protected":false},"author":4,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":[],"meta":{"_acf_changed":false,"footnotes":""},"categories":[8],"publication":[111,109],"acf":[],"_links":{"self":[{"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/posts\/8488"}],"collection":[{"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/users\/4"}],"replies":[{"embeddable":true,"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/comments?post=8488"}],"version-history":[{"count":0,"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/posts\/8488\/revisions"}],"wp:attachment":[{"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/media?parent=8488"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/categories?post=8488"},{"taxonomy":"publication","embeddable":true,"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/publication?post=8488"},{"taxonomy":"paper_format","embeddable":true,"href":"https:\/\/citejournal.org\/wp-json\/wp\/v2\/format?post=8488"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}