{"id":8147,"date":"2018-09-28T19:47:23","date_gmt":"2018-09-28T19:47:23","guid":{"rendered":"https:\/\/citejournal.org\/\/\/"},"modified":"2019-03-13T14:04:47","modified_gmt":"2019-03-13T14:04:47","slug":"the-impact-of-a-teacher-education-program-redesign-on-technology-integration-in-elementary-preservice-teachers","status":"publish","type":"post","link":"https:\/\/citejournal.org\/volume-18\/issue-4-18\/general\/the-impact-of-a-teacher-education-program-redesign-on-technology-integration-in-elementary-preservice-teachers","title":{"rendered":"The Impact of a Teacher Education Program Redesign on Technology Integration in Elementary Preservice Teachers"},"content":{"rendered":"

The recognition of the need for 21st-century student learning has spurred teacher education programs to purchase technology and adapt coursework to meet new demands from school districts and accreditation agencies (Bos, 2011; Male & Burden, 2014). Purchasing technologies for use without providing ongoing professional development for teacher educators and cooperating teachers has often resulted in little impact on the use of technology for teaching and learning (e.g., Hutchison, 2012; Tondeur, Pareja Roblin, van Braak, Voogt, & Prestridge, 2017).<\/p>\n

Ball and Cohen (1999) set an ambitious agenda for teacher education, contending that to prepare teachers who move beyond the status quo teacher educators need to present a coherent and compelling vision. The challenge is to balance the reproductive nature of current classroom practice with knowledge and vision of 21st<\/sup>-century student learning.<\/p>\n

For change to take place in technology integration, preservice teachers must be scaffolded to use technology effectively (Carpenter, Graziano, Borthwick, DeBacker, & Finsness, 2016; Wright & Wilson, 2005). In reality, not all preservice teachers observe state-of-the-art technology integration in method courses and field experiences; as a result, teacher education need a transformation to encourage meaningful integration by instructors and cooperating teachers (Ertmer & Ottenbreit-Leftwich, 2010; Martin, 2015; Tondeur et al., 2017).<\/p>\n

The purpose of the multicohort study described here was to examine the growth of preservice teachers\u2019 technology integration in response to a teacher education program redesign that aimed to create a leading-edge technology integration experience as part of a 21st<\/sup>-century alignment. We also investigated how elements in the program were related to preservice teachers\u2019 technology knowledge, integration motivation, and classroom action.<\/p>\n

The program redesign followed Ball and Cohen\u2019s (1999) suggestions to make sure that preservice teachers are positioned to be innovative and future ready<\/em>. Building on the concept of laboratories of practice (Latta & Wunder, 2012), the program focused on understanding subject matter, learners, and pedagogy with technology as an integrated feature. The redesign was based on the framework of Technological, Pedagogical, and Content Knowledge (TPACK), following the recommendation by Darling-Hammond and Bransford (2005):<\/p>\n

If teachers are to develop a curricular vision with respect to the use of technology for learning, teacher education programs need to think of their responsibilities as including the production of technically literate teaching professionals who have a set of ideas about how their students should be able to use technology within particular disciplines. (p. 199)<\/p>\n

Theoretical Model<\/h2>\n

The theoretical model in Figure 1 conceptualizes the integration of the different elements that must come together to impact technology integration success and subsequent impact on K12 student achievement. The Teacher Education program impacts TPACK Efficacy and Technology Knowledge, two key components leading to successful educationally relevant technology integration. The term TPACK Efficacy refers to teachers\u2019 sense of efficacy about their ability to integrate technology and subject areas to teach meaningful lessons (one such item can be, \u201cI can design lessons that combine literacy and technology effectively.\u201d).<\/p>\n

Environmental Supports moderate any impact of the learning and motivation of preservice teachers to use technology in the classroom. Simply put, the availability of resources such as devices, reliable broadband connection, and technical support have a substantial impact on teacher\u2019s sustained engagement with technology, including cooperating teacher modeling of effective technology integration for preservice teachers (Chali\u00e8s, Bruno-M\u00e9ard, M\u00e9ard, & Bertone, 2010; Tondeur et al., 2017; Whittier, 2007). These cooperating teachers also need professional development to use the devices in student-centered ways, going beyond the assessment uses for which many are initially purchased (Sheninger & Murray, 2017; Walser, 2011).<\/p>\n

The resulting instructional change should lead to K12 students\u2019 achievement, conceived broadly to include subject-specific knowledge, technology knowledge, and learning strategies. Following the logic expressed by Guskey (2002) implementation creates a feedback loop in which K12 student success further impacts TPACK Efficacy and Technology Knowledge. When preservice teachers, teacher education faculty, and cooperating teachers all integrate technology as both a teaching and learning tool, teacher education programs impact schools and K12 students in positive ways.<\/p>\n

\"\"<\/a>
Figure 1.<\/strong> Theoretical model for effective teacher education impacting technology integration.<\/em><\/figcaption><\/figure>\n

 <\/p>\n

Literature Review<\/h2>\n

TPACK<\/h3>\n

TPACK builds on Shulman\u2019s (1986) pedagogical content knowledge framework. Shulman argued that the most effective teaching takes place when teachers merge their understanding of content and pedagogy to plan learning experiences that overcome teaching challenges. TPACK refers to \u201can emergent form of knowledge that goes beyond all three components (content, pedagogy, and technology)\u201d (Mishra & Koehler, 2006, p. 1028). It is an understanding that emerges from the interaction of these bodies of knowledge, both theoretically and in practice, producing flexible knowledge necessary to successfully integrate technology into teaching (Carpenter, et al., 2016; Koehler & Mishra, 2009). Teachers need to understand \u201cnot just the subject matter they teach, but also the manner in which the subject matter can be changed by the application of technology\u201d (Mishra & Koehler, 2006, p.1028).<\/p>\n

These three components are more than the sum of their parts, empowering teachers to facilitate lessons where technology advances student learning to a new level. As devices and uses for technology in schools increase, the TPACK framework adds a technological knowledge component highlighting the need for teachers to know how technology can influence content and pedagogy.<\/p>\n

The TPACK framework has become ubiquitous in the educational technology field and is supported by the American Association of Colleges for Teacher Education (AACTE; Carpenter, et al., 2016). The existing literature on this topic has come from work with both established teachers (e.g., Bruce & Chiu, 2015; Graham et al., 2009; Harris & Hofer, 2017) and preservice teachers (e.g., Niess, 2008).<\/p>\n

At the same time, AACTE has embraced the TPACK model for preservice teachers so they learn how and why to integrate technology as they begin planning and teaching (Herring, Koehler, & Mishra, 2016). With constantly evolving technologies, teacher education must prepare preservice teachers to teach in ways that prepare students to learn using these digital tools (Niess, 2008).<\/p>\n

TPACK as a Basis for Program Redesign<\/h3>\n

As researchers have begun to focus on techniques to aid TPACK growth in preservice and in-service teachers (e.g., Cavin, 2008; Graham et al., 2009), modifications in courses and fieldwork are emerging (Koehler et al., 2012). Our program redesign began with the three primary foci for developing TPACK in teacher preparation programs, as outlined by Hofer and Grandgenett (2012): \u201ca dedicated educational technology course; content-specific teaching methods, or practicum courses; or through the duration of coursework in a teacher preparation program\u201d (p. 87). We changed the \u201cor\u201d to \u201cand,\u201d however, to layer opportunities and capacity.<\/p>\n

Empirical studies on developing TPACK had mainly focused on one or two of these components. For example, Chai, Koh, and Tsai (2010) focused on the first component by teaching TPACK in an educational technology course with a cohort of 889 preservice teachers in a postgraduate secondary education program in Singapore. The technology course focused on pedagogical and technological knowledge. The instructors presented a technology tool and its pedagogical use to students organized by subject area, who created a final thematic unit comprised of technology enhanced lessons in their area. Findings showed that technology courses that directly taught technology tools along with pedagogy raised preservice teachers\u2019 technological and pedagogical knowledge with moderate to large effect sizes.<\/p>\n

Similarly, Maor (2017) conducted a study of two consecutive versions of a mainly graduate technology course in Australia using blended learning for instructors to model, and students participated collaboratively with technology to explore the effect of TPACK on digital pedagogies. Maor found significant TPACK growth in each domain, along with greater confidence and understanding of TPACK application, leading to implementation in the classroom.<\/p>\n

Harris and Hofer (2011) utilized content-specific teaching methods (second component) for professional development to help teachers go beyond self-evaluating TPACK to put TPACK-in-Action<\/em>. Seven classroom teachers participated in the study of TPACK professional development. The instructor presented examples, descriptions, and suggested technologies to accomplish curriculum goals. Participants then planned a unit by incorporating a variety of learning activities into the content and pedagogy. Teachers noted that adding selected activities and technologies allowed them to effect deeper, more self-directed learning in the classroom. Five of the seven teachers commented on how the activities facilitated the fit between the TPACK domains, teaching requirements, and time.<\/p>\n

Mouza, Karchmer-Klein, Nadakumar, Yilmaz Ozdem, and Hu (2014) combined the first two components for building TPACK in teacher education. They built on the idea that, when the technology course is integrated with method courses and field experience, preservice teachers benefit by applying learning directly into teaching with technology (Niess, 2005, 2012). Their study examined 88 preservice teachers enrolled in the technology course and related method courses during one semester. All preservice teachers showed significant growth in each TPACK area and applied their knowledge during field experience. However, Mouza et al. noted that it was difficult to place preservice teachers in classrooms where teachers effectively modeled technology integration. Cooperating teachers used technology for teaching and learning in a very limited way, so preservice teachers mainly learned pedagogy (PCK and pedagogical knowledge), not technology integration, from cooperating teachers.<\/p>\n

Hofer and Grandgenett (2012) added the third component of technology integration throughout a program as they examined TPACK integration through a three semester graduate teaching program with eight participants. Results indicated growth in TPACK throughout the program, but the largest gains occurred when preservice teachers were concurrently enrolled in the educational technology course and their first method course, where they discussed teaching strategies, lesson planning, and technology integration.<\/p>\n

Preservice teachers\u2019 TPACK in lesson plans fell slightly during student teaching, and the authors suggested that the demands of classroom practice may have negatively impacted technology integration. Hofer and Grandgenett (2012) suggested a need for more longitudinal studies of TPACK across teacher education programs.<\/p>\n

Current research demonstrates that the three TPACK components are being used successfully in teacher preparation programs; however, they also indicate the need for further investigations focusing on sustainable longitudinal program wide approaches. The current study included all three components (technology course, technology infused into method courses, field experiences, and across program) integrated into consecutive iterations.<\/p>\n

The Role of Teacher Efficacy<\/h3>\n

Ertmer and Ottenbriet-Leftwich (2010) suggested that to change and sustain teachers\u2019 technology practices teacher educators need to focus on knowledge, self-efficacy, pedagogical beliefs, and culture in both teacher education programs and teacher professional development. Research on motivation emphasizes the role beliefs play in influencing persistence, behaviors, and achievement.<\/p>\n

The motivational construct of self-efficacy<\/em> (Bandura, 1986) has become the focus of educational research in varied domains, such as mathematics, science, reading, writing, and sports (Bandura, 1997; Pajares, 1997; Pajares & Miller, 1994; Schunk & Zimmerman, 2007). Self-efficacy is a person\u2019s estimation of the probability of success if they attempt to organize and execute actions required to accomplish a task (Bandura, 1986). In education, self-efficacy has been shown to be a powerful predictor of students\u2019 motivation and academic achievement (e.g., see Schunk & Pajares, 2009).<\/p>\n

Teacher self-efficacy refers to teacher\u2019s beliefs about their capacity to accomplish pedagogical tasks (Bandura, 1986). It is the basis for understanding teachers\u2019 beliefs about their ability to translate their knowledge into successful action. For example, Abbitt (2011) found that teacher efficacy for technology integration interacted with TPACK in predicting change in technology integration.<\/p>\n

Teacher efficacy is crucial in making sure that the capacity teachers acquire will actually be used in the classroom. As illustrated in Figure 1, successful implementation of educational change, in our case technology integration, requires the confluence of knowledge, motivation, and resources. TPACK alone may not translate into sustained integration into teaching and student learning without teachers believing they can do it (Bauer & Kenton, 2005; Ertmer & Ottenbreit-Leftwich, 2010; Corkin, Ekmekci, White & Fisher, 2016; Wozney, Venkatesh, & Abrami, 2006).<\/p>\n

Teachers and preservice teachers need multiple experiences integrating technology in classrooms and practicum situations to build confidence through personal mastery and vicarious learning, the strongest sources of self-efficacy (Bandura, 1997).<\/p>\n

The Role of Modeling<\/h3>\n

Preservice teachers have been learning from their own teachers throughout their K12 schooling in a process Lortie (1975) called \u201cthe apprenticeship of observation.\u201d However, as students, they do not always have access to the knowledge, skills, and reasoning behind the myriad of procedures they observe, sometimes causing misconceptions about teaching. Modeling, on the other hand, is a high leverage activity that can scaffold vicarious learning into personal mastery when teacher educators and teachers share their thought processes to support actions and move preservice teachers into the role of teacher (Grossman, Hammerness, & McDonald, 2009).<\/p>\n

Ertmer (2003) found that when teacher educators, cooperating teachers, and preservice teachers collaborate to plan technology integrated lessons, modeling happens naturally as teachers each demonstrate their area of expertise. Ertmer further noted that some teacher education programs explicitly model what a meaningful technology integrated lesson looks like before preservice teachers try to create lessons themselves. In such ways, teachers at all levels tend to benefit from observing a variety of expert performance as they move toward more advanced levels of technology use.<\/p>\n

Baran, Canbazoglu Bilici, Albayrak Sari, and Tondeur (2017) showed that instructor modeling in three teacher education programs in Turkey was a significant predictor of TPACK perception by preservice teachers. Angeli (2005) used explicit modeling by teacher educators to explain and demonstrate their process of integrating lessons with technology to prepare preservice teachers. After building confidence by observing an expert, preservice teachers created their own technology integrated science lessons for elementary students, guided by teacher educators. Findings showed that along with modeling teacher educators also need to explain the pedagogical reasoning so preservice teachers see \u201chow the teacher\u2019s role changes, how the subject matter gets transformed, and how the learning process is enhanced (Angeli, 2005, p. 395). What\u2019s more, teacher educators should explicitly teach how to apply the unique features of a tool to transform a specific content domain in ways not possible without the tool.<\/p>\n

Summary<\/h3>\n

In order to create meaningful change in the ways teachers use technology in their classroom, knowledge and self-efficacy have to be purposefully attended to, while making sure that resources are available so technology can be used. To move the field forward, all stakeholders in a teacher education program need to move together. University faculty need to model effective use of technology in courses and empower preservice teachers to utilize these tools in coursework and beyond. Cooperating teachers need professional development adding technology into instruction as personal digital devices become ubiquitous in education.<\/p>\n

The model presented in Figure 1 was the basis of the redesign in our teacher education program. We progressively added components that supported all aspects of TPACK efficacy, technology knowledge, and resources to create optimal conditions for developing teachers ready to teach in the 21st century.<\/p>\n

We focused on three questions:<\/p>\n

    \n
  1. How do preservice teachers\u2019 TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency change over time in response to integration of technology practices into the teacher education program?<\/li>\n
  2. What is the contribution of TPACK Efficacy and Technology Knowledge to Technology Integration Frequency in the classroom?<\/li>\n
  3. What is the impact of modeling on TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency?<\/li>\n<\/ol>\n

    Methods<\/h2>\n

    Participants<\/h3>\n

    The participants were 891 preservice teachers (801 female, 90%) from 11 cohorts across consecutive semesters (n1 <\/sub><\/em>= <\/sub>92, n2<\/sub><\/em> = <\/sub>75, n3 <\/sub><\/em>= <\/sub>82, n4 <\/sub><\/em>= <\/sub>82, n5 <\/sub><\/em>= <\/sub>81, n6 <\/sub><\/em>= <\/sub>83, n7 = <\/sub><\/em>80, n8 <\/sub><\/em>= <\/sub>65, n9 <\/sub><\/em>= <\/sub>107, n10 = <\/sub><\/em>64, n11 <\/sub><\/em>= <\/sub>80) from fall 2011 to fall 2016 in a large Midwestern university. All participants were undergraduate students. Most were traditional students aged between 19 and 25 (n <\/em>= 846; 95%), in addition to 24 students aged between 26 and 30 (3%), and 21 students between 31 and 50 (2%). They were enrolled in an elementary education program, with 58% focusing on elementary-only, 24% on elementary special education, 12% on inclusive P-3 education, 5% on early elementary education, and 1% on elementary and English learners. The majority of the participants were Caucasian (n <\/em>= 864, 97%), with some Hispanic (n <\/em>= 13), African American (n <\/em>= 9), and Asian American (n <\/em>= 5). At the time the data was collected, the participants were at the end of student teaching their final semester.<\/p>\n

    Measures<\/h3>\n

    An online survey was administered to all student teachers in their last semester in the program (student teaching; see Appendix). After responding to demographic questions, preservice teachers were introduced to three instruments.<\/p>\n

    Technology Knowledge.<\/em><\/strong> The first instrument measured their Technology Knowledge adapted from the Survey of Preservice Teachers\u2019 Knowledge of Teaching and Technology (Schmidt et al., 2009). There were seven items in Technology Knowledge on a scale from 1 (strongly disagree<\/em>) to 5 (strongly agree<\/em>). We adapted six items and replaced one item with the reported lowest factor loading (.65) \u2013 \u201cI have had sufficient opportunities to work with different technologies\u201d \u2013 with \u201cColleagues often ask me to help them with technology,\u201d developed by the researchers. This item focused on preservice teachers\u2019 mastery experience working with technology and was validated with technology coaches from across the state. The reliability of the seven items in this study was .88 using Cronbach’s alpha, slightly higher than the value .82 reported previously (Schmidt et al., 2009).<\/p>\n

    TPACK Efficacy.<\/em><\/strong> The second instrument included (a) measurement of preservice teachers\u2019 TPACK Efficacy in designing and teaching lessons that combine subject matter and technology to reach objectives (adapted from Schmidt et al.\u2019s TPACK knowledge domain, 2009) on a Likert scale from 1 (highly ineffectively<\/em>) to 5 (highly effectively<\/em>), (b) the frequency of such lessons on a Likert scale from 1(never<\/em>) to 4 (in all of my classes<\/em>), and (c) three open-ended questions soliciting preservice teachers\u2019 detailed description of a lesson in which they integrated content and technology effectively to reach their lesson objectives:<\/p>\n

      \n
    1. What was the content?<\/li>\n
    2. What technology did you use? What did you use it for?<\/li>\n
    3. What technology did students use? What did they use it for?\u201d<\/li>\n<\/ol>\n

      For TPACK Efficacy, we adapted only four items measuring preservice teachers\u2019 efficacy to integrate subject areas relevant to our teacher education program of interest: literacy, mathematics, science, and social studies. The reliability was .87 for the adapted four items, compared to the original scale with a reliability of .92 for nine items (Schmidt et al., 2009).<\/p>\n

      Modeling<\/em><\/strong>.<\/strong> The third instrument focused on modeling adapted from Schmidt et al.\u2019s (2009) measure of Models of TPACK (faculty, PK-6 teachers). Preservice teachers were first asked to name one individual who was an exceptional model in technology integration and describe her\/his role. Following were seven items asking preservice teachers to rate whether university classes have modeled technology integration effectively (i.e., Literacy Methods, Mathematics Methods, Science Methods, Social Studies Methods, Technology Methods, Practicum\/Student Teaching, and Reading Center) on a Likert scale from 1 (highly ineffectively<\/em>) to 5 (highly effectively<\/em>). We adapted the six items from Schmidt et al. (2009). In addition, we added one item to address the modeling at the Reading Center that is an integral part of our teacher education program. The reliability for these items in this study was .57. Schmidt et al. (2009) did not provide reliability for the items measuring modeling.<\/p>\n

      Data Analysis Procedures<\/h3>\n

      Our first research question focused on how preservice teachers\u2019 TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency changed across cohorts. To achieve this goal, we used one-way ANOVAs to determine any differences in preservice teachers\u2019 reported scores across cohorts. Before conducting ANOVAs, we administered Chi-square tests to examine the characteristics of participants across cohorts regarding their gender, age, and program focus. Participants\u2019 age was categorized into three groups (from 19 to 25; 26 to 30; and 31 to 50). Initial analysis also examined potential outliers and the normality and homogeneity of measured variables. We used the mean of items to calculate Technology Knowledge; therefore, Cronbach\u2019s alpha was computed to examine its internal consistency.<\/p>\n

      For the second and third research questions, we applied structural equation modeling (SEM) to examine the relationships among teacher program Modeling, TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency. SEM is a statistical technique that models the relationships among latent factors. We applied a two-step process: (a) a confirmatory factor analysis (CFA) to confirm that the measurement model fit respective data; and (b) a structural regression model to examine the relationships among latent factors (Thompson, 2000).<\/p>\n

      Additionally, for Questions 2 and 3 we analyzed narrative data from the open-ended questions in the survey using Miles and Huberman\u2019s (1994) thematic coding. After reading through all data to get a sense of the content, we reread using open coding, assigning codes created initially as well as adding axial codes as needed. We then read the data a third time looking for patterns and answers to research questions. Rich and thick quotes (Creswell, 1998) were selected to express how preservice teachers explained actual lessons they taught integrating technology as well as how their best teaching models integrated technology.<\/p>\n

      Program Development<\/h3>\n

      Over a period of 5 years the teacher education program was redesigned to strengthen preservice teachers\u2019 TPACK (Trainin & Friedrich, 2014;\u00a0<\/span>Trainin, Friedrich, & Deng, 2013<\/span>). Each component built upon the five elements of professional development, which Desimone (2009) has shown to be effective: focused content, collective participation, active learning, duration, and coherence.<\/p>\n

      In addition, Dagen and Bean (2014) noted a new wave of research emphasizing collaborative learning as a key feature, taking into consideration the teacher\u2019s organization. They maintained that \u201ceffective professional development would encompass as many of those features as appropriate for a specific professional development initiative\u201d (p. 47). We discuss each component of the transformed teacher education program in relation to these core features of effective professional development (see Figure 2).<\/p>\n

      \"\"<\/a>
      Figure 2.<\/strong> Teacher education program redesign components as rolled out throughout cohort.<\/em><\/figcaption><\/figure>\n

       <\/p>\n

      Technology Integration Planning and Baseline Data Collection.<\/em><\/strong> The transformation began with a University Reading Center pilot, where we had full control over devices and apps, one-to-one usage with students, and supervision to allow teacher educators to model in class and preservice teachers to enact TPACK in real time. The course content focused on strategies to assist striving readers and writers and was designed to engage preservice teachers in collaborative learning to plan lessons and share student results.<\/p>\n

      At the same time the pilot was enacted we collected baseline data from the preservice teachers who were then in student teaching. Cohort 1 completed the adapted Survey of Preservice Teachers\u2019 TPACK to provide a baseline measure of TPACK Efficacy to plan and teach TPACK lessons, as well as frequency of actual implementation, technology knowledge, and effectiveness of teacher educators in modeling technology integration. Each cohort following completed the same survey during student teaching.<\/p>\n

      Technology Pilots in Method Classes.<\/em><\/strong>\u00a0 The literacy methods course demonstrated the technology fit into content and pedagogy (TPACK) as iPads were integrated into teaching and learning. Preservice teacher use of a class set of Version 1 iPads, cameras, and software began the methods course redesign. Instructors modeled a variety of apps and discussed uses to teach literacy components, which preservice teachers then used to teach elementary students in the associated practicum. The program built upon this learning to integrate the technology component to focused content in mathematics, science, and social studies method courses in progressive semesters using an active learning format, where preservice teachers observed and participated in class then taught in practicum.<\/p>\n

      Professional Development Conferences.<\/em><\/strong> One professional development conference per semester offered preservice teachers, cooperating teachers, and teacher educators opportunities to learn and collaborate around technology. The goal of the conferences was to help all three teacher groups develop as professionals integrating technology through collective participation with each other. The program required preservice teachers and encouraged cooperating teachers and teacher educators to attend university-planned conferences that provided hands-on technology practice through active learning using real classroom examples shared by peers from all groups.<\/p>\n

      The conferences assisted cooperating teachers in integrating technology in meaningful ways to assist their schools and to provide locations where preservice teachers could experience effective integration in action. Wepner et al. (2012) found that school-university partnerships can expose teachers to new methodologies, provide innovative and cutting-edge ideas for the classroom, encourage collaborative inquiry about practice, renew the love of teaching, and develop teacher leadership. Building upon collaborative partnerships with the local school districts, all teachers were invited to attend the professional development conferences along with the preservice teachers. As we observed teachers grow in technology integration, we invited them to present at upcoming professional development conferences.<\/p>\n

      The format of the conference frequently began with a keynote that challenged participants to consider emerging issues in education including 1:1 technology integration in classrooms, innovative learning spaces, classrooms of the future, makerspaces, and project-based learning STEAM (science, technology, engineering, art, and mathematics) curriculum. Participants then attended self-selected sectionals to meet individual goals.<\/p>\n

      For example, an elementary teacher modeled how she used Green Screen technology to empower students to make videos to demonstrate learning about systems of the human body. This presenter modeled the process and showed student sample projects before inviting participants to collaborate with a partner to create a video during the sectional.<\/p>\n

      Although the conference lasted one day, the duration of the learning continued as preservice teachers collaborated throughout the semester with cooperating teachers, peers, and supervisors (Friedrich & Trainin, 2016). A prototypical 5-hour conference offered fifteen 45-minute sessions plus a keynote. Classroom teachers presented 11 sessions with university instructors and State Department of Education personnel presenting two sessions each. All sessions utilized a bring-your-own-device hands-on format.<\/p>\n

      Faculty Training<\/em><\/strong>.<\/em> Parallel to the professional development conferences, Teacher Educators received ongoing professional development through the redesigned program. Instructors were invited to attend monthly collaborative learning meetings, where all attending shared new tools and uses and answered questions. A university-focused professional development conference each summer challenged teacher educators to innovate teaching methods and share their learning with other teacher educators from across the state.<\/p>\n

      Sectionals supported instructor needs ranging from novice to expert (e.g., online teaching and feedback, mobile devices in the classroom, collaborating, Google tools for teacher productivity and student learning, and update on technology integration at the elementary, secondary, and university levels). Through collective participation in an active learning format, instructors encountered tools and strategies used in their content focus area.<\/p>\n

      Technology Integration Class Redesign<\/em><\/strong>.<\/strong> The technology integration course was reimagined to fit the new vision for preservice teachers. The first step was to fix the timing of the class to the beginning of the professional program. In this way preservice teachers were gaining pedagogical knowledge with the accompanying technological and integrated skills that could be used over the duration of the program. The curriculum was changed to build on the availability of mobile devices and eventually district one-to-one integration. The course itself was split so that later in the program we could add a practicum in technology integration during literacy methods and practicum.<\/p>\n

      Tablet Requirement<\/em><\/strong>. <\/strong>The redesigned program required preservice teachers to have a tablet for entrance into the teacher education program. This intentional decision provided environmental support assuring that each preservice teacher had equal access to technology for teaching when schools and cooperating teachers differed in their access to and uses of technology. The college supported purchase for students with financial difficulty. Device availability in class and practicum allowed full participation in courses that were redesigned for learning in and through technology.<\/p>\n

      Technology Practicum.<\/em><\/strong> Preservice teachers engaged in a technology practicum during literacy methods semester. The program provided coaching by university supervisors in practicum classrooms as a model of technology integration, an environmental support to sustain instructional change by scaffolding meaningful technology integration by preservice teacher\/cooperating teacher teams. When appropriate, these coaches suggested learning activities where technology could allow K-5 students to learn using digital sources in addition to print sources and, when needed, assisted with teaching lessons that involved using technology to teach and learn.<\/p>\n

      Professional Development Class for Cooperating Teachers. <\/em><\/strong>\u00a0<\/em>The program offered a parallel course for interested cooperating teachers to learn the same uses for technology in the classroom that their preservice teacher was learning in technology practicum. This course supported cooperating teachers as they explored tools and designed lessons, implementing in their classroom with their preservice teacher supported by a university coach.<\/p>\n

      Makerspace.<\/em><\/strong> The program continues to add components in an effort to prepare preservice teachers for the rapidly emerging technologies and pedagogies entering schools. The most recent addition is a Makerspace component integrated into the technology integration class. Effective technology integration today empowers students as creators using technology, and the Makerspace is an effort to make sure that all preservice teachers have the capacity to engage with making (Sheninger & Murray, 2017). Learning in a supportive environment where trial and error is encouraged, preservice teachers ask questions and create projects to solve real problems.<\/p>\n

      Results<\/h2>\n

      Before answering the research questions, we conducted initial analyses to examine whether assumptions for multivariate analyses were met. Test of Normality with Kolmogorov-Smirnov indicated violation of normality for preservice teachers\u2019 TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency. We also referred to Q-Q plots inspection, and skewness (ranged from -1.11 to .57) and kurtosis (ranged from -1.27 to 1.35), which indicated reasonable normality for all variables. The homogeneity of variance assumption was met for all variables in TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency (p <\/em>> . 05). We examined the need for covariates and demographic factors, which revealed no significant difference across cohorts in preservice teachers\u2019 gender, c2<\/sup>(10, 891) = 3.11, p = <\/em>.927; age, c2<\/sup>(20, 891) = 22.63, p = <\/em>.066, and program focus, c2<\/sup>(40,891) = 38.16, p = <\/em>.10.<\/p>\n

      Growth<\/h3>\n

      TPACK Efficacy<\/em><\/strong>. Four separate one-way ANOVAs were conducted to examine any differences across cohorts in preservice teachers\u2019 TPACK Efficacy in four content areas, including literacy, mathematics, science, and social studies. The results suggested significant differences across cohorts for technology integration with literacy, F<\/em>(10,881) = 73.08, p < <\/em>.001; mathematics, F<\/em>(10,881) = 53.59, p < <\/em>.001; science, F<\/em>(10,881) = 33.87, p < <\/em>.001; and social studies, F<\/em>(10,881) = 35.25, p < <\/em>.001. Detailed descriptive statistics are presented in Table 1.<\/p>\n

      Table 1<\/strong>
      \nMeans and Standard Deviations by Cohort Number for TPACK Efficacy Subject Areas and Technology Knowledge<\/p>\n\n\n\n\n\n\n\n\n\n
      \u00a0Variable<\/strong><\/td>\n1<\/strong><\/td>\n2<\/strong><\/td>\n3<\/strong><\/td>\n4<\/strong><\/td>\n5<\/strong><\/td>\n6<\/strong><\/td>\n7<\/strong><\/td>\n8<\/strong><\/td>\n9<\/strong><\/td>\n10<\/strong><\/td>\n11<\/strong><\/td>\nd3<\/em><\/strong>
      \n(11 to 1)<\/strong><\/td>\n<\/tr>\n
      TE in Literacy<\/td>\n1.79
      \n(.74)<\/td>\n
      3.95
      \n(.87)<\/td>\n
      3.94
      \n(.87)<\/td>\n
      4.00
      \n(.88)<\/td>\n
      4.00
      \n(.87)<\/td>\n
      4.11
      \n(.76)<\/td>\n
      4.31
      \n(.69)<\/td>\n
      4.29
      \n(.67)<\/td>\n
      4.07
      \n(.93)<\/td>\n
      4.13
      \n(.97)<\/td>\n
      4.15
      \n(.78)<\/td>\n
      3.10<\/td>\n<\/tr>\n
      TE in Math<\/td>\n1.75
      \n(.62)<\/td>\n
      3.91
      \n(1.02)<\/td>\n
      3.90
      \n(1.02)<\/td>\n
      4.04
      \n(.97)<\/td>\n
      4.00
      \n(.92)<\/td>\n
      4.05
      \n(.89)<\/td>\n
      4.13
      \n(.80)<\/td>\n
      4.09
      \n(.81)<\/td>\n
      3.87
      \n(1.06)<\/td>\n
      3.84
      \n(1.08)<\/td>\n
      4.05
      \n(.91)<\/td>\n
      2.95<\/td>\n<\/tr>\n
      TE in Science<\/td>\n2.11
      \n(.89)<\/td>\n
      3.74
      \n(.86)<\/td>\n
      3.74
      \n(.86)<\/td>\n
      3.70
      \n(.96)<\/td>\n
      3.86
      \n(.92)<\/td>\n
      3.96
      \n(.78)<\/td>\n
      3.92
      \n(.81)<\/td>\n
      3.92
      \n(.79)<\/td>\n
      3.75
      \n(.98)<\/td>\n
      3.52
      \n(1.00)<\/td>\n
      3.95
      \n(.85)<\/td>\n
      2.11<\/td>\n<\/tr>\n
      TE in Social Studies<\/td>\n2.04
      \n(.98)<\/td>\n
      3.71
      \n(.91)<\/td>\n
      3.72
      \n(.89)<\/td>\n
      3.79
      \n(.97)<\/td>\n
      3.76
      \n(.89)<\/td>\n
      3.96
      \n(.82)<\/td>\n
      3.87
      \n(.80)<\/td>\n
      3.88
      \n(.78)<\/td>\n
      3.75
      \n(.98)<\/td>\n
      3.69
      \n(1.08)<\/td>\n
      3.96
      \n(.86)<\/td>\n
      2.08<\/td>\n<\/tr>\n
      TK<\/td>\n3.94
      \n(.80)<\/td>\n
      3.52
      \n(.68)<\/td>\n
      3.85
      \n(.65)<\/td>\n
      3.85
      \n(.51)<\/td>\n
      3.82
      \n(.62)<\/td>\n
      3.84
      \n(.55)<\/td>\n
      3.82
      \n(.47)<\/td>\n
      3.93
      \n(.58)<\/td>\n
      3.87
      \n(.60)<\/td>\n
      3.84
      \n(.49)<\/td>\n
      4.07
      \n(.48)<\/td>\n
      <\/td>\n<\/tr>\n
      Notes:<\/em>\u00a0TE = TPACK Efficacy, TK = Technology Knowledge.\u00a0 5-point scale; For TE, 1 =\u00a0Strongly Disagree<\/em>; 2 =\u00a0Disagree<\/em>; 3 =\u00a0Neither Agree Nor Disagree<\/em>; 4 = Agree; 5 =\u00a0Strongly Agree<\/em>. For TK,\u00a01 =\u00a0Highly Ineffectively<\/em>; 2 =\u00a0Somewhat Ineffectively<\/em>; 3 =\u00a0Neutral<\/em>; 4 =\u00a0Somewhat Effectively<\/em>; 5 =\u00a0Highly Effectively.<\/em><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n

       <\/p>\n

      We conducted follow-up procedures with a Tukey HSD post hoc test to compare differences among cohorts. The findings revealed incremental improvement that became significant over multiple cohorts. For example, Technology Integration Frequency for literacy in Cohort 1 was significantly lower than that in all other cohorts (p < .001 for all comparisons). Technology Integration Frequency for literacy in Cohort 2 (M <\/em>= 2.10; SD <\/em>= .64) was significantly lower than that in most following cohorts; for example, Cohort 7 (M <\/em>= 2.44; SD <\/em>= .71; p =<\/em> 04) and Cohort 9 (M <\/em>= 2.63; SD <\/em>= .71; p <\/em>< .001).<\/p>\n

      For mathematics, Technology Integration Frequency in Cohort 1 was significantly lower than Cohort 2 (p <\/em>= .031) and the rest of cohorts (p <\/em>< .001 for all comparison). Technology Integration Frequency in Cohort 2 was significantly lower than that in Cohort 9 and beyond.\u00a0For science, Technology Integration Frequency in Cohort 1 was significantly lower than in Cohort 3 (p <\/em>= .008), Cohort 6 (p <\/em>= .011), and all subsequent cohorts.<\/p>\n

      For social studies, Technology Integration Frequency in Cohort 1 was significantly lower than in Cohort 2 (p <\/em>= .022), cohort 3 (p <\/em>< .001), and beyond. Technology Integration Frequency in Cohort 2 (M <\/em>= 1.94, SD <\/em>= .64) was significantly lower than that in Cohort 11 (M <\/em>= 2.22, SD <\/em>= .71, p <\/em>= .002). Technology Integration Frequency in Cohort 5 (M<\/em> = 1.90, SD = <\/em>.64) was significantly lower than that in Cohort 10 (M <\/em>= 2.27, SD <\/em>= .72; p <\/em>= .047) and Cohort 11(M <\/em>= 2.22, SD <\/em>= .71; p <\/em>< .001). Last, Cohort 8 (M <\/em>= 1.92, SD <\/em>= .69) was significantly lower than Cohort 11 (M <\/em>= 2.22, SD <\/em>= .71; p <\/em>= .001). The change in social studies appears to have been more incremental than other domains.<\/p>\n

      Overall, the effect sizes (Cohen\u2019s d<\/em>; Cohen, 1969) for the Technology Integration Frequency differences between Cohort 1 and Cohort 11 were large, with the values ranging from .90 to 1.72 (see Table 2), indicating that preservice teachers in Cohort 11 integrated technology more frequently in all four subject areas than those at baseline.<\/p>\n

      Table 2<\/strong>
      \nMeans and Standard Deviations for the Frequency for Technology Integration by Subject Areas<\/p>\n\n\n\n\n\n\n\n\n\n
      Cohort<\/strong><\/td>\nCohen\u2019s d<\/em><\/strong><\/td>\n<\/tr>\n
      Subject Areas<\/strong><\/td>\n1<\/strong><\/td>\n2<\/strong><\/td>\n3<\/strong><\/td>\n4<\/strong><\/td>\n5<\/strong><\/td>\n6<\/strong><\/td>\n7<\/strong><\/td>\n8<\/strong><\/td>\n9<\/strong><\/td>\n10<\/strong><\/td>\n11<\/strong><\/td>\n3
      \nto 1<\/strong><\/td>\n
      11
      \nto 8<\/strong><\/td>\n
      11
      \nto 1<\/strong><\/td>\n<\/tr>\n
      Literacy<\/td>\n1.61
      \n(.60)<\/td>\n
      2.10
      \n(.64)<\/td>\n
      2.36
      \n(.61)<\/td>\n
      2.36
      \n(.62)<\/td>\n
      2.23
      \n(.63)<\/td>\n
      2.41
      \n(.56)<\/td>\n
      2.44
      \n(.71)<\/td>\n
      2.42
      \n(.71)<\/td>\n
      2.63
      \n(.71)<\/td>\n
      2.81
      \n(.71)<\/td>\n
      2.71
      \n(.68)<\/td>\n
      1.24<\/td>\n.42<\/td>\n1.72<\/td>\n<\/tr>\n
      Math<\/td>\n1.75
      \n(.71)<\/td>\n
      2.14
      \n(.60)<\/td>\n
      2.49
      \n(.79)<\/td>\n
      2.47
      \n(.70)<\/td>\n
      2.41
      \n(.77)<\/td>\n
      2.34
      \n(.72)<\/td>\n
      2.40
      \n(.86)<\/td>\n
      2.37
      \n(.86)<\/td>\n
      2.52
      \n(.76)<\/td>\n
      2.59
      \n(.81)<\/td>\n
      2.63
      \n(.82)<\/td>\n
      .99<\/td>\n.31<\/td>\n1.15<\/td>\n<\/tr>\n
      Science<\/td>\n1.65
      \n(.63)<\/td>\n
      1.94
      \n(.76)<\/td>\n
      2.06
      \n(.71)<\/td>\n
      2.07
      \n(.72)<\/td>\n
      1.95
      \n(.77)<\/td>\n
      2.05
      \n(.76)<\/td>\n
      2.06
      \n(.69)<\/td>\n
      2.06
      \n(.68)<\/td>\n
      2.24
      \n(.76)<\/td>\n
      2.11
      \n(.69)<\/td>\n
      2.29
      \n(.78)<\/td>\n
      .61<\/td>\n.76<\/td>\n.90<\/td>\n<\/tr>\n
      Social Studies<\/td>\n1.57
      \n(.65)<\/td>\n
      1.94
      \n(.64)<\/td>\n
      2.07
      \n(.73)<\/td>\n
      2.11
      \n(.68)<\/td>\n
      1.90
      \n(.64)<\/td>\n
      2.20
      \n(.70)<\/td>\n
      1.94
      \n(.69)<\/td>\n
      1.92
      \n(.69)<\/td>\n
      2.22
      \n(.71)<\/td>\n
      2.27
      \n(.72)<\/td>\n
      2.40
      \n(.77)<\/td>\n
      .72<\/td>\n.66<\/td>\n1.16<\/td>\n<\/tr>\n
      Notes<\/em>. 4-point Scale; 1 = Never<\/em>; 2 = in a few lessons<\/em>; 3 = in most lessons<\/em>; 4 = in all of my lessons. <\/em><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n

       <\/p>\n

      Interestingly, the data showed two waves of increase in Technology Integration Frequency for all four content areas (see Table 2 and Figure 3). The first increase happened between Cohort 1 and Cohort 3, with the effect sizes ranging from .61 to 1.24. In Cohort 1, preservice teachers reported they had integrated technology in very few lessons for all four subject areas. In Cohort 3, preservice teachers reported they had integrated technology in some lessons for literacy and mathematics and a few lessons for science and social studies. Preservice teachers\u2019 Technology Integration Frequency did not change significantly between Cohort 3 and Cohort 8.<\/p>\n

      The second increase happened between Cohort 8 and Cohort 11, with the effect sizes ranging from .31 to .76. In Cohort 11, preservice teachers reported that they had integrated technology in most lessons in literacy and mathematics and in some lessons in science and social studies. Narrative responses describing a lesson in which preservice teachers effectively integrated content and technology matched preservice teachers\u2019 self-efficacy ratings for integrating technology into these content areas, as well as their frequency of integrating technology in lessons in these areas. Preservice teachers most frequently described a literacy or math lesson, with science and social studies being mentioned less.<\/p>\n

      Relationships<\/h3>\n

      To answer the second and third research questions, we used a structural equation model (Kline, 2011) to test the fit of the conceptualized theoretical model to the data. The model was tested using the robust maximum likelihood estimator (MLR) with standard errors that are robust to nonnormality. The criteria for model fit included model \u03c72<\/sup>, the comparative fit index (CFI: values above .95 indicate good fit, and at or above .90 indicate reasonable fit; Bentler, 1990), RMSEA (values lower than .06 are desirable for good fit; Steiger, 1990), and SRMR (values lower than .08 are considered a good fit; Hu & Bentler, 1999).\u00a0 The use of multiple fit indexes is recommended to evaluate the fit of a model with a more holistic view (Kline, 2011). Figure 4 presents parameters from the measurement model and structural model.<\/p>\n

      \"\"<\/a>
      Figure 4.<\/strong> Results for the Structural Equation Model. Boldface arrows indicate the structural component between model (i.e., Modeling), tpack_se (i.e, TPACK Efficacy), freq (i.e., Technology Integration Frequency), and tech (i.e., Technology Knowledge).<\/em><\/figcaption><\/figure>\n

       <\/p>\n

      Measurement Model<\/em><\/strong>.<\/em> The latent factor of Modeling has six manifest variables describing university classes modeling technology integration effectively (Cronbach\u2019s alpha = .64). The latent factor of TPACK Efficacy has four manifest variables, including preservice teachers\u2019 efficacy to integrate technology and subject areas (i.e., literacy, mathematics, science, and social studies) to reach lesson objectives (Cronbach\u2019s alpha = .87). The latent factor of Technology Knowledge was represented by eight manifest variables (Cronbach\u2019s alpha = .73). Last, the latent factor of Technology Integration Frequency includes four manifest variables, including preservice teachers\u2019 frequency of technology integration in literacy, mathematics, science, and social studies (Cronbach\u2019s alpha = .71).<\/p>\n

      All manifest variables loaded significantly onto their respective latent factors (p<\/em> < .001 for all standardized coefficient estimates; see Figure 4). The standardized coefficients ranged from .41 to .57 for Modeling, from .76 to .84 for TPACK Efficacy, from .53 to .79 for Technology Knowledge, and from .55 to .67 for Technology Integration Frequency. The overall model fit was good. The chi-square was statistically significant, \u03c7 2<\/sup>(200) = 551.15, p < <\/em>.001, but other fit indices were in the expected range: CFI = .923, TLI = .911, RMSEA = .05 (90% CI = .047, .058), and SRMR = .047.<\/p>\n

      There was significant correlation between Modeling and Technology Knowledge (r <\/em>= .30, p < <\/em>.001), Modeling and TPACK Efficacy (r <\/em>= .25, p < <\/em>.001), Modeling and Technology Integration Frequency (r <\/em>= .33, p < <\/em>.001), TPACK Efficacy and Technology knowledge (r <\/em>= .13, p <\/em>= .028), and TPACK Efficacy and Technology Integration Frequency (r <\/em>= .25, p < <\/em>.001). The correlation was not significant between TPACK Efficacy and Technology Integration Frequency (r <\/em>= .09, p = <\/em>.194).<\/p>\n

      Structural Model<\/em><\/strong>.<\/em> A structural model with all latent factors and their respective predictors were tested. Figure 4 presents the results with significant path in solid line. The chi-square was statistically significant, \u03c7 2<\/sup>(202) = 568.58, p < <\/em>.001, CFI = .920, TLI = .908, RMSEA = .053 (90% CI = .050, .059), and SRMR = .053. We compared this model to an alternative following modification suggestion, where we added the path between modeling and technology integration. An adjusted chi-square difference test yielded a significantly better fit of the alternative model, \u0394\u03c72(1, N <\/em>= 891) = 15.32, p <\/em>< .001. Therefore, the alternative model was the final structural model (see Figure 4).<\/p>\n

      Standardized path coefficient values are presented along the path. The chi-square was statistically significant, \u03c7 2<\/sup>(201) = 551.64, p < <\/em>.001, but the model fit was good according to other indices, CFI = .923, TLI = .912, RMSEA = .052 (90% CI = .047, .057), and SRMR = .048. The significant chi-square statistic might be due to the large sample size in the study, as chi-square statistic is sensitive to sample size and model complexity (Hu & Bentler, 1999).<\/p>\n

      Research Question 2 focused on the contribution of TPACK Efficacy and Technology Knowledge to Technology Integration Frequency in the classroom. Technology Knowledge significantly predicted preservice teachers\u2019 Technology Integration Frequency in the classroom. However, preservice teachers\u2019 TPACK Efficacy did not significantly predict their Technology Integration Frequency. The results suggested that Technology Knowledge contributed to preservice teachers\u2019 technology integration in classroom instruction; TPACK Efficacy, however, did not contribute to their technology integration.<\/p>\n

      Research Question 3 examined the impact of teacher program Modeling on Technology Knowledge, TPACK Efficacy, and Technology Integration Frequency. Modeling significantly predicted preservice teachers\u2019 TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency, indicating that the modeling of technology integration in the teacher education program affected positively preservice teachers\u2019 development of Technology Knowledge and TPACK Efficacy in content areas, as well as enhanced their frequency of effective technology integration in classroom instruction.<\/p>\n

      Description of Technology Integration<\/h3>\n

      We used students\u2019 self-reported technology integration to examine the enactment of TPACK. Narrative descriptions showed that preservice teachers\u2019 most effective lessons changed from teacher presentations (in early cohorts) to greater student use of technology in later cohorts. For example, Cohort 4 reported 14 occasions in which preservice teachers used technology to show a presentation, 20 occasions where the preservice teacher showed a presentation and students interacted using technology, and no occasions where students created a presentation to demonstrate learning. By Cohort 10 preservice teachers reported their effective technology lessons as five occasions where they created and showed a presentation in teaching, 14 occasions where students interacted with the teacher-made presentation, and five occasions where students created multimedia presentations to demonstrate learning. Two examples of these student-created presentations described by preservice teachers in Cohort 10 included the following:<\/p>\n