Theoretical model for effective teacher education impacting technology integration.<\/em><\/figcaption><\/figure>\n <\/p>\n
Literature Review<\/h2>\nTPACK<\/h3>\n TPACK builds on Shulman\u2019s (1986) pedagogical content knowledge framework. Shulman argued that the most effective teaching takes place when teachers merge their understanding of content and pedagogy to plan learning experiences that overcome teaching challenges. TPACK refers to \u201can emergent form of knowledge that goes beyond all three components (content, pedagogy, and technology)\u201d (Mishra & Koehler, 2006, p. 1028). It is an understanding that emerges from the interaction of these bodies of knowledge, both theoretically and in practice, producing flexible knowledge necessary to successfully integrate technology into teaching (Carpenter, et al., 2016; Koehler & Mishra, 2009). Teachers need to understand \u201cnot just the subject matter they teach, but also the manner in which the subject matter can be changed by the application of technology\u201d (Mishra & Koehler, 2006, p.1028).<\/p>\n
These three components are more than the sum of their parts, empowering teachers to facilitate lessons where technology advances student learning to a new level. As devices and uses for technology in schools increase, the TPACK framework adds a technological knowledge component highlighting the need for teachers to know how technology can influence content and pedagogy.<\/p>\n
The TPACK framework has become ubiquitous in the educational technology field and is supported by the American Association of Colleges for Teacher Education (AACTE; Carpenter, et al., 2016). The existing literature on this topic has come from work with both established teachers (e.g., Bruce & Chiu, 2015; Graham et al., 2009; Harris & Hofer, 2017) and preservice teachers (e.g., Niess, 2008).<\/p>\n
At the same time, AACTE has embraced the TPACK model for preservice teachers so they learn how and why to integrate technology as they begin planning and teaching (Herring, Koehler, & Mishra, 2016). With constantly evolving technologies, teacher education must prepare preservice teachers to teach in ways that prepare students to learn using these digital tools (Niess, 2008).<\/p>\n
TPACK as a Basis for Program Redesign<\/h3>\n As researchers have begun to focus on techniques to aid TPACK growth in preservice and in-service teachers (e.g., Cavin, 2008; Graham et al., 2009), modifications in courses and fieldwork are emerging (Koehler et al., 2012). Our program redesign began with the three primary foci for developing TPACK in teacher preparation programs, as outlined by Hofer and Grandgenett (2012): \u201ca dedicated educational technology course; content-specific teaching methods, or practicum courses; or through the duration of coursework in a teacher preparation program\u201d (p. 87). We changed the \u201cor\u201d to \u201cand,\u201d however, to layer opportunities and capacity.<\/p>\n
Empirical studies on developing TPACK had mainly focused on one or two of these components. For example, Chai, Koh, and Tsai (2010) focused on the first component by teaching TPACK in an educational technology course with a cohort of 889 preservice teachers in a postgraduate secondary education program in Singapore. The technology course focused on pedagogical and technological knowledge. The instructors presented a technology tool and its pedagogical use to students organized by subject area, who created a final thematic unit comprised of technology enhanced lessons in their area. Findings showed that technology courses that directly taught technology tools along with pedagogy raised preservice teachers\u2019 technological and pedagogical knowledge with moderate to large effect sizes.<\/p>\n
Similarly, Maor (2017) conducted a study of two consecutive versions of a mainly graduate technology course in Australia using blended learning for instructors to model, and students participated collaboratively with technology to explore the effect of TPACK on digital pedagogies. Maor found significant TPACK growth in each domain, along with greater confidence and understanding of TPACK application, leading to implementation in the classroom.<\/p>\n
Harris and Hofer (2011) utilized content-specific teaching methods (second component) for professional development to help teachers go beyond self-evaluating TPACK to put TPACK-in-Action<\/em>. Seven classroom teachers participated in the study of TPACK professional development. The instructor presented examples, descriptions, and suggested technologies to accomplish curriculum goals. Participants then planned a unit by incorporating a variety of learning activities into the content and pedagogy. Teachers noted that adding selected activities and technologies allowed them to effect deeper, more self-directed learning in the classroom. Five of the seven teachers commented on how the activities facilitated the fit between the TPACK domains, teaching requirements, and time.<\/p>\nMouza, Karchmer-Klein, Nadakumar, Yilmaz Ozdem, and Hu (2014) combined the first two components for building TPACK in teacher education. They built on the idea that, when the technology course is integrated with method courses and field experience, preservice teachers benefit by applying learning directly into teaching with technology (Niess, 2005, 2012). Their study examined 88 preservice teachers enrolled in the technology course and related method courses during one semester. All preservice teachers showed significant growth in each TPACK area and applied their knowledge during field experience. However, Mouza et al. noted that it was difficult to place preservice teachers in classrooms where teachers effectively modeled technology integration. Cooperating teachers used technology for teaching and learning in a very limited way, so preservice teachers mainly learned pedagogy (PCK and pedagogical knowledge), not technology integration, from cooperating teachers.<\/p>\n
Hofer and Grandgenett (2012) added the third component of technology integration throughout a program as they examined TPACK integration through a three semester graduate teaching program with eight participants. Results indicated growth in TPACK throughout the program, but the largest gains occurred when preservice teachers were concurrently enrolled in the educational technology course and their first method course, where they discussed teaching strategies, lesson planning, and technology integration.<\/p>\n
Preservice teachers\u2019 TPACK in lesson plans fell slightly during student teaching, and the authors suggested that the demands of classroom practice may have negatively impacted technology integration. Hofer and Grandgenett (2012) suggested a need for more longitudinal studies of TPACK across teacher education programs.<\/p>\n
Current research demonstrates that the three TPACK components are being used successfully in teacher preparation programs; however, they also indicate the need for further investigations focusing on sustainable longitudinal program wide approaches. The current study included all three components (technology course, technology infused into method courses, field experiences, and across program) integrated into consecutive iterations.<\/p>\n
The Role of Teacher Efficacy<\/h3>\n Ertmer and Ottenbriet-Leftwich (2010) suggested that to change and sustain teachers\u2019 technology practices teacher educators need to focus on knowledge, self-efficacy, pedagogical beliefs, and culture in both teacher education programs and teacher professional development. Research on motivation emphasizes the role beliefs play in influencing persistence, behaviors, and achievement.<\/p>\n
The motivational construct of self-efficacy<\/em> (Bandura, 1986) has become the focus of educational research in varied domains, such as mathematics, science, reading, writing, and sports (Bandura, 1997; Pajares, 1997; Pajares & Miller, 1994; Schunk & Zimmerman, 2007). Self-efficacy is a person\u2019s estimation of the probability of success if they attempt to organize and execute actions required to accomplish a task (Bandura, 1986). In education, self-efficacy has been shown to be a powerful predictor of students\u2019 motivation and academic achievement (e.g., see Schunk & Pajares, 2009).<\/p>\nTeacher self-efficacy refers to teacher\u2019s beliefs about their capacity to accomplish pedagogical tasks (Bandura, 1986). It is the basis for understanding teachers\u2019 beliefs about their ability to translate their knowledge into successful action. For example, Abbitt (2011) found that teacher efficacy for technology integration interacted with TPACK in predicting change in technology integration.<\/p>\n
Teacher efficacy is crucial in making sure that the capacity teachers acquire will actually be used in the classroom. As illustrated in Figure 1, successful implementation of educational change, in our case technology integration, requires the confluence of knowledge, motivation, and resources. TPACK alone may not translate into sustained integration into teaching and student learning without teachers believing they can do it (Bauer & Kenton, 2005; Ertmer & Ottenbreit-Leftwich, 2010; Corkin, Ekmekci, White & Fisher, 2016; Wozney, Venkatesh, & Abrami, 2006).<\/p>\n
Teachers and preservice teachers need multiple experiences integrating technology in classrooms and practicum situations to build confidence through personal mastery and vicarious learning, the strongest sources of self-efficacy (Bandura, 1997).<\/p>\n
The Role of Modeling<\/h3>\n Preservice teachers have been learning from their own teachers throughout their K12 schooling in a process Lortie (1975) called \u201cthe apprenticeship of observation.\u201d However, as students, they do not always have access to the knowledge, skills, and reasoning behind the myriad of procedures they observe, sometimes causing misconceptions about teaching. Modeling, on the other hand, is a high leverage activity that can scaffold vicarious learning into personal mastery when teacher educators and teachers share their thought processes to support actions and move preservice teachers into the role of teacher (Grossman, Hammerness, & McDonald, 2009).<\/p>\n
Ertmer (2003) found that when teacher educators, cooperating teachers, and preservice teachers collaborate to plan technology integrated lessons, modeling happens naturally as teachers each demonstrate their area of expertise. Ertmer further noted that some teacher education programs explicitly model what a meaningful technology integrated lesson looks like before preservice teachers try to create lessons themselves. In such ways, teachers at all levels tend to benefit from observing a variety of expert performance as they move toward more advanced levels of technology use.<\/p>\n
Baran, Canbazoglu Bilici, Albayrak Sari, and Tondeur (2017) showed that instructor modeling in three teacher education programs in Turkey was a significant predictor of TPACK perception by preservice teachers. Angeli (2005) used explicit modeling by teacher educators to explain and demonstrate their process of integrating lessons with technology to prepare preservice teachers. After building confidence by observing an expert, preservice teachers created their own technology integrated science lessons for elementary students, guided by teacher educators. Findings showed that along with modeling teacher educators also need to explain the pedagogical reasoning so preservice teachers see \u201chow the teacher\u2019s role changes, how the subject matter gets transformed, and how the learning process is enhanced (Angeli, 2005, p. 395). What\u2019s more, teacher educators should explicitly teach how to apply the unique features of a tool to transform a specific content domain in ways not possible without the tool.<\/p>\n
Summary<\/h3>\n In order to create meaningful change in the ways teachers use technology in their classroom, knowledge and self-efficacy have to be purposefully attended to, while making sure that resources are available so technology can be used. To move the field forward, all stakeholders in a teacher education program need to move together. University faculty need to model effective use of technology in courses and empower preservice teachers to utilize these tools in coursework and beyond. Cooperating teachers need professional development adding technology into instruction as personal digital devices become ubiquitous in education.<\/p>\n
The model presented in Figure 1 was the basis of the redesign in our teacher education program. We progressively added components that supported all aspects of TPACK efficacy, technology knowledge, and resources to create optimal conditions for developing teachers ready to teach in the 21st century.<\/p>\n
We focused on three questions:<\/p>\n
\nHow do preservice teachers\u2019 TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency change over time in response to integration of technology practices into the teacher education program?<\/li>\n What is the contribution of TPACK Efficacy and Technology Knowledge to Technology Integration Frequency in the classroom?<\/li>\n What is the impact of modeling on TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency?<\/li>\n<\/ol>\nMethods<\/h2>\nParticipants<\/h3>\n The participants were 891 preservice teachers (801 female, 90%) from 11 cohorts across consecutive semesters (n1 <\/sub><\/em>= <\/sub>92, n2<\/sub><\/em> = <\/sub>75, n3 <\/sub><\/em>= <\/sub>82, n4 <\/sub><\/em>= <\/sub>82, n5 <\/sub><\/em>= <\/sub>81, n6 <\/sub><\/em>= <\/sub>83, n7 = <\/sub><\/em>80, n8 <\/sub><\/em>= <\/sub>65, n9 <\/sub><\/em>= <\/sub>107, n10 = <\/sub><\/em>64, n11 <\/sub><\/em>= <\/sub>80) from fall 2011 to fall 2016 in a large Midwestern university. All participants were undergraduate students. Most were traditional students aged between 19 and 25 (n <\/em>= 846; 95%), in addition to 24 students aged between 26 and 30 (3%), and 21 students between 31 and 50 (2%). They were enrolled in an elementary education program, with 58% focusing on elementary-only, 24% on elementary special education, 12% on inclusive P-3 education, 5% on early elementary education, and 1% on elementary and English learners. The majority of the participants were Caucasian (n <\/em>= 864, 97%), with some Hispanic (n <\/em>= 13), African American (n <\/em>= 9), and Asian American (n <\/em>= 5). At the time the data was collected, the participants were at the end of student teaching their final semester.<\/p>\nMeasures<\/h3>\n An online survey was administered to all student teachers in their last semester in the program (student teaching; see Appendix). After responding to demographic questions, preservice teachers were introduced to three instruments.<\/p>\n
Technology Knowledge.<\/em><\/strong> The first instrument measured their Technology Knowledge adapted from the Survey of Preservice Teachers\u2019 Knowledge of Teaching and Technology (Schmidt et al., 2009). There were seven items in Technology Knowledge on a scale from 1 (strongly disagree<\/em>) to 5 (strongly agree<\/em>). We adapted six items and replaced one item with the reported lowest factor loading (.65) \u2013 \u201cI have had sufficient opportunities to work with different technologies\u201d \u2013 with \u201cColleagues often ask me to help them with technology,\u201d developed by the researchers. This item focused on preservice teachers\u2019 mastery experience working with technology and was validated with technology coaches from across the state. The reliability of the seven items in this study was .88 using Cronbach’s alpha, slightly higher than the value .82 reported previously (Schmidt et al., 2009).<\/p>\nTPACK Efficacy.<\/em><\/strong> The second instrument included (a) measurement of preservice teachers\u2019 TPACK Efficacy in designing and teaching lessons that combine subject matter and technology to reach objectives (adapted from Schmidt et al.\u2019s TPACK knowledge domain, 2009) on a Likert scale from 1 (highly ineffectively<\/em>) to 5 (highly effectively<\/em>), (b) the frequency of such lessons on a Likert scale from 1(never<\/em>) to 4 (in all of my classes<\/em>), and (c) three open-ended questions soliciting preservice teachers\u2019 detailed description of a lesson in which they integrated content and technology effectively to reach their lesson objectives:<\/p>\n\nWhat was the content?<\/li>\n What technology did you use? What did you use it for?<\/li>\n What technology did students use? What did they use it for?\u201d<\/li>\n<\/ol>\nFor TPACK Efficacy, we adapted only four items measuring preservice teachers\u2019 efficacy to integrate subject areas relevant to our teacher education program of interest: literacy, mathematics, science, and social studies. The reliability was .87 for the adapted four items, compared to the original scale with a reliability of .92 for nine items (Schmidt et al., 2009).<\/p>\n
Modeling<\/em><\/strong>.<\/strong> The third instrument focused on modeling adapted from Schmidt et al.\u2019s (2009) measure of Models of TPACK (faculty, PK-6 teachers). Preservice teachers were first asked to name one individual who was an exceptional model in technology integration and describe her\/his role. Following were seven items asking preservice teachers to rate whether university classes have modeled technology integration effectively (i.e., Literacy Methods, Mathematics Methods, Science Methods, Social Studies Methods, Technology Methods, Practicum\/Student Teaching, and Reading Center) on a Likert scale from 1 (highly ineffectively<\/em>) to 5 (highly effectively<\/em>). We adapted the six items from Schmidt et al. (2009). In addition, we added one item to address the modeling at the Reading Center that is an integral part of our teacher education program. The reliability for these items in this study was .57. Schmidt et al. (2009) did not provide reliability for the items measuring modeling.<\/p>\nData Analysis Procedures<\/h3>\n Our first research question focused on how preservice teachers\u2019 TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency changed across cohorts. To achieve this goal, we used one-way ANOVAs to determine any differences in preservice teachers\u2019 reported scores across cohorts. Before conducting ANOVAs, we administered Chi-square tests to examine the characteristics of participants across cohorts regarding their gender, age, and program focus. Participants\u2019 age was categorized into three groups (from 19 to 25; 26 to 30; and 31 to 50). Initial analysis also examined potential outliers and the normality and homogeneity of measured variables. We used the mean of items to calculate Technology Knowledge; therefore, Cronbach\u2019s alpha was computed to examine its internal consistency.<\/p>\n
For the second and third research questions, we applied structural equation modeling (SEM) to examine the relationships among teacher program Modeling, TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency. SEM is a statistical technique that models the relationships among latent factors. We applied a two-step process: (a) a confirmatory factor analysis (CFA) to confirm that the measurement model fit respective data; and (b) a structural regression model to examine the relationships among latent factors (Thompson, 2000).<\/p>\n
Additionally, for Questions 2 and 3 we analyzed narrative data from the open-ended questions in the survey using Miles and Huberman\u2019s (1994) thematic coding. After reading through all data to get a sense of the content, we reread using open coding, assigning codes created initially as well as adding axial codes as needed. We then read the data a third time looking for patterns and answers to research questions. Rich and thick quotes (Creswell, 1998) were selected to express how preservice teachers explained actual lessons they taught integrating technology as well as how their best teaching models integrated technology.<\/p>\n
Program Development<\/h3>\n Over a period of 5 years the teacher education program was redesigned to strengthen preservice teachers\u2019 TPACK (Trainin & Friedrich, 2014;\u00a0<\/span>Trainin, Friedrich, & Deng, 2013<\/span>). Each component built upon the five elements of professional development, which Desimone (2009) has shown to be effective: focused content, collective participation, active learning, duration, and coherence.<\/p>\nIn addition, Dagen and Bean (2014) noted a new wave of research emphasizing collaborative learning as a key feature, taking into consideration the teacher\u2019s organization. They maintained that \u201ceffective professional development would encompass as many of those features as appropriate for a specific professional development initiative\u201d (p. 47). We discuss each component of the transformed teacher education program in relation to these core features of effective professional development (see Figure 2).<\/p>\n <\/a>Figure 2.<\/strong> Teacher education program redesign components as rolled out throughout cohort.<\/em><\/figcaption><\/figure>\n <\/p>\n
Technology Integration Planning and Baseline Data Collection.<\/em><\/strong> The transformation began with a University Reading Center pilot, where we had full control over devices and apps, one-to-one usage with students, and supervision to allow teacher educators to model in class and preservice teachers to enact TPACK in real time. The course content focused on strategies to assist striving readers and writers and was designed to engage preservice teachers in collaborative learning to plan lessons and share student results.<\/p>\nAt the same time the pilot was enacted we collected baseline data from the preservice teachers who were then in student teaching. Cohort 1 completed the adapted Survey of Preservice Teachers\u2019 TPACK to provide a baseline measure of TPACK Efficacy to plan and teach TPACK lessons, as well as frequency of actual implementation, technology knowledge, and effectiveness of teacher educators in modeling technology integration. Each cohort following completed the same survey during student teaching.<\/p>\n
Technology Pilots in Method Classes.<\/em><\/strong>\u00a0 The literacy methods course demonstrated the technology fit into content and pedagogy (TPACK) as iPads were integrated into teaching and learning. Preservice teacher use of a class set of Version 1 iPads, cameras, and software began the methods course redesign. Instructors modeled a variety of apps and discussed uses to teach literacy components, which preservice teachers then used to teach elementary students in the associated practicum. The program built upon this learning to integrate the technology component to focused content in mathematics, science, and social studies method courses in progressive semesters using an active learning format, where preservice teachers observed and participated in class then taught in practicum.<\/p>\nProfessional Development Conferences.<\/em><\/strong> One professional development conference per semester offered preservice teachers, cooperating teachers, and teacher educators opportunities to learn and collaborate around technology. The goal of the conferences was to help all three teacher groups develop as professionals integrating technology through collective participation with each other. The program required preservice teachers and encouraged cooperating teachers and teacher educators to attend university-planned conferences that provided hands-on technology practice through active learning using real classroom examples shared by peers from all groups.<\/p>\nThe conferences assisted cooperating teachers in integrating technology in meaningful ways to assist their schools and to provide locations where preservice teachers could experience effective integration in action. Wepner et al. (2012) found that school-university partnerships can expose teachers to new methodologies, provide innovative and cutting-edge ideas for the classroom, encourage collaborative inquiry about practice, renew the love of teaching, and develop teacher leadership. Building upon collaborative partnerships with the local school districts, all teachers were invited to attend the professional development conferences along with the preservice teachers. As we observed teachers grow in technology integration, we invited them to present at upcoming professional development conferences.<\/p>\n
The format of the conference frequently began with a keynote that challenged participants to consider emerging issues in education including 1:1 technology integration in classrooms, innovative learning spaces, classrooms of the future, makerspaces, and project-based learning STEAM (science, technology, engineering, art, and mathematics) curriculum. Participants then attended self-selected sectionals to meet individual goals.<\/p>\n
For example, an elementary teacher modeled how she used Green Screen technology to empower students to make videos to demonstrate learning about systems of the human body. This presenter modeled the process and showed student sample projects before inviting participants to collaborate with a partner to create a video during the sectional.<\/p>\n
Although the conference lasted one day, the duration of the learning continued as preservice teachers collaborated throughout the semester with cooperating teachers, peers, and supervisors (Friedrich & Trainin, 2016). A prototypical 5-hour conference offered fifteen 45-minute sessions plus a keynote. Classroom teachers presented 11 sessions with university instructors and State Department of Education personnel presenting two sessions each. All sessions utilized a bring-your-own-device hands-on format.<\/p>\n
Faculty Training<\/em><\/strong>.<\/em> Parallel to the professional development conferences, Teacher Educators received ongoing professional development through the redesigned program. Instructors were invited to attend monthly collaborative learning meetings, where all attending shared new tools and uses and answered questions. A university-focused professional development conference each summer challenged teacher educators to innovate teaching methods and share their learning with other teacher educators from across the state.<\/p>\nSectionals supported instructor needs ranging from novice to expert (e.g., online teaching and feedback, mobile devices in the classroom, collaborating, Google tools for teacher productivity and student learning, and update on technology integration at the elementary, secondary, and university levels). Through collective participation in an active learning format, instructors encountered tools and strategies used in their content focus area.<\/p>\n
Technology Integration Class Redesign<\/em><\/strong>.<\/strong> The technology integration course was reimagined to fit the new vision for preservice teachers. The first step was to fix the timing of the class to the beginning of the professional program. In this way preservice teachers were gaining pedagogical knowledge with the accompanying technological and integrated skills that could be used over the duration of the program. The curriculum was changed to build on the availability of mobile devices and eventually district one-to-one integration. The course itself was split so that later in the program we could add a practicum in technology integration during literacy methods and practicum.<\/p>\nTablet Requirement<\/em><\/strong>. <\/strong>The redesigned program required preservice teachers to have a tablet for entrance into the teacher education program. This intentional decision provided environmental support assuring that each preservice teacher had equal access to technology for teaching when schools and cooperating teachers differed in their access to and uses of technology. The college supported purchase for students with financial difficulty. Device availability in class and practicum allowed full participation in courses that were redesigned for learning in and through technology.<\/p>\nTechnology Practicum.<\/em><\/strong> Preservice teachers engaged in a technology practicum during literacy methods semester. The program provided coaching by university supervisors in practicum classrooms as a model of technology integration, an environmental support to sustain instructional change by scaffolding meaningful technology integration by preservice teacher\/cooperating teacher teams. When appropriate, these coaches suggested learning activities where technology could allow K-5 students to learn using digital sources in addition to print sources and, when needed, assisted with teaching lessons that involved using technology to teach and learn.<\/p>\nProfessional Development Class for Cooperating Teachers. <\/em><\/strong>\u00a0<\/em>The program offered a parallel course for interested cooperating teachers to learn the same uses for technology in the classroom that their preservice teacher was learning in technology practicum. This course supported cooperating teachers as they explored tools and designed lessons, implementing in their classroom with their preservice teacher supported by a university coach.<\/p>\nMakerspace.<\/em><\/strong> The program continues to add components in an effort to prepare preservice teachers for the rapidly emerging technologies and pedagogies entering schools. The most recent addition is a Makerspace component integrated into the technology integration class. Effective technology integration today empowers students as creators using technology, and the Makerspace is an effort to make sure that all preservice teachers have the capacity to engage with making (Sheninger & Murray, 2017). Learning in a supportive environment where trial and error is encouraged, preservice teachers ask questions and create projects to solve real problems.<\/p>\nResults<\/h2>\n Before answering the research questions, we conducted initial analyses to examine whether assumptions for multivariate analyses were met. Test of Normality with Kolmogorov-Smirnov indicated violation of normality for preservice teachers\u2019 TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency. We also referred to Q-Q plots inspection, and skewness (ranged from -1.11 to .57) and kurtosis (ranged from -1.27 to 1.35), which indicated reasonable normality for all variables. The homogeneity of variance assumption was met for all variables in TPACK Efficacy, Technology Knowledge, and Technology Integration Frequency (p <\/em>> . 05). We examined the need for covariates and demographic factors, which revealed no significant difference across cohorts in preservice teachers\u2019 gender, c2<\/sup>(10, 891) = 3.11, p = <\/em>.927; age, c2<\/sup>(20, 891) = 22.63, p = <\/em>.066, and program focus, c2<\/sup>(40,891) = 38.16, p = <\/em>.10.<\/p>\nGrowth<\/h3>\n TPACK Efficacy<\/em><\/strong>. Four separate one-way ANOVAs were conducted to examine any differences across cohorts in preservice teachers\u2019 TPACK Efficacy in four content areas, including literacy, mathematics, science, and social studies. The results suggested significant differences across cohorts for technology integration with literacy, F<\/em>(10,881) = 73.08, p < <\/em>.001; mathematics, F<\/em>(10,881) = 53.59, p < <\/em>.001; science, F<\/em>(10,881) = 33.87, p < <\/em>.001; and social studies, F<\/em>(10,881) = 35.25, p < <\/em>.001. Detailed descriptive statistics are presented in Table 1.<\/p>\nTable 1<\/strong> \nMeans and Standard Deviations by Cohort Number for TPACK Efficacy Subject Areas and Technology Knowledge<\/p>\n\n\n\n\u00a0Variable<\/strong><\/td>\n1<\/strong><\/td>\n2<\/strong><\/td>\n3<\/strong><\/td>\n4<\/strong><\/td>\n5<\/strong><\/td>\n6<\/strong><\/td>\n7<\/strong><\/td>\n8<\/strong><\/td>\n9<\/strong><\/td>\n10<\/strong><\/td>\n11<\/strong><\/td>\nd3<\/em><\/strong> \n(11 to 1)<\/strong><\/td>\n<\/tr>\n\nTE in Literacy<\/td>\n 1.79 \n(.74)<\/td>\n 3.95 \n(.87)<\/td>\n 3.94 \n(.87)<\/td>\n 4.00 \n(.88)<\/td>\n 4.00 \n(.87)<\/td>\n 4.11 \n(.76)<\/td>\n 4.31 \n(.69)<\/td>\n 4.29 \n(.67)<\/td>\n 4.07 \n(.93)<\/td>\n 4.13 \n(.97)<\/td>\n 4.15 \n(.78)<\/td>\n 3.10<\/td>\n<\/tr>\n \nTE in Math<\/td>\n 1.75 \n(.62)<\/td>\n 3.91 \n(1.02)<\/td>\n 3.90 \n(1.02)<\/td>\n 4.04 \n(.97)<\/td>\n 4.00 \n(.92)<\/td>\n 4.05 \n(.89)<\/td>\n 4.13 \n(.80)<\/td>\n 4.09 \n(.81)<\/td>\n 3.87 \n(1.06)<\/td>\n 3.84 \n(1.08)<\/td>\n 4.05 \n(.91)<\/td>\n 2.95<\/td>\n<\/tr>\n \nTE in Science<\/td>\n 2.11 \n(.89)<\/td>\n 3.74 \n(.86)<\/td>\n 3.74 \n(.86)<\/td>\n 3.70 \n(.96)<\/td>\n 3.86 \n(.92)<\/td>\n 3.96 \n(.78)<\/td>\n 3.92 \n(.81)<\/td>\n 3.92 \n(.79)<\/td>\n 3.75 \n(.98)<\/td>\n 3.52 \n(1.00)<\/td>\n 3.95 \n(.85)<\/td>\n 2.11<\/td>\n<\/tr>\n \nTE in Social Studies<\/td>\n 2.04 \n(.98)<\/td>\n 3.71 \n(.91)<\/td>\n 3.72 \n(.89)<\/td>\n 3.79 \n(.97)<\/td>\n 3.76 \n(.89)<\/td>\n 3.96 \n(.82)<\/td>\n 3.87 \n(.80)<\/td>\n 3.88 \n(.78)<\/td>\n 3.75 \n(.98)<\/td>\n 3.69 \n(1.08)<\/td>\n 3.96 \n(.86)<\/td>\n 2.08<\/td>\n<\/tr>\n \nTK<\/td>\n 3.94 \n(.80)<\/td>\n 3.52 \n(.68)<\/td>\n 3.85 \n(.65)<\/td>\n 3.85 \n(.51)<\/td>\n 3.82 \n(.62)<\/td>\n 3.84 \n(.55)<\/td>\n 3.82 \n(.47)<\/td>\n 3.93 \n(.58)<\/td>\n 3.87 \n(.60)<\/td>\n 3.84 \n(.49)<\/td>\n 4.07 \n(.48)<\/td>\n <\/td>\n<\/tr>\n \nNotes:<\/em>\u00a0TE = TPACK Efficacy, TK = Technology Knowledge.\u00a0 5-point scale; For TE, 1 =\u00a0Strongly Disagree<\/em>; 2 =\u00a0Disagree<\/em>; 3 =\u00a0Neither Agree Nor Disagree<\/em>; 4 = Agree; 5 =\u00a0Strongly Agree<\/em>. For TK,\u00a01 =\u00a0Highly Ineffectively<\/em>; 2 =\u00a0Somewhat Ineffectively<\/em>; 3 =\u00a0Neutral<\/em>; 4 =\u00a0Somewhat Effectively<\/em>; 5 =\u00a0Highly Effectively.<\/em><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n <\/p>\n
We conducted follow-up procedures with a Tukey HSD post hoc test to compare differences among cohorts. The findings revealed incremental improvement that became significant over multiple cohorts. For example, Technology Integration Frequency for literacy in Cohort 1 was significantly lower than that in all other cohorts (p < .001 for all comparisons). Technology Integration Frequency for literacy in Cohort 2 (M <\/em>= 2.10; SD <\/em>= .64) was significantly lower than that in most following cohorts; for example, Cohort 7 (M <\/em>= 2.44; SD <\/em>= .71; p =<\/em> 04) and Cohort 9 (M <\/em>= 2.63; SD <\/em>= .71; p <\/em>< .001).<\/p>\nFor mathematics, Technology Integration Frequency in Cohort 1 was significantly lower than Cohort 2 (p <\/em>= .031) and the rest of cohorts (p <\/em>< .001 for all comparison). Technology Integration Frequency in Cohort 2 was significantly lower than that in Cohort 9 and beyond.\u00a0For science, Technology Integration Frequency in Cohort 1 was significantly lower than in Cohort 3 (p <\/em>= .008), Cohort 6 (p <\/em>= .011), and all subsequent cohorts.<\/p>\nFor social studies, Technology Integration Frequency in Cohort 1 was significantly lower than in Cohort 2 (p <\/em>= .022), cohort 3 (p <\/em>< .001), and beyond. Technology Integration Frequency in Cohort 2 (M <\/em>= 1.94, SD <\/em>= .64) was significantly lower than that in Cohort 11 (M <\/em>= 2.22, SD <\/em>= .71, p <\/em>= .002). Technology Integration Frequency in Cohort 5 (M<\/em> = 1.90, SD = <\/em>.64) was significantly lower than that in Cohort 10 (M <\/em>= 2.27, SD <\/em>= .72; p <\/em>= .047) and Cohort 11(M <\/em>= 2.22, SD <\/em>= .71; p <\/em>< .001). Last, Cohort 8 (M <\/em>= 1.92, SD <\/em>= .69) was significantly lower than Cohort 11 (M <\/em>= 2.22, SD <\/em>= .71; p <\/em>= .001). The change in social studies appears to have been more incremental than other domains.<\/p>\nOverall, the effect sizes (Cohen\u2019s d<\/em>; Cohen, 1969) for the Technology Integration Frequency differences between Cohort 1 and Cohort 11 were large, with the values ranging from .90 to 1.72 (see Table 2), indicating that preservice teachers in Cohort 11 integrated technology more frequently in all four subject areas than those at baseline.<\/p>\nTable 2<\/strong> \nMeans and Standard Deviations for the Frequency for Technology Integration by Subject Areas<\/p>\n\n\n\nCohort<\/strong><\/td>\nCohen\u2019s d<\/em><\/strong><\/td>\n<\/tr>\n\nSubject Areas<\/strong><\/td>\n1<\/strong><\/td>\n2<\/strong><\/td>\n3<\/strong><\/td>\n4<\/strong><\/td>\n5<\/strong><\/td>\n6<\/strong><\/td>\n7<\/strong><\/td>\n8<\/strong><\/td>\n9<\/strong><\/td>\n10<\/strong><\/td>\n11<\/strong><\/td>\n3 \nto 1<\/strong><\/td>\n11