{"id":7580,"date":"2017-10-12T00:09:15","date_gmt":"2017-10-12T00:09:15","guid":{"rendered":"https:\/\/citejournal.org\/\/\/"},"modified":"2018-03-05T16:46:50","modified_gmt":"2018-03-05T16:46:50","slug":"incorporating-multiple-technologies-into-teacher-education-a-case-of-developing-preservice-teachers-understandings-in-teaching-statistics-with-technology","status":"publish","type":"post","link":"https:\/\/citejournal.org\/volume-17\/issue-4-17\/mathematics\/incorporating-multiple-technologies-into-teacher-education-a-case-of-developing-preservice-teachers-understandings-in-teaching-statistics-with-technology","title":{"rendered":"Incorporating Multiple Technologies Into Teacher Education: A Case of Developing Preservice Teachers\u2019 Understandings in Teaching Statistics With Technology"},"content":{"rendered":"

\"NTLI-FELLOWS-2011.cdr\"<\/p>\n

A prior version of this paper received the 2015 NTLI Fellowship Award from the Association for Mathematics Teacher Education.<\/p>\n

The importance of using technology in the preparation of preservice mathematics teachers (PSTs) has been at the forefront of national conversations in mathematics teaching and teacher preparation for over 15 years (e.g., Association of Mathematics Teacher Educators, 2015; Garofalo, Drier, Harper, & Timmerman, 2000; National Council of Teachers of Mathematics, 2000). Mathematics teacher educators (MTEs) are charged with promoting PSTs\u2019 engagement with a variety of technological tools as well as mathematics-specific technologies that deepen understanding of mathematics and students\u2019 thinking with technology.<\/p>\n

Such engagement may require MTEs to make changes to their teaching goals to provide learning opportunities that better foster the development of PSTs\u2019 technological pedagogical content knowledge (also referred to as technology, pedagogy, and content knowledge, or TPACK; Mishra & Koehler, 2006; Niess et al., 2009). Therefore, the purpose of this paper is to (a) present one approach for incorporating technology into a mathematics methods course that utilizes several types of technology into one lesson, (b) highlight affordances and limitations of different technological choices, and (c) discuss implications for teacher education.<\/p>\n

Specifically, our approach situated MTEs as having to draw upon their own TPACK, where the specialized content was mathematics education, to create opportunities for PSTs to develop their TPACK, where the specialized content was a topic in secondary mathematics. To do so, MTEs have often used an approach to increase PSTs\u2019 TPACK by engaging them in the tasks similar to those that they will be expected to use in their own classrooms, examining classroom videos, and students\u2019 work to make sense of students\u2019 reasoning about the task (e.g., Didis, Erbas, Cetinkaya, Cakiroglu, & Alacaci, 2016; Wilson, Lee, & Hollebrands, 2011). The task we chose to use is a data analysis activity we refer to as Mislabeled Variables (Appendix A<\/a>), one that the instructor had previously taught to sixth-grade students (Lovett & Lee, 2016).<\/p>\n

As students engaged in the Mislabeled Variables task, they investigated variable types (i.e., categorical and quantitative) and distributions generated from data gathered from a series of survey questions. Students used data collected from a class survey to make claims about which survey questions produced the data for each variable, where the variables are given letters names (e.g., A<\/em> and B<\/em>) rather than descriptive names (e.g., gender or shoe size).<\/p>\n

A data analysis task was chosen in response to the need to increase the preparation of PSTs to teach statistics, as reflected in the increased emphasis on statistics in recent years in the K-12 curriculum (National Council of Teachers of Mathematics, 2000; National Governors Association Center for Best Practice & Council of Chief State School Officers, 2010). Research has shown that many preservice secondary teachers do not possess the statistical knowledge needed to teach statistics effectively and often are not provided enough learning opportunities related to statistics learning and teaching in their mathematics methods courses (Lovett, 2016). In this context we chose to pose the Mislabeled Variables task to our PSTs.<\/p>\n

The lesson was designed and enacted in an undergraduate mathematics education course for middle school and high school mathematics teachers taught by the first author at a large southeastern university. This lesson consisted of two parts: (a) PSTs engaged in the Mislabeled Variables task, and (b) PSTs watched and discussed a video case of sixth-grade students\u2019 answers, reasoning, and misconceptions on the same task.<\/p>\n

Video cases are a common tool used by MTEs and researchers to help PSTs reflect on their own knowledge and practice, and they allow PSTs opportunities to observe and understand professional practice and students\u2019 mathematical reasoning (Grossman et al., 2009; Lampert & Ball, 1998; van Es & Sherin, 2002, 2008). Short clips that highlight important classroom moments are recommended because they can be viewed repeatedly to focus on important classroom moments (Borko, Jacobs, Eiteljorg, & Pittman, 2008; LeFevre, 2004).<\/p>\n

We were unable to videotape the classroom when the task was enacted with sixth-grade students, so we chose to create a brief animated video to illustrate the classroom scenario that included samples of students\u2019 work. Animations allow MTEs and researchers to recreate a real classroom environment without introducing cameras and microphones into the classroom. They are also useful when capturing high quality sound of what the teachers and students are saying is not possible (Chazan & Herbst, 2012).<\/p>\n

Literature Review<\/h2>\n

Research exploring animations in mathematics teacher education is still in its infancy. The majority of the prior research conducted has examined the use of LessonSketch<\/em>, an online environment that enables users to create representations of classroom scenarios (Herbst, Aaron, & Chieu, 2013). While conducting a professional development with in-service teachers, Chazan and Herbst (2012) found that participants were able to identify with the fictional teacher and that the animation did not inhibit participants\u2019 discussion of the instruction depicted in the video. Teachers were also able to project their previous classroom experiences onto the characters in the animation so they could have a discussion about their experiences.<\/p>\n

In another study, Herbst, Aaron, and Erickson (2013)\u00a0asked preservice teachers to rate videos and animations on their genuineness.\u00a0 In addition, the researchers explored preservice teachers\u2019 capacity to notice pedagogical and content knowledge features of the animations, asking them to reflect on alternate actions of the teacher while considering their previous experiences. The researchers found no significant differences in ratings in any of the measures except in genuineness, concluding that animations could be just as effective as video case examples in teacher education. In light of these findings, we had evidence that an animation of student engagement with the Mislabeled Variables task might provide PSTs with an authentic glimpse of student strategies.<\/p>\n

Framing the Task Design<\/h2>\n

In the design of the lesson, we extended Lee and Hollebrands\u2019 (2011) three aspects of teachers\u2019 knowledge related to teaching statistics with technology to characterize four types of teachers\u2019 knowledge: (a) statistical knowledge (SK), (b) technological statistical knowledge (TSK), (c) pedagogical statistical knowledge (PSK), and (d) technological pedagogical statistical knowledge (TPSK; Lee & Nickell, 2014).<\/p>\n

Statistical knowledge is foundational for developing statistical knowledge for teaching and technological pedagogical statistical knowledge (Groth, 2013; Lee & Hollebrands, 2011). To assist in the development statistical knowledge, PSTs should engage with technology-enabled tasks that allow exploratory data analysis (EDA). To develop TSK, PSTs should engage in tasks with dynamic statistical software that encourages simultaneously development of statistical ideas and technological skills.<\/p>\n

In terms of PSK, particular pedagogical decisions arise when teaching statistics that differ from other areas of mathematics, such as (a) planning for group projects and discussions about data, (b) supporting students in making statistical arguments based on appropriate evidence, and (c) considering the contexts used for teaching statistical ideas. Therefore, PSTs should learn to engage students in statistical investigations in a variety of contexts that require students to make decisions and arguments and consider how respond to different conclusions among groups during discussions (Shaughnessy, 2007).<\/p>\n

The ultimate goal in the preparation of PSTs to teach statistics with technology is to develop the specialized subset of knowledge representing TPSK. This type of complex knowledge is likely to develop with extended experiences with technology and considerations of the impact of such tools on the teaching and learning of statistics. MTEs are tasked with designing technology-enabled learning environments that develop aspects of PSTs\u2019 TPSK. The technologies chosen will impact what aspects of TPSK that PSTs have an opportunity to develop.<\/p>\n

Madden (2011) suggested that tasks used with teachers can be provocative in the sense that they can excite and stimulate focused conversations and attention to statistics, context, and technology. Tasks can also be pedagogically provocative since they can stimulate a focus on pedagogical issues within statistics. Within SK, this study focused on engaging PSTs in statistical thinking through explorations of real data using TinkerPlots (Konold & Miller, 2011) and analyses of data to draw conclusions. In this way, the statistical knowledge that PSTs developed may have been interwoven with their technological statistical knowledge (TSK), which focused on using technology to explore and analyze data.<\/p>\n

Within PSK, the Mislabeled Variables task focuses on providing PSTs with experiences with group work and supporting arguments with evidence that could be used as a foundation for planning their own lessons. Within TPSK, the goal is to provide PSTs with opportunities to reason about students\u2019 learning of statistical ideas with technology.<\/p>\n

The Mislabeled Variables Task for Preservice Teachers<\/h3>\n

We identified learning objectives related to SK, TSK, PSK, and TPSK to guide our planning. The content objective (SK) was to increase PSTs\u2019 ability to reason about the context of data and measurement units; to engage with a multivariate data set with a dynamic statistical tool, TinkerPlots; and to make claims about reasonable contexts for different data distributions. In terms of PSK, objectives were to encourage PSTs to justify their reasoning with data-based evidence, to critique the reasoning of others, and to consider how to promote this type of reasoning and critique in students.<\/p>\n

The TSK objective was for PSTs to learn how dynamic statistical software can support initial data exploration using dot plots, dynamic linking, and coordination of two or three variables.\u00a0 Lastly, for TPSK the objective was for PSTs to reason about students\u2019 statistical understandings and misunderstandings and approaches to the task while using a dynamic statistical software.<\/p>\n

Engaging Preservice Teachers in the Task as Learners<\/h3>\n

It was important to engage PSTs in the Mislabeled Variables task first as learners (to develop SK and TSK) using the task as it was previously taught to sixth-grade students (Lovett & Lee, 2016). The Mislabeled Variables task was adapted from Garfield and Ben-Zvi\u2019s (2008) Variables on Backs, a task designed to assist introductory statistics students in developing an understanding that different statistical questions produce different types of variables. The Variables on Backs task was enhanced through the use of TinkerPlots to explore survey questions, types of variables that questions produce, measurement units, and expected data values.<\/p>\n

To begin the Mislabeled Variables task, PSTs completed a personal information survey as a Google Form containing 16 questions (e.g., \u201cWhat time did you go to bed last night?\u201d). Questions were chosen intentionally to produce different data types (e.g., whole numbers, decimals, and time values). Survey responses were gathered online and used to create a data set in TinkerPlots, in which 16 attribute names were labeled as A, B, C, and so forth, and randomized so as not to match the order of questions from the survey. See Appendix B<\/a> for the survey questions and assignment of the corresponding letters. (The Mislabeled Variables task for teacher educators is also available as a resource in the Teaching Statistics Through Data Investigations MOOC-Ed, friday.institute\/tsdi<\/a>)<\/p>\n

In class, PSTs were seated using a single laptop with access to a practice file and the class survey data in TinkerPlots. The instructor provided a quick tutorial on using TinkerPlots, since PSTs had no prior experience with the software. This tutorial included an introduction to the cards, an explanation of how TinkerPlots uses colors to represent quantitative and categorical variables, and step-by-step instructions for creating a plot, separating data, and stacking data. An identical tutorial was provided to sixth graders. Next, each pair of PSTs was assigned data from two attributes and asked to make a conjecture about survey questions that most likely generated the data. Attributes were assigned so that each PST pair reasoned about different types of data (e.g., whole numbers, decimals, and time values). The attributes of C, F, I, J, O, and P were assigned to at least two groups, since these data were featured in the animation to be used during the second portion of the lesson.<\/p>\n

Following small group work, the instructor modeled Common Core Mathematical Practice 3, \u201cConstruct viable arguments and critique the reasoning of others\u201d (National Governors Association Center for Best Practice & Council of Chief State School Officers, 2010) by engaging PSTs in a class discussion. PSTs presented claims and evidence about the match between the attribute and the source of the data to their peers and critiqued the reasoning of others. The instructor purposefully chose PSTs to present their group\u2019s approach, encouraging PSTs to examine specific attributes and demonstrate different ways of reasoning about the task.<\/p>\n

Pedagogical Focus of the Lesson<\/h3>\n

Following their engagement in the task, the PSTs watched a video case of sixth-grade students\u2019 answers, reasoning, and misconceptions \u2013 specifically, a 3-minute animation created with GoAnimate to depict examples of how sixth grade students reasoned when they had completed the task (see Video 1). This animation provided the instructor with the opportunity to use authentic artifacts to engage PSTs in sense-making activities with student work (as in, e.g., Didis et al., 2016; Wilson et al., 2011).<\/p>\n

The instructor showed the animation once to help PSTs understand the content of the animation. Then, the instructor directed PSTs to focus on claims provided by the sixth graders and how these claims were similar or different to those made by PSTs. The instructor showed the animation a third time and directed the PSTs to focus on ways in which students reasoned about the data and engaged with TinkerPlots.\u00a0 <\/em>The PSTs compared and contrasted this engagement with their own use of the software.<\/p>\n