{"id":6846,"date":"2016-07-06T13:18:15","date_gmt":"2016-07-06T13:18:15","guid":{"rendered":"https:\/\/citejournal.org\/\/\/"},"modified":"2016-12-19T14:38:45","modified_gmt":"2016-12-19T14:38:45","slug":"strategizing-teacher-professional-development-for-classroom-uses-of-geospatial-data-and-tools","status":"publish","type":"post","link":"https:\/\/citejournal.org\/volume-16\/issue-3-16\/science\/strategizing-teacher-professional-development-for-classroom-uses-of-geospatial-data-and-tools","title":{"rendered":"Strategizing Teacher Professional Development for Classroom Uses of Geospatial Data and Tools"},"content":{"rendered":"

Overview<\/h2>\n

Between 2010 and 2014 the project, Studying Topography, Orographic Rainfall, and Ecosystems with Geospatial Information Technology (STORE) explored the strategies that stimulate teacher commitment to the project\u2019s driving innovation: having students use geospatial information technology (GIT) to learn about weather, climate, and ecosystems. The GIT in STORE was a combination of freely available place-based geospatial data sets and visualization tools. The goal was to structure the innovation and accompanying teacher professional development strategy so that participating teachers would find it appealing, plan for and enact effective instruction with it, achieve optimal impacts on student learning and engagement, and persist with it enough to carry out two implementations.<\/p>\n

A professional development strategy for bringing about these ends evolved iteratively. The article describes how STORE addressed the challenges of getting teachers to persist with the innovation and to use it skillfully. It culminated in the positing of a new model\u2014Conceptualization, Iteration, Adoption, and Adaptation (CIAA)\u2014for developing scientific data-centered instructional resources and accompanying teacher professional development.<\/p>\n

CIAA emerged as the pathway to high teacher commitment to the project as codevelopers and users. Applying principles guiding the development of grounded theory (Glaser & Strauss, 1967), the project team developed the model as a product of their analyses of the teacher attitudes and behaviors, which they gathered in meetings and interviews with the teachers, observations of their classes, and studies of the curricula they implemented in those classes.<\/p>\n

Sustaining teacher commitment to a curricular innovation is problematic, and rates of attrition from innovative practices are high (Kubitskey, Johnson, Mawyer, Fishman, & Edelson, 2012). Reasons include lack of time to learn and pursue the innovation, lack of administrator support, lack of incentives, and absence of a professional learning community (DuFour, 2004; Dunne, Nave, & Lewis, 2000; Louis & Marks, 1998; Strahan, 2003). Another factor may be mismatch between the expectations of the project for teacher involvement and the extent to which the particular participating teachers can meet those expectations (Rogers, 1962; Schoenfeld, 2011).<\/p>\n

Commitment to innovation does not necessarily bring about enactment of high quality implementation (Darling-Hammond et. al., 2009). Reasons cited in the literature include lack of teacher skill in enacting high-quality instruction, especially when the instruction calls for student-centered discussion, argumentation, and deep reasoning.<\/p>\n

These challenges may arise even when the teacher is a good planner and developer of high-quality lesson plans and assessments (Grossman, Hammerness, & McDonald, 2009). Teachers may lack sufficient content knowledge or background in the technologies needed for successful implementation. In the science classroom in particular, these inadequacies may be manifest in overly didactic and superficial treatment of content and, when there is hands-on technology involved, inordinate attention to procedure at the expense of scientific analysis and communication (Penuel et al., 2006). Hence, two types of barriers confront teacher success in implementing innovations: barriers to their acquiring the skills they need for successful planning and implementation and barriers to their persistence. This article describes\u00a0how the STORE project addressed these barriers and what were the results. The results are a combination of attitudes expressed on the teacher survey, the record of products that the teachers developed, and the diversity of implementations that occurred.<\/p>\n

Background<\/h2>\n

The STORE project team compiled geospatial data sets from publicly available scientific portals and made them useable in ArcGIS Explorer Desktop and Google Earth. These GIT applications are freely available, and anyone can download onto their computers from the Web. Free accessibility and downloading capability were the main reasons behind the project decision to use those two applications to host the data, even though they are less sophisticated than some other GIT applications, such as ArcMap, that cost money or require schools to obtain grants to purchase.<\/p>\n

To guide classroom use of the GIT applications and data sets, the project research team, with the help of six design partner teachers, developed in the first project year the first iterations of six thematically connected, hands-on exemplar\u00a0lessons that provide students with the opportunity to see focal enduring understandings played out in \u201cstudy area\u201d regions of mid-California and the western part of New York State. These enduring understandings include orographic impacts on weather and climate, sustainment of plant species and ecosystems, and climate change. In the process, students were to learn about the nature of geospatial data, including how the data are\u00a0collected and visually rendered in layers on maps.<\/p>\n

The lessons make use of parallel data sets about the two study areas. The data provide recent multidecadal averages of temperature, precipitation, and predominating land cover, as well as climate change model-based projections for temperature, precipitation, and land cover in 2050 and 2099. The model is the A2 scenario from the Intergovernmental Panel on Climate Change (Nakicenovic & Swart, 2000). This scenario assumes continued global growth of carbon dioxide levels in the atmosphere corresponding to the continuation of current levels of population growth.<\/p>\n

The STORE data tell a relatively clear narrative about orographic rainfall and how it influences the climate and predominating land cover. The California orographic rainfall is influenced by winter weather systems coming off of the Pacific Ocean, going west over the Coastal Range, over the Central Valley, and up the slopes of the Sierra Nevada Mountains. Occasional summer storms coming north from the Gulf of California and northwest from the Gulf of Mexico have a slightly different path. Yet, as with the winter storms, they bring strong orographic rainfall effects.<\/p>\n

The Western New York orographic rainfall is more subtle, influenced by less dramatic topographic variances. Also, Lake Ontario and Lake Erie exert a strong influence on the origins, magnitudes, and directions of storms. Hence, these data sets are especially reinforcing of learnings about the water cycle, the characteristics of populations and ecosystems, heredity, differences between weather and climate, principles of meteorology, and projected temperature increases on precipitation and land cover.<\/p>\n

Lesson 1 introduces basic meteorological concepts about the relationship between weather systems and topography. Lessons 2 and 3 focus on recent climatological and land cover data (30-year averages). Lesson 4 brings in Excel as the technology for\u00a0graphing relationships between temperatures at weather stations in the study areas and elevation. Lesson 4 reinforces learnings about temperature lapse rates, a concept first introduced in Lesson 1. The last two lessons focus on model-based climate change projections in relation to the possible fates of different regional species of vegetation. Answer keys are provided for the various constructed response questions in each lesson.<\/p>\n

Teachers have in addition access to geospatial data files that display some storm systems that moved over California and New York between the winter of 2011 and summer of 2012. These storm data sets are especially useful for exploring the similarities and differences between weather and climate. Students can study relationships between the storm behavior, the topography of the study areas, and the extent to which the storms mimic the geospatial distributions of the 30-year climatology.<\/p>\n

Figure 1 shows an image of how some of the STORE data layers appear in Google Earth. In this image, precipitation accumulations from one particular day in a particular storm that passed through the California study area appear on top of a layer showing 30-year precipitation averages. Below the base map is a transect that crosses weather stations from which the precipitation data were collected for the 30-year averages. This image is rich in conceptual learning. It exemplifies relationships between precipitation and elevation and lends itself to comparing rainfall accumulations from the storm to determine the extent to which the weather on that day was consistent with the different topographically influenced microclimates in the study area. All the data for producing images such as this one are available from the project website, http:\/\/store.sri.com<\/a>.<\/p>\n

\"Figure
Figure 1.<\/strong> STORE data image showing storm precipitation accumulations against climatology averages and elevation profile.<\/em><\/figcaption><\/figure>\n

 <\/p>\n

After the development of the first iterations during the first project year, the six exemplar\u00a0lessons were refined during subsequent years as needed after being used by students. During the second and third years of the project, the intention was to recruit six other teachers to implement and receive training for doing so. Design partner teachers were also asked to implement in Years 2 and 3, though one decided, instead, to start implementing in Year 1. Hence, the lessons, GIT, and data files were subject to multiple years of implementation and iteration. The 12 teachers were each asked to implement STORE twice in their classrooms. Usually, they implemented over the course of two academic years.<\/p>\n

Method<\/h2>\n

Participants<\/h3>\n

The project team intended to work with 12 interested teachers, which the team would recruit through informal networking. Appendix A<\/a> identifies the teachers\u2019 backgrounds, school (https:\/\/nces.ed.gov\/ccd\/<\/a>) and community (http:\/\/www.city-data.com<\/a>) characteristics, whether they came into the project as design partners or not, their prior familiarity with GIT, how they were recruited, and during which project years they implemented STORE lessons.<\/p>\n

We intended that six design partner teachers, who were recruited prior to the writing of the grant proposal for the project, would help the project team design the curricula and tools during the first project year.\u00a0 Half were recruited from locations near project researchers in Menlo Park, CA, and half near Geneva, NY. These six were supposed to begin implementation starting in Year 2 and then carry out another round of implementation in Year 3. One decided to start implementing in the latter half of Year 1, however. \u00a0After Year 1, five recruited colleagues from their schools, and these pairs (CB and TO, LS and JW, JR and LC, EL and KM, GW and JT) maintained collaborations during the life of the project.<\/p>\n

Of all the teachers, five had prior experience with GIT, owing to courses they had already been teaching or professional development they had already received, and seven had no prior experience with it.\u00a0 Of the six design partner teachers, the three from New York had already had experience with the technologies, but the three from California had not. This differentiator was unintended, however, and only the result of who happened to show interest.<\/p>\n

Data were gathered in surveys and in semistructured one-on-one interviews to assess teacher attitudes, including their perceptions of challenges and the worth of the project. The interview protocol and survey instrument were developed by the research team. The team adapted items from instruments used on some prior research projects that also examined impacts of curricular innovations on teacher attitudes and classroom practices. With one exception (JR and LC), additional interviews were conducted with the collegial pairs who were teaching at the same schools. The one-on-one interview process was triangulated by the classroom observations (as in Merriam, 1995).<\/p>\n

In cases where the teacher was implementing with different classes in multiple school periods, the observer would observe the teacher implementing the same lesson each period during the observation day. The observer audiotaped the classes and took photos of key events (such as a screenshot of the failure for a map to be redrawn in Google Earth following a student interaction requiring the redrawing, or a particular set of interactions between teacher and students on a smartboard).\u00a0Observation of the teacher multiple times was a way to minimize threats to reliability of the observation data. The taking of photos was a way to validate the observation notes by providing another medium for communicating what happened in the classroom.<\/p>\n

During small group time at computers, the observer walked around the room watching students interact and occasionally asking them what they were doing and why. The observer also acted as teacher aide, answering student content and technology questions if a hand was raised and the teacher was busy with another student. The emphasis in the observations was to detect key student and teacher behaviors and artifacts, not rigorous quantification of numbers of times certain behaviors occurred or numbers of students who exhibited them. This approach was in keeping with grounded theory (Glaser & Strauss, 1967)\u2014to detect types of behaviors with less attention to exact frequencies. Yet, there was enough monitoring of frequencies in the broad sense to enable the application of a scoring rubric.<\/p>\n

Each one-on-one interview followed on the same day or later the same week. Interviews with pairs would occur at convenient times for each teacher in the pair, not necessarily following an observation. \u00a0The teachers\u2019 responses to what the interviewer noted in his observations constituted a validity check on the observations because the teachers had an opportunity to contest a particular observer perception or conclusion.<\/p>\n

The interviewer asked teachers to reflect back on how the lessons went and what they might do differently next time. The interviewer, equipped with his observation notes, asked them to comment on lesson aspects that he thought represented broad characteristics of the interactions between the teacher and the students and the interviewer\u2019s observations of the extent to which the students appeared engaged with the lessons. Then, teachers responded with perspectives and critiques of what they did, why they did what they did, and what they might do differently the next time.\u00a0This sharing of notes with the teachers and their responses to those notes constituted validation of the observations.<\/p>\n

Individual teacher summaries were prepared in audio or written form from the interview protocols and observations. The summaries were organized by coding categories that emerged as most salient for capturing the key characteristics of what occurred during the implementation: technology (e.g., whether to use the ArcGIS Explorer Desktop or Google Earth software applications, how to orient students to the software, and how to troubleshoot problems that could arise with the software or hardware), instructional practice (e.g., how to tie student investigation of the data to prior knowledge, whether to modify certain tasks in the interest of time, and checking for understanding, stimulating discussion), and lesson content (e.g., tying tasks to previously introduced content, the degree of open-endedness designed into the lessons, tying lesson content to the vocabulary of scientific inquiry, and including student field observation tasks and analysis of storm data). Then, in follow-up discussions they looked at results from student classroom products and assessments as stimuli for additional reflections.<\/p>\n

Ten teachers completed a survey after STORE implementation in their classrooms. The teacher surveys looked at the challenges teachers faced with the project, using a series of 4-point Likert scale questions (choices were totally agree, agree more than disagree, disagree more than agree, <\/em>and totally disagree<\/em>). Teachers who believed that an item was not applicable to them could instead select \u201cother\u201d and explain why in a comment field. Several other items used different selection scales that involved the teachers rating themselves on their perseverance with STORE.\u00a0For validation of the survey instrument, the teachers were given the opportunity to provide feedback and additional information in comments if the questions did not capture the characteristics of their attitudes and practices that the survey was intended to capture.<\/p>\n

A rubric was developed and used to score the characteristics of teacher instruction from observations of STORE implementations in the teachers\u2019 classrooms (see Appendix B<\/a>). The rubric was designed to capture by scale the key implementation characteristics of what was observed and corroborated in the postobservation interviews. The rubric was formulated after the first year\u2019s implementations as a tool to capture the domains of behaviors and practices observed during those implementations then was used to rate them during their second year\u2019s implementations.<\/p>\n

Data were also gathered on student engagement and learning outcomes through pre and post assessments. The student outcomes are not the subject of this article, however.<\/p>\n

Confronting Barriers to Acquiring the Skills Teachers Needed for Successful Planning and Enactment.\u00a0<\/em><\/h2>\n

Strategies<\/h3>\n

Providing supports that would meet the needs of different teachers with differing levels of geospatial technology literacy and science knowledge was a challenge. A lesson learned by the STORE team at the first meeting with teachers, 2 months into the project, was that such supports were needed to deepen the teachers\u2019 comfort with making good instructional uses of the GIT and data.<\/p>\n

All but four of the teachers started their involvement in STORE as novices with GIT. To meet their technological knowledge needs, the STORE staff, starting in the second month of the first year and ending in the 11th<\/sup> month of the third year, developed and posted supporting teacher resources, for example,<\/p>\n

    \n
  1. Decision support tutorial comparing the advantages and disadvantages of the two GIT applications.<\/li>\n
  2. Slide, presentations, and videos on YouTube and on the project website that described for both students and teachers how to use the STORE data in Google Earth, the GIT application that all the teachers chose to use (although two, for their technology classes, chose to also use ArcGIS Explorer).<\/li>\n<\/ol>\n

    In addition, both the novice and more experienced GIT-using teachers needed to build their knowledge about the data; about how the data sets were collected, who collected them, and how they were visually represented on the GIT base map. They also needed to understand how the STORE technical team, in order to provide rich layers of projections of future outcomes in the face of climate change, took the climatology data and combined it with model data to make projections beyond the ones provided to the public by the government source agencies.<\/p>\n

    Three months into the project, the STORE team began developing and posting, first on a wiki, later on its website (http:\/\/store.sri.com<\/a>), documentation that provided this background information, including metadata descriptions of each layer, how the data were derived, and how the STORE team processed the data into map layers.<\/p>\n

    The assumption driving the decision to document was that with the documentation the teachers would deepen their understanding. That is, they would become better able to help their students understand how scientists and technicians use what they have in data and models to draw conclusions, knowing that the conclusions they draw are not necessarily definitive because the data and models are predicated on assumptions subject to critical scrutiny.<\/p>\n

    The project was also challenged to come up with appropriate resources for building the teachers\u2019 content knowledge about the scientific phenomena behind the data. Though observations, interviews, and group discussions that occurred in each project year revealed that their content knowledge was stronger than their knowledge of the technology and data, the project research team still treated content knowledge as a need and made content-oriented background information available.<\/p>\n

    Hence, overview documents from the National Climatic Data Center summarizing the climate characteristics of the states of the participating teachers and students were provided, as well as an answer key to the questions on the six exemplar\u00a0lessons, and adapted versions of the lessons that the teachers were creating and implementing that showed how they can be resources for each other. For example, an Advanced Placement biology teacher embedded in her adapted lessons content concerning plant adaptability to rapid environmental change, a topic that some of the other teachers knew less about but could learn about by studying this teacher\u2019s embellishments on that subject.<\/p>\n

    Research supports the value of making curricular materials educative as a vehicle for more effective teacher professional development (Bodzin, Anastasio, & Kulo, 2014). The educative components were more implicit than explicit. For example, the lessons were not embedded with annotations telling teachers why specific student tasks appeared in the lesson and what the teacher should be paying attention to when implementing those lessons. Rather they were educative because they exposed the teachers to examples of scientific understandings that can be drawn from the data and the types of student tasks that can be posed to build these understandings.<\/p>\n

    We assumed at first that the design research partnership would produce a curriculum that all the teachers would use, with perhaps minor modifications that they would make individually. A pivotal moment occurred during the second meeting of the design partner teachers. In the fourth month of the first year, we realized that this initial strategy would not work and that flexibility was needed. Two of the teachers were conversing over the six lessons. One teacher who taught biology argued that she would not want to teach a particular lesson because it focused too much on student analysis of the orographic rainfall data, whereas another teacher who taught environmental science liked those parts. The biology teacher had nothing against the meteorology parts and only objected because meteorology was not a topic in her course. She, instead, wanted to expand the population and ecosystem components by tying them to what her biology students were learning about evolution of plant species.<\/p>\n

    Principal investigator and paper co-author Dr. Daniel R. Zalles, who was leading this session, responded by telling these teachers that their inability to reach consensus was not a problem and that they could do what they want with the GIT and data as long as they justified why. Hence, the teachers were allowed to use the six exemplar\u00a0lessons as is, adapt them, or make up their own. Furthermore, they could decide how much class time to devote to the data and lessons. When this policy of flexibility was articulated, these teachers could become more productive as design partners. They saw themselves as partners in the design of exemplar lessons they did not have to implement if the exemplars were not a good fit for their current courses.<\/p>\n

    The teachers who were not design partners and who joined the project in Year 2 were told immediately that they could use the six lessons as is, adapt them, or make up their own. Furthermore, they could decide how much class time to devote to the data and lessons. This action had the effect of creating a community of practice. As part of their training, teachers went through the exemplar\u00a0lessons just like their students would, made judgments about their appropriateness, and then studied the adaptations that the other teachers made for the purpose of deciding on their own adaptations. For design partner teachers, this adapting began to be encouraged during their third group session in the fourth month of the project. For nondesign partner teachers, it was encouraged as soon as they began participating during Year 2.<\/p>\n

    This STORE strategy of developing a\u00a0foundational curriculum of six exemplar lessons \u00a0as a vehicle for building teachers\u2019 capacities to then develop or adapt innovative materials themselves resembles a teacher professional development study that looked at student outcomes of a science education innovation in three comparison groups: (a) teachers implementing a predeveloped curriculum with adaptations as needed, (b) teachers being taught how to carry out their own instructional designs then developing their own curricula, and (c) a hybrid condition where they were expected to do both. The results showed that the attention to helping the teachers develop their own capacities to do instructional design was the most winning ingredient, due to its high correlation with student learning outcomes on common assessments (DeBarger, Choppin, Beauvineau, & Moorthy, 2014; Penuel Gallagher, & Moorthy, 2011).<\/p>\n

    Other flexible components of STORE included choice of GIT application and choice of data. Teachers could choose which application (Google Earth and ArcGIS Explorer) and which, if any, tools in those applications they would use beyond the viewing of data on map layers. Examples of tools were path making, measuring distances, generating point layers for locations such as the school, and generating elevation profiles. The teachers could also choose which phenomena (temperature, precipitation, or land cover) to focus on and which data to use about those phenomena.<\/p>\n

    Choices of data included whether they wanted to look at both California and New York State or only\u00a0one state, whether to look at climate change projections for 2050 or 2099, which type(s) of vegetation\/land cover data, and which type(s) of temperature data (e.g., average annual, average highest monthly, or average highest daily). The project assumption behind providing this flexibility was that it would ensure that greater numbers of teachers, teaching many different science courses, would find value in the essential STORE offering: the GIT and the data. Furthermore, by being presented with such a rich set of choices, they would have the opportunity to build their instructional decision making skills.<\/p>\n

    To enhance the teachers\u2019 ability to carry out deliberative decision-making about what precisely they would implement and why, it was requested from the outset that they precede implementations by completing content representation (CoRe) forms. On these forms, they would need to explain what they intended to teach, why, what student learning challenges they anticipated, and what they expected to do to meet the challenges (Mulhall, Berry, & Loughran, 2006). A CoRe is structured around questions related to the main science content ideas associated with a specific topic, teaching procedures and purposes, and knowledge about students\u2019 thinking. The concepts and topics are documented on a table\u2014one learning objective\/big idea per column.<\/p>\n

    This exercise in completing CoRe\u00a0forms was useful for six\u00a0of the 12 (50%) teachers. \u00a0The following, for example, are misconceptions (1 and 2) and readiness characteristics (3) that teacher JW expected to encounter among his 11th<\/sup>-grade Earth Science students attending his rural high school:<\/p>\n

      \n
    1. Climate is the same as daily weather.<\/li>\n
    2. All parts of California experience the same weather, both temperature and rainfall.<\/li>\n
    3. Students are not familiar with the program, nor do they utilize computers regularly at our school (except for word processing or Internet research for papers or projects).<\/li>\n<\/ol>\n

      At the end of each classroom implementation, between Years 1 and 4, the teachers were asked to explain what they learned and what they might do differently next time. These reflections were captured in interviews, changes to their CoRe documents, and to a lesser extent in emails and written memos. In addition, they were asked postimplementation to complete reflective surveys and get their students to do the same. The student surveys prompted feedback about student satisfaction with the STORE implementation, what (if anything) they felt they learned from their STORE lessons, and what (if anything) they might want to learn about those topics in the future. Teachers received stipends each year after they completed classroom implementations.<\/p>\n

      Results<\/h2>\n

      This section describes two types of results of this strategy. First, results concerning instructional planning are described, followed by results concerning classroom implementation. Concerning instructional planning, the flexibility afforded to the definition of what constituted the essence of STORE implementation yielded a wider variety of implementations than expected. The project team expected that STORE would be deemed appropriate only for high school students, but informal referrals among colleagues also created a demand among middle school teachers and community college instructors. The result was a blooming of diverse instructional strategies and curricula for diverse students:<\/p>\n