http:\/\/store.sri.com<\/a>), documentation that provided this background information, including metadata descriptions of each layer, how the data were derived, and how the STORE team processed the data into map layers.<\/p>\nThe assumption driving the decision to document was that with the documentation the teachers would deepen their understanding. That is, they would become better able to help their students understand how scientists and technicians use what they have in data and models to draw conclusions, knowing that the conclusions they draw are not necessarily definitive because the data and models are predicated on assumptions subject to critical scrutiny.<\/p>\n
The project was also challenged to come up with appropriate resources for building the teachers\u2019 content knowledge about the scientific phenomena behind the data. Though observations, interviews, and group discussions that occurred in each project year revealed that their content knowledge was stronger than their knowledge of the technology and data, the project research team still treated content knowledge as a need and made content-oriented background information available.<\/p>\n
Hence, overview documents from the National Climatic Data Center summarizing the climate characteristics of the states of the participating teachers and students were provided, as well as an answer key to the questions on the six exemplar\u00a0lessons, and adapted versions of the lessons that the teachers were creating and implementing that showed how they can be resources for each other. For example, an Advanced Placement biology teacher embedded in her adapted lessons content concerning plant adaptability to rapid environmental change, a topic that some of the other teachers knew less about but could learn about by studying this teacher\u2019s embellishments on that subject.<\/p>\n
Research supports the value of making curricular materials educative as a vehicle for more effective teacher professional development (Bodzin, Anastasio, & Kulo, 2014). The educative components were more implicit than explicit. For example, the lessons were not embedded with annotations telling teachers why specific student tasks appeared in the lesson and what the teacher should be paying attention to when implementing those lessons. Rather they were educative because they exposed the teachers to examples of scientific understandings that can be drawn from the data and the types of student tasks that can be posed to build these understandings.<\/p>\n
We assumed at first that the design research partnership would produce a curriculum that all the teachers would use, with perhaps minor modifications that they would make individually. A pivotal moment occurred during the second meeting of the design partner teachers. In the fourth month of the first year, we realized that this initial strategy would not work and that flexibility was needed. Two of the teachers were conversing over the six lessons. One teacher who taught biology argued that she would not want to teach a particular lesson because it focused too much on student analysis of the orographic rainfall data, whereas another teacher who taught environmental science liked those parts. The biology teacher had nothing against the meteorology parts and only objected because meteorology was not a topic in her course. She, instead, wanted to expand the population and ecosystem components by tying them to what her biology students were learning about evolution of plant species.<\/p>\n
Principal investigator and paper co-author Dr. Daniel R. Zalles, who was leading this session, responded by telling these teachers that their inability to reach consensus was not a problem and that they could do what they want with the GIT and data as long as they justified why. Hence, the teachers were allowed to use the six exemplar\u00a0lessons as is, adapt them, or make up their own. Furthermore, they could decide how much class time to devote to the data and lessons. When this policy of flexibility was articulated, these teachers could become more productive as design partners. They saw themselves as partners in the design of exemplar lessons they did not have to implement if the exemplars were not a good fit for their current courses.<\/p>\n
The teachers who were not design partners and who joined the project in Year 2 were told immediately that they could use the six lessons as is, adapt them, or make up their own. Furthermore, they could decide how much class time to devote to the data and lessons. This action had the effect of creating a community of practice. As part of their training, teachers went through the exemplar\u00a0lessons just like their students would, made judgments about their appropriateness, and then studied the adaptations that the other teachers made for the purpose of deciding on their own adaptations. For design partner teachers, this adapting began to be encouraged during their third group session in the fourth month of the project. For nondesign partner teachers, it was encouraged as soon as they began participating during Year 2.<\/p>\n
This STORE strategy of developing a\u00a0foundational curriculum of six exemplar lessons \u00a0as a vehicle for building teachers\u2019 capacities to then develop or adapt innovative materials themselves resembles a teacher professional development study that looked at student outcomes of a science education innovation in three comparison groups: (a) teachers implementing a predeveloped curriculum with adaptations as needed, (b) teachers being taught how to carry out their own instructional designs then developing their own curricula, and (c) a hybrid condition where they were expected to do both. The results showed that the attention to helping the teachers develop their own capacities to do instructional design was the most winning ingredient, due to its high correlation with student learning outcomes on common assessments (DeBarger, Choppin, Beauvineau, & Moorthy, 2014; Penuel Gallagher, & Moorthy, 2011).<\/p>\n
Other flexible components of STORE included choice of GIT application and choice of data. Teachers could choose which application (Google Earth and ArcGIS Explorer) and which, if any, tools in those applications they would use beyond the viewing of data on map layers. Examples of tools were path making, measuring distances, generating point layers for locations such as the school, and generating elevation profiles. The teachers could also choose which phenomena (temperature, precipitation, or land cover) to focus on and which data to use about those phenomena.<\/p>\n
Choices of data included whether they wanted to look at both California and New York State or only\u00a0one state, whether to look at climate change projections for 2050 or 2099, which type(s) of vegetation\/land cover data, and which type(s) of temperature data (e.g., average annual, average highest monthly, or average highest daily). The project assumption behind providing this flexibility was that it would ensure that greater numbers of teachers, teaching many different science courses, would find value in the essential STORE offering: the GIT and the data. Furthermore, by being presented with such a rich set of choices, they would have the opportunity to build their instructional decision making skills.<\/p>\n
To enhance the teachers\u2019 ability to carry out deliberative decision-making about what precisely they would implement and why, it was requested from the outset that they precede implementations by completing content representation (CoRe) forms. On these forms, they would need to explain what they intended to teach, why, what student learning challenges they anticipated, and what they expected to do to meet the challenges (Mulhall, Berry, & Loughran, 2006). A CoRe is structured around questions related to the main science content ideas associated with a specific topic, teaching procedures and purposes, and knowledge about students\u2019 thinking. The concepts and topics are documented on a table\u2014one learning objective\/big idea per column.<\/p>\n
This exercise in completing CoRe\u00a0forms was useful for six\u00a0of the 12 (50%) teachers. \u00a0The following, for example, are misconceptions (1 and 2) and readiness characteristics (3) that teacher JW expected to encounter among his 11th<\/sup>-grade Earth Science students attending his rural high school:<\/p>\n\n- Climate is the same as daily weather.<\/li>\n
- All parts of California experience the same weather, both temperature and rainfall.<\/li>\n
- Students are not familiar with the program, nor do they utilize computers regularly at our school (except for word processing or Internet research for papers or projects).<\/li>\n<\/ol>\n
At the end of each classroom implementation, between Years 1 and 4, the teachers were asked to explain what they learned and what they might do differently next time. These reflections were captured in interviews, changes to their CoRe documents, and to a lesser extent in emails and written memos. In addition, they were asked postimplementation to complete reflective surveys and get their students to do the same. The student surveys prompted feedback about student satisfaction with the STORE implementation, what (if anything) they felt they learned from their STORE lessons, and what (if anything) they might want to learn about those topics in the future. Teachers received stipends each year after they completed classroom implementations.<\/p>\n
Results<\/h2>\n
This section describes two types of results of this strategy. First, results concerning instructional planning are described, followed by results concerning classroom implementation. Concerning instructional planning, the flexibility afforded to the definition of what constituted the essence of STORE implementation yielded a wider variety of implementations than expected. The project team expected that STORE would be deemed appropriate only for high school students, but informal referrals among colleagues also created a demand among middle school teachers and community college instructors. The result was a blooming of diverse instructional strategies and curricula for diverse students:<\/p>\n
\n- Community settings: 4 urban, 7 rural<\/li>\n
- Schools: 7 regular public, 2 private, 1 charter<\/li>\n
- Subjects: 2 Integrated Math-Science, 1 General Science, 1 Biology, 4 Earth science, 3 Environmental Science, 1 Physics, 2 Technology and Science<\/li>\n
- Grades: 6, 7, 9, 10, 11, 12, community college<\/li>\n
- Student levels: Special needs, nonspecial need but lower end achievers, regular heterogeneous mix, Advanced Placement<\/li>\n
- Primary student ethnicities: Asian, Hispanic, Anglo<\/li>\n<\/ul>\n
Seven teachers developed some of their own lessons, though they did make some use of the existing exemplar\u00a0lessons as well. JT, for his community college Geospatial Technology class, developed a lesson that required students to do an environmental impact report about proposed new construction near the campus. Among other things, his students assessed what drainage issues developers would need to consider. He introduced his students to the power of GIS in this way.<\/p>\n
In partnership, CB and TO developed lessons that were about the data from New York State. They also had students compare the New York and California data and added more open-ended inquiry by having students draw their own transects and analyze the data along them. Instead of using only predeveloped transects in the master data sets that connect weather stations. GW, for his community college Earth Science class, developed lessons aimed at building his students\u2019 skills in drawing contour lines on maps so as to better understand how to analyze the STORE data, which also is contoured.<\/p>\n
JR, for his middle school course, developed lessons that engaged students in collecting their own predominating land cover data in the forest around their school, in order to better understand how scientists from the United States Geological Service collected the predominating land cover data in the STORE master data sets. EL, for his Advanced Placement Environmental Science class, developed lessons around the storm data that the project staff also put into the master sets of data yet had not developed lessons around.<\/p>\n
LC put STORE lessons in a whole new high school science course that she submitted to the University of California for accreditation. The STORE lessons fit into the section of her course about Industry, Technology, and Politics.<\/p>\n
Major modifications that teachers made to the exemplar\u00a0lessons were classified as such if they took students in new directions; for example,<\/p>\n
\n- Framing the data analysis tasks from the exemplar\u00a0lessons in terms of hypothesis generating and testing, or adding new data to the inquiry, such as plate tectonics data (JT did this for his community college Geospatial Technology class).<\/li>\n
- Expanding the scope of the advanced lessons about climate change to bring in more attention to the characteristics of plant species impacting the prospects of their survivability (KM did this for her Advanced Placement Biology class).<\/li>\n
- Developing project-based lessons around endangered animal species, using the STORE climate model based projections about temperature and precipitation as the basis for student research into whether a particular California animal species is likely to survive. (LC did this for her Advanced Placement Biology Class).<\/li>\n
- Taking the mathematics content about air pressure, dew point, relative humidity, and temperature lapse rate in the first exemplar\u00a0STORE lesson (Basic Lesson 1), which was written for high school students, and adapting it for middle school students (JR did this for in integrated math-science course for seventh graders)<\/li>\n<\/ul>\n
Minor modifications included changes in phrasing, display, or directions. Teachers\u2019 final curricular products and CoRe\u00a0forms are posted on the project website.<\/p>\n
In the nomenclature of Roger\u2019s (1962) taxonomy of innovation adopters, the STORE teachers, all self-selected, could be characterized as a combination of innovators and early adopters. Innovators are those who like to develop their own instructional materials, whereas early adopters are more disposed to using predeveloped materials.<\/p>\n
In one sense, all the STORE teachers were innovators with the primary STORE resources: the GIT and data. All had in common this commitment to using these primary\u00a0resources, and 10 out of 12 (83%) persisted over 2 years of implementation. Yet, some were more like early adopters in how they decided which GIT data-centered lessons to use, some choosing more than others to change the lessons\u2019 contents. These outcomes have implications for distinguishing innovators from early adopters. These differentiations may need to be refined to become more sensitive to new paradigms of technology-based instructional innovations that are not curriculum centered but tool or data centered.<\/p>\n
An element that factored into teachers\u2019 curricular decisions was the time available for STORE in their course schedules. For example, two colleagues at the same high school teaching advanced placement (AP) courses made different decisions about curricula. The AP Biology teacher decided she had little time in her crowded curriculum to spend implementing STORE lessons, so she took only 2 days. However, her colleague teaching AP Environmental Science was accountable to teach a less crowded curriculum and, hence, had time to implement a whole week\u2019s worth of lessons and activities around STORE.<\/p>\n
Concerning classroom implementation, ratings were gathered through classroom observations of 11 of the 12 implementing teachers during their final implementation year using the observation rubric. Teachers\u2019 final year results of these ratings provide only a snapshot of teacher practices on the short numbers of days in which they were observed, so it would be wrong to overgeneralize from them. That said, the rating system was useful for articulating a range of teacher performances on the three traits of interest: instructional practice, technology challenge, and classroom management. The scores most related to the STORE professional development goals were about instructional practices and to a lesser extent about meeting technology implementation problems and managing student acclimation to using the GIT in hands-on tasks.<\/p>\n
On the instructional practice trait (mean = 2.18, standard deviation = .87), the findings reveal that six of the teachers only occasionally showed some ability to pose thought-provoking, open-ended questions about the scientific content to the students in discussions, which in turn, enabled them to gauge student understanding. The relatively low frequency of such interactions is understandable given that the main objective of the observed lessons was giving students hands-on time with the data on their computers.<\/p>\n
Three other teachers also attended to the scientific content in their student interactions but only in terms of reviewing or lecturing about it. Two teachers interacted with students only on procedural technical issues. Hence, these findings, while limited, provide little evidence of high-level instructional enactment capacity. They are consistent with other teaching practice research, in which variances in teachers\u2019 ways of responding to students\u2019 learning needs illustrated individual differences in their how they looked at teaching, what their objectives were, and what instructional resources they used (Schoenfeld, 2011).<\/p>\n
Concerning the classroom management trait (mean = 2.73, standard deviation = .47), eight teachers showed skill at keeping the students on task, and only three experienced management challenges that diverted them from teaching for short periods of time. Challenges to technology implementation, the third trait (mean = 2.18, standard deviation = .60) were common, however. Most of their technology problems were beyond their control. Only three teachers had no technology performance problems, seven had some, though only occasionally. One teacher had a severe problem. He had to terminate a STORE session in his computer lab because the district did not have enough bandwidth at the time to support all of his students interacting with Google Earth online. The principal asked him to terminate the lab session so that staff at a different school in the district could go online.<\/p>\n
There were instances when teachers showed special adeptness at proactively anticipating what types of implementation challenges might arise and planning ways to circumvent those challenges from getting in the way of learning. For example, one teacher, knowing that her set of student laptops could not be connected to the Internet in her classroom, used a thumb drive to install a local version of Google Earth on each laptop. Then, after being told by the STORE principal investigator that that the local Google Earth access would prevent students from generating numerically accurate elevation profiles, she resegmented her lesson. First she asked students to rate locations in the New York study area according to high, medium, and low, reserving the identification of real values for a class discussion. In the discussion, she projected the real values on her SmartBoard from her teacher computer, which was the only computer in the class that had Internet access.<\/p>\n
A different teacher, after observing that his students were not following textual directions about how to use the relevant Google Earth functionality nor watching him demonstrate how at his presentation station, decided instead to use a special application to demonstrate. This application allowed him to take control of all the students\u2019 computers. It caused more students to pay attention, because they could not continue to work on the computers on their own until he finished his demonstration.<\/p>\n
Some planning strategies were suggested by the principal investigator during discussions with the teacher; for example, having students collect vegetation data around the school ground, embedding hypothesis-posing questions in lessons, and comparing weather and climate data. Other decisions were influenced by conversations between teachers during group professional development sessions. For example, in one session, a seventh- \u00a0<\/sup>and eighth-grade teacher from one school got tips from a ninth-grade teacher at a different school about how better to scaffold tasks designed to get students to express conclusions about relationships between temperature and precipitation averages and climate model-derived projections.<\/p>\nThese implementation results are interesting from an exploratory perspective. The scores from the observations, however, represent only a glance at the teachers\u2019 experiences with implementing STORE and should not be taken as an indicator of what may have occurred in their classes on other days.\u00a0<\/strong><\/p>\nConfronting Barriers to Persistence With the Innovation<\/h2>\nNeeds Assessment<\/h3>\n
The project team strove to enact professional development strategies that would maximize the likelihood of tthe teachers\u2019 persisting with the innovation over multiple years. The strategy was informed by a needs assessment. A teacher survey was delivered in the middle of the project to provide that assessment. The administering of the survey in the middle of the project provided an opportunity for teachers to react to what strategies had already been put into place and still inform changes to the strategies if needed. The following are highlights. On a scale of 4 to 1 (4 = very high<\/em>, 3 = somewhat high<\/em>, 2 = somewhat low<\/em>, 1 = very low<\/em>), they were asked to rate perseverance with STORE. The mean rating was 2.9. Then, items asked participants to rate the agreeability statements about perseverance. Again, these were on a 4-point scale (4 = totally agree<\/em>, 3 = agree more than disagree<\/em>, 2 = disagree more than agree<\/em>, 1 = totally disagree<\/em>). The means on these items, from highest agreement to lowest, were as follows:<\/p>\n\n- \u201cIt would be easier to persevere with STORE if other teachers at my school were also participating.\u201d (3.3)<\/li>\n
- \u201cEven though there have been many e-mails about STORE features and there is now the STORE master website from where you can go to get to any of the STORE resources, I find it hard to commit the time I feel is needed to study all these resources independently and use them in my instructional planning.\u201d (2.9)<\/li>\n
- \u201cIt would be easier to persevere with STORE if there was a clear relationship between doing STORE activities with my students and the tests I\u2019m accountable to give to them.\u201d (2.7)<\/li>\n
- \u201cIt would be easier to persevere with STORE if my administrators (i.e., the principal, science department lead, or district administration) were committed to STORE and recommending that teachers use it.\u201d (2.7)<\/li>\n
- \u201cI came into the STORE project very motivated to participate but because I have so many other responsibilities, it is challenging to persevere with the project.\u201d (2.4)<\/li>\n<\/ul>\n
These responses suggest that teachers saw challenges in persevering with the project and that, hypothetically speaking, perseverance was increased if they worked with school colleagues who were also implementing and had more time for building mastery via independent study of the project resources. These conditions would make bigger differences than would alignment of the resources to accountability tests and support by administrators. The following comments from them elaborate on what they were thinking concerning available time:<\/p>\n