Regan, K., Evmenova, A. S., MacVittie, N. P., Leggett, A., Ives, S., Schwartzer, J., Mastropieri, M., & Rybicki-Newman, M. P. (2019). A case of early adopters of technology in a social studies classroom. Contemporary Issues in Technology and Teacher Education, 19(3). https://citejournal.org/volume-19/issue-3-19/social-studies/a-case-of-early-adopters-of-technology-in-a-social-studies-classroom

A Case of Early Adopters of Technology in a Social Studies Classroom

by Kelley Regan, George Mason University; Anya S. Evmenova, George Mason University; Nichole P. MacVittie, George Mason University; Alicia Leggett, George Mason University; Samantha Ives, George Mason University; Jessica Schwartzer, George Mason University; Margo Mastropieri, George Mason University; & Maria P. Rybicki-Newman, George Mason University

Abstract

Integrating unfamiliar technology in the classroom often requires ample technological resources and professional development. However, these resources are often not available. This case study of qualitative data combined with pretest or posttest student data illustrates how one pair of coteachers autonomously planned for and implemented a digital tool for persuasive writing into their fourth- and fifth-grade classrooms without external supports. Findings revealed the decisions teachers made to integrate the tool into their social studies curriculum and what influenced those decisions, implementation, and student outcomes. Within the context of this case study, the authors provide suggestions for teachers to improve student learning when integrating technology in the classroom. Future research is also discussed.

Rapid technological developments and access to digital learning are changing today’s classrooms. Digital technology can help students become more active agents in their learning and provide teachers with accessible data, so they can make effective data-driven decisions when planning quality instruction. Teacher integration of technology can lead to increased student learning and engagement (Darling-Hammond, Zielezinski, & Goldman, 2014) and computers can be used to personalize instruction for students with diverse needs (Bouck, 2016).

Technology also has possible sociocultural implications because of its potential to function as a “third space” in education, the intersection of everyday knowledge and institutional knowledge thereby creating a “productive cultural space for learning” (Bhabha, 1994; McCarthey, Kennett, Smith, & West, 2017, p. 49). Additionally, technology has the potential to be motivating and enhance student learning in subject areas that students may perceive as less appealing, such as social studies (Heafner, 2004).

Meaningful use of technology to support social studies curriculum includes virtual field trips (Shriner, Clark, Nail, Schlee, & Libler, 2010), interactive websites, apps (Waters, Kenna, & Bruce, 2016), and web-based digital libraries. Digital videos can be used to encourage critical thinking or to build historical empathy (Bell & Bull, 2010) and blogs or photoblogs have been used to facilitate disciplinary literacy (e.g., Barrow, Anderson, & Horner, 2017).  

These tools may be used in conjunction with brief texts or historical fiction to support the acquisition of content knowledge (Shanahan & Shanahan, 2008). Some historical fiction selections can be content rich and less overwhelming for younger students and those who read below grade level. In addition, several studies have demonstrated the benefit of using cognitive organizers on the computer to enhance social studies learning (e.g., Boon, Burke, Fore, & Hagan-Burke, 2006; Boon, Burke, Fore, & Spencer, 2006).

The increasing integration of technology in social studies instruction is a practice embraced by the social studies education community and promoted in discipline books and journals (Hammond, 2014). With teacher scaffolding, technology can extend student learning of content knowledge and promote greater inclusion of local history. Technology, pedagogy, and content knowledge (TPACK) is a widely adopted conceptual framework for thinking about integration of its namesakes’ components (Gómez, 2015; Hammond & Manfra, 2009; Mishra & Koehler, 2006).  A substantial gap exists, however, between conceptual understanding and utilization. 

A focus on each of the components of TPACK helps to discern a teacher’s strengths and areas where improvement is needed.  Hammond and Manfra (2009) offered a practical model focusing on pedagogy and technology within the TPACK framework as a way to begin to address the implementation gap.  Hammond and Manfra described a teacher’s pedagogical techniques of giving, prompting, and making in social studies instruction and ways the latter promotes student-centered learning. Instead of using technological tools to give information to learners, the learner is prompted to engage actively with materials to develop new understandings and ultimately to make a product that represents this understanding.  Pedagogical knowledge (PK) is revealed when teachers design and implement a lesson (Mishra & Koehler, 2006).  Thus, the methods, strategies, behavior management techniques, and assessments used in the classroom to facilitate student learning are relevant to a teacher’s PK.

When teachers add technological knowledge (TK), they are considering other ways to accomplish the task at hand or enhance learning by selecting and using different technologies.  In the content classroom specifically, a comprehensive model for how to effectively incorporate technology into ones pedagogy is in the early stages of development (Gómez, 2015).

Despite the promising impact of teaching with technology, and a widely accepted conceptual model, a large body of literature has demonstrated the challenges with integrating technology across the curriculum.  Teachers may lack technological knowledge (Gorder, 2008; Mumtaz, 2000), question the usability and value of the technology for enhancing teaching and learning (Ertmer, 2005), and struggle to select and meaningfully connect the technology to the content and the expected educational outcomes (Gorder, 2008; Hutchison & Reinking, 2011).  For example, in a survey of middle school teachers of science, language arts, and social studies, most respondents reported that students never used technology for writing blogs, emails, autobiographies, biographies, or lab reports (Graham, Capizzi, Harris, Hebert, & Morphy, 2014).  Lack of consistent access to technology (McCarthey et al., 2017) and limited or no professional development opportunities related to technology are identified barriers to utilization (McCarthey et al., 2017; Waters et al., 2016). 

Additionally, if the technology tool can provide students’ performance data, teachers may struggle with knowing how to make sense of it and use it inadequately (Mandinach & Gummer, 2013). Transforming the use of technology in the classroom to be more student-centered and aligned with the goals of the curriculum has proven to be challenging for teachers (Hutchison & Woodward, 2018).  When technology is supplemental to instruction or included as an add-on and not aligned with the instructional purpose, it does not enhance student learning to the fullest. Teachers, for example, may assign students to play online games as part of center-based instruction or ask students to type a paper on the computer in the back of the room.

In these examples, teachers are using technology but not necessarily integrating technology to improve student learning.  Even when teachers attempt to integrate technology for learning, they tend to do so in a prescriptive and limiting way, thus discouraging collaboration and creativity (McCarthey et al., 2017).  In summary, evidence in the literature indicates that teachers typically have not integrated technology into their pedagogy, they have been using it for generic purposes such as displaying information to students (Funkhouser & Mouza, 2013), or they have not used it to its full potential to enhance student learning.

While research has identified the barriers teachers face when considering the integration of technology, less is known about how in-service teachers who are technologically savvy and open to facilitating and mediating student-centered learning experiences navigate the process of integrating technology in the classroom.  A case study of two social studies teachers’ effective use of literacy and technology illustrated the varying degrees of technology integration in a content classroom (Curry & Cherner, 2016).  One teacher, for example, used technology to do things she would normally do, but the technology allowed her to do it faster and easier. The second teacher facilitated student use of technology extensively in his instruction and assessment practices.  Similarly, teacher intentions and attitudes toward perceived successful integration and utilization of technology in the social studies classroom have been explored (Gómez, 2015) but, specifically, how they plan for its use, respond to student performance, and make instructional decisions remains unclear.  

The following report of our case study offers a unique perspective, in that it examines coteachers planning for use of a technology-based tool to improve the quantity (i.e., number of words and number of sentences), organization, and overall quality of writing (e.g., elements of persuasion and transition words) of fourth and fifth grade students during social studies instruction. The case study also revealed how these plans were executed in the classroom and how the teachers made instructional decisions during and after implementation. The purpose of this study was to better understand how teachers planned for, facilitated, and collaboratively problem-solved during implementation of technology to improve student learning in general, and particularly, in writing.

Our Study

Case study research design is led by a series of propositions within a conceptual framework to answer a logical problem (Yin, 2009). In this study, our case was a pair of coteachers in an urban elementary school who embarked on integrating a technology-based graphic organizer (TBGO) into their literacy instruction to improve the written expression of fourth and fifth graders. However, the current study deviated from previous investigations of the TBGO’s effectiveness (Evmenova et al., 2016; Regan et al., 2016a; Regan et al., 2017). We wanted to explore what happens when teachers were provided with the autonomy to plan for, construct, and deliver their own lessons to teach a technology-based intervention. To achieve this, coteachers were provided a 3-hour training by the third and fourth authors to learn and practice with the TBGO.

Following training, the coteachers had ongoing meetings with each other to plan instructional lessons collaboratively. Instead of using vetted researcher developed lesson plans (i.e., Regan et al., 2016b), teachers had flexibility to integrate the tool into their classroom in any way they wished to teach their students.

The school’s leadership and the overall culture of the school were well aligned with the technology-based project. For example, there were 1:1 laptops accessible at the school site, and participating teachers were already reportedly using technologies for testing and for providing instruction. The teachers volunteered to employ the TBGO and were receptive to the use of technology for enhancing student learning. We were able to support any inherent malfunctions of the technology by providing onsite expertise to troubleshoot any technical needs or mishaps within the tool. (There was limited, if any, need for such support throughout the study).

We hypothesized that our case study would reflect the authentic decisions, discussions, and challenges of planning for and implementing the TBGO in a whole classroom environment without the use of vetted lesson plans. We also hypothesized based on previous research (e.g., Evmenova et al., 2016) that most students would benefit from the TBGO. This study sought to answer the following research question: How do two teachers in an urban elementary school integrate a technology-based intervention into their literacy instruction?

Methodology

To answer the research question, we used the case study approach (Yin, 2009) to collect and analyze data from observations, interviews, pre-post test writing samples, and documents generated over 5 weeks of implementation.

Participants    

A school district partnered with us in support of a grant proposal. After the grant funding was awarded, the school district liaison in central administration asked the school principals if any teachers in their designated schools may be interested in learning more about a writing-with-technology intervention. Subsequently, a school-based technology instructional coach and a principal contacted us with an interest in using the technology-based writing tool at their elementary school.  

The school leaders had identified two teachers in the school who were eager to use the technology in their classrooms.  The leaders of the school and the teacher volunteers were not familiar with the product other than the purpose to support students’ written expression.  One teacher (fourth grade) was interested in having her students be more independent when approaching writing tasks, and she hoped the technology tool would increase students’ written language. The other teacher (fifth grade) wanted to participate for similar reasons, and she thought that exposing her students to technology would be helpful given that standardized testing was moving toward an online format.

Of the three platforms, the teacher participants opted for the computer-based graphic organizer, since the classes had enough laptops for every single student and teachers were comfortable using Microsoft Word®. They also thought it would be easier for students to type responses on a traditional keyboard. Since the teachers were beginning a unit of persuasion, they selected the persuasive TBGO. The teachers, along with 29 of their fourth- and fifth-grade students from two classrooms, participated in the study.

Coteachers. The two teachers involved in this study were both female and Caucasian. They had been coteaching their classes for a year prior to the onset of the study. Therefore, the fourth graders and fifth graders joined together during instruction delivered by both teachers. Both teachers entered into teaching through an alternative placement program and held a master’s degree in education. The teachers had an average of 3 years teaching experience. Both teachers were licensed in elementary education and English as a second language (ESL). Teacher A taught fourth grade English language arts (ELA) and social studies (SS). She was fluent in Spanish. Teacher B taught fifth-grade ELA and SS. They described themselves as competent with technology and used it socially and daily in the classroom to support instruction.

Students.There were 29 fourth- and fifth-grade student participants. The group of students included English language learners (ELLs) and students who received special education services. All students received free and reduced lunch. The primary language spoken by the majority of ELL students was Spanish. Table 1 includes student demographic information.

Table 1
Demographic Information for Student Participants Across Two Classrooms

 Fourth GradersFifth GradersTotal
N181129
Gender (M/F)9/95/614/15
Hispanic Latino10818
African American639
Multi-Racial202
Special Education Services3 (SLD)2 (SLD; SLI)5
English Language Learners119[a]6[b]17
Six of these students were monitored as FLEP (formerly English Limited Proficient).
[b] Three of these students were monitored as FLEP (formerly English Limited Proficient).

Setting

The setting was a low-performing bilingual elementary school in an urban, mid-Atlantic city on the East Coast of the United States. Approximately 391 students were in Grades PK-5, of whom 84% were Hispanic/Latino, 12% African-American, 2% Caucasian, 1% Asian, and 1% multiracial. More than 13% of students were diagnosed with disabilities and received special education services. About 66% of students in the school had limited English proficiency.

The main classroom used in this study was located in a temporary trailer. The classroom was rectangular with two large windows and walls covered in instructional posters, charts, and student work. Desks were clustered in groups. Teacher desks were in the back next to shelves and containers filled with books and iPads. Two larger tables were used for small group activities. The room had a portable cart, an LCD projector, and a white board. Teachers reported daily student use of laptops and iPads for small group instruction. For whole group instruction, teachers reported daily use of a projector and a document camera.

The Technology-Based Graphic Organizer

Former research has investigated TBGOs with self-regulated learning strategies to support students with and without disabilities to compose persuasive writing essays (Evmenova et al., 2016; Regan et al., 2016a) and argumentative essays (Boykin, 2015). The TBGO has improved students’ organization of writing, number of words, number of transitions, and writing quality.

Of the three platforms of TBGOs (i.e., computer-based, mobile-based, and web-based), teacher participants in this study used the computer-based graphic organizer (CBGO) platform created in Microsoft Word® (Evmenova & Regan, 2012). Of the three writing genres (i.e., persuasive, argumentative, and narrative), participants selected the persuasive genre. 

The CBGO includes five parts: (a) Pick your goal, (b) Fill in the chart/table below, (c) Copy the text in the orange box, (d) Paste the text into the box below, and (e) Evaluate (see Appendix A for a completed example).  Students begin when provided a writing prompt from the teacher. In Part 1, students read the prompt and then select a goal from the drop-down menu (e.g., “I will include three reasons and two examples”). Goal setting is one of the self-regulated learning strategies embedded in the CBGO.

In Part 2, students begin by writing words in the Brainstorm box to represent ideas they may have when considering the writing prompt. Then, students complete the table in Part 2. The first column in the table includes a vertical mnemonic, IDEAS, which in the persuasive genre stands for the following: Identify your opinion, Describe three reasons, Elaborate with examples, Add transition words as you go, and Summarize. The visual reminder found throughout the CBGO is a light bulb to represent the IDEAS mnemonic. In addition to the visual reminder, text hints and audio comments are embedded in the CBGO to support students’ self-regulated learning strategy of self-instruction. For example, text hints appear when students hover over each letter of the mnemonic (e.g., when hovering over “I = Identify your opinion,” a text hint would appear stating, “What do you think about the topic?”) or audio hints are played when students click on the light bulb icon located by each letter of the mnemonic.

The second column in the table, Main Points, provides space for students to take the words from the brainstorm box and organize them in the order of the IDEAS mnemonic. Students then write complete sentences in the third column based on the ideas generated in the brainstorm and main points columns. As students write sentences, they can select transition words from a pull-down menu.

The fourth column, titled Check Your Work, allows students to check a box to monitor whether each essay part was successfully included (e.g., “I included three reasons to support my opinion”). It is another self-regulated learning strategy embedded into the CBGO (i.e., self-monitoring). After students check their work, they move to Part 3: Cut. In this section, students copy the content of the orange box in Part 2 and paste the contents in the large white text box with an orange border found in Part 4: Paste.

The table to text feature of Microsoft Word® transforms all student sentences written in different rows in a table into a paragraph in the text box. Students can edit the contents, as needed. Students may produce a complete six- to eight-sentence paragraph depending on the selected goal.

Finally, students evaluate their work in Part 5. Self-Evaluation, another self-regulated learning strategy, which includes counting and then inserting the number of sentences, reasons, and elaborations. Students also identify from a menu whether the sentences in the paragraph make sense, how they feel about their paragraph by selecting one of three images and goals for their next writing. Space to insert specific teacher/peer feedback is included. Students save the completed organizer as a Word document or print out the organizer and completed essay. In this study, student participants saved their essays on an individual thumb drive.

Data Sources

Multiple data sources were used in this study. Data sources included observations of planning sessions and instructional lessons, teacher participant lesson plans and teacher logs, pre- and post interviews of the teacher participants, and student pre-post test data.

Observations of planning sessions.  Teacher participants had a scheduled planning period twice a week for a total of 90 minutes. The teachers were asked to audio/video record themselves when planning instruction. Two observers independently took notes on all discussion topics, teacher dialogue including quotes, and any decisions observed. They then debriefed and compared notes to verify interpretations and came to consensus on what was observed.    

Observations of instructional lessons.  All teaching sessions related to the CBGO were video/audio recorded. The five 60- to 90-minute lessons were cotaught. From these recordings, -two observers independently took notes on instructional activities and arrangements, curricular materials, and the length of the lessons. The two research members debriefed and compared notes to verify each other’s interpretations and came to consensus on what was observed.  These observations allowed us to describe teacher and student’s actions during instruction.

Teacher lesson plans and logs. Teachers completed school-developed weekly instructional lesson plans. The four-page template included the following information per lesson: lesson standard, reading and writing objective, agenda, active engagement strategies, materials, and instructional literacy groupings. The template also included a checklist of instructional strategies and questions that teachers could ask students during instruction.

Teachers completed a log per instructional lesson. The lead researcher developed the electronic teacher log. The log included multiple choice and open-ended questions asking teachers to identify information such as the purpose of the day’s tasks, characteristics of student behavior, and descriptions of student engagement. The teacher log also asked them to brainstorm the challenges from the lesson, what went well, and any additional insights. 

Teacher interviews. Each teacher participant was interviewed before and after the study by the lead researcher. The semistructured interviews were essential to understanding teachers’ motivation for participating in the study, their personal and professional backgrounds, as well as their perspectives on the lesson creation, implementation, and perceived effectiveness of the CBGO. The pre-interview protocol was guided by previous research (Regan et al., 2016a) and was provided to teachers in a questionnaire format. 

The questionnaire included questions regarding educational and professional background, how the teacher participants taught writing, and how they incorporated technology into their lessons. The questionnaire was sent via email and the teachers emailed the researchers their responses.

The postinterviews, completed by phone at the end of the study, lasted 30 minutes, were audio recorded, and then transcribed verbatim. The postinterview included questions regarding the teachers’ overall experience with the CBGO. Sample questions included (a) Can you describe what went well? (b) Any challenges? and (c) How many opportunities/how often did you have students independently practice with the CBGO?

Writing performance measures. Student writing was evaluated three times: before the teachers taught the CBGO, after instruction and practice was provided with the CBGO, and again when students wrote without the CBGO. The data gleaned from student writing included five writing performance measures: number of words, number of sentences, number of transition words, and writing quality.

Writing measures were used in previous research (e.g., Evmenova et al., 2016). The number of words measure used Microsoft Word®’s word count tool to determine the total number of words that students used in their writing. The sentences measure was defined as a complete thought inclusive of a noun and a verb with ending punctuation. The number of transition words was counted if they were available from the pull-down menu in the CBGO, as well as any other student-generated transition word that demonstrated transition from one thought to another (e.g., “First” and “Furthermore”). The holistic writing quality measure was determined by an 8-point rubric used in previous research. The rubric included descriptors of a topic sentence, reasons, elaborations, transitions, and a summary.

Procedures

Prior to any data collection, Institutional Review Board (IRB) approval was obtained from both the university and the school district to interview and video-audio record teacher and student participants. Consent/assent forms were obtained.

Teacher training. After providing consent, two research team members met with the teachers. They received the following: (a) an overview of the CBGO for persuasive writing, (b) an introduction of the IDEAS strategy, and (c) practice writing with the CBGO for persuasive writing. The training lasted 3 hours.

Pre-post interviews. Given consent, teachers responded to a prestudy questionnaire via email. After the study, each teacher participated in a phone interview with the first author.

Pretest. Students were provided individual laptops during ELA to respond to one of two validated persuasive writing prompts within 30 minutes (e.g., “Write an essay on whether or not schools should be separate for girls and boys”) using a locked Microsoft Word® document; The document allowed only typing, without the ability to change settings. The instructor followed a prescribed script to administer directions, which included reading both prompt choices aloud.

Planning sessions and instructional lessons. Teachers collaboratively planned instruction involving the CBGO a total of five times over 5 weeks. They were provided a camera to record planning sessions as well as any instructional sessions. The planning sessions were an average of 45 minutes, with the exception of Planning Session 2, which was reportedly less than 10 minutes. Data from planning sessions and teacher logs indicated that the first planning session took place a day after training and before any student instruction. Subsequent planning sessions were typically completed at the beginning of the week with both teachers present.

Due to standardized testing obligations, instructional lessons occurred no more than 2 to 3 days a week. The lessons were taught with the fourth and fifth graders present in the same room at the same time. Fourth-grade lessons were taught with Teacher A as the lead teacher and Teacher B as the support teacher.  Fifth-grade lessons were taught vice versa. Specific instruction related to writing and the CBGO was embedded within a 90-minute ELA class. Instructional time per lesson varied but was an average of 50 minutes.

Posttest. Following all instructional and practice lessons, students were provided individual laptops during ELA to respond to one of two persuasive writing prompts within 30 minutes using the CBGO (e.g., “Write an essay on whether or not students your age should make the rules for the classroom”). The instructor followed the same prescribed script as used at pretest.

Maintenance test. Approximately 3 weeks after posttest, one maintenance test was administered in which students responded within 30 minutes to one of two persuasive writing prompts using the locked Microsoft Word® document without the CBGO. Procedures were the same as posttest.

Interobserver Agreement and Fidelity of Testing

For pre-post tests and maintenance tests’ writing performance scores, interobserver agreement (IOA) was calculated. Prior to scoring, all members of the research team were trained by the primary researcher to score writing probes. Training included scoring of anchor essays from previous research studies. These anchor essays of varying length and quality were reviewed together, discussed, and independently scored over three 30-minute sessions.

Following training, two members of the research team independently scored each student participant’s essay (i.e., pretest, posttest with CBGO, and maintenance test) for all writing performance measures. Any discrepancies between the two raters were discussed and resolved until 100% agreement. To establish IOA, an independent third rater scored 30% essays. Number of words, sentences, and transition words were 100% IOA. Writing quality scores were 86.2% IOA.

Two observers measured fidelity of testing. Fidelity of testing was calculated by dividing the number of steps occurred by the number of steps planned (e.g., “Reads script from protocol”). Fidelity for all testing procedures was 100% with 100% agreement on the fidelity of testing.

Data Analysis and Credibility

We first independently observed all planning and instructional lesson videos and written summaries of these observations and reviewed all transcribed interviews to gain a general sense of the process that was followed and teacher perceptions. Then, the lead researcher independently coded the transcripts of interviews and written summaries of the planning sessions and instructional lessons by identifying segments in the data that were responsive to answering the research question (as in Merriam, 2009).

The lesson plans/teacher logs were used primarily as a secondary data sources to triangulate the observational data and to refine the initial open codes. Additionally, memos by the lead author were taken throughout observations and interviews to better assess bias, develop initial categories, and further triangulate the data. We then met to review the initial codes, data, and preliminary categories in order to develop broader themes.

All themes emerged from the data. Initial themes included Challenges of Technology Use, Competing Agendas, Decision-Making, and Lack of Explicit Instruction. We examined each of these themes, compared results, and confirmed or considered any conflicting explanations before identifying three overarching themes.

Member checking of interview data promoted quality in our understanding of teacher experiences. These steps increased trustworthiness by incorporating multiple perspectives, opening communication around any biases, and increasing coding efficiency and reliability of coding (as suggested by Merriam, 2009).

In addition, it allowed multiple themes to emerge and to be discussed, allowing us to better understand how teachers integrated the new technology-based intervention into their literacy instruction. In addition to the qualitative analysis, pre-post and maintenance student writing performance data were scored and entered in the statistical package for the social sciences (SPSS) to run paired sample t-tests.

Findings

Three broad themes emerged in the analysis of the data to inform the research question: (a) curricular integration, (b) explicit instruction, and (c) competing agendas.

Curricular Integration

When the teacher participants were first interviewed, they said that they used (a) iPads for small group literacy instruction 30 minutes a day and (b) computers in the classroom for small groups of students completing a computer-based reading program twice a week. Teacher A reported that she had used laptops for whole class instruction only once prior to the study. When describing writing instruction specifically, she stated,

We use paper-based graphic organizers and checklists…. We brainstorm, read essay prompts aloud, and discuss our brainstorms as a whole group. We revise our work…critique other students’ work. Students also write creatively for homework and receive feedback. Students are expected to write daily…using [an acronym that stands for] — restate question, answer, provide evidence, explain, summarize.

Teacher B added that her fifth graders used digital readers monthly and that the students type their writing monthly for end-of-unit projects. She also said that the schoolwide writing mnemonic strategy of RACES, an acronym for the actions restate, answer, cite, explain, and summarize; a slight variation of the acronym shared by Teacher A.

Despite descriptions of little technology use for general instruction and the absence of technology for writing instruction specifically, teachers demonstrated more than using technology only as an add-on to their instruction. For example, the teachers integrated the CBGO into their instruction to teach particular literacy and social studies standards of learning. We refer to this use as curricular integration.

Appendix B represents the Common Core State Standards (CCSS) and district teaching objectives identified in teachers’ lesson plans. Curricular integration was observed in planning sessions and in instructional implementation.

Planning of instruction. Observations of planning sessions and the teacher logs revealed how the teachers planned their instruction. Throughout the planning sessions, a pattern developed; the teachers would describe observations from completed lessons to each other, discuss next steps to instruction, and then make a decision based on observations. Teachers were not observed reviewing student work. Of the two weekly planning sessions in the teachers’ schedule, a total of five for the study’s duration were specific to the teaching of the CBGO.

Curricular integration was discussed during all planning sessions. In Planning Session 1, the teachers discussed how the CBGO would integrate into what they were currently doing. Teacher B stated that, currently, students’ written responses were based on a passage they were reading in social studies. They both wished to “…keep it [CBGO] aligned with what they are already doing now…” and “…use it to talk about what it means to have strong evidence [when writing].”

The teachers also recognized the added challenge to persuasive writing when using evidence from text on a topic that is less familiar and personally relevant to students. Further observations of the teachers’ discussions revealed that the 29 students were completing standardized testing for part of the school day during the first 2 weeks of the study and studying a unit on the American Revolution. During the discussion, Teacher A commented,

…It would be helpful to get to the point where you’re reading and responding with your writing, but maybe in the very, very, beginning it might make more sense just to be like, “Do you like a pizza party or an ice cream party and say why?”

The teachers collectively thought that the latter format would be easier for the students to accomplish. Teacher B agreed, and they decided that for Day 1 of instruction, they would be the ones to model use of the CBGO with a motivating and relevant topic, projecting the organizer for the students to see. Student ideas would be elicited and added to the organizer. Day 1 and subsequent plans were described in the teacher log as follows:

Day 1 – introduction and brainstorming around persuasive writing and IDEAS.
Day 2 – modeling of how to use the CBGO using a fun question prompt.
Day 3 – independent practice using the CBGO to respond to the same question prompt as previous lesson.
Day 4 – apply graphic organizer to a social studies content related prompt. Read aloud supporting text.
Day 5 – students collect evidence and details for question prompt.
Day 6 – students use CBGO to develop a paragraph in response to a social studies related prompt.

As indicated in the log, the plan was for students to be first exposed to the CBGO and to practice using it when responding to a prompt that did not require facts and evidence from text used to teach the social studies content. After doing so for approximately 2 weeks, students would be done with the standardized testing, and teachers could then fully integrate the CBGO into their social studies curriculum.

Implementation. Observations of teacher planning sessions, instructional lessons, lesson plans, and teacher logs informed us how the teachers integrated a technology-based intervention into their literacy instruction. Data indicated that a total of four instructional lessons over 3 weeks were provided, followed by three opportunities for students to practice writing with the CBGO over a subsequent 2-week time period. The practice sessions entailed students independently using laptops in social studies to respond to a given prompt using the CBGO. Students were observed to be on task, and some students were observed with a book from the social studies curriculum while completing the CBGO. During these practice sessions, the teacher participants observed students and provided individualized support, as needed. Teacher A shared her observations of how students were able to work independently:

…after initially working with the kids in a few practice sessions towards the end, the students were able to begin writing with very little support from teachers. So we could just give them the prompt and whatever prework we did – set them up for it. They were able to start writing promptly. There were relatively few hands up in the air wanting a teacher to get them started.

Observations of instructional lessons showed the teachers using a variety of methods for teaching students how to write a persuasive essay with technology. They used whole group and peer discussions, small group instruction, visuals to support student learning, Socratic seminars, literature to model exemplar persuasive prose, narrative and historical fiction novels (e.g., Samuel’s Choice by Richard Berleth, Pink and Say by Patricia Polacco, Independent Dames: What You Never Knew About the Women and Girls of the American Revolution by Laurie Halse Anderson), and interactive notebooks.

Lesson 1 illustrated some of these methods. For example, the lesson began with a teacher-directed discussion of what persuasive writing was; the teacher documented student ideas on chart paper for everyone to see. The teacher then read aloud the book Hey, Little Ant by Phillip and Hannah Hoose. Students were directed to listen to the story and decide if they would take the opinion of the young boy or the ant character and to provide reasons as to why they agreed with the character.

Soon after, the mnemonic IDEAS was introduced to the students. Students stapled a distributed photocopy of the IDEAS strategy into their interactive notebook. Students were eventually given a handout of text to read about wearing uniforms or no uniforms in school. The students were asked to use a highlighter to identify reasons and examples provided in the text. The objective of Lesson 1 was for students to identify the elements of a persuasive essay and to apply the IDEAS strategy to an exemplar text.

 In subsequent lessons, the teachers provided students with the opportunity to have peer discussions (i.e., turn and talk) or interactive 1:1 debates to share ideas about a particular topic. The students were also able to write collaboratively or independently about various topics of persuasion while using the CBGO. As observed in Lessons 2 and 3, the teachers used a projector to display the CBGO on a white board for students as she typed in their shared ideas when answering a prompt as a whole group.

The writing prompts included the following: “Should students take standardized tests – Why or why not?” “Should we have PE every day?” “Should children be allowed to watch as much TV as they want?” These relevant prompts required students to provide personal reasons to support their opinion.

In contrast, the teachers also gave students guided practice opportunities to provide support for an opinion with facts or evidence drawn from text. The texts, in these cases, were typically short readings or books about the American Revolution read aloud by the teacher. The teachers referred to this as “add text evidence.” For example, when students were given practice opportunities with the CBGO, prompts included the following: “What do you believe was the single most important cause of the Revolutionary War?“ “Did women play an important role in the American Revolution? Why or why not,” “Should Samuel (from Samuel’s Choice) have joined the colonists in their fight for independence? Why or why not?” Teacher A described her instruction when guiding students to respond to the latter prompt:

…I gathered student ideas…based on IDEAS. The poster was organized into “yes” (Samuel should have joined the colonists) and “no” (Samuel should not have joined the colonists). We then went through and labeled as a class where each piece of info went (Is this an opinion? Is this a reason? How do you know?). Students were constantly referencing the poster in order to help them, particularly those who are very low readers. I also gathered details from the text for some of the lower students to reference, which seemed to help drastically for some of the children who might have gotten lost looking for details in the book. The higher students already had relevant details underlined in the book from previous lessons, which helped them…

Video recordings showed that students engaged with the CBGO in Lesson 2 and all subsequent lessons. While data revealed the teachers’ curricular integration when planning instruction, data also presented the teachers’ challenges with implementation. These challenges were best represented by the theme referred to as explicit instruction.

 Explicit Instruction

Explicit instruction involves the teacher providing step-by-step demonstrations, clear and concise language, many examples, and corrective feedback. Observations of the planning sessions and interview data revealed that teachers recognized three areas of need during instruction: (a) students struggled with the IDEAS mnemonic, (b) students did not use specific components of the CBGO, and (c) students struggled with technology. Explicit instruction to address any of these areas was not observed, however. Student performance data revealed that students’ written expression improved when using the TBGO.

IDEAS mnemonic. Reflections in the teacher log revealed that teachers were monitoring student performance. For example, after Lesson 1, Teacher A stated that students were “enthusiastic about how the graphic organizer worked,” but students

…needed more structure and scaffolding throughout lesson…. Students had misconceptions about what IDEAS stood for…. Students struggled to articulate the words opinion, reasons, and elaboration. We need to do some more vocabulary pre-work and practice using those words.

The teacher log also disclosed Teacher A’s struggle to recall what the letters represented in the IDEAS strategy and she had to correct herself during instruction. Observations of instructional lessons indicated that the teachers verbally referred to the E in IDEAS as evidence rather than elaborate with examples. Their field notes from the second planning session clarified this instructional change to the mnemonic:

We struggled with IDEAS as a strategy for persuasive writing. We couldn’t seem to remember it as a mnemonic…. We used Common Core standards to create a checklist to clarify the parts of IDEAS…. We all agreed that we really like the graphic organizer, but we don’t like IDEAS very much as we can’t seem to remember the parts ourselves, so we have less hope for it being memorable and useful for the children. The most successful part of our planning was comparing IDEAS to the [Common Core] checklist and equating “examples” with facts or details, which is the Common Core language.

Observations of instruction showed teachers displaying the visual anchor chart of the IDEAS mnemonic on the white board during every lesson. Despite the visual support, the struggle with the “stickiness” of the mnemonic reportedly persisted. During the postinterview, some students forgot or recalled minimal parts of the IDEAS strategy. One student inaccurately reported the D as representing define and the E in IDEAS as elaborate evidence. Additionally, during the postinterview, Teacher B referenced the mnemonic incorrectly:

I think because you know IDEAS, the second step is opinions [Note that this is an error. The first letter, I stands for “Identify your opinion.”], but yet it starts with a D. So it’s determine your opinion [This is the error. The D stands for “Determine three reasons.”]. So there was that. So we had to do a lot more reinforcing with what IDEAS stood for and what it meant. We even did some playing around with it to see if we could come up with our own. I think we came up with ARG, like a pirate. So it was like answer, reason. State your reason once or up to three times. And then G was gather your ideas for summarize at the end. We were just trying to think of other ways we could do the same concept, but make it a little easier for the kids.

CBGO components. Despite being unable to recall elements of the IDEAS strategy, both teachers relayed during postinterviews that they appreciated the structure of the organizer:

I really liked how it had a place in the organizer to create an outline that was off to the left that wasn’t included in their final essay, but it allowed them to see that “I planned ahead, and then I go ahead and expand into sentences.” I liked how that worked. (Teacher A)

There were some students who were able to take it and really understand how an essay comes together in terms of structure and organization. I would say that was the biggest improvement. Not necessarily length or how can I say it, like, they still struggle with how to articulate their ideas. But they were much better at grouping related information and structuring an essay, especially that sort of introduction and conclusion piece is so important. (Teacher B)

Another consistent observation across the instructional lessons and lesson plans demonstrated that the teachers did not provide any additional instruction about the self-regulatory (i.e., goal setting, self-monitoring and self-evaluation) components embedded in the CBGO. In a lesson conducted during the third week of the study, Teacher B reminded individual students who were done with their CBGO to now complete the “bottom section where you count your parts” and to “check off all of those boxes” in the Check Your Work column, dismissing the iterative process of monitoring one’s work while writing and not afterwards.

In addition, while introducing the CBGO in Lesson 1, Teacher B did not explicitly address the technological features of the CBGO or the self-regulated learning strategies. In later lessons, students were not observed using the headsets to listen to and edit their essay, while using the text-to-speech feature. Teacher B shared in her post interview, “We didn’t use text-to-speech or the audio comments. We did use when you hover over the certain features of the organizer and it gave you a prompt like a reminder.”

Teachers also reported during Planning Session 3 that students were “struggling with the main idea bucket” in Part 2 of the CBGO, dismissing parts of the CBGO and directly writing sentences, and not representing brief one- to two-word thoughts in the brainstorm box or the main points column. Rather than providing additional instruction regarding how to brainstorm effectively and use the CBGO for planning, the teachers reported that students needed more practice with the tool.

For example, Teacher A made the following comment during Planning Session 3: “They [students] need practice with the graphic organizer…. [Students need] way more practice with different prompts for this to have an effect.” Further, teachers said that students should use the tool to respond to personal and relevant prompts before writing on topics discussed in social studies and using text to provide evidence. Specific teaching objectives were not explicitly identified, but 30-40 minutes of class time was provided for students to practice with the CBGO.

Finally, in the postinterview, Teacher B noted the value of the transition drop-down menu component of the CBGO. Students were responsive to this feature, and teachers recognized students’ generalization of this in other writing. She explained,

I liked the transition dropdown a lot. I noticed students really enjoyed that. It took some of the scariness of choosing transitions when they had some options, and I think they became more comfortable with transitions. I see them popping up in their writing more frequently now.

Additionally, Teacher A noted the support of the CBGOs copy and paste feature.  She commented, “Part of the appeal was the fact that they don’t have to rewrite it [the essay] you know. They’re copying and pasting…”

Technology. In the third planning session, teachers described their students’ struggle with technology, specifically noting students’ labored typing skills, lack of awareness for how to select and copy text, and ignoring the text hints that pop up on the CBGO when one hovers over a letter of the mnemonic. Teacher A depicted the struggle with student skill and logistics:

The biggest challenge was the technology piece, especially for the younger kids, the fourth graders. There was a lot of student anxiety and discomfort about using Word features and commands. …With twenty-one kids…it was sort of overwhelming to help them. You know copy and paste, then save properly, and even opening the jump drive file and selecting the correct file was stressful…. Although we did have laptops at the school that we could use we ran into a couple of times where, for whatever reason, we couldn’t use the laptops that day or the laptops hadn’t been charged…. So it was a couple of logistical elements that were challenging.

Furthermore, the fifth-grade teacher said during a planning session that she had to explain to her students to press the left side of the mouse pad and move the cursor up. She also said that some students did not know how to use the shift key as a simpler way to make a capital letter, but consistently used caps lock. Teacher B lamented,

I feel like the kids who really needed the [content] help – it was hard to get to them because it was so many other things they had to learn and the technology piece. It’s like they didn’t even know how to select something or copy something.

While discussing lessons, the teacher made a cue card on poster board to post in the classroom. On the cue card, she provided the keys to perform the following tasks on the computer:  Undo, Copy, Paste, Capitalize, and Indent. Although Teacher A commented, “I knew the technology was going to be an issue,” explicit instruction for use of the technology was not observed in planning sessions or instructional lessons.

Despite some of the students’ lack of technological skills, teacher interview data suggested that teachers were considering how both technology and paper-based pedagogical tools could support their writing instruction. Teacher B explained,

When we ask them to do brainstorming on paper where we allow them to do things like concept webs or other graphic organizers, we see a little bit more…. But the process of creating a final copy was so much faster using the computer. And the process of editing and revising was so much faster. The kids didn’t have to labor over rewriting, making a mistake, and then wanting to rewrite it again. And then for me – having it digital or electronic was really great for me doing track changes when I wanted to give them feedback…

Student performance. Despite teachers not providing explicit instruction for addressing the problematic student skills observed during CBGO instruction, students’ persuasive writing performance improved as a result of using the technology-based tool.  Teacher interview data included their own observations of students’ writing progress and how it aligned with their goal for students to be more independent:

So fourth grade was particularly very needy of teacher support [before the CBGO] and having the sense to take their own initiative. And I saw the improvement there. In fifth grade they were already pretty good, but it [writing instruction] became even more hands-off, which is great.

Pre-post test data from the 29 students were entered in a paired samples t-test to compare student performance on all writing measures (i.e, number of words, number of sentences, number of transition words, and overall writing quality score). Comparisons were made between scores from the pretest (i.e., writing a persuasive essay without the CBGO), scores from the posttest (i.e., writing a persuasive essay with the CBGO), and maintenance scores (i.e., writing a persuasive essay again without the CBGO). These analyses yielded significant differences in all but one of the eight comparisons.

Significant differences were found between pretest and posttest for the number of words, t(28) = -4.60; p = .000; number of sentences, t(28) = -4.55; p = .000; number of transition words, t(28) = -21.298; p = .000; and the holistic writing quality, t(28) = -6.92; p = .000. Significant differences were also found between pretest and maintenance test for the number of words, t(28) = -2.12; p = .043; number of transition words, t(28) = -4.03; p = .000; and the holistic writing quality, t(28) = -2.21; p = .036. Differences between the number of sentences between pretest and maintenance were not significant, t(28) = -1.26; p = .219. 

These findings were confirmed with follow-up nonparametric tests, since some standard deviations exceeded the mean (see Table 2). Such findings lend support to the effectiveness of the treatment for all students in this classroom at posttesting and maintenance (with the exception of sentences at maintenance testing), despite the challenges encountered during explicit instruction.

Table 2
Student Writing Performance at Pretest, Posttest, and Maintenance

 Pretest
M (SD)
Posttest
M (SD)
Maintenance
M (SD)
Words95.07

(43.95)
131. 24

(50.91)
116. 55

(44.46)
Sentences4.28

(3.51)
6.72

(2.51)
4.93

(2.93)
Transition words0.59

(0.73)
6.14

(1.36)
1.93

(2.07)
Quality3.66

(1.47)
5.79

(1.82)
4.45

(2.15)

Teachers also reported positive input shared by their students over the course of the study.  Teacher B said that the students “thought it was easier to write with it (the CBGO)…. Maybe the writing process felt easier to them?” 

Competing Agenda

A recurring theme across multiple data sources was a competing agenda. Video observations and teacher postinterviews illustrated that students needed additional instruction to learn the mnemonic and the components of the CBGO. Students also needed support when using the technology. However, data revealed that competing curricular needs overshadowed these student needs.

Specifically, the participating teachers regularly communicated the need to maintain the pace of instruction rather than altering their instructional timeline. For example, Teacher B commented in Planning Session 3: “[The CBGO instruction] either needs to be with the content or I can’t do this.” Therefore, the CBGO was used to reinforce content knowledge of social studies curriculum.

Additionally, 3 weeks into the study, the teachers said that they were off course to providing instruction for an upcoming narrative prose project. Teacher A remarked that per the pacing guide,

[We were] supposed to be starting the GRASP (a narrative prose project), and we aren’t even starting that this week…. This [persuasive writing with text evidence] is supposed to be the big unit, and I feel like I’ve had five days on it.

Teacher B added remarks about the time-consuming standardized testing that took place at the start of the study. In the midst of competing agendas, teachers reportedly valued the CBGO in their instruction and for student learning. Teacher B even commented how it was helpful with her own organization of student work throughout the writing process:

I really liked that for the CBGO it was all in one place. So doing all these steps in one document – they could be saved, and I think that was helpful; whereas, often times, where we go through the writing process we’ve got several different handouts with different versions, and I can’t find it and all that stuff. So that was nice to have…

 Nevertheless, teacher decisions were persuaded more by the pace of instruction: “The writing for next week is not persuasive…it’s narrative. So, I’m with you. I don’t think we will really have time to do centers or small group instruction [with the CBGO]…”

The overwhelming urgency to advance the curricular agenda was visibly pronounced in the final planning session of Week 5, which occurred after the students and teachers returned from spring break. The discussion was informal compared to the previous sessions. It was held in a teachers lounge area. They were questioning if students were ready for posttest after completing three practice essays with the CBGO.

Teachers were not observed discussing specific student performance or looking at students’ writing.  Both teachers quickly indicated that they believed the students were not ready for the posttest. The teachers added, “It’s a shame that IDEAS is not stickier.” They wanted to provide students with another practice opportunity, but struggled to determine the context of the final practice writing session. Unable to determine a new writing prompt, they decided to have students take a CBGO they had already started for revision. This plan would save students class time, and then the posttest could be administered at the end of the week.

Discussion

In this case study, we sought to unveil how a pair of teachers planned for and integrated a technology-based tool into instruction for fourth and fifth graders. The teachers influenced student learning during the planning and integration of the writing technology tool during a social studies unit on the American Revolution. Social studies education not only promotes college and career readiness, but it allows students the opportunity to engage in perspective taking and social problem solving.

The teachers in this study used pedagogical strategies to promote students’ critical thinking, inquiry, and effective communication. Specifically, they taught their students to use a technology-based tool for writing and then asked their students to use the tool and literature about the American Revolution when composing a persuasive writing response to a social studies writing prompt. The latter represents the pedagogical technique of using technology so that students are engaging with materials and making an original written product (Hammond & Manfra, 2009). Integrating literacy practices in the elementary grades can support student learning in the content area standards and student performance on the National Assessment of Educational Progress (Heafner & Fitchett, 2018).  

Overall, the teachers were optimistic and excited for their students to experience the CBGO.  In contrast to teachers in Graham et al.’s survey (2014), the teachers in this study were willing to integrate technology into their social studies lessons. The teachers were able to articulate and plan for how the CBGO would merge into their current and future instruction. The teachers did not display any apprehension in using the technology in their pedagogical instruction. 

When students practiced using the CBGO, the teachers recognized student receptivity to the use of technology for writing as well as the problematic areas in which students needed support.  Similar to the case study of exemplar social studies teachers (Curry & Chernen, 2016), the teachers in this study valued how the technology could increase their own efficiency for providing feedback as well as student productivity.

The teachers also articulated considerations of when technology use versus paper-pencil may be beneficial and vice versa.  The use of the CBGO was not just a paper-based graphic organizer in electronic form, but the teachers recognized how using the CBGO improved their pedagogical strategies.

When using the TPACK framework, areas of growth for the teachers were observed, Teachers could have enhanced their use of technology and pedagogical techniques to support student learning. For example, teachers in this study recognized the technology features (e.g., organizer structure, drop-down menu of transition words, and copy and paste) that helped students to develop writing skills. However, use of additional verbal and visual features embedded in the tool (e.g., text-to-speech) may have provided further supports for students with disabilities and ELLs.

Additionally, teachers wanted students to support their opinion with facts or evidence by using brief texts including historical fiction. Engaging students to access historically relevant text in the digital realm as well as the hardcopy books may have provided more varied sources of information and genres for students to draw from when developing their writing.

Using technology in this way goes beyond a simple replication of print materials and widens the resources available for students to gather evidence. This pattern among teachers to not fully consider how technology connects to students’ needs or how it could be used to improve student outcomes is consistent with the literature (Leu et al., 2015). 

Overall, data revealed that the teachers made decisions that seemed to embody preconceived assumptions about (a) how to mediate student use of the technology, (b) how students would engage with the technology, and (c) the effects of the technology tool on student learning (Judson, 2006).

Mediating Student Use of Technology

Mediating student use of technology is a part of both pedagogical and technological knowledge. First, the coteachers chose not to adjust instruction for the ways the users interacted with the technology. For example, the teachers elected to not teach or reinforce all aspects of the tool (e.g., text-to-speech) nor how to use aspects of the technology explicitly when the students were not using the features, using the features inaccurately (e.g., self-regulatory strategies, copy and paste, and capitalization), or demonstrating a need to improve their writing performance. Students also displayed limited technology skills and struggled to recall the IDEAS strategy.

Rather than providing explicit instruction to teach the tool and the technology, the teachers prioritized providing students with opportunities to practice with the CBGO.  It was not clear why teachers did not provide the explicit instruction needed to address the observed student needs. Perhaps the teachers did not perceive this instruction as their role since these objectives were distant from any of their content curriculum objectives.

Teaching technology is a pedagogical shift (Glasset & Schrum, 2009) and “if the project requires the teacher to also cover new content or objectives, this increases the distance from current practice and, therefore, decreases the likelihood of success” (p. 32). Consistent with the literature that states how negative discussions can derail use of the technology (Bauer & Kenton, 2005; Leonardi, 2009), the coteachers’ planning sessions became less formal over time, and teachers readily discussed the inadequacies of the tool rather than ways to problem-solve.

Another explanation as to why teachers decided to not mediate student use of the tool is their need to maintain the rapid instructional pace of the curriculum. In a 5-week period of the study, the teachers needed to not only introduce the CBGO, but they needed students to write using text as evidence and to begin a unit on narrative prose. This external pressure of covering content negatively influenced the instructional time devoted to effective technology integration.

In hindsight, we should have considered how the brief 3-hour professional development (PD) might have foreshadowed the ways teachers may struggle to address both writing pedagogy and the technology. These aspects of integrating technology into instruction were not addressed during the PD or prior to implementation, and as a result, teachers’ preconceived notions of student technology skills potentially influenced their receptivity to the CBGO.

Student Engagement With the Technology

Consistent with prior research (Hutchison & Colwell, 2016; Hutchison & Woodward, 2018), this case study illustrates how teachers struggle to integrate technology in ways that fully align with curriculum standards and teacher goals. In this study, teachers decided to engage students with the CBGO by integrating the tool into the social studies curriculum intending for students to write an essay using facts from text as evidence to support their opinion. Specifically, teachers decided to focus on having students write about topics related to the American Revolution.

However, the students were first introduced to the CBGO with a persuasive writing prompt that did not require content background knowledge. Students were able to generate ideas based on experience. Although this content was not perceived as significant when the teachers planned for instruction, challenges surfaced during Lesson 1 and throughout the study.

The lack of “stickiness” of the IDEAS strategy was apparent for both teachers and students, perhaps because the project innovation introduced during the training and used throughout the study was specific to persuasive writing sans use of text to provide evidence. The IDEAS strategy for persuasive writing is taught with the following mnemonic: Identify your opinion, Describe three reasons, Elaborate with examples, Add transition words as you go, and Summarize.

When users employ text resources to provide facts or evidence to support a claim, the genre is known as argumentative writing.  For an argumentative essay, the IDEAS acronym is still used, but each letter represents a slightly different phrase: Identify your claim, Determine three facts, Elaborate with evidence, Add transition words as you go, and Summarize (see Boykin, 2015).  

This instructional shift was unforeseen, and when we recognized it, we decided not to provide an entirely new CBGO for implementation, out of respect for teacher and student time. Although we do not know if the argumentative IDEAS strategy would have been “stickier” for the teachers and students, this latter acronym clearly aligns with the vernacular language used by the teachers throughout the study (i.e., evidence). In addition, “innovations that fit into an individual’s existing understanding or schema will be more easily adopted” (Straub, 2009, p, 631). As demonstrated in this case study, the finite characteristics of an innovation may have unintended consequences for implementation.

Effects of Technology Tool on Student Learning

Regardless of teachers’ pedagogical decisions, student persuasive writing performance improved when the students used the CBGO. This finding suggests that barriers to effective integration should not necessarily prevent implementation.  Students were unable to recall the strategy accurately, but seemingly used the CBGO tool’s supports to successfully complete an organized persuasive essay.  

Findings from this case study regarding student outcomes build upon former research in several ways.  First, elementary students’ writing performance from this study was consistent with positive outcomes of previous research involving middle school students who used the CBGO in language arts classes (e.g., Regan et al., 2016a).  

As demonstrated in this study, when the CBGO was taken away during maintenance testing, students still demonstrated stronger writing performance than shown at pretest. This finding suggests that the practice opportunities with the CBGO provided by the teachers during social studies encouraged students to internalize the sequence of sentences and commit the transition words to memory.  

In addition, teacher interview data highlighted the appealing factors of the CBGO, as identified by the teachers and the students.  Specific attributes of student writing that persisted after students used the CBGO were also shared. These findings collectively suggest that the CBGO inherently scaffolded students’ needs when planning and composing a well-organized essay. This finding is particularly promising considering that the student participants of the bilingual school were largely represented by ELLs.

Second, findings from this study confirmed that persuasive writing prompts specific to elementary social studies content can be used with the CBGO.  This finding builds upon former research that demonstrated middle school students’ use of the mobile-based graphic organizer to respond to writing prompts that were aligned with seventh-grade social studies standards (e.g., “Write an essay about whether or not the Reconstruction Era would have been a good time for immigrants to settle in America”; Regan et al., 2017).  Finally, this study suggests that the CBGO can be used without researcher developed lesson plans, as used in previous studies.

Fundamentally, this case study showed that teachers can take a technology tool and demonstrate effective curricular integration. The teachers connected the use of the technology to reading, writing, and social studies instructional goals. Given that testing and standards influence the time that teachers spend on nontested subject areas such as social studies, the initiative of the teacher participants is encouraging (Fitchett, Heafner, & Lambert, 2014).

In addition, the teachers in this study described how they used brief texts that were engaging and less overwhelming for struggling readers and how they modeled ways to generate and record the evidence (pros and cons) for argumentation. This practice is encouraged in the design of disciplinary literacy (Duhaylongsod, Snow, Selman, & Donovan, 2015). However, the coteachers reportedly did not use any instructional time to teach the students basic computer skills or how to use specific components of the technology tool.

Previous survey research of literacy and language arts teachers revealed that a key obstacle to teaching students basic computer skills is the lack of time to do so during a class period (Hutchison & Reinking, 2011).  Throughout the case study the dilemma of time consistently imposed on teachers’ decisions. The teachers struggled between the competing agendas of addressing student needs or meeting curricular goals.

It was unclear during the study if the pressure to meet the demands of the curriculum was driven from the context of the school or the teachers themselves. Regardless, the heightened demand of maintaining the pace of teaching the curriculum should also mean meeting the digital technology skills that are mentioned in the anchor standards and in many individual grade level standards of the Common Core Curriculum. Hutchison and Colwell (2015) stated, “Teachers must be mindful of the explicit scaffolding that must accompany instructing students with digital tools…. Mini-lessons in using digital tools may be helpful and necessary” (p. 17).

Implications for Practice and Future Research

This case study helped to identify some broad considerations for teachers who use technology to improve student learning in the classroom and some considerations specific to the use of the CBGO. First, when bringing digital technology into the classroom, the teacher should identify an instructional goal and carefully select the digital tool(s) (Hutchison & Colwell, 2015).  The current study demonstrated the importance of identifying a clear instructional goal and carefully matching the technology to that goal.

The teachers in this study initially used the CBGO for persuasive writing to teach students to convince the reader of their personal opinion. They then had students practice with the CBGO in social studies and instructed students to use evidence from texts and their content knowledge to support an opinion, which is an argumentative essay. The teachers realized the mismatch of the tool with the objective and attempted to modify the embedded mnemonic. Intentional and iterative practice with the tool prior to instruction may have alleviated this misalignment and maximized opportunities for student learning.

Second, enabling teachers to customize the technology tool on demand may also be beneficial, so that it can best fit into the individual’s instructional context.  For example, if teachers in this case study could have immediately adjusted the mnemonic in the CBGO to make it more memorable for them, they might have been more effective with implementation or students’ writing performance may have been even greater.

An additional feature to consider embedding in the CBGO for supporting teacher analysis of student performance would be a customized reporting system of data from the CBGO output, information about individual student strengths and weaknesses, and classroom level instructional needs (as suggested in Cho & Wayman, 2014). Key elements influencing diffusion are the ease and compatibility of the innovation in the context of one’s classroom (Straub, 2009).  When technology can provide such accessible, real-time data of student performance, it is a clear advantage over use of paper.   

Researchers suggest that teachers’ data-based decisions should be integrated into practice (Ingram, Louis, & Schroeder, 2004).  Data-based decision making can not only improve the quality of student learning but can help teachers to be more efficient with time.  In Curry and Cherner’s (2016) case study, one of the exemplar social studies teachers had students login to Schoology and complete assignments on their class website. The teacher then “helped students when requested, but otherwise sat at his desk grading submitted assignments from students in real time and calling them to his desk for one-on-one consultations” (p. 131). 

Although providing such instant feedback is unique, responding to students’ writing performance data was not observed in the current study. Teachers’ instructional plans were more responsive to a predetermined instructional timeline rather than to students’ writing performance data. Student work products were not examined during the planning sessions to determine the next instructional decisions. Making instructional decisions on a limited number of observations is common practice (Ingram et al., 2004; Mandinach & Gummer, 2013) and, therefore, a critical area of concern for teacher preparation.

Also, Cho and Wayman (2014) pointed out that teachers’ data-based decisions depend on what teachers see as data, and that the “agency for change” in instruction rests in people, not in technology. The students in this study saved their final products and could print out the finished essay. Teachers in this study had access to the students’ writing, but the teachers did not refer to the written products specifically in teacher interviews beyond use of tracked changes for providing feedback. Specific data-use-related PD (e.g., Schildkamp & Poortman, 2015) may be a relevant element to include in PD of technology integration for teachers.

Finally, required prerequisite skills for using the technology innovation are important to understand; teachers and students must take the time to practice those skills. At the same time, simply by using technology, students can learn some of those basic skills through continuous exposure. For example, in our case study, while students struggled with basic word processing operations (e.g., copy, paste, and highlight), they were exposed to these skills and were able to use them successfully throughout the study.

More importantly, teachers need to reinforce the use of all features incorporated into a technology-based intervention based on students’ needs. Teachers need to understand how those features address students’ needs. The fact that our teacher participants did not focus on the self-regulated learning features and technological supports (e.g., audio comments) might have contributed to the decrease in students’ performance during maintenance. Current literature on PD and technology integration implies that teachers may benefit from using a facilitator to help teachers work together effectively in a professional learning community to integrate technology and to engage in discussions about technology, data, and student outcomes (Thoma, Hutchison, Johnson, Johnson, & Stromer, 2017).

Although this study provided evidence as to the constraints and outcomes of technology integration, it represents only one case and leaves with more research questions to be answered. For example, did the curricular integration of writing argumentative prose in social studies during the practice sessions encourage student learning of the American Revolution unit? Also, a limitation of this study is that we did not capture teacher perspectives as to why the teachers did not emphasize all of the technology’s affordances or why they did not explicitly teach certain aspects of the technology tool.

More broadly, future research is needed to understand technology integration across varying platforms of technology and in content areas beyond writing. Another area of future research is to determine how to mitigate the dilemma of competing agendas that teachers are faced with daily and how to encourage teacher use of data to enhance instruction.

In order for students to meet the digital literacy skills in today’s classrooms, teachers need training, support, and practice with how to integrate the innovation or technology-based tools effectively into the existing curriculum.  Knowing the pitfalls ahead of time may be one way for teachers to mitigate the constraints of technology integration.

References

Barrow, E., Anderson, J., & Horner, M. (2017). The role of photoblogs in social studies classroom: Learning about the people of the Civil War. Contemporary Issues in Technology and Teacher Education, 17(4). Retrieved from https://citejournal.org/volume-17/issue-4-17/social-studies/the-role-of-photoblogs-in-social-studies-classroom-learning-about-the-people-of-the-civil-war

Bauer, J., & Kenton, J. (2005). Toward technology integration in the schools: Why it isn’t happening. Journal of Technology and Teacher Education, 13(4), 519-546.

Bell, L., & Bull, G. (2010). Digital video and teaching. Contemporary Issues in Technology and Teacher Education, 10(1), 1-6. Retrieved from https://citejournal.org/volume-10/issue-1-10/editorial/digital-video-and-teaching

Bhabha, H.K. (1994). The location of culture. New York, NY: Routledge.

Boon, R. T., Burke, M. D., Fore, C., III., & Hagan-Burke, S. (2006). Improving student content knowledge in inclusive social studies classrooms using technology based cognitive organizers: A systematic replication. Learning Disabilities: A Contemporary Journal, 4, 1-17.

Boon, R. T., Burke, M. D., Fore, C., III., & Spencer, V. G. (2006). The impact of cognitive organizers and technology-based practices on student success in secondary social studies classrooms. Journal of Special Education Technology, 21, 5-15.

Bouck, E. C., (2016). A national snapshot of assistive technology for students with disabilities. Journal of Special Education Technology, 31, 4-13. doi: 10.1177/0162643416633330

Boykin, A. (2015). The impact of computer-based graphic organizers with embedded self-regulated learning strategies on the content area argumentative writing of typical and struggling writers (doctoral dissertation). Available from ProQuest Dissertations and Theses Global database. (UMI No.3720658)

Cho, V., & Wayman, J. C. (2014). Districts’ efforts for data use and computer data systems: The role of sensemaking in system use and implementation. Teachers College Record, 116(2), 1-45.

Curry, K., & Cherner, T. (2016). Social studies in the modern era: A case study of effective teachers’ use of technology and literacy. The Social Studies, 107(4), 123-136. doi: 10.1080/00377996.2016.1146650

Darling-Hammond, L., Zielezinski, M. B., & Goldman, S. (2014). Using technology to support at-risk students’ learning. Report retrieved from Stanford Center for Opportunity Policy in Education and the Alliance for Excellent Education website: https://edpolicy.stanford.edu/sites/default/files/scope-pub-using-technology-report.pdf

Duhaylongsod, L., Snow, C. E., Selman, R. L., & Donovan, M. S. (2015). Toward disciplinary literacy: Dilemmas and challenges in designing history curriculum to support middle school students. Harvard Educational Review, 85(4), 587-685.

Ertmer, P. A. (2005). Teacher pedagogical beliefs: The final frontier in our quest for technology integration. Educational Technology Research and Development, 53(4), 25–39.

Evmenova, A., & Regan, K. (2012). Project WeGotIT!: Writing efficiently with graphic organizers – Teachers integrating technology. Washington, DC: Technology and Media Services for Individuals with Disabilities—Stepping-Up Technology Implementation, U.S Department of Education, Office of Special Education and Rehabilitative Services.

Evmenova, A., Regan, K., Boykin, A., Good, K., Hughes, M., MacVittie, N., …, & Chirinos, D. (2016). Emphasizing planning for essay writing with a computer-based graphic organizer. Exceptional Children, 82(2), 144-169. doi: 10.1177/0014402915585483

 Fitchett, P. G., Heafner, T. L., & Lambert, R. G. (2014). Examining elementary social studies marginalization: A multilevel model. Educational Policy, 28(1), 40-68. doi: 10.1177/0895904812452998

Funkhouser, B. J., & Mouza, C. (2013). Drawing on technology: An investigation of preservice teacher beliefs in the context of an introductory educational technology course. Computers & Education, 6(2), 271-285.

Glassett, K., & Schrum, L. (2009). Teacher beliefs and student achievement in technology-rich classroom environments. International Journal of Technology in Teaching and Learning5(2), 138–153.

Gómez, M. (2015). When circles collide: Unpacking TPACK instruction in an eighth-grade social studies program. Computers in the Schools, 32(2), 278-299. doi: 10.1080/07380569.2015.1092473

Gorder, L. M. (2008). A study of teacher perceptions of instructional technology integration in the classroom. The Journal of Research in Business Education50(2), 63-76.

Graham, S., Capizzi, A., Harris, K., Hebert, M., & Morphy, P. (2014). Teaching writing to middle school students: A national survey. Reading and Writing, 27(6), 1015-1042. doi: 10.1007/s11145-013-9495-7

Groff, J., & Mouza, C. (2008). A framework for addressing challenges to classroom technology use. Association for the Advancement of Computing in Education Journal, 16(1), 21-46.

Hammond, T. (2014). Transforming the history curriculum with geospatial tools. Contemporary Issues in Technology and Teacher Education, 14(3), 266-287. Retrieved from https://citejournal.org/volume-14/issue-3-14/social-studies/transforming-the-history-curriculum-with-geospatial-tools

Hammond, T. C., & Manfra, M. M. (2009). Giving, prompting, making: Aligning technology and pedagogy within TPACK for social studies instruction. Contemporary Issues in Technology and Teacher Education, 9(2), 160-185. Retrieved from https://citejournal.org/volume-9/issue-2-09/social-studies/giving-prompting-making-aligning-technology-and-pedagogy-within-tpack-for-social-studies-instruction/

Heafner, T. (2004). Using technology to motivate students to learn social studies. Contemporary Issues in Technology and Teacher Education4(1). Retrieved from https://citejournal.org/volume-4/issue-1-04/social-studies/using-technology-to-motivate-students-to-learn-social-studies

Heafner, T. L., & Fitchett, P. G. (2018).  US history content knowledge and associated effects of race, gender, wealth, and urbanity: Item response theory (IRT) modeling of NAEL-USH achievement. The Journal of Social Studies Research, 42, 11-25.

Hutchison, A., & Colwell, J. (2015). Bridging technology and literacy: Developing digital reading and writing practices in grades K-6. Lanham, MD: Rowman & Littlefield.

Hutchison, A., & Reinking, D. (2011). Teachers’ perceptions of integrating information and communication technologies into literacy instruction: A national survey in the U.S. Reading Research Quarterly, 46(4), 308-29.

Hutchison, A., & Woodward, L. (2014). A planning cycle for integrating technology into literacy instruction. Reading Teacher, 67(6), 455-64. doi: 10.1002/trtr.1225.

Hutchison, A., & Woodward, L., (2018). Examining the technology integration planning cycle model of professional development to support teachers’ instructional practices. Teachers College Record, 120(10).

Ingram, D., Louis, K. S., & Schroeder, R. G., (2004). Accountability policies and teacher decision making: Barrier to the use of data to improve practice. Teachers College Record, 106, 1258-1287. doi: 10.111/j.1467-9620.2004.00379.x

Judson, E. (2006). How teachers integrate technology and their beliefs about learning: Is there a connection? Technology and Teacher Education, 14(3), 581-597. doi.org/10.3102/0034654307309921

Leonardi, P. M. (2009). Why do people reject new technologies and stymie organizational changes of which they are in favor? Exploring misalignments between social interactions and materiality. Human Communication Research, 35, 407-441.

Leu, D. J., Forzani, E., Rhoads, C., Maykel, C., Kennedy, C., & Timbrell, N. (2015). The  new literacies of online research and comprehension: Rethinking the reading  achievement gap. Reading Research Quarterly, 50(1), 37–59.

Mandinach, E. B., & Gummer, E. S. (2013). A systematic view of implementing data literacy in educator preparation. Educational Researcher, 42(1), 30-37. doi: 10.3102%2F0013189X12459803

McCarthey, S. J., Kennett, K., Smith, A., & West, A. (2017). Facilitating students’ stances toward technology-enhanced reading and writing the classroom. Journal of Literacy and Technology, 18(2). Retrieved from http://www.literacyandtechnology.org/ uploads/1/3/6/8/ 136889/jlt_v18_2_mccarthy_kennett_smith_west.pdf

Merriam, S. B. (2009). Qualitative research: A guide to design and implementation. San Francisco, CA: Jossey-Bass.

Mishra, P., & Koehler, M. J., (2006). Technological pedagogical content knowledge: A framework for teacher knowledge.  Teacher’s College Record, 108(6), 1017-1054.

Mumtaz, S. (2000). Factors affecting teachers’ use of information and communications technology: A review of the literature. Journal of Information Technology for Teacher Education9(3), 319-342. doi:10.1080/14759390000200096

Regan, K., Evmenova, A., Boykin, A., Sacco, D., Good, K., Ahn, S. Y., MacVittie, N., & Hughes, M. D. (2016a). Supporting struggling writers with class-wide teacher implementation of a computer-based graphic organizer. Reading and Writing Quarterly: Overcoming Learning Difficulties, 33(5), 428-448.doi: 10.1080/10573569.2016.1221781

Regan, K., Evmenova, A. S., Kurz, L. A., Hughes, M. D., Sacco, D., Ahn, S. Y., …, & Chirinos, D. S. (2016b). Researchers apply lesson study: A cycle of lesson planning, implementation, and revision. Learning Disabilities Research and Practice, 31(2),113-122. doi: 10.1111/ldrp.12101

Regan, K., Evmenova, A., Good, K., Leggett, A., Ahn, S. Y., & Mastropieri, M. (2017). Persuasive writing with mobile-based graphic organizers in inclusive classrooms across the curriculum. Journal of Special Education Technology, 33(1), 3-14. doi: 10.1177/0162643417727292

Rogers, E. M. (2003). Diffusion of innovations. (5th ed.). New York: NY: Free Press.

Schildkamp, K., & Poortman, C. (2015). Factors influencing the functioning of data teams. Teachers College Record, 117, 1-42.

Shanahan, T., & Shanahan, C. (2008). Teaching disciplinary literacy to adolescents: Rethinking content-area literacy. Harvard Educational Review, 78(1), 40-59. doi: 10.17763/haer.78.1.v62444321p602101

Shriner, M., Clark, D. A. Nail, M., Bethanne, M. S., & Libler, R. (2010). Social studies instruction: Changing teacher confidence in classrooms enhanced by technology. The Social Studies, 101(2), 37-45.

Straub, E. T. (2009).  Understanding technology adoption: Theory and future directions for informal learning. Review of Educational Research, 79(2), 625-649.

Thoma, J., Hutchison, A. Johnson, D. Johnson, K., & Stromer, E. (2017). Planning for technology integration in a professional learning community. The Reading Teacher, 71(2),167-175.  doi: 10.1002/trtr.1604

Waters, S., Kenna, J., & Bruce, D. (2016).  Apps-olutely perfect! Apps to support common core in the history/social studies classroom. The Social Studies, 107(3), 1-7. doi: 10.1080/00377996.2016.1149046

Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Thousand Oak, CA: Sage.


Appendix A
Completed Computer-Based Graphic Organizer

pdf download


Appendix B
Curriculum Integration by Common Core State Standards (CCSS) and District Social Studies Teaching Objectives

 Group nOriginal Graphic OrganizerEdited Graphic OrganizerMean GainWithin Group ESBetween Group ES
  MSDMSD   
Overall
Treatment4015.961.9919.151.12+3.191.962.27
Control3615.651.8715.651.870.000.00 
By Content Area        
Science
- Treatment514.701.9919.001.00+4.302.471.88
- Control215.002.8315.002.830.000.00 
Social Studies
- Treatment814.501.5818.751.39+4.252.702.07
- Control815.191.8315.191.830.000.00 
English
- Treatment517.102.0719.400.89+2.301.302.54
- Control516.001.4616.001.460.000.00 
Math
- Treatment216.500.7120.000.00+3.503.98NA
- Control117.000.0017.000.000.000.00 
Music
- Treatment1016.602.0819.201.23+2.601.461.70
- Control715.792.5515.792.550.000.00 
World Language
- Treatment616.252.1419.331.21+3.081.642.69
- Control614.831.8114.831.810.000.00 
Art
- Treatment216.000.7118.000.00+2.002.281.03
- Control316.001.7316.001.730.000.00 
CSD
- Treatment217.50.71200.00+2.502.852.20
- Control416.881.3116.881.310.000.00 
Note. ES = effect size; n = number; M = mean; SD = standard deviation; NA = not applicable; CSD = communication sciences and disorders.

Loading