Friedman, A. (2006). State standards and digital primary sources: A divergence. Contemporary Issues in Technology and Teacher Education [Online serial], 6(3). https://citejournal.org/volume-6/issue-3-06/social-studies/state-standards-and-digital-primary-sources-a-divergence

State Standards and Digital Primary Sources: A Divergence

by Adam Friedman, University of North Carolina at Charlotte

Abstract

The past decade has witnessed a proliferation of state standards and their associated tests in public schools in order to hold schools, teachers, and students accountable for learning outcomes and achievement. In this qualitative study of eight world history and world geography teachers, the degree to which the Virginia Standards of Learning (SOLs) influenced the use of digital primary source materials was examined. The SOLs had a negative influence on digital primary source use because of the large amount of material to be covered for the test, the focus of the test on fact-recall, and the intense pressure noted by the majority of teachers for their students to pass the test. Each finding, as well as its implications, is discussed.

In an effort to hold schools, teachers, and students accountable for learning results, the past decade has witnessed a movement toward standardized testing in public schools (Savage, 2003). In the mid-1990s, the Commonwealth of Virginia created specific content-based Standards of Learning (SOLs) for which each student would be responsible, as well as accompanying high-stakes end-of-course tests.

In secondary social studies, at the conclusion of each history course students are assessed on a 65- to 70-question, multiple-choice test that emphasizes fact-recall of content included in the standards. Although Virginia teachers’ pay is not tied to their students’ results, achieving a passing score is vital to both students and schools. Students who pass an exam receive what is termed a “verified” credit. In order to graduate from high school, students must receive at least one verified credit in history and social science (Virginia Department of Education, n.d.-a).

Additionally, in order for a school to become fully accredited by the state, a minimum of 70% of students must receive a passing grade on this exam (Virginia Department of Education, n.d.-c). The preponderance of specific standards and their accompanying high-stakes tests are not limited to the state level; in 2002, President Bush signed the No Child Left Behind Act (NCLB), which calls for annual testing of students in order to ensure that “those responsible [for education] are held accountable for producing results” (NCLB, 2002, Executive Summary section, ¶ 8).

Promises and Provisos for State Standards

Fueled by scathing indictments of U.S. schools, such as the 1983 report A Nation at Risk (The National Commission on Excellence in Education, 1983), advocates of state standards and their associated tests have argued that public schools in the United States have graduated students that are ill-equipped to enter either the workforce or postsecondary education (Finn, 2002). A way to alleviate this problem, supporters have argued, is through the development of specific state standards (Finn, 2002; Ravitch, 1996). Ravitch (1996) argued that by establishing specific standards and a means to be assessed, both teachers and students would have “clear expectations” of what is necessary for success in school, and as such, academic achievement would rise for all students (¶ 3). Alongside these expectations, Finn (2002) contended that standardized tests would provide motivation to students, encouraging them to “take their studies more seriously” (p. 11).

Advocates of specific standards point out that standards level the playing field in terms of educational opportunity by holding all students responsible for the same content and high expectations (Ravitch, 1996). Additionally, standardized testing offers an opportunity for outsiders to measure student achievement (Phelps, 1999).

In order to “represent a broad consensus” of what various stakeholders “believe schools should teach and students should learn,” Virginia instituted the SOLs in 1995 (Virginia Department of Education, n.d.-b, para. 2). However, the adoption of these standards was not without controversy. Fore (1998) argued that two competing camps with diametrically opposed viewpoints disputed the way in which these standards should be constructed; their opposition represented a dichotomy in “the purpose of social studies education” (p. 559). Fore (1998) described this discord as a philosophical schism between those who view social studies as a static subject in which names, dates, and historical figures are recalled, and those who favor teaching a more abstract subject. Fore described this abstract subject as “organized around concepts” (p. 12), where an overarching goal reflects an ambition of the National Council for the Social Studies (1994): to prepare students to become “active, informed citizens” (p. vii).

For a variety of reasons however, the former emerged victorious over the latter in the Commonwealth of Virginia (Fore, 1998). With the intention of ensuring that each student had met these standards, by 1998 high-stakes tests (with possible negative repercussions “for individual students and their schools”) had been created and implemented (Duke & Reck, 2003, p. 46). Indeed, 4 years after the adoption of these standards, Christie (1999) reflected that the “guiding vision” behind these standards had been “to raise student achievement through accountability for results” (p. 32).

Researchers have noted that standardized tests such as the Virginia SOLs have affected the teaching of all subjects and grade levels. These examinations are often given in a multiple-choice format and test at a low level of Bloom’s Taxonomy, rather than emphasizing higher order thinking skills (Maddux, 1998; Pahl, 2003). Additionally, the presence of standardized tests has led teachers to “teach to the test” (Meier, 2002, p. 195). Meier (2002) argued that teaching to the test is problematic in that it can lead to a prescribed way of teaching, in which little deviation is allowed and topics are covered on a cursory level to ensure that the entire curriculum is covered.

Teaching to the test is an exacerbated problem in social studies because the subject matter itself can be controversial, even resulting in differences of opinion on what people think is right and wrong (Meier, 2002). Writing test questions on subject matter for which there is no consensus correct answer is problematic. In order to ensure reliability of the test itself, the questions must be “simple, straightforward, and factually based” (Pahl, 2003, p. 13). However, this manner of teaching social studies in which students memorize and restate facts is inconsistent with current literature on effective means of history teaching, which supports teaching social studies through the development of historical understandings (Levstik & Barton, 2001; Wineburg, 2001).

Historical Understandings

Historical understanding is not merely the recall of names and dates; rather, it is a higher order thinking process in which students examine and subsequently interpret different perspectives and historical contexts (Grant, 2003; National Center for History in the Schools, 2005; VanSledright, 2004). Students engage in historical thinking, which as Greene (1994) noted, is “an act of judgment made on the basis of historical evidence” (p. 92). Among these forms of evidence are primary sources, which Poulton (1972) described as firsthand accounts of historical events.

When students analyze and interpret historical primary sources they should be taught to think like historians think (Cremer, 2001; Kobrin, 1996; Levstik & Barton, 2001; VanSledright, 2002). Since the advent of the World Wide Web, primary sources have been increasingly accessible in digital form (VanFossen & Shiveley, 2000; Warren, 2001). The use of the Internet for teaching and learning in social studies has been widely acclaimed. Braun and Risigner (1999) referred to the Internet as a “truly revolutionary development,” offering “access to library catalogues and historical archives” (p. 7). Additionally, an Internet connection can be found in virtually every school in the United States (Tabs, 2003).

Because of this potential to increase student understanding and achievement in social studies, a number of teacher educators have endorsed the infusion of technology into social studies methods courses (see, e.g., Diem, 2002; Martorella, 1997). Advocates of technology integration in social studies have suggested that technology should be used within the context of social studies content and should be used in such a way that it allows teachers and students to accomplish instructional objectives they would otherwise not be able to accomplish (Mason et al., 2000). Digital primary historical sources fulfill both of these conditions.

Digital primary historical sources allow students to analyze documents of the past and draw their own conclusions, tasks which the National Center for History in the Schools (2005) deems essential for historical understanding. These higher order thinking tasks can be challenging, as they force students to employ “a complex regimen of investigative techniques” in order to reach their own conclusion of how past events transpired (VanSledright, 2002, p. 6). However, the development of these skills is by no means impossible (Kobrin, 1996; Levstik, 1997; VanSledright, 2002).

 

Higher Order Thinking and Standards of Learning

The importance of students developing historical understandings through the analysis and interpretation of primary sources is reflected in the Virginia SOLs. The first standard for both World History I (world history until 1500) and II (world history post-1500) states that “the student will improve skills in historical research and geographical analysis by identifying, analyzing, and interpreting primary and secondary sources to make generalizations about events and life in world history” (Virginia Department of Education, 2001). The World Geography standards contain a focus on analysis and evaluation rather than memorization. The standards call for students to “think geographically” as they “analyze past and present trends in human migration and cultural interaction as they are influenced by social, economic, political, and environmental factors” (Virginia Department of Education, 2001, p. 32).

Although the Virginia SOLs emphasize the development of higher order thinking processes and skills among students, McMillan, Myran, and Workman (1999) reported that the standards impacted teachers’ instructional practices in such a way that they may have decreased this type of instruction. Many teachers in this study felt compelled to teach what was perceived to be a large amount of content on the specific SOLs, limiting instruction that focused on higher order thinking skills, such as student analysis of documents and generation of their own knowledge. Teachers were also concerned about covering all of the mandated content in the standards. As one social studies teacher commented, “The history SOLs are so numerous and detailed and they heavily emphasize rote memorization of facts. I fear that for my students to do well on the test I will have to forego teaching critical thinking processes” (McMillan et al., 1999, p. 7).

The effort among states to establish specific standards and tests is not limited to Virginia. In the mid-1990s, the New York State Education Department transformed its social studies curriculum in order to both increase student achievement and establish a method by which schools may be held responsible for student learning (New York State Education Department, 2004a). The New York State Social Studies Core Curriculum calls for students to develop historical understandings, stating that students “should learn to consult databases and a wide variety of primary sources,” as well as “take and defend positions on past and contemporary issues” (New York State Education Department, 2004b, p. 4). As part of their end-of-course test, students in New York are given a document-based essay question in order to “test [students’] ability to work with historical documents” (New York State Education Department, 1999, p. 18).

Numerous researchers have noted that although the use of primary sources is encouraged in state standards this encouragement is not necessarily translated into practice in the classroom. Grant et al. (2002) in a study of global history in New York found that one teacher incorporated primary sources and historical thinking skills into her curriculum almost “in spite of” the state standards, rather than as a result of their being mandated. The teacher consistently attempted to balance what she perceived to be conflicting views of teaching the state mandated curriculum and her personal beliefs about teaching history. In her view of history teaching, there is no one correct, absolute answer, and in this spirit she directed students to analyze and interpret primary sources. However, when she utilized primary sources in the way she thought “the state wants to use them,” she kept the focus much more narrow, as there was “a pretty specific answer to the question that goes along with the document” (Grant et al., 2002, pp. 244-245).

The Virginia SOLs clearly support the development of historical thinking skills among students, but unlike New York’s assessment that contains a writing component, students in Virginia are given only a multiple-choice exam. The seeming incongruence between the standards and the manner in which students are tested informs this research study. The research questions in this study seek to discover how standards and associated tests influence teachers’ encouragement of historical understanding through the use of primary sources. Specifically, the research focused on the extent to which world history and world geography teachers are influenced by the Virginia SOLs and/or the end-of-course test in terms of their use of digital primary source materials.

Methods

As part of a larger qualitative study that investigated world history and world geography teachers’ beliefs and practices regarding using digital primary sources and the factors facilitating or inhibiting their use (Friedman, 2004), participants were asked to describe the extent to which the Virginia SOLs and the associated tests impacted their instruction.

The crux of qualitative research is in-depth study of small groups of people in their natural setting, as opposed to a more superficial study of a large group of people (Miles & Huberman, 1994). For this study, 8 world history and world geography teachers were selected from a pool of 40 teachers who were given a survey based on the Virginia SOLs regarding their frequency of use (on a scale of 0-3) and method of acquiring primary sources. Teachers who specified that they acquired primary sources from the Internet on a particular SOL were regarded as digital primary source users for that specific standard. From this data, three high-frequency digital primary source users (used digital primary sources on at least 90% of the SOLs), three low-frequency (15% or less), and two middle-frequency (16-89%) participants were chosen to be interviewed and observed. These 8 participants came from a diverse sample of schools (two rural, two suburban, and one urban). The breakdown of teachers, their rates of digital primary source use, and their school is provided in Table 1. The names of each teacher and school are pseudonyms.

 

Table 1
Teachers’ Self-Reported Use of Digital Primary Sources

Digital Primary
Source Use

Teacher
School
Type of School
HIGH

Mr. Lukas
Mr. Clark
Ms. Pullen

Mountainview High School
Lakefront High School
Eastside High School
Suburban
Suburban
Urban
MIDDLE
Mr. Wagner
Ms. Lewin
Riverside High School
Lakefront High School
Rural
Suburban
LOW
Mr. Mancuso
Mr. Mitchell
Ms. Mather
Riverside High School
Eastside High School
Plains High School
Rural
Urban
Rural

The study was qualitative in nature and based on Erickson’s (1986) method of analytic induction. The researcher was the instrument, and data consisted of interviews, observations, field notes, and document analysis, with the ultimate goal of creating a believable account of what transpired in each teacher’s classroom, not with absolute statistical proof, but with “plausibility” (Erickson, 1986, p. 149). Erickson deemed having more than one type of data to be essential. Any one type of data may be imperfect, but the fusion of multiple types of data, or “triangulation,” will yield more fastidious results (p. 140).

Each teacher participated in one in-depth (1.5 hour) interview, a portion of which was focused on the influence of the Virginia SOLs. The interview protocol was based on Patton’s (1990) general interview guide approach. Following Patton’s (1990) approach, a “list of questions or issues that are to be explored in the course of an interview” were generated (p. 283). An interview guide was used to help ensure that the information gathered from different participants was similar yet allowed flexibility for the researcher to be able to, as Patton (1990) put it, “explore, probe, and ask questions that will elucidate and illuminate that particular subject” (p. 283). For a complete list of the specific questions in the formal interview that pertained to the Virginia SOLs, see the appendix.

Each teacher also participated in three informal (15-20 minute) interviews and unrestricted, impromptu observations during which archival evidence (such as lesson plans, student handouts, and specific Web sites used) was collected. These data were used to generate an assertion about the effect of the Virginia SOLs on digital primary source use among world history and world geography teacher participants.

In order for this assertion to take place, data were analyzed throughout the study. Erickson (1986) described the past as having an effect on the present, and this notion held true during data collection. In order to account for the researcher’s presence and changes in the research context resulting from the collection of data, analytic memos were written from the earliest points of data collection. These memos were called upon at a later date to assist in data analysis (as recommended by Maxwell, 1996).

Results

 

Teacher perceptions and concerns about the Virginia SOLs, as well as their associated end-of-year tests, had a direct influence on teachers’ instructional decision making. This effect manifested itself in two fundamental ways. Each teacher perceived digital primary source use (and engagement of students in historical thinking) as being more time consuming than other methods of instruction. Because the SOLs cover an enormous amount of material, these world history and world geography teachers did not believe they had the instructional time to explore any one area of the standards in depth. Second, the SOLs’ associated tests contain questions at the most basic fact-recall level. The world history and world geography participant teachers in this study did not view such assessments as compatible with the historical thinking skills fostered by the use of digital primary sources. Exacerbating this dichotomy was an intense pressure felt by nearly all of the teachers for their students to achieve a passing grade on the exam.

Limitations on Instructional Time

 

The state of Virginia offers two world history courses – one that covers events until 1500 and the other from 1500 to the present – as well as a world geography course that covers each of the world’s regions in terms of physical and cultural geography. Participants described a tremendous time crunch resulting from the sheer number of topics to be covered in these courses. Mr. Wagner, who teaches at Riverside High School, exemplified these concerns about time when he said that in order to cover all of the required material before the SOL test, he has to “find the quickest, best way to get something across.”

This was further illustrated by Plains High School’s Ms. Mather, who although she would prefer to teach in a manner in which she could go into great detail about certain topics and have student-centered lessons and activities no longer taught this way because of the time pressure she perceived from the SOLs. She stated, “We have to rush the kids and we don’t have time for certain activities,” and consequently altered her teaching style to include a greater emphasis on lecturing about factual content. Observations reflected this focus on lecture. Neither of the two classroom visits of Ms. Mather contained any type of inquiry or discovery learning; rather, each consisted of teacher-centered lectures and worksheets.

Ms. Pullen of Eastside High School also stated that as a result of the time pressure she felt the SOLs exerted, she used fewer digital primary sources. In her opinion, it was time consuming for students to look at digital primary sources on the Internet, and although it might be a more worthwhile learning experience for her students, the time pressure she believed she faced with the SOLs made her less likely to utilize these sources in her instruction. As she put it,

[Digital primary sources] involve them diving into the material and learning a lot of facts that are not going to be on the SOL, and that’s all time consuming, and we’re under such a time constraint that we can’t spare a 90-minute block for them to go on to a Webquest on World War I and have them learn all of these interesting facts about lice and rats. They have to know the Treaty of Versailles, the League of Nations, the 14 Points and all that, so I think we are all discouraged from letting them “surf around” the Internet, even with a structured Webquest. It just takes too much time from instruction.

 

The message in this quote is clear: Because there are so many facts on the SOL test, students need to be instructed, or told, about those facts. The large number of facts that are tested means, at least for teachers like Ms. Pullen, that there is not enough time for students to discover those facts on their own in a manner that the use of digital primary sources would facilitate.

In addition to teaching world geography, Mr. Mancuso coaches football and uses football terms to describe his approach to teaching world geography. In order to cover all of the required material in as much time as possible, he uses what he referred to as the “bump and run” method:

I call it the bump and run. I see some things that are wrong with that because it doesn’t give you a chance to really just focus in a particular area of study and really let your students investigate something. You may take a week, week and a half to do that. You really can’t veer off course too much. Instead, you’re bumping and running. Bump and run. Bump and run. When we’re done with this, let’s move on to the next one.

Mr. Mancuso’s “bump and run” teaching method entailed covering a content area quickly and superfluously and then going to the next area in order to ensure that all of the material was covered. This approach to teaching was reflected in both of Mr. Mancuso’s observations, as he taught a teacher-centered class in which, in his words, he “made sure to cover all of the objectives from the SOL.” Again, the message is clear: There is not time for unstructured, discovery learning in Virginia classrooms because they are driven by standards-based assessments.

The Power of the Test

The SOL test is composed of multiple-choice questions that test students’ knowledge of historical facts. This type of assessment is at the lowest level of Bloom’s taxonomy of educational objectives. Not one of the participants expressed a belief that the test encouraged students to think on a higher level. As Mr. Clark said, “I just think the SOLs test rote knowledge.” It was Ms. Lukas’ view that “in reality, the SOLs don’t assess historical thinking.” This sentiment was echoed by Ms. Mather, who did not believe that “the SOLs are set up to let kids get past anything but memorization, so the part of learning that gets you to think is not really there.” Each of these teachers described the tests as inconsistent with thinking historically and using digital primary sources.

Mr. Wagner, a world geography teacher, noted that the rote learning required for students to pass the SOL test had “steered [him] away from primary sources” of all types. There were more in-depth topics and learning activities that he would have liked for his students to learn, but as Mr. Wagner put it “they’re only tested on the basic information.” The resulting focus on low level knowledge was reflected in each of the observations of Mr. Wagner’s classes. He typically gave his students low level worksheets that focused on geographic terms. He later explained that he did this in order “make sure they’ve heard of these words for the SOL.”

Ms. Pullen noted that since her students were responsible for rote knowledge her assessment procedures likewise focused on rote learning. One of her instructional approaches was to show students digital primary source images from different time periods to supplement her instruction. She believed that her students would be able to “put a name with a face” and, thus, be better able to understand and remember the material. However, when she tested students the images were never there. She explained this by saying that “when they sit down and take that standardized test, it’s all black and white—it’s all fact-specific.”

Too Much Heat and Not Much Light

The vast majority of the participants described an enormous amount of pressure from the SOL tests. Mr. Clark succinctly summarized common feelings on this topic, as he said, “The SOLs are the most important thing regardless of what we are told, and there is a lot of pressure.” This feeling was also expressed by Ms. Mather, who believed that her school placed such an emphasis on this exam that teachers were not “encouraged to do much of anything…except pass the SOLs.”

None of the schools in this study required specific pass rates for individual teachers, but all the schools reported SOL test results by teacher. The implicit message from this reporting was that teachers were responsible for their students’ passing the test and that student performance on this test was not only a measure of the students’ knowledge but also the teacher’s ability. Mr. Clark noted this pressure and stated that he thought skills as a teacher are being judged by his superiors:

When we get SOL scores back, every student is tagged with a teacher’s name beside that student, so it is pretty clear to me, I know every year how my SOL scores are, which means I am pretty sure that even if my colleague down the hall doesn’t know, that my department chair knows, and the principal knows, and the guidance staff knows.

Although he thought he was being judged, Mr. Clark admitted to being fortunate to have taught advanced-level classes for the past few years. Mr. Clark’s students had higher passing rates than did students in standard-level classes. It was his opinion that if he taught a standard-level class and taught in the same way as he does an advanced class, “instead of having a 90% pass rate I could have a 50% pass rate.” However, he did not think that his superiors would recognize the difference in student achievement level. When Mr. Clark imagined his SOL pass rate dropping to 50%, the first words out of his mouth were “you know I would take heat from that.” This pressure clearly discouraged his use of digital resources.

Although Mr. Clark was a high-frequency user (similar to Ms. Pullen, he showed his class digital images on a near daily basis and did so during each observation), he would have liked to have had his students create Web-based representations of their work with digital resources. Such work would have taken 2 days of instructional time. He thought that this time spent would be worth it, but was reluctant because “if my kids come back with low SOL scores [they will say] ‘You spent how many days with [computers]?’” When Mr. Clark did use digital resources, they were teacher-centered activities in which he displayed images on a computer projector for his students to see.

Ms. Lewin, who also teaches at Lakefront High School but teaches students with a lower achievement level, noticed similar pressure. Much of the pressure was due to the fact that she “know[s] they break down SOL scores by teacher.” There was another teacher in her school who taught a similar group of lower achievement level students, and she thought that it would be “only natural for them to compare us and to see how one person’s doing over another regardless of the type of class that you end up having.” This directly impacted her teaching.

Although Ms. Lewin realized that she could not control who was in her class, she could control how it was taught. As a result of the pressure to cover content in the SOLs, Ms. Lewin admitted to having a “tendency to teach toward the test a little more than I would like.” Her teaching was characterized by worksheets that contained vocabulary words. In one lesson Ms. Lewin gave students a worksheet with terms from the Roman Empire, telling the class to “make sure you know these.” After each class she mentioned that she gave students the worksheets in order to “prepare them for the SOL.”

Student performance on the SOL test was not tied to a review of each teacher’s job performance, per se; however, it was clear that individual teachers sensed pressure for their students to perform at a high level on this test. This pressure was not unique to any particular high school, as all participants described feeling some degree of pressure associated with this test.

The pressure that teachers faced did not always come from their supervisors. Instead, teachers seemed to sense more indirect pressure from their desire to be a “successful teacher,” which they appeared to define as guiding students toward passing grades on tests. Mr. Mancuso, a low-frequency user of digital resources reported pressure “for kids to pass the test, and to teach what will be asked on those SOL tests.” Ms. Pullen, who was on the opposite end of the spectrum in terms of digital primary source use, expressed similar beliefs. She felt like she was, in her words, “under so much pressure to get them to pass the test, [as] they have to have verified credits.”

Some teachers felt more direct pressure to make sure their students passed the SOL test. Mr. Mitchell of Eastside High School and Mr. Lukas of Mountainview High School both described feeling administrative pressure to cover all of the SOL material. They did not experience the same indirect pressure in terms of their students passing the test. This appeared to be related to the fact that their students were at the extremes of the tracking system. Mr. Lukas taught honors students, while Mr. Mitchell taught a low student achievement level class. Because their students fell at the two extremes, neither thought that their teaching would make a difference in student SOL scores. Mr. Lukas indicated that he was “not worried about these kids—they’re going to do just fine on it, despite what I do in class.” Mr. Mitchell expressed a similar sentiment, saying that he was “not worried about their performance because I think their problem isn’t so much their knowledge of history, it is just their functional skills need so much work.”

The Virginia SOLs served as a major barrier to using digital resources for the world history and world geography teachers in this study. Due to the large amount of material included in world history and world geography standards, the teacher participants in this study felt rushed in terms of instructional time. They did not believe they could explore any one area of history or geography in depth. As a result, they were less likely to use digital resources. The SOL tests’ emphasis on basic knowledge led the teachers in the study to use digital resources less frequently. They did not think that their use would necessarily translate into higher scores for their students. In general, the majority of teachers described a tremendous amount of pressure to ensure that their students passed these exams.

Discussion, Implications, and Conclusions

This study has implications for both teacher educators and policy makers. Teacher educators must prepare future teachers for the realities of the classroom, while simultaneously teaching effective methods for encouraging historical thinking among students. The results are also significant for policy makers when standards and test-driven teaching negatively influences teachers’ use of digital resources.

Preparing Highly Qualified Teachers

 

With the recent passage of NCLB legislation, it is doubtful that state standards and their associated exams will be going away anytime soon; in fact, the future could bring an increased pressure on teachers and schools to achieve the necessary “adequate yearly progress” (NCLB, 2002) required by the NCLB. Teacher education programs have an arduous task. They must prepare highly qualified teachers to teach social studies to prepare their students to be successful on standardized tests, while at the same time teaching preservice teachers how to encourage historical thinking among their students. Such divergent teacher education goals might be accomplished a number of ways.

Merryfield (1997) described an activity in which preservice teacher education students read opposing accounts (one from the African perspective, another from the White perspective) about an historical encounter in the late 19th century. As students began to discuss what happened, they were only considering one perspective. As the discussion unfolded, it became apparent that there was more than one point of view on what transpired over 100 years ago. When students came to this realization, they could begin to realize the vagaries of the past.

This activity (or one similar) could be modified by encouraging preservice teachers to find and utilize digital primary historical sources and create questions and activities based on specific state standards. The primary sources themselves can likely be found online. Teacher education programs must teach future teachers how to use content-specific resources directed at promoting historical thinking and understandings associated with content highlighted on standardized tests.

Policy Makers

Although it is likely that standards for learning and the related tests will be in place for the foreseeable future, policymakers should recognize the incongruence between the standards-based exams and best practice in teaching history. A possible compromise between advocates and detractors of state standards and their associated high stakes tests would be for the exam to be divided into two parts, the first in a multiple-choice format and a second section focusing more on historical thinking skills. The New York State Regents exam is composed in this manner, with students writing two essays; one on a theme of history, and another based on historical documents (Grant et al., 2002). If this change was made, it is likely that there would be an increase in digital primary source use among social studies teachers in Virginia. Although Grant et al. (2002, p. 245) has illustrated that teachers might use primary historical documents in their own way “in spite of” the state standards, any association between primary historical documents, historical thinking skills, and state mandated exams is encouraging.

Limitations

Although this was an in-depth study of eight world history and world geography teachers in terms of state standards and their associated tests, it does have limitations. First, because there was a small sample size, it may not be generalizable to all teachers. Also, because the preponderance of the data were accumulated through interviews, they are based on these teachers’ own perceptions as opposed to a more quantifiable measure. Finally, because this study is rooted in the Virginia Standards of Learning for World History and World Geography, the outcomes may not be applicable to teachers in other states, or even other subjects within the social studies, such as United States History.

Conclusion

A clear incongruence was evident between the Virginia SOLs and digital primary source use. History best practice calls for students to practice the skills of historians as they investigate and subsequently analyze documents directed at understanding history and historical processes. Such historical work involves higher order thinking skills and can often result in depth of study rather than breadth. In this qualitative study of 8 world history and world geography teachers, the Virginia SOLs and their associated tests acted as a deterrent to best historical practices. Teachers were pressed for instructional time and described the SOL test as basic fact-recall and, thus, incompatible with the fostering of in-depth historical thinking skills. The teachers in this study also experienced direct and indirect pressure for students to achieve a passing score on the test.

As the U.S. education system increases testing and accountability, it is necessary to study the effect that standards and their associated tests have on teaching and learning. In order to see whether the results of this study hold true on a wider scale further research is necessary. One such study might use a survey looking at how state standards influence teachers’ uses of digital resources directed at engaging students in historical understandings. It would be of particular interest to compare the results of teachers in states such as Virginia, which has a fact-recall test, and New York, which calls for students to interpret and analyze documents. Ultimately, research and related action should be directed at a reconciliation of the discord between advocates and detractors of standards and their associated tests so that the needs of all stakeholders in the educational process might be met.

 

 

References

Braun, J., & Risinger, F. (1999). Surfing social studies. Washington, DC: National Council for the Social Studies.

Christie, M. (1999). Standards of learning: Why Virginia’s education reform is working. Virginia Issues and Answers 6(2), 32-37.

Cremer, D. J. (2001). Matter, method, and machine: the synergy of world history, active learning, and computer technology. In D. A. Trinkle & S. A. Merriman (Eds.), History.edu: Essays on teaching with technology (pp. 117-124). Armonk, NY: M.E. Sharpe.

Diem, R. A. (2002, April). An examination of the effect of technology instruction in social studies methods classes. Paper presented at the annual meeting of the American Educational Research Association, New Orleans, LA. (ERIC Document Reproduction Service No. ED465 691).

Duke, D. L., & Reck, B. L. (2003). The evolution of educational accountability in the Old Dominion. In D. L. Duke, M. Grogan, P. D. Tucker & W. F. Heinecke (Eds.), Educational leadership in an age of accountability: The Virginia experience (pp. 36-68). Albany, NY: SUNY Press.

Erickson, F. (1986). Qualitative methods in research on teaching. In M. C. Wittrock (Ed.), Handbook of research on teaching (pp. 119-161). New York: Macmillan.

Finn, C. E. (2002, April). What ails U.S. high schools? How should they be reformed? Is there a federal role? Paper presented at Preparing America’s Future: The High School Symposium, Washington, DC. (ERIC Document Reproduction Service No. ED467 037).

Fore, L.C. (1998). Curriculum control: using discourse and structure to manage educational reform. Journal of Curriculum Studies 30(5), 559-576.

Friedman, A. M. (2004). Digital primary source use in world history and world geography (Doctoral dissertation, University of Virginia, 2004). Dissertation Abstracts International, 65, 2958.

Grant, S. G. (2003). History lessons. Mahwah, NJ: Lawrence Erlbaum Associates, Inc.

Grant, S. G., Derme-Insinna, A., Gradwell, J., Lauricella, A. M., Pullano, L., & Tzetzo, K. (2002). Juggling two sets of books: A teacher responds to the New York State global history exam. Journal of Curriculum and Supervision, 17(3), 232-255.

Greene, S. (1994). The Problems of learning to think like a historian: Writing history in the culture of the classroom. Educational Psychologist, 29(4), 89-96.

Kobrin, D. (1996). Beyond the textbook-teaching history using documents and primary sources. Portsmouth, NH: Heinemann.

Levstik, L. (1997). “Any history is someone’s history:” Listening to multiple voices from the past. Social Education, 61(1), 48-51.

Levstik, L., & Barton, K. (2001). Doing history-investigating with children in elementary and middle schools. Mahwah, NJ: Lawrence Erlbaum Associates.

Maddux, C. (1998). Barriers to the successful use of information technology in education. Computers in the Schools, 14(3/4), 5-11.

Martorella, P. (1997). Technology and social studies: Which way to the sleeping giant? Theory and Research in Social Education, 25(4), 511-514.

Mason, C., Berson, M., Diem, R., Hicks, D., Lee, J., & Dralle, T.(2000). Guidelines for using technology to prepare social studies teachers. Contemporary Issues in Technology and Teacher Education [Online serial], 1(1). Retrieved July 13, 2006, from https://citejournal.org/vol1/iss1/currentissues/socialstudies/article1.htm

Maxwell, J. (1996). Qualitative research design: An interactive approach. Thousand Oaks, CA: Sage Publications, Inc.

McMillan, J. H., Myran, S., & Workman, D. (1999, April). The impact of mandated statewide testing on teachers’ classroom assessment and instructional practices. Paper presented at the annual meeting of the American Educational Research Association, Montreal, QC. (ERIC Document Reproduction Service No. ED431041.

Merryfield, M. M. (1997). A framework for teacher education in global perspectives. In M. M. Merryfield, E. Jarchow. & S. Pickert (Eds.), Preparing teachers to teach global perspectives: A handbook for teacher educators (pp. 1-24). Thousand Oaks, CA: Corwin Press.

Meier, D. (2002). Standardization versus standards. Phi Delta Kappan, 84(3), 190-198.

Miles, M., & Huberman, A. (1994). An expanded sourcebook: Qualitative data analysis. Thousand Oaks, CA: SAGE Publications.

National Center for History in the Schools. (2005). Overview of standards in historical thinking. Retrieved July 13, 2006, from http://nchs.ucla.edu/standards/thinking5-12.html

National Commission on Excellence in Education. (1983). A nation at risk: The imperative for educational reform. Retrieved on July 13, 2006 from the United States Department of Education Web site: http://www.ed.gov/pubs/NatAtRisk/index.html

National Council for the Social Studies. (1994). Expectation of excellence: Curriculum standards for social studies. Silver Spring, MD: National Council for the Social Studies.

New York State Education Department. (1999). Global history and geography regents examination: Test sampler draft. Retrieved July 13, 2006, from http://www.emsc.nysed.gov/osa/socstre/socstudarch/ghist1.pdf

New York State Education Department. (2004a). New York State learning standards. Retrieved July 13, 2006, from http://www.emsc.nysed.gov/ciai/describe.html

New York State Education Department. (2004b). Social studies overview. Retrieved July 13, 2006, from http://www.emsc.nysed.gov/ciai/socst/pub/ssovervi.pdf

No Child Left Behind. (2002). Executive summary. Retrieved July 13, 2006, from The White House Web site: http://www.whitehouse.gov/news/reports/no-child-left-behind.html#1

Pahl, R. H. (2003). Assessment traps in K-12 social studies. The Social Studies, 94(5), 212-215.

Patton, M. (1990). Qualitative evaluation and research methods. Newbury Park, CA: Sage Publications.

Phelps, R. P. (1999). Why testing experts hate testing (Fordham Report, Vol.3, No.1). Washington, DC: Thomas B. Fordham Foundation. (ERIC Document Reproduction Service No. ED429089).

Poulton, H. (1972). The historian’s handbook. Norman, OK: University of Oklahoma Press.

Ravitch, D. (1996). The case for national standards and assessments [Electronic version]. Clearing House, 69(3), 134-135.

Savage, T. V. (2003). Assessment and quality social studies. The Social Studies, 94(5),
201-206.

Tabs, E. D. (2003). Internet access in U.S. public schools and classrooms: 1994-2002. Retrieved July 13, 2006, from the National Center for Educational Statistics Web site: http://nces.ed.gov/pubs2004/2004011.pdf

VanFossen, P. J., & Shiveley, J. M. (2000). Using the Internet to create primary source teaching packets. The Social Studies, 91(6), 244-252.

VanSledright, B. (2002). In search of America’s past. New York: Teachers College Press.

VanSledright, B. (2004). What does it mean to think historically…and how do you teach it? Social Education, 68(3), 230-233.

Virginia Department of Education. (n.d.-a). Project Graduation: Frequently asked questions about earning a Virginia high school diploma. Retrieved July 13, 2006, from http://www.pen.k12.va.us/2plus4in2004/faq.shtml

Virginia Department of Education. (n.d.-b). Standards of learning currently in effect for Virginia public schools. Retrieved July 13, 2006, from http://www.pen.k12.va.us/VDOE/Superintendent/Sols/home.shtml

Virginia Department of Education. (n.d.-c). Virginia school report card: School accreditation status for 2004-2005. Retrieved July 13, 2006, from http://www.pen.k12.va.us/VDOE/src/vasrc-accred-rate-descr.shtml

Virginia Department of Education. (2001). History and social science standards of learning. Retrieved July 13, 2006, from http://www.pen.k12.va.us/VDOE/Superintendent/Sols/historysecondary.pdf

Warren, W. J. (2001). Using the World Wide Web for primary source research in secondary history classes. In D. A. Trinkle & S. A. Merriman (Eds.), History.edu: essays on teaching with technology (pp. 171-180). Armonk, NY: M.E. Sharpe.

Wineburg, S. (2001). Historical thinking and other unnatural acts. Philadelphia, PA: Temple University Press.

 

Author Note:

Adam Friedman
University of North Carolina at Charlotte
email: [email protected]

 

 

 

Appendix

Interview Protocol Regarding Virginia SOLs

What type of learning do you think the SOLs represent?

Do you feel any pressure from the SOLs? Please describe.

How does this pressure affect your teaching?

If the SOLs didn’t exist, would you teach the same way? If not, please describe how you would teach.

 

 

Loading