{"id":812,"date":"2006-06-01T01:00:00","date_gmt":"2006-06-01T01:00:00","guid":{"rendered":"http:\/\/localhost:8888\/cite\/2016\/02\/09\/a-computer-based-instrument-that-identifies-common-science-misconceptions\/"},"modified":"2016-05-27T10:14:49","modified_gmt":"2016-05-27T10:14:49","slug":"a-computer-based-instrument-that-identifies-common-science-misconceptions","status":"publish","type":"post","link":"https:\/\/citejournal.org\/volume-6\/issue-3-06\/science\/a-computer-based-instrument-that-identifies-common-science-misconceptions","title":{"rendered":"A Computer-Based Instrument That Identifies Common Science Misconceptions"},"content":{"rendered":"
The editors of Contemporary Issues in Technology and Teacher Education hereby retract this article, \u201cA Computer-Based Instrument That Identifies Common Science Misconceptions\u201d by Timothy Larrabee, Mary Stein, and Charles Barman. The article is being retracted because a substantively duplicate manuscript was subsequently published by the same authors in the Journal of Science Teacher Education, Issue 2, Volume 18, April 2007: \u201cWhat Are They Thinking? The Development and Use of an Instrument That Identifies Common Science Misconceptions\u201d by Mary Stein, Charles R. Barman, and Timothy Larrabee (pp. 233\u2013241).<\/p>\n
\n
<\/p>\n
Although many instruments have been developed that target individuals\u2019 misconceptions about a variety of specific science topics, an online instrument targeting a wide range of science beliefs has not yet been developed. As society becomes more at ease with the use of the Internet, the development of instruments that effectively use technology for educational research are needed. This article will describe the rationale for and development of an easily administered instrument, known as the Science Beliefs Test, which helps researchers, science educators, and science teachers understand more about commonly held scientific misconceptions. A description of the Science Beliefs Test and an explanation of how its validity and reliability were established are included in this discussion.<\/p>\n
Accessing Students\u2019 Thinking About Science<\/p>\n
Traditional Methods<\/p>\n
Research on students\u2019 beliefs and alternative conceptions they may hold has a long history and continues to be of great interest. A variety of methods have been used to elicit students\u2019 ideas, and these have been widely reported in the literature (e.g., Aron, Francek, Nelson, & Bisard, 1994; Haslam & Treagust, 1987; Osborne & Freyberg, 1985; Schoon, 1995; Trumper, 2001; Watts & Zylbersztajn, 1981). Many of these methods are not feasible in terms of the time and effort required for use in existing K-12 and preservice elementary education science classrooms. Moreover, many of the assessments focus on specific science topics rather than on a broad range of science conceptions. The existing assessments often test at greater depths than can be reached in general survey science courses. As a result, the authors became interested in developing an instrument that would make effective use of existing technology in eliciting respondents\u2019 beliefs about a wide range of science topics that could be accessed from any computer. The instrument would assist K-12 general science teachers, as well as preservice elementary education science educators, in identifying key misconceptions held by their students across the various science concepts presented in their curricula.<\/p>\n
Haslam and Treagust (1987) noted that individual student interviews are often a useful way for researchers to identify students\u2019 misconceptions in science; however, this methodology may not be as useful to teachers (Fensham, Garrard, & West, 1981; Peterson, Treagust, & Garnett, 1989). Typically, when interviewed, students\u2019 responses are recorded, transcribed, and analyzed. As students become more adept at using the keyboard to express themselves in e-mails and instant messages, they will become more comfortable typing their responses to questions provided online. And their written responses will more closely approximate the verbal answers they may have given in a face-to-face interview.<\/p>\n
Not only are current methods for eliciting students\u2019 beliefs, such as interviewing and paper-and-pencil surveys, often cumbersome for teachers and scholars, they may also fail to be useful to the students as a means for encouraging thinking about their own ideas, the reasons for those ideas, and how their ideas may change as a result of instruction. Rosenfeld, Booth-Kewley, and Edwards (1993) reported that \u201cresponding on the computer may lead to higher levels of self-awareness,\u201d and participants perceive online assessments as more useful and relevant (p. 498).<\/p>\n
Odom and Barrow (1995) have advocated the development of paper-and-pencil tests to help classroom teachers diagnose misconceptions. Yet, administering and analyzing these assessments can be costly in terms of lost instructional time and money for printing and reprinting copies. Furthermore, there may be difficulties associated with collecting and analyzing the complete data set. In keeping with the concerns related to the difficulty of conducting personal interviews, as well as many other forms of data collection, we have developed an electronic instrument, the Science Beliefs Test, which aids in revealing science misconceptions.<\/p>\n
Benefits of Online Surveys<\/p>\n
The benefits of administering online instruments are well documented. Natal (1998) listed many of them. Respondents may complete online surveys at a time and place that is convenient for them without having to travel to a specific location at a particular time. Students receive immediate feedback on their results at the conclusion of the exam. Students with special needs can take all the time they need to complete the assessment without feeling rushed by the instructor or classmates. Students who have grown up with computers often feel more comfortable composing responses online, and their responses are more legible in type than script. Instructors do not have to give up instructional time and can instead use the time taken up by in-class paper exams to clarify students\u2019 thinking about misconceptions identified by online assessments. Moreover, the collected data can be more easily analyzed.<\/p>\n
Scholars benefit from the cost savings associated with not having to hire interviewers, transcribe tapes, or print paper surveys for large population samples. The surveys can also be disseminated to a wide range of participants and are not limited by geographic proximity or institutional interference in delivering the survey (Handwerk, Carson, & Blackwell, 2000; Schmidt, 1997). In addition to saving time, automatic data collection eliminates errors resulting from data entry (Rosenfeld et al., 1993).<\/p>\n
A thorough review of the literature relating to the development of instruments used to determine misconceptions revealed that many researchers have emphasized the need for assessments that could be easily administered and used by classroom teachers. Although many of the existing instruments use a multiple-choice format, this format does not allow respondents to develop and express alternative responses that more fully reflect the range of their beliefs, including misconceptions, about a particular idea.<\/p>\n
The Science Beliefs Test<\/p>\n
Format<\/p>\n
Our objectives in creating the Science Beliefs Test were to uncover prevalent misconceptions, as well as potential reasons for these misconceptions. Therefore, we decided to use a two-tiered instrument. The first tier consists of statements with a true or false response, and the second tier asks students to provide a written explanation to support the true\/false response given for each item. The online collection process keeps a record of these explanations and provides a rich collection of data related to beliefs regarding specific scientific phenomena. This format also has practical classroom implications. It not only helps the teacher determine the extent to which particular misconceptions are held by students, but it also provides a mechanism for determining students\u2019 underlying ideas. Moreover, it helps teachers understand when students are selecting the \u201cright\u201d answer but for the wrong reason(s) or, alternatively, the \u201cwrong\u201d answer but with a justified explanation. When used to assess students\u2019 prior knowledge, teachers are alerted to the most commonly held misconceptions so they can then adjust their instruction accordingly.<\/p>\n
Instrument Development<\/p>\n
Item selection and development was an iterative process. A thorough review of research on instruments that were developed to target alternative conceptions and misconceptions revealed that most instruments were developed for in-depth study of a specific concept, such as diffusion or chemical bonding, rather than a variety of science concepts crossing a range of science disciplines. K-12 general science classroom teachers and science educators, particularly those who teach elementary preservice teachers, have use of a broader instrument for both teaching and research purposes. The Science Beliefs Test was constructed to target a wide range of science concepts across science disciplines for that target audience. We sought to maintain balance in the number of questions associated with the topics of life science, physics, chemistry, earth science, and astronomy. Moreover, we focused item development on concepts that appeared to be fundamental to developing higher levels of understanding related to a particular concept. The selection criteria included (a) item represented a basic understanding for scientifically literate adults, (b) concept had been previously identified as being problematic for learners in research on alternative conceptions and misconceptions, (c) concept or topic is addressed in the National Science Education Standards<\/em> (National Research Council [NRC], 1996), and (d) a balance of items across science areas.<\/p>\n Initally, 23 items were selected from existing instruments. These questions were converted into a true\/false format before being administered to preservice teachers. This pilot test was designed to reveal (a) problems with the structure of the statements that might mislead participants, (b) the effectiveness of the two-tier design, and (c) science misconceptions commonly held by the participants.<\/p>\n