Bergman, D., & Novacek, G. (2021). Effects of an asynchronous online science methods course on elementary preservice teachers’ science self-efficacy. Contemporary Issues in Technology and Teacher Education, 21(3). https://citejournal.org/volume-21/issue-3-21/science/effects-of-an-asynchronous-online-science-methods-course-on-elementary-preservice-teachers-science-self-efficacy

Effects of an Asynchronous Online Science Methods Course on Elementary Preservice Teachers’ Science Self-Efficacy

by Daniel Bergman, Wichita State University; & Greg Novacek, Wichita State University

Abstract

To address a statewide demand for elementary teachers, a midsized Midwestern (U.S.A.) university created an undergraduate licensure program for para-educators, nontraditional students who are already working full-time in schools. Although fieldwork experiences and mentoring occur in the schools where they work, the para-educator preservice teachers (PSTs) completed all college coursework via online classes with course readings, writings, videos, discussion board, home activities, and videoconference class sessions. Their coursework included an inquiry-based science methods course, taught asynchronously over 8 weeks in the summer, emphasizing the 5E Learning Cycle Model (Bybee, 2002; Bybee et al., 2006; Contant et al., 2018) and the Next Generation Science Standards (NGSS Lead States, 2013). Pre- and posttest measures were collected from the participating PSTs (N = 57), including the STEBI-B (Enochs & Riggs, 1990) to analyze self-efficacy beliefs about teaching science. Findings between pre- and postassessments included statistically significant increases with large effect sizes in both STEBI-B subscales (Science Teaching Outcome Expectancy; Personal Science Teaching Efficacy Belief). Responding to open-ended follow-up questions, participants perceived writing lesson plans and doing at-home science activities as the most helpful course elements in their confidence about teaching science.

From Para-Educator to Full-Time Teacher

To address a statewide demand for elementary teachers, a midsized Midwestern university created a new licensure pathway for paraprofessionals, or para-educators, who were already working full-time in schools. Students in this Teacher Apprentice Program (TAP) were categorized as returning adult learners (age 21 or older) or transfer students. As with undergraduate students in the traditional on-campus program, these TAP participants were considered to be preservice teachers (PSTs) as they worked to complete a Bachelor of Arts in Early Childhood Unified/Elementary Education and state requirements for full teacher licensure (certification).

Since the TAP preservice teachers were full-time para-educators, their fieldwork requirements (practicum, internship) occurred in the school in which they worked. They were mentored by school teachers and faculty in their workplace, along with an assigned success coach from the university, who communicated with the TAP PSTs primarily through online interactions (email, discussion board, and videoconference meetings such as Zoom, Skype, and Google Hangout).

To accommodate the PSTs’ full-time work schedules and wide-ranging geographic distances, all college coursework in TAP was completed via online classes. This included teacher preparation courses such as foundations, philosophy, psychology, and management, as well as all methods courses featuring subject-specific pedagogy. Online courses were taught through the Blackboard learning management system with course readings, writings, discussion board, and videoconference class sessions.

Online Science Teaching Methods

In addition to online methods courses in literacy, language arts, mathematics, social studies, and the fine arts, the TAP featured a science methods course titled Inquiry-Based Learning. It was a two-credit-hour course, taught asynchronously over 8 weeks in the summer. Like all TAP courses, class size was capped at 30 students. This class was typically taken near the middle of the program, after TAP PSTs had spent at least one academic year completing field experience/internship requirements in their school placement.

Courses taken before the science methods class, including Principles of Mentoring, Engaging and Motivating Learners, Early Childhood Assessment and Methods, Elementary Teaching Early Literacy, and Family Collaboration. Depending on the amount of General Education courses needed, the TAP PSTs may also complete additional methods, management, or content courses in mathematics or science over the summer.

In the science methods course, the expectation was that PSTs would spend approximately 96 hours on course-related activities, or an average of 12 hours each week for 8 weeks. Primary learner outcomes for the science methods course included the following:

  • Design science activities using the inquiry method to teach developmentally appropriate science content.
  • Be familiar with current science curricular materials and understand the interconnectedness of science disciplines.
  • Facilitate student planned and conducted investigations.
  • Provide the opportunity for students’ discovery and application of knowledge.
  • Prepare and implement lessons that include both content and language objectives and utilize strategies that support the learning of students from diverse populations, including English language learners.
  • Evaluate and select assessments to fit diverse learner strengths and needs including English language learners.

Learner outcomes aligned with those in other TAP courses (lesson planning, assessment, and differentiation for diverse learners and language learning). An overview of the course sequence, major topics and tasks is located in Appendix A, with a detailed summary in Appendix B. Different instructors may teach one or multiple sections of the class. Nevertheless, all sections aligned in terms of course outcomes, readings and resources, activities and assignments.

The textbook used in all sections of the science methods course was Teaching Science Through Inquiry-Based Instruction (Contant et al., 2018). Additional primary resources feature Next Generation Science Standards (NGSS Lead States, 2013) and the 5E Learning Cycle Model (Brown & Abell, 2007; Bybee, 2002; Bybee et al., 2006; Rodriguez et al., 2019). Besides text readings, the course provided several video and multimedia resources with overviews of inquiry-based learning, as well as example classroom footage of model activities with students (Australian Academy of Science, 2016; Biological Sciences Curriculum Study, 2012).

Since the course had no face-to-face classroom sessions, all science activities were designed as home experiments. PSTs completed inquiry activities through prompts and recommended materials. Any procedures provided were designed to initiate investigations and model inquiry-based thinking, as opposed to procedural or cookbook verification activities. In addition to the activities, PSTs completed written reflections on their experiences, with prompts focused on both the science content and the learning process and elements of inquiry (5Es, modifications, questioning, etc.).

During the course, PSTs designed an inquiry-based science lesson framed around the 5E model and aligned with NGSS performance expectations and three dimensions of Science and Engineering Practices, Crosscutting Concepts, and Disciplinary Core Ideas. Since the course occurred in the summer, PSTs did not teach the lesson to school children. However, they were encouraged to select standards and content that aligned with an age group or grade with whom they worked during the school year.

In addition to getting instructor feedback on their lesson plans, PSTs also applied decookbook strategies provided from example activities and articles (Everett & Moyer, 2007; Shiland, 1997). Other topics addressed were safety, evaluation, the teacher’s instructional role, and interdisciplinary connections.

Online Teacher Education

The TAP is an example of the growing online presence in higher education and teacher preparation programs. According to the most recent report by the National Center for Education Statistics (NCES; Institute of Education Sciences, 2018), the percentage of undergraduate students enrolled in at least one online class during 2015-2016 was 43.1%, up from 32.0% in 2011-2012. Numbers were even higher for students specifically in the field of Education: 45.7% in 2015-2016, versus 33.8% in 2011-2012. Among the 13 identified fields of study in the NCES report, Education was third highest in the percentage of students taking at least one online class in 2015-2016, behind Business/management and Computer/information science.

Similar trends have occurred for students enrolled in programs that are entirely online. In 2015-2016, 10.8% of all undergraduate students in a degree program were enrolled in one offered entirely online, compared to 6.5% in 2011-2012. Slightly fewer Education students were enrolled in an entirely online program, but still gaining. In 2011-2012, 6.4% of all students in the field of Education were in entirely online programs. However, this proportion increased to 9.7% in 2015-2016.

Despite the growing numbers of online courses and programs, only 9% of postsecondary faculty members prefer to teach classes that are entirely online (Pomerantz & Brooks, 2017). This survey of over 11,000 faculty members in U.S. postsecondary institutions also found that instructors have a love-hate relationship with online learning. While most faculty respondents believed an experience with online teaching would improve their instruction, a majority still believed that students did not learn as well as in face-to-face courses. Past research, however, finds that learning gain differences are insignificant when comparing online and face-to-face courses, with a blended approach resulting in stronger learning outcomes than either format by itself (Means et al., 2009, 2014).

The issue of online instruction becomes even more complex in the contexts of science and teacher education. Science courses face the challenge of going beyond content and incorporating laboratory techniques and inquiry approaches.

Online lectures by video are fine for conveying facts, formulas and concepts, but by themselves they cannot help anyone learn how to put those ideas into practice. Nor can they give students experience in planning an experiment and analyzing data, participating in a team, operating a pipette or microscope, persevering in the face of setbacks or exercising any of the other practical and social skills essential for success in science. (Waldrop, 2013, p. 268)

The rise of MOOCs — massive open online courses — has pushed educators and universities to explore new ways to practice science investigations, including remote control of laboratory equipment, smartphone applications, and video games (Hollands & Tirthali, 2014; Waldrop, 2013).

In the same way that online science content classes are limited without an authentic laboratory, online teacher education courses lack a tangible classroom in which to model and practice pedagogy. Moreover, the particular focus on science teaching methods creates even greater complexity. Historically, teachers have often struggled in identifying their roles in inquiry-based science lessons, which emphasize student-centered instruction and more intangible strategies on the part of the teacher (Crawford, 2000; Riga et al., 2017; Walker & Shore, 2015).

A robust body of research into online professional development for in-service teachers and science education already exists (e.g., Annetta & Shymansky, 2006; Davis & Zhang, 2013; Goldenberg et al., 2014; Herbert et al., 2016; Ingber et al., 2014; Kokoc et al., 2011; McFadden, 2013; Randle, 2013; Roehrig et al., 2013; Vanides, 2007; Watkins et al., 2020; Wong et al., 2016). Although not as extensive, undergraduate online science methods courses for PSTs are growing, along with research into this endeavor (Colon, 2010; Fulton &Yoshioka, 2017; González-Espada, 2009; Kern, 2013; Miller, 2008; Pope, 2012).

Studies of these courses have examined various elements (technology, student views, and beliefs) as well as general overviews of successes, challenges, misconceptions, and tips. Beyond the impact of an online science methods class, teacher candidates’ level of inquiry lesson planning also depends on additional factors, such as technology expertise, time demands, and school context (Colon, 2010). Nevertheless, the impact of online science methods courses can be examined with respect to multiple outcomes. This study reported in this article, for example, examined science self-efficacy beliefs of elementary PSTs.

Theoretical Framework for Science Teaching Self-Efficacy

Based on the work of Albert Bandura (1977, 1997), efficacy beliefs have long been strong predictors of teacher behavior (Pajares, 1996; Stripling et al., 2008; Tschannen-Moran et al., 1998; Wolters & Daugherty, 2007). Bandura (1994) himself defined self-efficacy as “people’s beliefs about their capabilities to produce designated levels of performance that exercise influence over events that affect their lives. Self-efficacy beliefs determine how people feel, think, motivate themselves and behave” (p. 71).

Historically, teachers’ efficacy in science has been found to correlate with their instructional behaviors and student performance. For example, a higher self-efficacy relates to improved pedagogy and achievement (Ashton & Webb, 1986; Enochs et al., 1995; Gibson & Dembo, 1984; Henson, 2001; Hoy & Woolfolk, 1990). Conversely, teachers with lower self-efficacy typically have negative views about science and may even avoid teaching it altogether (Koballa & Crawley, 1985). Moreover, teachers with low self-efficacy, in general, often have higher stress and are more likely to leave the profession (Durgunoglu & Hughes, 2010).

Given self-efficacy’s importance in teacher perceptions and behaviors, teacher educators have applied a multitude of approaches to improve elementary PSTs’ science self-efficacy. In his seminal work introducing self-efficacy, Bandura (1977) described four sources affecting one’s efficacy expectations:  performance accomplishments (or “personal mastery experiences”), vicarious experience, verbal persuasion, and emotional arousal (“physiological states”).

These sources are illustrated in Figure 1, including corresponding treatments provided by Bandura as well as specific elements in the online science methods course featured in this study. Some course elements are listed more than once since, as described by Bandura (1977), “Any given method, depending on how it is applied, may of course draw to a lesser extent on one or more other sources of efficacy information” (p. 195). The organization in Figure 1 is not intended to be an exhaustive catalog of course elements, but rather an overview to support instruction and study. More information about course structure, sequencing, and tasks is shared in Appendices A and B, with links to sample content.

Figure 1   Illustration of Theoretical Framework Outlining Self-Efficacy Sources and Treatments (Bandura, 1977) With Corresponding Course Elements

Research Focus and Methodology

The purpose of this study was to determine the effects of the online science methods course on preservice elementary teachers’ perceptions of teaching science. In particular, the following research questions guided the work: 

  • What impact does an online elementary science methods course have on participants’ self-efficacy of teaching science? 
  • What do participants perceive to be the most significant concepts or skills learned in the course?

This study featured a pre/posttest design using the Science Teaching Efficacy Belief Instrument for PSTs, or STEBI-B (Enochs & Riggs, 1990). Participants read series of statements and responded to each one on a 5-point Likert scale (as recommended by Burns, 2000), from strongly agree to strongly disagree. The instrument consists of two sections (presented as 23 interspersed statements) that assess two aspects of teaching efficacy: Science Teaching Outcome Expectancy (STOE) and Personal Science Teaching Efficacy Beliefs (PSTEB). These two sections, or subscales, align with Bandura’s (1977) self-efficacy theory, pinpointing two factors to consider when predicting behavior. The survey used in this study was a modified version by Bleicher (2004), whose revisions more clearly defined items on the STOE subscale. A copy of the instrument is located in Appendix C.

In addition to the STEBI-B instrument, participants responded to open-ended questions on the pre- and postsurveys (also in Appendix C). These answers were examined using content analysis (Esterberg, 2002) with a naturalistic inquiry approach (Denzin & Lincoln, 2005; Harry et al., 2005; Norris & Walker, 2005). Participants’ responses to open-ended questions typically consisted of listing topics or items with little or no description. Therefore, analysis required categorizing different topics as textual outcroppings, along with recording exemplars of dominant messages (McKeone, 1995).

The multimethods approach of quantitative and qualitative analyses was used to enhance study through complementary binocular vision of survey data (as in Reichardt & Rallis, 1994). Participants completed the survey instrument twice — once in the 1st week of the session (pre), and again in the final week of the course (post).

Participants

In this sample of convenience, participants were PSTs in the TAP, enrolled in four sections of the online science course. The total number of participants was 57. A large majority of participants identified as female (89.5%) and white/Caucasian (87.7%). The range of ages among participants was 20 to 54, with the average age being 35 years old. Table 1 provides more demographic information about participants.

Table 1    Demographic Information Provided by Study Participants (N = 57)

Categoryn (%)
Gender Identity
Female51 (89.5%)
Male6 (10.5%)
Ethnicity
White50 (87.7%)
Non-White7 (12.3%)
Age Range
20-29 yrs20 (35.1%)
30-39 yrs19 (33.3%)
40-49 yrs11 (19.3%)
50+ yrs7 (12.3%)
Highest Education Degree
High School Diploma/GED13 (22.8%)
Associates Degree30 (52.6%)
Bachelor’s Degree12 (21.1%)
Master’s Degree2 (3.5%)
Para-Educator Experience
0 yrs4 (7.0%)
1-3 yrs29 (50.9%)
4-6 yrs11 (19.3%)
7-9 yrs8 (14.0%)
10+ yrs5 (8.8%)

Regarding educational background, over half (52.6%) had associate degrees. Since the TAP was designed for nontraditional or transfer college students, a diverse range was found in participants’ highest educational degrees attained, from high school diploma to graduate degrees. On average, participants had completed 3.1 high school science courses and 2.4 college science content courses prior to the online science methods courses in this study. At the start of the semester, participants’ years of experience working as a para-educator ranged from 0 to 18, with the mean being 4.5 years.

Results and Analysis

STOE and PSTEB

A paired-samples (repeated measures) t-test was conducted to compare pre- and postsurvey data of participants’ self-efficacy and outcome expectancy, as measured by the STEBI-B instrument. Table 2 provides a summary of these results.

Table 2    Results of Paired Samples (Repeated Measures) t-tests Comparing Pre- and Postsurvey Data of STEBI-B Sections

SubscaleNMean Pre
(SD)
Mean Post
(SD)
t-valueSig. (p)Effect size[a]
(eta2)
STOE5735.84

(4.63)
37.67

(3.72)
3.22.002.16
PSTEB5748.16

(5.69)
52.54

(5.12)
7.63<.0005.51
[a]Eta2 values: .01 = small effect, .06 = moderate effect, .14 = large effect (Cohen, 1988)

A statistically significant increase was found in STOE scores between the presurvey, M = 35.84, SD = 4.63, and postsurvey, M = 37.67, SD = 3.72, t(57) = 3.22, p = .002, a = .05. For the PSTEB subscale, again, a statistically significant increase was found between the presurvey. M = 48.16, SD = 5.69, and postsurvey, M = 52.54, SD = 5.12, t(57) = 7.63, p < .0005. The eta squared statistic (Cohen, 1988) indicated large effect sizes for both the STOE subscale (.16) and for the PSTEB subscale (.51).

Since participants had a variety of educational and science backgrounds, subsequent analyses were completed to examine these variables and potential relationships with either instrument subscale. Variables investigated were participants’ age, years of experience as a para-educator, number of high school science courses completed, and number of college science courses completed. Initial analysis used Pearson product-moment correlation coefficient and found that number of college science courses had a medium, positive correlation with the presurvey PSTEB subscale, r = .34, n = 52, p = .013, but no significant correlation with the presurvey STOE subscale. All other variables examined had no significant correlation with either subscale. Individual correlation coefficients are listed in Table 3.

Table 3   Pearson Product-Moment Correlations Between Presurvey STEBI-B Subscales and Participant Background Experiences

NEED A COLUMN HEADERSTOE (pre)PSTEB (pre)
Number of High School Science Courses Completed (N = 54)
Pearson Correlation (r).15.21
Sig. (p, 2-tailed).287.131
Number of College Science Courses Completed (N = 52)
Pearson Correlation (r).06.34[a]
Sig. (p, 2-tailed).657.013
Years of Work Experience as Para-Educator (N = 57)
Pearson Correlation (r).26-.01
Sig. (p, 2-tailed).054.917
Age (N = 57)
Pearson Correlation (r).10-.15
Sig. (p, 2-tailed).484.271
[a]Correlation is significant at the 0.05 level (2-tailed).

Analysis of covariance (ANCOVA) tests were conducted to further examine potential impact of these same background experiences (independent variables) on the postsurvey subscales (dependent variables). Presurvey scores prior to the online course were used as covariates to control for individual differences on the respective subscale (STOE or PSTEB). After adjusting for presurvey scores, no significant main effects or interaction effects were found from any of the independent variables — science coursework in high school or college, age, years of para-educator work — on either postsurvey subscale. Since participants came from four different sections of the online course with different instructors, additional ANCOVA tests were conducted and no significant main effects were found from instructor on either postsurvey subscale.

Skills and Concepts Learned

In addition to the STEBI-B instrument, surveys asked participants to share responses to open-ended questions about what they learned and features of the online methods course (Research Question 2). The presurvey initially asked participants to share what they would like to learn most in the course. Although the presurvey prompt does not directly address Research Question 2, it does provide context and background information for examining postsurvey responses.  Only 45 of the 57 participants responded to this presurvey question, and the most frequent topics are listed in Table 4. Since participants could provide multiple items in their response, the total percent is greater than 100%.

Two topics appeared considerably more often than any others: student engagement (51%) and science pedagogy (49%) — both in nearly half of all responses. In several responses, these two items appeared together. Example comments areas follows:

  • “A fun and innovative way to teach elementary students science through interactive, hands-on learning.”
  • “Activities to keep science focused on the hands-on aspect for engagement”
  • “How to apply the necessary steps in effectively teaching students the subject of science. How to make better ideas for science lessons to help kids with a low-motivation become interested.”

Table 4   Most Frequent Topics Identified in Response to “What Would You Like to Learn Most in This Course?” Presurvey Question (N = 45)

TopicFrequency of ResponsePercent of ResponsesExample Responses
Engaging students2351%“How to effectively engage students into science.”
Science pedagogy, activities2249%“How to teach science.”
Serving specific student populations613%“How to help students with disabilities . . .”
Science content511%“More about science . . .”
General pedagogy49%“Any new skills to apply to the classroom.”

The postsurvey asked two questions about what participants perceived they learned most and what course features were most helpful. For both questions, 55 of the 57 participants provided responses. Again, since these prompts were open-ended, responses could include a single topic or multiple items.

Table 5   Most Frequent Topics Identified in Response to “What Are Some of the Most Significant Skills or Concepts You Have Learned?” Postsurvey Question (N = 55)

Skill/ConceptFrequency of ResponsePercent of ResponsesExample Responses
5E2342%“The introduction of the 5E Model.”
Inquiry1731%“Teaching science using inquiry based instruction . . .”
Integrating curriculum1425%“How to put science into any of your other subjects.”
Lesson planning1120%“The most significant skill I learned was the correct way to put together an NGSS lesson plan.”
Standards / NGSS815%“How to use the science standards webpage.”
Questioning skills / engaging students59%“The importance of being a guide when teaching and not hovering over the students. They need to be very involved in their learning.”
Making science accessible to students59%“Providing accommodations for students in science lessons.”

Table 5 lists the most frequent topics appearing in participants’ responses to the question asking about the most significant skills or concepts learned in the course. The two most frequently mentioned topics were “5E” (42%) and “inquiry” (31%). Some responses simply listed one or both items (e.g., “5E Model,” “5E’s. Inquiry based instruction.”). Other participants elaborated on these topics, including additional concepts or skills:

  • “How to use the 5E’s model of science instruction, how to integrate science into other subjects, and how to make science accessible for all students.”
  • “Student assessment, the 5E’s of inquiry-based learning.”
  • “I learned how to make a lesson plan with the 5E’s and make it effective for teaching with inquiry included.”

These example statements include other commonly identified skills or concepts participants noted as significant in their learning from the course: curriculum integration (25%), lesson planning (20%), science for all learners (9%).

Table 6   Most Frequent Items Identified in Response to “What Specific Parts of the Course Were Most Helpful in Your Confidence about Teaching Science?” Postsurvey Question (N = 55)

Course ElementFrequency of ResponsePercent of ResponsesExample Responses
Writing lesson plan2647%“Creating an inquiry based science lesson was the biggest eye opener.”
Doing science activities1629%“The simulated labs and hands on activities.”
Resources
- Articles/text
- Videos
12
9
3
22%
16%
5%
“The book itself is a great resource.”
“Articles and videos.”
5E1018%“Learning what the 5E’s were.”
Reflection / Feedback611%“How to use the science standards webpage.”
Discussion board46%“The independent worksheets and classroom discussions helped me the most.”

Table 6 lists the items participants identified most often in their responses to the postsurvey question asking what class elements were most helpful in their confidence about teaching science. Nearly half (47%) of all responses included the inquiry-based science lesson plan assignment. Example comments identifying the lesson plan follow:

  • “I learned the most from the lesson plan assignment as it made me apply it to a real life lesson I would teach in my elementary room.”
  • “Personally, designing the IBL [inquiry-based learning] lesson plan was most helpful to me as it gave me the opportunity to see what IBL looks like in design with the classroom.”

Doing science activities (29%) was the second most frequent course element identified by participants in helping their confidence in teaching science. Participants identifying these experiences also addressed student perspectives and consideration of the learning process:

  • “The very first activity we had to do as it shows the frustrations some students could have.”
  • “I really liked our first assignment where we had to participate in a science lesson. It helped me get in the mind of being a student again. This whole program I haven’t really felt like a student and this was so refreshing.”
  • “It helped to ‘do science’ to understand the 5E’s of the lesson.”

Additional course elements mentioned by participants were the various resources (22%), which included both written material (articles and textbook) and — to a lesser extent — videos. The 5E Model by itself was identified by 18% of the responding participants as most helpful in their confidence about teaching science.

Discussion, Limitations, and Implications

Prior research has found that PSTs’ self-efficacy beliefs increase after completion of a science teaching methods course (Bleicher & Lindgren 2005; Morrell & Carroll 2003; Palmer 2006a; 2006b). Findings in the present study indicated similar positive outcomes in PSTs as a result of completing a science methods course in an online setting, as opposed to a traditional face-to-face classroom. Participants showed significant increase in both PSTEB and STOE subscales of the STEBI-B instrument. Further analysis using the eta squared statistic (Cohen, 1988) found a large effect size for both the PSTEB (.51) and STOE (.16). Results of this study align with past research using the STEBI-B, which often have found greater growth in the PSTEB subscale than the STOE (Deehan, 2017).

One challenge is to identify particular aspects or experiences in the online course that may have impacted participants’ self-efficacy. Data from open-ended questions in this study’s postsurvey provide some insight in participants’ perceptions. In particular, many of the prominent items identified by participants — 5E, inquiry, integrating curriculum, lesson planning — align with what Palmer (2006b) describes as “cognitive pedagogical mastery.” These topics connect both with science-specific content as well as instructional application.

Of note is the power of branding complex inquiry-based instruction in a succinct manner (e.g. “5E”). Settlage (2000) also found links between PSTs’ self-efficacy and instruction about the learning cycle. Participants in the present study may have found comfort and structure through a simplified five-part framework for planning inquiry-based lessons. Subsequent research could assess the accuracy and depth of participants’ understanding and application of each “E” — engagement, exploration, explanation, elaboration, evaluation — in their learning and teaching.

The emphasis on inquiry-based pedagogy connects to participants’ responses regarding what course elements were most helpful in their confidence about teaching science. The most common element was the 5E-framed lesson plan assignment, identified by nearly half (47%) of all participants as helping their confidence. The second highest course element (29%) was the hands-on science activities that participants experienced. Even with a remote, at-home setting, participants found value in using prompts to guide their independent investigations and self-paced experiments with household objects (aluminum foil, pennies, mirrors, etc.). Given the significant change in participants’ self-efficacy from the start of the course to the end, the course design of writing inquiry-based lesson plans after doing inquiry activities appears to have influence. This is also supported by 5E and inquiry being the two highest items participants reported learning from the class.

Of note is what course elements were given little or no attention by participants in their survey responses. Although 11% of participants indicated on the presurvey that they wanted to increase their science content understanding from the course, science content by itself was not present in either postsurvey question about what they learned or what was most helpful for their confidence in teaching science. Even though learning science content knowledge is not a primary focus of this course, related items such as inquiry, safety, misconceptions, and nature of science (NOS) were featured in the course. With the exception of the fundamental theme of inquiry, none of these other topics were mentioned in participants’ postsurveys.

Teacher questioning and accessibility were two other pedagogical items given little attention by participants with respect to significant learning from the course. While a two-credit-hour, 8-week online summer course is limited in its scope, more explicit attention may be needed to address and assess important items such as instructional interactions, accessibility, NOS, safety, and misconceptions. The latter of these — science misconceptions — may tie nicely to science content understanding, as participants reflect on the accuracy of their own comprehension as well as common ideas of their students.

With regard to course elements influencing participants’ confidence in teaching science (Table 6), participants made little mention of collaboration and interaction. Discussion board assignments, which included reflective and responsive writings, appeared in only 6% of participants’ responses. While instructor feedback and reflection appeared a few more times in participants’ surveys (11%), any aspect of direct communication and cooperation was relatively sparse compared to other course tasks such as lesson planning and doing science activities. This result may have been due to the interactive elements aligned with verbal persuasion and emotional arousal treatments, which typically have less impact on self-efficacy (Bandura, 1977).

Likewise, this result could be the case with the use of videos — aligned with both emotional arousal and vicarious experience — mentioned in only 5% of responses about helping confidence. The original online course design featured multiple videos and related depictions or scenarios (see Appendix B), with the intent to compensate for limited interactive instruction and exposure to classroom science lessons. However, future courses may need to revise use of video and multimedia — with purposeful prompts and tasks to increase interactions and applications —to enhance participants’ experience and reflection.    

One caveat is the phrasing of the post-survey question about participants’ “confidence,” which is not identical to self-efficacy. As Bandura (1997) noted,

Confidence is a nondescript term that refers to strength of belief but does not necessarily specify what the certainty is about. … Perceived self-efficacy refers to belief in one’s agentive capabilities, that one can produce given levels of attainment. (1997, p. 382)

Nevertheless, the postsurvey question used the term “confidence” since participants were more likely familiar with it than “self-efficacy.” Moreover, the question specified context by focusing on confidence about the capability of teaching science. 

Participants’ postsurvey responses provide some insight into what course features may be linked to self-efficacy changes. Namely, key elements are inquiry-based science activities at home, along with educational application through lesson plan writing. Aside from the remote “at home” setting, these elements are not exclusive to an online format. There is still inclusion of mastery experiences (Bandura, 1997) and cognitive pedagogical mastery (Palmer, 2006b) and a positive impact on teachers’ self-efficacy. That no significant effect was found from one of four different instructors indicates the importance of intentional course design and experiential tasks.

Nevertheless, Watkins et al. (2020) spoke to the element of “responsive teaching” in online courses, and the instructor’s key role in facilitating and fostering student engagement. This responsibility for guiding and supporting learning could be further explored, especially with respect to asynchronous and synchronous online formats.

Even with accounting for participants’ previous science coursework and para-educator work experiences, other variables may impact survey responses. The actual quality of participants’ prior science and educational experiences is beyond the scope of this study but could assuredly influence PSTs’ perceptions and decisions. Most participants were taking one or more classes the same semester as the featured online science methods. This concurrent coursework contained a variety of subjects (i.e., general education, teacher education, and electives), as well as a range of instructional examples. Additional lurking variables arise from the participants’ school-based experiences. For example, the para-educators worked in schools and communities of different sizes, with different student populations and age groups. Participants may have received different levels of support from colleagues and mentors, particularly with respect to models of science instruction.

Another limitation in this sample of convenience is the participant group that is mostly homogenous in ethnicity and gender. A larger and more diverse pool of participants would strengthen ongoing research and the generalization of its findings. Nevertheless, past research with STEBI-B has found that gender has no significant effect on elementary preservice teachers’ PSTEB or STOE (Mulholland et al., 2004).

Future research could also expand data collection and analysis, going beyond the STEBI-B Likert scale and pre-/postsurveys. One example is Tosun’s (2000) modification to STEBI-B by adding interview questions. Another avenue is structured journaling activities, such as those used by Hodges et al. (2016) in their study of problem-based science self-efficacy. Analysis could include comparisons with face-to-face or hybrid settings for science methods courses. Interviews, focus groups, and case studies are all avenues for robust study. These avenues would assist examination of particular course elements from the online format — assignments and projects, hands-on activities, videos and other resources, discussion board interactions and written reflections, and more.

The actual design of online coursework is also of interest, including length and number of sessions. The course in this study used an 8-week schedule, typically shorter than most traditional college semesters. As much, the accelerated timeframe is atypical to most face-to-face formats (Roddy et al., 2017, p. 2). Additional next steps in research and practice are adjustments in course scheduling, including both length and time of year. While most classes in the featured online program follow an 8-week format, there are some cases of shorter (4-week) and longer (16-week) sessions. Also, year-round cohorts may create opportunities to teach the inquiry-based science class during a fall or spring semester, during which participants have more opportunities for direct application with school-aged students.

Any lasting impact of course elements and format could be examined through longitudinal studies of participants, including follow-up interviews and observations. Not only would this follow-up help measure the long-term effects of the online science methods format, but also align with recommendations to monitor teacher self-efficacy over time (Pruski et al., 2013) and ultimately, evidence of student learning (Guskey, 2000; Moreno & Tharp, 2006; Shidler, 2009).

After studying both elementary and secondary PSTs, Woodcock (2011) reiterated that preparation programs must feature both on-campus coursework and school-based experiences with explicit attention on self-efficacy. The inclusion of online class instruction to this mix can provide additional opportunities to increase preservice self-efficacy. Online format can promote self-efficacy with mastery experiences for future teachers, namely doing at-home inquiry activities and writing inquiry-based lesson plans. Additional use of text and video materials, surveys and assessments, discussion board conversations and other reflection exercises can further support participants’ learning and self-efficacy.

Teacher educators must give careful consideration to the prompts and tasks used to guide thought and reflection, which can enhance emotional and persuasive self-efficacy treatments toward a level of personal mastery. Through this guidance, teacher educators can also model explicit instruction with purposeful questioning and support, as opposed to lecture or overbearing control. Such examples of inquiry teaching and learning are beneficial for both onsite and online environments.     

References

Annetta, L. A., & Shymansky, J. A. (2006). Investigating science learning for rural elementary school teachers in a professional development project through three distance education strategies. Journal of Research in Science Teaching, 43(10), 1019-1039.

Ashton, P. T., & Webb, R. B. (1986). Making a difference: Teacher efficacy and student achievement (Monogram). Longman.

Australian Academy of Science. (2016). Primary connections: Linking science with literacy. https://www.youtube.com/channel/UCmx-UK7n-78qwCi4tnJYdDQ.   

Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84, 191-215.

Bandura, A. (1994). Self-efficacy. In V.S. Ramachaudran (Ed.), Encyclopedia of human behavior (Vol. 4, pp. 71-81). Academic Press.

Bandura, A. (1997). Self-efficacy: The exercise of control. W.H. Freeman.

Biological Sciences Curriculum Study. (2012). BSCS 5E instructional model.  https://www.youtube.com/watch?v=Is7a3nkoe-g.  

Bleicher, R. E. (2004). Revisiting the STEBI-B: Measuring self-efficacy in preservice elementary teachers. School Science and Mathematics, 104, 383-391.

Bleicher, R. E., & Lindgren, J. (2005). Success in science learning and preservice science teaching self-efficacy. Journal of Science Teacher Education, 16, 205–225.

Brown, P., & Abell, S. (2007). Examining the learning cycle. Science & Children, 46, 58-59.

Burns, R. B. (2000). Introduction to research methods. Pearson Education.

Bybee, R. W. (Ed.) (2002). Learning science and the science of learning. NSTA Press.

Bybee, R. W., Taylor, J., Gardner, A., Van Scotter, P., Powell, J., Westbrook, A., & Landes, N. (2006). The BSCS 5E instructional model: origins, effectiveness, and applications. BSCS.

Cannon, J. R., & Scharmann, L. C. (1996). Influence of a cooperative early experience on preservice elementary teachers’ science self-efficacy. Science Education, 80, 419-436.

Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Erlbaum.

Colon, E. L. (2010). Teacher candidates in an online post-baccalaureate science methods course: Implications for teaching science inquiry with technology. Unpublished dissertation. University of Hawaii at Manoa.

Cone, N. (2009). Preservice elementary teachers’ self-efficacy beliefs about equitable science teaching: Does service learning make a difference? Journal of Elementary Science Education, 21(2), 25-34.

Contant, T. L., Bass, J. L., Tweed, A. A., & Carin, A. A. (2018). Teaching science through inquiry-based instruction (13th ed.).Pearson.

Crawford, B. A. (2000). Embracing the essence of inquiry: New roles for science teachers. Journal of Research in Science Teaching, 37(9), 916-937.

Davis, K. S., & Zhang, H. (2013, January 9-12). Using a virtual community of practice to improve the quality of high school teachers’ inquiry-based instruction [Paper presentation]. Annual meeting of the Association for Science Teacher Education. Charleston, SC.

Deehan, J. (2017). The Science Teaching Efficacy Belief Instruments (STEBI A and B): A comprehensive review of methods and findings from 25 years of science education research. (Springer Briefs in Education). Springer.

Denzin, N., & Lincoln, Y. S. (Eds.) (2005). Handbook of qualitative research (3rd ed.). Sage.

Durgunoglu, Y., & Hughes, T. (2010). How prepared are the U.S. pre-service teachers to teach English language learners?International Journal of Teaching and Learning in Higher Education, 22(1), 32-41.

Enochs, L. G., & Riggs, I. M. (1990). Further development of an elementary science teaching efficacy belief instrument: A preservice elementary scale. School Science and Mathematics, 90(8), 694-706

Enochs, L. G., Scharmann, L. C., & Riggs, I. M. (1995). The relationship of pupil control to preservice elementary science teaching self-efficacy and outcome expectancy. Science Teacher Education, 79, 3-75.

Esterberg, K. (2002). Qualitative methods in social research. McGraw-Hill.

Everett, S., & Moyer, R. (2007). “Inquirize” your teaching. Science and Children, 44(7), 54-57.

Fulton, L., & Yoshioka, J. (2017, March 30-April 2). An online science methods course: Successes and challenges [Paper presentation]. National Science Teachers Association Conference. Los Angeles, CA.

Gibson, S., & Dembo, M. (1984). Teacher efficacy: A construct validation. Journal of Educational Psychology, 76, 543-578.

Goldenberg, L. B., Culp, K. M., Clements, M., Pasquale, M., & Anderson, A. (2014). Online professional development for high school biology teachers: Effects on teachers’ and students’ knowledge. Journal of Technology and Teacher Education, 22(3), 287-309.

González-Espada, W. J. (2009). Preservice teacher education online: Student opinions from a science methods course. Journal of the Kentucky Academy of Science, 70(1), 84-93.

Guskey, T. R. (2000). Evaluating professional development. Corwin Press.

Harry, B., Sturgis, K. M., & Klinger, J. K. (2005). Mapping the process: An exemplar of process and challenges in grounded theory analysis. Educational Researcher, 34(2), 3-13.

Henson, R. K. (2001, January). Teaching self-efficacy: Substantive implications and measurement dilemmas [Paper presentation]. Annual meeting of the Educational Research Exchange, College Station, TX.

Herbert, S., Campbell, C., & Loong, E. (2016). Online professional learning for rural teachers of mathematics and science. Australasian Journal of Educational Technology, 32(2), 99-114.

Hodges, C.B., Gale, J., & Meng, A. (2016). Teacher self-efficacy during the implementation of a problem-based science curriculum. Contemporary Issues in Technology & Teacher Education, 16(4). https://citejournal.org/volume-16/issue-4-16/science/teacher-self-efficacy-during-the-implementation-of-a-problem-based-science-curriculum/.

Hollands, F. M., & Tirthali, D. (2014). MOOCs: Expectations and reality. Full report. Center for Benefit-Cost Studies of Education, Teachers College, Columbia University. http://cbcse.org/wordpress/wp-content/uploads/2014/05/MOOCs_Expectations_and_Reality.pdf

Hoy, W. K., & Woolfolk, A. E. (1990). Socialization of student teachers. American Education Research Journal, 27, 279-300.

Ingber, J., Freking, F., & Maddox, A. (2014, January 15-18). How does participation in an online community of science teacher practice impact performance on embedded program assessments? [Paper presentation]. Annual meeting of the Association for Science Teacher Education, San Antonio, TX.

Institute of Education Sciences. (2018). Digest of education statistics. Table 311.22. Number and percentage of undergraduate students enrolled in distance education or online classes and degree programs. National Center for Education Statistics. https://nces.ed.gov/programs/digest/d18/tables/dt18_311.22.asp.

Kern, C. L. (2013, January 9-12). The impact of an online secondary science method course on preservice teachers’ efficacy, beliefs, and perceptions of teaching science [Paper presentation]. Annual meeting of the Association for Science Teacher Education, Charleston, SC.

Koballa, T. R., Jr., & Crawley, F. E. (1985). The influence of attitude on science teaching and learning. School Science and Mathematics, 85, 222-232.

Kokoc, M., Ozlu, A., Cimer, A., & Karal, H. (2011). Teachers’ views on the potential use of online in-service education and training activities. Turkish Online Journal of Distance Education, 12(4), 68-87.

McFaddan, J. R. (2013, January 9-12). Why can’t teachers work in the cloud: An examination of science teacher online professional development using Ning [Paper presentation].  Annual meeting of the Association for Science Teacher Education, Charleston, SC.

McKeone, D. (1995). Measuring your media profile. Gower Publishing Company.

Means, B., Bakia, M., & Murphy, R. (2014). Learning online: What research tells us about whether, when and how. Routledge.

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online-learning studies. U.S. Department of Education.

Miller, K. W. (2008). Teaching science methods online: Myths about inquiry-based online learning. Science Educator, 17(2), 80-86.

Moreno, N. P., & Tharp, B. Z. (2006). How do students learn science? In J. Rhoton & P. Shane (Eds.), Teaching science in the 21st century (pp. 291-305). NSTA Press.

Morrell, P., & Carroll, J. (2003). An extended examination of preservice elementary teacher’s science teaching self-efficacy. School Science and Mathematics, 103, 246–251.

Mulholland, J., Dorman, J. P., & Odgers, B. M. (2004). Assessment of science teaching efficacy on preservice teachers in an Australian university. Journal of Science Teacher Education, 15(4), 313-331.

NGSS Lead States. (2013). Next generation science standards: For states, by states. National Academies Press.

Norris, N., &Walker, R. (2005). Naturalistic inquiry. In B. Somekh & C. Lewin (Eds.), Research methods in the social sciences (pp. 131–137). Sage Publications.

Palmer, D. H. (2006a). Durability of changes in self-efficacy in a science methods course for primary teacher education students. International Journal of Science Education, 28, 655–671.

Palmer, D. H. (2006b). Sources of self-efficacy in a science methods course for primary teacher education students. Research in Science Education, 36, 337-353.

Pajares, F. (1996). Self-efficacy beliefs in academic settings. Review of Educational Research, 6, 543-578.

Pfitzner-Eden F. (2016). Why do I feel more confident? Bandura’s sources predict preservice Teachers’ latent changes in teacher self-efficacy. Frontiers in Psychology7, 1486. https://doi.org/10.3389/fpsyg.2016.01486.

Pomerantz, J., & Brooks, D. C. (2017). ECAR study of faculty and information technology (Research report). ECAR.

Pope, M. (2012, March 5). Developing an online science methods course for an elementary education teacher program: Getting it right the second time? Top ten tips [Paper presentation]. Society for Information Technology and Teacher Education Conference, Austin, TX.

Pruski, L. A., Blanco, S. L., Riggs, R. A., Grimes, K. K., Fordtran, C. W., Barbola, G. M., Cornell, J. E., & Lichtenstein, M. J. (2013). Construct validation of the Self-Efficacy Teaching and Knowledge Instrument for Science Teachers-Revised (SETAKIST-R): Lessons learned. Journal of Science Teacher Education, 24, 1133-1156.

Randle, D. E. (2013). An analysis of interactions and outcomes associated with an online professional development course for science teachers. Unpublished dissertation. Columbia University.

Reichardt, C. S., & Rallis, S. F. (1994). The relationship between the qualitative and quantitative research traditions. New Directions for Program Evaluation, 61, 5-11.

Rice, D. C. & Roychoudhury, A. (2003). Preparing more confident preservice elementary science teachers: One elementary science methods teacher’s self-study. Journal of Science Teacher Education, 14(2), 97-126.

Riga, R., Winterbottom, M., Harris, E., & Newby, L. (2017). Inquiry-based science education. In K.S. Taber & B. Akpan (Eds.), Science education: New directions in mathematics and science education (pp. 247-262). Rotterdam.

Roddy C., Amiet, D. L., Chung, J., Holt, C., Shaw, L., McKenzie, S., Garivaldis, F., Lodge, J. M., & Mundy, M. E. (2017).Applying best practice online learning, teaching, and support to intensive online environments: An integrative review. Frontiers in Education, 2:59. doi: 10.3389/feduc.2017.00059

Rodriguez, S., Allen, K., Harron, J., & Qadri, S. A. (2019). Making and the 5E learning cycle. The Science Teacher, 86(5), 48-55.

Roehrig, G. H., Donna, J., Billington, B., & Hoelscher, M. (2013, January 9-12). The Teacher Induction Network: Providing continued induction support to teachers during their first years of teaching [Paper presentation]. Annual meeting of the Association for Science Teacher Education, Charleston, SC.

Settlage, J. (2000). Understanding the learning cycle: Influences on abilities to embrace the approach by preservice elementary school teachers. Science Education, 84, 43-50.

Seung, E., Park, S., & Lee, M. (2019). The impact of a summer camp-based science methods course on preservice teachers’ self-efficacy in teaching science as inquiry. Journal of Science Teacher Education, 30(8), 872-889.

Shidler, L. (2009). The impact of time spent coaching for teacher efficacy on student achievement. Early Childhood Education, 36, 453-460.

Shiland, T. W. (1997). Decookbook it! Science and Children, 35(3), 14-18.

Stripling, C., Rickets, J., Roberts, T., Harlin, J. (2008). Pre-service agricultural education teachers’ sense of teaching self-efficacy.Journal of Agricultural Education, 49(4), 120-130.

Tosun, T. (2012). The impact of prior science course experience and achievement on the science teaching self-efficacy of preservice elementary teachers. Journal of Elementary Science Education, 12(2), 21-31.

Tschannen-Moran, M., Hoy, A. W., & Hoy, W. K. (1998). Teacher efficacy: Its meaning and measure. Review of Educational Research, 68, 202-248.

Vanides, J. (2007). Online professional development that works. Learning & Leading with Technology, 34(8), 10-14.

Wade, R. C. (1995). Developing active citizens: Community service learning in social studies teacher education. Social Studies, 86(3), 122-128.

Waldrop, M. M. (2013). The virtual lab. Nature, 499, 268-270.

Walker, C. L., & Shore, B. M. (2015, October-December). Understanding classroom roles in inquiry education: Linking role theory and social constructivism to the concept of role diversification. SAGE Open, 1-13. https://journals.sagepub.com/doi/pdf/10.1177/2158244015607584.

Waters, J. J., & Ginns, I. S. (2000). Developing motivation to teach elementary science: Effect of collaborative and authentic learning practices in preservice education. Journal of Science Teacher Education, 22(4), 301-321.

Watkins, J., Jaber, L. Z., & Dini, V. (2020). Facilitating scientific engagement online: Responsive teaching in a science professional development program. Journal of Science Teacher Education, 31(5), 515-536.

Wolters, C. A., Daugherty, S. G. (2007). Goal structures and teachers’ sense of efficacy: Their relation and association to teaching experience and academic level.Journal of Educational Psychology, 99(1), 181-193.

Wong, S. S., Firestone, J. B., Ronduen, L. G., & Bang, E. J. (2016). Middle school science and mathematics teachers’ conceptions of the nature of science: A one-year study on the effects of explicit and reflective online instruction. International Journal of Research in Education and Science, 2(2), 469-482.

Woodcock, S. (2011), A cross sectional study of pre-service teacher efficacy throughout the training years. Australian Journal of Teacher Education, 36(10), 23-34.


Appendix A
Overview of Sample Course Sequence, Topics, Assignments

Module length was 2 weeks each

Module 1
Doing Science / Science Standards

  • “Can I See Myself?” activity
  • “Float Your Boat” activity
  • What is science / How scientists do science
  • What are: Technology, Engineering, and Math
  • Children and play
  • Observations vs inference
  • Next Generation Science Standards (NGSS)

Module 2
Inquiry-Based Learning

  • Teaching through inquiry (Teaching science the way scientists do science)
  • The 5E Model for Inquiry-Based Learning

Module 3
Developing an Inquiry-Based Learning Environment

  • Review examples of lesson plans
  • Creating a classroom environment that is conducive to science discovery and safety
  • Questioning strategies
  • Inquiry-Based “5E” Science Lesson Plan due

Module 4
Science Assessment, Integration, and Accessibility

  • Use of a science notebook
  • Student misconceptions of science
  • Integration of a science lesson into another subject area.
  • Assessing science learning
  • Making science accessible for all learners

Appendix B
Detailed Summary of Learner Outcomes, Tasks, and Resources


Appendix C
STEBI-B—Science Teaching Efficacy Belief Instrument for Preservice Teachers (Bleicher, 2004; Enochs & Riggs, 1990)

Loading