Veal, W., Brantley, J, & Zulli, R. (2004). Developing an online geology course for preservice and inservice teachers: Enhancements for online learning. Contemporary Issues in Technology and Teacher Education [Online serial], 3(4).

Developing an Online Geology Course for Preservice and Inservice Teachers: Enhancements for Online Learning

by William Veal, University of North Carolina at Chapel Hill; John Brantley, University of North Carolina at Chapel Hill; & Rebecca Zulli, North Carolina Education Research Council

As more and more courses are being offered online, the most critical issues facing educators include identifying how to increase the educational value of online courses and how to enhance instruction. While researchers have long investigated and developed best practices for classroom-based instruction, much less is known about how to best deliver instruction via the Internet (Kozma, 1999). Simply importing existing classroom-based models of instruction to an online format is not appropriate; likewise, the incorporation of current conceptions regarding what makes effective printed materials or video is not sufficient.

The purpose of this study is to explore previously identified instructional quality indicators or enhancements for classroom-based courses and evaluate improvements in effectiveness, usefulness, competency, assessment, communication, and achievement resulting from the incorporation of instructional quality indicators in an online course. For the purpose of this study, enhancements are defined as instructional processes related to general pedagogy and assessment and include such items as graphical and advance organizers, communication tools, and assessment formats.

This paper incorporates a systematic program of interlocking research that used a “design study approach” (Linn, 2000), in which the course was offered and studied in two consecutive summers in Phase 1 and 2, respectively. The results of inquiry from the Phase 1 informed and shaped the design of Phase 2. This study proposes the development of a methodology for educators and instructors who are planning and developing online instruction, and/or already teaching courses online. Online teaching and design benchmarks were developed from the literature to guide the design and evaluation of the online science course.

Literature Review

Distance education has been defined as “any form of learning that does not involve the traditional classroom setting in which student and instructor are in the same location at the same time” (Ko & Rossen, 2001, p. 313). Online courses are one example of distance education. In any form of learning, there are often agreed-upon elements for how instruction should occur. For example, The California Distance Learning Project in 1997 (as cited in Palloff & Pratt, 1999) defined key elements for distance education, which can also be applied to online courses; the separation of the teacher and learner in space and time, the availability of two-way communication, and the volitional control of learning by the students.

Other organizations (i.e., Concord Consortium) and people have developed guides for effective online teaching. For example, Elbaum, McIntyre, and Smith (2002) have published a list of 17 “essential elements” for preparing, designing, and teaching an online course.

Online Benchmarks

Studies of the benefits of online course elements or enhancements suggest that a variety of techniques and strategies can be effective with teachers in training to enhance their own learning (Phipps & Merisotis, 1999; Sonner, 1999; Tucker, 2001). Among online enhancements that are judged effective, students (here after referred to as “teachers” for the purpose of this study) may have preferences for features they judge to be useful for their own teaching (Pringle, 2002; Rivera & Rice, 2002). Thus, for teachers in training, effectiveness and usefulness of instructional enhancements are two related judgments with potential significance for further research on learning, teaching, and design. An additional consideration is the extent to which teachers claim to actually use enhancements that are effective, useful, or both. This line of research might be considered an extension of the perception literature (e.g., O’Malley, 1999), but does not include the transfer of knowledge by a population to a different setting (e.g., teachers learning and implementing technology or content knowledge in their classrooms).

Chickering and Gamson (1991) provided a template of benchmarks for designing traditional educational learning environments across the full spectrum of schooling, including online instruction: encourage student-faculty contact, encourage cooperation among students, encourage active learning, give prompt feedback, emphasize time on task, communicate high expectations, and respect diverse talents and ways of learning. Table 1 displays a comparison among different benchmarks for quality instruction from published research articles on online teaching and learning (Hannum, 1999; Phipps & Merisotis, 2000; Ragan, 1999).

Table 1.  Comparison of Online Instructional Benchmarks

Hannum (1999)Phipps & Merisotis (2000)Ragan (1999)
Encourages student-instructor contactInstitutional supportLearning goals and content presentation
Encourages cooperation among studentsCourse developmentInteractions
Encourages active learningTeaching/learningAssessment and measurement
Gives prompt feedbackCourse structureInstructional media and tools
Emphasizes time on taskStudent supportLearner support and services
Communicates high expectationsFaculty support
Respects diverse talents and ways of learningEvaluation and assessment


The columns represent the different benchmarks. Overlap among the different benchmarks is indicative of online teaching, learning, course design, communication, and assessment. Assistance for faculty members to design and teach an online course, guidelines for technical help, and advice and information for students to ease their learning situations are institutional responsibilities that make online teaching and learning possible. Teaching and learning contains the types of active situations and the instructional media and tools needed for online delivery. Course design includes the development and structure of an online course. Communication comprises an interaction between and among course personnel. Assessment consists of the evaluation and types of measurements used in an online course.

Additional benchmarks can be found at instructional technology centers of different universities and colleges around the nation (e.g., Southern Regional Education Board Evalutech, 2001; University of Houston-Clear Lake, 2002) and in books (e.g., Elbaum et al., 2002; Ko & Rossen, 2001; Weiss, Knowlton, & Speck, 2000).

Some believe that basic quality assurance principles from traditional teaching apply equally to online learning. For example, Jonassen and Hannum (1987) derived from traditional models of quality teaching four basic dimensions of learning from the research literature for guidance and understanding in “web-based” teaching, which can also be applied to online courses: design of the stimulus, learner responses, feedback, and lesson control. There are many practical guides and lists of suggestions for implementing online instruction (e.g., Phipps & Merisotis, 2000; Southern Regional Education Board Evalutech, 2001) that focus on various aspects of teaching and learning.

Other principles have emerged specifically around online learning. Hannum (1999) outlined seven broad models or delivery systems of “web-based” learning for online courses; WBI, Library, Textbook, Interactive Instruction, Computer Mediated Communications, Hybrid WBI, and Virtual Classroom. Instructors and course designers alike have drawn understanding from the strengths and weaknesses of these different models.

In two reports researchers Maki, Maki, Patterson, and Whittaker (2000), and Maki and Maki (2002) studied consecutive cohorts of students over two years taking introductory general psychology in two instructional formats: traditional classroom lecture vs. online formats. They sought to find characteristics that would enable predictions of success and satisfaction and students’ “multimedia comprehension.” Students were also asked to rate themselves on their skill in doing these tasks. Overall, students preferred the traditional lecture format, credited tentatively to instructor enthusiasm and coaching for exams. Despite the lower preference for online instruction, the web-based components were rated highly, and Maki and Maki (2002) declared that “further research [needs] to determine the characteristics of the highly satisfied student.”

Effectiveness of Online Learning

The effectiveness of online teaching and learning has been reviewed in the literature with similar conclusions (e.g., Russel, 1999; Swan, 2002). For example, Russel (1999) reviewed 355 research reports, articles, and summaries in the past 25 years and found no significant difference between distance education and traditional instruction on a variety of criteria. Some of these criteria are reflected in the quality design and teaching benchmarks mentioned earlier.

Other articles, books, and series usually suggest effective teaching elements, enhancements, or principles for implementing effective online teaching and learning. For example, Graham, Cagiltay, Lim, Craner, and Duffy (2001) discussed the use of Chickering and Gamson’s (1991) seven principles of effective teaching in evaluating an online university course, but these principles have not been used in a report on the evaluation of a course.

There are series of edited books by publishers that highlight principles or elements of online teaching. For example, Weiss et al. (2000) published an edited book that discussed the teaching and design of online courses. In another example the Sloan Foundation has published a series of books entitled Elements of Quality Online Education. This series focuses on learning effectiveness, faculty satisfaction, cost effectiveness, quality learning effectiveness, and student satisfaction.

The most recent book in the series entitled, Practice and Direction, Volume 4 (Bourne & Moore, 2002), continues along the same themes, but presents a review of research to support claims for online education. For example, Swan (2002) reviewed the research and determined that outcomes from face-to-face and online learning are similar.

In the current study, a graduate online geology course for preservice and in-service science teachers served as the context for exploring the interactions and learning within a community of learners. Although there have been articles espousing guidelines for developing online science courses (i.e., Murphy, 2001; Patterson, 2000), few studies have examined the effectiveness and content of online science courses (i.e., Burke & Greenbowe, 1999; Shih, Howard, & Thompson, 2000).

Even fewer studies were found that involved science instruction with in-service and or preservice teachers (e.g., Murphy, 2001). The quality indicators in the online geology course were translated into 10 enhancements that were tested and then implemented. Table 2 contains the dimensions of instruction used in the online course development. In the right-hand column, active links show how these dimensions were incorporated into the online design of the geology course.


Table 2. Quality Considerations in Distance Learning

Dimensions of the Course Varied Through InstructionEnhancement Variables Considered in the Present Study
Course Content:



(1)   Advance organizers

(2)   Conceptual questions

(3)   Class objectives

(4)   External web sites as content sources


Instructional delivery, the learning environment

Cooperative and

Collaborative Learning

(5) Visual imagery reminders of linked websites

(6) Graphical organizers

(7) In-text links to websites

(8) Flow-chart elements distributed through text


Feedback, assessment and evaluation:    (9) Section reviews of conceptual questions, objectives, and concept maps

(10) Daily quizzes

Theoretical Framework

Socioconstructivist and postmodern theories of learning reflect the social nature of knowledge and the notion that humans construct their own interpretations of phenomena that are developed through interaction (Piaget, 1973; Vygotsky, 1978). With the increased emphasis on communally constructed knowledge, the development and implementation of online courses have necessitated the building of a community of learners (McDonald, 2002) and social presence (Swan, 2002).

Social constructivists believe that knowledge is not constructed within a person; rather, knowledge is constructed outside of the person in social situations. Vygotsky (1987) believed that people negotiated and renegotiated thoughts, ideas, and language with one another so that meanings could be understood by those involved. Social factors influence learning, rather than the individual’s prior experiences (Vygotsky, 1978). Social constructivists believe that interpersonal relations are located outside of the individual before they are internalized. People within a community construct meaning so that dialogue and shared meaning can exist. The interactions and understandings are the basis for a shared culture (Shotter, 1992).

Social constructivism “posits that learners actively create knowledge and meaning through experimentation, exploration, and the manipulation and testing of ideas in reality” (Palloff & Pratt, 1999, p.16). Interaction and feedback from others assist in determining the accuracy and application of ideas.

In addition, collaboration, group activities, simulations, and open-ended questions facilitate an atmosphere of shared knowledge. Jonassen (1996) noted that the social construction of knowledge and meaning making through interactions within a community of learners is the preferred form of instruction, compared to one that involves teacher intervention and control. This understanding of social constructivism influences how online courses are designed and conducted, eventually resulting in design benchmarks for online teaching.

Connecting Communities of Learners (CCL; Tobin, 1997) embraces social constructivism and gives consideration to the social context and environment of learners and their individual needs. CCL is an approach to teaching that uses an online management system (i.e., WebCT or Blackboard) to organize a set of online activities. The CCL approach has been used to study preservice teachers during their development (e.g., Goh & Tobin, 1999). This approach has also been found to facilitate students’ engagement with the online curriculum (Tobin, 1998). In essence, the online management system allows instructors to develop and manage a course while incorporating multiple communication and multimedia applications to ultimately enhance learning.

The online environment is used to promote coparticipation and maximize learning within a community. Certain functions or attributes of the online course may enable students to share knowledge and communicate with one another. This interaction is different from that of the traditional face-to-face classroom with its controlled interactions and boundaries within time and space (Jonassen, 1996). This framework was couched in learner theory and tied directly to instructional practice and engaged curriculum development to enhance different functions of online communities.

The development and implementation of the teaching and learning enhancements in the online geology course reflected the principles of CCL. All participants were able to draw on cultural understandings and use them for the learning of content by communicating with one another about content, pedagogical, and technological issues.

The benefit of using socioconstructivist theory and CCL as theoretical frameworks for this study is highlighted by the integration of theory and application. The dimensions of instruction in Table 2 are the platform upon which online courses should be developed. Along similar lines, the theory used to construct the platform is based upon principles emphasizing communication, students’ social construction of knowledge, and collaborative interaction between students and an instructor. The application entails the implementation of design, teaching, and assessment to an online course.

Students who enroll in distance learning classes are likely to vary in their prior exposure to online learning. Some have only known traditional lectures; others may have taken numerous online courses. It is nowadays commonplace for both traditional and online courses to present information in a variety of ways, such as lectures, assigned readings, or media enrichment (such as slide shows, etc).

In addition to grades as indicators of learning success, student ratings of satisfaction may be equally important. There is not a sufficient or systematic knowledge regarding the best practices for delivering online content, and undertaking carefully designed systematic research is essential to establishing what works best in online learning for science (e.g., Linn, 2000; Linn, diSessa, Pea, & Songer, 1994).

There are many studies describing online learning and course development, but few studies explore the “original research dedicated to explaining and predicting phenomena related to distance learning” (Phipps & Merisotis, 1999, p. 2). These few studies conclude that distance education is comparable to classroom-based or face-to-face education (see Russell, 1999). What has not been studied to a significant degree is the impact of the design considerations based upon teaching and learning principles.

Swan (2002) stated, “We know online learning is effective. What we need to know is what makes it good, and how can we make it better?” Research questions were generated from these considerations and developed for this study based upon the literature. The first question was answered in Phase 1; the remaining questions were answered in both Phases 1 and 2.

  1. Did teachers learn science content more easily from the information found in the book or on websites?
  2. Did teachers increase their science content knowledge as a result of taking this online course?
  3. To what degree are online features added to course content, pedagogy, and feedback/assessment seen by teachers as effective enhancements?
  4. To what degree are online features added to course content, pedagogy, and feedback/assessment seen by teachers as relatively useful in their own instructional practice?
  5. How did teachers perceive the online enhancements improve their own instruction from the beginning to the end of the course?
  6. What is the nature of the online communication, and what role does it play in facilitating learning, if any?



In order to answer the research questions, it was determined that a design study approach, as proposed by Linn (2000), was well suited to investigate the impact of curricular decisions from one phase of a study (or course) to the next. The research design examined the same online geology course over two consecutive summers using two different groups of participants. Phase 1 was a pilot for the inclusion of enhancements in an online geology course following a specific 2×4 research design involving two groups and four treatment conditions.

Phase 2 focused on the implementation of the enhancements based upon the results from Phase 1. The “qualitative-quantitative philosophy of educational research” was used to emphasize that theory development was a continuous process throughout the research (Newman & Benz, 1998). The “qualitative-quantitative continuum” is a series of self-correcting feedback loops that operates to enhance the types of data being collected and used. Thus, each new item learned from Phase 1 was implemented or altered for Phase 2.

Self-checks were incorporated along the overall study to delete, add, and amend any research questions. Along the lines of the No Child Left Behind legislation, this mode of “scientific inquiry” is validated by the verification methods derived from qualitative (inductive) and quantitative (deductive) perspectives within the continuum’s feedback loops. The findings of Phase 1 informed the changes made to the course and curriculum, which were subsequently researched in Phase 2.


The overall study design incorporated two phases: Phase 1 consisted of research on 16 participants who enrolled and successfully completed the online course during the first summer. Phase 2 consisted of research with 18 participants who completed the course in the subsequent summer. Phase 1 initially included 30 participants, but due to technology problems, relocation, and lack of course completion, only 16 people ultimately completed the course and were included in the final count of participants. The course was offered during two consecutive summers. The research study and results reflect data collected on 16 participants for Phase 1 and 18 participants for Phase 2. The participants were a mix of preservice (12) and in-service (22) secondary (6-12) science teachers who completed the course. The preservice teachers had a median age of 25 while the in-service teachers’ median age was 35.

These preservice teachers took the course toward partial fulfillment for an initial teaching license and completed the course on campus or at their homes via the Internet without meeting face to face. The in-service teachers completed the course from their homes, located anywhere from 10 to 150 miles from campus via the Internet and took the course to increase their content and technology knowledge, while also earning continuing education and technology units for recertification.

Context of the Online Course

The online geology course was designed to follow the seven themes of Earth and environmental sciences found in the North Carolina Standard Course of Study (NC-SCOS); lithosphere, tectonic processes, origin and evolution of the Earth, hydrosphere, atmosphere, solar system, and environmental stewardship. The course had 20 “classes” divided into four units. Each class contained an overview that included the NC-SCOS alignment table and a list of objectives, an introductory summary of concepts, a concept map, and three to five web pages containing content and links to external websites.

Due to the online nature of the course, it was decided in the development stage that external websites would be used for the majority of the content, since the power and content of the Internet was to be exploited. The creators also did not want teachers merely to read a book and comment online.

Teachers were required to complete different assignments for their grade. In both phases, teachers were paired with a partner to develop and submit a lesson plan using technology. In addition, discussion groups comprising four to five teachers were developed for online discussions about topics ranging from technology to any teacher-initiated topic on geology. Teachers also had to complete a web evaluation such that two websites (one from the online course and one a teacher’s choice) were evaluated for their effectiveness in helping the teacher understand and teach content and help teachers learn content. Unit quizzes were given at the end of each unit. Daily quizzes were only given in Phase 1 during select treatments.

Over 300 external websites were used to provide and increase course content that was in an accompanying textbook (entitled The Blue Planet by Skinner, Porter, & Botkin, 1999). Within each class, students read a short description of the content and then referred to external websites to explore the content in more depth. The external websites were chosen due to their comprehensive or specific content, interactional use of simulations and videos, the quality of the lesson plans, pedagogical aspects (readability, organization, and degree of content difficulty), the ability to add knowledge to content not found in the book, or quality of diagrams and pictures.

A panel consisting of a science educator, a geologist, and five graduate students chose and evaluated the external websites. All communication and discussion occurred asynchronously using the course management software (Blackboard). For a more detailed discussion of the course, see Veal, Kubasko, and Fullagar (2002).

While any or all of the benchmarks in Table 1 may be critical to the quality of learning and instruction provided in distance learning, features in three categories were applied in the course design in the current study (Table 2): (a) Course content, development, and structure; (b) pedagogical aspects of instruction; and (c) Feedback, assessment, and evaluation.

Enhancements of course structure were provided through the presence or absence of overall graphical organizers, conceptual questions, class objectives stated at the beginning of each lesson, and external websites as content sources. Enhancements of instructional delivery were provided by visual imagery reminders of linked websites, graphical organizers (concept maps), links to websites embedded in text, and flow-chart elements distributed throughout text. Feedback, assessment and evaluation enhancements included section reviews of conceptual questions, objectives and concept maps, and daily quizzes.

Research Design

The research design was based upon the “qualitative-quantitative continu um” and the “design study approach” described in detail earlier. The study described in this paper took place in two phases, whereby research was conducted with participants taking the online geology course in the first summer it was offered, followed by a second phase in which results and recommendations learned from the first phase were implemented. Phase 1 was designed to evaluate the implementation of quality online enhancements. Phase 2 evaluated the effectiveness, usefulness, and competency of the implemented enhancements.

Phase 1. This group of 16 teachers, 3 preservice and 13 in-service, was presented with an original text version and the enhanced version of the course. The original version of the course was a textual presentation of content and descriptions of linked websites. The enhanced version contained the quality indicators mentioned previously. The research design is shown in Table 3.

Both groups started with “normal instruction,” which was the current online text material contained in the course and based upon initial designers’ perceptions of best practices. Once a baseline was established, half the sample received the “enhanced instruction” (Group A), while the other half continued to receive the “normal instruction” (Group B) for the next unit. The order switched for the third unit, and for the final unit, all participants received “enhanced instruction.” For example, daily quizzes were only given in the “enhanced” version, since it was deemed an educational enhancement for learning from the literature.

Table 3.  Research Design for Phase 1

UnitClassesNC-SCOS TopicGroup AGroup B
26Tectonic processes;
Origin and evolution of the Earth
36Hydrosphere; AtmosphereNormalEnhanced
44Solar System;
Environmental Stewardship

Phase 2. The second phase of the study included 18 teachers, 9 in-service and 9 preservice teachers, who completed the online geology course that integrated the enhancements. The enhancements were previously developed and incorporated into the design of the course for Phase 2. The entire course was enhanced with concept maps, conceptual questions, hyperlinks, visual reminders of links, references to the NC-SCOS, and graphic organizers. There was no separation of teachers into groups for this phase.

Data Sources

Multiple data sources were used to triangulate interpretations. No one source answered all the research questions. Together, the data sources illustrated how learning occurred and to what degree the enhancements helped participants learn science content. Table 4 shows the research questions, data sources, and analysis techniques.


Three surveys were used for this study. The first survey entitled the Effectiveness and Usefulness Survey (Appendix A) asked participants how the enhancements were effective in their learning of content. A Likert scale ranging from 1 to 4 (feature was a distracter to feature was very helpful) was developed for the effectiveness component. A second component of the survey asked how the participants’ knowledge of enhancements might be useful in their teaching or in their classrooms. A Likert scale ranging from 1 to 4 (never use this feature to use this in most of my lessons) was developed. There was a 73% response rate.

The second survey entitled Competency Skills (Appendix B) sought to measure teachers’ perceived competency in content, pedagogy, technology, and communication. A Likert type scale ranging from 1 to 5 (low competency level to high competency level) was used to measure the change in teachers’ perceptions on these four areas since the beginning of the course. In addition, four open-ended questions were asked about science knowledge, online learning, and course improvement. There was a 100% response rate to this survey.

Table 4.  Data Sources and Instruments to Answer Research Questions

Research QuestionData Source/InstrumentAnalysis
1. Did teachers learn science content more easily from the information found in the book or on websites?Daily Quizzes t-test
2. Did teachers increase in their science content knowledge as a result of taking this online course?Pretest, post testt-test
3. To what degree are online features added to course content, pedagogy, and feedback/assessment seen by teachers as effective enhancements?Teacher post survey of Enhancement EffectivenessQualitative with descriptive statistics (preservice vs. inservice teacher)
4. To what degree are online features added to course content, pedagogy, and feedback/assessment seen by teachers as relatively useful in their own instructional practice?Teachers’ reflective pre and post survey of Enhancement UsefulnessQualitative with descriptive statistics (preservice vs. inservice teacher)
5. How did teachers perceive the online enhancements in their own instruction improve from the beginning to the end of the course?Teacher post survey of Perceived Competency (MSEN survey)Qualitative with descriptive statistics
6. What is the nature of the online communication and what role does it play in facilitating learning, if any?Discussion Board, emails, teacher presentationsQualitative Content Analysis

The third survey was one developed by the North Carolina Math and Science Education Network (MSEN) for the evaluation of Eisenhower grants. The survey contained seven sections, and was used to determine the satisfaction and quality aspects of the course. There was a 100% response rate to this survey. All surveys were administered after the summer course during the fall semester. In October, participants were mailed the surveys with stamped return envelopes. Those who had not returned the surveys by mail were asked to complete the surveys in person at the fall meeting in November. For those teachers who did not attend the fall meeting or did not return the surveys initially, the surveys were mailed to them a second time.


Daily Quizzes

Sterling (2001) found that “multiple short assessments targeted at student explanations of specific concepts were most helpful in following emerging student understanding” (p. 7). Thus, we developed a series of short six question quizzes for 12 enhanced classes in Phase 1 (see units 2 and 3 in Table 3). Appendix C contains a sample of quiz questions (organized by topic) developed to match content found in the book and Internet websites respectively.

The developers sought this structure for several reasons. First, short content assessments would help students understand topics and prepare them for tests. Second, the design of the six questions was based upon content from the textbook and Internet to determine if students learned content from the textbook or Internet more readily. Three of the questions were on content found only in the accompanying textbook, and the other three questions were from content found on the linked websites provided in the online material. All questions were multiple choice.

Content Exams

A 40-item, multiple-choice, content exam was developed that followed the topics found in the seven themes of the NC-SCOS. Students took the exam on the first day they entered the online course. The same 40 items were part of the final exam, in which the items were interspersed with other items and the responses were rearranged. All content exams were completed using the online assessment tools found in Blackboard.

Communication Tools

Data from students’ online synchronous and asynchronous discussions and emails were collected. The asynchronous discussions were part of the course requirements, and the synchronous discussions were done in the fall as follow-up to the summer course. The fall “virtual chats” were to determine how the teachers were using the content learned in the course in their own teaching situations. The emails were from the students to the instructor of the course. Additional qualitative communication data were garnered from a face-to-face meeting in late fall, in which some of the participants shared their teaching successes with using content and ideas from the online course. In addition, focus group meetings were established at the fall meeting to discuss different issues related to the course design and implementation.

Qualitative Data Analysis

Textual analysis focused on generating etic themes from the instruments (effectiveness, usefulness, learning, and instruction), as well as categories derived from the literature for online learning (institutional and instructor support, design considerations, learning environment, communication, and evaluation). These categories matched the survey themes and online distance education literature concepts. The text data were read line by line and coded for units of meaning and understanding. These units were then defined using analytic memos (Strauss, 1987). Using Bogdan and Biklen’s (1992) model of analytic induction, assertions were generated and then compared with the etic themes and literature categories resulting in hypotheses. These hypotheses were continually formed and tested against subsequent data as well as across Phases 1 and 2. The quantitative data and patterns in analyses permitted the triangulation of themes and categories.


This section presents data for each phase. Results from Phase 1 were used to modify the course content and format for Phase 2. Some of the survey data were combined from all participants. The presentation of data is tabular, descriptive, and narrative. Within each phase, each individual research question will be answered. Looking within the process of online instruction, the data presented offers a variety of achievement, preference, and satisfaction measures that may shed further light upon Maki and Maki’s (2002) dilemma. That is, if the most successful students are not also the most satisfied, are there elements of the distance learning process that can account for this? Maki and Maki for example, mentioned structure and deadlines as potential deterrents. In the current study, structure is consid ered a positive attribute so that additions such as advance organizers, teaching tips, and lesson plans are expected to be “enhancements.” The current study also provides measures of the perceived usefulness of knowledge and skills acquired for the practice of teaching. If predictive of success, the variable of usefulness could also be a significant consideration in the relationship between success and satisfaction.

Phase 1

Research Question 1. The scores for the daily quizzes were averaged over all of the answers. The means and standard deviations for the correct scores for the book and Internet questions are in Table 5. There was a significant difference in the scores for the two types of questions. The questions formed from content in the accompanying textbook were answered more correctly than those from the Internet (t = 4.078, p < .0001). These results would indicate that these teachers learned more effectively from textual information. This was substantiated by some of their comments during the focus group meeting during the fall. As one teacher stated, “I’m a hard-copy person.”

Other teachers printed out the online material for several reasons. Some used the printed notes as an outline and then filled in more information from the linked websites. “I read most of it on the computer, but I still would print it because I would jot down notes,” another teacher stated. Others printed the material so they could read the content in short segments. (“I could go back and print the days, and I could read them during my free time when I didn’t have a computer.”) Most of the teachers felt that printing the text was more convenient and that it was not related to how they learned. For example, most of them did not feel that learning material online or from a computer screen was difficult; the printed text was just easier.


Table 5:  Comparison of Scores from Daily Quiz Questions based upon Websites vs. Textbook (Research Question 1)

Source of QuestionsMSDt

*p < .001

Research Questions 3 and 4. For Phase 1, an analysis of responses to the Effectiveness and Usefulness Survey indicated that these teachers liked some of the enhancements and showed no preference for others. For example, the teachers loved the external links as resources they eventually used in their own classrooms. (“This course has given me oodles of information about where I can find those technology components [content sites].”)

The content of the course was thorough, followed the NC-SCOS, and was understandable. One teacher integrated 90% of the online content into her teaching, offering that “we have used probably 90% or more of the curriculum from the geology 130.” Another teacher stated, “So I found the content very good and well organized…because it is so hard when you get that earth science book in a high school classroom. I’ve never really liked the organization of how they [textbook] set up the different topics.” Other teachers enjoyed the flow chart: “I also would like to say that I liked the first half of the flow map.”

Research Question 6. The teachers in this course communicated well using online discussions, a virtual discussion, and exchanging emails with “buddies” to complete projects and assignments in pairs. Responses included, “I loved the discussion board. I thought it was the most wonderful thing since homemade bread.” The virtual discussion was used as a sounding board to share ideas and contexts for teaching Earth and environmental sciences. Many of the teachers shared ideas about specific lab activities related to the content of the course. For example, one teacher responded that “one Internet research project I give them is where they are to pretend they are travel agents for an intergalactic travel agency and they have to design trips to the planets.”

Other sharing involved the pedagogy of using computers in their teaching, “I would sometimes do `center’ work like in kindergarten where there were certain centers that students went through —one of them being my computer. Because of limited time, I gave them specific sites.” Another area of communication was the assignment of buddies. Teachers were to work with one other person (assigned by the instructor based upon teaching level) to complete assignments and share ideas. As one explained, “Everything that we did we communicated back and forth and it was just wonderful. I made a new friend that I have never seen.”

Phase 1 was originally set up to test the effectiveness of the online enhancements following the 2×4 research design. Due to 8 teachers dropping the course and 6 not completing the course, it was difficult to ascertain specific data and statistical results from this phase. The data collected did inform the research team that the enhancements were effective and useful, and the teachers learned the content best from the material presented in the accompanying textbook. Communication in the form of discussion boards, emails, buddies, and shared assignments was also a vital component of the course.

Results Across Phase 2 and 3

Research Question 2. Paired sample t-tests were done on the pre- and posttests to determine the level of achievement for all 34 participants. Almost all (33 out of 34) of the GEOL 130 teachers made a significant gain on content achievement over the duration of the course (t = -14.300, p < .001). Table 6 contains the means, standard deviations, and t-test for the teachers.


Table 6: Comparison of Pretest and Posttest Content (Research Question 2)

Source of QuestionsMSDt

*p < .001

Research Question 3. The ratings of effectiveness showed that all of the enhancements were rated “sometimes or very helpful” by all of the 25 respondents (Table 7). The “most effective” enhancements were class objectives, links to websites, and use of external websites. These enhancements were rated 3.6, 3.6, and 3.5 on a 4.0 scale, respectively. The lowest rated enhancements were daily use of assessments and parts of a flow chart presented throughout text (3.0 and 3.1, respectively).

Table 7.  Perceived Effectiveness and Usefulness of Enhancement Features

Effectiveness Usefulness






Overall graphic organizer at start3.
Conceptual questions at beginning3.
Class objectives at beginning3.
Visual imagery reminders from linked websites3.
Graphical organizers to enhance content learning3.
Links to websites embedded in text3.
Parts of flow chart presented throughout text3.
Review section: restatement of conceptual questions, objectives and concept map3.
Daily brief assessment3.
Use of external websites for content3.

Research Question 4. The ratings of usefulness showed that all of the enhancements were rated “sometimes or very helpful” by all of the 25 respondents (Table 7). The “most useful” enhancements were class objectives at the beginning and use of external websites. These enhancements were rated 3.5 and 3.3 on a 4.0 scale, respectively. The lowest rated enhancements were daily use of assessments and parts of flowchart presented throughout text; rated 3.0 and 3.12, respectively.


In addition, one teacher mentioned, “Knowledge is easily imparted online. What to do with the knowledge gained is not.” This quote applies to the usefulness of the course for teachers when they implemented their knowledge and the course’s websites into their own classes.

When looking at the results from the effectiveness and usefulness data, five areas of distinction are apparent. These five areas represent the means that were either high or low for both the effectiveness and the usefulness of the enhancements. In other words, the way teachers learned the content may have been similar to the way they implemented it. The use of objectives at the beginning of a unit or class was deemed helpful for learning and teaching. In subsequent virtual chats, online discussions, and group meetings the teachers did not mention the use of objectives.

Likewise, the use of external websites for teachers’ learning new content followed by having students use and learn from the same sites was considered good. The teachers raved about the amount, substance, and quality of the external websites used in the course. As one teacher stated, “I will definitely use many of the links.”

Other teachers immediately started using the links in their classrooms: “I’ve been using more Internet info this fall than ever before (Thanks to all the terrific sites I discovered through GEOL 130).” On the other end of the spectrum, the teachers felt that graphical organizers, parts of flowcharts within the text, and daily assessments were not effective in learning the content or useful for their own teaching.

Research Question 5. The perceived competency level of these teachers increased in all aspects considered —content, pedagogy, technology, and communication. Table 8 shows the results of the teacher perceived development within these four aspects. The highest gains were found in the development of content knowledge. These results are not surprising since the main intent of the course was to develop content knowledge. A second aspect, which was probably not known until the end by the teachers, was the familiarity with the NC-SCOS. The lowest gains were found in the understanding of science process skills, ability to use the Internet, and the ability to communicate knowledge.

Table 8.  Perceived Competency Levels at the Beginning vs. End of Course

(Research Question 5)





Understanding of content2.
Understanding of NC SCOS competencies2.
Understanding of science process skills4.
Ability to use conceptual questions, objectives and key terms3.
Ability to use graphic organizers2.
Ability to use visual image reminders3.
Ability to use websites3.
Ability to use daily brief assessments3.
Ability to use the internet4.
Ability to communicate electronically with instructor3.
Ability to communicate electronically with peers3.
Ability to communicate knowledge4.


These low ratings are not surprising when considering the original purpose for developing the course. First, few hands-on process skills or activities were included, because the developers were unsure how to integrate “quality” hands-on experiences in the online environment. Second, most of the teachers who took the course were self-selected and probably had a high degree of computer knowledge and skill. As shown in Table 8, the beginning level was 4+ out of 5 within a point of the highest level. Third, since these were mostly experienced teachers, they probably felt they already knew how to communicate knowledge to others.


Two areas that showed modest gains in competency were the use of graphical organizers and websites. As stated previously, the websites were valuable in learning content and displaying content in a visual manner. Teachers also felt that they now understood how to use and implement the Internet into their instruction. Most of the teachers used the websites in projects and lectures with their students. The use of graphic organizers was probably more beneficial for the teachers to learn the concepts than to use them as a teaching tool. Little mention of the assessment devices were mentioned or raised by the teachers.

Online Dimensions of Teaching and Learning

Appendix D shows the qualitative themes derived from the data and literature. The qualitative themes were based upon five quality indicators derived from the literature on Internet learning and curriculum development. The data for this discussion came from the students’ emails, discussion boards, virtual chats, and a focus group meeting. The general trend in these themes came from questions on the funding agency’s survey. Appendix D contains representative quotes for the themes while the subsequent text explains and elaborates on the context. Through data analysis these broad themes were distilled from more specific analytic memos. Some examples of the analytic memos are technology use and problems, course structure and assignments, and communication.

First, the institutional, faculty, learner, and technical support varied among the stakeholders. In this particular study, the institutional support was not content specific and focused on technology use and access to the online course. Along similar lines, the instructor was the main conduit and troubleshooter for technical problems of the students. Learner support focused on technical issues of implementation and learning. Often, buddies or classmates were used as troubleshooters of technology. This conclusion was garnered from the focus groups. Content was not an issue or barrier to learning. All teachers learned content (with the help of technical and instructor support). Technical support was deemed appropriate to learners and faculty.

Second, the faculty designers and students deemed the course development, content, and structure procedures exemplary. Many students had never taken an online course and stated that they could learn easily in this type of enhanced environment. The use of websites from which to learn was a positive experience. The content of the websites and the interactive nature of many of them made learning more enjoyable and easier. This interactive type of content presentation resonated well with certain types of learners who would have had a difficult time if the content was only textual. All of the teachers had background with using technology, computers, and the Internet, thus the focus was on knowledge acquisition and not technology skill development.

Third, the instructional delivery and learning environment were efficient and reflected a negative atmosphere only due to technical problems and time needed for asynchronous discussion. The negative aspects of the course were mainly found in the first unit of the course or first few days of being online. A grace period for the teachers to get acclimated to the online format was needed. Once the teachers were comfortable with the course requirements, navigation system, and discussion tools there were minor technological troubles throughout the remainder of the classes.

Most of the teachers agreed that scientific knowledge could be learned online. Sometimes this meant enjoying the ability to print out the websites to read on hardcopy. This was also a frustration with some types of learners who expected some type of hands-on component. Even without a hands-on component, the teachers enjoyed use of online websites for knowledge acquisition.

Fourth, the communication, instructional, and interpersonal interaction were perceived as influential to their learning. Communicating with one another and sharing ideas was beneficial and helped the learning process. The collaborative nature of the online discussions, immediate feedback provided from the instructors, and the partnering of students were all seen as pedagogical enhancements to the course. Compared to just reading online material, the interpersonal interaction through communication amplified the affective component of learning. Continuous support from the instructor was deemed vital and essential to online learning in this format.

Fifth, the feedback, assessment, and evaluation aspects of the course mirrored the curricular support items mentioned previously in that they fluctuated among the types of learners. For example, the assessment devices were considered standard (multiple-choice and short-answer quizzes and tests), while the daily quizzes were thought of as too excessive. One teacher, however, perceived the daily quizzes as beneficial for her taking unit exams.

Another example relates to the assignments in which the teachers had to develop lesson plans. The in-service teachers had no problem with this task, while the preservice teachers struggled with the format and content of the lessons. This result could be expected since the preservice teachers had little or no background in developing lesson plans. The teachers did appreciate the feedback on assignments, which was deemed helpful and quick. Although online assessment was successful in this course, the implementation of hands-on activities or assignments would have introduced a new problem for assessment. All teachers responded well to the individual feedback from classmates and the instructors, which increased the sense of community among the teachers.

The results of this two-pronged study indicated that online learning of the content was effective and that the online content was perceived as transferable to teachers’ classrooms. Content knowledge did increase by taking the course. Communication aspects of the online course were helpful and did not detract from learning the content. For the most part, the communication during the first phase focused on technical problems and content questions. The number of emails and posted items on these topics reflects these items. They were resolved for the second phase with the use of different authoring software.


Online learning has been perceived by some to be the next wave in education. This study was able to shed some light on effectiveness of distance education using online instruction. Online learning of science can be done, and it may be done as effectively as face-to-face learning. Even though this study did not directly compare an online to a face-to-face course, the data presented indicated that learning science online with certain enhancements based upon quality teaching and learning indicators can be successful. This study also showed that traditional quality indicators for teaching and learning could be translated to an online format. The communities of teacher practitioners that took the course were similar in their achievement and perceptions of their online experience.

Traditional Techniques and Strategies

The usefulness and effectiveness of traditional strategies for teaching and learning translated relatively well into the online format. Several of the traditional techniques transferred more readily than others. For example, student and instructor contact, active learning, and cooperation among students (Chickering & Gamson, 1991; Hannum, 1999) were well received by the teachers and perceived as beneficial to learning. Course development and structure, evaluation and assessment, and student support (Phipps & Merisotis, 2000) were deemed necessary for this online course.

Most of the teachers in Phase 1 scored significantly better on questions developed from content found in the textbook compared to on the Internet. This result indicates that these teachers were more traditional learners and relied on the textbook rather than the linked websites for content information. Interactions, learning goals, and content presentation (Ragan, 1999) were other techniques that enhanced the ability for the teachers to learn science content online.

This conclusion may indicate that the learning styles of these teachers did not match that of the multimedia presentation of the online content. In subsequent surveys and interviews, this conclusion was substantiated. Some of the teachers in the interview stated that they would often print out the content from the online class. These same teachers found it easier to read and take notes from the book.

Online learning may be a persuasive and economic way to teach the masses, but learning styles of the students need to be understood and accounted for in course design and content presentation. The transfer of traditional teaching and learning techniques and strategies to an online format alone does not ensure learning and promote effective teaching. The solution is to incorporate all of the suggestions from experienced instructors (e.g., Elbaum et al., 2002; Ko & Rossen, 2001) into course development, but this may tax the abilities and resources of many institutions. The limitation in our study was the direct comparison of the online course with a face-to-face course using the same traditional techniques and strategies. Yet, this was not the intent of the study and may be a consideration for further investigation.

Validated Benchmarks (Enhancements)

The benchmarks or enhancements developed for online teaching and learning were validated by the incorporation of the design elements in Table 2. All 10 of the enhancements were perceived as possible to some degree. The only item not carried over to Phase 2 was the use of daily quizzes. It was perceived that these were more burdensome than helpful. It was not determined whether the daily quizzes actually helped the teachers learn the content or use the content in their assignments; rather the initial intent was to determine the effect of providing content in a traditional versus an online format. As with almost any course, it is assumed that students would learn content.

This result alone does not answer the overall research questions about the impact of teaching science online, but it is one of a constellation of results that point to the success of online learning. Science knowledge can be learned online, although the results do not answer what type of knowledge (declarative or procedural) may be best suited for the online environment.

More research studies need to be completed that will look at the ability of the online format to teach both declarative and procedural knowledge; both of which are vital to science learning. In order to understand the impact of online science teaching, more studies must be completed that include the “essence of teaching science” by incorporating experiences that can best imitate process skills and the use of manipulatives. Additionally, this is just the first of many studies that should evaluate and examine the impact of design considerations on online teaching and learning (Swan, 2002).

Social Constructivist Theory Online

The development of a CCL for online learning was effective in that teachers from diverse locations were able to share, discuss, and develop knowledge through online interaction with the instructor and classmates. The online format, coupled with the asynchronous and synchronous environments for discussion, provided few boundaries of time and space for knowledge sharing and learning. The use of external websites and the context of an online course facilitated the teachers’ engagement with the online curriculum. Learning was active and communal and represented the design enhancements outlined in this paper.

It can be concluded that various enhancements to the online content can help establish a community of learners who will share knowledge and exchange ideas. The CCL that was established online permitted equal representation of ideas and work from all participants. What needs to be explored in the future is how online communities of learners form with and without the instructor’s influences when taking into account the dynamics of asynchronous learning, how should online discussions and content presentations be adjusted for differing types of learners? These were not perceived limitations of the current study: rather, these are ideas that were discovered to be advantages of online learning for some types of learners.

This study proposes that the development of an online course take into consideration modified traditional techniques and strategies for teaching and learning. The methodology used and advocated in this study was one that incorporated a “design study approach” in which the instructors implemented ideas from the research literature, learned from the implementation, and then revised the online content and structure.

Of course, this methodology should repeat itself each time the course is offered. All instructors modify their courses from semester to semester, but often with little guidance and purposeful understanding. We propose that instructors purposefully evaluate their teaching online in consideration that this type of teaching and learning in this medium is still a new phenomenon for most people. These same instructors need to consider also the impact of including a forum for communication that can serve as the link among participants in the course and as a platform for sharing and learning knowledge.



Bogdan, R. C., & Biklen, S. K. (1992). Qualitative research for education: An introduction to theory and methods. Allyn and Bacon: Boston.

Bourne, J., & Moore, J. C. (2002). Elements of quality online education: Practice and direction. Sloan Consortium: Needham, MA.

Burke, K. A., & Greenbowe, T.J. (1999). The challenge of interactive chemistry at a distance: The Iowa chemistry education alliance. Tech Trends, 43(5), 29-31.

Chickering, A. W., & Gamson, Z. F. (1991). New directions for teaching and learning: Applying the seven principles for good practice in undergraduate education. San Francisco: Jossey-Bass.

Elbaum, B., McIntyre, C., & Smith, A. (2002). Essential elements: Prepare, design, and teach your online course. Madison, WI: Atwood Publishing.

Goh, S. C., & Tobin, K. (1999). Student and teacher perspectives in computer-mediated learning environments in teacher education. Learning Environments Research, 2(2), 169-190.

Graham, C., Cagiltay, K., Lim, B., Craner, J., & Duffy, T. M. (2001, March-April). Seven principles of effective teaching: A practical lens for evaluating online courses. The Technology Source [Online serial]. Retrieved January 3, 2004, from .

Hannum, W. (1999, March). Research based guidelines for web based instruction. Paper presented at the annual meeting of the International Conference on Mathematics/Science Education and Technology, San Antonio, TX.

Jonassen, D. H. (1996). Computer-mediated communication: Connecting communities of learners. Computers in the classroom: Mindtools for critical thinking. Englewood Cliffs: Prentice Hall.

Jonassen, D. H., & Hannum, W. H. (1987). Designing effective learner-courseware interactions. Educational Technology, 27(12), 7-14.

Ko, S., & Rossen, S. (2001). Teaching online: A practical guide. Houghton Mifflin: Boston.

Kozma, (1999, April). Discussant for current applications of instructional theory and design in technology. Symposium presented at the annual meeting of the American Educational Research Association, Montreal, Canada.

Linn, M.C. (2000). Designing the knowledge integration environment: The partnership inquiry process. International Journal of Science Education, 22(8), 781-796.

Linn, M. C., diSessa, A., Pea, R. D., & Songer, N. (1994). Can research on science learning and instruction inform standards for science education? Journal of Science Education and Technology, 3(1), 7-15.

Maki. W., & Maki, R. (2002). Multimedia comprehension skill predicts differential outcomes of web-based and lecture courses. Journal of Experimental Psychology: Applied, 8(2), 85-98.

Maki, R. H., Maki, W. S., Patterson, M., & Whittaker, P. D. (2000). Evaluation of a Web-based introductory psychology course: Learning and satisfaction in on-line vs. lecture courses. Behavior Research Methods, Instruments and Computers, 32(2), 230-239.

McDonald, J. (2002). Is “as good as face-to-face” as good as it gets? Journal of Asynchronous Learning Networks, 6(2), 10-23.

Murphy, T. P. (2001). Helping your local amphibians (HYLA): An Internet-based amphibian course for educators. Journal of Science Education and Technology, 10(4), 287-292.

Newman, I., & Benz, C. R. (1998). Qualitative-quantitative research methodology: Exploring the interactive continuum. Carbondale, IL: Southern Illinois University Press.

O’Malley, J. (1999). Students perceptions of distance learning, online learning and the traditional classroom. Online Journal of Distance Learning Administration, 2(4). Retrieved January 4, 2003, from

Palloff, R. M., & Pratt, K. (1999). Building learning communities in cyberspace. Jossey-Bass: San Francisco.

Patterson, M.J. (2000). Developing an Internet-based chemistry class. Journal of Chemical Education, 77(5), 554-555.

Phipps, R., & Merisotis, J. (1999). What’s the difference? A review of contemporary research on the effectiveness of distance learning in higher education. Washington: The Institute for Higher Education Policy.

Phipps, R., & Merisotis, J. (2000). Quality on the line: Benchmarks for success in internet-based distance education. Washington, DC: The Institute for Higher Education Policy. Retrieved January 4, 2004, from

Piaget, J. (1973). To understand is to invent. New York: Grossman.

Pringle, R. M. (2002). Developing a community of learners: Potentials and possibilities in web mediated discourse. Contemporary Issues in Technology and Teacher Education [Online serial], 2(2). Retrieved January 2, 2004, from

Ragan, L. C. (1999). Good teaching is good teaching: An emerging set of guiding principles and practices for the design and development of distance education. Cause/Effect Journal, 22(1). Retrieved January 4, 2004, from

Rivera, J. C., & Rice, M. L. (2002). A comparison of student outcomes and satisfaction between traditional and web based course offerings. Online Journal of Distance Learning Administration, 5(3). Retrieved January 4, 2004, from

Russell, T. L. (1999). The no significant difference phenomenon: A comparative research annotated bibliography on technology for distance education. Raleigh, NC: Office of Instructional Telecommunications, North Carolina State University.

Shih, C., Howard, M., & Thompson, A. D. (2000, February). Formative evaluations of a Web-based masters program: Insights for web-based course developers. Proceedings from the International Conference for the Society for Information Technology & Teacher Education. San Diego, CA.

Shotter, J. (1992, February). In dialogue: Social constructivism and radical constructivism. Paper presented at the Conference of Alternative Epistemologies in Education, Athens, Georgia.

Skinner, B. J., Porter, S.C., & Botkin, D.B. (1999). The blue planet (2nd ed.). New York: John Wiley & Sons, Inc.

Sonner, B. (1999). Success in the capstone business course: Assessing the effectiveness of distance learning. Journal of Education for Business, 74(4), 243-248.

Southern Regional Education Board Evalutech. (2001). Criteria for evaluating online courses. Retrieved July 30, 2001, from

Sterling, D. R. (2001, March). Strategies enabling collaborative teacher teams to assess student understanding of science. Paper presented at the Annual Conference of the National Association of Research in Science Teaching, St. Louis, MO.

Strauss, A. L. (1987). Qualitative analysis for social scientists. Cambridge University Press: Cambridge, MA.

Swan, K. (2002). Learning effectiveness: What the research tells us. In J. Bourne & Janet J. C. Moore (Volume Eds.), Elements of quality online education: Practice and direction: Vol. 4. Online Education (pp. 13-46). Needham, MA: The Sloan Consortium.

Tobin, K. (1997). Use of technology to connect communities of learners. University of Pennsylvania: Philadelphia.

Tobin, K. (1998). Qualitative perceptions of learning environments on the world wide web. Learning Environments Research, 1(2), 139-162.

Tucker, S. (2001). Distance education: Better, worse, or as good as traditional education? Online Journal of Distance Learning Administration, 4(4). Retrieved January 4, 2004, from

University of Houston – Clear Lake (2002). A checklist of standards for on-line course development (using WebCT). Houston: University of Houston – Clear Lake, Instructional Technology Center.

Veal, W. R., Kubasko, D. S., & Fullagar, P. (2002). Web based course on Earth and environmental science for preservice and inservice teachers. Journal of Science Teacher Education, 13(2), 131-146.

Vygotsky, L. (1978). Mind and society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.

Vygotsky, L. (1987). Thinking and speech. In R. Rieber, & A. Carton, (Eds.) & Minick, N. (translator) The collected works of L. S. Vygotsky (Vol. 1). New York: Plenum Press.

Weiss, R. E., Knowlton, D. S., & Speck, B. W. (2000, Winter). New directions for teaching and learning, principles of effective teaching in the online classroom (No. 84). Jossey Bass: San Francisco.



Contact Information: William Veal
University of North Carolina at Chapel Hill
email: [email protected]