Bober, M. J. (2003). Using technology to spark reform in preservice education. Contemporary Issues in Technology and Teacher Education [Online serial], 3(2).

Using Technology to Spark Reform in Preservice Education

by Marcie J. Bober, San Diego State University

Preservice education is in the midst of a long-awaited and much-needed renewal, spurred by reform at the national, state, and local levels and the availability of funds to jump-start curricular restructuring. Since 1999, more than 400 demonstration projects have been awarded under the federal government’s Preparing Tomorrow’s Teachers to Use Technology Initiative. San Diego State University’s Learning Through CyberApprenticeship (LTCA), part of the first cohort of grantees, is built around a simple but challenging premise: to establish and grow a “best practices” structure that calls for preservice teachers enrolled in our “stand-alone” technology course (EdTec 470) to work directly with technologically skilled teachers in the field—creating instructional units or lessons for immediate classroom use. This article illustrates how LTCA and EdTec 470 reflect the new look of preservice education in California.

Preservice education is in the midst of a long-awaited and much-needed renewal, spurred by a remarkable set of converging conditions: at the national level, a move among accrediting agencies to endorse programs that are learner centered and performance based (e.g., see National Council for Accreditation of Teacher Education [NCATE], 2002); at the state level, legislative reform that increases the rigor of teacher education and, by extension, the criteria or standards by which credentials are awarded (e.g., see Education Week, 2003); and at the local level, curricular revisions (funded with both private and public moneys) that promote mentoring/modeling, experiential learning, use of technology, and community building (e.g., see the Bridging Cultures Project, led by the WestEd Regional Educational Laboratory, serving Arizona, California, Nevada, and Utah, or Focus on Algebra, led by the San Diego County Office of Education). (Editor’s note: See the Resources section at the end of this article for web sites.)

That technology is central to curricular innovation is confirmed by the sheer number of demonstration projects (400+) awarded since 1999 under the federal government’s Preparing Tomorrow’s Teachers to Use Technology (PT3) initiative (  and Currently, PT3 is authorized under Title II of the No Child Left Behind Act (NCLB; U.S. Congress, 2001), which attends (in part) to the preparation, training, and recruiting of high quality teachers and administrators. Despite the awkward (and slightly recursive) wording, the purpose of the PT3 program is succinctly described in Part B/Section 221 of that legislation:

… to carry out programs that prepare prospective teachers to use advanced technology to prepare all students to meet challenging state and local academic content and student academic achievement standards; and to improve the ability of institutions of higher education to carry out such programs.

The overarching goal of PT3 is to change dynamically the very face of teacher preparation, in particular, the ways in which preservice teachers are exposed to technologies that can transform teaching and learning (See, for example, the broad range of PT3 issues, which include sustainability, digital equity, best practices, faculty development, and mentoring, discussed at the Eighth Annual Teaching and Learning Symposium sponsored by the Texas Center for Educational Technology at the University of North Texas). Many projects feature strategies and activities focused on faculty development, course restructuring, state-level certification requirements, mentoring relationships, venues for building communities of practice, and assessment tools and techniques.

San Diego State University’s (SDSU) Learning Through CyberApprenticeship (LTCA), awarded under the PT3 Initiative in 1999, has a simple but challenging goal: to establish and grow a structure through which preservice teachers enrolled in EdTec 4701 work directly with technologically skilled teachers (TMTs) in the field to create projects (activities, lessons, units) for classroom implementation that depict best practices in technology use.

The passage of time, however, suggests that the real legacy of LTCA is far different—and far broader—than the program team ever imagined. The search for enduring techniques to build bonds with veteran teachers spurred substantive rethinking about and revamping of the course itself. The project staff now realizes that LTCA’s several objectives have always centered around EdTec 470, affecting its content, its facilitation, its expectations of what students will accomplish and how their performance is showcased, its relationships within SDSU’s School of Teacher Education and among its faculty, and its interactions with the school districts where students complete their supervised fieldwork.

Contextualizing LTCA and EdTec 470

Background of Reform

LTCA “uses” EdTec 470 as a vehicle for targeting several gaps in teacher preparation that earlier technology-oriented interventions have failed to remedy, such as the following:

  • Preservice candidates are generally not placed with technologically adept master teachers. As a result, candidates may get the message—however unintentional—that technology is a nonessential add-on.
  • Integration of content-specific curricular thinking with technology is uneven at best. Because preservice teachers are differentially exposed to technology (in both their foundations and methods courses), they may enter the profession without a coherent vision of the ways in which technology can enhance both disciplinary-specific and higher order learning (decision-making, problem-solving).
  • There are technologically sophisticated teachers in local school districts who have stellar ideas about curricular enhancement but limited time to design, develop, and implement them.
  • Preservice teachers tend to be exposed to one or two school sites prior to graduation. While in-depth immersion has many advantages, candidates may not be adequately prepared to work in the diverse communities served by today’s public schools.
  • Limited exposure to actual classrooms and veteran instructors can create artificiality about the assignments preservice teachers complete to demonstrate their awareness of and skill with specific technologies and instructional strategies.

In essence, the LTCA proposal openly acknowledged that EdTec 470 did not fully reflect the very instructional practices that teacher candidates most need to emulate. If the overarching goal was to launch a generation of new teachers comfortable with active, student-centered learning environments that embrace constructivism and promote student-centeredness, then it was incumbent upon program staff to expose them to such settings as learners.

But education courses are not born in isolation; teaching standards (whether or not technology focused) are determined at the state level, and teacher educators are well advised to keep abreast of (and contribute to) the ongoing dialogue about best practices. Thus, program staff planned project activities with an eye to California’s long-range efforts to dynamically restructure preservice education; reform, in fact, is systemic, attending to a complex of issues associated with the credentialing process, among them, admissions polices, assessment practices, oversight, and field placements.

In December 1998 (just prior to the official PT3 announcement), California’s Commission on Teacher Credentialing (CCTC) drafted new technology standards that responded both to recommendations of the Commission-appointed Computer Education Advisory Panel and to specific legislation (AB 1023) passed a year earlier by the State Assembly. Standard 20.5 affected both multiple and single-subject candidates, classifying technology savvy into two types: general skills and knowledge and specific skills and knowledge. Just as important, it required a demonstration of competence at two strategic points in time: prior to issuance of the preliminary credential (Level 1; 5-year limit; nonrenewable) and prior to issuance of the professional credential (Level 2; permanent, renewable). In Table 1 are two examples that represent the intent of competence at Level 1.

Table 1

Examples of CCTC Technology Standards’ Level Competence

General Knowledge and SkillsSpecific Knowlege and Skills
  • Each candidate demonstrates knowledge of current basic computer hardware and software terminology
  • Each candidate demonstrates knowledge and understanding of the legal and ethical issues concerned with the use of computer-based technology
  • Each candidate uses computer applications to manage records (e.g., gradebook, attendance, and assessment records)
  • Each candidate identifies student learning styles and determines appropriate technological resources to improve learning

But that was just the beginning. In 2001, the CCTC adopted a set of standards (mandated by passage of SB 2042) that portended a comprehensive overhaul of teacher preparation in its entirety. To some degree, this phase of reform attempts to keep pace with changes enacted by the major agencies or associations charged with accrediting teacher preparation programs, NCATE most prominent among them (see, in particular,  ). SB 2042 attended to program design and governance; disciplinary rigor; readiness for meeting the needs of special populations (including English language learners and students with disabilities); curricular integration of emerging and established practices; and supervised fieldwork, including selection criteria, candidate qualifications, and assessment. Standard 9 of Category B ensures that newly credentialed teachers are technology competent. However, the promoted skills, knowledge, and values at the core of Standard 9 are far less rigorous than those first adopted in 1998 (see a draft version at  ). While the two-tier system remains intact (introductory or basic skills at Level 1/initial credential; more complex skills at Level 2/credential renewal), the distinction between general and specific skills is blurred. Table 2 provides two examples.

Table 2

Examples of Distinctions Between General and Specific Skills in CCTC Standard 9

  • Each candidate uses computer applications to manage records (e.g., gradebook, attendance, and assessment) and to communicate through printed media (e.g., newsletters incorporating graphics and charts, course descriptions, and student reports).
  • Each candidate considers the content to be taught and selects the best technological resources to support, manage, and enhance student learning in relation to prior experiences and level of academic accomplishment.

Each of the state’s credentialing institutions/programs is required to meet all SB 2042 guidelines within a 2-year time frame (generally, late 2003); however, some requirements—technology competence among them—have an accelerated time line.

Teacher educators are generally aiming at the more exacting technology competencies adopted by the CCTC in 1998. Thus, as of July 1, 2002, most preservice candidates will earn a Level 1 or 2 credential only if their teacher preparation program verifies attainment of the general and specific skills/knowledge highlighted in Standard 20.5. Preservice teachers graduating from SDSU’s School of Teacher Education and seeking a Level 1 credential must have either earned a passing grade in EdTec 470 or received an EdTec 470 waiver via well-defined and publicized procedures.2

Defining Competence

The state’s revamped technology standards provided project staff with an obvious starting point for course reorganization, but the process of conceptualizing and reordering priorities ultimately led to a fundamental design question that had to be addressed head on: To what overall or culminating outcome were staff aiming—technical skill or pedagogical excellence—and if the latter, what are the pedagogical beliefs and behaviors that characterize the technically savvy teacher candidate?

Few would argue that hardware and software savvy alone portend competence—defined simply as a teacher’s ability or readiness to incorporate technology into daily instruction. True proficiency goes well beyond isolated skills to the ways in which knowledge, skills, and values converge or meld to affect the teaching and learning dynamic and contribute to pedagogical excellence. Roblyer, Edwards, and Havriluk (1997) spoke of the behavioral and attitudinal shifts that result from consistent exposure to technology and promising techniques for successful classroom integration. They suggested that technology infusion allows teachers to become more learner centered, less interested in whole-class instruction. The activities and projects they assign tend to be interdisciplinary and open ended, with students encouraged to pursue creative and appropriate solutions rather than right ones. They stress cooperation and healthy competition.

In that light, then, program staff explored the many “standards” systems that began to emerge in the late 1990s. Many are comprehensive, featuring sets of metrics or scales by which growth or progress may be progressively measured. Common among them is a focus on self-assessed skill with different software and hardware (novice to mastery); frequency of use (rare to always); awareness of techniques (unfamiliar to highly familiar); and perceptions of support or access to technical assistance (no support to 24/7 support). Ultimately, LTCA staff drew on elements of each of the following in considering the pedagogical elements of technology competence to which preservice candidates should be aiming.

NETS. Leading the field are the highly touted National Educational Technology Standards (NETS) for Teachers (International Society for Technology in Education [ISTE], 2002) developed by the ISTE with PT3 funding. The NETS system is, in fact, the benchmark by which a growing number of funded technology-infusion efforts are assessed; since 1999, several state agencies and professional associations have adopted them wholesale. The allure of NETS is its breadth of vision. First, performance standards are couched in terms of essential conditions that must be in place institutionally and programmatically for preservice candidates to reach their potential. The conditions center around a shared vision for technology; equitable access to hardware, software, and telecommunications; financing and tenure/promotion policies that suggest technology is valued administratively; skilled faculty who are aware of and can model both general and disciplinary-specific technology use; availability of technical assistance and professional development; rigorous and relevant assessment practices; a commitment to student-centered approaches; and a commitment to disciplinary expertise (see

Second, teacher preparation is organized around the four phases that characterize most preservice programs: general preparation; professional preparation; student teaching/internship; and first-year teaching.

The NETS competencies are organized into six clusters, three with a distinct pedagogical focus: attending to the design of effective learning environments (supported by technology); the implementation of curriculum that includes methods/strategies for applying technology to maximize student learning; and the use of technology to facilitate a variety of assessment and evaluation strategies.

Familiarity with NETS spurred innovating thinking among the LTCA staff about EdTec 470 content, scope, and organization.

Seven Dimensions. The Milken Family Foundation for Educational Technology promotes its Seven Dimensions for Gauging Progress (The Milken Exchange on Education Technology, 1998), helping policymakers, educators, and technology directors determine “the conditions that should be in place for technology to be used to its greatest educational advantage in any classroom.” Central to the companion document, entitled Technology in American Schools: Seven Dimensions of Progress — An Educator’s Guide (Lemke & Coughlin, 1998), is a continuum of progress indicators for each dimension organized around three stages of progress: entry, adaptation, and transformation. Transition steps guide an educator from one stage to the next.

Each of the framework’s dimensions is relatively independent, featuring fundamental questions that stakeholders (most often, within a K-12 environment) should consider as technology and telecommunications are deployed. LTCA staff were most intrigued with Dimension 3: Professional Competency, which features four areas, or strands, with a clear pedagogical emphasis: core technology fluency; curriculum, learning, and assessment; professional practice and collegiality; and classroom and instructional management. Each profile features multifaceted outcomes that embrace skill, knowledge, and attitudes. The Professional Competency Continuum Online Assessment Tool helps educators (teachers and administrators) assess their status within the skill and knowledge areas showcased in the Professional Competency Dimension. The General Assessment provides an overview, while Detailed Assessments in the four major areas or strands generate customized advice and resources.

Familiarity with the Seven Dimensions (specifically, Professional Competency) provided LTCA staff with a useful complement to the NETS. Especially appealing was that the assessment scales signify receptivity to change and innovation (not merely personal skill, comfort, or frequency of use/application) and encourage group-level (not merely individual) participation (helping to ensure that program transformation is about us—not merely about me).

CTAP. The state-funded California Technology Assistance Program (CTAP) provides assistance to schools and districts integrating technology into teaching and learning. Effective use of technology is promoted through regional coordination of support services based on local needs organized around five core areas: staff development, technical assistance, information and learning resources, telecommunications infrastructure, and funding. Technology proficiency is organized into discrete sets or areas that roughly align to both CCTC standards and NETS; the sets themselves have been classified into larger dimensions or facets of the teaching experience: communication and collaboration; preparation for planning, designing, and implementing learning experiences; and evaluation and assessment.

Once registered in the CTAP2 network, users complete one or more of the surveys associated with each proficiency set/area, view their results, and then select professional development opportunities “customized” to identified weaknesses and strengths. Ongoing skill assessment is encouraged; higher scores on the surveys imply the effectiveness of the activities instructors have opted to attend/complete.

The CTAP structure was important for LTCA staff to consider, given its high profile within the state as a tool for planning/implementing professional development. Nonetheless, the project staff was well aware of its limitations; while several items within a proficiency set/area attend to instructional pedagogy, rigorous assessment of the indicators is stymied by the behaviorist orientation of the survey items and a simplistic measurement scale (introductory, intermediate, proficient).

StaR Chart. The CEO Forum on Education and Technology (2002) offers its Teacher Preparation (STaR) Chart, a tool for benchmarking and monitoring progress toward technological readiness and proficiency. The framework is organized around eight dimensions, some specific to a university, others particular to an institution’s teacher education program. The STaR Chart produces four school profiles that range from “Early Tech” (an institution, college/school, or teacher preparation program with little or no technology) to “Target Tech” (an institution, college/school, or teacher preparation program that serves as a innovative model for others to emulate).3

Institutions reportedly use the ratings data in several ways: to set benchmarks and goals (and then monitor progress toward their attainment); to identify technology needs for which grant/award applications may be written; to determine how best to allocate technology funds already available; and to serve as the basis of statewide technology assessments. The LTCA staff was quick to notice the system’s decidedly limited emphasis on pedagogy; nonetheless, they could understand its popularity. It certainly is simple to deploy and offers immediate results and next-steps prescriptions that, though generic, are easy to interpret.

While the least comprehensive of the four systems, the LTCA staff felt it was feasible to use the StaR Chart as a basic needs assessment tool to spur discussions about professional development and curricular revision.

The Look and Feel of EdTec 470

Although EdTec 470 intentionally remains a work in progress, it is certainly oriented to the learner, hands-on/experiential, and reflective of the actual school settings newly credentialed teachers face. It has a cognitivist bent, focused on the mental changes that occur during instruction, as well as learner autonomy and initiative (Simonson & Thompson, 1997). Pedagogically, the program staff has considered the student’s predisposition to learning; his or her developmental stage; the structure and form of knowledge itself; the sequencing of instructional materials and events; the form and pacing of practice, assessment, and reinforcement; and the level/nature of learner control (Bober, 2002). It is also constructivist in nature. Program staff argued for a design that recognized that a student constructs meaning based on personal or individual experience. If the course was to be relevant and authentic, it had to be content- and stimuli-rich, embrace visual formats, promote teaming and collaboration, and be flexibly organized (Jonassen, 2000).

In terms of configuration, the course reflects a format advanced by Wetzel (1993), wherein a core technology course is “combined” with the integration of technology across the curriculum. The idea is to create a structure in which ideas are constantly revisited and reexamined as candidates move from course to course and then to their field experiences. This blended strategy provides an authentic context for preservice teachers to examine instructional practices and reflect on their learning as new knowledge (skills, content) are acquired (Niederhauser, Salem, & Fields, 1999).

Each section of EdTec 470 is now linked to a teaching block and, thus, to faculty committed to technology and practical application of student work. The relationship benefits students in many ways, the most profound being clear connections between what they produce and why. Students are continually focused on planning and, by extension, the instructional processes and the strategies that promote learning. Primary readings are drawn from practitioner and peer-reviewed journals published by ISTE to ensure a solid orientation to experimental and quasi-experimental research designs. Genré-specific writing assignments (reflection, critique, persuasive argument) call for critical thinking about the art and science of teaching and students’ future professional responsibilities. Students write and rewrite objectives, aligning them with different forms of assessment—both embedded and more traditionally deployed. Ways to present and reinforce content are continually devised, as are multiple opportunities for extended practice and feedback (Gagné, Briggs, & Wager, 1992; Reiser & Dick, 1995). These are the essential theoretical/pedagogical connections that may have been overlooked in the past.

EdTec 470 is composed of self-contained modules that may be presented in different ways, in no preordained order. Each instructor determines the actual rollout based on the instructor’s assessment of students’ entry-level technology competence; their credentialing goals (single or multiple subject; special education); and their disciplinary focus (social sciences, literacy, math/science, arts, etc.). Even individual modules may be tailored, with more or less emphasis placed on web-based instruction, particular software packages, or specific instructional strategies. Different communication vehicles (both off- and online) ensure that the several instructors who teach the course are working as a team—sharing ideas, successes and disappointments, and challenges. Collaboration among the staff is exceedingly important, since students (upon graduation) will face a growing array of local, state, and national accountability mandates for which they must be prepared—many of them far easier to manage/track when technology is deployed.

Among the most noteworthy course innovations are the following:

  • The focus in EdTec 470 is on student-centered inquiry; preservice teachers complete complex, team-based projects in which the learning is meaningful and situated and the tasks open-ended and generative (Howard, 2002). While end-products may lack pedagogical sophistication, they reflect an instructional philosophy (the “spiral path of inquiry”) with an established history that reflects today’s emphasis on content and performance standards (Molebash, 2002a; Molebash, 2002b).4
  • In-class activities are grade-level appropriate to ensure that preservice teachers attend to state- and local-level content/performance standards for which they will be held accountable as teachers.
  • EdTec470 is designed to mix brief lectures with modeling/demonstration, hands-on practice, and out-of-class online discussions—an interactive structure that builds camaraderie between and among preservice candidates, makes using technology more enjoyable, and alters the types of questions (conceptual, logistical, procedural) classmates pose to one another (Bitner & Bitner, 2002).
  • Single subject candidates who enroll in EdTec 470 now find themselves with classmates who share their disciplinary/curricular interests (math/science; English, social studies). Content stratification—in which students are struggling with similar questions about attainment of state- or district-level standards and performance- and knowledge-based assessment—leads to more robust/viable activities and discussions.
  • Preservice teachers enrolled in EdTec 470 have multiple ways to develop authentic connections with veterans in the field. They can establish relationships with teachers (recruited via invitation), who submit criteria-specific ideas for lessons that call for innovative uses of technology and then mentor their junior colleagues (during development) via constructive feedback (see They can also affiliate with the Teaching to the Big Ideas project, a new facet of the City Heights Educational Pilot. Big Ideas is modeled after a similarly named professional development program (see by the nonprofit Educational Development Center; that effort improved elementary teachers’ mathematical understandings by organizing instruction around students’ own ideas. Central to the City Heights version is a thematic (rather than linear) view of social studies, as well as the use of technology resources to demonstrate the relevance of historical characters, events, and ideas to today’s world.

Impact of the Course Redesign

Data Collection: Systematic and Theory-Driven

A variety of data are regularly and systematically collected, both to comply with PT3 requirements and to ensure that the course is on track pedagogically, responsive to student needs, and compliant with state mandates. Evaluation is, in fact, guided by a framework that couples a management focus (Stufflebeam, 2000) with core elements of the Concerns Based Adoption Model or CBAM (Hord, Rutherford, Huling-Austin, & Hall, 1987). Moersch’s (2001) Levels of Technology Implementation Framework, an adaptation of CBAM that characterizes technology infusion along a growth or progress spectrum (where Level 0 represents nonuse and Level 6, refinement), has been particularly useful. Evaluatively, the tool promotes insight into student and instructor receptivity to change and innovation; it allows for extensive exploration of relationships (levels of use and student performance, classroom organization, and/or teacher/student interactions) and impact (how technology implementation is affected by purchasing practices/policies, classroom connectivity, or administrator commitment).

Data Collection: A Sampling of Evaluation Findings

Among the many data sources are pre-and post-course surveys directed to preservice candidates; surveys directed to course instructors; class observations; reviews of extant data (for example, the course website, email exchanges between TMTs and the preservice teachers they mentor, the instructor listserv, and projects/assignments students complete); and informal and formal discussions with the program staff.

The appendix contains a sample of pertinent survey findings and the implications drawn from them that have contributed to next-steps decision-making. The student survey results clearly suggest that the course attends well to critical content and in a way that leads to long-term retention. Still, there is an obvious need to substantiate the findings and explore why skill savvy differentially affects a preservice teacher’s emerging instructional philosophy. When triangulated with classroom observations, surveys have allowed program staff to confirm a baseline level of consistency between and among the course sections. It appears that section customization/tailoring (where students are organized by discipline and grade level) and well-defined linkages to methods courses increase pedagogical soundness (dynamic activities and lively discussions; distinct connections between what students produce and why) without negatively affecting attainment of state-mandated competencies.

Technology Competence in the Real World

This article demonstrates one institution’s efforts to contribute significantly to the reform of preservice education. LTCA has funded ongoing activities that model innovative uses of technology to improve the productivity and pedagogy of its teacher candidates. Course renovation has been systemic and thoughtfully implemented; connections with methods and foundations faculty are well-established and built on shared interests in techniques and strategies that contribute to enlivened instructional settings—both for preservice candidates and their future students.

The changes are sustainable, in part, because curricular reform has always been focused on responsiveness to legislative or regulatory mandates and the value of lifelong learning and continuous improvement. The project staff has worked diligently to reduce, if not fully eliminate, reliance on high-cost reform strategies, including incentives, workshops (which are lengthy and difficult-to-schedule/coordinate), and vendors with their own agendas.

Still, these results are not easily generalized to those achieved by sister initiatives. The demonstration nature of the PT3 program, in fact, breeds unique (project-specific) idiosyncrasies that thwart comparative analyses. Nonetheless, our experience allows for broad inferences to be legitimately made.

  • There is no one method or strategy for orienting preservice students to technology—and thus meeting Level 1 (and Level 2) competencies. Our results suggest, in fact, that teacher educators must be flexible and adaptable, receptive to new ideas, eager to change focus if and when circumstances warrant, and committed to currency and relevance.
  • Preservice students must do more than learn about technology. Experiential learning premised on authentic situations tends to help students see themselves as innovative curriculum designers and student-centered instructional leaders— responsive to unique student needs.
  • The preservice focus on technology (as articulated by the CCTC) may not adequately prepare graduates to tackle the logistical issues they will face as education professionals in their own classrooms—primarily time, hardware/software access, and opportunities to network with colleagues.
  • A focus on fairly low-level outcomes (publishing student work on the web, using drill-and-practice software, creating school or classroom web pages) fails to help preservice students see the ways in which technology can help them meet performance/achievement mandates for which they will be held accountable.
  • The preservice technology experience must attend both to common issues all candidates face and to unique dimensions that characterize different disciplines (math, science, English, business, etc.), grade levels (multiple subjects, single subject), and school settings (urban, suburban, rural; low and high socioeconomics, etc.).
  • The long-term impact of the technology-focused preservice experience warrants closer scrutiny. California’s state-level induction program, Beginning Teacher Support and Assessment, provides a ready-made mechanism for reconnecting with former students. For the preservice experience to be faithful to the realities of teaching and improve retention rate, we must continuously and systematically seek out and heed the voice of our newest practitioners.

Finally, there is little empirical evidence to support the rapidity with which emerging standards systems have been embraced—by teacher educators, professional associations, state accrediting agencies, and the preservice candidates themselves. This author cannot, in fact, locate any recent studies in which basic validity and reliability have been systemically tested. There is little to suggest that these systems are on target conceptually and/or theoretically, flexible enough to accommodate the high rate of change and innovation endemic to technology, or politically neutral (clearly important if they are to “outlive” the tenure of elected public officials charged with “enforcing” them). Researchers with a specific interest in the reform of preservice education have an obligation to investigate the viability of the structures on which our children’s future increasingly depends.


Bitner, N., & Bitner, J. (2002). Integrating technology into the classroom: Eight keys to success. Journal of Technology and Teacher Education, 10(1), 95-100.

Bober, M. J. (2002). Measuring teacher outcomes: Changed pedagogy. In J. Johnston & L. Toms Barker (Eds.), Assessing the impact of technology on teaching and learning: A sourcebook for evaluators (pp. 87-117). Ann Arbor: University of Michigan, Institute for Social Research.

CEO Forum on Education & Technology. (2003). School technology and readiness (STaR) charts. Retrieved May 5, 2003, from

Gagné, R. M., Briggs, L. J., & Wager, W. W. (1992). Principles of instructional design (4th ed.). New York: Harcourt Brace.

Hord, S. M., Rutherford, W. L., Huling-Austin, L., & Hall, G. D. (1987). Taking charge of change. Alexandria, VA: Association for Supervision and Curriculum Development.

Howard, J. (2002). Technology-enhanced project-based learning in teacher education: Addressing the goals of transfer. Journal of Technology and Teacher Education, 10(3), 343-364.

International Society for Technology in Education. (2002). National educational technology standards for teachers. Retrieved May 2, 2003, from

Jonassen, D. H. (2000). Computers as mindtools for schools: Engaging critical thinking (2nd ed.). Upper Saddle River, NJ: Merrill.

Lemke, C., & Coughlin, E. (1998). Technology in American schools: Seven dimensions for gauging progress. Milken Family Foundation. Retrieved May 2, 2003, from

The Milken Exchange on Education Technology. (1998). Seven dimensions for gauging progress of technology in the schools. Retrieved May 5, 2003, from

Moersch, C. (2001). Next steps: Using LoTi as a research tool. Learning and Leading With Technology, 29(3), 22-27.

Molebash, P. (2002a). The role of inquiry in methods courses and how technology can help. Clemson University, South Carolina Center for Excellence for Instructional Technology Training, Project Circuit. Retrieved January 4, 2003, from

Molebash, P. (2002b). The role of inquiry in the classroom and how technology can help. Clemson University, South Carolina Center for Excellence for Instructional Technology Training, Project Circuit. Retrieved January 4, 2003, from

National Council for Accreditation of Teacher Education. (2002). NCATE Unit Standards (2002 ed.). Retrieved May 2, 2003, from http://

Niederhauser, D. S., Salem, D. J., & Fields, M. (1999). Exploring teaching, learning, and instructional reform in an introductory technology course. Journal of Technology and Teacher Education, 7(2), 153-172.

Reiser, R. A., Dick, W. (1995). Instruction planning: A Guide for teachers. (2nd ed.). New York: Allyn & Bacon.

Roblyer, M. D., Edwards, J., & Havriluk, M. A. (1997). Integrating educational technology into teacher. Upper Saddle River, NJ: Merrill.

Simonson, M. R., & Thompson, A. (1997). Educational computing foundations (3rd ed.). Upper Saddle River, NJ: Merrill.

Stufflebeam, D. L. (2000). The CIPP model for evaluation. In D. L. Stufflebeam, G. F. Madaus, & T. Kellaghan (Eds.), Evaluation models: Viewpoints on educational and human services evaluation (pp. 279-318s). Boston: Kluwer Academic Publishers.

U.S. Congress. (2001). No Child Left Behind Act of 2001 (PL 107-110), Title II _ Teacher Quality Enhancement. Retrieved May 2, 2003, from .

Wetzel, K. (1993). Teacher educators’ uses of computers in teaching. Journal of Technology and Teacher Education, 1(4), 335-352.

Education Week. (2003, January 9). Quality counts 2003. Retrieved May 2, 2003, from



Beginning Teacher Support and Assessment –

Bridging Culture Project –

California Commission on Teacher Credentialing Technology Standard 20.5 –

California Technology Assistance Program –

City Heights Educational Pilot –

EdTec 470 –

Eighth Annual Teaching and Learning Symposium –

Focus on Algebra –


A Sample of Pertinent Survey Findings

Instructor survey. The lead evaluator regularly administers a perceptual survey to course instructors that calls for them to assess (a) the ways in which specific revisions have impacted the course and their facilitation of it; (b) the frequency with which they contribute to the instructor listserv, (c) the listserv’s collegial or instructional benefits; and d) their own contributions to course revisions. (The Fall 2002 end of course survey is available for review at: .)

Pertinent ResultsActions Based on Results
•   According to instructors, the course modifications that most affected students enrolled during the 2001-2002 academic year were: personalizing/tailoring the course syllabus, individualizing the module order, customizing the point system and making it more flexible, and providing open sessions for students on Saturdays.

•   Instructors are far more engaged than in the past; only a few continue to use the forum passively rather than actively. These stragglers expect the course coordinators (or their colleagues) to post comments on upcoming events, syllabus changes, or insights about specific activities—but do not see themselves as active contributors. While some fail to find the listserv particularly useful for addressing individual concerns or sharing professional interests, others have no qualms about airing their opinions publicly. The instructors who teach off-site claim that the forum makes them feel more connected and provides a heads-up about potential problems (with equipment or implementation) of which they should be aware. Nearly half the instructor pool also indicated that the forum makes them more responsible; the messages remind them that they have a common mission … despite unique teaching styles and, perhaps, areas of emphasis.

•   Instructors are eager to improve their interactions with methods faculty (both in terms of frequency and quality). They also strive to use one another as expert resources, especially when an upcoming in-class activity is complex or equipment-dependent. They seem to recognize that each of them brings skill sets to the table of which they can and should take advantage.

•   Results from the instructor survey have led the EdTec 470 coordinator to schedule face-to-face instructor meetings prior to and at the close of each semester. The planning and debriefing sessions extend listserv discussion; participants share best practices and lessons learned; discuss equipment needs; and calendar upcoming events important to their students or their own professional growth.

•   Survey results have led to prompter updating of the website and more regular (weekly or biweekly) collection of How did the module go? data (critically important when a course is taught by multiple instructors).

•   Evaluation data have affected how instructor services are utilized outside class time.

–    During the 2001-02 academic year, each instructor committed to facilitating one Saturday session (per semester) where interested students could seek additional help, get a jump-start on an assignment, or learn a skills/technology/tool in greater depth.

–    During the 2002-03 academic year, each instructor committed to shadow a colleague on specific days—either to provide general assistance with labor-intensive technology (for example when students created/edited digital movies) or disciplinary expertise that the colleague lacked (e.g., working with science probes to collect field data or creating spreadsheets to model complex algebraic concepts).

Student Survey. The lead evaluator has collected perceptual data from students enrolled in EdTec 470 since the project launch in 1999. The survey she uses has been well tested and features several different measurement scales (The fall 2002 end-of-course survey is available for review at .) Reliability is high (with no scale earning a reliability coefficient below .85).

Cluster DescriptionResults/Interpretation
A 12-item cluster is focused on skills(application or functional type); the scale was selected because it addresses the competencies required by the state and has been featured in surveys administered in several other California school districts. Students place themselves at one of four levels, each one representing an increasingly complex understanding or use.Although end-of-semester mean ratings for file managementdatabasesspreadsheetsweb authoringethics, and classroom integration tend to hover between 2.4 and 2.8, paired t-tests (data collected at Weeks 1 and 16, respectively) have revealed statistically significant growth in all skill areas. [27]

Based on perceptual data, it appears that the course well prepares students to meet state proficiency requirements for beginning teachers.

Another 12-item cluster calls for respondents to rate their proficiency on ideas associated with instructional planning or instructional strategies. The five-point scale is anchored by Not proficient at all (1) and Highly proficient (5).For each of the past five semesters, paired t-tests (data collected at Weeks 1 and 16, respectively) have revealed statistically significant growth on all issues. For the Fall 2002 semester, the mean change pre to post ranged from .58 (Using technology to communicate with parents about the school day) to .92 (Creating assessment tools—e.g., rubrics, checklists, matrixes—for evaluating student work).

In addition, only three items earned post-mean ratings less than 3.5:

Helping colleagues learn to use technology for instructional purposes: m=3.43;

Using technology to help students with special needs: m=3.09;

Incorporating technology into the physical environment of the classroom to support a variety of learning activities: m=3.47).

Based on perceptual data, it appears that the course exceeds state expectations for instructional use of technology by beginning teachers.

A 9-item cluster focuses on the value or potential of technology, with all but one item worded in the negative (as a quality-check on the survey itself). The five-point scale is a traditional Likert (1 = Strongly disagree; 4 = Strongly agree) with I don’t know (9) replacing Neutral or Undecided. For this series, then, a lower mean rating suggests a more positive attitude.For four of the past five semesters, changes pre to post (Week 1 to Week 16) have been small—almost imperceptible.

Discussions with course instructors have revealed a number of reasons for these results, e.g.:

•   that one semester does not offer ample time for reflection on the ideas targeted in this section of items;

•   that students generally begin the course fairly upbeat and positive—idealistic, one might argue—about many of the issues, making growth/change unlikely;

•   that students lack real-world experience to make valid judgments in these areas; and

•   that relative to items where the pre-post change is to the negative, students become more realistic and worldly as the semester unfolds—perhaps even cynical and apprehensive about their future in the classroom.

However, some interesting patterns emerged via analysis of the Fall 2002 survey data.

•   The items with minimal change tended to be philosophical in nature: Modeling the use of technology isn’t my job (pre: m=1.72; post: m=1.67); I’m earning a credential in a subject area that doesn’t lend itself to technology use, including the Internet (pre: m=1.64; post: m=1.69); and My students’ many personal and educational needs make focusing on technology impractical (pre: m=1.78; post: m=1.81).

•   The items with positive significant changetended to focus on self-confidence or personal beliefs: The majority of my students are likely to know more about technology, including the Internet, than I do (pre: m=2.17; post: m=1.86); I feel awkward when confronted with using technology in my classroom (pre=1.98; post=1.70); and Technology could interfere with the personal relationships I develop with my students (pre: m=1.69; post: m=1.56).

•   The items with negative significant change tended to focus on logistics: There simply isn’t enough time to incorporate technology into classroom instruction (pre: m=1.82; post: m=2.02; and I expect to have so little access to technology in my future teaching that it won’t make much difference in the way I teach (pre: m=1.56; post: m=1.75).

Also consistent over the past five semesters is a decrease (often dramatic) in the percentage of students selecting I don’t know. Still, there appears to be some ambivalence or hesitation about technology’s “promise” among preservice teachers about to hit the job market that warrants further investigation.

Yet another 12-item cluster focuses broadly on classroom integration. High ratings suggest active teaching, resourcefulness, and a potential commitment to technology. The five-point scale is anchored by Not important at all (1) and Extremely important (5). For each of the past five semesters, there has been little change in student perceptions pre to post. The evaluation team believes these findings reflect the youthfulness of the respondent pool, and the growing number who (anecdotally, at least) enter the preservice program rather familiar with several software applications (or software types) andlikely to have used computers instructionally (perhaps frequently) while enrolled in middle and/or high school.Analysis of the Fall 2002 survey data revealed two noteworthy trends. The first was not surprising, given that the course intentionally downplays “rote” technology use: the three items that earned pre/post ratings at or near 3.5 are all associated with fairly low-level instructional strategies.

•   Using subject-specific (math, science) drill and practice software programs—pre: m=3.56, post: m=3.47

•   Publishing student work electronically—pre: m=3.47, post: m=3.52

•   Creating school or classroom web pages—pre: m=3.61, post: m=3.41)

The second was surprising, and even a bit disturbing: the items with lower post-mean ratings are all associated with higher-order tasks or students’ own future roles as educational leaders.

•   Having students plan, compose, write, and/or edit stories, essays, or reports—pre: m=4.57, post: m=4.45

•   Having students conduct web-based research—pre: m=4.14, post: m=4.03

•   Having students communicate with others in their community or around the world—pre: m=4.07, post: m=4.02

•   Your own participation in professional development, whether or not focused on technology—pre: m=4.59; post: m=4.47


1 EdTec 470 is the one-semester course that teacher candidates earning an initial (or Level 1) credential complete to demonstrate state-mandated technology proficiency.

2 Students can request a waiver if they have completed (or are close to finishing) an advanced degree in instructional/educational technology from an accredited institution. Students may also opt to test out of the course. The current test, which features both traditional (multiple choice) items and performance tasks, is offered several times a year on a schedule established by the university.

3 It’s worth noting that the original tool (from which this version has been adapted) was designed to jump-start technology planning in the K-12 arena. Three key or guiding questions serve as the advance organizer: Is technology (at the school or district level) being used in ways that ensure the best possible teaching and learning? What is a school or district’s “technology profile?” What areas (at the site/district level) should be targeted to ensure effective integration?

4 In brief, students reflect on previous or new material; ask questions related to the topic; define procedures for investigation; find and investigate data/information that will help answer questions; manipulate the data; discuss and defend results; and reflect on results and restart the process if necessary.