Edwards, R. (2014). Assessment and collaboration in the context of the systematic design of blended PBL: A commentary on An (2013). Contemporary Issues in Technology and Teacher Education, 14(3). https://citejournal.org/volume-14/issue-3-14/general/assessment-and-collaboration-in-the-context-of-the-systematic-design-of-blended-pbl-a-commentary-on-an-2013

Assessment and Collaboration in the Context of the Systematic Design of Blended PBL: A Commentary on An (2013)

by Richard Edwards, University of Waikato

Abstract

In highlighting the importance of problem-based learning in the development of 21st century skills, An (2013) identified the challenges faced by novice teachers in its implementation and suggested strategies to support them. This commentary explores two aspects mentioned in the article, assessment and the role of collaboration, and argues that they need greater critical consideration if the implementation of problem-based learning is to be effective.  The role digital technologies can play is discussed and some implications for teacher education are considered.

In her 2013 article, Systematic Design of Blended PBL, Yun Jo An (2013) made a strong case for problem-based learning (PBL) in the context of the development of 21st-century skills (Binkley et al., 2012; Partnership for 21st Century Skills, 2011). She explored the experiences of a group of PBL novices as they worked through their first experience of blended PBL in an online learning situation. From this experience she identified a number of challenges the novices faced and developed a set of useful suggestions for supporting them. In reading the research presented, two specific areas seemed to merit further exploration, and they are the focus of this commentary: the role of collaboration in PBL and approaches to assessment of learning in PBL.

Collaboration

Increasingly, the importance of students being able to work together collaboratively has been recognized by educators and policy makers. This skill has been identified in research as key for 21st-century living and employment (Binkley et al., 2012) and is, therefore, a key element in An’s (2013) justification for PBL, even though it is not explored in her research. Collaborative problem-solving has been included in the 2015 Programme for International Student Assessment (PISA; OECD, 2013) evaluation of education systems worldwide.  Given that PBL is characterized by being complex, ill-defined (at least initially), and open ended, the drawing together of diverse expertise and experience through collaboration is well-suited to resolving problems; hence, its recognition in descriptions of 21st-century skills.

Dillenbourg (1999) suggested that in its broadest sense, collaborative learning refers to a situation in which two or more people learn or attempt to learn something together. A more refined view distinguishes collaboration as a “coordinated, synchronous activity that is the result of a continued attempt to construct and maintain a shared conception of a problem” (McWhaw, Schnackenberg, Sclater, & Abrami, 2003; Roschelle & Teasley, 1995).  It is characterized by a sense of mutual engagement and agency rather than simply a division of labor, as is seen in cooperative activity. The outcome of collaboration is often greater than the sum of the individual contributions.

The PISA framework for collaborative problem-solving (OECD, 2013) took this definition further: “The capacity of an individual to effectively engage in a process whereby two or more agents attempt to solve a problem by sharing the understanding and effort required to come to a solution and pooling their knowledge, skills, and efforts to reach that solution.”

Although the focus is on the individual, there is an acknowledgement that collaboration skills can be assessed at the individual, group, or organizational level.  Any consideration of collaboration, therefore, needs to explore the nature of the group, the nature of the activity in which the group is engaged, and the nature of the interactions that contribute to completion of the activity (Dillenbourg, 1999). The concept of collaboration is process oriented even though the purpose is to achieve the agreed outcome.

The ATC21S (Assessing 21st Century Skills) project has conceptualized collaborative problem-solving in five broad strands within the two main areas of social and cognitive knowledge and skills (Griffin, Care, & McGaw, 2012). Within the social skills area, they identified the capacity of an individual to (a) recognize the perspective of other persons in a group (perspective taking); (b) participate as a member of a group by contributing knowledge, experience, and expertise in a constructive way (participation); and (c) recognize the need for contributions and how to manage them (social regulation).

In the cognitive skills area they identified the capacity to (a) identify structure and procedure involved in resolving a problem (task regulation) and (b) to build and develop knowledge and understanding as a member of the group (knowledge building). They also recognized that where the environment for collaboration involves the use of digital technologies collaborative problem-solving skills need to be supported with appropriate skills in the use of these technologies.

Based on this and on other theoretical frameworks, the PISA model identified three collaborative problem-solving competencies:

  • Establishing and maintaining shared understanding,
  • Taking appropriate action to solve the problem, and
  • Establishing and maintaining team organization.

The development of these competencies is influenced by student background, the development of core skills, and the context in which they are being applied.  Clearly, any consideration of individual performance in a collaborative setting needs to explore both social and cognitive elements and outcomes, as they are inextricably linked in collaborative activity. Also important are the context and the media in which collaboration occurs.

Much of the research into collaboration has looked primarily at the process of collaboration, particularly with respect to collaborative learning.  In recent years a growing interest in the ways in which information technologies can support collaboration has developed. Several themes have emerged. The first is a recognition of the diversity of skills and background of participants and the need for people to develop collaborative skills, as well as those skills needed to achieve intended cognitive outcomes (Dawes & Sams, 2004; Fransen, Weinberger, & Kirschner, 2013; Montequín, Fernández, Balsera, & Nieto, 2013; Napier & Johnson, 2007).

Some of these data have come from analyses of failure of collaborative projects in education (Baker, Bernard, & Dumez-Féroc, 2012; Kapur & Kinzer, 2009; Pathak, Kim, Jacobson, & Zhang, 2011). Attention has also focused on the role of design in regulating both the process and outcomes of collaboration (Fischer, Kollar, Stegmann, & Wecker, 2013; Strijbos, Martens, & Jochems, 2004). The design relates to both the environment, for example through scripting or modelling, and the choice of tools and media supporting collaboration. Another theme is the challenge of resolving, both practically and theoretically, the outcomes of collaborative learning as either individual or group outcomes (Akkerman et al., 2007; Dillenbourg, 1999; Salomon, 1993).

Thus, any consideration of PBL that involves only individuals misses much of the richness that is embedded in collaborative PBL and does not adequately address the focus on collaborative knowledge and skills commonly found in sets of skills identified as needed by people in the 21st century. However, a number of challenges are associated with collaborative PBL, one of which is assessment, the focus of the next section.

Assessment

The design and implementation of PBL is important, yet it is a means to an end, and there must be a way to comment in a meaningful way on what has been learned or at least to demonstrate what has been learned. In the context of 21st-century skills, new approaches to assessment will likely be needed, particularly with the possibilities offered by the current rapid development of digital technologies. Twentieth century approaches to assessment are unlikely to meet the needs adequately.

Assessment is the process by which performance of an individual or group is appraised and resulting judgements are made based on the consideration of evidence (Brown, 2008). The decisions from this process are used in a range of ways to inform future teaching and learning and to indicate levels of achievement or competence.

Assessment has been the focus of considerable research over many years leading to greater recognition of the importance of formative assessment and an emphasis on the nature of the evidence being assessed. As a result a wide range of definitions, principles, and practices have been published (e.g., Absolum, 2006; Brown, 2008; Dochy, 2009).

More recently, assessment has been theorized from the perspective of sociocultural theory (Crossouard, 2009; Moss, Pullin, Gee, Haertel, & Young, 2008). Such a perspective highlights social interaction and participation and sees learning in terms of distributed cognition and embodied cognition rather than simply individual cognitive change. The emphasis is placed on the problem or questions that evidence is needed to address. Multiple forms of evidence are used, and interpretation of the evidence takes into account the particular nature of the context. The role of classroom discourse is also considered (Hickey & Anderson, 2007). A sociocultural view shifts responsibility for assessment decision-making toward learners, with the result that they are more involved in the assessment processes (Bain, 2012).

Recent advances in digital technologies have provided opportunities for a much wider range of ways of collecting and collating performance evidence. This, together with the shift toward greater involvement of the learner, challenges traditional views of assessment. In a discussion of assessment discourse in higher education, Boud (2007) promoted a shift from the current focus on measurement and certification to a focus on assessment as informing judgement. In this view the learner becomes a more active participant in the learning process, consistent with research into the importance of what is called consequential validity (a measure of the consequences of assessment on desired learning; Admiraal, Hoeksma, van de Kamp, & van Duin, 2011; Bain, 2012; Cizek, Rosenberg, & Koons, 2008).  As learners grow in autonomy, they take on a greater role in the learning process and have a greater voice in assessment decision-making (Bain, 2012).

This process is particularly relevant in contexts where multiple outcomes and multiple ways of developing outcomes are possible, as is the case in PBL. Recent technological changes are making new approaches to assessing learning increasingly possible (Finger & Jamieson-Proctor, 2009; Luchoomun, McLuckie, & van Wesel, 2010; Newhouse, 2013), although they place new challenges and expectations on teachers and students, particularly related to the use of the technologies.

Digital technologies offer a broad range of forms of representation, such as images, video, audio, graphics, and text. They also provide tools for producing representations, including mobile devices, digital cameras and video recorders, Internet, and networking technologies that are easier and quicker to use than more traditional nondigital methods.

Procedural learning is an important part of PBL, and evidence of this kind of learning needs to be collected over time. Such evidence is often collated in the form of a portfolio (Newhouse, 2013), which enables the learner to take a more active role in the selection and presentation of appropriate evidence and supports a greater focus on the individual. Recent research into the use of digital portfolios (e.g., Kimbell, 2012; Williams, 2012) has highlighted the potential to broaden the forms of evidence that can be used to demonstrate developing capability.

Research into the assessment of collaborative learning commonly focuses on both the social and the cognitive aspects of collaboration (Montequín et al., 2013; OECD, 2013; Pazos, Micari, & Light, 2010; Persico, Pozzi, & Sarti, 2010; Strijbos, 2011), although attention is also paid to motivation (Kagan, 1995; Slavin, 1996). The research highlights the tensions produced by social loafing and free riders and the problems associated with assigning a common grade to a group (Kagan, 1995). These challenges have resulted in assessment being largely focused on the ability to collaborate rather than on the learning the collaboration is intended to facilitate.

While collaboration is recognized as a common element of PBL, it poses some major challenges for assessment, including whether to focus on the individual level or the group level, whether to focus on the extent to which group members gain or have the same (convergent) knowledge or on divergent knowledge, and whether the assessment should focus primarily on cognitive outcomes (Strijbos, 2011). Recent developments in digital technologies, including web-based networking and mobile devices with audio and image recording capability, have provided a range of more cost-effective and user-friendly ways to collaborate and to provide evidence for assessment.

Williams (2013), identified a critical need for research into the use digital forms of representation for summative assessment in his introduction to a substantial research project in this area. However, the project looked only at performance in tasks undertaken by individuals. Very little research has examined the use of digital representations of performance in group tasks, even though there are now tools such as e-portfolios that can facilitate this type of assessment.

Role of Teacher Education

An (2013) identified a number of strategies that her research suggested support students in developing effective PBL design. However, she also pointed out that “many teachers are unfamiliar and uncomfortable with the new roles and responsibilities required by open-ended, learner-centered strategies” (para. 3) and that often teachers’ actual practice is different from that espoused. The implication is that there is a need to focus on teacher education if the current shift toward PBL, in particular, and 21st-century learning, more generally, are to bear fruit. When the role of collaboration and a more learner-centered view of assessment are also considered, the importance of appropriate teacher education becomes even more evident.

Limited research exists, however, into teacher education models that support 21st-century learning (Griffin et al., 2012). Both the complexity of teaching and the need for new approaches are acknowledged (Chan & van Aalst, 2006; Darling-Hammond, 2006), but clearly, further work in this area is needed. Approaches suggested to date all reflect a commitment to 21st-century learning strategies and principles in teacher education, modeling what the authors hope teachers will subsequently adopt in their own classrooms.

Griffin et al. (2012) noted that extensive professional education for both teachers and teacher educators, both preservice and in-service, will be required.  In her discussion of 21st-century teacher education, Darling-Hammond (2006) advocated a coherent and well-integrated approach that links theoretical perspectives with clinical practice in schools, although this activity would clearly depend on the schools themselves modeling effective practice. As might be expected with something new, such schools are not likely to be common, at least initially. Chan and van Aalst (2006) explored Knowledge Building (Scardamalia & Bereiter, 2010) as a useful theoretical framework in conjunction with the collaborative use of digital technologies, which is a good example of what might be possible.

Summary

PBL clearly has an important place in pedagogies of the 21st-century. However, implementing PBL without due attention to a critical consideration of underlying concepts and philosophical positions means it is unlikely to provide the necessary support for 21st-century learning. Two concepts have been identified here with a view to promoting discussion. The first is collaboration, which is identified as both an important 21st-century skill and an integral part of effective PBL. It does, however, bring challenges such as the focus of learning, the focus on individual or group for assessment, and the level and nature of support needed to enable students to participate effectively.

The second concept is assessment. A different view is advocated here, one that places increased emphasis on the role of the learner and on the nature of the evidence on which assessment judgments are made.  The point is also made that for any of this discussion to result in significant educational benefits for students, it needs to take into account the context of teacher education.

References

Absolum, M. (2006). Clarity in the classroom. Auckland, NZ: Hachette Livre NZ Ltd.

Admiraal, W., Hoeksma, M., van de Kamp, M.-T., & van Duin, G. (2011). Assessment of teacher competence using video portfolios: Reliability, construct validity, and consequential validity. Teaching and Teacher Education, 27(6), 1019–1028. doi:10.1016/j.tate.2011.04.002

Akkerman, S., Van den Bossche, P., Admiraal, W., Gijselaers, W., Segers, M., Simons, R.-J., & Kirschner, P. (2007). Reconsidering group cognition: From conceptual confusion to a boundary area between cognitive and socio-cultural perspectives? Educational Research Review, 2(1), 39–63. doi:10.1016/j.edurev.2007.02.001

An, Y.-J. (2013). Systematic design of blended PBL: Exploring the design experiences and support needs of PBL novices in an online environment. Contemporary Issues in Technology and Teacher Education, 13(1), 61–79. Retrieved from https://citejournal.org/vol13/iss1/general/article1.cfm

Bain, J. (2012). Negotiating the vacuum: Constructing and applying assessment criteria to focus design learning. In Explorations of best practice in Technology, design, and engineering education (Vol. 1, pp. 13–24). Gold Coast, Australia: Griffith Institute for Educational Research.

Baker, M., Bernard, F.-X., & Dumez-Féroc, I. (2012). Integrating computer-supported collaborative learning into the classroom: The anatomy of a failure. Journal of Computer Assisted Learning, 28(2), 161–176. doi:10.1111/j.1365-2729.2011.00435.x

Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining twenty-first century skills. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 17–66). Dordrecht, Netherlands: Springer.

Boud, D. (2007). Reframing assessment as if learning were important. In D. Boud & N. Falchikov (Eds.), Rethinking assessment in higher education: Learning for the longer term (pp. 15–25). Oxon, UK: Routledge.

Brown, G. (2008). Conceptions of assessment: Understanding what assessment means to teachers and students. New York, NY: Nova Science Publishers.

Chan, C. K. K., & van Aalst, J. (2006). Teacher development through computer-supported knowledge building: Experience from Hong Kong and Canadian teachers. Teaching Education, 17(1), 7–26. doi:10.1080/10476210500527907

Cizek, G. J., Rosenberg, S. L., & Koons, H. H. (2008). Sources of validity evidence for educational and psychological tests. Educational and Psychological Measurement, 68(3), 397–412. doi:10.1177/0013164407310130

Crossouard, B. (2009). A sociocultural reflection on formative assessment and collaborative challenges in the states of Jersey. Research Papers in Education, 24(1), 77–93. doi:10.1080/13669870801945909

Darling-Hammond, L. (2006). Constructing 21st-century teacher education. Journal of Teacher Education, 57(3), 300–314. doi:10.1177/0022487105285962

Dawes, L., & Sams, C. (2004). Developing the capacity to collaborate. In K. F. Littleton, D. Miell, & D. Faulkner (Eds.), Learning to collaborate, collaborating to learn (pp. 95–109). New York, NY: Nova Science Publishers.

Dillenbourg, P. (1999). Introduction: What do you mean by?”collaborative learning?” In P. Dillenbourg (Ed.), Collaborative learning: Cognitive and computational approaches (pp. 1–19). Oxford, UK: Elsevier Science Ltd.

Dochy, F. (2009). The edumetric quality of new modes of assessment: some issues and prospects. In G. Joughin (Ed.), Assessment, learning and judgement in higher education. Wollongong, NSW: Springer.

Finger, G., & Jamieson-Proctor, R. (2009). Assessment issues and new technologies: ePortfolio possibilities. In C. Wyatt-Smith & J. J. Cummings (Eds.), Educational assessment in the 21st century: Connecting theory and practice (p. 309). Dordrecht, The Netherlands: Springer International.

Fischer, F., Kollar, I., Stegmann, K., & Wecker, C. (2013). Toward a script theory of guidance in computer-supported collaborative learning. Educational Psychologist, 48(1), 56–66. doi:10.1080/00461520.2012.748005

Fransen, J., Weinberger, A., & Kirschner, P. A. (2013). Team effectiveness and team development in CSCL. Educational Psychologist, 48(1), 9–24. doi:10.1080/00461520.2012.747947

Griffin, P., Care, E., & McGaw, B. (2012). The changing role of education and schools. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 1–15). Dordrecht, The Netherlands: Springer.

Hickey, D. T., & Anderson, K. T. (2007). Situative approaches to student assessment: Contextualizing evidence to transform practice. Yearbook of the National Society for the Study of Education, 106(1), 264–287. doi:10.1111/j.1744-7984.2007.00105.x

Kagan, S. (1995). Group grades miss the mark. Educational Leadership, 52(8), 68.

Kapur, M., & Kinzer, C. K. (2009). Productive failure in CSCL groups. International Journal of Computer-Supported Collaborative Learning, 4(1), 21–46. doi:10.1007/s11412-008-9059-z

Kimbell, R. (2012). The origins and underpinning principles of e-scape. International Journal of Technology and Design Education. doi:10.1007/s10798-011-9197-x

Luchoomun, D., McLuckie, J., & van Wesel, M. (2010). Collaborative e-learning: e-Portfolios for assessment, teaching and learning. Electronic Journal of E-Learning, 8(1), 21–29.

McWhaw, K., Schnackenberg, H., Sclater, J., & Abrami, P. C. (2003). From cooperation to collaboration: Helping students become collaborative learners. In R. M. Gillies & A. F. Ashman (Eds.), Cooperative learning: The social and intellectual outcomes of learning in groups (pp. 69–86). London, UK: RoutledgeFalmer.

Montequín, V. R., Fernández, J. M. M., Balsera, J. V., & Nieto, A. G. (2013). Using MBTI for the success assessment of engineering teams in project-based learning. International Journal of Technology and Design Education, 23(4), 1127–1146. doi:10.1007/s10798-012-9229-1

Moss, P., Pullin, D., Gee, J., Haertel, E., & Young, L. (Eds.). (2008). Assessment, equity and opportunity to learn. Cambridge, UK: Cambridge University Press.

Napier, N. P., & Johnson, R. D. (2007). Technical projects: Understanding teamwork satisfaction in an introductory IS course. Journal of Information Systems Education, 18(1), 39–48.

Newhouse, C. P. (2013). Literature review and conceptual framework. In P. J. Williams & C. P. Newhouse (Eds.), Digital representations of student performance for assessment (pp. 9–28). Rotterdam, The Netherlands: Sense Publishers.

OECD. (2013). PISA 2015: Draft collaborative problem solving framework. Retrieved from http://www.oecd.org/pisa/pisaproducts/Draft%20PISA%202015%20Collaborative%20Problem%20Solving%20Framework%20.pdf

Partnership for 21st Century Skills. (2011). Framework for 21st century learning. Retrieved from http://www.p21.org/about-us/p21-framework

Pathak, S. A., Kim, B., Jacobson, M. J., & Zhang, B. (2011). Learning the physics of electricity: A qualitative analysis of collaborative processes involved in productive failure. International Journal of Computer-Supported Collaborative Learning, 6(1), 57–73. doi:10.1007/s11412-010-9099-z

Pazos, P., Micari, M., & Light, G. (2010). Developing an instrument to characterise peer-led groups in collaborative learning environments: Assessing problem-solving approach and group interaction. Assessment & Evaluation in Higher Education, 35(2), 191–208.

Persico, D., Pozzi, F., & Sarti, L. (2010). Monitoring collaborative activities in computer supported collaborative learning. Distance Education, 31(1), 5–22. doi:10.1080/01587911003724603

Roschelle, J., & Teasley, S. D. (1995). The construction of shared knowledge in collaborative problem solving. In C. O’Malley (Ed.), Computer supported collaborative learning (Vol. 128, pp. 69–97). Berlin, Germany: Springer-Verlag.

Salomon, G. (1993). No distribution without individuals’ cognition: A dynamic interactional view. In G. Salomon (Ed.), Distributed cognitions: Psychological and educational considerations (pp. 111–138). Cambridge, UK: Cambridge University Press.

Scardamalia, M., & Bereiter, C. (2010). A brief history of knowledge building. Une Brève Histoire de La Coélaboration de Connaissances., 36(1), 1–16.

Slavin, R. E. (1996). Research on cooperative learning and achievement: What we know, what we need to know. Contemporary Educational Psychology, 21(1), 43–69. doi:10.1006/ceps.1996.0004

Strijbos, J.-W. (2011). Assessment of (computer-supported) collaborative learning. IEEE Transactions on Learning Technologies, 4(1), 59–73. doi:10.1109/TLT.2010.37

Strijbos, J. W., Martens, R. L., & Jochems, W. M. G. (2004). Designing for interaction: Six steps to designing computer-supported group-based learning. Computers & Education, 42(4), 403–424. doi:10.1016/j.compedu.2003.10.004

Williams, P. (2012). Investigating the feasibility of using digital representations of work for performance assessment in engineering. International Journal of Technology and Design Education, 22(2), 187–203. doi:10.1007/s10798-011-9192-2

Williams, P. J. (2013). Introduction and background. In P. J. Williams & C. P. Newhouse (Eds.), Digital representations of student performance for assessment (pp. 1–8). Rotterdam, The Netherlands: Sense Publishers.

Author Note

Richard Edwards
University of Waikato
NEW ZEALAND
Email: [email protected]

 

Loading