Zydney, J. M., Angelone, L., & Rumpke, E. (2021). Lessons learned from an elementary school citizen science project. Contemporary Issues in Technology and Teacher Education, 21(3). https://citejournal.org/volume-21/issue-3-21/science/lessons-learned-from-an-elementary-school-citizen-science-project

Lessons Learned from an Elementary School Citizen Science Project

by Janet Mannheimer Zydney, University of Cincinnati; Lauren Angelone, Xavier University; & Erin Rumpke, University of Cincinnati

Abstract

This article describes a pilot study on the use of a computer supported collaborative citizen project with elementary school students. From public data available on the web, the researchers sought to understand how students engaged in science practices within a citizen science project. In addition, the researchers examined the different roles that emerged within the citizen science community. A social media feed, including posts and comments, was collected from one project within the citizen science site and analyzed qualitatively using a content analysis and role analysis. The results were contextualized to determine what guidance is needed to help teachers set up this type of project in their classrooms. The recommendations include scaffolding science practices, providing expectations for students on how to post on social media sites, and establishing productive partnerships with scientists in the community. Incorporating these guidelines within teacher education and professional development programs may help teachers provide their students with authentic research experiences through citizen science projects.

Educational policy such as the National Research Council’s (2012) A Framework for K-12 Science Education and the Next Generation Science Standards (NGSS Lead States, 2013) recommends incorporating authentic science and engineering practices into the K-12 curriculum. Participation in citizen science projects, collaborative research efforts between professional scientists and nonexpert members of the community, has been shown to support key science practices, as defined by A Framework for K–12 Science Education and the Next Generation Science Standards.

By involving students in practices that mimic those utilized by professional scientists (Condon & Wichowsky, 2018; Hartman & Kahn, 2017; Koomen et al., 2018), students can improve their scientific literacy and engage in authentic research. At the same time, professional scientists benefit from increased data collection and observation efforts (Bonney et al., 2016).

Although participation in citizen science projects demonstrates a creative way to integrate authentic research experiences into educational settings, realizing the full potential of citizen science in the K-12 setting remains a challenge for educators. A recent review of literature noted that more research is needed within K-12 settings, particularly at the elementary school level (Tsivitanidou, & Ioannou, 2020).

Despite documenting positive knowledge gains and increased scientific literacy, students struggle to practice higher order thinking skills and to change their attitudes toward science as a result of participation (Bonney et al., 2016; Phillips et al., 2019). Effective mentorship of students during a project is imperative to see meaningful outcomes; however, mentoring can be taxing on the already overloaded schedules of both classroom teachers and partnering professional scientists (Koomen et al., 2018; Shah & Martinez, 2016).

The purpose of this study was to examine the use of a computer supported collaborative learning (CSCL) citizen science project on students’ science practices as a means for helping teachers implement these types of projects in their classroom. This study examined the interactions on a social media feed from a public citizen science website of one class of elementary school students, with the specific intent of answering the following research questions:

  • What science practices are found within the student interactions in the citizen science community?
  • What different roles do participants play within the citizen science community?

Literature Review

This section explores the intersection of the literature concerning citizen science, science practices, and roles within CSCL.

Citizen Science

Citizen science projects are research collaborations between professional scientists and nonexpert members of the community. By getting involved in these projects, students gain practice with authentic research and help professional scientists in their data collection efforts (Bonney et al., 2016).

Advances in web-based technologies have supported the expansion of citizen science opportunities. Mobile technologies facilitate the collection of much more data than ever before because anyone with access to a mobile device can collect observations, enabling large amounts of data to be crowdsourced (Malykhina, 2013). Social media provides an avenue for the public to engage directly with the data and interact with the scientific community (Huang et al., 2018; Tsivitanidou, & Ioannou, 2020).

Parallel to citizen science research, existing research concerning the use of social media as an instructional tool has focused on adult populations, and less often on the assessment of formal learning outcomes in K-12 student populations (Askari et al., 2018). The use of social media for learning has been shown to foster active learning, increase the connections between students and their community, and improve collaboration (Greenhow et al., 2019). The benefits afforded by social media could improve the impact that citizen science opportunities present as learning tools. Given the inherent concerns that surround privacy and safety of students online, additional research is necessary to define best practices for applying these tools for classroom learning.

Although a wide range of citizen science projects are available to K-12 educators (Shah & Martinez, 2016), such as Project Noah (https://www.projectnoah.org/), the Bumble Bee Watch (https://www.bumblebeewatch.org/), and Journey North (https://journeynorth.org/) to name a few, more research is needed on the use of citizen science projects in K-12 settings (Tsivitanidou, & Ioannou, 2020). Much of the research on citizen science projects has been focused on adults learning in informal settings (Koomen et al., 2016), and evaluation of specific learning outcomes of participants involved in citizen science projects is still limited (e.g., Bela et al., 2016; Bonney et al., 2016).

In K-12 settings, citizen science initiatives involve a unique triad of stakeholders – professional scientists, teachers, and students – where both teachers and students can work concurrently as citizen scientists, with teachers often providing a mentoring role for students in the process (Condon & Wichowsky, 2018; Koomen et al., 2018). The role of each stakeholder can vary depending upon the nature of the citizen science project and the age of the student.

In expert-led, citizen science projects, interaction is generally limited between students and professional scientists; classroom teachers often provide training, tools, and support as students collect, prepare, and submit data for the project. Open-ended projects geared towards the coproduction of knowledge are characterized by extensive interactions between students and professional scientists, with teachers empowering students to become actively involved (Ciasullo et al., 2019).

Despite the fact that citizen science can be an effective method for incorporating authentic research into the classroom, integrating these projects into teaching remains a challenge for educators. Effective mentorship of students during a project is imperative for meaningful outcomes. The question of how to incorporate such practices effectively into the classroom remains less concrete. To implement citizen science projects successfully in the classroom setting, teachers must have the opportunity to learn best practices through professional development workshops and technology resources (Condon & Wichowsky, 2018; Shah & Martinez, 2016). Moreover, preservice teachers may benefit from the integration of citizen science projects into teacher education programs (McGinnis et al., 2020). Additional research is needed in the area of classroom teaching and the impact of citizen science in K-12 education.

Science Practices Within Citizen Science

Citizen science projects often focus on the solution of real-world issues using a problem-based approach that aligns with the basic elements of inquiry-based instruction (Condon & Wichowsky, 2018; Mitchell et al., 2017). It also answers the call from A Framework for K-12 Science Education to “engage [students] with fundamental questions about the world and with how scientists have investigated and found answers to those questions” (National Research Council, 2012, p. 9). Participation provides students a firm base to shift from knowing science to using what they know to make sense of the world through action, decision-making, interpretation, and knowledge construction (Koomen et al., 2018).

Projects geared toward student participation typically guide students through structured inquiry to develop hypotheses, collect and analyze data, and draw conclusions (Wiggins & Crowston, 2011). As learners lead themselves through the participatory processes of a project, they move through the actions of performing research tasks, collaborating with other members of the project, experiencing challenges and troubleshooting solutions – all behaviors that align with the tasks of a professional scientist. Through the process of “participatory learning” students begin to find their own identity as emerging scientists in the community of practice and, thereby, begin to develop a grasp of how to practice science (Koomen et al., 2018).

Although more research is needed on use of citizen science projects in K-12 spaces according to a review of research by Tsivitanidou and Ioannou (2020), there appears to be promising results in the research conducted so far: allowing students to generate explanations, analyze data, and develop arguments (Koomen et al. 2016); giving students hands-on experience in the scientific process (Saunders et al., 2018); and increasing interest, motivation, perceived mastery, and positive attitudes toward nature (Kelemen-Finan et al., 2018). Of importance for this particular project, Kelemen-Finan et al. also found that these benefits were highest for primary students.

Citizen Science Roles Within CSCL

CSCL can be defined as the use of computers and other technologies to support activities in which learners collaborate to achieve a shared learning goal (Dado & Bodemer, 2017). CSCL can be distinguished from technology-enhanced education by its focus on collaboration and mutual knowledge construction (Sun et al., 2008) as well as shared information processing during collaboration (Rummel et al., 2011). Citizen science aligns well with CSCL because, when engaging in citizen science, students are given the opportunity to collaborate with peers and members of the greater scientific community. Understanding the nature of the collaboration of the students within the K-12 classroom and the larger citizen science community is of pointed interest to understanding what students are learning within these environments.

Studying roles of citizen science members is a key element in understanding the collaboration and group work within these environments. Roles can be considered as “a microcosm of the complexity of CSCL, and may in fact constitute a central defining construct for the field” (Hoadley, 2010, p. 552). The CSCL literature identifies a number of different definitions on roles, depending on the research perspective (Hoadley, 2010).

This study adopted Biddle’s (1986) characterization of role as “behav[ing] in ways that are different and predictable depending on their respective social identities and the situation” (p. 68). It aligns with the research perspective of cognitive role theory, which is concerned with how roles form based on a person’s expectations of others and how those expectations manifest into behaviors. This theory encompasses several interrelated areas, including (a) role playing or performing a role; (b) considering the influence of group norms on different types of roles; (c) anticipating roles based on beliefs; and (d) role taking or temporarily projecting oneself into the roles of others (Biddle, 1986; Coutu, 1951).

Many researchers who study roles in CSCL environments have examined the impact of assigning roles for specific tasks or duties (see Hoadley, 2010). Others have argued that this approach oversimplifies the complexity of collaboration and group dynamics and haved recommended studying the ways roles emerge naturally to understand how to support students in assuming more productive roles within these environments (Heinimaki et al., 2020; Oliveira et al., 2014). Instead of thinking of roles as an intervention, roles can be examined as an outcome measure as a means to understand the collaboration that took place (Hoadley, 2010).

 Since learning within CSCL contexts takes place on numerous levels within the community, it must be examined purposefully using different data sources. These sources should then be integrated to provide a more complete picture (Dado & Bodemer, 2017). This study sought to better understand the science practices and roles within a CSCL environment by integrating the findings from two different analyses.

Method

This pilot study utilized a qualitative approach to evaluate public data of one elementary class’s use of Project Noah (https://www.projectnoah.org), a nonprofit, social media site dedicated to connecting citizen scientists through sharing observations of wildlife. Project Noah is comprised of various missions that are focused on different environmental efforts, such as tracking how the biodiversity of a species changes. The teacher posted student work in a mission called Global Schoolyard Blitz, an effort to understand the biodiversity of wildlife found in schoolyards. This schoolyard was located in a small midwestern city. Each student group documented a different observation or spotting of wildlife, including different types of insects as well as a squirrel. Professional scientists interacted with the students asynchronously through a social media feed.

Student groups were each assigned unique numbers to make comments on each other’s work. Even though groups used a unique number, ostensibly for privacy purposes, some students included their names in the comments. As such, these comments were not directly quoted, but were summarized. In reporting the data, only direct quotes were used that could not be traced back to student names through a Google search. All of these direct quotes are cited verbatim, retaining grammar and spelling issues, to allow for a more authentic voice of the children.

Participant posts and comments from one class were downloaded for analysis. Data were analyzed using two methods: deductive content analysis (Hsieh & Shannon, 2005), using an a priori coding scheme derived from the NGSS science practices, and an inductive role analysis to understand the group dynamics between citizen science members (Carspecken, 1996).

Content Analysis

To analyze the ways that the science practices were supported, a deductive content analysis (Hsieh & Shannon, 2005) was conducted using an a priori coding scheme. The a priori scheme was based on the NGSS science and engineering practices in Appendix F of the NGSS (NGSS Lead States, 2013) and utilized by Brownstein and Horvath (2016) in a study of science and engineering practices in edTPA (“educative” Teacher Performance Assessment) portfolios.

In Appendix F, each science practice is operationalized by grade level. The focus on science practices without the inclusion of engineering practices was intentional, as this project did not involve engineering components. The operationalized definitions for grades K-2 and 3-5 were both used to code the data based on the grades of the students who participated in the citizen science project. For example, Practice 1 is “Asking Questions and Defining Problems.” This practice is then operationalized at the K-2 and 3-5 level using the characteristics in Table 1. These characteristics were used to code each post and comment within the Project Noah mission under study.

Table 1   Practice 1: Asking Questions and Defining Problems

K-23-5
Asking questions and defining problems in K–2 builds on prior experiences and progresses to simple descriptive questions that can be tested.

- Ask questions based on observations to find more information about the natural and/or designed world(s).

- Ask and/or identify questions that can be answered by an investigation.

- Define a simple problem that can be solved through the development of a new or improved object or tool.
Asking questions and defining problems in 3–5 builds on K–2 experiences and progresses to specifying qualitative relationships.

- Ask questions about what would happen if a variable is changed. Identify scientific (testable) and non-scientific (non-testable) questions.

- Ask questions that can be investigated and predict reasonable outcomes based on patterns such as cause and effect relationships.

- Use prior knowledge to describe problems that can be solved.

- Define a simple design problem that can be solved through the development of an object, tool, process, or system and includes several criteria for success and constraints on materials, time, or cost.
Note. Adapted from the NGSS (NGSS Lead States, 2013, p. 385).

Using the characteristics of each science practice from Appendix F of the NGSS, the data were coded a first time, marking each post or comment with a practice and noting any instances needing further analysis due to overlap, vague comments/posts that required interpretation, or data that did not fit into the coding scheme. In the second round of coding, one practice, Practice 7, was used to conduct a more detailed analysis that applied what was learned in the first round of coding. Coded examples of Practice 7 were organized into examples and nonexamples. The remaining science practices were then coded in detail.

Role Analysis

To analyze the different roles participants played within the community, a role analysis was conducted to understand the interactions between citizen science members within social media feeds (as in Carspecken, 1996). According to Carspecken, “a role is a pragmatic unit of meaning, understood holistically but only in such a way as to perform congruently with it” (p. 136). Through identifying a particular role, some expected behaviors were ascertained that may predict future behaviors and how those behaviors play out within the group.

To begin role analysis, the first step involved inductively coding the possible meaning fields for each comment. Meaning fields are “meanings that other people in the setting might themselves infer, either overtly or tacitly” (Carspecken, 1996, p. 95). Although ascertaining for certain what the student intended is not possible, the goal is to specify all the potential possibilities of meaning. As an example, consider the comment, “I could never have a better picture than that who saw it that would be very hard to spot a fly like that.” The possible meaning fields could be an enthusiastic compliment or a sarcastic comment, since the photo of the fly was a closeup but a little out of focus.

The data set was then examined for patterns across all the possible meaning fields. The patterns indicated for both this particular student as well as for the norms of the class as a whole, that this comment most likely was supportive. Although individual comments were not classified, related patterns of meaning fields emerged thematically from the data as roles. Roles were then described from a first-person perspective, following the recommendation by Carspecken (1996):

You should avoid role analysis… that takes a purely third person perspective on roles, defining them by their function in interactive settings as only a detached observer could perceive. The role analysis that I advocate prioritizes the first and second positions, rather than third, and defines roles according to their meaning, not their function.” (p. 139)

By examining the underlying meaning of the text from a first-person perspective, the characteristics and behaviors of the actors emerged. The roles were then further examined through the perspective of cognitive role theory.

Results

This section describes results of the content analysis of the science practices and the role analysis of the different positions participants played in the citizen science community.

Students’ Science Practices Identified

Of the eight science practices found in the NGSS, seven were identified within the student experience in Project Noah. Practice 2, “Developing and Using Models,” was absent from the analyzed project within the website. See the appendix for a summary of codes, occurrences, and instances of each.

In Project Noah, the teacher posted each group’s spotting in a template provided by the site (see Figure 1). Student groups were then able to comment on the spotting and engage in interaction through a social media feed.

Figure 1   Project Noah Spotting Template

Note. Reprinted with permission from Project Noah (April 12, 2020).

Each text field in the template seemed to promote particular practices. In the text field for both the common name and the scientific name, Practice 6 was identified. Student groups provided an evidence-based explanation by making a claim in the identification of the species from evidence they had researched. In the text field for the description and habitat, Practices 3 and 4 were found. Practice 3 includes making observations, and Practice 4 includes recording observations, which were both taking place. In the text field for notes, Practice 8 was identified most commonly, as student groups would use that space to include information that they researched about the species they were attempting to identify.

After the teacher posted each groups’ spotting, other groups and professional scientists from outside the class were able to comment and engage in a discussion around that spotting. Practices 1 and 7 were found in the comments section. Practice 1, which is focused on testable questions or questions based on observations, was identified occasionally as students asked clarifying questions about other spottings. Practice 7 was found more consistently within the comments, since commenting allowed participants to interact with one another, which is needed for productive argumentation. Practice 8 was also identified throughout the use of Project Noah, as this practice is in part focused on communicating information in a variety of formats.

Practice 1

Practice 1 is mostly focused on asking testable questions, which were absent from the data. One facet of Practice 1 at the K-2 level, however, is to begin by asking questions based on observations to find out more information. In the comments on the spottings, several instances of this type of question appeared. All four instances of this type of question had to do with the location of the spotting. For example, one student group wrote, “Where did you find this? I can’t think of any place on the playground where it would be!” This statement demonstrates that students used the comments to ask questions for more information but did not go further to ask testable questions or questions about changes in variables, which is a goal at the Grades 3-5 level.

Practices 3 and 4

 Practice 3 is mostly focused on planning and carrying out controlled investigations. At the Grades 3-5 level, testing variables one at a time is introduced. One part of planning and carrying out investigations is making observations, which is the main way that Practice 3 was supported in this project. The data included 10 instances of Practice 3. Nine of the instances were in the description and habitat text fields of the Project Noah template. One student group made the following observations: “The pill bug that we found is a little chubby, brown and black, striped, hard exoskeleton and a soft squishy inner body and antennae.” One instance was found in the comments to the spottings as a response to a question: “Yes, it was alive. It did move and we had to take the pic really fast. Also, we had to be really quiet to not scare it away.”

Practice 4 is mainly focused on analyzing and interpreting data. One part of this practice is collecting, recording, and sharing observations. As the NGSS Appendix F (NGSS Lead States, 2013) makes clear, practices often intentionally overlap. Practice 4 overlaps with Practice 3 in this way, as illustrated by evidence from this study of students making observations that they recorded and shared on Project Noah. Therefore, all data coded as Practice 3 was also coded as Practice 4, since students did not go beyond the making and recording of observations in this data. At the Grades 3-5 level, making quantitative observations is introduced, and three instances were found of students quantifying data. For example, “Description: Size (adult length):10mm to 65mm (0.39in to 2.56in). Identifying colors: brown, red, tan, and orange Additional descriptors: legs, princers.”

Practice 5

Practice 5 is focused on mathematics and computational thinking, but both Grades K-2 and 3-5 levels include the need to determine when to use qualitative or quantitative data. The project was set up to gather more qualitative types of data through photographs and written descriptions and, as such, only three instances of Practice 5 were found in the form of quantitative observations. An example of a description that included quantitative data is, “A lifespan of a fly is usually 28 days. A fly has 2 wings.”

Practice 6

Practice 6 is focused on students constructing evidence-based accounts of natural phenomenon. At the Grades 3-5 level, it includes explaining variables and relationships. In this project, each of the 13 instances coded as Practice 6 were at the K-2 level. In nine instances, student groups used their observations of the animal as evidence to identify it. This response took place in the text field where student groups identified the common and scientific name of the animal. It served as an evidence-based explanation. An example of this sort of explanation was as follows: “Common Name: American Oil Beetle Scientific Name: Meloe americanus.” One instance of Practice 6 was in the comments, as students responded to one another, “This larvae moves slower than worms so we could tell it was not a worm.”

Practice 7

Practice 7 centers around argumentation based on evidence. In Grades K-2, Practice 7 means comparing ideas about the natural world, and in Grades 3-5, progresses to critiquing the scientific ideas of peers in various ways. All instances coded as Practice 7 were in the comments to the post of a spotting. Students engaged in argumentation when they agreed or disagreed with other groups’ explanations or claims using evidence. This response occurred twice during the project, once by a student group and once by a professional scientist. The student group disagreed and offered evidence. “I don’t know if this is a long legged sac spider because I looked them up and that’s not what they look like. #Spiders ;).” The scientist modeled evidence for their disagreement as well.

This small fly is a nonbiting fly. It is called a Wood Gnat. They like to feed on decaying wood and vegetation and fermenting sap. I can see why you might be confused about what type of fly, since they do somewhat resemble mosquitoes. Very cool find! I have never seen one of these before.

This sort of argument using evidence was limited in the data. Most argumentation occurred without evidence. Four instances of agreeing or disagreeing without using evidence were found, such as, “Nice job but I think it is a fly.”

Practice 8

Practice 8 is focused on obtaining, evaluating, and communicating information, which includes reading scientific texts to gather information as well as communicating information scientifically. As a whole, through the use of the citizen science site, student groups were communicating scientific information. In that way, all 10 spottings written by student groups were supporting communicating scientific information.

In addition, in the notes section (six instances) and in the comments to the post (six instances), students appeared to have researched to obtain information and used it as evidence or have used it to support their argumentation. An example of Practice 8 in the notes was as follows: “It makes the sound ‘kat-i-did.’ They lay their egg in a single row, they overlap each other, they are often not the same color.” An example of Practice 8 in the comments to posts is as follows: “#FUN FACT: Did you know that these bugs are also called Rolly Pollys.”

Roles Within the Citizen Science Community

Four main roles emerged from the comments made within this citizen science community. The most common role that surfaced was participants acting as enthusiastic supporters for one another. Other less common, but also meaningful, roles were participants acting as experts or authority figures, imitators of adults, and game players uncovering secret identities.

Participants as Enthusiastic Supporters

Many of the participants – both the students and the professional scientists – acted as enthusiastic supporters for one another. This supportive role can be understood through the cognitive role theory. This theory suggests that group norms influence the types of roles that people take on within the community (Biddle, 1986). A shared norm for both Project Noah community members and the elementary school that participated is to show respect for one another. This norm is explicitly mentioned on the Project Noah site: “All community members are expected to respect each other in the use and/or engagement with this website, subdomains, social media and/or apps” (Project Noah, 2020, “Terms & Conditions”). And the school’s website notes “Be respectful” as one of three core behavioral expectations for its students.

One way to show respect is by acting in a supporting role, which was illustrated through the comments on the site from both the students and professional scientists. Approximately 60% of the comments made were supportive in nature. Compliments ranged from simple one-word responses, like “COOL!!!!!!!!!” or “NICE!” to more elaborate and descriptive compliments, such as “I could never have a better picture than that who saw it that would be very hard to spot a fly like that.” Often these supportive comments used explanation points or emoticons like smiley faces to emphasize their enthusiasm.

About a third of these complimentary comments were related to the appreciation of the photography. Examples of these types of compliments included “wow! great pic! i luv the angle you shot from!” and “Nice Close Up!” Others tried to ask questions that indicated an interest in knowing more, such as “Nice find. How did you get it to stay still?” or “Wow that’s really neat where did you find it???” Although not as common, some supportive comments related to the content of what was found. For example, one comment in relation to a centipede noted, “Awesome they HAVE SO MANY LEGS! How did you find such an interesting animal?”

Participants as Experts

Several students and outside community members took on a role of expert / authority on the subject matter, ranging from science to photography. These roles also can be understood by examining both shared and divergent norms within these communities and its relation to the roles that members play. One of the shared norms in both a classroom and a citizen science community is the sharing of expert knowledge. For example, classrooms have a norm of sharing expertise to build understanding and knowledge. Although not explicitly stated on the school’s website, teachers generally impart their expertise and encourage students to share their knowledge. These approaches were exemplified by all student groups in their initial posts, which shared their scientific knowledge on the insect or animal found. In addition, six student groups added their expert knowledge within the discussion comments.

Two of these groups added multiple comments that contributed to the scientific knowledge shared. For example, one group shared helpful information about the centipede: “careful poisonous fluid oozes out from under armor plates sometimes when it senses danger)-(when it feels threatened!!)” Another group shared their expertise on mosquitos: “Did you know only female mosquitos bite?” While the majority of knowledge/expertise shared was related to scientific concepts, some shared their knowledge in computers and photography. For example, one group explained how to take a close-up picture by zooming in: “I know how the person who took the pic got it like that! The person zoomed in on that pic.”

Similarly, we observed on the Project Noah site that community members were expected to share their expertise with one another to help identify wildlife. For example, one scientist helped the student group reidentify their insect from mosquito to a wood gnat fly by sharing knowledge and providing the resource Bugguide.net. One distinction between the community norms on the Project Noah site and those within a classroom is that community members simply provided the answers; whereas, teachers often coach students to find their own answers. The professional scientists’ lack of understanding of the norms of the classroom when commenting on the students’ posts may have prevented students’ further discussion and inquiry on the topic.

Children as Imitators of Adults

Children often want to be more like adults and try to mimic expressions or mannerisms of adults that they want to be like. This notion is illuminated by role playing within cognitive role theory, which “is said to appear naturally in the behavior of children and can be practiced as an aid in both education and therapy” (Biddle, 1986, p. 74). For example, one professional scientist, who was perceived as an authority figure on the site used an interesting phrase, “Neat find!” and in a later post, “Very cool find.” The use of “find” as a noun is likely not used by young children, who would be more familiar with using the word as a verb, as confirmed by an early childhood teacher. Yet, in eight instances the word “find” was used as a noun by the students.

Most of these uses appeared in the same thread with the scientist, but two other instances of students using the word find as a noun appeared in other threads. In one of these cases the same group had also participated in the thread with the scientist and likely carried over the use of this word to another thread.

In addition to repeating the phrasing used by adults on the site, several students repeated the scientist’s conclusion that the mosquito was mistakenly identified and was actually a fly. However, not all students adopted the scientist’s conclusion on the identification and asked about whether the student had been bitten: “Did You Get Bitten??? is that on somebodies arm???” Knowing whether these students read the scientist’s comment and chose to disagree or if they simply did not read the earlier comments is difficult. Notably, none of the comments related to the spotting being a mosquito used the word find, suggesting that these students may have missed the scientist’s comment.

The young children also seemed to mimic other expressions of older peers/adults who are regular social media users. Examples of emoticons, texting abbreviations, and hashtags were included throughout the students’ comments. In a couple of cases, these social media expressions seem to have been used appropriately. For example, “I don’t know if this is a long legged sac spider because I looked them up and that’s not what they look like. #Spiders ;)” However, in other cases, the children might not have fully understood its meaning. For example, one student group used “LOL” after complimenting another group on the angle of the shot of the photograph, which did not seem particularly funny. Perhaps, some side joke was happening in the actual classroom that was not detectable in the online comments.

Children as Game Players Uncovering Secret Identities

Another explicit norm posted on the school’s website as a core expectation for students was “Be safe.” Schools often have safety rules guiding what identifiable information can be posted about students on public sites. Similarly, Project Noah’s terms state, “Think before you post any information or materials that identify you personally” (Project Noah, 2020, “Terms & Conditions”). As such, many users on the site used a pseudonym rather than their own name under which to post. In line with these norms, the teacher appeared to have given each student group a generic account name with a number. However, this precaution had some unintended consequences, resulting in a game-playing role where the students tried to detect each other’s secret identity.

This behavior took on a variety of forms, including directly revealing their identity, hinting at their identity, or asking others their identity. Four student groups directly revealed their names by signing their name to the post, which likely was against the school rules for posting on a public site. Whether this behavior was intentional rule breaking or if they simply forgot is not clear. Others identified themselves using a first-person pronouns “me” or “us” without actually stating their names. Side conversations may have been occurring in the classroom where “me” was revealed. Other students hinted at their identity. For example, one comment stated, “I LOVEEEEEEEEEE Squirrels. I bet you know who this is though.” Another student group came right out and asked others what group they were in: “That is awesome you guys! What group are you?”

Discussion

Citizen science projects provide a means to incorporate authentic research experiences into educational settings (Bonney et al., 2016). However, prior research has shown that providing effective mentorship of students can be challenging for educators (Koomen et al., 2018; Shah & Martinez, 2016). This study examined public data of one elementary class’s postings in a citizen science CSCL project to understand how best to support teachers who want to do this type of project in their classrooms.

By examining this data, this study contributes to best practices for classroom teaching when engaging in citizen science CSCL projects. These best practices include providing explicit instruction to support science practices, establishing clear guidelines for students on how to post on public social media sites, and finding outside community leaders who can model science practices effectively.

Providing Explicit Instruction to Support Science Practices

The findings around science practices align with previous studies, indicating that these sorts of practices are supported in citizen science projects (Crawford, 2012; Koomen et al., 2016) and that providing appropriate scaffolding is necessary to ensure a successful learning experience (Rienties et al., 2012). The findings in this study demonstrated that specific science practices were identified in particular text fields in the template for posting spottings on Project Noah. According to the operationalized definitions of the science practices found in Appendix F of the NGSS (NGSS Lead States, 2013), the analysis showed that the majority of the practices were found to be at a K-2 level, and at times, superficially so, with more nonexamples of Practice 7 than high-quality examples of argumentation.

Opportunities exist within the Project Noah template to support these practices more deeply through pedagogical supports and scaffolding (Zydney, 2012). For example, students could be provided prompts to support their arguments similar to the claim, evidence, and reasoning framework (McNeill & Krajcik, 2011). Writing prompts (McNeill & Martin, 2011) and sentence prompts (Allen & Park Rogers, 2015; Zydney, 2008) have been used in similar work around supporting science practices to help students guide and organize their thinking.

Further, when social media has been used in K-12 classrooms, teachers found the need to model the difference between formal and informal language (Greenhow et al., 2020). In the Project Noah spotting, this scaffolding of more formal science discourse could take place in the text field for notes. For example, if students are to identify the organism they found in the common name text field, they could be provided sentence starters for the notes text field to better support their claims with evidence. The teacher could provide sentence starters for the notes field, such as, “My evidence and reasoning for identifying this organism is…” The comments section, also provides room for a higher quality argumentation from evidence. Students could use sentence starters such as, “I agree with your claim and my evidence for that is… or I disagree with your claim and my evidence for that is…” Teachers could have the sentence starters posted on the board in the room or in a digital document easily accessible to the class.

Establishing Clear Guidelines

One of the unintended consequences of giving student groups numbers in which to post their work is that it created a guessing game for students, as their identities were not only hidden from the outside community but hidden from each other. Uncovering this game-playing role helps to determine what guidelines are needed to support students in assuming more productive roles in the future (Heinimaki et al., 2020; Oliveira et al., 2014). One recommendation is to provide students with a list of names associated with the student numbers, so within the classroom, students know with whom they are communicating. This strategy will help students stay focused on the scientific task and not become distracted by trying to guess other identities.

In some cases, students revealed their names in the postings. Prior to engaging in this type of project, teachers should review the school’s acceptable use policy with students and remind them whether or not it is acceptable to post with their first names, initials, or assigned number. Any postings that violate the school’s acceptable use policy should be deleted by the teacher. Teacher education and professional development programs should prepare teachers to look closely at the school’s acceptable use policy for public sites so that they are familiar with what is allowable for their students.

Finding Meaningful Community Partnerships

Collaborative partnerships between professional scientists and teachers are a critical element for the success of implementing citizen science projects within schools (Hod et al., 2018). However, not all partnerships are effective in helping students achieve the learning outcomes. The findings from this study demonstrated that the outside scientists had a powerful impact on the discussion, with many students mimicking both their recommendations as well as their terminology. One problem noted was that the professional scientists simply provided students with the correct identification of their insect, as opposed to getting them to figure out the answer on their own.

The discussion around the mosquito was not one of the more active discussions within the community, likely because the answer provided by the two scientists effectively shut down the student discussion, as opposed to prompting further discussion. Instead of telling the students that the mosquito was in fact a wood gnat fly, the scientist could have asked the students what their evidence was for identifying the insect as a mosquito. Students could have been questioned on whether the fact that the student was not bitten might alter their conclusion. The scientist could provide them with resources on mosquitos and the frequency with which mosquitos would likely bite to help them evaluate their own conclusion. Gaining an understanding of the classroom norms when commenting on the students’ posts may have helped the scientists promote deeper discussion and inquiry.

Given the impact that authority figures can have, selecting community partners on these citizen scientist sites to work with the schools that can help them meet their learning objectives is important (Hod et al., 2018). Scientists are not trained to be teachers and are not aware of how to coach students to find the answers. Teacher education and professional development programs can help prepare teachers with resources for selecting successful citizen science partnerships as well as training materials they can provide to their partners to enable them to ask the type of questions that can generate a scientific discussion among the students.

Future Work

This activity took place prior to the COVID-19 pandemic but could easily be adapted for remote or socially distanced classrooms. Leveraging existing citizen science opportunities can be a practical and accessible method to engage students in authentic science practices from a distance, given that most citizen science projects are completed outside of the traditional classroom setting and can be completed using a tablet or smartphone.

Students learning remotely can explore the natural world surrounding their own home and participate in Project Noah by sharing images from their backyard or a local park. Students can also actively participate in Project Noah by reading posts on social media feeds, researching new species, and commenting. For educators who have returned to teaching in person during the pandemic, designing lesson plans to incorporate observing nature using citizen science sites, such as Project Noah, take advantage of the opportunity to be outside, which can be a safer instructional environment. A directory of citizen science opportunities, searchable by age group, is available for interested educators (https://scistarter.org/finder).

Conclusion

In conclusion, this exploratory study contributed to best practices when implementing citizen science projects in the classroom. First, when completing citizen science CSCL projects, teachers should provide explicit instruction to support science practices. One idea is for teachers to provide sentence starters to scaffold students’ scientific discussions.

Clear guidelines for students must be established on how to post on public social media sites, following the school’s acceptable use policy for technology. Third, success of the project depends on finding strong partnerships with outside community leaders. Teachers need resources and training materials that they can provide to partners, so they can effectively model science practices for students. Teaching these best practices within teacher education programs and through professional development opportunities may enable teachers to provide the type of mentorship and structure needed to incorporate authentic research experiences through citizen science CSCL projects in their classrooms.

References

Allen, J., & Park Rogers, M. (2015). Putting ideas on paper: Formulating scientific explanations using the claim, evidence, and reasoning (CER) framework. Science and Children, 53(3), 32-37.

Askari, E., Brandon, D., Galvin, S., & Greenhow, C. (2018). Youth, learning and social media in K-12 education: The state of the field. Proceedings of International Conference of the Learning Sciences, ICLS, 1(2018-June), 344–351.

Bela, G., Peltola, T., Young, J. C., Balázs, B., Arpin, I., Pataki, G., Hauck, J. Kelemen, E., Kopperoinen, L., Van Herzele, A., Keune, H., Hecker, S., Suškevičs, M. Roy, H., Itkonen, P., Külvik, M., László, M., Basnou, C., Pino., J., & Bonn, A. (2016). Learning and the transformative potential of citizen science. Conservation, 30(5), 990–999. https://doi.org/10.1111/cobi.12762

Biddle, B. J. (1986). Recent developments in role theory. Annual Review of Sociology12(1), 67-92. https://doi.org/10.1146/annurev.so.12.080186.000435

Bonney, R., Phillips, T. B., Ballard, H. L., & Enck, J. W. (2016). Can citizen science enhance public understanding of science? Public Understanding of Science, 25(1), 2–16. https://doi.org/10.1177/0963662515607406

Brownstein, E. M., & Horvath, L. (2016). Next generation science standards and edTPA: Evidence of science and engineering practices. Electronic Journal of Science Education, 20(4), 44-62. https://eric.ed.gov/?id=ED571279

Carspecken, P. F. (1996). Critical ethnography in educational research: A theoretical and practical guide. Routledge.

Ciasullo, M. V., Manna, R., & Palumbo, R. (2019). Developing a taxonomy of citizen science projects in primary school: Toward sustainable educational quality co-production. The TQM Journal, 31(6), 948-967. https://doi.org/10.1108/TQM-03-2019-0083

Condon, M., & Wichowsky, A. (2018). Developing citizen-scientists: Effects of an inquiry based science curriculum on STEM and civic engagement. The Elementary School Journal, 119(2), 196–222. https://doi.org/10.1086/700316

Coutu, W. (1951). Role-playing vs. role-taking: An appeal for clarification. American Sociological Review16(2), 180-187. https://doi.org/10.2307/2087691

Crawford, B. A. (2012). Moving the essence of inquiry into the classroom: Engaging teachers and students in authentic science. In K. Chwee, D. Tan, & M. Kim (Eds.), Issues and challenges in science education research (pp. 25–42). Springer.

Dado, M., & Bodemer, D. (2017). A review of methodological applications of social network analysis in computer-supported collaborative learning. Educational Research Review, 22, 159–180. https://doi.org/10.1016/j.edurev.2017.08.005

Greenhow, C., Galvin, S. M., Brandon, D. L., & Askari, E. (2020). A decade of research on K-12 teaching and teacher learning with social media: Insights on the state of the field. Teachers College Record, 122(6), 1-72. https://www.tcrecord.org  ID Number: 23303

Greenhow, C., Galvin, S. M., & Staudt Willet, K. B. (2019). What should be the role of social media in education? Policy Insights from the Behavioral and Brain Sciences, 6(2), 178–185. https://doi.org/10.1177/2372732219865290

Hartman, S., & Kahn, S. (2017). Start local, go global: Community partnerships empower children as scientists and citizens. Social Studies and the Young Learner, 29, 3–7. https://www.socialstudies.org/system/files/publications/articles/yl_2904173.pdf

Heinimäki, O. P., Volet, S., & Vauras, M. (2020). Core and activity-specific functional participatory roles in collaborative science learning. Frontline Learning Research8(2), 65-89. https://files.eric.ed.gov/fulltext/EJ1252753.pdf

Hoadley, C. (2010). Roles, design, and the nature of CSCL. Computers in Human Behavior26(4), 551-555. https://doi.org/10.1016/j.chb.2009.08.012

Hod, Y., Sagy, O., & Kali, Y. (2018). The opportunities of networks of research-practice partnerships and why CSCL should not give up on large-scale educational change. International. Journal of Computer Supportive Collaborative Learning 13, 457–466.  https://doi.org/10.1007/s11412-018-9287-9

Hsieh, H. F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277-1288. https://doi.org/10.1177/1049732305276687

Huang, J., Hmelo-Silver, C. E., Jordan, R., Gray, S., Frensley, T., Newman, G., & Stern, M. J. (2018). Scientific discourse of citizen scientists: Models as a boundary object for collaborative problem solving. Computers in Human Behavior, 87, 480–492. https://doi.org/10.1016/j.chb.2018.04.004

Koomen, M. H., Rodriguez, E., Hoffman, A., Petersen, C., & Oberhauser, K. (2018). Authentic science with citizen science and student-driven science fair projects. Science Education, 102(3), 593–644. https://doi.org/10.1002/sce.21335

Kelemen-Finan, J., Scheuch, M., & Winter, S. (2018). Contributions from citizen science to science education: An examination of a biodiversity citizen science project with schools in Central Europe. International Journal of Science Education, 40(17), 2078-2098. https://doi.org/10.1080/09500693.2018.1520405

Malykhina, E. (2013). 8 apps that turn citizens into scientists. Scientific American. www.scientificamerican.com/article/8-apps-that-turn-citizens-into-scientists

Mitchell, N., Triska, M., Liberatore, A., Ashcroft, L., Weatherill, R., & Longnecker, N. (2017). Benefits and challenges of incorporating citizen science into university education. PLoS One12(11), 1-15. https://doi.org/10.1371/journal.pone.0186285

McGinnis, J. R., Hestness, E., Mills, K., Ketelhut, D. J., Cabrera, L., & Jeong H. (2020). Preservice science teachers’ beliefs about computational thinking following a curricular module within an elementary science methods course. Contemporary Issues in Technology and Teacher Education, 20(1). https://citejournal.org/volume-20/issue-1-20/science/preservice-science-teachers-beliefs-about-computational-thinking-following-a-curricular-module-within-an-elementary-science-methods-course

McNeill, J., & Krajcik, J. (2011). Inquiry and scientific explanations: Helping students use evidence and reasoning. In J. Luft, R. Bell, & J. Gess-Newsome (Eds.) Science as inquiry in the secondary setting, 121-134. NSTA Press.

McNeill, K. M., & Martin, D. M. (2011). Claim, evidence, and reasoning: Demystifying data during a unit on simple machines. Science and Children, 48(8), 52-56.

NGSS Lead States. (2013). Next generation science standards: For states, by states. The National Academies Press.

National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. The National Academies Press. https://doi.org/10.17226/13165.

Oliveira, A. W., Boz, U., Broadwell, G. A., & Sadler, T. D. (2014). Student leadership in small group science inquiry. Research in Science & Technological Education32(3), 281-297. https://doi.org/10.1080/02635143.2014.942621

Phillips, T. B., Ballard, H. L., Lewenstein, B. V., & Bonney, R. (2019). Engagement in science through citizen science: Moving beyond data collection. Science Education, 103(3), 665–690. https://doi.org/10.1002/sce.21501.

Project Noah (2020). https://www.projectnoah.org/

Rienties, B., Giesbers, B., Tempelaar, D., Lygo-Baker, S., Segers, M., & Gijselaers, W. (2012). The role of scaffolding and motivation in CSCL. Computers & Education, 59(3), 893–906. https://doi.org/10.1016/j.compedu.2012.04.010

Rummel, N., Deiglmayr, A., Spada, H., Kahrimanis, G., & Avouris, N. (2011). Analyzing collaborative interactions across domains and settings: An adaptable rating scheme. In S. Puntambekar, G. Erkens, & C. Hmelo-Silver (Eds.), Analyzing interactions in CSCL (Vol. 12; pp. 367-390). Springer.

Saunders, M. E., Roger, E., Geary, W. L., Meredith, F., Welbourne, D. J., Bako, A., Canavan, E., Herro, F., Herron, C., Hung, O., Kuntsler, M., Lin, J., Ludlow, N., Paton, M., Salt, S., Simpson, T., Wang, A., Zimmerman, N., Drews, K. B., … Moles, A. T. (2018). Citizen science in schools: Engaging students in research on urban habitat for pollinators. Austral Ecology, 43, 635-642. https://doi.org/10.1111/aec.12608

Shah, H. R., & Martinez, L. R. (2016). Current approaches in implementing citizen science in the classroom. Journal of Microbiology & Biology Education. 17, 17-22. https://doi.org/10.1128/jmbe.v17i1.1032

Sun, P. C., Tsai, R. J., Finger, G., Chen, Y. Y., & Yeh, D. (2008). What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Computers & Education, 50(4), 1183-1202. http://dx.doi.org/10.1016/j.compedu.2006.11.007.

Tsivitanidou, O., & Ioannou, A. (2020). Citizen Science, K-12 science education and use of technology: A synthesis of empirical research. Journal of Science Communication 19(4),1-22. https://doi.org/10.22323/2.19040901 

Wiggins, A., & Crowston, K. (2011). From conservation to crowdsourcing: A typology of citizen science. 2011 44th Hawaii International Conference on System Sciences, 1–10. https://doi.org/10.1109/HICSS.2011.207

Zydney, J. M. (2008). Cognitive tools for scaffolding students defining an ill-structured problem. Journal of Educational Computing Research38(4), 353-385. https://doi.org/10.2190/EC.38.4.a

Zydney, J. M. (2012). Scaffolding. In N.M. Seel (Ed.), Encyclopedia of the sciences of learning. Springer.


Appendix
Summary of NGSS Codes

NGSS Science and Engineering PracticeOccurrenceExamples of Instances
Practice 1: Asking Questions and Defining Problems5“where did you find this? i cant think of any place on the playground where it would be!”
Summary: Most posts asked where the original poster found their organism.
Practice 2: Developing and Using Models0 
Practice 3: Planning and Carrying Out Investigations12“The pill bug that we found is a little chubby, brown and black, striped, hard exoskeleton and a soft squishy inner body and anntenae.”
“Yes, it was alive. It did move and we had to take the pic really fast. Also we had to be really quiet to not scare it away.”
“Size ( adult length ):10mm to 65mm ( 0.39in to 2.56in) Identifying colors: brown, red, tan, and orange Additional descriptors: legs, princers”
Summary: These posts generally describe their organism using both qualitative and quantitative observations. Some also include a description of the habitat.
Practice 4: Analyzing and Interpreting Data12Same examples as Practice 3.
In the operationalized definition of Practice 3, making observations to produce data is an example of Practice 3. In Practice 4, recording observations is an example. Since each observation is recorded on Project Noah, each observation in Practice 3 was also counted as Practice 4.
Practice 5: Using Mathematics and Computational Thinking3“10mm to 65mm ( 0.39in to 2.56in)”
“A lifespan of a fly is usually 28 days. A fly has 2 wings.”
Summary: These posts include quantitative observations.
Practice 6: Constructing Explanations and Designing Solutions13“Common Name: American Oil Beetle

Scientific Name: Meloe americanus”
“this larvae moves slower than worms so we could tell it was not a worm”
Summary: All but one post includes the common name and scientific name of the organism.
Practice 7: Engaging in Argument from Evidence7“I don't know if this is a long legged sac spider because I looked them up and that's not what they look like. #Spiders ;)”
“This small fly is a non-biting fly. It is called a Wood Gnat. They like to feed on decaying wood and vegetation and fermenting sap. I can see why you might be confused about what type of fly, since they do somewhat resemble mosquitoes. Very cool find! I have never seen one of these before.”
“Nice job but I think it is a fly”
Summary: Most of these posts include a simple agreement or disagreement of the identification of the organism. One post revises an initial identification.
Practice 8: Obtaining, Evaluating, and Communicating Information19“#FUN FACT: Did you know that these bugs are also called Rolly Pollys”
“It makes the sound "kat-i-did". They lay their egg in a single row, they overlap each other, they are often not the same color.”
Summary: In addition to the postings as a whole that counted as communicating information, most of the other posts included information that students likely gained through researching the organism. Some used a more formal tone or more complex vocabulary than other posts.

Loading