Laster, B., Rogers, R., Gallagher T., Scott, D. B., Vasinda, S., Orellana, P., Rhodes, J., Deeney, T., Waller, R., Hoch, M., Cavendish, L., Milby, T., Butler, M., Johnson, T., Msengi, S., Dozier, C., Huggins, S., & Gurvitz, D. (2024). Literacy clinics during COVID-19: Pivoting and imagining the future. Contemporary Issues in Technology and Teacher Education, 24(1).

Literacy Clinics During COVID-19: Pivoting and Imagining the Future

by Barbara Laster, Towson University; Rebecca Rogers, University of Missouri-St. Louis; Tiffany Gallagher, Brock University, CANADA; D. Beth Scott, Pennsylvania State University-Harrisburg; Sheri Vasinda, Oklahoma State University; Pelusa Orellana, Universidad de los Andes, CHILE; Joan Rhodes, Virginia Commonwealth University; Theresa Deeney, University of Rhode Island; Rachael Waller, Montana State University-Billings; Mary Hoch, National Louis University; Leslie Cavendish, High Point University; Tammy Milby, University of Richmond; Melinda Butler, University of Southern Maine; Tracy Johnson, University of Indianapolis; Shadrack Msengi, Southern Illinois University; Cheryl Dozier, University at Albany; Shelly Huggins, Towson University; & Debra Gurvitz, National Louis University


Literacy clinics have a long history of providing supplemental assessment and instruction to students with literacy needs, but they were tested during the COVID-19 pandemic, as many pivoted from a face-to face format to three-way remote learning. This study provides a window into how literacy clinics at this moment of transformation in education embraced, and in some cases were challenged by, technology. A survey was administered in spring 2021 to a sample of 58 literacy clinic directors from the United States, Canada, Brazil, Bolivia, The Netherlands, and Australia. Data analysis included quantitative descriptive and inferential statistics reporting on the use of technological platforms and resources, clinic settings, and the format of clinics, before, during, and anticipated after pandemic. Results suggest that clinicians retained some traditional instruction methods while moving some components to digital spaces. Qualitative analysis included (a) coding, (b) creating categories, and (c) developing profiles of respondents based on their prepandemic and postpandemic instructional delivery format. Survey responses conveying the challenges and opportunities of online instruction are discussed in accordance with technology, pedagogy, and content knowledge. This research captured the precipice of institutional change as literacy clinics responded to the pandemic and then recalibrated their intentions for the future.

The current hope is to return to face-to-face learning when we are able to safely. I am not sure what, if anything, from online tutoring will be retained. (Literacy Clinic Director)

The best part of this difficult situation was the freedom to make changes to the format of the clinic without having to convince colleagues. …Everyone had to change and think differently about the way we taught and conducted our overall program, including the clinic. I hope we will continue to embrace change when we move out of the pandemic. (Literacy Clinic Director)

The COVID-19 pandemic was a historical moment in which educational institutions, generally, and university literacy clinics, specifically, were forced to recalibrate to be responsive (Carrillo & Maria Assunção Flores, 2020). During the 2020 lockdowns, many teacher educators, teachers, and students were not able to be together. Many K-12 schools and universities closed, as institutions learned how to navigate life and learning during a pandemic.

For most, this was a time of innovation and pervasive inequities (Ravitch, 2020). Indeed, the National Academy of Education (2020) noted that the shift to online learning exacerbated the digital divide for the most marginalized populations. As more technological equipment and instruction utilizing technology was implemented during the pandemic, some learners were left with inconsistent access to instruction (Lohnes-Watulak & Laster, 2016). The lack of access led  to educational decline in some cases, as more privileged students have technological access while those who are marginalized did not (Onyema, 2020). Likewise, institutions of higher education responded to the pandemic in a myriad of ways, depending on geographical location and other factors. (Carrillo & Maria Assunção Flores, 2020).

Institutions of higher education have advanced digital learning environments in recent years. Likewise, research has explored the use and impact of technology in literacy clinics (Laster, 2013; Vasinda et al., 2015). Literacy clinics are sites where aspiring literacy specialists and preservice teachers, both called “clinicians,” advance their professional knowledge about literacy education. Literacy clinics also offer supplemental (i.e., out-of-school) assessment and instruction to developing learners (Evensen, 1999). Specifically, projects related to literacy clinics have examined types and purposes of technology use (Laster et al., 2018), such as whether technology is used for drill and practice or for more generative learning (Tysseling & Laster, 2013).

Yet, the massive change in the face of the pandemic accelerated the forward thinking about the nature of digital teaching and learning (Selwyn & Jandrić, 2020). Educators responded to this shift in modality in several ways, from rebuilding to redesigning (e.g., Moore et al., 2021). The values of an institution or program undergirded the shift in practices during the pandemic (Brooks et al., 2021). These values are signaled, in part, through the terms used to describe the changes made by literacy clinics during the early months of the pandemic.In this study, the following terms surfaced:

  • Face-to-face instruction (F2F) refers to students and instructors being in the same physical space (Evensen, 1999).
  • Emergency remote teaching (ERT) refers to teaching delivery modes that rapidly change as the result of a crisis such as the COVID-19 pandemic (Hodges et al., 2020).
  • Three-way remote learning in literacy clinics is a modality in which the teacher educator, clinicians, and K-12 students are in three separate locations.
  • Online education and online learning environments refer to instruction that occurs partially or completely over the internet (Means et al., 2009); such instruction is carefully designed over time with established infrastructure (whereas ERT occurs quickly and potentially with limited resources; Singh & Thurman, 2019).
  • Synchronous instruction means students and instructors interact in real time (Romero-Hall & Vicentini, 2017).
  • Asynchronous instruction takes place at a time of convenience for students and instructor (Singh & Thurman, 2019).
  • Hybrid instruction is a combination of face-to-face and online delivery modes (Amrein-Beardsley et al., 2007).

The proliferation of terms and slippery nature of their usage during this time points to an aspect of what Ravitch (2020) referred to as flux pedagogy, which is an emergent, inquiry-based mindset that is adaptive, generative, and compassionate. These types of pedagogies are built during times of uncertainty, such as the context of the pandemic. The opening quotes at the beginning of this paper from literacy clinic directors reflect the divergent ways that clinics responded to the pandemic. This diversity of responses piqued interest not only for this moment in time but what it might suggest for the future of literacy clinics.

There has been scholarship elucidating the impact of the pandemic focused on higher education (Neuwirth et al., 2021) and K-12 education (Lewis & Kuhfeld, 2021); however, less has been written about literacy clinics that have been in existence for over 100 years. Research has described in detail the functioning of literacy clinics (Coffee et al., 2013); the perspectives of students and families in clinics (Deeney et al., 2019); the knowledge and skills that teachers transfer from the literacy clinic to the classroom (Deeney et al., 2011); use of technology (Ortlieb et al., 2014; Rhodes, 2013); and assessment and instruction acumen (Pletcher et al., 2019). Yet, the bulk of this scholarship has occurred within face-to-face (F2F) literacy clinics in university, school, or community settings. There have been some exceptions, such as the online clinic described by Vokatis (2017), in which teachers and students were in the same venue, but the teacher educator was in a separate place. 

While the design and implementation of literacy clinics across the nation (and world) vary, there are commonalities (McKenna & Walpole, 2007). First, advancing teachers’ up-to-date knowledge and skills in reading and writing assessment and instruction is paramount (Laster, 2013; Laster et al., 2022). Second, teaching is geared to accelerating the progress of students who experience challenges with reading and writing yet does not proceed from a deficit perspective (Dozier et al., 2006). Third, scholarship points to the parallel layers of learning that occur within literacy clinic settings: teachers, students, families, and teacher educators (Deeney et al., 2019; Laster, 1999). Fourth, literacy clinics aim to establish a responsive learning environment through strong teacher-student relationships (Dozier & Deeney, 2013). Fifth, the use of the latest current pedagogy, including technology, advances literacy practitioners as leaders in the field (Bowers et al., 2017; Vasinda et al., 2020).

With increased use of technology as a modality for instruction and as an instructional tool, some literacy clinic directors have incorporated the Technology, Pedagogy and Content Knowledge (TPACK) model into their course design (Mishra & Koehler, 2006; Vasinda et al., 2015; Vasinda et al. 2020). Building on and updating Shulman’s (1986) Pedagogical Content Knowledge model, which focused on the intersection of content knowledge and pedagogical knowledge, the TPACK model outlines the specific types of knowledge educators need and which technology tools and pedagogies are best when planning for and providing instruction in the contexts of unique educational settings (Mishra & Koehler, 2006) 

When integrated in clinic settings, the TPACK model advances the intersections of technological content knowledge (TCK) and technological pedagogical knowledge (TPK). This builds on the pedagogical content knowledge developed in reading and writing methods courses that clinicians take before enrolling in clinic. TPACK is useful for providing course instructors and clinicians with an interconnected approach for integrating technology in tutoring sessions and has been noted as having positive results for teacher training of digital skills (Alférez-Pastor et al., 2023). 

Literacy clinics were tested during the COVID-19 pandemic, as many pivoted from F2F format to three-way remote learning (Laster et al., 2021; Msengi & Laster, 2022). Here, the term remote refers to instructors and students being physically separated during instruction (Lindner et al., 2020). In the literacy clinic circumstance, K-12 students were in their homes, clinicians were in their homes, and the clinic director was in their home/office.

This study provides a window into the life of literacy clinics at this historical moment of transformation in education. Most importantly, this research captures the precipice of institutional change as literacy clinics responded to the pandemic and then recalibrated their intentions for the future. Accordingly, the researchers asked an overarching research question, How did university literacy clinics respond to the pandemic? There were four subquestions:

  1. In what ways did the pandemic affect clinics’ adoption and use of technological platforms and resources?
  2. What shifts were made in clinic settings in response to the pandemic?
  3. In what ways was the format of clinics altered?
  4. What do clinic directors imagine for their clinics moving forward?


This section includes a description of the research design, including the data collected, the participants, and three phases of data analyses.  

Research Design

The survey instrument was researcher designed (Fowler, 1993) with 17 questions. It was administered in spring 2021 using the Google Forms platform. The first author created the first draft of the open-ended plus multiple-choice questions and presented them for review to the other researchers, who in turn, revised both the design and the content of the survey over multiple iterations. See the appendix for the final survey instrument.

The survey gathered information about the roles of respondents (e.g., teacher educators, and directors) in their clinics. It allowed respondents to indicate which students they worked with (e.g., undergraduate or graduate), the format in which the clinics operated before, during and after the pandemic (e.g., face-to-face or synchronous), and the spaces where clinicians taught (e.g., home, school, or university campus).

Respondents were asked to name the technologies they used to support tutoring instruction before, during, and after the pandemic. The remaining open-ended questions addressed how the pandemic affected key issues, such as recruitment of students (i.e., “How was recruitment of clients/students different?), assessment (i.e., “How was assessment of reading affected?”), and instruction (i.e., “How was reading instruction affected?”). Questions also addressed communication with families and teachers, affordances, and constraints of supervising clinicians (i.e., “What were changes in your communication with clinicians and/or changes in course expectations? How did you build community among clinicians?”), and changes made as a consequence of the pandemic that may be maintained (i.e., “Of the changes that you made during the pandemic, what will you stop and  what will you keep going forward?”).

There was an opportunity for respondents to add information regarding the impact of the pandemic on their clinics. The link to the survey was available for 10 weeks, after which response data were downloaded on a spreadsheet for coding and analyses.


The researchers of this team, composed of literacy clinic educators associated with the Literacy Research Association clinic study group, systematically recruited participants in two stages. First, the researchers sent out personalized invitations (n = 73) to their respective professional networks explaining the purpose of the study. These personal emails were sent to clinical directors from 40 of 50 states of the US plus five other countries (Canada, Brazil, Bolivia, The Netherlands, and Australia).

There were 43 confidential respondents (response rate of 59%) to the online survey. The survey platform did not collect email addresses and all responses (including their locale), and results were reported in the aggregate without identifying information, as per institutional review board (IRB) directions. The first author’s university IRB determined this study to be exempt, which meant there was minimal risk to participants and the research did not require continual oversight.

Leveraging multistage sampling (Fowler, 1993), the researchers activated a second stage of recruitment 2 months after the personalized emails. The researchers extended the call for participants from literacy research organizations (i.e., Literacy Research Association, International Literacy Association, and Association of Literacy Educators and Researchers). This second stage yielded an additional 15 completed surveys; thus, the total sample (n = 58 respondents) represented two distinct recruitment efforts.

The survey respondents self-identified as literacy clinic directors. The researchers use the labels teachers/tutors or clinicians to describe those who were enrolled in a practicum that manifested in literacy clinics and who were supervised or coached by the clinic director or supervisor or teacher educator. Some respondents taught or supervised more than one demographic, such as sections of undergraduates plus sections of graduate students. Yet, respondents specified that their clinicians were 62.7% graduate students/practicing teachers; 37.3% preservice teachers (22% preservice graduate students and 40.7% undergraduates).

Data Analysis

Data analysis included both quantitative (Preacher, 2001) and qualitative procedures (Charmaz, 2007). To answer the first three subquestions, quantitative analyses began with frequency counts for the responses of the three close-ended survey questions (i.e., Q3, Q4, Q6) that were culled in a matrix. Descriptive statistics (i.e., percentages of proportionate responses) and inferential statistics (i.e., chi-square goodness-of-fit test) were calculated. Given the categorical data, the chi-square goodness-of-fit test (Cochran, 1952) was used when appropriate for evaluating the interdependence of the variables. Data gathered for Q6 included numerous technological resources that were reviewed and meaningfully clustered by the researchers (e.g., all Google Apps were clustered into one category). 

To answer the fourth subquestion (What do clinic directors imagine for their clinics moving forward?), the researchers focused on Q16 (“Of the changes that you made during the pandemic, what will you stop and what will you keep going forward?” and Q17 (“Explain any details of this pandemic’s impact on your clinic/lab?”) in the survey. These survey items were open ended and elicited generative, confidential responses from respondents.

Each respondent was given a unique identifying number which allowed the research team to track their responses across clusters of items. Following qualitative protocols (Creswell, 2018), the analysis of the responses included three phases: (a) coding, (b) creating categories and definitions, and (c) developing profiles of respondents. Qualitative analysis included interrater checks at multiple junctures to establish trustworthiness of our codes, categories, and definitions.

Phase I: Coding

Initial coding (Saldaña, 2021) was done independently by researchers who read and reread the answers to Q16 and Q17. Responses were read and coded line by line, and exact excerpts from responses were included for illustration. Next, in vivo codes were used to assign a label to each section of the excerpt, using a word or short phrase taken from that section. As the researchers identified clusters of codes, which led to potential categories that would answer Q16 and Q17, they met to discuss the analytic memos and open coding results.

The next step was for two additional researchers to employ the same process in looking for missing codes, add potential categories, and insert additional insights and analytic notes. These insights were captured in different colors in the document so all researchers could see the accumulation of analysis across the researchers. This iterative process helped to establish a common analytic framework and consistent interpretation.  

Phase II: Categories and Definitions

The second phase consisted of examining potential categories in a new document, to identify clusters of overtly similar categories. Thus, researchers noted (often using color-coding) those categories where patterns of similarities were observed. For example, there were seven categories for Question 17 (e.g., time, space, presence, technology connectivity, instruction, health, and flexibility). To get a sense of the prevalence of the category, researchers counted the number of times each category appeared across respondents. Definitions were generated for each of the categories, coupled with illustrative examples that represented the range of evidence within each category. Researchers met again in small groups for each of Q16 and Q17 to discuss their analytic memos, open coding results, and categories.

Phase III: Developing Profiles of Respondents 

In response to the fourth subquestion, the researchers looked across the survey respondents and developed profile labels based on the clinics’ technology platforms and contexts. First, clusters of responses were identified across survey Q3 (“Compare the format of your clinic before, during, and after the pandemic”) that distinguished clinics into groups based on their prepandemic instructional delivery and postpandemic (intended) instructional delivery. This clustering yielded the following four clinic profile groups:

  • Profile 1 was face-to-face (F2F) and plans to go back to F2F.
  • Profile 2 was F2F and plans to remain online.
  • Profile 3was F2F and plans to develop a hybrid option going forward.
  • Profile 4was online or hybrid before, went to remote during pandemic, and will be either online or hybrid going forward.

Clear composites for each profile were paired with direct quotations that illustrate each profile in the Findings section.


This section reports the results of the closed-ended survey items to provide a description of the clinics’ contexts and response to the first three research subquestions. Specifically, there are reports on the use of technological platforms and resources, clinic settings, and the format of clinics, before, during, and anticipated after pandemic. The profiles of the respondents in response to the fourth subquestion are also presented.

Technological Platforms and Resources   

The researchers gathered data from the survey Q6, “Compare the technology platforms and resources used in your clinic/lab before, during, and after the pandemic” (see Table 1 for the frequencies of responses).

Table 1
Technology Platforms and Resources Used in Clinics/Lab by Timing (Before, During, and After Pandemic)   

Technology Platforms and ResourcesBefore PandemicDuring PandemicAfter PandemicTotals
Learning Management Systems (Blackboard, Canvas)39 (4.5%)40 (4.6%)26 (3.1%)105 (12.2%)
Zoom11 (1.3%)52 (5.9%)33 (3.8%)96 (11%)
Padlet9 (1.0%)18 (2.0%)12 (1.3%)39 (4.3%)
Noninteractive Instructional Tools (PowerPoint, Prezi)56 (6.4%)66 (7.6%)44 (5.0%)166 (19.0%)
Google Classroom (including Jamboard; Meet, Slides)34 (3.9%)87 (9.9%)58 (6.6%)179 (20.4%)
Digital Texts (Epic books, Newsela)45 (5.1%)99 (11.2%)69 (7.9%)213 (24.2%)
Other teaching sites21 (2.3%)35 (3.9%)24 (2.7%)80 (8.9%)
Totals215 (24.5%)397 (45.1%)266 (30.4%)878 (100%)

A chi-square goodness-of-fit test (Preacher, 2001) determined that there was a statistically significant relationship between the technology platforms/resources used and the timing of the pandemic (before, during, and after), x2(18) = 35.20, p = .009. This relationship is influenced by pronounced use of video platforms (e.g., Zoom; Google Classroom) and digital texts during the pandemic. Despite adopting these technologies, there were setbacks reported during tutoring sessions, such as sharing screens and texts to read, as this tended to slow down the pace of instruction.

The learning curve to master new modalities consumed time, as did learning to harness the pedagogical affordances of literacy education in online environments. Overall, this was a demonstration of the clinicians’ resourcefulness with finding and deploying new digital resources. Respondents reported moderate intentions to retain these technologies (subtotal percentage of 30.4%) after the pandemic.

Clinic Settings

With respect to the literacy clinics’ tutoring settings relative to the timing of the pandemic (before, during, and after), Table 2provides frequency counts. The data come from the survey Q4, “In what space did your clinicians teach?”

Table 2
Setting of Clinics/Lab by Timing (Before, During, and After Pandemic)  

SettingBefore PandemicDuring PandemicAfter PandemicTotals
In a university38 (15.4%)4 (1.6%)20 (8%)62 (25%)
In a school building where we/they usually have Clinic/Lab23 (9.3%)5 (2%)12 (4.8%)40 (16.1%)
In their school building (which is different than usual)11 (4.4%)11 (4.4%)13 (5.2%)35 (14%)
In their own home1 (0.4%)43 (17.3%)13 (5.2%)57 (22.9%)
In the student's home1 (0.4%)11 (4.4%)4 (1.6%)16 (6.4%)
In a public space (e.g., community center; library)11 (4.4%)8 (3.2%)8 (3.2%)27 (10.8%)
Other1 (0.4%)8 (3.2%)2 (0.8%)11 (4.4%)
Totals86 (34.7%)90 (36.3%)72 (29%)248

Response categories were collapsed into the traditional clinic/lab settings (university or typical school building), alternative settings (different school, own home, or student’s home) and nonspecific (public space or other). A chi-square goodness-of-fit test (Preacher, 2001) determined that there was a statistically significant relationship among the setting of clinics/lab and the timing of the pandemic (before, during, and after), x2(4) = 73.29, p = 0.00.

This relationship was influenced by the dominant traditional clinic/lab settings in universities and schools (before pandemic) and the alternative settings during the pandemic in private homes or different schools. As a function of the shift in space where the clinicians tutored, families had altered access arrangements. Some families for whom geography or transportation were typical obstacles were now able to access the services of literacy clinics for their children. However, access was narrowed for other families with young children, as clinicians noted that these learners had challenges during online assessment and instruction. Access was restricted for other families because of a lack of technology or connectivity, thus contributing to the digital divide. Respondents also noted that scheduling time to meet with parents was often tricky after an online tutoring session; however, virtual communication opened access for some families.

Format of Clinics

Data regarding shifts in the format of clinics’ lesson delivery mode is presented in Table 3, which includes the frequency counts for responses to the survey Q3, “Compare the format of your clinic/lab before, during, and (if you know) after the pandemic.”

Table 3
Format of Clinics/Lab by Timing (Before, During, and After Pandemic)

FormatBefore PandemicDuring PandemicAfter PandemicTotals
Face-to-face52 (16.7%)2 (0.6%)18 (5.7%)72 (23%)
Online8 (2.5%)43 (13.8%)15 (4.8%)66 (21.1%)
Hybrid of face-to-face and online7 (2.2%)9 (2.9%)24 (7.7%)40 (12.8%)
Fully synchronous (everyone meets at the same time)29 (9.3%)23 (7.4%)16 (5.1%)68 (21.8%)
Fully asynchronous (students meet at their convenience for coursework and for working with young students)6 (1.9%)13 (4.3%)11 (3.5%)30 (9.7%)
Hybrid of synchronous and asynchronous6 (1.9%)18 (5.9%)12 (3.8%)36 (11.6%)
Totals108 (34.5%)108 (34.9%)96 (30.6%)312 (100%)

A chi-square goodness-of-fit test determined that there was a statistically significant relationship between the format of the clinic/lab and the timing of the pandemic (before, during, and after), x2(10) = 107.46, p = .000. This relationship was influenced by the number of face-to-face clinics before the pandemic and online/remote clinics during the pandemic. In essence, teacher educators, clinicians, and their students were flexibly responsive as they made technological changes to their literacy clinics. Respondents reported that, after the pandemic, they anticipated a hybrid of face-to-face and online/remote formats.

Profiles of Clinics

Based on data analyses, four profiles of literacy clinics’ trajectories were identified and listed in Phase III: Developing Profiles of Respondents. The four profiles are described next.

Profile 1: Plans to Continue F2F Postpandemic

In the first profile (F2F back to F2F), there were a few commonalities across respondents. First, they were not likely to see any aspects of online literacy clinics as salvageable and transferrable to F2F clinics. This contributed to their overall negative evaluation of online literacy clinics. Second, respondents in this profile tended to note that students in their clinics made fewer gains during the pandemic. Third, it was typical of this profile to have used few digital tools prior to the pandemic (perhaps only a learning management system such as Blackboard or Canvas). 

This cluster of respondents faced multiple challenges as they pivoted to remote or online clinics (e.g., recruitment, poor participation, technology, supervision, and lack of community for students and teachers). As illustration, Respondent 29 reported that “the side-by-side reading approach was impacted. The general framework of literacy instruction did not change. However, clinicians struggled with attention, and adjusted as needed based on students’ behaviors and affect.” In this clinic, tutoring was synchronous, but the supervision and feedback was asynchronous. Respondent 37 noted, “Supervision was greatly altered. Not being able to monitor students in person was a major loss” and added, “Teacher candidates were more ‘on their own’ and missed immediate feedback as we have at the in-person clinic.”

The tenor of the responses in this first profile wavered between almost completely deficit focused about online as a modality to an experimental tone (e.g., Respondent 11, “We tried virtual family visits”). At one clinic (Respondent 33), F2F weekly parent workshops were missed during the pandemic, not only because parents benefitted, but because teachers missed the opportunity to provide such educational opportunities.

The coda of Profile 1 was represented in Q16 responses, which suggested that moving to online literacy clinics was a “patch” and they viewed their F2F clinics as more effective. Said Respondent 29, “Stop: Online tutoring! The online tutoring was not nearly as effective for both the clinicians and the children we serve. In particular, the children did not get to experience being a part of the clinic community.” Here, it is important to note the reference to “the clinic community” referring to an in-person literacy clinic. Respondent 35 wrote, “The current hope is to return to face-to-face learning when we are able to safely. I am not sure what, if anything, from online tutoring will be retained.” Here is the suggestion that there were no tools to be transferred, processes that were beneficial, or areas of online literacy clinic life that might benefit literacy teaching and learning.

Respondent 7 wrote,

I hope to get back to pre-pandemic clinic ASAP. I wonder if teacher candidates will ‘demand’ online options from administrators now as they have seen it is possible (even if it is not effective). …We are already planning to be back in full strength this coming fall and return to clinic as normal.

This respondent suggested that there were tools such as inviting a guest speaker and using more online resources that they will continue to use but they can “hardly wait for face-to-face interactions with children again.”

Profile 2: Plans to Continue With Online Literacy Clinic

The cluster of participants in the second profile was F2F, pivoted to remote or online, and envisioned a positive horizon for online clinics moving forward. The majority of the respondents in this profile had a solid technological foundation before the pandemic. Some of the respondents noted they layered in additional platforms, such as Zoom, to support synchronous online teaching and additional digital resources (e.g., Padlet, Jamboard, EPIC, or Newsela).

This cluster of respondents noted a transformational moment when they realized that an aspect of their clinic had been improved because of the pivot to online. Respondent 18 said that they tried new virtual seminars for families because of the online platform. “We held family literacy nights where all of the parents/students came together to hear a children’s author share her bilingual book.” Respondent 26 also reported benefits for family engagement, “It increased the communication with families, as most of the families had to assist the children with technology or take pictures of students’ work and send them to teachers.” 

This group of respondents recognized issues of access and inclusion for families who otherwise could not attend on-campus clinics. Respondent 9 wrote, “We will keep virtual tutoring as part of our course requirements. It serves our students well, and we are able to serve more rural communities we weren’t able to before. It also provides our candidates with virtual teaching experience.” Respondent 18 noted that their clinic’s recruitment of families became more intentional to include “schools and districts that have been hardest impacted by the ongoing pandemic/s and are historically underfunded schools that serve primarily Black and Latinx students.” 

Furthermore, the respondents in this cluster noted that the K-12 students made “typical student gains” or they were still determining student learning. A few noted that they were exploring different kinds of strengths and gains that the teachers might expect to see with digital literacies and in online settings. 

These respondents envisioned important theories and practices from their F2F clinics and worked to transfer these to the online setting. For example, Respondent 39 stated, “We worked to move all of our lessons online in Google Slides…” Respondent 18 said, “I gave the educators permission to not just ‘transfer’ effective literacy pedagogy to an online platform but to really explore how digital tools can create new literacy experiences; allowing students to showcase their strengths with literacies in new ways.” 

This cluster of respondents used a great deal of hopeful, future-oriented language as they described the future of their online or hybrid literacy clinics.  As Respondent 18 said, “Teaching and learning literacy online will continue into the future, and we need to get better at designing, assessing, and planning for literacy acceleration in ways that are culturally responsive and sustaining in online environments.”

Profile 3: Plans to Develop a Hybrid Option Moving Forward

A third profile emerged with a group of respondents who originally supervised in-person literacy clinics, then during the pandemic, transitioned to hybrid, online, or remote delivery, and were finally considering the implications of using a hybrid model in future clinics. Several different perspectives were evident within this profile group.

The first perspective was that, although moving to online or hybrid literacy clinics brought many frustrations, respondents were forward thinking with respect to new prospects for clinicians, students, supervisors, and parents.  This was specifically evident in two areas: supervision of the clinicians and instruction using digital tools. With respect to supervision, a few respondents said that they were unable to provide feedback to clinicians in a timely manner, had issues with technology that impacted their supervision, or found that using video supervision was time consuming (Respondents 23, 25, and 28).

The change to online supervision did, however, encourage changes in supervision, as evidenced by one clinic director who found the private chat feature in Zoom an effective tool in communicating with clinicians (Respondent 19), a clinic director who paired clinicians to do peer coaching (Respondent 28), and another who noted the impact of video coaching on improving clinician pedagogy.

Instruction using digital tools was a second area of frustration and growth for the respondents in this profile. Respondents commented that they had to learn how to teach differently and work harder using digital tools, particularly as they were no longer using computers to extend student learning but to provide direct instruction (Respondents 19, 25, and 28). But for others, these challenges were recognized as learning opportunities, such as for Respondent 13, who intended to adopt a hybrid option moving forward, as their clinicians embraced digital tools and then, in turn, the supervisors “learned so much [from the clinicians].”

Other respondents within this profile acknowledged with resignation that the presence of online and hybrid literacy clinics may be a reality moving forward. Respondent 30 said, “If I am required to continue to lead clinic online, I will need to continue to develop online instructional strategies.” This statement, along with one about the ineffectiveness of online tutoring, suggests that this respondent, while favoring F2F learning, appeared to recognize that the pivot to online learning in literacy clinics during the pandemic revealed potential options and opportunities not previously considered or employed. Other respondents recognized and commented on advantages such as convenience, flexibility, and options to better accommodate students in online venues (Respondents 25, 47, and 55).

The final perspective on moving from a F2F to online or hybrid literacy clinics involved clinicians who were working with students living in poverty. One director (Respondent 38) related how the clinicians used US mail and telephone calls for connecting with students who had little or unreliable access to technology and the internet. In this clinic, the clinicians “found a way to maintain connections. …Families were extremely grateful for university support and partnerships, isolated kids had an adult mentor during a challenging time.” Other respondents noted that they physically delivered supplies, materials, and books to families’ homes.

Profile 4: Continue Offering Online or Hybrid Clinics as Before the Pandemic

The fourth profile included respondents whose literacy clinics were online or hybrid before the pandemic, then went to three-way remote during the pandemic, and will be either online or hybrid going forward. Some of these respondents commented that relationships with parents were stronger; others thought there was a diminishment of interaction with parents. Yet, Respondent#17, who used all modalities (F2F, hybrid, and online) before the pandemic and will have all modalities after (including fully synchronous), noted that they will utilize more virtual communications, which will allow more access than before the pandemic.  

Several affordances were spotlighted in the context of challenges. Respondents commented on issues of equity, resources, the impact on teachers, and students’ resilience. For example, Respondent 34 said, “The equity issue of who has/doesn’t have devices or internet excluded some children who typically participate. On the other hand, we were able to serve students [who were located] hours and days away from our site. That was a huge bonus.” Respondent 31 said, “On a positive note, teachers did a wonderful job of being flexible and resourceful in identifying tools for engagement and instruction.”

There were several comments about the advantages teachers accrued by having technology plus literacy in previous iterations of literacy clinics. It was remarkable that those who typically had online literacy clinics had significant advantages during the pandemic. Respondent 32 said, that having done online literacy clinic for years,

our pivot was not drastic, though some clinicians had to work with students remotely, not in the school building as is usual. Same communication as usual (SeeSaw, Classroom Dojo, email, phone).  Students made typical gains and the literacy clinic remained steady. Many of our clinicians came to us and said they were so grateful for all the online experiences and tools used throughout the program because they were much more comfortable with the technology than many they were working with in their schools.

In sum, literacy clinics that conformed to Profiles 1 and 4 planned to make few changes from their prepandemic operations; that is, they will go back to either fully F2F or fully online clinics. On the other hand, those aligned with Profiles 2 and 3 had a vision forward that likely will dramatically change the future functioning of their literacy clinics.


We posed the overarching research question, How did university literacy clinics respond to the pandemic? The findings from this research captured a transformational moment of institutional change as literacy clinics responded to the pandemic and then looked ahead to recalibrate their operations and outlook for the future. This study highlights the fact that, for some clinic directors, the goal of the pandemic pivot was to discover how best to use technology to transform instruction and innovate, while for others the goal was to replicate existing literacy clinic practices in a virtual setting. As much as some respondents viewed the pivot negatively, the option a decade ago would have been to just shut down literacy clinics when the pandemic hit. The many newly established technologies allowed for the functioning of literacy clinics during the pandemic when K-12 schools and universities were closed. In this section, the researchers integrate the research findings with the literature on the intersection of technology and teaching, underscore the central themes of flux pedagogy (Ravitch, 2020), and capture forward visions for literacy clinics, including the potential for culturally sustaining pedagogies in a technological landscape.

Clinic directors and clinicians who had experience with digital spaces and resources —such as those in Profile 4, who had online clinics or much technology integration before the pandemic — easily pivoted during the pandemic. In contrast, those who had less experience with using technology in powerful ways — such as those in Profile 1 — felt overwhelmed and anxious to return to F2F operations. This underscores a point confirmed in the literature, which is the relationship between teachers’ experience, agency, and integration of technologies into teaching (e.g. Ertmer & Ottenbreit-Leftwich, 2010; Kocak-Usluel et al., 2015). The spectrum of survey responses conveying the challenges and opportunities of online instruction parallels educators’ range of TPACK (Mishra & Koehler, 2006).

The pandemic created an opportunity for both the clinic directors and the clinicians to integrate familiar instructional methods into sometimes unfamiliar technology.  Teaching in the literacy clinic practica has long provided a strong, supportive space (Evensen, 1999; Pletcher et al., 2019) to develop PCK (Shulman, 1986). With faculty guidance, teachers deepen their knowledge of literacy processes knowledge (CK) with their understanding of best instructional practices for a variety of learners (PK), putting those together to practice their PCK.

During the pandemic pivot, clinicians and directors had no choice but to adapt, integrate, and leverage their pedagogical strategies with technology tools and formats. Furthermore, the educators’ adeptness and familiarity with using technology influenced their efficacy in tackling problems that arose. In the context of online teaching, those clinicians with well-developed and practiced TPACK were able to more easily navigate the pandemic pivot with positive attitudes and outcomes.

For example, to successfully use digital texts, clinicians were required to choose appropriate texts (TCK), model effective comprehension strategies using those digital texts (TPK), screen share and use available digital teaching tools (e.g., text highlighting) (TK), and then scaffold the students’ interactions with the digital texts (TPACK). Additionally, the TPACK model acknowledges the influence and importance of the contexts of teaching (including cultural contexts) and the needs of the individual student or small group (MacKinnon, 2017; Mishra & Koehler, 2006). The data from this research illuminate both specific community needs and commonalities across multiple sites related to pandemic pivots and the long-term use of technology for literacy teaching.

The open-ended questions of the survey revealed several other important topics. First, the link between engagement and relationship building continues to be a central research interest in literacy clinics (Deeney et al., 2019), regardless of modality. The Profiles showcased in the findings illustrate a great variation in observations about relationship-building and online presence. 

At some sites, there was a palpable loss of the establishment of interpersonal relationships. Some of the hindrances related to making authentic connections with students when the clinicians and students accessed the learning via digital devices. Ravitch (2020) gave some illumination to the findings in her notion of flux pedagogy, which is inquiry based and involves an educational mindset that is adaptive, generative, and compassionate (p. 3). At other sites, respondents lauded the interpersonal relationships that were built between clinicians and students/families in the digital space. Educators also spoke about opportunities that arose during the pandemic, such as working collaboratively with colleagues to innovate new approaches to assessment and teaching that are engaging, creative, and effective.

Research suggests that student engagement, especially in online learning environments, has been associated with positive student outcomes (Leslie, 2021; Wankel & Blessinger, 2012). Thus, the range of findings from this research gives impetus to follow up this study with in-depth interviews at multiple sites to understand better why some found presence and relationship-building to be a frustration while others found it to be an inclusive option.  These interviews could also further examine the impact of online engagement on student outcomes.

Some clinic directors spoke positively about future visions. They reflected that online or remote clinics did and will continue to open avenues for supporting underserved families and communities. Expanding the literacy clinics to new populations may make tutoring more diverse, interesting, and supportive in useful ways. Perhaps, literacy clinics can help bridge the gap in achievement between economically advantaged students and economically disadvantaged students. Earlier research showed that this gap is significantly greater in online reading comprehension (i.e., locating information, reading critically, synthesizing, and communicating) than in reading and writing activities that are offline (Leu et al, 2015).

Furthermore, political, or cultural contexts may help explain some of the variations among the respondents, as the survey responses represent many regions of North America and a few other countries. The data from this study contained a range of opinions about whether there was increased interaction between clinics and families that may be explained by cultural, political, and geographic differences.

Literacy clinics, as well as educational technology, need to address the local needs of communities and families (Selwyn & Jandrić, 2020). Related to relationship building and online presence, is culturally sustaining instruction (Paris & Alim, 2017). Survey respondents did not explicitly speak to whether increased access to online texts assisted with students having greater exposure to multicultural texts, but they did express that they were able to gather substantial digital resources. Access to a greater range of children’s literature including digital texts may, in the future, be a thrust forward toward more culturally sustaining literacy pedagogies (Ladson-Billings, 2014).

Survey respondents described clinicians as resilient, creative, and resourceful in finding and using new digital tools. This research is a conduit for imagining new pathways for literacy clinics for their explicit purposes of teacher education and interaction with students and families. This is important to the fields of literacy, teacher development, and educational equity. These findings build on the work of Secoy and Sigler (2019), who found that teachers can retain the integrity of their traditional literacy instruction while moving some instructional components to digital spaces.


There are some methodological limitations with this study. Sampling error is a common concern when multistage sampling (Fowler, 1993) is employed. Although the researchers had the advantage of a sizable, geographically diverse research team, recruitment of literacy clinic personnel who were not known to the researchers or who were not members of any of the leading literacy research organizations may have been overlooked. Second, during recruitment, many potential respondents from literacy clinics were extremely busy and some may have received the invitation to participate in the survey but did not participate because of their time constraints. This circumstance may have limited the response sample.

The survey might have been pilot tested among other literacy researchers with experience in literacy clinics. Perhaps, a pilot test would have afforded the current study with a more diverse instrument. As well, validating the findings with other academics in this domain would also enhance the credibility of the findings.

An overarching limitation of interpretation of the findings of this research is that it was a snapshot in time during spring 2021. This narrowed the ability of the respondents to make evidence-based projections about their intentions, given that the pandemic was a unique point in history. As well, the dynamic nature of the pandemic and its impacts on literacy clinics continue to evolve. Consequently, these findings provide a window into decision-making at this important moment in time but may not translate to other time periods.


The COVID-19 pandemic imposed shifts in educational settings that may drive the researchers and others to consider changes in what has been traditional literacy assessment and instruction. This shift was aptly described by Respondent 18:

There is an enormous amount of potential to create joyful, liberating accelerative literacy instruction in online settings. There is a great deal of potential in connecting students with educators working on their literacy specialist certification and also with elders in their community and with peers from around the globe. For too many students who experience reading and writing difficulties, literacy instruction is reductionist. These children and youth need new opportunities to see themselves as part of a “literacy community,” and their proficiency with digital tools can provide them with this bridge. Likewise, children and youth who do not see themselves in schooled literacies nor are engaged in schooled literacies may be able to leverage digital tools that are part of online literacy teaching/learning in new ways. Children who struggle with traditional schooled literacies may have extraordinary strengths with digital literacies. In this way, we have an opportunity as a literacy clinic community to not just transfer print literacies to online environments (although, of course, part of that is necessary) but to rethink what counts as educational literacies and for whom. There are opportunities of connecting teachers and students from around the country and world — helping students to see both the word and the world that is possible with a liberatory literacy education.

Transformations will likely occur with all the stakeholders involved in literacy clinics: directors/instructors, clinicians, and students/families. The changes involve place, time, types of texts, and new ways of using technology to improve literacy assessment and instruction.  For all K-12 educators and students, the COVID-19 pandemic has demanded transformative practices to ameliorate exhaustion, isolation, technology challenges, and in some cases a lack of motivation (Niemi & Kousa, 2020).

Like many other institutions, education is likely to be permanently impacted by the global pandemic; educators need to continue to be responsive to the emerging learning requirements of students (Reimers & Schleicher, 2020), as this research brought into clear focus. The adaptations  made both to the ways educators teach and the ways students learn will be a springboard to widespread and long-term changes in clinics and in schools. We hope that the literacy clinic directors’ experiences described in this study will provide insights for postpandemic educational practice.


Alférez-Pastor, M., Collado-Soler, R., Lérida-Ayala, V., Manzano-León, A., Aguilar-Parra, J. M., & Trigueros, R. (2023). Training digital competencies in future primary school teachers: A systematic review. Education Sciences, 13(5), 461. MDPI AG.

Amrein-Beardsley, A., Foulger, T. S., & Toth, M. (2007). Examining the development of a hybrid degree program: Using student and instructor data to inform decision-making. Journal of Research on Technology in Education, 39(4), 331-357.

Bowers, E., Laster, B., Ryan, T., Gurvitz, D., Cobb, J., & Vazzano, J., (2017).  Video for teacher reflection: Reading clinics in action (pp.141-160).  In R. Brandenburg, K. Glasswell, M. Jones, & J. Ryan (Eds.), Reflective theories in teacher education practice – process, impact and enactment (pp. 141-160). Springer Publishers. doi: 10.1007/978-981-10-3431-2_8 

Brooks, C., McIntyre, J., & Mutton, T. (2021). Teacher education policy making during the pandemic: Shifting values underpinning change in England? Teachers and Teaching.

Carrillo, C., & Assuncao Flores, M. (2020). COVID-19 and teacher education: A literature review of online teaching and learning practices. European Journal of Teacher Education, 43(4), 466-487.

Charmaz, K. (2007). Constructing grounded theory: A practical guide through qualitative analysis. Sage Publications.

Coffee, D., Hubbard, D., Holbein, M., & Delacruz, S. (2013). Creating a university-based literacy center. In E. Ortlieb & E. H. Cheek (Eds.), Literacy research, practice and evaluation: Volume 2. Advanced literacy practices: From the clinic to the classroom (pp. 21–42). Emerald Group Publishing Limited.

Cochran, W. G. (1952). The chi-square test of goodness of fit. The Annals of Mathematical Statistics, 23(3), 315–345. doi:10.1214/aoms/1177729380. JSTOR 2236678

Creswell, J. W. (2018). Qualitative inquiry and research design: Choosing among five approaches (3rd ed.). Sage.

Deeney, T, Dozier, C., Cavendish, L., Ferrara, P., Gallagher, T., Gurvitz, D., Hoch, M., Huggins, S., Laster, B., McAndrews, S., McCarty, R., Milby, T., Msengi, S., Rhodes, J., & Waller, R. (2019, December 4-7). Student and family perceptions of the literacy lab/reading clinic experience [Paper presentation]. Literacy Research Association Annual Meeting, Tampa, FL, USA.

Deeney, T., Dozier, C., Smit, J., Davies, S., Laster, B., Applegate, M., Cobb, J., Gaunty-Porter, D., Gurvitz, D., McAndrews, S., Ryan, T., Eeg-Moreland, M., Sargent, S., Swanson, M., Dubert, L., Morewood, A., & Milby, T. (2011). Clinic experiences that promote transfer to school contexts: What matters in clinical teacher preparation. In P. J. Dunston & L. Gambrell (Eds.), 60th yearbook of the Literacy Research Association (pp. 127-143). Literacy Research Association.

Dozier, C., & Deeney, T. (2013). Keeping learners at the center of teaching. In E. T. Ortlieb & E. H. Cheeks (Eds.), Literacy, research, practice, & evaluation: From clinic to classroom (pp. 367-386). Emerald Group Publishing Limited.

Dozier, C., Johnston, P., & Rogers, R. (2006). Critical literacy/critical teaching: Tools for preparing responsive teachers. Teachers College Press. doi:

Ertmer, P. A., & Ottenbreit-Leftwich, A. (2010). Teacher technology change: how knowledge, confidence, beliefs, and culture intersect. Journal Research Technology Education, 42, 255. doi: 10.1080/15391523.2010.10782551

Evensen, D. (1999). Introduction. In P. Mosenthal & D. Eversen (Eds.), Reconsidering the role of the reading clinic in a new age of literacy (pp. ix-xii). JAI Press.

Fowler, F. J. (1993). Survey research methods (2nd ed.). Sage.

Hodges, C., Moore, S., Lockee, B., Trust, T., & Bond, A. (2020). The difference between emergency remote teaching and online learning. EDUCAUSE.

Koçak-Usluel, Y., Özmen, B., & Çelen, F.K. (2015). Integration of ICT in learning and teaching process and a critical overview of TPACK model. Educational Technology Theory Practive, 5, 34–35. doi: 10.17943/etku.14356

Ladson-Billings, G. (2014). Culturally relevant pedagogy 2.0: A. K. A. the remix. Harvard Educational Review, 84(1), 74–84.

Laster, B. (2013). A historical view of student learning and teacher development in reading clinics. In E. T. Ortlieb & E. H. Cheeks (Eds.), Literacy, research, practice, and evaluation: From clinic to classroom (pp. 3-20). Emerald Group Publishing Limited.

Laster, B. (1999).Welcoming family literacy at the front door. In P. Mosenthal & D. Eversen (Eds.), Reconsidering the role of the reading clinic in a new age of literacy (pp. 325-346). JAI Press.

Laster, B., Butler, M., Hoch, M., Waller , R.,Vasinda, S., Orellana, P., Rhodes, J., Deeney,T., Scott, D. B., Gallagher, T., Cavendish, L., Milby, T., Rogers, R., Johnson, T., Msengi, S., Dozier, C., Huggins, S., & Gurvitz , D. (2022). Literacy Clinics during COVID-19: Voices that envision the future. Literacy Research and Instruction, 62(2), 155–179.

Laster, B., Rhodes, J., & Wilson, J. (2018, April). Literacy teachers using iPads in clinical Settings. In E. Langran & J. Borup (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference (pp. 544-550). Association for the Advancement of Computing in Education.

Laster, B., Rogers, R., Gallagher, T., Scott, D. B., Vasinda, S. Orellana, P., Rhodes, J., Waller, R., Deeney, T., Hoch, M., Cavendish, L., Milby, T., Butler, M., Johnson, T., Msengi, S., Dozier, C., Huggins, S., & Gurvitz, D. (2021, December). Contrapuntal voices from literacy clinics during COVID-19: What do we harvest for the future? Literacy Research Association Conference.

Laster, B., Tysseling, L., Stinnett, M., Wilson, J., Cherner, T., Curwen, M., Ryan, T., & Huggins, S. (2016). Effective use of tablets (iPads) for multimodal literacy learning: What we learn from reading clinics/literacy labs. The App Teacher.

Leslie, H.  (2021). Trifecta of student engagement: A framework for engaging students in online courses. In Information Resources Management Association (Ed.), Research anthology on developing effective online learning courses. doi: 10.4018/978-1-7998-8047-9.ch007

Leu, D. J., Forzani, E., Rhoads, C., Maykel, C., Kennedy, C., & Timbrell, N. (2015). The new literacies of online research and comprehension: Rethinking the reading achievement gap. Reading Research Quarterly, 50(1), 37-59.

Lewis, K., & Kuhfeld, M. (2021). Learning during COVID-19: An update on student achievement and growth at the start of the 2021-22 school year. NWEA Brief. (ED627552). ERIC.

Lindner, J., Clemons, C., Thoron, A., & Lindner, N. (2020). Remote instruction and distance education: A response to COVID-19. Advancements in Agricultural development, 1(2), 53-64.

Lohnes Watulak, S., & Laster, B. P.  (2016, December). Take 2: A second look at technology stalled: Exploring the new digital divide in one urban school. Journal of Language and Literacy Education, 12(2), 1-2. 

MacKinnon, G. R. (2017). Highlighting the Importance of context in the TPACK model: Three cases of non-traditional settings, Issues and Trends in Learning Technologies, 5(1).

Masters K. (2021). They shoot horses, don’t they? A warning to medical schools about medical teacher burnout during COVID-19. MedEdPublish, 10(1), 1-19.

McKenna, M. C., & Walpole, S. (2007). Assistive technology in the reading clinic: Its emerging potential. Reading Research Quarterly, 42(1), 140.

Means, B., Toyama, Y., Murphy, R., Bakia, M., & Jones, K. (2009). Evaluation of evidence-based practices in online learning: A meta-analysis and review of online learning studies.

Mishra, P., & Koehler, M. J. (2006). Technological pedagogical content knowledge: A new framework for teacher knowledge. Teachers College Record, 108(6), 1017-1054.

Moore, S., Trust, T., Lockee, B., Bond, & Hodges, C. (2021, November 10). One year later . . . and counting: Reflections on emergency remote teaching and online learning. EducCause Review.

Msengi, S., & Laster, B. (2022). Pivots during COVID-19:  Teachers, students, parents, and supervisors in the circle of literacy clinics. In V. Shinas (Ed.), Cases on practical applications for remote, hybrid, and hyflex teaching (pp. 244-265). IGI Global.

National Academy of Education. (2020). COVID-19 educational inequities roundtable series, summary report.

Niemi, H.M. & Kousa, P. (2020). A case study of students’ and teachers’ perceptions in a Finnish high school during the COVID pandemic. International Journal of Technology in Education and Science.

Neuwirth, L. S., Jović, S., & Mukherji, B. R. (2021). Reimagining higher education during and post-COVID-19: Challenges and opportunities. Journal of Adult and Continuing Education, 27(2), 141-156.

Onyema, E. M. et al., (2020). Impact of Coronavirus pandemic on education. Journal of Education and Practice, 11(13), 108-121.

Ortlieb, E., Sargent, S., & Moreland, M. (2014). Evaluating the efficacy of using a digital reading environment to improve reading comprehension within a reading clinic. Reading Psychology, 35(5), 397-421.

Paris, D., & Alim, H. S. (Eds.). (2017). Culturally sustaining pedagogies: Teaching and learning for justice in a changing world. Teachers College Press.

Pletcher, B., Robertson, P.,  & Sullivan, M. (2019). A current overview of 10 university-based reading clinics.  Reading Horizons, 58(3), 1-22.

Preacher, K. J. (2001, April). Calculation for the chi-square test: An interactive calculation tool for chi-square tests of goodness of fit and independence [Computer software].

Ravitch, S. M. (2020). Flux pedagogy: Transforming teaching and leading during coronavirus. Penn GSE Perspectives on Urban Education, 17(1), 1-15.

Reimers, F. M. & Schleicher, A. (2020). A framework to guide an education response to the COVID-19 Pandemic of 2020.

Rhodes, J. (2013). Innovative practices in the reading clinic: Helping “Digital Natives” incorporate 21st century technologies. In E. T. Ortlieb & E. H. Cheeks (Eds.), Literacy, research, practice, and evaluation: From clinic to classroom (pp. 283-302). Emerald Group Publishing Limited.

Romero-Hall, E., & Vicentini, C. R. (2017). Examining distance learners in hybrid synchronous instruction: Successes and challenges. Online Learning Journal, 21(4).

Saldaña, J. (2021). The coding manual for qualitative researchers. Sage.

Secoy, M., & Sigler, H.W. (2019). Discovering digital differentiation: a teacher reimagines writing workshop in the digital age. Voices from the Middle, 26(4), 21–27.     

Selwyn, N., & Jandrić, P. (2020). Postdigital living in the age of Covid19: Unsettling what we see as possible. Postdigital in Science and Education, 2, 989-1005.

Shulman, L. (1986). Those who understand: Knowledge growth in teaching. Educational Researcher, 15(2), 4-14.

Singh, V., & Thurman, T. (2019). How many ways can we define online learning? A systematic literature review of definitions of online learning (1988-2018). American Journal of Distance Education, 33(4), 289-306.

Tysseling, L. & Laster, B. (2013). Taking technology from clinic to classroom. In E. T. Ortlieb & E. H. Cheek (Eds.) Literacy, research, practice, and evaluation: From clinic to classroom (pp. 245-264). Emerald Group Publishing Limited.

Vasinda, S., Adams, H., James, K., Henry, A., Henson, T., McKinney, B., Mueller, E., Randolph, M., & Taylor, J. (2020). Preservice teachers use design-based research: Learning to tutor online during COVID19. In R. Ferdig, E. Baumgartner, R. Hartshorne, R. Kaplan-Rakowski, & C. Mouza, (Eds.), Teaching, technology, and teacher education during the COVID-19 pandemic: Stories from the field (pp. 367-372). Association for the Advancement of Computing in Education.

Vasinda, S., Kander, F., & Redmond-Sanogo, A. (2015). University reading and mathematics clinics in the digital age: Opportunities and challenges of iPad integration. In M. Niess & G. Gillow-Wiles, (Eds.), The handbook of research on teacher education in the digital age (pp. 135-163). IGI Global.

Vokatis, B. (2017). How to establish an online reading clinic with quality supervision: A perspective on course design, technologies, and mentorship. Literacy Practice & Research, 42(2), 34-38.

Wankel, C., & Blessinger, P. (Eds.). (2012). Increasing student engagement and retention using online learning activities: Wikis, blogs and webquests. Emerald Group Publishing.

Literacy Clinics Survey 2021

  1.  During 2020-21 you taught the practicum to…   (e.g., undergraduates; graduate students who are preservice; graduate students who are inservice teachers; other___; I did not teach practicum during the pandemic (no need to proceed with the survey)
  2. What was/is your role(s) in clinic/lab? (e.g., instructor; director/supervisor; administrator/logistics coordinator; other __________)
  3. Compare the format of your clinic/lab before, during, and (if you know) after the pandemic. 
  4. In what space did your clinicians teach?
  5. With whom did you consult as you planned for the change?
  6. Compare the technology platforms and resources used in your clinic/lab before, during, and (if you know) after the pandemic.
  7. How was recruitment of clients/students different?
  8. How was assessment of writing affected?
  9. How was assessment of reading affected?
  10. How was reading instruction affected?
  11. How was writing instruction affected?
  12. In what ways did your changes during the pandemic afford or constrain parents/caregivers’ participation and clinic/lab personnel’s communication with families?
  13. How did the progress of students differ during pandemic? Under “other” please explain your response.
  14. How was your supervision of clinicians altered? Tell about what were affordances and what were constraints to your supervision/support of clinicians. (Please do not include information in your response that could be used to identify you.)
  15. What were changes in your communication with clinicians and/or changes in course expectations? How did you build community among clinicians?
  16. Of the changes that you made during the pandemic, what will you stop & what will you keep going forward?
  17. Explain any details of this pandemic’s impact on your clinic/lab (e.g., most frustrating aspects? what was hopeful?)