ASCB logo LSE Logo

General Essays and ArticlesFree Access

“How Do We Do This at a Distance?!” A Descriptive Study of Remote Undergraduate Research Programs during COVID-19

    Published Online:https://doi.org/10.1187/cbe.21-05-0125

    Abstract

    The COVID-19 pandemic shut down undergraduate research programs across the United States. A group of 23 colleges, universities, and research institutes hosted remote undergraduate research programs in the life sciences during Summer 2020. Given the unprecedented offering of remote programs, we carried out a study to describe and evaluate them. Using structured templates, we documented how programs were designed and implemented, including who participated. Through focus groups and surveys, we identified programmatic strengths and shortcomings as well as recommendations for improvements from students’ perspectives. Strengths included the quality of mentorship, opportunities for learning and professional development, and a feeling of connection with a larger community. Weaknesses included limited cohort building, challenges with insufficient structure, and issues with technology. Although all programs had one or more activities related to diversity, equity, inclusion, and justice, these topics were largely absent from student reports even though programs coincided with a peak in national consciousness about racial inequities and structural racism. Our results provide evidence for designing remote Research Experiences for Undergraduates (REUs) that are experienced favorably by students. Our results also indicate that remote REUs are sufficiently positive to further investigate their affordances and constraints, including the potential to scale up offerings, with minimal concern about disenfranchising students.

    INTRODUCTION

    The global COVID-19 pandemic caused major disruptions to research and teaching across postsecondary education in 2020. Educators and the organizations that support them, ranging from education companies to professional societies to centers for teaching and learning, all scrambled to shift to online experiences for undergraduate programs. A body of knowledge about online instruction, including principles for designing and strategies for teaching online courses synchronously and asynchronously, was available to inform these changes (e.g., Collison et al., 2000; Palloff and Pratt, 2007; Means et al., 2014). Yet, as science, technology, engineering, and mathematics (STEM) undergraduate education has shifted to maximize student involvement in research, a major gap in knowledge has been identified: how to engage undergraduates in research at a distance.

    Alternatives have been offered to afford students opportunities to think and work like scientists at a distance, such as by analyzing literature or carrying out virtual lab or at-home demonstration laboratory activities (Qiang et al., 2020). Although these approaches are demonstrated to promote student learning and development (e.g., Clark et al., 2009), it is questionable whether they can fully replace the educational value afforded by in-person undergraduate research experiences in STEM. Of particular value is the role that in-person research experiences plays in facilitating undergraduate student integration into the scientific community and enabling students to clarify their educational and career interests (Laursen et al., 2010; Lopatto and Tobias, 2010; Estrada et al., 2011; Gentile et al., 2017). Therefore, it was of particular concern that these in-person experiences were relegated to remote experiences in 2020.

    Many programs are in place nationwide to offer undergraduate research experiences in the form of internships every summer. One of the most long-standing and widely recognized sources of support for these programs is the National Science Foundation (NSF). This support started in the form of the NSF Undergraduate Research Participation (URP) program, which was launched in 1958 (Neckers, 1982). The NSF URP–funded projects, known as REU Sites, recruited, selected, and hosted undergraduates as research interns working with faculty mentors and other scientists, including graduate students and postdoctoral associates. Resumed in 1987 as the Research Experiences for Undergraduates (REU) program, REU continues to be one of the largest supporters of undergraduate research experiences in the United States (McDevitt et al., 2017). Currently, NSF typically supports undergraduate research experiences through two funding mechanisms: REU Sites, which host cohorts of approximately 10 students each year, and REU Supplements, which typically support one or two undergraduate researchers associated with an individual faculty member’s NSF-funded research project. The REU Sites are based on independent proposals to recruit, select, and engage cohorts of undergraduates in research and complementary professional development and social activities. The REU Sites can be based in a single discipline or can offer interdisciplinary research opportunities.

    In 2019 alone, NSF supported 125 REU Sites funded by NSF’s Biological Sciences Directorate (BIO), engaging ∼1270 undergraduates in research, 68% of whom identified as women and 61% of whom identified as an underrepresented minority (S. O’Connor, NSF REU program officer, personal communication). The BIO REU Sites are connected through a Leadership Council that functions as a communication and resource-sharing hub. In Spring of 2020, communication through the Leadership Council revealed that about 80% of BIO REU Sites opted to cancel their 2020 Summer REU programs due to the COVID-19 pandemic, and 20%—or 25 programs—opted to proceed. These programs are the main focus of our study; students supported by REU Supplements were not included. One additional undergraduate program that was funded by the USDA National Institute for Food and Agriculture was also included, because this program had previously received support from NSF BIO and thus was still connected through the Leadership Council. Furthermore, some of the REU Sites in this study also involved in their programs other undergraduate researchers supported by other funding sources. These students were included in our study, because the only difference in their experiences compared with those of REU Site–funded students was the source of their funding. Thus, from here forward, we use the broader term of “program” rather than the NSF-specific term of “Site.” We define “program” as a coherent, time-bounded, organized experience for a cohort of undergraduates during which they engage in mentored research accompanied by professional development and social activities.

    To document how typically in-person programs operated remotely, 23 programs collaborated to generate descriptive accounts of how their programs were designed and implemented. These programs also collaborated with an external evaluation team (authors O.A.E., R.B.C., and E.L.D.) to collect and analyze evaluation data on how undergraduates experienced REU programming, including their perceptions of programmatic strengths and weaknesses and recommendations for improvements. Here, we report the descriptive accounts and their alignment with the evaluation results. Given the unprecedented nature of the situation—specifically, the national shutdown and transition to online instruction by research institutions that host Summer REU programs—we aimed to address two research questions:

    • In what ways were Summer REU programs implemented remotely?

    • What were the strengths of these programs as well as suggestions for improvement from the perspectives of undergraduate researchers?

    Our results yield preliminary insights into the features of remote undergraduate research programs that might make them effective for students and to inform the improvements of such programs in the future.

    DESIGN AND METHODS

    We designed this study to include observational descriptive and evaluative components. Through the observational description, we sought to characterize the range of ways programs were implemented during the COVID-19 pandemic. We used a “case series” approach that allowed for the systematic documentation of 23 life science undergraduate research programs offered in Summer 2020, each serving as a distinct case or implementation of a remote program (Grimes and Schulz, 2002). We collected data to document who participated in the 23 remote REU programs; what activities occurred in each program; and when, where, and how each program was implemented. Then, we conducted an evaluation study of the different REU programs from a utilization-focused perspective (Patton, 2008), meaning that we aimed to collect, analyze, and report data that would be useful to program principal investigators (PIs). Specifically, we sought feedback from undergraduate researchers on the strengths of the novel, remote experiences as well as suggestions for improving programs both immediately and in future offerings. The results reported here are part of a larger study of remote REUs that was reviewed and determined to be exempt by the University of Georgia Institutional Review Board (STUDY00005841, MOD00008085).

    Programs and Participants

    We invited 25 programs that involved students in remote undergraduate research in 2020 to participate in this study. Twenty-three (23) programs chose to participate. The programs were hosted by 24 organizations (e.g., universities, research institutes) in 18 states and one U.S. territory and involved three to 39 students and two to 20 mentors per program, with funding from NSF, USDA, and other sources. One program that was invited to participate in the evaluation did not have the capacity to do research at a distance, so it joined with another program to offer a combined program. Five programs across four institutions also involved in-person research experiences for a small number of students, while 21 programs were entirely remote. In this study, we focus primarily on the remote programming and the experiences of students who engaged with their programs and carried out research entirely online. Table 1 provides information about the number and racial, ethnic, gender, and first-generation college status of students who participated in this study (n = 275).

    TABLE 1. Characteristics of students participating in this studya

    Prior research experience
    Race/ethnicityNone1 term2 terms3 terms>3 termsNot reportingTotal
    African American or Black910961145
    Central and East Asian6588431
    Latinx111418141270
    Middle Eastern112
    Native American or Native Hawaiian4431214
    South Asian131510
    White2136391725138
    Not reporting1113
    Total46646839562

    aIn total, 275 students participated in this study, including 184 women, 82 men, seven individuals identifying as nonbinary, and two not reporting a gender. There were 55 students who identified as transfer students, and 78 who indicated they were first-generation college students (i.e., no parent or guardian completed a bachelor’s degree). Students’ racial and ethnic identities are reported, disaggregated by the number of terms (i.e., Summer, quarter, or semester) they indicated participating in research before Summer 2020. Students who identified with multiple races or ethnicities are included in all relevant counts (e.g., a student who reported as Black and Latinx is included in counts for both African-American or Black students and Latinx students). Thus, counts may not sum to the totals.

    Data Collection and Analysis

    We collected three types of data: written program descriptions from program PIs, focus groups with students, and surveys of students. Each is described in detail in the following sections.

    Written Descriptions.

    We collected written descriptions of each program using a structured template (see Supplemental Material) to document when, where, and how each program was implemented from the perspective of its PI(s). Shortly after their programs were completed, we asked PIs to describe the design and implementation of their programs, including expectations, introductory and culminating events, and weekly activities. We chose this timing to ensure PIs could describe the implementation of their programs in their entirety (i.e., after all activities were completed) and with accuracy (i.e., soon enough to be able to recollect program activities). We then edited the descriptions to create streamlined, self-similar “program profiles” to allow for quick comprehension and easy comparison of the features of each program. We met briefly with PIs to clarify any ambiguities and fill in any gaps in the profiles before asking for their review, revision, and approval that the profiles accurately represented the design and implementation of their programs. Once the profiles were completed and compiled (included in Supplemental Material), we reviewed the collection to generate a summary description of the programs. Program names are included to allow readers to follow up directly with PIs for details.

    Focus Groups.

    We conducted focus groups with students in each program at the midpoint and end of the program. An average of 81% and 67% of students participated in midpoint and end-of-program focus groups respectively, with percentage by program ranging from 33% to 100% for midpoint and 17% to 100% for end of program. We sought feedback about positive aspects of programs as well as suggestions for improvements. For larger programs or instances when not all students were available at the same time, we held multiple focus groups, and students chose the one that best suited their schedules. If a student was unable to participate in a focus group, we solicited responses to focus group questions by email. All focus groups were recorded to ensure feedback was captured accurately and in its entirety.

    The student focus group data were the primary focus of analysis. The evaluation team (authors O.A.E, R.B.C, and E.L.D) identified strengths for each program and suggestions for improvement by reviewing student responses and creating brief, descriptive, and actionable summaries along with illustrative quotes as supporting data, which were provided in mid- and end-point reports to each program. The evaluation team then carried out an inductive, qualitative content analysis of the reports (Miles et al., 2014; Saldana, 2015). The team independently read each strength and suggestion and ascribed it with a meaning (i.e., To what aspect of the program does this strength or suggestion relate?). The team then met as a group to discuss and refine the meanings, group them into larger themes, and develop definitions of each theme. The evaluation team then carried out a deductive check to ensure that the themes provided a coherent and cohesive representation of the meanings identified across all of the focus groups (Saldana, 2015). Specifically, the team compiled all of the feedback initially identified as fitting a particular theme and reviewed the feedback to determine whether and how it related to the theme. The team revised and refined the themes as needed to ensure they represented a parsimonious interpretation of the data while reflecting the range of feedback identified in the focus groups.

    Finally, the evaluation team reviewed all of the reports to identify crosscutting themes related to the strengths and suggestions and to determine whether each theme was reported as a strength, a suggestion for improvement, or a mixture of the two for each program. In keeping with a descriptive study, our results include detailed descriptions of each program (see Supplemental Material) as well as descriptions of the strengths and suggestions identified through this cross-program analysis.

    Surveys.

    To complement the focus group data, we surveyed students at the end of their programs regarding:

    • the extent to which they experienced their programs synchronously versus asynchronously;

    • the quality of their relationships with their research mentors (Ragins and Cotton, 1999); and

    • the level of connectedness they felt in their programs (Rovai, 2002).

    Survey items are included in the Supplemental Material. Given the research questions and the descriptive nature of the work, means and standard deviations were calculated for each of these variables for the entire data set and program-level data are depicted using violin plots.

    Program names have been removed in the reports of the focus group and survey data to protect program confidentiality. Programs are numbered in order from most to fewest strengths for reference in the figures.

    RESULTS AND DISCUSSION

    Here we present the descriptions of remote program design and implementation. For succinctness, we have integrated the presentation and discussion of the themes that emerged as strengths and areas for improvement during student focus groups. When relevant, we include survey results to support focus group findings.

    Remote Undergraduate Research Program Design and Implementation

    The programs in this study varied in the extent to which the overall design and scientific focus changed to accommodate remote offerings. Some programs shifted to allow students to work in teams with a single mentor or for mentors to work collaboratively with one or more students. For some programs, these changes enabled the involvement of more students. For others, partnering bench- or field-focused faculty with colleagues doing computational work enabled the formulation of suitable projects. Some programs that previously had students work in teams dropped the teamwork component to ease logistics. Some programs were already computational in focus and one program, the Rosetta Commons REU: A Cyberlinked Program in Computational Biomolecular Structure & Design, had been implemented with distributed cohorts in previous years (Alford et al., 2017). For these programs, more modest changes were made to accommodate remote participation. Student survey responses indicated that the programs included a mix of synchronous and asynchronous programming (Figure 1).

    FIGURE 1.

    FIGURE 1. Synchronous vs. asynchronous programming. Students reported that their programs, numbered from 1 to 23 in order of most to fewest strengths, were structured more synchronously than asynchronously (mean = 1.44 out of 5; SD = 0.71 with a range of 1 = entirely synchronous; 5 = entirely asynchronous). Lack of consensus in student ratings may indicate variation in how students experienced their programs, with some engaging in more asynchronous activities than others (e.g., watching video recordings of speakers rather than live sessions). Alternatively, students may be perceiving the rating scale differently. Details about the level of synchronous vs. asynchronous programming are provided in Supplemental Material.

    All programs hosted some form of kickoff or orientation for students and/or mentors in the first day or two of the program, although the goals, structure, and content ranged widely. Some programs prioritized social interactions by facilitating get-acquainted sessions and community-building exercises. Some programs focused on getting students acquainted with the research, the program, and the expectations for the summer. Two programs organized events or activities that preceded the program start date, such as discussions among mentors about plans for the summer and how to address issues that might arise, and workshops for students to get acquainted with research options and begin building computational skills.

    All programs implemented knowledge- or skill-building sessions, either early on or distributed throughout the summer. These sessions aimed to develop a range of skills, from coding in R to using particular types of software or platforms (e.g., ImageJ, Rosetta Commons, Software Carpentry). Other topics included how to carry out literature searches, navigate databases, use reference managers, apply for fellowships, prepare for the Graduate Record Examination (GRE), conduct particular statistical tests, make posters, and communicate scientifically (writing manuscript-style papers, presenting posters, etc.). All programs included sessions dedicated to the ethical and responsible conduct of research, with some programs addressing particular bioethical considerations such human subjects research and issues related to use of sex and race categories in research (e.g., the Fungal Genomics and Computational Biology Summer Research program). The Exploring 21st Century Careers in the Biological Sciences: A Comparative Regenerative Biology Approach program facilitated sessions on innovation, intellectual property, and technology transfer. The Genes & the Environment REU from Rural & Tribal Colleges program facilitated sessions on psychosocial skill building, such as managing stress, practicing mindfulness, and engaging in difficult conversations.

    All programs also hosted panel discussions, scientific seminars, or talks by guest speakers to facilitate students’ professional development beyond research and skill building. Panel discussions addressed a range of topics, from applying to graduate school to offering advice on careers, graduate school, and navigating science as a person of color. Most programs included students in scientific seminars or journal clubs, with some programs expecting students to present relevant literature or their own research in progress. All programs included at least some discussion about social justice, diversity, equity, inclusion, and/or anti-racism. These discussions were facilitated in a variety of ways, from hosting events on anti-racism and pride to facilitating movie nights with discussions about the Black Lives Matter and ShutDownSTEM movements.

    Some programs included more informal, less structured elements, such as hosting lunch hours, coffee breaks, teatimes, and game nights using Zoom Video Communications (Zoom). In some programs, these events were organized by students. Some programs also included Zoom drop-in hours for advice about graduate school, careers, research, technical issues, and troubleshooting. At least two programs collected evaluation data outside of what are described here to make improvements during the summer and identify ways to support students after they completed the program. For instance, the Bruins-in-Genomics Summer Undergraduate Research Program administered regular check-in surveys with students and mentors to identify and address any issues that arose.

    All programs ended with students presenting their research progress in the form of short talks or posters. Two programs also held award sessions. Talk formats ranged widely from 10- to 15-minute individual or team presentations followed by a few minutes of questions and answers, to 3-minute thesis style presentations or other very short talks. All programs required students to produce one or more products, such as posters, talks, papers, proposals, or videos. The Cary Institute of Ecosystem Studies REU program required students to generate “data nuggets” (http://datanuggets.org), which are mini-research projects or tasks that can be used in K–16 instruction to develop students’ science research skills. Some programs made a point of encouraging students to invite family and friends. The Morton Arboretum: Integrative Tree Science in the Anthropocene program included keynote speakers of color. The Rosetta Commons REU program held its culminating event as part of a larger conference being held by the Rosetta Commons community (www.rosettacommons
.org). The Training and Experimentation in Computational Biology program held its closing poster session in virtual reality.

    Strengths and Areas for Improvement of Remote Undergraduate Research Programs

    Students in this study described program strengths and areas for improvement in terms of 10 overarching themes (Figure 2). Three themes that emerged as strengths across programs were 1) quality of mentorship, 2) opportunities for learning, and 3) feeling connected with research groups and programs. Two themes that emerged as areas for improvement were 4) the cohort experience and 5) the unstructured nature of research and remote work. Two themes emerged as having both beneficial and problematic elements: 6) program logistics and 7) opportunities for professional socialization. Finally, three themes were identified less frequently across programs and were experienced as either strengths or areas for improvement depending on the program: 8) networking; 9) technical issues; and (10) diversity, equity, inclusion, and justice (DEIJ). Each of these themes is defined and described in numerical order in the following sections. As a reminder, 23 programs were included in the study and analysis.

    FIGURE 2.

    FIGURE 2. Student-identified strengths and areas for improvement in remote REU Sites. This figure provides an overview of the strengths and areas for improvement for 21 programs in this study, which are numbered across the top. Programs 20 and 21 are not included here, because students in these programs did not participate in focus groups. Programs 22 and 23 are separated, because they included substantive in-person components. Blue indicates the areas of strength (three most common in the top three rows); red indicates areas in need of improvement (next two rows); purple indicates a mixture within a program, with some students emphasizing this as a strength and others as an area in need of improvement (next two rows); white indicates that no evidence related to that theme was observed during the focus groups for that program. The bottom three rows feature themes that were mentioned by students in fewer programs. The four columns on the right are sums of how many programs had students reporting the theme as a strength, a concern, or a mix, with the total indicating how many programs had students commenting on the theme regardless of whether it was a strength or concern.

    Theme 1. Mentorship: Students Described the Mentorship They Received from Their Research Mentors to Help Them Learn, Make Progress in Their Research, and Be Successful in their Programs.

    The main strength across most of the programs in this study was students’ perceptions of the mentorship they received. Students in 15 programs spoke favorably about the mentorship they received, as described by this student:

    The mentor that I had personally, they went out of their way to make sure I was in a good area or ask how I was doing. My mentor in particular was [having a personal situation]. So he had to leave for a while. I had a technician of his take over and she was amazing as well. Even while his family was going through that he would message me to see, “How are you doing? How’s your research going? Is there anything that I can do?” It was going above and beyond to make sure that I was understanding what I was doing and getting the most out of this experience.

    This quote captures a sentiment expressed by other students—that mentors provided both direct support and indirect support by connecting them with someone who could help when the mentor was unable to do so. Students across programs noted how their mentors forged connections between them and the rest of the research group so they could reach out and ask questions. One student noted that “it is helpful knowing if I get stuck on something, [my mentor] is available.”

    Students’ descriptions of the mentorship they experienced fit scholarly definitions of mentorship, including positive feelings about the relationship and support received from the mentor (Eby et al., 2013; Byars-Winston and Dahlberg, 2019). Students reported receiving technical support (e.g., how to accomplish a particular research task), career support (e.g., guidance on applying to graduate school), and psychosocial support (e.g., encouragement when encountering research difficulties). Most students who commented on mentorship felt that their mentors cared about them not just as scientists, but as people. For instance, one student appreciated that their mentor “was really invested in [them] and invested in [their] research.” Another student noted that their relationship with their mentor is “something [they] cherish a lot.” Students also appreciated how responsive mentors were to how the pandemic could be affecting students’ work and mentors’ willingness to be flexible around complications that arose from working from home. One student observed: “there are so many assumptions that can be made about students.” Students repeatedly mentioned how mentors quelled their anxieties about asking for help and “never made [them] feel dumb for needing help.”

    Students appreciated that their mentors provided dynamic, responsive support, rather than being “one-size-fits-all.” For instance, they commented on their mentors’ ability to balance providing support with allowing students to answer their own questions. One student noted that their mentor “[made] sure [they were] on track. It wasn’t too overbearing, but they were also always making sure I was going along on the project.” Another student described how their faculty mentor was open to feedback such that, when the student expressed concerns about how their experience was going, “it actually improved once I talked to my PI about what was going on and what I needed from her, which helped. That made a big difference.” The mostly positive experience students had with their research mentors is evident in their overall positive ratings of the quality of their relationships with their mentors (Figure 3).

    FIGURE 3.

    FIGURE 3. Mentorship relationship quality. For the most part, students reported a high level of agreement that they had positive relationships with their research mentors (mean = 5.31 out of 6; SD = 1.16). This figure shows student ratings by REU Site, with a rating of 6 indicating strong agreement and 1 indicating strong disagreement (see Supplemental Material for items and rating scale). Some negative ratings were observed, reflecting the mixed or negative experiences of some students.

    Mentorship was not a strength for all programs and students. Students in one program indicated that the mentorship they received was inadequate, and students in three programs had mixed ratings of their mentoring relationships (Figure 3). In these instances, students expressed concern that the time they were able to spend with mentors was inadequate and the ways they were able to communicate (or not) with their mentors was insufficient. For instance, some students who were struggling with their research felt they could not just “drop in” to ask a question or get help. They perceived that their mentors would have been receptive to providing drop-in help if the program had been in person, but they did not see a way to accomplish this remotely. One student indicated having a set weekly meeting with their mentor and otherwise was not “allowed” to contact the mentor with questions except in emergency situations. This often meant that they would reach an impasse in their research and be unable to make progress during the week until the next weekly meeting. These results are consistent with research showing that not all undergraduate research mentorship experiences are positive (Limeri et al., 2019) and that informal interactions are critical components of effective mentorship (Ragins and Cotton, 1999).

    Recommendations from the National Academies on effective and inclusive research mentorship offer guidance on how to avoid or mitigate the impact of insufficient or problematic mentorship (Byars-Winston and Dahlberg, 2019). First, programs can establish an expectation that all mentors participate in professional development to improve their mentoring skills. Second, programs can set clear expectations for the frequency with which mentors should be expected to communicate with students and the flexibility of that communication. Third, programs can collect data on mentorship support and quality and determine whether certain individuals are not well suited to mentor students at a distance or in general. Finally, programs can conduct midpoint checks with students about the mentorship they are receiving, including what is working well and what needs to be improved. This feedback can then be used to help mentors and students improve the mentoring relationship or remove students from situations that are deemed sufficiently problematic.

    Theme 2. Learning: Students Described Gains in Knowledge, Skills, or Abilities as a Result of Participating in Remote Research.

    Students in 15 programs emphasized how much they learned from their research experience. Students reported gaining knowledge in the content area of their research and vastly improving their coding skills; one student describing their coding abilities as “phenomenally improved.” Even for programs in which computational biology was not a major emphasis, the remote nature of the research meant that students carried out projects that involved coding to query data sets and conduct analyses. Students perceived that their research experiences provided a “real-life” context for learning to code, which was superior to learning coding through course work, as one student noted: “Be[ing] able to actually use it in a project was so much better for learning how to program than anything I could have learned in a class at my university.” In addition, students perceived that their new skills would be “so beneficial for future research and future labs.”

    Beyond gaining content knowledge and technical skills, students reported learning more about the research process and gaining confidence in their own abilities to be successful in research. One student noted that “when [they] first started,” [they] thought it would be super hard to conduct research, and it was difficult, but it’s not as unattainable as [they] once thought it was.” Students also reported developing other professional and scientific skills such as troubleshooting, expressing that “figuring out things for yourself has become satisfying” and that they now felt “equipped with the skills to be able to troubleshoot problems when I have them.” Students expressed surprise that they were able to grow in their knowledge, skills, and confidence in such a short time while working remotely, with one student explaining that “[at first, I was] really nervous putting things together … but toward the end I was really communicating with my colleagues.”

    Theme 3. Connectedness: Students Described the Sense of Being Connected to and Comfortable with Their Research Groups, Their Programs, or Broader Scientific Communities.

    Note: Students described their sense of being connected with their research groups, the program, or the broader scientific community as distinct from feeling like a cohort of undergraduate researchers within their specific program. Thus, the cohort experience is described separately.

    Students in 12 programs emphasized how their programs and their research groups helped them feel like they were connected to a research community that would not have been available to them if they had not participated in remote research. This finding adds to a previous report that students in a mostly remote REU program were able to develop a sense of community (Alford et al., 2017). Students’ sense of connectedness with a larger community manifested in a variety of ways. Some students described how their programs created a culture where students felt they could “go to anyone for help” and that this environment allowed them to “see how collaborative research really is.” Some programs and research groups ensured that students had ample opportunities to interact with graduate students other than those who served as their research mentors, and this had a “profound impact on [their] overall experience” and “play[ed] a big role in feeling welcome to [their] lab group.” Students emphasized the importance of making these connections early in the summer so that it was easier to seek out that guidance later in the program. Yet another student noted that the level of engagement by everyone involved in the program helped them feel connected. The student described that, during presentations, “Everyone is really supportive and engaged and they give you really valuable feedback, not just for the sake of giving feedback, but because they’re actually engaged with what you’re saying.”

    For the most part, students positively rated the connectedness they felt with their programs (Figure 4), although student ratings in certain programs were less favorable. Students in one program indicated they felt disconnected because there was no transparency about whether they could seek help from others outside their research group or what resources were available to the entire group. They explained that there was a “resource sitting there for everybody and only a select few knew about it.” It appeared that one or a few research groups made their students aware of the resource but that other research groups and the program administrators did not, which created inequity that undermined their sense of connectedness with the program. In addition, only some research groups in this program made an effort to connect their students with other faculty. These students appreciated the opportunity to develop relationships with faculty members other than their mentors and to become part of a “community of different scientists.” Students who did not have this experience were eager for it, indicating they wanted to learn from a broader and more diverse group of faculty members about topics beyond “research and what they look for in graduate students,” such as “how they became a scientist and what they see as lab culture.”

    FIGURE 4.

    FIGURE 4. Connectedness. Students were generally positive about the sense of connectedness they felt in their programs (mean = 4.51 out of 6; SD = 0.90), but their ratings were lower (i.e., lower means and medians) and more consistent (i.e., smaller SD) within each REU Site than ratings of their relationships with their mentors. This figure shows student ratings by REU Site, with 6 indicating strong agreement and 1 indicating strong disagreement (see Supplemental Material for items and rating scale).

    Theme 4. Cohort Experience: Students Described the Sense of Feeling Close to and Engaged with Other Undergraduate Researchers in Their Cohort or Feeling Isolated or Disconnected from the Group.

    Students in 12 programs indicated that they missed interacting with other undergraduate researchers and expressed concern about missing out on a cohort experience. In one program, students had mixed feelings, with some finding it easier and some finding it more difficult to get to know one another. One student described feeling connected with the other undergraduate researchers in their program, noting that “it was sweet to see the other interns and to want to go to their [Zoom breakout] rooms and just check in on everyone. I still feel like, even though [the program] wasn’t in person, it built camaraderie and a cohort.” Other students lamented the loss of informal interactions because they were not “able to ask a neighbor, ‘Hey, can you help me out with this?’” One student explained how not getting to know people on a personal level meant that they were not able to alleviate some of the nervous feelings associated with asking questions.

    Students reported several factors that prevented or undermined the development of a cohort feeling. First, some programs involved only a few students. Students thought that the small number was insufficient to provide a cohort experience. Second, at least one program held fewer whole-group events as the summer progressed to allow students to focus their attention on their research. Students in this program indicated that they would have preferred to continue meeting weekly as a whole group to continue to get to know one another. Finally, students found it difficult to have more casual interactions that normally occurred when working alongside others. They felt that this limited their abilities to network and build relationships with other students.

    Some programs arranged social time on Zoom for cohort building, but students had mixed feelings about this. Some appreciated having game nights or other social activities (e.g., Pictionary on virtual whiteboards, bingo, escape room, trivia night, Jackbox, virtual meditation or yoga), while others felt “Zoom fatigue” after many hours of program and research activities on Zoom. Students in several programs suggested integrating cohort building into regular work-week activities rather than as an additional activity. For instance, students in several programs expressed the desire for synchronous, online work time on Zoom to simulate an in-person collaborative work environment. Students could join the call and ask impromptu questions or talk through ideas as they worked. Similarly, students wanted to use GroupMe or Slack among themselves to communicate about non–research related things and get to know each other.

    Students in three programs noted that cohort building was a strength of their programs, emphasizing that they still felt connected with other undergraduate researchers in the program despite the remote circumstances. They reported that doing activities as a group and being encouraged by program leadership to socialize among themselves helped to achieve this. Other factors that promoted their sense of camaraderie included talking about things “outside the scope of our respective projects,” such as students’ roles in the broader scientific community and in the world given the country’s raised awareness of systemic racism and racial injustice. For instance, one group of students commended their program for making time and creating a safe space for discussion about BlackLivesMatter and ongoing racial injustice in honor of the #ShutDownSTEM initiative. This group reported that these activities helped to both “build a dialogue about the issues and build a community” among the cohort. Students in another program appreciated the intentionality displayed by the program’s leadership to support cohort building. This program established a committee structure, which gave every student a way to be involved and promoted a sense of inclusion. This is consistent with research on community building, which indicates that community can be fostered through shared tasks (Lave and Wenger, 1991; Wenger, 1999; Kim, 2006). Students also noted that having a student-only GroupMe group or Slack channel as well as the use of smaller breakout groups on Zoom all facilitated getting to know one another and promoted a cohort feeling.

    Theme 5. Structure: Students Described Program Design Elements, Such as Schedules, Workflows, Expectations, Milestones, or Deadlines, That Helped Them Organize Work and Manage Time.

    Students in 14 programs indicated that they were struggling with the lack of structure inherent to remote work and to research. At least some students struggled to organize their workdays, because they did not have the structure of physically leaving home at a regular time to go to a research environment. Furthermore, science research itself is an unstructured or “ill-structured” endeavor, meaning that there are multiple ways to make progress and no single “right” answer (Simon, 1977; Dolan and Weaver, 2021). Thus, remote research appeared to function as a “double whammy”—requiring students to navigate an ill-structured task in an unstructured environment. Students noted that having scheduling flexibility was helpful, because their circumstances were so unpredictable, but that the extent of the flexibility was “daunting” and made time management difficult. They expressed concern that they did not know how much progress they were expected to make each day, and they struggled to define when the workday should start and end. The lack of clarity regarding how much to work and what was expected of them left some feeling like they had “to work on their project at all times” and prompted some to work longer hours. Others felt as though they had extra time that could have been used more productively. If they had been on-site, they would have sought additional things to do, but they were not sure how to do this at a distance. Having mentors with more of a “hands-off” approach exacerbated these issues. While students clearly needed some flexibility, leaving structures entirely to individual research groups (e.g., whether and how frequently mentors meet with students) was problematic.

    Students in four programs indicated that their programs provided important structure to help them stay on track throughout the summer. Indeed, a growing body of research indicates how structure in the form of policies and procedures helps to ensure equitable engagement and success of all students regardless of their backgrounds or prior preparation (Hurtado et al., 2008; Balster et al., 2010; Tanner, 2013; Eddy and Hogan, 2014). One program required students to prepare a research proposal and complete other mandatory assignments, which helped them “refocus” and “make sure [they] knew what [they] were talking about.” They explained that “the more mandatory assignments [they] had, the more on track [they were] because they had to force [themselves] to reevaluate [their] understanding and application [of their knowledge and skills].” Other programs had regular meetings with program leadership, such as start-of-week check-ins, that ensured they set goals and gauged progress on a regular basis and got feedback and help before too much time had passed if they were off track.

    Students across programs made several suggestions for adding structure that would have allowed them to better gauge whether they were on track in their research and programs, including:

    • defining a daily or weekly schedule or offering suggested schedules, including expected number of hours per day (even “clocking in”) and whether and how much they should take breaks to prevent burnout;

    • defining “checkpoints,” “check-ins,” “assignments,” or “intermediate goals” throughout the program to help with gauging progress and avoid tasks “hitting [them] all at once” at the end;

    • ensuring mentors set aside time every day or two or schedule standing meetings to provide guidance and instruction;

    • requiring students to write brief weekly updates or reports for their mentors to check to ensure they are making sufficient progress;

    • scheduling midpoint progress meetings to get feedback from mentors about the progress they have made, the quality of the work they have completed, and goals and potential improvements for the remainder of the summer;

    • providing a list of optional tasks or recommendations for what students could be doing if they had extra time, such as additional reading, writing, or analysis tasks, working on other projects when they have downtime on their main project, and additional skill building; and

    • hosting one or two sessions with mentors or program leadership to share how they manage their workdays and brainstorm strategies for time management (e.g., what to do, in what order, and when to get things done by) and structure that helps them to “organize their day, set priorities, and meet goals.”

    Some of the students who made these suggestions thought that increased structure would not only help them better gauge their progress, but would also help them avoid distractions and “set firmer boundaries with family members during times they have set aside for working.” Some students shifted to creating their own structure to mitigate the lack of structure inherent to working from home, including “making a daily checklist … that motivated me to get things done in the day” and “mak[ing] a [physical] workplace that’s separate from where you rest, just so you can separate working life better.”

    Theme 6. Program Logistics: Students Described Operational Aspects of Programs, Including Onboarding, Meetings, Communication, and Pacing, which Improved or Undermined Their Experience.

    Students in 15 programs indicated that several aspects of how their programs operated made it possible to navigate the program smoothly at a distance. These aspects included frequent meetings with their mentors, their cohorts, and/or the program leadership; clear and open communication between students, mentors, and program leadership; and proper program pacing. Students reported that the inclusion of frequent meetings, such as daily meetings with their mentors and weekly meetings in their programs, helped them to stay focused and motivated and to feel connected with others in the community despite being physically distant from them. They also noted that these meetings made communication easy to maintain and were important for their success in the program, helping them “feel a little bit more connected and less on my own.” Students also noted that regular communication in advance, such as weekly announcements of upcoming events and other key information, made it easier to ensure they were in the right places at the right times and had sufficient time to plan their research around program activities. Students appreciated having access to this information in a single location or platform so they could find it when they needed it. Students in several programs commented that their programs started more slowly, helping them acclimate to working online at a distance and to get up to speed on their research. They also appreciated that pacing changed over time, allowing more time as the summer progressed to focus more on research and less on program activities.

    Students in 17 programs commented that some logistical elements were missing, which compromised their overall experience. Examples included poor or sporadic communication, uneven program pacing, and difficulties with getting started in their programs. Regarding communication, students reported wanting more open and consistent communication among participants, their mentors, and program leadership. For instance, some students reported getting announcements on multiple platforms, which led to confusion about where and when to find needed information. In some instances, announcements came with such short notice that students missed activities. Other students expressed concern that their mentors seemed unaware of program activities, which resulted in these activities feeling separated from or in conflict with their research activities. In these instances, students felt like they had to choose between their program responsibilities and furthering their research. Students suggested that Summer program calendars be shared with mentors to alleviate confusion. They also suggested scheduling events at a particular time and communicating these times with mentors and students sufficiently far in advance to allow for planning. Students indicated that mentors needed to seek mentee input when scheduling meetings, as everyone had different schedules, often in different time zones.

    Students in multiple programs struggled with program pacing. They expressed concerns about pacing both within a day and across the summer. Day-to-day, students emphasized the importance of limiting the number of online meetings and sticking to schedules rather than letting meetings run over time. Students indicated that program activities should be evenly spread throughout the summer, rather than front-loaded at the beginning. This change would allow for more time to start research and enable just-in-time guidance and support, such as writing workshops when students would be writing instead of early in the summer. Finally, given the remote nature of the programs, students needed functional computers, software, and network access as well as institutional credentials to access institutional resources and functions.

    Theme 7. Professional Socialization: Students Described How Programs Helped Them Gain Insight into Graduate Education and Research Careers and to Envision Themselves Pursuing Further Education and Careers in Science.

    Students in 15 programs indicated that their programs facilitated their professional socialization despite the remote circumstances. One approach that programs used to accomplish this was to host online sessions related to graduate education, including webinars about fellowships and funding opportunities, panels with current graduate students, and workshops for GRE preparation.1 Students found it inspiring to hear from current doctoral students and learn about the many different paths they could take to graduate school. One student highlighted how an NSF grant workshop was so “motivating” that it “inspired [them] to get [their] academics in order [so that they could] get research opportunities in the future, and eventually get to graduate school.” Several students noted that these sessions served as a “mental health break” from the challenging work they were doing in their research.

    In addition to engaging students in research, programs supported students’ professional socialization by hosting sessions highlighting the diversity of research careers. Typically, these sessions involved panels of scientists from a wide range of fields, careers, and backgrounds, providing students insights into “what it’s really like to be a researcher, the good and the bad” and helping them to discern whether they would like to pursue a career in research. Students noted that a major advantage of online panels was that they met scientists from a wide variety of fields from all over the country, which they thought might not have happened if the program was in person. Some students felt their programs could have done more to integrate them into the research community. Typically, these programs did not offer workshops related to graduate school preparation or had limited if any interactions with speakers, panelists, and other students.

    Through attending workshops about graduate school, hearing from current doctoral students and scientists during panels, and doing research, students reported feeling that they had “found their purpose.” For instance, one student indicated that “I live close to [a Native American] reservation, and I’m a [member of this tribe], too. It was hard to not be able to do anything for my people [during the pandemic] … I didn’t know how to help out. When I heard about this research experience, it was like, ‘Hey, this is how I can actually help in some way.’” More generally, students also commented on developing “confidence in [themselves] … and what kind of research [they] want to do” and “reassurance that [they] can do this and that this is something that [they] can see [themselves] pursuing.”

    Theme 8. Networking: Students Described Opportunities to Meet and Build Relationships with Others Who May Be Helpful to Their Learning and Career Development.

    Students in six programs explained how their programs provided opportunities to meet and build relationships with faculty, other professionals, graduate students, and peers who could help them learn or otherwise advance toward achieving their education or career goals. Several students felt that they had plenty of opportunities to “expand their network.” For some, networking mitigated the feeling of being isolated, with students explaining that “if we didn’t get to meet as many people from [the institution] as we did, the [remote] experience would have been significantly more isolating.” In fact, some students commented that “the most impactful” thing they got out of their research experience was the connections they made throughout the summer; as one student described it: “The community was something that was really helpful for me, especially looking at the network of resources and the networks of labs to join for possible next steps in my future as well as the future of my research.” Several students expressed how grateful they were to finish their programs feeling like they had met people who could help them as they progress in their careers. One student commented that, before their experience, they did not realize how collaborative the scientific community was and thought that it was “really awesome to see that, from this one opportunity, [they] now have connections to [so many] different places.”

    Students indicated that programs supported networking in multiple ways. Some programs encouraged students to talk and work with lab groups and mentors other than their own. Other programs took advantage of the remote circumstances to organize cross-program activities and invite individuals from all around the country and even around the world to meet with students as speakers, panelists, and collaborators, thereby expanding students’ connections far beyond what might have occurred in person. Students who participated in these opportunities appreciated connecting to researchers both within and beyond their programs and were grateful that this enabled them to be able to work with mentors with expertise in their research interests. Students in some programs had the opportunity to help choose speakers and organize seminars. One student explained that this was an advantage of a remote program, because they had “a wider range of speakers because we can reach people all over the world right now,” and how “hearing from a researcher in [another country] was especially exciting.” Having informal settings for interaction was another tactic that supported networking. For instance, one program had weekly check-ins with the directors, which one student indicated was their favorite part of their program.

    Even in programs in which students noted networking was a strength, this varied by lab group, with some fostering more connections than others. Several students heard from their peers about interacting with graduate students, and they wished they had more opportunities to do so. Students also expressed a desire to develop relationships with faculty other than their own mentors. They felt they had learned so much from their own mentors that their experiences could only be enhanced by learning from other mentors. Some specifically wanted to hear from faculty members about topics “beyond research,” such as “how they became a scientist and [how they view] lab culture,” and these students mentioned that having meet-and-greet hours with faculty would be an impactful way to facilitate these connections. Other students suggested having their work reviewed by more than one mentor would afford opportunities to get more feedback and build rapport with other mentors. Students acknowledged that they felt personal “responsibility to network and make those connections” as well as a responsibility of the programs to facilitate networking, especially given how challenging this was for students to do remotely.

    Theme 9. Technological Issues: Students Described Issues with Technology That Undermined or Limited Their Experience.

    Students in five programs reported several issues with technology that compromised their research progress and their overall experience. First, some students had difficulty accessing communication platforms (e.g., an institutional learning management system), either because they did not have the appropriate credentials for access or because the platform itself was “confusing to navigate” or “hard to use.” Second, some students described how their programs used multiple communication platforms, which made it “easy to miss things” when certain events or activities were announced on one platform, but other key information was available on a different platform. Third, some students did not have suitable Internet connections, access to a computer with sufficient computing capacity, or credentialing to allow for access to necessary software. These issues were identified by programs, and PIs were responsive to student needs, yet it took time for issues to be resolved, which limited the progress students felt they could make in their research. Finally, some students indicated that they did not have enough support with coding or learning to code. Several of these students explained that, by the second half of their programs, they had found someone that they could ask for coding help when needed. Yet they wished these connections had been made available to everyone in the program early in the summer so that they had equal access to support and could have made better progress throughout the summer.

    Interestingly, no students indicated technology as an area of strength for their programs, possibly because students expected technology to work and thus only noticed when their expectations were not met. Students who reported having technology issues made three suggestions for preventing these issues or mitigating their impacts in the future. First, they recommended selecting a common, easy-to-use platform for communication, such as group messaging (e.g., GroupMe, Slack) or email lists. Second, they recommended setting up institutional credentials and conducting technology audits in advance of program start dates by determining the technological needs of each research project and the computing and Internet capacity to which each student has access. If the needs exceed the capacity, there should be sufficient time to ship suitable computers (this was done by the Summer Integrative Neuroscience Experience in Jupiter at Florida Atlantic University), set up improved Internet access, and ensure students have needed credentials in place. Finally, they recommended making transparent to all students the individuals who could provide coding support. This support could be provided by the research group, the program, and/or the institution, depending on needs and resources.

    Theme 10. Diversity, Equity, Inclusion, Justice, and Representation: Students Described How Programs Created Time and Space to Discuss Social Justice Topics.

    A review of the REU program profiles (see Supplemental Material) shows that all programs facilitated at least one formal or informal discussion or event regarding diversity, inclusivity, social justice, or anti-racism. However, students in only three programs mentioned this as a strength of their programs. Students in two programs spoke about how their programs scheduled time to discuss issues around DEIJ. Students in these programs noted that the discussion of the larger national social justice conversation made them feel as though they were “people and not just scientists.” These students also appreciated the opportunity to bring their whole selves to the research experience and they appreciated being encouraged to “talk how they like to talk.” One student explained that offering remote REU programs allowed for participation in research by people with disabilities or other circumstances that prevented traveling to a distant REU Site. One student indicated that they had not previously imagined applying to graduate school but found it “inspiring” to hear from graduate students who took nontraditional paths to graduate school.

    The absence of student comments about diversity, equity, inclusion, and representation is especially noteworthy given that the programs took place in Summer 2020, just months after the killings of Ahmed Arbery, Breonna Taylor, and George Floyd and at the height of national consciousness about BlackLivesMatter. The #ShutDownAcademia/#ShutDownSTEM strike occurred on June 10, 2020, when all of the programs in this study were in session. It is also noteworthy that the NSF REU program prioritizes engagement of persons excluded because of ethnicity or race (Asai, 2020). It is possible that these discussions occurred and were simply not reported during focus groups. It is also possible that DEIJ activities or events were too limited in scope or disconnected from other aspects of programming to be perceived as a strength. For instance, in one program that held multiple events related to diversity and inclusion in STEM, students explicitly highlighted representation and DEIJ as an area of weakness due to the absence of people of color in workshops and seminars. They also mentioned that they would have appreciated receiving advice from individuals from more economically diverse backgrounds and career paths “other than ‘went to undergrad, went to grad school, got a job, paid off my loans.’” This finding brings to attention, once again, the need to restructure higher education such that DEIJ is an integral element rather than an additional activity. Fortunately, there is a growing body of research on how to engage in difficult dialogues that can be used to ensure that REU programs dedicate time and create safe spaces for discussion of the value of diversity, ways to ensure equity and promote inclusion, and the importance of justice (Page, 2008; Sue et al., 2009; Tienda, 2013; Asai and Bauerle, 2016; Asai, 2020). At least some of this research has been described and translated into practical actions that could be applied to REU Sites (Tanner and Allen, 2007; Tanner, 2013; Harrison and Tanner, 2018; Seidel et al., 2015; Braun et al., 2018; Gin et al., 2020; Pfeifer et al., 2020). Future programming should ensure that time and space is dedicated to engaging in these important discussions and that the voices and experiences of people of color are integrated throughout programming, tapping local experts in diversity offices and centers for teaching and learning for guidance.

    CONCLUSIONS

    When considered collectively, these results indicate that remotely implemented REU programs can, at least under certain circumstances, afford many of the same opportunities that in-person programs offer. These results should provide some reassurance that remote REUs are worth offering and may offer some advantages over or in addition to in-person programming. For example, remote programs could involve undergraduates in research whose personal situations would preclude participating in an on-site program. In-person programs could consider adopting some of the strategies used during remote programming, such as networking across programs and holding sessions using video conferencing so that students can interact with speakers, panelists, and collaborators beyond those who are available on-site. Our results also indicate that several elements of REUs were more challenging to implement at a distance. To assist the community in planning future offerings of REU programs and overcome these challenges, we curated students’ feedback along with relevant guidance from the literature into a single set of recommendations, organized according to the themes reported here (Figure 5).

    FIGURE 5.FIGURE 5.

    FIGURE 5. Recommendations for remote REU Sites. During the focus groups, students offered a number of recommendations for maximizing the quality of their experiences in remote REUs, compiled here. These recommendations are complemented by recommendations drawn from relevant literature cited in the Results and Discussion.

    Our results raise several questions that should be addressed in future research. For example, what professional development and support structures are needed to ensure the quality and effectiveness of remote mentorship relationships? To what extent do remote REU Sites allow engagement of undergraduates in research who would otherwise not have such opportunities? Do students in remote REU Sites pursue graduate education and research-related careers at the same level as students who complete in-person programs? Could REU Sites involve some students in person and others at a distance without creating inequitable experiences among members of the cohort or their mentors? What are the experiences of faculty and others who mentor undergraduates in research, and how do these experiences compare with in-person programs? Although these questions should be pursued with caution to avoid disadvantaging those who participate in research remotely, our results provide evidence that remote REUs are sufficiently positive to allow for further investigation of their affordances and constraints.

    It is important to note that the study reported here is descriptive and evaluative in nature rather than a comparison of outcomes of remote versus in-person REU programs or a causal test of whether certain variables influence the effectiveness or inclusiveness of remote REUs. We have strived to keep our reporting of the results descriptive and, when possible, to highlight other research that is useful for understanding the observations and for improving remote REU programs in the future.

    FOOTNOTES

    1 Although this was not a focus of any of the discussions, it is important to note that the GRE is increasingly being dropped as a requirement for graduate applications in the life sciences and is not allowed to be reported by some programs. These decisions are driven by the growing number of studies showing the lack of predictive validity of the GREs for success in life science doctoral programs (e.g., Hall et al., 2017; Moneta-Koehler et al., 2017; for a comprehensive list, see https://beyondthegre.org/grexit).

    ACKNOWLEDGMENTS

    We thank all of the students, faculty, and other research mentors for their willingness to proceed with remote REU programming and for sharing their experiences so that others could learn. We also thank Riley Hess for her feedback on drafts of this article. This material is based upon work supported by the NSF under grant no. DBI-2030530. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of any of the funding organizations. The authors dedicate this work to all of the undergraduates seeking to do research and the individuals who provide these opportunities despite challenging circumstances.

    REFERENCES

  • Alford, R. F., Leaver-Fay, A., Gonzales, L., Dolan, E. L., & Gray, J. J. (2017). A cyber-linked undergraduate research experience in computational biomolecular structure prediction and design. PLoS Computational Biology, 13(12), e1005837. https://doi.org/10.1371/journal.pcbi.1005837 MedlineGoogle Scholar
  • Asai, D. J. (2020). Race matters. Cell, 181(4), 754–757. MedlineGoogle Scholar
  • Asai, D. J., & Bauerle, C. (2016). From HHMI: Doubling down on diversity. CBE—Life Sciences Education, 15(3), fe6. https://doi.org/10.1187/cbe.16-01-0018 LinkGoogle Scholar
  • Balster, N., Pfund, C., Rediske, R., & Branchaw, J. (2010). Entering Research: A course that creates community and structure for beginning undergraduate researchers in the STEM disciplines. CBE—Life Sciences Education, 9(2), 108–118. https://doi.org/10.1187/cbe.09-10-0073 LinkGoogle Scholar
  • Braun, D. C., Clark, M. D., Marchut, A. E., Solomon, C. M., Majocha, M., Davenport, Z., ... & Gormally, C. (2018). Welcoming Deaf students into STEM: Recommendations for university science education. CBE—Life Sciences Education, 17(3), es10. https://doi.org/10.1187/cbe.17-05-0081 LinkGoogle Scholar
  • Byars-Winston, A., & Dahlberg, M.( (2019). The science of effective mentorship in STEMM. Washington, DC: National Academies Press. https://doi.org/10.17226/25568 Google Scholar
  • Clark, I. E., Romero-Calderón, R., Olson, J. M., Jaworski, L., Lopatto, D., & Banerjee, U. (2009). “Deconstructing” scientific research: A practical and scalable pedagogical tool to provide evidence-based science instruction. PLoS Biology, 7(12), e1000264. https://doi.org/10.1371/journal
.pbio.1000264 MedlineGoogle Scholar
  • Collison, G., Elbaum, B., Haavind, S., & Tinker, R. (2000). Facilitating online learning: Effective strategies for moderators. Madison, WI: Atwood Publishing. Google Scholar
  • Dolan, E. L., & Weaver, G. C. (2021). A guide to course-based undergraduate research (1st ed.). New York, NY: Macmillan Higher Education. Google Scholar
  • Eby, L. T., Allen, T. D., Hoffman, B. J., Baranik, L. E., Sauer, J. B., Baldwin, S., ... & Evans, S. C. (2013). An interdisciplinary meta-analysis of the potential antecedents, correlates, and consequences of protégé perceptions of mentoring. Psychological Bulletin, 139(2), 441–476. https://doi.org/
10.1037/a0029279 MedlineGoogle Scholar
  • Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: How and for whom does increasing course structure work? CBE—Life Sciences Education, 13(3), 453–468. https://doi.org/10.1187/cbe.14-03-0050 LinkGoogle Scholar
  • Estrada, M., Woodcock, A., Hernandez, P. R., & Schultz, W. P. (2011). Toward a model of social influence that explains minority student integration into the scientific community. Journal of Educational Psychology, 103(1), 206–222. https://doi.org/10.1037/a0020743 MedlineGoogle Scholar
  • Gentile, J., Brenner, K., & Stephens, A. (2017). Undergraduate research experiences for STEM students: Successes, challenges, and opportunities. Washington, DC: National Academies Press. Google Scholar
  • Gin, L. E., Guerrero, F. A., Cooper, K. M., & Brownell, S. E. (2020). Is active learning accessible? Exploring the process of providing accommodations to students with disabilities. CBE—Life Sciences Education, 19(4), es12. https://doi.org/10.1187/cbe.20-03-0049 LinkGoogle Scholar
  • Grimes, D. A., & Schulz, K. F. (2002). Descriptive studies: What they can and cannot do. The Lancet, 359(9301), 145–149. MedlineGoogle Scholar
  • Hall, J. D., O’Connell, A. B., & Cook, J. G. (2017). Predictors of student productivity in biomedical graduate school applications. PLoS ONE, 12(1), e0169121. https://doi.org/10.1371/journal.pone.0169121 MedlineGoogle Scholar
  • Harrison, C., & Tanner, K. D. (2018). Language matters: Considering microaggressions in science. CBE—Life Sciences Education, 17(1), fe4. https://doi.org/10.1187/cbe.18-01-0011 LinkGoogle Scholar
  • Hurtado, S., Cabrera, N. L., Lin, M. H., Arellano, L., & Espinosa, L. L. (2008). Diversifying science: Underrepresented student experiences in structured research programs. Research in Higher Education, 50(2), 189–214. https://doi.org/10.1007/s11162-008-9114-7 Google Scholar
  • Kim, A. J. (2006). Community building on the web: Secret strategies for successful online communities. Berkeley, CA: Peachpit Press. Google Scholar
  • Laursen, S., Hunter, A.-B., Seymour, E., Thiry, H., & Melton, G. (2010). Undergraduate research in the sciences: Engaging students in real science. Hoboken, NJ: Wiley. Google Scholar
  • Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. New York, NY: Cambridge University Press. Google Scholar
  • Limeri, L. B., Asif, M. Z., Bridges, B. H. T., Esparza, D., Tuma, T. T., Sanders, D., ... & Dolan, E. L. (2019). “Where’s my mentor?!” Characterizing negative mentoring experiences in undergraduate life science research. CBE—Life Sciences Education, 18(4), ar61. https://doi.org/10.1187/cbe.19-02-0036 LinkGoogle Scholar
  • Lopatto, D., & Tobias, S. (2010). Science in solution: The impact of undergraduate research on student learning. Washington, DC: Council on Undergraduate Research. Google Scholar
  • McDevitt, A. L., Patel, M. V., & Ellison, A. M. (2017). Three decades as an NSF REU Site: Lessons and recommendations. Retrieved from BioRxiv: 162289. https://doi.org/10.1101/162289 Google Scholar
  • Means, B., Bakia, M., & Murphy, R. (2014). Learning online: What research tells us about whether, when and how. New York, NY: Routledge. Google Scholar
  • Miles, M. B., Huberman, A. M., & Saldana, J. (2014). Qualitative data analysis: A methods sourcebook (3rd ed.). Thousand Oaks, CA: Sage. Google Scholar
  • Moneta-Koehler, L., Brown, A. M., Petrie, K. A., Evans, B. J., & Chalkley, R. (2017). The limitations of the GRE in predicting success in biomedical graduate school. PLoS ONE, 12(1), e0166742. https://doi.org/10.1371/journal.pone.0166742 MedlineGoogle Scholar
  • Neckers, D. C. (1982). The threat to undergraduate research. Journal of Chemical Education, 59(4), 329. https://doi.org/10.1021/ed059p329 Google Scholar
  • Page, S. E. (2008). The difference: How the power of diversity creates better groups, firms, schools, and societies. Princeton, NJ: Princeton University Press. Google Scholar
  • Palloff, R. M., & Pratt, K. (2007). Building online learning communities: Effective strategies for the virtual classroom. Hoboken, NJ: Wiley. Google Scholar
  • Patton, M. Q. (2008). Utilization-Focused Evaluation. Thousand Oaks, CA: SAGE Publications. Google Scholar
  • Pfeifer, M. A., Reiter, E. M., Hendrickson, M., & Stanton, J. D. (2020). Speaking up: A model of self-advocacy for STEM undergraduates with ADHD and/or specific learning disabilities. International Journal of STEM Education, 7(1), 1–21. Google Scholar
  • Qiang, Z., Obando, A. G., Chen, Y., & Ye, C. (2020). Revisiting distance learning resources for undergraduate research and lab activities during COVID-19 pandemic. Journal of Chemical Education, 97(9), 3446–3449. https://doi.org/10.1021/acs.jchemed.0c00609 Google Scholar
  • Ragins, B. R., & Cotton, J. L. (1999). Mentor functions and outcomes: A comparison of men and women in formal and informal mentoring relationships. Journal of Applied Psychology, 84(4), 529. MedlineGoogle Scholar
  • Rovai, A. P. (2002). Development of an instrument to measure classroom community. The Internet and Higher Education, 5(3), 197–211. https://doi.org/10.1016/S1096-7516(02)00102-1 Google Scholar
  • Saldana, J. (2015). The coding manual for qualitative researchers. Thousand Oaks, CA: Sage. Google Scholar
  • Seidel, S. B., Reggi, A. L., Schinske, J. N., Burrus, L. W., & Tanner, K. D. (2015). Beyond the biology: A systematic investigation of noncontent instructor talk in an introductory biology course. CBE—Life Sciences Education, 14(4), ar43. https://doi.org/10.1187/cbe.15-03-0049 LinkGoogle Scholar
  • Simon, H. A. (1977). The structure of ill-structured problems. In Models of Discovery. Boston Studies in the Philosophy of Science. vol 54 (pp. 304–325). Dordrecht, Netherlands: Springer. Google Scholar
  • Sue, D. W., Lin, A. I., Torino, G. C., Capodilupo, C. M., & Rivera, D. P. (2009). Racial microaggressions and difficult dialogues on race in the classroom. Cultural Diversity and Ethnic Minority Psychology, 15(2), 183. MedlineGoogle Scholar
  • Tanner, K., & Allen, D. (2007). Cultural competence in the college biology classroom. CBE—Life Sciences Education, 6(4), 251–258. https://doi
.org/10.1187/cbe.07-09-0086 LinkGoogle Scholar
  • Tanner, K. D. (2013). Structure matters: Twenty-one teaching strategies to promote student engagement and cultivate classroom equity. CBE—Life Sciences Education, 12(3), 322–331. https://doi.org/10.1187/cbe.13-06-0115 LinkGoogle Scholar
  • Tienda, M. (2013). Diversity ≠ Inclusion: Promoting integration in higher education. Educational Researcher, 42(9), 467–475. https://doi.org/
10.3102/0013189X13516164 Google Scholar
  • Wenger, E. (1999). Communities of practice: Learning, meaning, and identity. New York, NY: Cambridge University Press. Google Scholar