Next Article in Journal
Integrating Mathematics and Science Teaching in the Context of Education for Sustainable Development: Design and Pilot Implementation of a Teaching-Learning Sequence about Air Quality with Pre-Service Primary Teachers
Next Article in Special Issue
Impact of COVID-19 on Educational Sustainability. Initial Perceptions of the University Community of the University of Cádiz
Previous Article in Journal
Interdisciplinary Reservoir Management—A Tool for Sustainable Water Resources Management
Previous Article in Special Issue
GIS Distance Learning during the COVID-19 Pandemic (Students’ Perception)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Development of Sustainable Assessment during the COVID-19 Pandemic: The Case of the English Language Program in South Korea

1
Department of English Language and Literature, Gachon University, 1342 Seongnamdaero, Sujeong-gu, Seongnam-si, Gyeonggi-do 13120, Korea
2
Department of English Education, Hongik University, Seoul 04066, Korea
*
Author to whom correspondence should be addressed.
Sustainability 2021, 13(8), 4499; https://doi.org/10.3390/su13084499
Submission received: 30 March 2021 / Revised: 15 April 2021 / Accepted: 16 April 2021 / Published: 18 April 2021
(This article belongs to the Special Issue Sustainable Assessment in the Education System in the Age of COVID-19)

Abstract

:
The COVID-19 pandemic has posed challenges to educational systems around the world. In particular, language learning environments being impacted by the pandemic has resulted in a shift from traditional in-person to online language teaching. This paper examines the case of an English language program in South Korea to investigate how the sudden transition to online language teaching has influenced language instructors’ teaching and assessment practice. The current study also examines the level of satisfaction of instructors and students with the changing form of English language teaching and assessment practices. Results showed that a professional learning community was formed by instructors to engage in regular communication as an attempt to develop new forms of assessment practices that were process-oriented and formative. Instructors also assigned multimodal projects to promote sustainable assessments where students could actively utilize target language forms and structures. Students were highly satisfied with new forms of language assessment practices, whereas instructors’ level of satisfaction towards their language assessment practices were somewhat low. Findings provided educators with language assessment suggestions that can offer language instructors ideas to deliver more creative and sustainable language assessment strategies that can promote self-regulated learning and sustainable development.

1. Introduction

The COVID-19 pandemic has posed unprecedented challenges to educational systems around the world, from unplanned school closures and home confinement to the abrupt transition to online learning. In particular, the changing nature of language learning environments, combined with the impact of the COVID-19 pandemic, has rapidly changed the landscape of language learning, resulting in a shift from traditional in-person to online language teaching. With little to no preparation regarding online language teaching, many language programs in higher educational institutions have quickly transitioned to online teaching and learning following the outbreak of the current pandemic. Recent studies on emergency online language teaching have provided various testimonials on how higher education language programs have swiftly moved from traditional to online teaching (e.g., Gacs et al. [1] and Ross and DiSalvo [2]), investigated how the form of language instruction has changed during the COVID-19 pandemic (e.g., Moorhouse [3] and Moser et al. [4]), and studied stakeholder perceptions of emergency online learning (e.g., MacIntyre et al. [5] and Park and Chung [6]). Many of these studies have, for the most part, echoed promising experiences found among stakeholders, along with meaningful teaching and learning results experienced within their programs.
Despite our shared joy in the successful implementation of online language teaching at a moment’s notice to cope with the pandemic, this shift in teaching delivery format has posed various challenges to language instructors and administrators in both English-as-a-second language (ESL) and English-as-a-foreign-language (EFL) contexts. Focusing on the case of an English language program offered by a private South Korean university, this paper investigates aspects of how this sudden transition to online language teaching has influenced language instructors’ teaching and assessment practices. In addition, it examines both EFL instructors’ and students’ levels of satisfaction with the changing form of English language teaching and assessment practices with regard to students’ sustainable English language development and use. Drawing on a one-year ethnographic fieldwork study and a questionnaire collected from 979 students and 14 EFL instructors, this paper aims to address the following three research questions:
  • What are the newly developed forms of language assessment that would ensure students’ sustainable language development and use?
  • How satisfied are EFL students with the new forms of sustainable language assessment practices?
  • How satisfied are EFL instructors with these new forms of sustainable language assessment practices?
The discussion presented here implores educators, school administrators and policymakers to see the importance of developing and implementing sustainable language assessment frameworks to ensure continuity of online teaching to online assessment, and to enhance students’ self-regulated and sustainable language development.

2. Literature Review

2.1. Sustainable Assessment

Sustainability in education emphasizes the importance of innovation and sustainability of educational practices. It promotes teaching, learning and assessment practices that can enable students to equip themselves with skills and confidence to become effective lifelong learners (Blewitt [7]; Tarrant and Thiele [8]; and Whitehouse [9]). The United Nations Educational, Scientific and Cultural Organization (UNESCO), for example, states that sustainability in education promotes “learning beyond the boundaries of educational institutions” and enables students to develop “the knowledge, skills and values to address [the] social, environmental and economic challenges of the 21st century” (UNESCO [10]).
In recent years, the importance of developing and implementing sustainable assessment has been emphasized (Boud and Soler [11]; Nguyen and Walker [12]; and Witts [13]). The notion of sustainable assessment was developed to engage students in the development of transferrable skills and to help students achieve sustainable, long-term learning outcomes that could meet their future learning needs (Boud and Soler [11]; and Rodríguez–Gómez and Ibarra–Sáiz [14]). Boud [15], for example, defines sustainable assessment as an instructional tool that enables students to become effective lifelong learners with conscious self-regulation of learning and motivation processes and self-awareness of learning needs and developing strategies. Similarly, Beck, Skinner and Schwabrow [16] state that sustainable and alternative assessment will enable students to “develop in students the ability to be sustainable assessors of their own long-term learning skills and to develop assessment devices for student self-monitoring” (p. 328). The advancement of educational technologies has made the development and implementation of sustainable assessment more accessible and easier to be applied, as technological tools and techniques involve students and teachers in authentic and interactive learning contexts (Beck, Skinner and Schwabrow [16]; Nicol [17] and Williams [18]).

2.2. Sustainable Assessment in Practice

Sustainable assessment aims to move away from the assessment of learning in which assessment merely includes a process for measuring students’ abilities and educational outcomes. Instead, it aims to create an assessment for learning whereby assessment is embedded into learning processes and facilitates meaningful learning and development for students (Boud and Molloy [19]; Rodríguez–Gómez and Ibarra–Sáiz [14]). Sustainable assessment thus emphasizes “a strong commitment to equity, including shared criteria for long-term learning outcomes and faculty and student monitoring of student progress toward outcomes through periodic [use of] rubrics and reflective sessions” (Beck, Skinner and Schwabrow [16], p. 326). Recent studies on sustainable assessment suggest the following as the key elements of effective sustainable assessment practices:
  • self-awareness and self-assessment: sustainable assessment emphasizes the development of self-assessment and self-directed learning skills through carefully planned and structured self-assessment activities. In the development and implementation of sustainable assessment, it is important to provide students with a learning environment in which they can constantly monitor their performances and progress while receiving sufficient instructional support and feedback to continue their learning (e.g., Cassidy [20]; Fastre et al. [21]; and McDonald [22]).
  • peer-assessment: sustainable assessment commonly combines peer-assessment with self-directed learning to enhance students’ development of autonomous learning abilities and improve their ability to critically analyze and reflect upon their learning performance (e.g., McMahon [23]; and Topping [24]).
  • assessment for learning: sustainable assessment views assessment as part of learning processes rather than serving as an instructional tool for measuring students’ performance or abilities. Beck, Skinner and Schwabrow [16], for example, describe sustainable assessment as a part of a constructive alignment between teaching and assessment practices, which enable students to increase “the ability to be sustainable assessors of their own long-term learning skills and to develop assessment devices for student self-monitoring” (p. 3).
The sustainable assessment approach has been given prominent attention during the current pandemic because the sudden outbreak of the COVID-19 virus has caused educational disruption around the world and raised concerns regarding students’ self-regulated learning and sustainable development (Giovannella [25]; and Wargadinata et al. [26]). The importance of developing and implementing sustainable teaching and assessment practices has been strongly highlighted in the field of language education, where the abrupt transition to online language learning caused by the current pandemic generally has limited students’ interaction with one another and with teachers, as well as their authentic and sustainable language use that is closely related to real-time communication (González–Lloret [27]; and Lomicka [28]).

3. Methods

This case study was conducted at Gachon International Language Center (GILC) at Gachon University, a private South Korean university located in the Seoul metropolitan area. The study was carried out during the course of an academic year (Spring 2020-Fall 2020), using qualitative and quantitative methods. The mixed method approach was used to compile data from two phases. Phase one included ethnographic work and qualitative interviews that allowed researchers to fully understand and conceptualize the changing nature of teaching practices and assessment procedures carried out during the pandemic. Phase two included questionnaire surveys among GILC instructors and students to identify their respective levels of overall satisfaction with the changing form of English language teaching and assessment practices.

3.1. Research Setting

The research was conducted in GILC, one of the largest EFL programs that exist in the South Korean higher educational system. GILC has 32 fulltime native speaking English instructors; two faculty administrators, including a program director and an assistant director; and two fulltime staff members. GILC mainly offers a two-semester College English sequence that is required for all newly admitted students at Gachon University. These courses are speaking-intensive, designed to help students practice English communication skills and to enable them to engage in sustainable English language development and use. The courses thus involve a large amount of group work and require students’ active participation and engagement with English language learning and use. Each semester, one required course involving approximately 216 sections (of College English) and six elective courses of approximately 87 sections are all offered to undergraduate students. Overall, each semester GILC offers approximately 6000 seats to undergraduate students at Gachon University.

3.2. Data Collection

Figure 1 illustrates the data collection procedure that was implemented to understand the language assessment practices used by instructors, as well their perceptions of online language teaching and assessment strategies. Further, the procedure was designed to understand students’ level of satisfaction regarding their overall online language learning and assessment experiences. The data analyzed in this study are twofold: qualitative data, taken from ethnographic work, and quantitative data, derived from instructor and student questionnaires.

3.2.1. Qualitative Data

We conducted ethnographic fieldwork and qualitative interviews between spring and fall 2020. Data collected included (a) a 50-page document containing faculty meeting notes based on 10 faculty meetings and fifteen professional learning community (PLC) meetings, (b) eight hours of informal classroom observations and a 40-page compilation of field notes, (c) a 220-page compendium of teaching materials and activity descriptions and (d) 12 h of in-depth qualitative interviews with seven focal participants. Qualitative interviews were conducted with each focal participant to gain an in-depth understanding of their experiences and perceptions of newly developed forms of language teaching and assessment. Each interview lasted between one hour and three hours.

3.2.2. Quantitative Data

We conducted questionnaire surveys with two separate groups of GILC instructors and students who took a course offered by GILC in Fall 2020. Online questionnaires were designed based on ethnographic fieldwork, and informal interviews with a group of GILC faculty members and students. The final version of the questionnaires was examined by a program director and a panel of five experts to assess the validity of the content, topic relevance and coverage, and overall comprehensibility. Minor revisions were recommended by the experts and revisions related to word choice to enhance clarity were made. The questionnaires contained multiple-choice questions and open-ended questions with space for comments in order to understand GILC instructors’ and students’ overall satisfaction with the newly developed form of language teaching and assessment practices.

Student Questionnaire

A student questionnaire survey was conducted among students who were enrolled in the second semester of a required two-semester College English sequence. The student questionnaire was sent to the students on the final day of classes and was available for one week, before the final grades for the semester were announced. This was to ensure that students had fully experienced the new assessment practices prior to participating in the survey and that their course grades did not influence their perceptions. The university offers a required, two-semester EFL course sequence (Spring-Fall semesters) to all newly admitted freshmen regardless of department or major. The sequence focuses on improving practical communication skills, with a particular emphasis on enhancing oral communication abilities. The sections in the required course are grouped based on college.
The total number of students enrolled in the second semester of the College English sequence was 3800. According to Cohen et al. [29], the conventional sampling strategy is to use a 95 percent confidence level and a 3 percent confidence interval. Hence, a sample of 834 students would be needed to be able to generalize the results to a population of 3800.
The questionnaire was sent to all students enrolled in College English during the Fall 2020 semester. Data collection was conducted using Google Forms and was sent to participants via text. Students were informed about the purpose of the study and that their participation in the study was entirely voluntary. They were informed that non-participation would not affect their grade. They were also assured of anonymity and confidentiality of data. In turn, participants gave their consent to participate. A total of 979 students took part in the survey. The response rate was 25.8%, which is fairly consistent with recent research (e.g., Lee [30] and Sax et al. [31]). Specifically, according to Sax et al. [31], the response rates for online surveys range between 17.1% to 19.8%. In the current study, the proportion of respondents from the 12 colleges at the university were comparable to their percentage enrollment at the university, which provide support that an unbiased sample was obtained (see Table 1).
The sample included 42% (N = 415) male and 58% (N = 564) female students, with a mean age of 20.15 (SD = 1.88). Of the 979 participants, 14% (N = 136) had previous experience taking an online English course prior to 2020, but 86% (N = 843) indicated that they did not have prior online English learning experience.
In the current study, an online questionnaire survey was used to investigate students’ perceptions of their online EFL experiences, as well as their overall satisfaction with sustainable language assessment. The survey consists of two sections: (1) participants’ demographic information and (2) level of satisfaction and perception of their online language learning experiences. The first section of the questionnaire included items such as age, gender, their experience with online English instruction and their major. The second section of the questionnaire asked students multiple questions to investigate their level of satisfaction with their online EFL learning experience in 2020. Students were asked to respond to three questions, which were:
  • How satisfied were you with the College English courses that were conducted online in 2020?
  • How satisfied were you with your English language use during online College English in 2020?
  • How satisfied were you with the language assessment practices used during online College English in 2020?
The questions used a 5-point Likert scale ranging from 1 to 5, with 1 meaning very dissatisfied and 5 meaning very satisfied. The results from the questionnaire are summarized using mean and SD. The internal reliability of the questionnaire was 0.90 (Cronbach’s alpha), which indicates high internal consistency.

Instructor Questionnaire

Of the 32 instructors contacted via email to participate in the preliminary online questionnaire, 13 participated. Table 2 outlines the background information of the participating instructors.
Among the 13 participants, six responded that they had previous experience teaching English entirely online prior to Spring 2020, and eight responded that they had experience teaching English that involved a combination of online and in-person instruction (e.g., blended learning, flipped classroom). The participants’ perceived ability to use digital media was 4.2 out of 5 (SD = 0.6), which indicates that the participants showed confidence in their digital literacy.
A questionnaire was developed to understand EFL instructors’ online teaching experience on two main themes: their perceptions of (1) online teaching support and experiences and (2) online assessment practices. Instructors were asked to indicate their level of agreement or disagreement (1 = Strongly Disagree, 5 = Strongly Agree) with regard to the following 6 statements:
  • I am satisfied with my experience of teaching online in 2020.
  • I am satisfied with the online teaching training I received in 2020.
  • My online students were actively involved in their learning.
  • Student achievements in my online teaching class were higher than those of my traditional class.
  • The preparation time for online teaching was longer compared to the preparation time for traditional teaching.
  • I am satisfied with my online assessment practices that were used in 2020.

3.3. Data Analysis

The different sets of qualitative data, including observation notes, faculty-meeting minutes and field notes, were analyzed thematically through constant comparison methods derived from grounded theory (Glaser and Strauss [32]). All the qualitative interviews were transcribed verbatim, thematically coded using qualitative content analysis (Mayring [33]), and triangulated with quantitative findings. In order to analyze the two questionnaires in the study, SPSS was used to conduct frequency, reliability and descriptive statistics.

4. Findings

4.1. The Development of Sustainable Assessment

As the COVID-19 pandemic has continued to evolve, many schools and universities worldwide have been forced to move their courses online and have made extensive adjustments to their teaching practices. In order to ensure the quality of prolonged emergency remote learning, they have actively restructured their teaching programs and curricula, and have provided various instructional materials and resources for their instructors and students (e.g., Gacs et al. [1]; and Johnson and Veletsianos [34] among many). Like many other English language programs affected by the COVID-19 pandemic, GILC has offered their EFL courses online since Spring 2020. The sudden transition to online learning has posed various challenges and difficulties for language instructors and administrators at GILC who needed to not only quickly switch to online course delivery but also create interactive and sustainable online language learning environments, with little to no time to prepare and with limited professional preparation for an online approach to teaching. Although the university provided digital platforms for teaching and learning, as well as initial training to use such programs, including Webex and Moodle, it was mostly left to the individual instructor to design their classes specifically to meet their course objectives. In order to successfully adapt their lesson plans to online platforms and ensure the quality of education, all language instructors and administrators at GILC formed a professional learning community (PLC) at the beginning of Spring 2020, and actively participated in PCL. They held weekly meetings and engaged in regular communication and information sharing via Google groups and email as an attempt to establish a shared goal, and to develop the new forms of teaching and assessment practices. This adjusted approach represents a paradigm shift in times of crisis (c.f., DuFour and Eaker [35]; and DuFour, Eaker and Many [36]).
In PLC, one of the main issues all the instructors actively and collaboratively addressed was to restructure and redesign courses and their evaluation procedures. Before the outbreak of the COVID-19 pandemic, College English courses offered by GILC generally consisted of oral presentation, involving different types of pair-or group-speaking tasks with the goal of providing students extensive opportunities for using English. They also included two sets of speaking and written examinations that were administered during midterm and finals weeks. The swift transition to online language learning during the pandemic, however, required GILC instructors to rethink and redesign the ways in which students would interact with one another and with instructors. In addition, as instructors began to carry out online language classes in Spring 2020, they gradually realized that students’ self-regulated language learning and self-efficacy becomes an important variable that affected their degree of successful acquisition and use of the target language in online language classes (c.f., González–Lloret [27]; and Lomicka [28]). A majority of GILC instructors during PLC meetings held in Spring 2020 expressed the immediate needs to change language assessment procedures to arouse students’ self-regulated and sustainable language learning. This would enable students to actively participate in online language learning but at the same time help them develop English language skills that would be transferable to out-of-classroom communicative situations.
Beginning from the middle of Spring 2020, GILC and its PLC started to work on developing language assessment practices that aligned with online language teaching practices and met students’ unique needs in online language learning classes. The instructors held weekly meetings throughout Spring 2020, collaboratively working on analyzing course content and restructuring activities and assessment procedures and developing a new mode of language assessment, as illustrated in Table 3.
As shown in Table 3, the newly developed form of language assessment includes a series of ongoing and alternative language assessment by which students are invited to participate in both self- and peer-evaluation processes while receiving adequate instructional supports and feedback from the instructor. With a shared vision of how to assess students’ language learning, GILC instructors implemented the newly developed form of sustainable language assessment during Fall 2020. While GILC instructors had the freedom to choose a wide range of different sustainable assessment types, including portfolio and language learning journals, a majority of the instructors used multimodal projects to assess their students’ learning progress. For example, instructors assigned multimodal projects in which students could actively utilize target language forms and structures using online digital tools. The following is a multimodal project used in one of the College English courses: (a) students create and share a video online in which an individual students express their opinions about a given topic about which they learned in a previous class; students are given a rubric that serves as a guide for them to accomplish a given task, (b) students provide comments and feedback for one another’s video clips based on a peer-evaluation guide provided by the instructor, and (c) students revise their video clips based on self-and peer-evaluation and write a short self-reflective note.
During the interview, many instructors stated that these multimodal projects allowed students to engage in authentic language interaction and express their thoughts and ideas in creative ways through multiple modes (e.g., auditory, visual, linguistic, spatial and gestural). This was the case even when students may have been limited by their linguistic abilities or had limited face-to-face interaction with their instructors or classmates. Similarly, many participants stated that the use of multimodality in language teaching and learning enhanced their willingness to communicate and lowered their language anxiety—both of which positively contributed to their reportedly increased sense of self-efficacy (see Kim and Belcher [37] for more discussion).

4.2. Students’ Overall Satisfaction with Sustainable Language Assessment

Based on the questionnaire data collected from 979 students who took a course offered by GILC in Fall 2020, students’ responses to questions regarding their level of satisfaction with their online language learning experiences, perceived use of language during language instruction and level of satisfaction on their online language assessment experiences were analyzed. Table 4 shows the results obtained from the student questionnaire.
In regard to students’ overall satisfaction with their online language learning experiences, the results showed that students’ level of satisfaction was high (M = 4.0, SD = 0.9). A majority of the students (74.1%) indicated that they were either satisfied (41.7%) or very satisfied (32.4%) with their online learning experiences. Similarly, students’ perceived level of satisfaction regarding how the newly developed form of language learning and assessment enhanced their use of English was also positive (M = 4.0, SD = 0.9). A majority of students (72.8%) reported that they were satisfied (39.2%) or very satisfied (33.6%) with the amount of language used during their online language learning time. Students’ experiences with regard to online language assessment was also positive (M = 3.8, SD = 1.0). Approximately 64.1% of the students were satisfied or very satisfied with the assessment practices implemented during online instruction. Overall, students’ level of satisfaction with their experiences with online language learning and sustainable language learning assessment were highly positive.

4.3. Instructors’ Satisfaction with Sustainable Language Assessment

Table 5 shows the results from the instructor questionnaire. With regard to instructor perspectives on their online teaching experiences, the level of their overall satisfaction was moderately high (M = 3.9, SD = 0.6), with most instructors agreeing or strongly agreeing (76.9%) that their online teaching experiences were satisfactory. A moderately positive level of satisfaction was indicated by their level of satisfaction with online teaching training they received (M = 3.6, SD = 0.9) and the level of student involvement during online instruction (M = 3.5, SD = 1.3). Instructors’ perceptions of their online students’ achievements were somewhat lower compared to traditional teaching (M = 2.8, SD = 0.8), and the amount of class preparation was higher compared to traditional teaching (M = 4.3, SD = 1.3). On the other hand, instructors’ level of satisfaction in relation to their sustainable language assessment practices was somewhat low (M = 2.8, SD = 1.3). Only four instructors indicated that their experiences with utilizing sustainable language assessment practices were positive.
The qualitative interviews that were conducted with seven focal participants illustrate some variables that influence the way they responded about their overall level of satisfaction with the newly developed form of language teaching and assessment. During the interview, all the participants stated that restructuring and redesigning their teaching and assessment practice created significant extra work and responsibilities for them while at the same time there was little support or training provided. In addition, while all the participants found the newly developed method for assessing students’ language development was effective and definitely enhanced their sustainable language development and use, the instructors pointed out that the implementation of sustainable language assessment required constant monitoring and feedback about students’ performance in order to facilitate students’ sustainable language learning. The following interview excerpts from discussions with instructors illustrate how the development and implementation of sustainable language assessment enhanced job satisfaction but simultaneously increased the number of ‘burnouts’ over time: “It was really enjoyable to implement sustainable language assessment rather than having two big exams as we used to use. It was great to see my student’s gradual and self-directed language learning progress throughout the semester, and my students said the same thing. But I need to admit the fact this type of language assessment takes a lot of time and efforts—very exhausting.” (Lois, a female instructor with 21 years of EFL teaching experience).
“I am happy with my semester, but I often felt like I wasn’t getting the support I need. It took me forever to prepare and use this type of language assessment in classes… At the end of the semester, one of my students said, “At first you forced me to talk [through a series of multimodal activities] and I hated you…but thank you. I feel more confident online and I feel more confident talking. I can see students are truly developing autonomous learning abilities this semester.”
(Caleb, with 23 years of EFL teaching experience)
“There should be more support for both instructors and students. We definitely need more professional trainings and administrative support.”.
(Jake, with 14.5 years of EFL teaching experience)
The above excerpts all indicate that while developing sustainable language assessment can increase students’ self-directed and sustainable language development, changing assessment practices can be challenging for instructors, who are accustomed to, and maybe prefer, summative assessment (c.f., Black and Wiliam [38]; and Kim [39]). Through their statements, they also highlighted the observation that even if instructors are aware of the need for new or improved assessment methods, not all instructors would have the time or motivation to make extensive efforts to change their existing assessment practices.

5. Discussion

Effective online language learning requires “the reconstruction of student and instructor roles, relations and practices” (Vonderwell et al. [40]). While coping with the transition from face-to-face to online learning, instructors at GILC developed and implemented sustainable language assessment to align their teaching practices with assessment. Findings of this study illustrate that a majority of students and more than half of the instructors were satisfied with the newly developed form of sustainable language assessment used during Fall 2020. However, it is noteworthy that instructors’ perceptions of their online teaching and assessment experiences were mixed, with their online teaching experiences being positive, while also indicating that the amount of preparation for online teaching was high. Instructors’ perceptions of student achievement and their online assessment practices were lower compared to the other variables, indicating that instructors may need more assistance or training in improving their skills with respect to those areas of online teaching.
As shown above, in both questionnaire and interview data, instructors pointed out the need to have increased opportunities for professional development and training to enhance their capacity to effectively develop and implement sustainable language assessment and to receive adequate administrative support such as sufficient budgets, supportive information technologies and improved course management. Despite the successful adaptation to the new sustainable language assessment examined herein, additional research on equipping language instructors with tools and guidelines for alternative and sustainable language assessment could act to enhance their collective ability to align learning with assessment. In order words, it is important to provide language instructors with adequate training and afford them with appropriate professional development so that they can equip themselves with the skills needed to develop sustainable language assessment (c.f., Kim [41]; and Lee et al. [42]). As previous research on online learning has pointed out, the high dropout rate and students’ inconsistent engagement in learning has been of a concern to practitioners and administrators (Friedman and Friedman [43]; Lee, Choi and Kim [44]; and Park and Choi [45]). Their arguments remind us the importance of consistently monitoring the level of students’ satisfaction with teaching and assessment practices, while providing practitioners with adequate instructional support.

6. Conclusions

In terms of aligning instruction to assessment, “continuity seems to be the strongest predictor of final grades…and it lends support to most instructors’ intuition that among all types of online learning behaviors, engaging with a course on a reliable and regular basis, especially in blended language courses, is what accounts for successful learning in the course” (Rubio et al. [46], p. 245). Online language instruction allows for more use of online instructional tools, which demands that instructors align their language assessment practices to meet their instructional practices. Therefore, developing adequate training and instrumental support for implementing sustainable assessment in online language classrooms are necessary to improve the quality of online language teaching and assessment. Moreover, because this study investigates the changes in language assessment practices that were made in one particular institution, it may limit the applicability of the findings. Future research should identify specific assessment development procedures in relation to course objectives and in-class activities. Furthermore, research on understanding online language assessment constructs may provide language instructors with information to deliver more creative and innovative online language assessment strategies, ones that are better aligned with learning objectives while providing more optimized feedback for learners. Studies on students’ assessment outcomes and how sustainable assessment practices can be used to develop lifelong learners (i.e., Boud [15]; and Boud and Molly [19]) and equip them with tools to encourage the use of self-regulated learning and assessment strategies. In addition, future research in the area of multimodal pedagogy calls for the necessity of training language instructors to use and develop rubrics as digital composition, which may be used more prevalently in the aftermath of the ongoing COVID-19 pandemic.

Author Contributions

Conceptualization, S.-J.C. and L.-J.C.; formal analysis, S.-J.C. and L.-J.C.; investigation, S.-J.C. and L.-J.C.; methodology, S.-J.C. and L.-J.C.; writing---original draft, S.-J.C. and L.-J.C.; writing—review & editing, S.-J.C. and L.-J.C. Both authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Gacs, A.; Goertler, S.; Spasova, S. Planned online language education versus crisis-prompted online language teaching: Lessons for the future. Foreign Lang. Ann. 2020, 53, 380–392. [Google Scholar] [CrossRef]
  2. Ross, A.F.; DiSalvo, M.L. Negotiating displacement, regaining community: The Harvard Language Center’s response to the COVID-19 crisis. Foreign Lang. Ann. 2020, 53, 317–379. [Google Scholar] [CrossRef] [PubMed]
  3. Moorhouse, B.L. Adaptations to a face-to-face initial teacher education course ‘forced’ online due to the COVID-19 pandemic. J. Educ. Teach. 2020, 46, 609–611. [Google Scholar] [CrossRef] [Green Version]
  4. Moser, K.M.; Wei, T.; Brenner, D. Remote teaching during COVID-19: Implications from a national survey of language educators. System 2021, 97, 1–15. [Google Scholar] [CrossRef]
  5. Macintyre, P.D.; Gregersen, T.; Mercer, S. Language teachers’ coping strategies during the Covid-19 conversion to online teaching: Correlations with stress, wellbeing and negative emotions. System. 2020, 94, 1–13. [Google Scholar] [CrossRef]
  6. Park, H.; Chung, S.J. An exploration of elementary students’ level of satisfaction with online EFL instruction. Multimed. Assist. Lang. Learn. 2020, 23, 339–358. [Google Scholar]
  7. Blewitt, J. Sustainability and lifelong learning. In The Sustainability Curriculum: The Challenge for Higher Education; Earthscan: London, UK, 2004; pp. 24–42. [Google Scholar]
  8. Tarrant, S.P.; Thiele, L.P. Practice makes pedagogy--John Dewey and skills-based sustainability education. Int. J. Sustain. High. Educ. 2016, 17, 54–67. [Google Scholar] [CrossRef]
  9. Whitehouse, H. Cross-sectorial relationships for education for sustainability. In Education for Sustainability in Tourism; Springer: Berlin, Germany, 2015; pp. 117–132. [Google Scholar]
  10. UNESCO World Conference on ESD 2014. Five Reasons to Support ESD-Education for Sustainable Development. Available online: https://en.unesco.org/themes/education-sustainable-development (accessed on 20 March 2021).
  11. Boud, D.; Soler, R. Sustainable assessment revisited. Assess. Eval. High. Educ. 2016, 41, 400–413. [Google Scholar] [CrossRef] [Green Version]
  12. Nguyen, T.; Walker, M. Sustainable assessment for lifelong learning. Assess. Eval. High. Educ. 2016, 41, 97–111. [Google Scholar] [CrossRef]
  13. Witts, J. Sustainable assessment: Developing lifelong learners. In Education for Sustainable Development in Further Education: Embedding Sustainability into Teaching, Learning and the Curriculum; Springer: Berlin, Germany, 2016; pp. 77–92. [Google Scholar]
  14. Rodríguez-Gómez, G.; Ibarra-Sáiz, M.S. Assessment as learning and empowerment: Towards sustainable learning in higher education. In Sustainable Learning in Higher Education: Innovation, Technology, and Knowledge Management; Springer: Cham, Switzerland, 2015. [Google Scholar] [CrossRef]
  15. Boud, D. Sustainable assessment: Rethinking assessment for the learning society. Stud. Contin. Educ. 2000, 22, 151–167. [Google Scholar] [CrossRef]
  16. Beck, R.J.; Skinner, W.F.; Schwabrow, L.A. A study of sustainable assessment theory in higher education tutorials. Assess. Eval. High. Educ. 2013, 38, 326–348. [Google Scholar] [CrossRef]
  17. Nicol, D. Laying a foundation for lifelong learning: Case studies of e-assessment in large 1st-year classes. Br. J. Educ. Technol. 2020, 38, 668–678. [Google Scholar] [CrossRef]
  18. Williams, P. Assessing context-based learning: Not only rigorous but also relevant. Assess. Eval. High. Educ. 2008, 33, 395–408. [Google Scholar] [CrossRef]
  19. Boud, D.; Molloy, E. Feedback in Higher and Professional Education: Understanding It and Doing It Well; Routledge: London, UK, 2013. [Google Scholar]
  20. Cassidy, S. Assessing ‘inexperienced’ students’ ability to self-assess: Exploring links with learning style and academic personal control. Assess. Eval. High. Educ. 2007, 32, 313–330. [Google Scholar] [CrossRef]
  21. Fastré, G.M.J.; Van der Klink, M.R.; Sluijsmans, D.; Van Merriënboer, J.J.G. Towards an integrated model for developing sustainable assessment skills. Assess. Eval. High. Educ. 2013, 38, 611–630. [Google Scholar] [CrossRef] [Green Version]
  22. McDonald, B. Self-assessment for understanding. J. Educ. 2007, 188, 25–40. [Google Scholar] [CrossRef]
  23. McMahon, T. Combining peer-assessment with negotiated learning activities on day-release undergraduate-level certificate course (ECTS Level 3). Assess. Eval. High. Educ. 2010, 35, 223–239. [Google Scholar] [CrossRef]
  24. Topping, K. Self and peer assessment in school and university: Reliability, validity and utility. In Optimising New Modes of Assessment: In Search of Qualities and Standards: Innovation and Change in Professional Education; Springer: Dordrecht, The Netherlands, 2003. [Google Scholar] [CrossRef]
  25. Giovannella, C. Languaging: Effect induced by the Covid-19 pandemic on students’ perception about technologies and distance learning. In Ludic, Co-Design and Tools Supporting Smart Learning Ecosystems and Smart Education: Smart Innovation, Systems and Technologies; Springer: Singapore, 2021; pp. 105–116. [Google Scholar]
  26. Wargadinata, W.; Iffat, M.; Dewi, E.; Zainu, R. Student’s responses on learning in the early COVID-19 pandemic. Tadris: J. Educ. Teach. Train. 2020, 5, 141–153. [Google Scholar] [CrossRef]
  27. González–Lloret, M. Collaborative tasks for online language teaching. Foreign Lang. Ann. 2020, 53, 260–269. [Google Scholar] [CrossRef]
  28. Lomicka, L. Creating and sustaining virtual language communities. Foreign Lang. Ann. 2020, 53, 306–313. [Google Scholar] [CrossRef]
  29. Cohen, L.; Manion, L.; Morrison, K.R.B. Research Methods in Education; Routledge: London, UK, 2000. [Google Scholar]
  30. Lee, J.J. International students’ experiences and attitudes at a US host institution: Self-reports and future recommendations. J. Res. Int. Educ. 2010, 9, 66–84. [Google Scholar] [CrossRef]
  31. Sax, L.J.; Gilmartin, S.K.; Bryant, A.N. Assessing response rates and nonresponse bias in web and paper surveys. Res. High. Educ. 2003, 44, 409–432. [Google Scholar] [CrossRef]
  32. Glaser, B.G.; Strauss, A.L. The Discovery of Grounded Theory: Strategies for Qualitative Research; Aldine Publishing: Chicago, IL, USA, 1967. [Google Scholar]
  33. Marying, P. Qualitative Content Analysis: Theoretical Foundation, Basic Procedures and Software Solution; Social Science Open Access Repository (SSOAR): Klagenfurt, Austria, 2014. Available online: https://www.ssoar.info/ssoar/handle/document/39517 (accessed on 20 March 2021).
  34. Johnson, N.; Velestsianos, G.; Seaman, J. Faculty and administrators’ experiences and approaches in the early weeks of the COVID-19 pandemic. Online Learn. 2020. Available online: https://olj.onlinelearningconsortium.org/index.php/olj/article/view/2285 (accessed on 20 March 2021).
  35. DuFour, R.; Eaker, R. Learning by Doing: A Handbook of Professional Learning Communities at Work; Solution Tree: Bloomington, IN, USA, 2006. [Google Scholar]
  36. DuFour, R.; Eaker, R.; Many, T. Professional Learning Communities at Work: Best Practices for Enhancing Student Achievement; Association for Supervision and Curriculum Development: Alexandria, VA, USA, 1998. [Google Scholar]
  37. Kim, Y.; Belcher, D. Multimodal compositing and traditional essays: Linguistic performance and learner perceptions. RELC J. 2020, 51, 86–100. [Google Scholar] [CrossRef]
  38. Black, P.; William, D. Inside the black box: Raising standards through classroom assessment. Phi Delta Kappa 1998, 80, 139–148. [Google Scholar] [CrossRef] [Green Version]
  39. Kim, H. Teacher Learning Opportunities Provided by Implementing Formative Assessment Lessons: Becoming Responsive to Student Mathematical Thinking. Int. J. Sci. Math. Educ. 2017, 17, 341–363. [Google Scholar] [CrossRef]
  40. Vonderwell, S.; Liang, X.; Alderman, K. Asynchronous discussions and assessment in online learning. J. Res. Technol. Educ. 2007, 39, 309–328. [Google Scholar] [CrossRef] [Green Version]
  41. Kim, H. Concreteness fading strategy: A promising and sustainable instructional model in mathematics classrooms. Sustainability 2020, 12, 2211. [Google Scholar] [CrossRef] [Green Version]
  42. Lee, K.S.; Kim, H.J.; Kang, J. From uniformity to sustainable diversity: Exploring the design attributes of renovating standardized classrooms in Korea. Sustainability 2019, 11, 5669. [Google Scholar] [CrossRef] [Green Version]
  43. Friedman, H.H.; Friedman, L.W. Cries in education: Online learning as a solution. Creat. Educ. 2011, 2, 156–163. [Google Scholar] [CrossRef] [Green Version]
  44. Lee, Y.; Choi, J.; Kim, J. Discriminating factors between completers of and dropouts from online learning courses. Br. J. Educ. Technol. 2013, 44, 328–337. [Google Scholar] [CrossRef]
  45. Park, J.H.; Choi, H.J. Factors influencing adult learners’ decision to drop out or persist in online learning. J. Educ. Technol. Soc. 2009, 12, 207–217. [Google Scholar]
  46. Rubio, F.; Thomas, J.; Li, Q. The role of teaching presence and student participation in Spanish blended courses. Comput. Assist. Lang. Learn. 2018, 31, 226–250. [Google Scholar] [CrossRef]
Figure 1. Data collection procedure.
Figure 1. Data collection procedure.
Sustainability 13 04499 g001
Table 1. Proportion comparison of participants in the study by degree field.
Table 1. Proportion comparison of participants in the study by degree field.
CollegeStudy Sample (%)Student Enrollment at the University (%)
Business89
Nursing97
Engineering & IT3030
Humanities77
Social Sciences1513
Liberal Arts22
Health Sciences86
Arts & Physical Education711
Bionano Technology68
Medicine21
Korean Medicine11
Law55
Table 2. Background information of the instructor questionnaire participants
Table 2. Background information of the instructor questionnaire participants
Age (Years)Total EFL Teaching Experiences (Years)Total EFL Experience in Korean Higher EducationHighest Degree
MSDMSDMSDDegreeN
42.05.912.85.89.14.2BA1
MA9
PhD3
Table 3. Newly developed form of sustainable language assessment at GILC and its characteristics.
Table 3. Newly developed form of sustainable language assessment at GILC and its characteristics.
Previously Used Language Assessment at GILCNewly Developed Form of Sustainable Language Assessment at GILC
Content (primary purposes)Summative: the primary purposes lie in gauging and measuring students’ language abilitiesFormative: the primary purposes lie in improving students’ self-regulated and sustainable language learning and use
Orientation (focus of measurement)Product-oriented: what students have learned as an outcome of the courseProcess-oriented: how students’ language learning is going
ApproachJudgmental: teachers provide an overall grade or score based on assessmentDiagnostic: teachers identify areas for improvement and instructional supports using assessment
Assessment format and procedureFormal assessment: systematic, planned sampling techniquesAlternative assessment: (1) embedded in classroom tasks and activities designed to elicit performance (2) combines a variety of assessment methods
EvaluatorsTeachersTeachers, students, peers
Table 4. Descriptive analysis of student questionnaire.
Table 4. Descriptive analysis of student questionnaire.
Frequency AnalysisDescriptive Statistics
Very DissatisfiedDissatisfiedNeutralSatisfiedVery SatisfiedMSD
Overall Satisfaction14 (1.4%)39 (4.0%)201 (20.5%)408 (41.7%)317 (32.4%)40.9
Use of Language22 (2.2%)37 (3.8%)207 (21.1%)384 (39.2%)329 (33.6%)40.9
Satisfaction of Assessment30 (3.1%)61 (6.2%)260 (26.6%)373 (38.1%)255 (26.0%)3.81
Table 5. Descriptive analysis of instructor questionnaire.
Table 5. Descriptive analysis of instructor questionnaire.
Frequency AnalysisDescriptive Statistics
Item NumberVery DissatisfiedDissatisfiedNeutralSatisfiedVery SatisfiedMSD
10 (0%)0 (0%)3 (23.1%)8 (61.5%)2 (15.4%)3.90.6
20 (0%)1 (7.7%)5 (38.5%)5 (38.5%)2 (15.4%)3.60.9
31 (7.7%)3 (23.1%)1 (7.7%)5 (38.5%)3 (23.1%)3.51.3
40 (0%)4 (30.7%)8 (61.5%)0 (0%)1 (7.7%)2.80.8
51 (7.7%)1 (7.7%)0 (0%)2 (15.4%)9 (69.2%)4.31.3
63 (23.1%)2 (15.4%)4 (30.7%)3 (23.1%)1 (7.7%)2.81.3
Note. Item Number 1. I am satisfied with my experience of teaching online in 2020. Item Number 2. I am satisfied with the online teaching training I received in 2020. Item Number 3. My online students were actively involved in their learning. Item Number 4. Student achievement in my online teaching class was higher than that of my traditional class. Item Number 5. The preparation time for online teaching was longer compared to the preparation time for traditional teaching. Item Number 6. I am satisfied with my online assessment practices that were used in 2020.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Chung, S.-J.; Choi, L.-J. The Development of Sustainable Assessment during the COVID-19 Pandemic: The Case of the English Language Program in South Korea. Sustainability 2021, 13, 4499. https://doi.org/10.3390/su13084499

AMA Style

Chung S-J, Choi L-J. The Development of Sustainable Assessment during the COVID-19 Pandemic: The Case of the English Language Program in South Korea. Sustainability. 2021; 13(8):4499. https://doi.org/10.3390/su13084499

Chicago/Turabian Style

Chung, Sun-Joo, and Lee-Jin Choi. 2021. "The Development of Sustainable Assessment during the COVID-19 Pandemic: The Case of the English Language Program in South Korea" Sustainability 13, no. 8: 4499. https://doi.org/10.3390/su13084499

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop