Next Article in Journal
A Visual Approach for Solving Problems with Fractions
Previous Article in Journal
The Anxiety Caused by Secondary Schools for Autistic Adolescents: In Their Own Words
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Digital University Teaching and Learning in Management—The Gini from the COVID-19 Bottle and Its Empirical Representations in Germany

1
Chair of Production and Logistics, Georg-August-Universität Göttingen, 37073 Göttingen, Germany
2
Fraunhofer Institute for Material Flow and Logistics (IML) Dortmund, 44227 Dortmund, Germany
3
FOM University of Applied Sciences, 45141 Essen, Germany
*
Author to whom correspondence should be addressed.
Educ. Sci. 2021, 11(11), 728; https://doi.org/10.3390/educsci11110728
Submission received: 29 September 2021 / Revised: 8 November 2021 / Accepted: 10 November 2021 / Published: 12 November 2021

Abstract

:
Digitalization of teaching, learning, and assessment in higher education has gained increasing attention in research in the recent years. While previous research investigated issues of effectiveness, course attendance, and course evaluation from a long-term perspective, the current COVID-19 pandemic forced higher education institutions to digitalize teaching, learning, and assessment in a very short time. In this context, we investigate the effects of the digitalization of three courses from operations research and management science in the summer term 2020, namely two large lectures and tutorials for undergraduate, and a seminar for graduate students. To that end, student performance, course and exam attendance rates, and course evaluations are compared to the setting of the same courses in the previous year 2019 with a traditional, non-digitalized setting. Next to the quantitative data, qualitative statements from the course evaluations and students’ expectations expressed during the term are investigated. Findings indicate that the lecturers’ understanding of learning behavior has to develop further as interaction is required in any format, on-site or digital. Absenteeism and procrastination are important risk areas especially in digital management education. Instruments would have to be adapted to digital settings, but with care and relating to course specifics (including digital evaluation). Digital education does not make learning per se easier or harder, but we observed that the students’ understanding and performance gap increased in digital teaching times. As an outlook, we propose the longitudinal investigation of the ongoing digitalization during the COVID-19 pandemic, and going beyond, investigate opportunities of the current crisis situation for implementing the long-term transition to digital education in higher institution institutions.

1. Introduction

Questions of digital teaching and learning in management and business administration have been a standing question for a long time [1,2,3]—and have received a further push in the recent COVID-19 situation as most teaching activities were transferred to digital systems. Already in 1964 for example, Hall provided an extended discussion about the relevant questions and frameworks in management education [4]. Further insights are related to the formative impact of management education regarding society and the environment [5,6]. The recent publication by Hwang et al. lists digital learning as one of the 15 most important research and development topics from an extensive literature review regarding business management education in general [7]. Similarly, for the specific case of operations management, the edited volume by Belien et al. determines digital developments as a major trend [8].
At the same time, the COVID-19 pandemic in 2020 and 2021 provides a sort of natural experiment in a quite unparalleled fashion [9,10]. This can be of help to analyze and understand specific questions in management and management education: Ahlstrom and Wang outline this for the question of firm behavior [11], Bansal and Grewatsch for the research area of firm sustainability management [12]. In adding to this discussion, our paper applies an empirical comparison regarding three specific operations management courses taught within the management education curriculum at the Georg-August-University of Göttingen in 2019 and 2020. By this comparison, the natural experiment setting of COVID-19 for 2020 is used to answer the question which impacts can be described regarding digitalization in general and operations management education at universities.
The specific contribution of this paper is threefold: First, we outline the existing knowledge and derive hypotheses regarding the digitalization development in general and operations management education. Second, we describe empirical results from a 2020 (COVID-19) compared to a 2019 (non-COVID-19) setting for two distinct operations management courses, including teaching evaluation and grading results in general terms. Third, we discuss derived implications and avenues for further research regarding digital general and operations management education at universities. The remainder of this paper is structured as follows: Section 2 outlines the theory framework regarding existing research results for digitalization changeovers in higher education as well as in management education. Section 3 provides the methodological approach pursued. In Section 4, the core empirical results from the comparison of two operations management courses in 2019 and 2020 are presented. Within Section 5, we discuss implications and derived hints for improvements regarding digital operations management education as well as general management university education.

2. Materials and Methods

2.1. Higher Education

Higher education has undergone comprehensive transition steps in the last decades in terms of formal framing (degrees), resources (diverse developments by country), quality (auditing culture), competition and impact. This is outlined for example by Wu and Liu and others for the general impacts of technological change on higher education [13,14,15]. Klumpp et al. investigate the increasing global competitive development in higher education with the example of international university rankings [16]. In addition, competition for resources and in especially excellent students and researchers is a dominant development force in higher education, often connected to expectations towards digitalization [17,18]. In addition, expectations towards universities as institutions of research, teaching and transfer regarding societal and economic impacts are traditionally high. This can be exemplified with a series of topics, where universities are seen as important and central vehicles for the advancement of such objectives such as for example sustainability and sustainability education [19], innovation and economic growth [20], artificial intelligence or global health resilience [21].
Interestingly, digital education elements have been a long-standing issue in higher education research and management, with labels such as distance education, e-learning or blended learning [22,23]. This is outlined in detail in the following section.

2.2. Digitalization in Higher Education

In general, digitalization can be defined as the use of digital technologies to renew, simplify and improve processes, tasks, and products [24]. The effects of ubiquitous digitalization and implications of digital transformation are investigated in many research fields and industry sectors, including organization science [25], the automotive sector or other service sectors. In higher education, Henderson et al. [26] find many different reasons that make digital technologies particularly useful for students. According to their analysis, digital technologies can help students to (1) organize and manage the “logistics” of studying (e.g., via learning management systems); (2) obtain flexibility of place and location; (3) save time, (4) enable reviewing, replaying, and revising content, (5) research information, (6) support basic tasks, (7) communicate and collaborate, (8) augment university learning materials, (9) see information in different ways, and (10) save costs. Castañeda and Selwyn [22] (p. 2) however emphasize that “framing digital technologies [solely] in terms of learning […] obscures the socialization, subjectification and qualification purposes of education”.
In the literature and across disciplines, there is no consensus on how effective online education in terms of students’ performance is: Papers finding significantly better performance in the classroom include [27] (case study: microeconomics course), [28] (case study: cognition, learning, and assessment course) and [29] (case study: statistics course). Studies that found significantly better student performance with online education include [30] (meta-analysis of 96 studies in psychology, engineering, computer science, business, and technical writing), [31] (meta-analysis of 201 studies related to health professions), and [32] (case study: programming language course).
In [24], digitalization in higher education is conceptualized as external process, e.g., driven by government, or internal process, e.g., driven by academic staff. That tensions may be caused if internal and external processes are not coordinated well. In the more extreme setting of the COVID-19 pandemic in early 2020, where the transition to online learning, teaching, and assessment needed to be made very rapidly, Watermeyer et al. [33] surveyed 1148 academics working in universities in the UK regarding the preparedness and confidence for the digital disruption, amongst others. They found that the digital disruption was mostly perceived with far more drawbacks than benefits by the academic staff. However, this emergency case cannot be compared to other research settings on digital education: Firstly, external and internal processes could not be coordinated in the short amount of time, leading to severe dysfunctions for both academic staff and students in providing online learning, teaching, and assessment. Secondly, for many higher education institutions, the objective was not to create a robust long-term digital education ecosystem, but rather to provide a short-term, temporary, access to learning, teaching, and assessment in a manner that this quick to set up and reliably available during an emergency [34]. In some disciplines and with new experience (forcibly) gathered, however, higher education institutions may consider following up on this short-term response with a more long-term strategy for making a sustainable transition to online learning.

2.3. Operations Management Education

Teaching and learning in operations management areas has the specific challenge to accommodate a large variety of sub-disciplines and scientific cultures (such as from mathematics, engineering, management science and others). Likewise, decision problems in the real world usually are interdisciplinary in nature and thus often unstructured [35]. Therefore, curricula should be designed in such a way that students are prepared to deal with such messes [36,37]. Many papers on OR/MS education agree that practical case studies and experiences should be represented in the curricula [37,38,39,40]. With a more process-oriented view of teaching, Cochran describes a strategy for teaching OR and MS as a three-step procedure encompassing (1) active learning (to promote students’ interest and engage them with the topic), (2) case-based learning (to develop comprehension and understanding), and (3) project-based learning (to enhance appreciation and proficiency) [38]. In a similar fashion, Reuter-Oppermann et al. describe their curriculum which provides students with different skills and knowledge, encompassing (1) domain knowledge, (2) mathematical and (3) software tools, (4) use cases and (5) practical experiences, while courses are designed in such a way that they complement each other [40]. In [41], digitalization is identified as a major trend affecting OR/MS education, enabling innovative teaching concepts such as blended learning, flipped classrooms and massive open online courses.

2.4. Specifics of Digitalization in Management Science Higher Education

In the field of operations research and management science (OR/MS) education, Miltenburg [42] describes an undergraduate MS course (with about 500 participants), which students can choose to attend live on campus, online via video tutorials or mixed (with some elements on campus and online). In addition, text-based online discussion is offered via emails and a discussion board. The students taking the live lecture achieve statistically significantly better grades. However, in comparison to the previous iteration of the course, which was taught live on campus only, the class average on the final examination improved significantly. They also report that about 15% of the students are hard to reach, i.e., they neither attend the live lecture nor use the provided online material. Sharkey and Nurre [43] describe an undergraduate OR course (with about 50 participants) with optional, supplementary online video tutorials providing additional examples and applications for the taught OR methods. In that sense, the authors interpret the online video tutorials as replacement for a course textbook. Regarding a particular exam question, they suppose that the supplementary online material helped students to achieve better grades in the final exam.

2.5. Hypothesis Development

The evaluation is especially connected to the existing body of knowledge regarding the success and impact factors for management education, e.g., practical relevance and experience [44]. Furthermore, connecting principles, collaboration and interdisciplinary learning are highlighted as success principles [45,46,47,48,49]. In several references, also international cooperation is mentioned as for example by Miranda and Teixeira for management science education specifically [50]. Regarding the two large analyzed courses, the following hypotheses are developed and tested subsequently:
Hypothesis 1. (H1):
Higher levels of digitalization in teaching are connected to higher levels of student performance.
Hypothesis 2. (H2):
Higher levels of digitalization in teaching are connected to higher levels of student satisfaction.
Hypothesis 3. (H3):
Higher levels of digital student-lecturer interaction are connected to higher levels of student performance.
Hypothesis 4. (H4):
Larger numbers of different digital teaching instruments lead to higher registration numbers in elective courses.

3. Materials and Methods

Due to the rapid spreading of COVID-19 in Europe in February and March 2020, higher education institutions in line with other public and private institutions needed to react quickly and adhere to governmental regulations intended to minimize all citizens’ contacts via social distancing. In Germany, many universities transitioned frantically to online learning, teaching, and assessment, as the described period of the outbreak preceded the beginning of the summer semester by only a few weeks. With massive uncertainties regarding (1) the future development of the pandemic and corresponding contact restrictions, (2) the eligibility of on-site learning, teaching, and assessment for the summer semester, (3) the stability of the extant IT infrastructure such as live conferencing systems in the face of a significant rise in demand, and limited experience regarding the suitability and availability of online tools for teaching, learning, and assessment, reliable temporary solutions were needed. To evaluate the online transition of our courses, we compare the digital implementations with their previous offline iterations in terms of students’ performance and course evaluation.
In the summer semester 2020, we offered two large undergraduate OR courses: (1) Production and Logistics (P&L), which is a required course in the faculty’s Business Administration degree program, with about 700 students enrolled in the online learning management platform; (2) Manufacturing Management (MM), which is an elective, specializing course, with about 260 enrolled students. Table 1 shows an overview of the undergraduate courses in terms of the number of participants.
Furthermore, we offered a graduate OR course (3) Simulation in Supply Chain Management. It is an elective, specializing seminar with a maximum of 13 students enrolled. The students work together in teams developing simulation models for specific problems concerning the logistics in a supply chain. The seminar includes introductory lectures, counselling sessions and a final presentation and discussion of the seminar papers. In all three courses, the covered topics are comparable to the iterations of the courses in the previous year 2019.

3.1. Course Implementation: Didactic and Technical Concepts

In the following, we describe the didactic concepts of our undergraduate courses. Figure 1 shows a taxonomy for different formats of online teaching at the University of Göttingen. As hybrid teaching was not allowed in the university during this phase of the pandemic, our courses were offered fully digitally, with the exception of the final exams in July/August 2020. Both courses included a combination of (1) asynchronous elements, i.e., lecture script, exercises, and their solutions as PDF files as well as recordings of lectures and tutorials as video files, and (2) synchronous elements, i.e., voluntary weekly digital sessions, where students could get live feedback on their questions regarding the content and organization of the course.
Students were expected to watch the online videos to prepare in advance of the weekly sessions, so that the sessions’ main objective was to answer the students’ questions. While voice-based questions were strictly restricted to the weekly sessions, text-based questions could be posted to a chat throughout the week, and were also often answered throughout the week or in the corresponding weekly session, at the latest. The question times were separated into lectures and tutorials, and for P&L, the tutorial sessions were separated into nine groups, each supervised by a student assistant. In the lecture sessions, the professor and research assistant answered questions for the group of all students (m:n), and in the tutorial sessions, a research or student assistant provided feedback to the respective group (1:n). As a concession in the COVID-19 pandemic and exception in the summer term 2020, students of our Faculty of Business and Economics were allowed to opt out of exams for 24 h after taking them without any drawbacks (usually, they can opt out of an exam up to 24 h before taking them).
Regarding the technical implementations of both undergraduate courses, we used two digital tools to implement the didactic concept. As with the previous on-site iteration of the courses, we used the open source digital learning management platform StudIP [51] which provides different functionalities for course management and has long been used in the University of Göttingen. Of the functionalities provided in StudIP, we used the announcements (for organizational issues), discussion board, overview of participants, text and video file repositories, time schedule, and course evaluation, which can be broadly summed up as asynchronous course elements. For the interactive synchronous elements, we used the online communication tool Discord (https://discord.com/, accessed on 9 November 2021), where different servers were set up for the two courses, including a rights management system. For example, the professor, research, and student assistants were allowed to share their screens for up to 50 participants or mute other participants during the weekly sessions, while students were not. Discord includes text and voice channels, which were set up for the lecture and tutorial sessions. In total, 330 students were registered on the P&L discord server and 77 on the MM server, respectively. However, participation in the live sessions was much lower, averaging roughly 50 participants per session in P&L and 20 in MM.
An overview on the didactic concepts in 2019 and 2020 for our graduate course, the master seminar, is shown in Table 2. In 2019, we used to introduce the simulation methods and software in class, with students bringing their laptops with the software preinstalled. The software and its coding were shown step by step by the lecturer and the students could follow each step and program simultaneously. Problems with the software were solved live in class. For the online course in 2020, we decided to record the lecture instead, so that the students can view it at home and stop the recording if needed. This video consisted of an introduction lecture on simulation in general and two simulation methods. The videos were uploaded before the first online live meeting. This meeting included a round of students’ and lecturers’ introduction, the seminar topics were explained and groups of two to four students were assigned to each topic. Students used voting sheets to choose their topic. The groups had some time to get to know each other, exchange contact details, plan group meetings, and discuss the seminar topic and initial questions in separate breakout rooms. The lecturer visited each breakout room to answer some follow-up questions. The meeting concluded with a Q&A session.
Regarding supervision, in 2019, students usually came in by appointment and met each other and the teacher in person at the university. In 2020, all communication in-between the groups were digital. In both years, thirteen students were enrolled which is the maximum number of students allowed for the seminar, with each semester one student dropping out over the time of the seminar. Thus, neither in the number of students nor in their commitment we identified any changes.

3.2. Sample Description

To avoid bias in comparing the results of 2019 and 2020, we analyzed the overall performance (measured as average grades in completed modules) for the five largest student cohorts (representing a combination of degree program and semester) enrolled in our courses with data from the statistics portal of the faculty’s examination office, see Table 3. There are slightly decreasing trends in average grades. However, we think that the differences between the overall average grades of course participants in 2019 and 2020 are marginal, so that we assume that any changes in the students’ performance are mostly related to the transition to online learning and teaching.

3.3. Evaluation of the Transition toward Digital Learning

To evaluate the online transition of our courses, we compare the digital implementations with their previous offline iterations in terms of students’ performance, course evaluation (which was collected before the exam, at the end of the period of lectures), and the statistics on course participants and video viewership given in Table 1 and Table 4. We measure the performance according to grades and achieved points in the final exam. In all exams with a duration of 90 min, 90 points could be achieved at maximum. The course evaluation consists of a quantitative part, where students are asked to rate several items on a 7-point Likert scale, and a qualitative part which allows students to give additional feedback on anything related to the course.
Regarding the P&L and MM modules, to test hypotheses H1 through H3, we perform independent two-sample t-tests (with unequal sample sizes and similar variances):
To test H1 (higher levels of digitalization in teaching are connected to higher levels of student performance), the two samples are the 2019 and 2020 exam participants. We compare these groups regarding the achieved points in the exam.
To test H2 (higher levels of digitalization in teaching are connected to higher levels of student satisfaction), the two samples are the 2019 and 2020 evaluation participants. We compare these groups regarding all evaluation questions.
To test H3 (higher levels of digital student-lecturer interaction are connected to higher levels of student performance), the two samples are generated by splitting the 2020 exam participants into two groups: those who participated in the Q&A sessions and those who did not. We compare these groups regarding the achieved points in the exam.
To test H4 (larger numbers of different digital teaching instruments lead to higher registration numbers in elective courses), we compare the numbers of students who participated in the exams of the elective course MM in 2019 and 2020.
Regarding the graduate seminar, unfortunately, no evaluation results were accessible for the year 2019, as the number of students who filled out the evaluation forms did not exceed the minimum of six students needed to access the results. Thus, we excluded the evaluation results of the graduate course from our quantitative analysis.

4. Empirical Results

Table 4 provides statistics on the number, duration, and viewership of the recordings for the P&L and MM courses, separated into lectures and tutorials. It should be noted that the number of tutorial videos in P&L was roughly three times higher, but the videos’ average individual length was roughly three times lower, because of the clustering of topics and exercises in videos: In P&L, multiple tutorials (and corresponding videos) covered different aspects of the same topic. For example, five videos covered linear programming and the simplex algorithm. In MM, multiple exercises covering similar topics were condensed into a single video.
Regarding the course format, it can be noted that the consumption of the added digital content (videos and Q&A sessions) is rather low. In P&L, only 185.45 students consumed the videos on average, while in MM, only 76.7 students did. This equals to 63.29% of exam participants in P&L, or 55.99% in MM, respectively. However, we do not know the relationship between the video consumption and exam participation, so that students who did not write an exam could also have consumed the videos. Similarly, only very minor shares of students participated the Q&A sessions (participation is measured in terms of active contributions, either by writing anything in the chat or using Discord’s reaction feature, which allows users to react to already written messages with emojis): In P&L, 51 students participated in a Q&A session at least once, while in MM, 22 students participated at least once. However, if only the chat contributions are counted without the reactions, participation numbers decrease to 19 students in P&L and 15 students in MM.
Table 5 shows the number of accesses for all PDF documents (lecture slides, tutorial exercises, tutorial solutions) in the P&L and MM courses. In general, it can be observed that the students access the course materials multiple times. In both modules, regarding lecture slides, the relative number of accesses decreased from 2019 to 2020, while regarding tutorial exercises, the relative number of accesses increased. The number of accesses regarding tutorial solutions is not comparable, because these solutions were not provided in 2019.

4.1. Compulsory Undergraduate Course: Production and Logistics

In production and logistics, regarding hypothesis H1, there is a statistically significant difference between the exam performances of the 2019 and 2020 groups (with p < 0.001 ), see Table 6.
Regarding hypothesis H2, statistically significant improvements of the 2020 P&L course, compared to the 2019 course, can be seen in the overall course evaluation and the teaching aids used (see Figure 2). A statistically significant degradation can be seen regarding the perceived fairness of the lecturer. The students’ effort for preparation and follow-up also increased significantly.
Regarding hypothesis H3, the comparison of the participating and non-participating students indicates that there are indeed significant statistical differences regarding exam performance, where students who participated actively during the semester performed better, see Table 7.

4.2. Elective Undergraduate Course: Manufacturing Management

In manufacturing management, regarding hypothesis H1, there is no statistically significant difference between the exam performances of the 2019 and 2020 groups, see Table 8.
Regarding Hypothesis H2, there is no statistically significant change in the overall course evaluation. The only statistically significant changes occur in the preparatory and follow-up work and the communication of performance requirements (see Figure 3).
Regarding hypothesis H3, the comparison of the two groups indicates that there are significant statistical differences regarding exam performance, where students who participated actively during the semester performed better (see Table 9).
Regarding hypothesis H4, the number of exam participants more than doubled between 2019 and 2020 (see Table 1). However, because this was only a single iteration of this course, the data is not conclusive and further iterations of the digital course would need to be compared with the pre-COVID-19 iterations.

5. Discussion

5.1. Comparative Analysis

The following points can be raised for a comparative analysis and discussion regarding the three analyzed courses at the University of Göttingen:
It is interesting and might be below expectations that less than half of the registered students have actively used the video files for learning. This is in contrast to the results reported in [52], where most students in a Mathematics course used video lectures as their primary learning material. This is even more interesting as before the COVID-19 event, it was a standard argument of students to ask for video files and recordings of lectures. There are two possible explanation hypotheses for discussion: First, video files might actually—at least for a major part of the student body—be less attractive for learning than for example simple slides in PDF files, e.g., due to the fixed learning speed in watching the video. When learning with slides, students might for example prefer to use different speed levels, lower levels for parts harder to grasp and higher speed levels for topics easier to comprehend. Based on their individual learning approaches [53], different students prefer different learning materials (including PDF files, video files, or referenced textbooks). Peimani and Kamalipour [54] argue that using multiple communication channels can also result in deeper learning through the representation of multiple viewpoints. Similar interactions between learning pace and (digital or non-digital) teaching channels were also found in [42]. Second, the low rate of video consumption might be connected to the (unusual, untrained) time schedule management by students. The lecturing concept required students to prepare synchronous Q&A sessions by watching the relevant video; if students did not manage beforehand, they might have been inclined to skip the video altogether, assuming that just consuming the live Q&A session would partly replace their own video studying session. Another explanation for student engagement is provided in [55], where the students’ levels of self-regulation and digital capabilities were identified as predictors for their engagement in online teaching. Additionally, due to the very short lead times, our videos did not have subtitles, hindering accessibility for deaf students [56].
In addition, synchronous study elements featured very low student participation rates. Again, two possible explanations might be the reasons for that: First, the increased multi-media learning material might have been sufficient and no more questions were left with most of the students. This may also be due to the fact that more written documents, i.e., solutions for tutorial exercises, were also provided as documents, so that detailed videos or the Q&A sessions may not have been perceived as necessary. Actually, the low participation rate corresponds with similarly low live question rate in traditional face-to-face lectures and courses (with larger groups) at our faculty in Göttingen. We do believe that the low participation rate in the Q&A sessions is an effective solution for students who would not dare to ask their questions in a face-to-face format, as a potential exposure is higher in a full lecture hall than in an online Q&A session, where students could choose to use a pseudonym. Second, there might have been other hurdles for participation. For example, scheduling conflicts might have come up as all lectures and courses went online during COVID-19 lockdown periods, as also found in [54]. For the (synchronous) online sessions, there was no administrative scheduling management to avoid collisions such as normally implemented with face-to-face sessions. Options to increase student participation during in online teaching are also discussed in [57], pointing out that multiple instruments and channels need to be combined to foster student engagement.
A slightly reduced exam performance in the largest course (production and logistics) can be caused by a multitude of reasons. It is not necessarily due to the changes lecture format, but can for example also be traced back to a generally higher emotional and cognitive stress level of the general population and the student population during COVID-19 lockdowns. Similarly, in [58], significantly lower student performances were found during courses in the COVID-19 pandemic.
The interesting fact the exam participation rates for face-to-face courses pre-COVID-19 and online courses during 2020 are on a similar level can be discussed as the fact that “hard to reach” student groups are similar and within the same limits and problems, not affected by the media change in the teaching and learning setting. We believe that the didactic concept with weekly Q&A sessions is an effective way to counteract procrastination during the semesters’ lecture time, because it encourages students to regularly and actively engage in learning. However, we could not analyze the exact times of students’ accesses to course materials as was for example done in [58] to evaluate the impact on student procrastination. Additionally, as attendance of lectures at our University is not compulsory for students (both digitally or face-to-face), options to engage with the hard-to-reach students are limited, and, as Scherrer [29] notes, it is unclear whether this is the lecturer’s responsibility at all.
The student feedback and evaluation were on average on a more positive level during COVID-19 than before. This is interesting and a possible bias due to positive selection processes with the online evaluation have to be checked and reconsidered (mainly those who already participated strongly in the digital teaching offers may have also used the online feedback system). An evaluation in presence (digital) format is traditionally used at the University of Göttingen, fostering this bias question further. Miltenburg [42] found no significant changes in the course evaluation. Our data allows for a more detailed investigation of improvements: Statistically significant improvements were found in the overall rating of the mandatory P&L course as well as the usefulness of provided media in both courses (P&L and MM). Additionally, the required preparation and follow-up of materials was increased, as was expected due to the change of the didactic concepts. Interestingly, the behavior of the lecturer was perceived as fairer in the face-to-face format of P&L. This may be due to the fact that lecturers are more tangible for students in face-to-face formats, especially regarding their answers and actions towards students for example with questions or contributions.
Most students mentioned in the teaching evaluation that the digital formats offered more options for interaction and feedback. This hints at the possibility to implement specific digital elements also in the post-COVID-19 university teaching.
A shift was observed regarding the acceptance of online and digital communication systems for university teaching: In the first months up to half a year, students accepted many different tools and software applications, mainly because they were happy to receive any teaching at all. However, after about half a year, students increasingly criticized the multitude and “chaos” of different digital teaching tools. This led to a standardization and reduction of digital teaching tools during the 12 months COVID-19 period.
It was further observed that for different tutorials and courses the digital setting allowed for quality checks and standardization as for example identical and jointly produced videos were used for all these sessions with different student groups. In the case of mistakes or feedback from students it was easier to change these things in a standardized fashion for all tutorial groups than it would have been in face-to-face courses.
To a great extent, students preferred specialization courses during the digital teaching phase due to COVID-19. This can be linked to the possible harder scheduling task for students as said before: Avoiding parallel courses was harder for students and less relevant with specialization courses than for basic ones. This is due to the partly uncoordinated timelines and schedules of digital courses—but could also be improved for further digital teaching sessions as a lack of coordination was mainly observed due to the short-term nature of the short-notice switch to digital teaching in 2020.
Regarding the graduate seminar, using the students’ own computers to watch a lecture and use a simulation software simultaneously was easier in a video conference session than in a classroom. However, our experience was that most of the students did not prepare and did not try to use the software beforehand and did not program the short exercise explained in the recorded video. Regarding student supervision, the advantages of the digital format were higher flexibility, fast assignment of appointments, and shorter meetings. From the lecture’s perspective, students needed more support or at least asked more often for a consultation meeting, which could also be due the lower barriers for a digital meeting.

5.2. Limitations

General conditions of our students, regarding, e.g., mental health, technical conditions, or the impact of the extended deadline for opting out of an exam should also play a role for in determining the students’ performance, but could not be analyzed with the available data. Moreover, regarding H3, we could not match all students who participated in the Q&A sessions with the exam candidates, because we allowed students to use arbitrary aliases in Discord, and they could delete their Discord accounts after the exam, so that it is not possible to identify them. However, most students actually used their full names and retained their accounts, so that this bias should be minimal. Furthermore, the registration numbers of exam participants can differ, because in 2020, there was an exception to how exam registrations were handled: students were allowed to sign out of exams even 24 h after taking them (usually, they are not allowed to sign out of exams by 24 h before taking them). Some of the provided video files were re-uploaded during the semester, because of small errors. This resets the video viewership for the respective file. However, because the number of faulty videos was low, errors were usually found quickly, and students were also informed quickly, this should not have a great effect on overall video viewership statistics. Finally, the course evaluation data could be skewed, because in Göttingen, most students are asked to fill out this evaluation in a synchronous course. Because the Q&A sessions were the only synchronous elements and also optional, the distribution of students participating in these course evaluations may be different from previous iterations of the course.

5.3. Implications

There is a multitude of implications that can be connected to the findings presented in this paper. The most important one is the question of individualization though: Digitalization implies in many forms and fashions the differentiation and individualization of learning. This can be a positive tendency for example with the chance to adjust to individual learning stages and capabilities better than in pure physical teaching settings. On the other hand, this is also accompanied by risks such as students falling behind or being left behind if their personal learning characteristics are less suitable for digital formats requiring specific competences and for example more self-organization skills.
Altogether, university teaching in a digitalized context requires intense and complex preparation as well as strategic planning. In [59], the course design, pedagogical strategies incorporating active learning and providing a sense of online community, infrastructure for delivery and training, and incorporating activities that support student wellbeing were identified as success factors for digital education. In [60], student–student and instructor–student dialogues are identified as success factors. In particular, depending on the digital platform and format used, supporting student-student dialogues can be challenging [54]. In [61], challenges regarding the diversity of student backgrounds and equitable participation are highlighted. A comprehensive view towards all aspects relevant to learnings is essential and requires motivation, skill and endurance of teachers in order to reach learning curve effects on both sides regarding digital instruments, for lecturers and students alike. This in turn means that most decision regarding specific formats, technologies and didactics used shall be located at the decentral level—and not to be centralized during digitalization efforts. This implies for example, that no central decision for specific software or platform solutions shall be made centrally, but university services should provide a multitude of digital services for the lecturer to select from individually.

6. Conclusions

There are many hopes connected to digital management teaching and learning. This includes individualized learning independent of time and place, the increased access for specific study groups and persons or an increased efficiency of learning via economies of scale. These hopes in many forms sound such as the open-topic wish list for a “Gini from the Bottle”, sometimes even in connection with other objectives such as sustainability improvements. The empirical study results showed that there are severe limitations to these expectations for several reasons: First, digital teaching and learning implies a differentiation of media channels, as well as student learning types connected to that. In turn, this leads to the challenge that learning performance and evaluations are similar on average, but variety and deviation levels increase—leading to a new didactics challenge. Second, preparation and resource input were underestimated from the start of most digital teaching and learning endeavors—on the student as well as on the lecturer side. Future digital teaching projects will have the luxury but also need to plan more efficiently and wisely regarding resource allocation in digital university teaching preparation and execution. A telling example for this is examination schemes: in 2020, many exam forms were changed, and many universities avoided on-site exams altogether. From 2021 onwards, there will be a diverse mix of examination strategies, with some universities keeping up digital examination forms and others focusing more on examination forms requiring physical presence. This is strongly connected to the specific resource balance (e.g., availability of rooms, preparation times for digital exams versus paper exams and so forth) for each and every exam as well as the overall strategy of one department or one university—altogether many differences are expected to arise and sustain in the university teaching and exam sector. Third, the competence situation and dynamic development from students and lecturers has to be considered. Learnings from the pandemic experiences will pertain and expectations will shift—this can already be recognized for individual student counseling. From students’ and lecturers’ sides, there will be more suggestions for digital meetings than before 2020—and this is also due to a specific skill acquisition to be applied.
Altogether, digitalization of university teaching in the operations management field such as other disciplines has experienced an external push by the 2020 pandemic experience. Many of these new developments will stay, although not all of them. Intelligent and efficient teaching strategies will on the one hand identify those elements with the most advantages for students and lecturers such as, e.g., a mix of asynchronous and synchronous teaching media elements. On the other hand, it will also be crucial to deselect the elements most unfavorable for students and teachers. This will be the core challenge for university teaching in the next decade and this paper provided some empirical hints as well as in-depth thoughts regarding that issue from Germany. This has to be compared and complemented with experiences from other countries [54,55,62,63,64,65,66] as well as other education areas [67]. In total, university lecturers are challenged globally to make the most out of the harsh and limiting circumstances experienced in the 2020/2021 timeframe due to COVID-19.

Author Contributions

Conceptualization, T.W. and M.K.; Methodology, T.W., M.K. and B.B.; Validation, T.W., M.K. and B.B.; Formal Analysis, T.W.; Investigation, T.W., M.K. and B.B.; Data Curation, T.W.; Writing—Original Draft Preparation, T.W., M.K. and B.B.; Writing—Review & Editing, T.W., M.K. and B.B.; Visualization, T.W. and B.B.; Supervision, M.K.; Project Administration, M.K.; Funding Acquisition, B.B. and M.K. All authors have read and agreed to the published version of the manuscript.

Funding

The APC was funded by the Open Access Publication Funds of the University of Göttingen.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Due to data privacy, only aggregated is included in the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Allen, D.; Kern, T.; Mattison, D. Culture, power and politics in ICT outsourcing in higher education institutions. Eur. J. Inf. Syst. 2002, 11, 159–173. [Google Scholar] [CrossRef]
  2. Karmarkar, U.S.; Apte, U.M. Operations management in the information economy: Information products, processes, and chains. J. Oper. Manag. 2007, 25, 438–453. [Google Scholar] [CrossRef]
  3. Kwok, R.C.-W.; Lee, J.-N.; Huynh, M.Q.; Pi, S.-M. Role of GSS on collaborative problem-based learning: A study on knowledge externalisation. Eur. J. Inf. Syst. 2002, 11, 98–107. [Google Scholar] [CrossRef]
  4. Hall, N. Education and Management. J. Manag. Stud. 1964, 1, 105–115. [Google Scholar] [CrossRef]
  5. Gill, M.J. High Flying Business Schools: Working Together to Address the Impact of Management Education and Research on Climate Change. J. Manag. Stud. 2020, 58, 554–561. [Google Scholar] [CrossRef] [Green Version]
  6. Vaara, E.; Faÿ, E. Reproduction and Change on the Global Scale: A Bourdieusian Perspective on Management Education. J. Manag. Stud. 2012, 49, 1023–1051. [Google Scholar] [CrossRef]
  7. Hwang, A.; Fornaciari, C.J.; Asarta, C.J.; Arbaugh, J.B.; Ferrara, Z. An analysis of highly-cited scholarship in business and management education: Findings and future agendas. Int. J. Manag. Educ. 2021, 19, 100447. [Google Scholar] [CrossRef]
  8. Advances in Operations Research Education: European Studies; Beliën, J.; Teixeira, A.P.; Ittmann, H.W.; de Miranda, J.L.; Laumanns, M.; Vaz Pato, M. (Eds.) Springer: Cham, Switzerland, 2018; ISBN 978-3-319-74104-8. [Google Scholar]
  9. Schmidt, S.C.E.; Anedda, B.; Burchartz, A.; Eichsteller, A.; Kolb, S.; Nigg, C.; Niessner, C.; Oriwol, D.; Worth, A.; Woll, A. Physical activity and screen time of children and adolescents before and during the COVID-19 lockdown in Germany: A natural experiment. Sci. Rep. 2020, 10, 21780. [Google Scholar] [CrossRef] [PubMed]
  10. Tomasik, M.J.; Helbling, L.A.; Moser, U. Educational gains of in-person vs. distance learning in primary and secondary schools: A natural experiment during the COVID-19 pandemic school closures in Switzerland. Int. J. Psychol. 2020, 56, 566–576. [Google Scholar] [CrossRef]
  11. Ahlstrom, D.; Wang, L.C. Temporal Strategies and Firms’ Speedy Responses to COVID-19. J. Manag. Stud. 2020, 58, 592–596. [Google Scholar] [CrossRef]
  12. Bansal, P.; Grewatsch, S.; Sharma, G. How COVID-19 Informs Business Sustainability Research: It’s Time for a Systems Perspective. J. Manag. Stud. 2020, 58, 602–606. [Google Scholar] [CrossRef]
  13. Jackson, N.C. Managing for competency with innovation change in higher education: Examining the pitfalls and pivots of digital transformation. Bus. Horiz. 2019, 62, 761–772. [Google Scholar] [CrossRef]
  14. Lohr, A.; Stadler, M.; Schultz-Pernice, F.; Chernikova, O.; Sailer, M.; Fischer, F.; Sailer, M. On powerpointers, clickerers, and digital pros: Investigating the initiation of digital learning activities by teachers in higher education. Comput. Hum. Behav. 2021, 119, 106715. [Google Scholar] [CrossRef]
  15. Wu, N.; Liu, Z. Higher education development, technological innovation and industrial structure upgrade. Technol. Forecast. Soc. Chang. 2021, 162, 120400. [Google Scholar] [CrossRef]
  16. Klumpp, M.; de Boer, H.; Vossensteyn, H. Comparing national policies on institutional profiling in Germany and the Netherlands. Comp. Educ. 2014, 50, 156–176. [Google Scholar] [CrossRef]
  17. Klumpp, M. Sisyphus Revisited: Efficiency Developments in European Universities 2011–2016 According to Ranking and Budget Data. Rev. High. Educ. 2019, 43, 169–219. [Google Scholar] [CrossRef]
  18. Lacka, E.; Wong, T.C.; Haddoud, M.Y. Can digital technologies improve students’ efficiency? Exploring the role of Virtual Learning Environment and Social Media use in Higher Education. Comput. Educ. 2021, 163, 104099. [Google Scholar] [CrossRef]
  19. Bautista-Puig, N.; Casado, E.S. Sustainability Practices in Spanish Higher Education Institutions: An Overview of Status and Implementation. J. Clean. Prod. 2021, 126320. [Google Scholar] [CrossRef]
  20. Sharif, R. The relations between acculturation and creativity and innovation in higher education: A systematic literature review. Educ. Res. Rev. 2019, 28, 100287. [Google Scholar] [CrossRef]
  21. Galante, J.; Dufour, G.; Vainre, M.; Wagner, A.P.; Stochl, J.; Benton, A.; Lathia, N.; Howarth, E.; Jones, P.B. A mindfulness-based intervention to increase resilience to stress in university students (the Mindful Student Study): A pragmatic randomised controlled trial. Lancet Public Health 2018, 3, e72–e81. [Google Scholar] [CrossRef] [Green Version]
  22. Castañeda, L.; Selwyn, N. More than tools? Making sense of the ongoing digitizations of higher education. Int. J. Educ. Technol. High. Educ. 2018, 15, 1–10. [Google Scholar] [CrossRef] [Green Version]
  23. Shachar, M.; Neumann, Y. Differences Between Traditional and Distance Education Academic Performances: A Meta-Analytic Approach. Int. Rev. Res. Open Distrib. Learn. 2003, 4, 1–20. [Google Scholar] [CrossRef]
  24. Tømte, C.E.; Fossland, T.; Aamodt, P.O.; Degn, L. Digitalisation in higher education: Mapping institutional approaches for teaching and learning. Qual. High. Educ. 2019, 25, 98–114. [Google Scholar] [CrossRef] [Green Version]
  25. Yoo, Y.; Boland, R.J.; Lyytinen, K.; Majchrzak, A. Organizing for Innovation in the Digitized World. Organ. Sci. 2012, 23, 1398–1408. [Google Scholar] [CrossRef]
  26. Henderson, M.; Selwyn, N.; Aston, R. What works and why? Student perceptions of ‘useful’ digital technology in university teaching and learning. Stud. High. Educ. 2015, 42, 1567–1579. [Google Scholar] [CrossRef]
  27. Brown, B.W.; Liedholm, C.E. Can Web Courses Replace the Classroom in Principles of Microeconomics? Am. Econ. Rev. 2002, 92, 444–448. [Google Scholar] [CrossRef]
  28. Ferguson, J.; Tryjankowski, A.M. Online versus face-to-face learning: Looking at modes of instruction in Master’s-level courses. J. Furth. High. Educ. 2009, 33, 219–228. [Google Scholar] [CrossRef]
  29. Scherrer, C.R. Comparison of an Introductory Level Undergraduate Statistics Course Taught with Traditional, Hybrid, and Online Delivery Methods. INFORMS Trans. Educ. 2011, 11, 106–110. [Google Scholar] [CrossRef] [Green Version]
  30. Sitzmann, T.; Kraiger, K.; Stewart, D.; Wisher, R. The Comparative Effectiveness of Web-based and Classroom Instruction: A Meta-Analysis. Pers. Psychol. 2006, 59, 623–664. [Google Scholar] [CrossRef]
  31. Cook, D.A.; Levinson, A.J.; Garside, S.; Dupras, D.M.; Erwin, P.J.; Montori, V.M. Internet-based learning in the health professions: A meta-analysis. JAMA 2008, 300, 1181–1196. [Google Scholar] [CrossRef] [PubMed]
  32. Dutton, J.; Dutton, M.; Perry, J. Do Online Students Perform as Well as Lecture Students? J. Eng. Educ. 2001, 90, 131–136. [Google Scholar] [CrossRef] [Green Version]
  33. Watermeyer, R.; Crick, T.; Knight, C.; Goodall, J. COVID-19 and digital disruption in UK universities: Afflictions and affordances of emergency online migration. High. Educ. 2020, 81, 623–641. [Google Scholar] [CrossRef]
  34. Krishnamurthy, S. The future of business education: A commentary in the shadow of the Covid-19 pandemic. J. Bus. Res. 2020, 1–5. [Google Scholar] [CrossRef] [PubMed]
  35. Ackoff, R.L. The art and science of mess management. Interfaces 1981, 11, 20–26. [Google Scholar] [CrossRef]
  36. Borsting, J.R. OR Forum—Presidents’ Symposium: Reflections on OR/MS Education. Oper. Res. 1987, 35, 787–791. [Google Scholar] [CrossRef] [Green Version]
  37. Williams, T.; Dickson, K. Teaching real-life OR to MSc students. J. Oper Res. Soc. 2000, 51, 1440–1448. [Google Scholar] [CrossRef]
  38. Cochran, J.J. You want them to remember? Then make it memorable! Means for enhancing operations research education. Eur. J. Oper. Res. 2012, 219, 659–670. [Google Scholar] [CrossRef]
  39. Ittmann, H.W. A South African Perspective on OR/MS Education. In Advances in Operations Research Education: European Studies; Beliën, J., Teixeira, A.P., W.Ittmann, H., de Miranda, J.L., Laumanns, M., Vaz Pato, M., Eds.; Springer: Cham, Switzerland, 2018; pp. 59–77. ISBN 978-3-319-74104-8. [Google Scholar]
  40. Reuter-Oppermann, M.; Zander, A.; Nickel, S. An Innovative Concept for Teaching Operations Research Applied to Health Care. In Advances in Operations Research Education: European Studies; Beliën, J., Teixeira, A.P., Ittmann, H.W., de Miranda, J.L., Laumanns, M., Vaz Pato, M., Eds.; Springer: Cham, Switzerland, 2018; pp. 95–105. ISBN 978-3-319-74104-8. [Google Scholar]
  41. Ittmann, H.W.; Beliën, J.; de Miranda, J.L. OR/MS Education in a Changing Environment. In Advances in Operations Research Education: European Studies; Beliën, J., Teixeira, A.P., Ittmann, H.W., de Miranda, J.L., Laumanns, M., Vaz Pato, M., Eds.; Springer: Cham, Switzerland, 2018; pp. 13–27. ISBN 978-3-319-74104-8. [Google Scholar]
  42. Miltenburg, J. Online Teaching in a Large, Required, Undergraduate Management Science Course. INFORMS Trans. Educ. 2019, 19, 89–104. [Google Scholar] [CrossRef] [Green Version]
  43. Sharkey, T.C.; Nurre, S.G. Video Tutorials Within an Undergraduate Operations Research Course: Student Perception on Their Integration and Creating A Blended Learning Environment. INFORMS Trans. Educ. 2016, 17, 1–12. [Google Scholar] [CrossRef] [Green Version]
  44. Oswald-Egg, M.E.; Renold, U. No experience, no employment: The effect of vocational education and training work experience on labour market outcomes after higher education. Econ. Educ. Rev. 2021, 80, 102065. [Google Scholar] [CrossRef]
  45. Watanabe, C.; Naveed, K.; Neittaanmäki, P. Co-evolution between trust in teachers and higher education toward digitally-rich learning environments. Technol. Soc. 2017, 48, 70–96. [Google Scholar] [CrossRef]
  46. Scherer, R.; Howard, S.K.; Tondeur, J.; Siddiq, F. Profiling teachers’ readiness for online teaching and learning in higher education: Who’s ready? Comput. Hum. Behav. 2021, 118, 106675. [Google Scholar] [CrossRef]
  47. Milićević, V.; Denić, N.; Milićević, Z.; Arsić, L.; Spasić-Stojković, M.; Petković, D.; Stojanović, J.; Krkic, M.; Milovančević, N.S.; Jovanović, A. E-learning perspectives in higher education institutions. Technol. Forecast. Soc. Chang. 2021, 166, 120618. [Google Scholar] [CrossRef]
  48. Herrera-Pavo, M.Á. Collaborative learning for virtual higher education. Learn. Cult. Soc. Interact. 2021, 28, 100437. [Google Scholar] [CrossRef]
  49. Santiago, I.-P.; Ángel, H.-G.; Julián, C.-P.; Prieto, J.L. Emergency Remote Teaching and Students’ Academic Performance in Higher Education during the COVID-19 Pandemic: A Case Study. Comput. Hum. Behav. 2021, 106713. [Google Scholar] [CrossRef]
  50. De Miranda, J.L.; Teixeira, A.P. OR/MS Education: Good Practices and International Cooperation. In Advances in Operations Research Education: European Studies; Beliën, J., Teixeira, A.P., Ittmann, H.W., de Miranda, J.L., Laumanns, M., Vaz Pato, M., Eds.; Springer: Cham, Switzerland, 2018; pp. 79–93. ISBN 978-3-319-74104-8. [Google Scholar]
  51. StudIP. Overview. Available online: https://hilfe.studip.de/help/4.0/en/Basis/MenuBar (accessed on 15 September 2021).
  52. Pócsová, J.; Mojžišová, A.; Takáč, M.; Klein, D. The Impact of the COVID-19 Pandemic on Teaching Mathematics and Students’ Knowledge, Skills, and Grades. Educ. Sci. 2021, 11, 225. [Google Scholar] [CrossRef]
  53. Smith, S.N.; Miller, R.J. Learning approaches: Examination type, discipline of study, and gender. Educ. Psychol. 2005, 25, 43–53. [Google Scholar] [CrossRef]
  54. Peimani, N.; Kamalipour, H. Online Education and the COVID-19 Outbreak: A Case Study of Online Teaching during Lockdown. Educ. Sci. 2021, 11, 72. [Google Scholar] [CrossRef]
  55. Limniou, M.; Varga-Atkins, T.; Hands, C.; Elshamaa, M. Learning, Student Digital Capabilities and Academic Performance over the COVID-19 Pandemic. Educ. Sci. 2021, 11, 361. [Google Scholar] [CrossRef]
  56. Aljedaani, W.; Aljedaani, M.; AlOmar, E.A.; Mkaouer, M.W.; Ludi, S.; Khalaf, Y.B. I Cannot See You—The Perspectives of Deaf Students to Online Learning during COVID-19 Pandemic: Saudi Arabia Case Study. Educ. Sci. 2021, 11, 712. [Google Scholar] [CrossRef]
  57. Ahshan, R. A Framework of Implementing Strategies for Active Student Engagement in Remote/Online Teaching and Learning during the COVID-19 Pandemic. Educ. Sci. 2021, 11, 483. [Google Scholar] [CrossRef]
  58. Guzsvinecz, T.; Szűcs, J. Using Analytics to Identify When Course Materials Are Accessed Relative to Online Exams during Digital Education. Educ. Sci. 2021, 11, 576. [Google Scholar] [CrossRef]
  59. Srinivasan, S.; Ramos, J.A.L.; Muhammad, N. A Flexible Future Education Model—Strategies Drawn from Teaching during the COVID-19 Pandemic. Educ. Sci. 2021, 11, 557. [Google Scholar] [CrossRef]
  60. Tsang, J.; So, M.; Chong, A.; Lam, B.; Chu, A. Higher Education during the Pandemic: The Predictive Factors of Learning Effectiveness in COVID-19 Online Learning. Educ. Sci. 2021, 11, 446. [Google Scholar] [CrossRef]
  61. Calder, N.; Jafri, M.; Guo, L. Mathematics Education Students’ Experiences during Lockdown: Managing Collaboration in eLearning. Educ. Sci. 2021, 11, 191. [Google Scholar] [CrossRef]
  62. Bakhov, I.; Opolska, N.; Bogus, M.; Anishchenko, V.; Biryukova, Y. Emergency Distance Education in the Conditions of COVID-19 Pandemic: Experience of Ukrainian Universities. Educ. Sci. 2021, 11, 364. [Google Scholar] [CrossRef]
  63. Cranfield, D.J.; Tick, A.; Venter, I.M.; Blignaut, R.J.; Renaud, K. Higher Education Students’ Perceptions of Online Learning during COVID-19—A Comparative Study. Educ. Sci. 2021, 11, 403. [Google Scholar] [CrossRef]
  64. Vijayan, R. Teaching and Learning during the COVID-19 Pandemic: A Topic Modeling Study. Educ. Sci. 2021, 11, 347. [Google Scholar] [CrossRef]
  65. Martín-Cuadrado, A.M.; Lavandera-Ponce, S.; Mora-Jaureguialde, B.; Sánchez-Romero, C.; Pérez-Sánchez, L. Working Methodology with Public Universities in Peru during the Pandemic—Continuity of Virtual/Online Teaching and Learning. Educ. Sci. 2021, 11, 351. [Google Scholar] [CrossRef]
  66. Moundy, K.; Chafiq, N.; Talbi, M. Comparative Analysis of Student Engagement in Digital Textbook Use during Quarantine. Educ. Sci. 2021, 11, 352. [Google Scholar] [CrossRef]
  67. Eady, M.J.; Green, C.A.; Capocchiano, H. Shifting the Delivery but Keeping the Focus: A Reflection on Ensuring Quality Teacher Preparation during a Pandemic. Educ. Sci. 2021, 11, 401. [Google Scholar] [CrossRef]
Figure 1. Different formats of online courses at Georg-August-University Göttingen, configuration of the P&L and PM courses is marked in italics.
Figure 1. Different formats of online courses at Georg-August-University Göttingen, configuration of the P&L and PM courses is marked in italics.
Education 11 00728 g001
Figure 2. Evaluation Results for P&L; Statistically significant deviations are marked as * p < 0.1; ** p < 0.05; *** p < 0.01.
Figure 2. Evaluation Results for P&L; Statistically significant deviations are marked as * p < 0.1; ** p < 0.05; *** p < 0.01.
Education 11 00728 g002
Figure 3. Evaluation Results for MM; Statistically significant deviations are marked as * p < 0.1.
Figure 3. Evaluation Results for MM; Statistically significant deviations are marked as * p < 0.1.
Education 11 00728 g003
Table 1. Comparison of the number of participants in P&L and PM 2019 and 2020.
Table 1. Comparison of the number of participants in P&L and PM 2019 and 2020.
CourseProduction and Logistics (P&L)Manufacturing Management (MM)
Semester (summer term)2019202020192020
Number of students enrolled in the online learning management system685695127268
Number of exam participants (absolute and in % of enrolled students)375 (55%)293 (42%)64 (50%)137 (51%)
Table 2. Course characteristics of the graduate course (seminar).
Table 2. Course characteristics of the graduate course (seminar).
Semester (Summer Term)20192020
Kick-off sessionLive, in class lectureLecture video (21 min) and live online Introduction and Q&A
2nd lectureIn class lecture on programming: students were programming simultaneouslyVideo of programming steps (60 min) uploaded plus the final outcome of the exercise
Group assignmentGroups and topics assigned in class, time to exchange details and ask further questionsGroups and topics assigned online using breakout rooms
SupportGroup meetings via appointment in officeGroup meetings via appointment online using BBB
PaperFinal paper handed in in print and digitallyDigital paper via email
Final presentationPresentation in classroom (feedback on presentation style)Online presentation, only using voice and slides
SupervisionVia Email and in personVia Email and video conference tools
Table 3. Comparison of largest student groups represented among the exam participants in P&L.
Table 3. Comparison of largest student groups represented among the exam participants in P&L.
Degree ProgramSemesterNumber of Exam ParticipantsDifferences in Average Grades (Compared to the 2019 Group)
Business Administration496−0.01
Business Administration3320.08
Business Administration624−0.14
Business Information Systems418−0.14
Business Administration516−0.07
Table 4. Statistics on video material.
Table 4. Statistics on video material.
Lecture Videos
CourseP&LMM
Number of Videos1010
Duration (in minutes):
Mean32:1036:23
Sd09:3411:01
Shortest/longest18:28/49:0921:32/61:36
Number of students accessing the videos:
Mean163.362.1
Sd46.6512.14
Least/most viewed116/27950/91
Tutorial videos
Number of videos2910
Duration (in minutes):
Mean17:3546:16
Sd13:4223:31
Shortest/longest01:59/67:4125:37/99:41
Number of students accessing the videos:
Mean185.4176.7
Sd31.199.07
Least viewed/most viewed137/28063/93
Table 5. Access statistics of digital lecture notes and tutorial exercises.
Table 5. Access statistics of digital lecture notes and tutorial exercises.
Document TypeModuleNumber of Accesses (Absolute/in % of Students Enrolled in Learning Platform)
Lecture slidesP&L 20193707 (541%)
P&L 20202928 (421%)
MM 2019672 (529%)
MM 20201060 (396%)
Tutorial ExercisesP&L 20192575 (376%)
P&L 20203132 (451%)
MM 2019150 (118%)
MM 2020528 (197%)
Tutorial SolutionsP&L 2019-
P&L 2020815 (117%)
MM 2019-
MM 2020460 (172%)
Table 6. t-Test results regarding student performance in P&L (2019 vs. 2020).
Table 6. t-Test results regarding student performance in P&L (2019 vs. 2020).
Student Performance (Exam Points) n MeanSd t p
2019 (face-to-face course)37659.3716.61−5.83<0.001
2020 (digital course)29352.413.54
Table 7. t-Test results regarding student performance in 2020 P&L (participation in Q&A sessions vs. no participation).
Table 7. t-Test results regarding student performance in 2020 P&L (participation in Q&A sessions vs. no participation).
Student Performance (Exam Points) n MeanSd t p
Active participation in any digital Q&A session4358.4115.533.21<0.001
No participation25051.3712.89
Table 8. t-Test results regarding student performance in MM (2019 vs. 2020).
Table 8. t-Test results regarding student performance in MM (2019 vs. 2020).
Student Performance (Exam Points) n MeanSd t p
2019 (face-to-face course)6456.3616.920.7580.225
2020 (digital course)13654.3317.99
Table 9. t-Test results regarding student performance in 2020 P&L (participation in Q&A sessions vs. no participation).
Table 9. t-Test results regarding student performance in 2020 P&L (participation in Q&A sessions vs. no participation).
Student Performance (Exam Points) n MeanSd t p
Active participation in any digital Q&A session2163.1815.212.5850.005
No participation11551.5619.52
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Witt, T.; Klumpp, M.; Beyer, B. Digital University Teaching and Learning in Management—The Gini from the COVID-19 Bottle and Its Empirical Representations in Germany. Educ. Sci. 2021, 11, 728. https://doi.org/10.3390/educsci11110728

AMA Style

Witt T, Klumpp M, Beyer B. Digital University Teaching and Learning in Management—The Gini from the COVID-19 Bottle and Its Empirical Representations in Germany. Education Sciences. 2021; 11(11):728. https://doi.org/10.3390/educsci11110728

Chicago/Turabian Style

Witt, Tobias, Matthias Klumpp, and Beatriz Beyer. 2021. "Digital University Teaching and Learning in Management—The Gini from the COVID-19 Bottle and Its Empirical Representations in Germany" Education Sciences 11, no. 11: 728. https://doi.org/10.3390/educsci11110728

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop