Next Article in Journal
Assessment of Mound Soils Bacterial Community of the Red Imported Fire Ant, Solenopsis invicta across Guangdong Province of China
Next Article in Special Issue
Investigating Undergraduate Student Experiences of NEE Courses in Guangdong, China during the COVID-19 Pandemic from 2020 to 2021
Previous Article in Journal
Evaluating Kindergarten Parents’ Acceptance of Unplugged Programming Language Courses: An Extension of Theory of Planned Behavior
Previous Article in Special Issue
Influence of Online Learning Environment and Student Engagement on International Students’ Sustainable Chinese Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

The Contribution of Learner Characteristics and Perceived Learning to Students’ Satisfaction and Academic Performance during COVID-19

1
Institute of Quality & Technology Management, University of the Punjab, Lahore 54590, Pakistan
2
Department of Innovation and Technology Management, College of Graduate Studies, Arabian Gulf University, Manama 293, Bahrain
3
Károly Ihrig Doctoral School of Management & Business, University of Debrecen, 4032 Debrecen, Hungary
4
Department of Industrial Engineering and Management, University of the Punjab, Lahore 54590, Pakistan
5
Psychology Department, Social Sciences Institute, Faculty of Health Sciences, University of Debrecen, 4032 Debrecen, Hungary
*
Authors to whom correspondence should be addressed.
Sustainability 2023, 15(2), 1348; https://doi.org/10.3390/su15021348
Submission received: 26 November 2022 / Revised: 26 December 2022 / Accepted: 27 December 2022 / Published: 11 January 2023

Abstract

:
With the rapid spread of COVID-19 worldwide, governments of all countries declared the closure of educational institutions to control its transmission. As a result, institutions were under pressure to offer online education opportunities so that students could continue their education without interruption. The unintended, hasty and unknown duration of the strategy encountered challenges at all pedagogical levels, especially for students who felt stressed out by this abrupt shift, resulting in the decline of their academic performance. Hence, it is necessary to comprehend the approach that might improve students’ involvement and performance in online learning. In this context, the current study used four models to understand the phenomenon: the Task Technology Fit (TTF), the DeLone and McLean Model of Information Systems Success (DMISM), the Technology-to-Performance Chain model (TPC) and the Technology Acceptance Model (TAM). The data for this study were obtained from 404 university students from the top ten universities of Pakistan. The results analyzed using structural equation modeling (SEM) show that learner characteristics positively predict performance through user satisfaction and task technology fit mediating function. Moreover, learner characteristics were also observed to have a significant positive influence on the academic performance of the students, with the mediating functions of user satisfaction and actual usage of the system. Likewise, perceived learning moderated the relationship between learner characteristics and user satisfaction. This research work provides policymakers with a profound framework that emphasizes how employing online learning technologies can strengthen the academic potential of students.

1. Introduction

The World Health Organization on 11th of March 2020 announced COVID-19 as a worldwide pandemic. According to UNESCO, 186 countries enforced the nationwide shutdown of educational institutions by the end of April 2020, affecting 73.8 % of all enrolled students [1]. Although the only way to control the spread of COVID-19 was by breaking down the transmission chain by implementing lockdowns and maintaining social distancing, closing institutions has impacted many students. In February 2020, Pakistan also announced a national emergency and closed down the whole country since the situation had deteriorated due to a rise in the number of COVID-19 cases in a number of cities.
Consequently, COVID-19 has been a stimulant for instructive establishments worldwide to search for innovative measures in a comparatively short time span. For the sake of maintaining educational activities, the majority of institutes had to switch to online learning. However, concerns about e-learning suitability, development, and efficacy remain uncertain, particularly in developing nations such as Pakistan, where technological barriers such as system compatibility and internet bandwidth accessibility pose substantial obstacles. In this research, we try to figure out what students think about online learning, what they prefer, and their insight and inclination towards web-based learning by conducting a survey in private and public universities in Punjab, Pakistan. During this period, the majority of educational institutions have moved to online mode utilizing Blackboard, Microsoft Teams, Zoom, Moodle, Skype, and numerous other technologies.
When it comes to learner motivation, contentment, and engagement, the online learning atmosphere is entirely different from the conventional classroom settings [2]. One study [3] contended that there was no noteworthy distinction between internet learning and in-person classroom sessions considering student satisfaction level. They also agreed that online classes could be just as beneficial as traditional classes if properly organized. The literature above indicates that, when properly designed and managed, online learning may be a viable alternative to traditional face-to-face classroom-based education [4].
Online learning necessitates continuous access to digital technologies. In a study conducted just before the outbreak of COVID-19, Ref. [5] observed the digital divide between urban and rural regions. In addition, students in remote regions often lack adequate access to information and communication technology, and they find it challenging to attend online learning sessions from the comfort of their own homes [6]. As a result of these various challenges, the students’ academic performance dropped considerably, and their grades suffered greatly. Furthermore, the rapid transition from traditional face-to-face classroom learning to online learning created a slew of challenges for institutions, students, and instructors. The most immediate concerns were providing high-quality education, integrating the quality processes required for online learning, and adapting to new technologies. Before we can establish the components that could improve student satisfaction and performance in online learning, we must first comprehend the basic principles of e-learning and the various factors of e-learning.
E-learning is a system that uses the internet to deliver instruction to students via laptops, cellphones, desktop computers, tablets, and other devices. Many governments are working to advance technology in education systems [7] because of its benefits, which include saving time, facilitating mutual communications, intensifying learning performance, providing the most up-to-date and accurate information, reducing costs, encouraging variable space options, and reducing spatial and time-specific issues associated with physical learning [8,9,10]. On the basis of these benefits, it is evident that online learning has been beneficial to students, educators, and other staff members during the COVID-19 pandemic. Many researchers have worked to define various theoretical ideas and construct various models in the field of information systems in order to predict and analyze individuals’ behavior to various technologies. Some of the important models observed in the literature in correspondence to information systems are the Theory of Reasoned Action (TRA) [11], Theory of Planned Behavior (TPB) [12], Technology Acceptance Model (TAM) [13], DeLone and McLean Model of Information Systems Success (DMISM) [14,15], Task Technology Fit model (TTF) [16] and Unified Theory of Acceptance and Use of Technology (UTAUT) model [17].
Therefore, the variables of these models have been derived from numerous online study research studies to describe online learning and its different frameworks. Despite the fact that researchers have attempted to investigate the relationship between online learning and various variables, there are still some gaps in the literature that need to be filled, such as the fact that only a few studies have looked into the relationship between learner characteristics and online learning performance impacts [18,19,20]. In addition, researchers could not identify extensive processes that may influence the performance of learners in online learning. Some other gaps associated with the variables utilized in online learning frameworks have also been discovered. Moreover, disparate findings have been reported on the association between user satisfaction, actual system usage, and performance effect. Studies have found that there are no significant associations between user satisfaction, actual usage and performance [21,22], but on the other hand, there are studies in favor of this relationship [23,24,25]. This paradox also allows users to analyze the importance or insignificance of the user satisfaction impact on actual system usage and performance. Moreover, there has rarely been a discussion of the role of mediators and moderators in online education models. For instance, the significance of human characteristics such as perceived learning or perceived usefulness as mediators and moderators has been hardly addressed. Consequently, some study questions have been formulated after evaluating the problems of the online learning system during COVID-19 and the gaps found in the prior literature. These include the following: (a) in what way can task technology fit (i.e., academic activities of students that are compliant with the relevant online learning system) affect learner characteristics and user satisfaction, contributing to high student performance in the time of COVID-19? Similarly, (b) how does actual system usage (i.e., the time period and frequency with which students utilize online learning systems) influence learner characteristics and user satisfaction, contributing to high student performance in COVID-19? (c) Can the perceived learning of online education enhance students’ satisfaction concerning the learner’s characteristics?
This study has been designed to examine the moderated mediation mechanism and evaluate the integration of the task technology fit theoretical model, technology to performance chain model, technology acceptance model and information system success model. The framework was designed to analyze the effect of learner characteristics on performance impact via the mediating role of user satisfaction and task technology fit in serial, as well as via the mediating influence of user satisfaction and actual usage of the system in serial. Likewise, the link between learner characteristics, user satisfaction, and perceived learning has been utilized as a moderating variable. This research focuses on the following research objectives: (1) to check whether perceived learning moderates the learner characteristics’ impact on user satisfaction. (2) To identify whether learner characteristics positively predict user satisfaction. (3) To determine whether the mediating effect of user satisfaction exists between learner characteristics and task technology fit. (4) To investigate whether learner characteristics positively predict performance impact via the serial mediating impact of user satisfaction and task technology fit. (5) To explore whether learner characteristics positively predict actual usage through the mediating variable of user satisfaction. (6) To examine whether learner characteristics positively predict performance impact through the mediating effect of user satisfaction and actual usage in series. As a result, a research framework has been established based on combining the DeLone and McLean Model of Information Systems Success (DMISM), the Task Technology Fit Model (TTF), The Technology-to-Performance Chain model (TPC) and the Technology Acceptance Model (TAM). The TTF model focuses on the task technology fit construct and its link to performance effect but ignores the correlation with the learner characteristics, user satisfaction, and actual usage constructs. The TPC model suggests that individual characteristics, task characteristics and technology characteristics influence task technology fit, which in turn affects performance. The TTF constructs are overlooked by DMISM, which stresses overall quality, user satisfaction, actual usage, and usage performance effect components. Therefore, considering the above literature, six variables were chosen to establish an online learning model. In this model, ‘learner characteristics’ is an independent variable; user satisfaction, actual usage, task technology fit, and performance impact are dependent variables. Perceived learning was used as a moderator for the relationship between learner characteristics and user satisfaction. Hence, learner characteristics were adopted from the TPC model, user satisfaction, and actual usage variables were taken from the DMISM model, and the task technology fit construct was taken from the TTF model, while the performance impact variable was shared by both models.
Learner characteristics are defined as an individual’s systematic strategy and the means by which the learners’ process information, regarded as a measuring tool for learning [26]. In the context of this research, learner characteristics are referred to as the accumulation of expertise and learning procedures used by the learner to manage the online learning activities proficiently and productively to elevate their satisfaction level with the online courses. It comprises self-efficacy motivation and self-regulated learning. Motivation is referred to the inner strength that compels an individual to perform an act or head toward a particular goal [27]. With reference to the context of this research, student motivation refers to capability, productivity, and willingness to be involved and to learn in an online learning environment. This setting has no physical location, in which tutors and pupils are located in different places. It is usually a part of a Learning Management System (LMS) that houses various information repositories for students’ engagement with other submission and evaluation interfaces.
As per one study [28], the motivation level of the students is an important element in maintaining elevated satisfaction levels in the online learning environment. Self-regulated learning refers to learners’ potential to restrain components or circumstances that have an impact on students’ online learning [29]. In the context of this research, self-regulated learning refers to the extent to which a learner is capable of thinking out, observing, determining his aim, and progression in the course, and applying the correct timing in the distance learning environment to accomplish the assigned tasks. Self-efficacy is defined as learners’ confidence in their potential and competence level to accomplish a particular task or assignment [30]. The present research refers to students’ recognition of their potential to achieve their learning tasks and assignments that are part of their online learning courses.
According to some [25], user satisfaction is the degree to which users find the system beneficial for them and are motivated to reuse it. In the context of online learning, user satisfaction is referred to as the extent to which students in online learning discern satisfaction in their independent decisions to rely on such services and how adequately they satisfy their demands [31,32]. Task technology fit is the degree to which a specific system is regarded as significant or fit for facilitating the user to accomplish their tasks, depending on work specifications [33]. Actual usage is defined as the prevalence of technology utilization and the extent of its periodic usage [34,35,36,37,38]. According to [39,40], the performance impact is referred to as the extent to which utilizing a particular system induces escalation of work quality by helping the users to execute their specific work quickly, granting dominance over the work, escalating the performance of the job, removing errors and enhancing the job capacity. Perceived learning is described as an individual’s perception that their knowledge and comprehension has increased [41]. It is the learner’s belief and perspectives about the learning events that have taken place. Some authors [42] described perceived learning as “changes in the learner’s perceptions of skill and knowledge levels before and after the learning experience.”

2. Literature Review and Hypotheses Development

2.1. Literature Review

When it comes to online learning and teaching, certain significant variables impact learning in general. Still, others are unique to the online learning environment because they considerably impact online learners’ satisfaction and performance. A learner characteristic is one of the significant variables that impact students’ learning accomplishments, whether in traditional or online education. As more learners encounter online learning environments in public and private institutions, the requirement to recognize pedagogical environments and procedures that advocate for online learning incitement and assimilation of learner characteristics is increasing. One study [43] observed that 75% of learners and 72% of teachers were deprived of the expertise required to use ICT-dependent learning elements because of inadequate skills and understanding of computer and internet-based programs, and this can cause the unsuccessful implementation of e-learning. According to some [44], learners’ computer expertise and time management skills are significant in an online learning environment and it is presumed that these components are consequential in online classes. In e-learning contexts, self-regulatory time management capabilities elicit improved performance, and learners’ ability to construct the real learning environment leads to effectiveness. Learners must pursue considerate support from fellow learners and instructors via emails, live chats and online meetings to attain maximum benefits [45]. Research work by [46] revealed that family life, work life, inadequate time and study load compelled learners to retract from online courses. A learner’s perspective toward online learning can lead to its success, and these influence behavioral intentions, which generally cause perseverance in the online learning environment. According to [47], the learners’ perspective relevant to e-learning is one of the success elements for the learning environment. Many scholars have contributed to web-based learning content in the past few years. Ref. [48] stated that advanced education foundations and governments strive to introduce web-based learning throughout the world. Ref. [49] referenced their model as being critical to making e-learning more powerful and comprehending its prosperity. The outcomes clarified that the grit of e-learning positively affected individual performance and satisfaction, aside from these studies. Ref. [50] posited that technology characteristics and task characteristics of massive open online courses positively estimate task technology fit. Moreover, perceived relatedness, perceived competence and social reputation substantially estimate learner’s behavioral intention. Ref. [51] proposed that enforcing an e-government may be a great challenge because of the low utilization in China. Another significant component of this research is perceived learning, which is a moderator between learner characteristics and user satisfaction variables. According to [52], the most dominant factors of online courses on students’ satisfaction and perceived learning in social sciences were course design and learning content. A research study by [53] observed that elements such as student motivation, interaction in the classroom, facilitation, instructor knowledge and course structure significantly positively impact learners’ perceived learning outcome and satisfaction. Furthermore, Ref. [54] analyzed that learner content interaction was the most effective and most important predictor of learner satisfaction, while online learning self-efficacy was the most effective and most important predictor of perceived learning.

2.2. Hypotheses Development

2.2.1. Learner Characteristics, User Satisfaction, Task Technology Fit and Actual Usage

Various researchers have examined the role of learner characteristics in online learning [55,56,57,58]. In the context of this research, learner characteristics comprise self-efficacy, motivation and self-regulated learning. Moreover, user satisfaction is considered an important element in acquiring new technology and a significant component in IS practices [59,60] (DeLone and McLean, 2016; Montesdioca and Maçada, 2014). Researchers have observed that learner characteristics notably affect user satisfaction in online learning [52,58,61]. Ref. [62] analyzed that learner characteristics positively affect user satisfaction. The current study highlights that the greater the learner characteristics of the students in the online learning system, the higher the satisfaction level of students in using online learning technologies, and the more they will assist in completing the online learning assignments of the students [63,64]. Therefore, it is deduced that:
Hypothesis 1 (H1). 
Learner characteristics positively predict user satisfaction.
When examining technology utilization in organizations, task technology fit is regarded as a particularly important factor [65]. Several types of research have been conducted to investigate the correlation between task technology fit and user satisfaction. These researchers observed that these two constructs have a substantial direct relationship [20,40,65,66,67,68,69,70,71]. Thus, it is inferred in this study that user satisfaction positively affects task technology fit because the more satisfied students are with the quality of technology used in online education, the more likely they are to estimate the technology to be eminently suitable for completing their online academic activities.
Hence it is proposed that:
Hypothesis 2 (H2). 
User satisfaction positively predicts task technology fit.
The concept of technology utilization by users is another important factor in technology-oriented studies. Much research has been done to investigate the link between user satisfaction and actual system usage. It has been determined that user satisfaction has a strong positive relationship with actual system usage [71]. Moreover, the average time spent using technology rises because of a higher level of user satisfaction [70]. The essential component of this proposition is user satisfaction, as the actual use of the system will rise if the user is satisfied with the system. So, the following hypothesis is proposed:
Hypothesis 3 (H3). 
User satisfaction is a strong determinant of actual system usage.

2.2.2. Task Technology Fit, Actual Usage and Performance Impact

Numerous studies have analyzed the positive association between task technology fit and factors such as performance and user satisfaction [20,40,65,66,67,68,69,70]. According to [72], task technology fit plays the role of a mediator amid performance impact and technological factors. With the continuous progress of technology and the inclusion of numerous new systems, the primary focus is on the technological system’s use outcome in terms of user performance enhancement to evaluate the system’s efficiency and productivity [60,73,74].
The performance effect in this study relates to the extent to which online learning affects student performance in terms of resource preservation, proficiency, competence, and knowledge growth [75]. Many researchers have statistically examined the relationship between task technology fit and performance impact, revealing that task technology fit predicts performance impact favorably [20,24,40,65,67,68,69,70,76]. In fact, it is evaluated that the performance of students in terms of efficiency and productivity has been improved by the task technology fit [77].
As a result, the following hypothesis is proposed:
Hypothesis 4 (H4). 
Task technology fit predicts performance impact favorably.
Another important component in the context of technology utilization is the relationship between actual system usage and performance impact [17,78,79,80], and several studies have attempted to close the gap by focusing on the relationship between actual usage and performance impact [40,81]. In a quantitative study, [71] determined that the actual use of a system substantially influences performance. Nevertheless, research based on information systems has found that actual usage of the system improves performance [32,65,81,82,83,84,85]. This correlation indicates that the more frequently the students use the online system to complete their academic work, the higher their academic achievement will be.
So, based on these facts, it is postulated that:
Hypothesis 5 (H5). 
Actual system use significantly predicts performance impact.

2.2.3. Mediating Role of User Satisfaction

According to the research mentioned above, learner characteristics predict user satisfaction [58,86], and user satisfaction has a strong positive influence on task technology fit [19]. As a consequence, it is proposed that learner characteristics impact task technology fit through user satisfaction. Furthermore, as evidenced by the literature cited above, user satisfaction increases the length of technological system use [87] and user satisfaction is significantly affected by learner characteristics [63,64]. Therefore, we consider the following hypotheses:
Hypothesis 6 (H6). 
Learner characteristics positively predict task technology fit through user satisfaction.
Hypothesis 7 (H7). 
Learner characteristics predict actual usage of the system through user satisfaction.

2.2.4. Sequential Mediations User Satisfaction, Task Technology Fit and Actual Usage

Since it has been seen as the level of learner characteristics increases, the satisfaction level of students also increases, which significantly affects task technology fit in the sequence. User satisfaction and task technology fit are the two most important factors in this hypothesis. However, there has been evidence of a correlation between task technology fit and performance impact [24]. The relationship between task technology fit and performance effect was studied empirically, revealing that task technology fit predicts performance impact considerably [24]. In fact, task technology fit improves students’ productivity and performance [88], and this technology fit meets the requirements when the user is highly satisfied, as indicated by [89] as a consequence of a higher level of learner characteristics.
Therefore, it is hypothesized that:
Hypothesis 8 (H8). 
Learner characteristics predict performance impact positively through user satisfaction and task technology fit in the sequence.
It has been identified that a greater level of learner characteristics leads to increased student satisfaction, which in turn improves actual system utilization indirectly through user satisfaction. The keys to this hypothesis are user satisfaction and actual usage of the system [75]. However, the literature also suggests that the performance can be influenced by actual usage [65], and the actual utilization of the system is determined by user satisfaction [75]. This user satisfaction is considerably influenced by the learner characteristics of online learners [62]. Therefore, learner characteristics evidently have a significant impact on performance via user satisfaction and actual system usage.
So, it is hypothesized that,
Hypothesis 9 (H9). 
Learner characteristics significantly predict performance impact through user satisfaction and actual usage of the system in the sequence.

2.2.5. The Moderating Role of Perceived Learning

Perceived learning is regarded as a significant measure of learning and is one of the fundamental components of course assessment [90]. Perceived learning of students and student satisfaction jointly can depict a more clear apprehension of the success of online learning [91]. Ref. [92] recommended a strong association between students’ overall perceived learning and students’ online learning satisfaction. A similar strong association was demonstrated by others [93,94]. Ref. [95] narrated that an instant output of a productive learning experience is a contented student, and observed that the student-perceived learning consequence is an acceptable predictor of student satisfaction in online learning. Ref. [58] observed that the output of perceived learning added to student satisfaction and positively impacted it in the online environment. Therefore, it can be hypothesized that.
Hypothesis 10 (H10). 
Perceived learning moderates the relationship between learner characteristics and user satisfaction.
All these hypotheses have been depicted in Figure 1.

3. Methodology

3.1. Research Design

In the present study, a multivariate analysis statistical technique named ‘Structural Equation Modeling’ (SEM) was employed to determine the structural correlations and practically examine the recommended hypotheses, with the help of Analysis of Moment Structures (AMOS®) 24 utility. It comprises two elements: the first one is Confirmatory Factor Analysis (CFA)—employed to estimate the measurement model amidst the observed and latent variables, and the second is Path Analysis (PA)—needed to adjust the structural model along with the latent variables. This two-stage methodology confirms that the structural model employs only those constructs that have an acceptable measure. A goodness-of-fit index was estimated amidst the sample data and the theoretical model in SEM. So as to estimate the measurement and the structural model fitness, three distinct measures were employed: Goodness of Fit Index (GFI), relative Chi-square ratio over the degree of freedom (χ2/DF), and Root Mean Square Error Approximation (RMSEA).

3.2. Sample and Procedure

The study’s population consisted of Pakistani students from the top 10 public and private institutions in Punjab, according to HEC rankings. When they participated in the study, the students were between the ages of 20 and 40. They were enrolled in the institution’s bachelor’s, master’s, and doctorate programs, as well as any additional diploma courses. The data from the students were obtained via quota sampling because the target population lacked a sample frame. Data were obtained via handing out a self-administered questionnaire in physical copy, as well as emailing surveys to students across universities and posting them on the official Facebook sites of the participating universities. The study was conducted from October 2021 to Jan 2022, with students taking online classes through online platforms such as Microsoft Teams and Zoom. The students were given a total of 1000 questionnaires. A total of 416 of the 1000 questionnaires received responses, and 404 were chosen for further research, with 12 responses being rejected owing to insufficient or incorrect data. As a result, the data collection yielded a 40.40% valuable response rate. The survey’s first part dealt with demographics, as shown in Table 1.
In addition, the Common Process Variance (CMV) was performed rationally, as suggested by [96]. CMV can be caused by the complexity of the scale items, the respondents’ inability to consider the research subject, double-barreled items, the respondents’ inexperience in evaluating the research issue, the respondent’s low involvement in the topic, the placement of the scale items, the respondent’s disposition to provide extreme responses, and so on.
CMV can be administered in two aspects: one by using procedural remedies, and the other by using numerous statistical techniques. Researchers use methodological remedies in the beginning stages of questionnaire design to prevent CMV. Harman’s single-factor test, also known as Harman’s one-factor test, is the most extensively used of these techniques [97]. Using this method, all 40 items in the sample were subjugated to a single exploratory factor analysis, yielding an unrotated factor analysis accounting for only 23.5% of the total. Subsequently, the CMV results affirmed that the sample data did not contain any CMV inclination.

3.3. Measurement Scale

Previously developed measurement scales were employed in order to collect data for the present research, as is exhibited in Appendix A. Each scale item was measured using a seven-point Likert scale (1—Strongly Disagree and 7—Strongly Agree). The scale for Learner characteristics was adopted from [63,98]. It has 18 items and a reported Cronbach alpha of 0.965 alpha value. The sample item is “In an online class, I prefer assignments and questions that really challenge me so that I can learn new things”. User Satisfaction has 3 items taken from [48]. The alpha value for this scale is 0.915. A Sample item is “My decision to use online learning was a wise one”. Task Technology was adopted from [75]. It has 3 items with a reported alpha value of 0.911, and the sample item is “Online learning fits with the way I like to learn and study.” Actual Usage was acquired from [19], has 2 items, and the alpha value for this scale is 0.818. The sample item is “on average, how much time do you spend per week using online learning?” Performance Impact was adopted from [48]. This scale has an alpha value of 0.959 with 10 items, and the sample item is “Online learning helps me accomplish my tasks more quickly”. Perceived Learning was taken from [92]. This scale has 4 items, and the sample item is “Overall, the online course met my learning expectations”, while it has an alpha value of 0.956. The overall summary of all the items has been reported in Table 2.

4. Data Analysis and Results

A popular statistical tool (AMOS), Analysis of a Moment Structure software, was used for data analysis in this research. AMOS software is a statistical bundle that has a novel system for performing Structural Equation Modeling (SEM), has an easy-to-use GUI, presents a distinct research model for the students, develops diagrams of high quality for publication, and its calculated numeric values are the most appropriate ones.

4.1. Descriptive Analysis

The measurements obtained for descriptive analysis of the model variables are displayed in Table 3. LC (Learner Characteristics) had a mean value of 4.83 and a standard deviation value of 1.42; for US (User Satisfaction), the value of 4.75 was obtained as the mean and 1.41 was obtained as the standard deviation value. Moreover, for PL (Perceived Learning), the mean value was 4.81 and for standard deviation, the value was 1.28. TTF (Task Technology Fit) was seen to have a mean value of 4.75 and a standard deviation value of 1.41. The value of the mean computed for AU (Actual Usage) was 4.67 and the value of the standard deviation computed was 1.48. In the end, the mean value computed for PI (Performance Impact) was 4.78 and the computed value for standard deviation was 1.78. It means that the coefficient of variation (CV = Mean/Std Dev) is not too high, and the data are not very scattered, which indicates the authenticity of the responses.
The estimated results of skewness exhibited that the data were normally distributed. For kurtosis, the calculated values were less than 10 and the calculated skewness values were found in the range of −1.0 to +1.0, which recommended significant ranges for determining normality.

4.2. Measurement Model

A measurement model illustrates precise or suggested models that link the latent variable and their respective indicators. It is also recognized as path analysis. The technique utilized to validate a conceptual measurement model is Confirmatory Factor Analysis (CFA). It denotes an interconnection between the observed variables or indicators and unobserved/ latent variables. The measurement model was assessed with the help of construct reliability and validity. Therefore, to estimate the validity in relevance to the six measures shown in Figure 2, CFA was conducted by examining the factor composition of variables.
In this study, SEM was used since it is a very efficient approach, and there is no other procedure that can give us more precise estimations of those parameters, assuming multivariate normal data. This research has employed the maximum likelihood (ML) estimation method that asserts that if every item loads on its associated factors, then the uni-dimensionality prevails for the constructs, which depicts the validity. In Figure 2, the measurement model is exhibited; Table 4 shows the estimates of composite reliability for the scale reliability, and CFA results are also shown there.

4.2.1. Model Fit

Model fit in CFA describes how closely the observed data enhance the associations mentioned in the hypothesized model. A model with a good fit is satisfactorily consistent with the data, i.e., to estimate whether the model notably fits in correspondence with the data or not. So, the goodness-of-fit of the model in relation to data was assessed by using different tests. So, depending on the goodness-of-fit indices determined, there is an indication of the acceptance of the model.
A Chi-squared (χ2) test was performed that assesses the association of the theoretical model in relation to empirical data. However, Chi-square is not being used for evaluating the model fitness as it relies upon the size of the sample. For evaluating the model fitness, the CMIN/DF ratio was computed, where ratio of less than two denotes a well-fitted model, an acceptable fit ratio will have a value of three to five, and a ratio greater than five represents an unacceptable value. The model observed in this study has a normed Chi-square (χ2/DF) value = (1179.531/542) = 2.176 (<3.00) that specifies a satisfactory fit. The Goodness-of-fit index (GFI ≤ 1) computes the measure of variance composed of the evaluated population covariance [3]. It is not connected with a null hypothesis, but we can generalize by evaluating 1-νresidual/νtotal. A perfect fit is considered if the value obtained is one. However, the value of GFI considerably increases when the sample size increases. GFI > 0.95 is regarded as a good fit and if the value of GFI < 0.65, it is considered an acceptable fit. The value of GFI determined for the current research model is GFI = 0.860, which indicates that the model fit is acceptable. RMSEA is the most fundamental element in covariance structure modeling. When the value obtained for RMSEA is <0.05, it is assessed as a good fit, and if the range of the value is between 0.08 to 0.10, it specifies as an average fit, and if the value is above 0.10, a poor fit is depicted. In this study, RMSEA = 0.054 and standardized RMR = 0.0312, which significantly denotes the constructs’ uni-dimensionality.

4.2.2. Reliability of the Variables

In the research context, reliability represents the degree to which the research methods produce definite and consistent results. Cronbach’s alpha value ranges between zero and one; the greater its value, the greater the internal reliability. In Appendix C, Cronbach’s alpha values relevant to the measures are enlisted, showing that they are higher than the threshold value of 0.70, which specifies convenient reliability for the measures employed in this current study [99].

4.2.3. Construct Validity

Validity is specified as “the durability or suitability of a test or apparatus in evaluating what it is intended to measure” [100]. In this research, construct validity was deduced after the recognition of discriminant validity, face validity, and convergent validity. The assessment items were procured from earlier studies, which led to the approval of the face validity. Further convergent validity is the magnitude with which a measure is positively associated with other measures of the relevant construct. The Average Variance Extracted (AVE) and indicator reliability was used to determine and evaluate convergent validity.
In order to assess the reliability of the indicators, factor loading was used. A construct with a high loading indicates that the associated indicators appear to have a lot in common, as indicated by the construct [101]. When factor loadings are above 0.50, they are estimated as highly essential values [102]. It was determined that all of the items were significant (p < 0.001), and the loadings for all the items observed were higher than the recommended value of 0.5, as exhibited in Appendix B, which specifies that items used in the model have fulfilled all the requirements. Moreover, it was specified that all AVE values were above the proposed value of 0.50 [102]. Therefore, convergent validity has been significantly attained for all constructs, and adequate convergent validity is presented in Table 4.
Furthermore, discriminant validity refers to the degree to which items distinguish across constructs or assess discrete ideas for the measuring model, and was justified with the help of three measures, namely cross-loadings, Fornell-Larcker and the heterotrait–monotrait ratio (HTMT). The discriminant validity was determined with the help of the Fornell-Larcker method. The outputs of this method were displayed in Table 4, where the values on the diagonals denote the square root of the AVEs, and these calculated values are greater than the correlation between the constructs (relative row and column values). This signifies that the constructs are strongly related to their respective indicators compared to other constructs in the model [100,103] that show a convenient discriminant validity [104]. Moreover, the correlation between exogenous constructs is estimated to be below 0.85 [105]. Hence, it can be said that discriminant validity is achieved for all the constructs that exist in the model.
The Fornell-Larcker criteria, on the other hand, has been criticized by researchers. According to [88] Hensler et al. (2015), Fornell-Larcker, in an ordinary research context, is unsuccessful in clearly explaining the unavailability of discriminant validity. So, a different technique, the heterotrait–monotrait ratio (HTMT) of correlations, was recommended depending on the multitrait–multimethod matrix. In the current study, HTMT was used to estimate discriminant validity. If the HTMT value is more than 0.90, i.e., HTMT0.90, or 0.85, i.e., HTMT0.85, then in these situations, there is a problem with discriminant validity. As exhibited in Table 5, all the estimated values were smaller than the recommended value of 0.85, which signifies that discriminant validity is achieved.

4.3. Structural Model Assessment

In SEM analysis, the structural equation model is the second-most essential procedure. This model may be presented once the measurement model has been validated by describing the association between the variables. As a result, the structural model depicts the relationship between the variables, illustrating the connectivity between constructs and presenting detailed aspects of the correlation between exogenous variables and important endogenous variables. The structural model’s results allow us to assess how effectively the theory is supported by empirical data and help us determine if the theory is empirically validated [102]. The goodness-of-fit value calculated for the structural model was correlated to the goodness-of-fit value of the CFA measurement model. In the structural model presented, the χ2/df = 2.741, CFI = 0.945, and RMSEA = 0.066. These fit indices described the verification of adequate fit among the conceptual model and the observed data.

4.4. Path Analysis and Hypothesis Testing

In order to determine the direction as well as the indirect influence of the exogenous variable, path analysis was executed. A path diagram in Figure 3 depicts the conceptual association among the constructs formulated on the outcomes from prior studies. LC represents an exogenous variable and US, AU, TTF and PL represent endogenous variables.
Bootstrapping methodology was employed for evaluating the indirect effects in the structural models by computing the beta (β) values, R2, and the relevant t-values. The p-value, however, was employed to evaluate the existence of the effect [106].
The estimation of the structural model describes the hypothesis tests as exhibited in Figure 3 and Table 6. Test results verified the six hypotheses formulated for this research work. Therefore, it is very obvious that learner characteristics positively estimate user satisfaction. So, H1 is accepted with (β = 0.239, p < 0.05). In the same way, user satisfaction positively estimates task technology fit, and the test results advocate this hypothesis; hence, H2 is accepted (β = 0.715, p < 0.05), as exhibited in Table 6. Besides, user satisfaction positively estimates actual system usage, and the test outcomes validate this hypothesis (β = 0.373, p < 0.05), which is exhibited in Table 6; therefore, H3 is acknowledged. Since task technology fit was found to influence performance impact significantly, H4 was also approved (β = 0.262, p < 0.05). According to H5, actual usage of the system positively estimates performance impact, and the outcome of the test advocates for this, as can be observed in Table 6; so, H5 is also accepted (β = 0.201 and p < 0.05).
In order to determine the effectiveness of mediating effects, the Variance Examined For (VAF) value was utilized. It is considered complete mediation if the VAF value is greater than 80%, a value lying between 20% to 80% is considered partial mediation, and no mediation exists when the value is less than 20% [102]. The study results presented in Table 7 depict the presence of partial mediation effects in the model. As stated in H6, learner characteristics positively estimate task technology fit via the mediation effect of user satisfaction possessing indirect effects (a × b) (β = 0.171) and (c) direct effects as (β = 0.278), denoting the presence of partial mediation. Mediation tests as described above were carried out in order to test H7 and as per the results, (a × b) (β = 0.089) were calculated as indirect effects and (c) (β = 0.250) as direct effects, indicating the existence of partial mediation. So, H7 is validated, and therefore, it can be inferred that learner characteristics certainly estimate the actual usage of the system via user satisfaction as a partial mediator.
Similar tests, as mentioned above, were performed for the mediation test of H8. It was determined that learner characteristics strongly estimate performance impact via the partial mediating effect of user satisfaction and task technology fit in the series and (a × b) (β = 0.045) denote indirect effects and (c) (β = 0.16) denote direct effects. For validating H9, similar mediation tests as mentioned before were carried out and (a × b) (β = 0.018) was examined as indirect effects and (c) (β = 0.012) was examined as direct effects. Thus, H5a is authenticated, so learner characteristics positively estimate performance impact via user satisfaction and the actual usage of the system, where they act as partial mediators.
For validating H10, which narrates that the perceived learning behaves as a moderator in the correlation between learner characteristics and user satisfaction, Hayes Process Macro was utilized to verify the moderation effect [107]. Initially, the total direct effect of learner characteristics was evaluated on user satisfaction; the outcome depicted a consequential association influence of learner characteristics on user satisfaction (β = 0.0977; t = 2.1988; p > 0.001). In the same way, the direct influence of perceived learning as a moderator on user satisfaction was evaluated; the output denoted a consequential association influence of perceived learning on user satisfaction (β = 0.4943; t = 10.4005; p < 0.001). Ultimately, the interaction influence of learner characteristics and perceived learning on user satisfaction was estimated and the results denoted that perceived learning has a consequential association influence on user satisfaction (β = 0.1553; t = 4.5204; p < 0.001). Considering the interaction term is essential, the moderation effect prevails in our framework. Therefore, in the interrelationship between learner characteristics and user satisfaction, as presented in Table 8, the moderation effect of perceived learning was notable. Hence, H10 was statistically confirmed and validated.

5. Discussion

In order to determine the association between learner characteristics, user satisfaction, task technology fit, actual usage, perceived learning and performance impact in top public and private universities in Pakistan, a model was developed in this study which was based on the integration between DeLone and McLean Model of Information Systems Success (DMISM), the Task Technology Fit model (TTF), the Technology-to-Performance Chain model (TPC) and the Technology Acceptance Model (TAM) model. According to this study, learner characteristics have a favorable impact on user satisfaction. This suggests that effective technology with the necessary characteristics, as well as positive learner attitudes and the possibility to obtain online learning with self-direction, all contributed to their greater degree of satisfaction. As a result, the student will feel more confident that they are making the right option by relying on and attaining online education. This conclusion is also supported by prior work, such as [52,61,63].
Similarly, user satisfaction was found to have a substantial impact on task technology fit, suggesting that user satisfaction is a key component in determining whether a new technology succeeds or fails. A prior study supports this conclusion [9]. This study also shows that learner characteristics influence task technology fit through a mediating impact on user satisfaction. According to this study, students with a higher degree of learner characteristics who use online education technology are more satisfied with the services offered by the technology and find it ideal for meeting their needs [57,108]. According to the findings of an empirical test on the link between task technology fit and performance effect, it was revealed that task technology fit positively predicts performance impacts, which is consistent with the findings of other research [65,66,67,68,69,109]. This study also suggests that learner characteristics positively predict performance impact via the mediation effect of user satisfaction and TTF in the sequence. According to this study, if the level of learner characteristics in the online education system is high, the students would be highly content with the online education system’s services in meeting their expectations. As a result, the student will find technology appropriate for completing their tasks, and coursework productivity and academic achievement will improve. Furthermore, because of task technology fit, students’ performance will improve efficiency and effectiveness. In addition, the link between user satisfaction and actual usage was examined, and it was discovered that user satisfaction predicts actual system usage favorably. This conclusion is supported by prior research findings [81,110]. The research explains that learner characteristics through users have an indirect influence on a student’s actual use of technology. This implies that the better the degree of student learner characteristics in the online learning system, the more satisfied the students will be, and they will eventually increase the frequency and duration of their online learning usage.
This research also supports the hypothesis that actual system usage predicts students’ performance impact favorably. Few studies in the literature have highlighted the interrelationship between actual system usage and performance impact, such as [71]. They found that actual system usage significantly impacts individual performance because users use the system to complete tasks, which improves their performance. In addition, numerous studies on IS have found that actual system usage has a significant positive impact on user performance [21,71,81,111]. This study confirmed that learner characteristics predict performance impact through the mediating function of user satisfaction and actual system usage. It states that the greater the level of learner characteristics in an online learning system, the more satisfied the students will be; therefore, the use of an online learning system will rise. This implies they will spend more time using online learning systems, improving their academic performance and coursework productivity. As a result, this technique adapts to how students learn and is deemed vital in their academic activities.
Perceived learning was also considered a mediator effect in the association between learner characteristics and user satisfaction in this study. According to [64], a satisfied student indicates a successful learning experience. The perceived student learning impact is a strong determinant of student satisfaction in online learning. This implies that if students believe that the online learning system meets their needs and is beneficial for them to complete their academic responsibilities, they will be extremely satisfied. As a result, the higher the perceived learning, the higher the level of learner satisfaction. Numerous research has shown that user satisfaction is favorably influenced by perceived learning [58,91].

6. Theoretical and Practical Implications

The following research study has the ability to provide a wide range of theoretical implications. First and foremost, this research adds to the body of knowledge by examining the moderating effect of perceived learning on the link between learner characteristics and user satisfaction. In addition, this study adds to the literature by examining the procedure of sequence mediation from learner characteristics to performance impact via the mediating effect of user satisfaction and task technology fit in series, as well as evaluating the influence of learner characteristics on performance impact via the mediating effect of user satisfaction actual usage. Furthermore, from a practical standpoint, this subject occupies a significant place due to its various applications. To begin with, e-learning may fundamentally boost learning through productive time utilization, and studying at one’s leisure increases attainment of education while using minimal resources, as well as reducing spatial barriers. Because of the COVID-19 pandemic, most educational institutions worldwide are now providing online education as a preventive strategy; thus, this research would benefit both institutions and students.
The second reason is that this research aimed to provide policymakers with a profound framework that emphasizes how employing online learning technologies can strengthen students’ academic potential as educational organizations. Governments worldwide are striving hard to make use of online education at a great level to ensure that students are provided with productive learning and education in the existing critical situation of the pandemic. According to the findings of the proposed framework, students’ academic performance in online education can be optimized if the learner characteristics of students in the online education system, user satisfaction constructs, task technology fit, actual system usage, and perceived learning are properly organized and adapted. Third, the focus of this research was to help students acquire information, enhance educational performance, and create constructive and dynamic expertise, all of which will reduce their stress levels when pursuing online education in the current circumstances of the COVID-19 pandemic.
Despite the fact that Pakistan is a developing country, it may fully leverage the benefits of online education so that, despite a lack of resources, it can provide high-quality education and learning throughout the country. Many countries worldwide have provided their students with modern technological equipment and reduced the costs of internet service providers to significantly enhance the availability of online education in their respective countries. Pakistan may also benefit from this action plan by implementing online learning across the country.

7. Conclusions

The current study examined students’ perspectives on online learning. It has highlighted elements that can assist students in enhancing their academic performance by adopting the most appropriate technology used in online learning. To deal with the problem, this study presented a consolidated model combining the DeLone and McLean Model of Information Systems Success (DMISM), the Task Technology Fit model (TTF), the Technology-to-Performance Chain model (TPC), and the Technology Acceptance Model (TAM). Learner characteristics, task technology fit, user satisfaction, perceived learning, actual system use, and performance impact were fundamental variables in the hypothesized framework.
The methodology given proved effective in revealing the impact of online learning on students’ academic progress, according to the results of several assessments. In assessing the task technology fit and practical application of online learning, user satisfaction is equally important. It also strengthens the relationship between learner characteristics, user satisfaction, and actual use. In addition, task technology fit is essential in evaluating academic performance and improving the link between user satisfaction and academic achievement. The perceived learning can also be used to gauge user satisfaction. The results of the tests clearly supported the existence of correlations between the framework’s components. The findings are consistent with previous research on the topic. Educational professionals and policymakers should highlight these traits in order to boost the probability of improved performance. Finally, the findings of this study will significantly aid the Pakistani government’s higher education policy. It will also be beneficial to create arrangements compatible with student activities, social values, and lifestyles, allowing students to use online learning to improve their academic achievement and, as a result, their work reliability.
The findings of this research will aid university policymakers in improving faculty and student knowledge and comprehension of the online learning system by conducting training programs on its usage. The necessary technological expertise for maintaining the online learning system should always be accessible. The management must ensure that the established online learning system is user-friendly and simple to use. In addition, the university administration is responsible for providing the necessary software, hardware and internet connectivity. If the necessary technical resources are updated on a regular basis, instructors and students will be able to effectively use online learning [112,113].
In addition, the framework established in this study will make it easier for students, teachers, and other administrative personnel to employ new technology to solve their issues. Several governments across the globe have successfully promoted educational achievements by offering modern technology equipment to pupils [114]. Pakistan can benefit from this strategy as well. Although Pakistan is a poor country with limited resources, it may nevertheless use the benefits of online learning to deliver high-quality education across the country.

8. Limitations and Future Research Directions

Several constraints were examined in this study, which can be used to anticipate guidelines for future research projects. Because the data for this study were collected from students at Punjab-based institutions, it is suggested that researchers who wish to do similar research should collect data from universities across Pakistan in other provinces so that the findings of this study may be generalized. A future study might be conducted on a bigger scale by comparing the online education system utilized in Pakistani universities to universities in other nations. Furthermore, this study only collected cross-sectional data. Still, future studies should include longitudinal data. To make the current study framework more comprehensive, the researchers should conduct other experiments to investigate the similarity among the results. Researchers should explore alternative moderators in future investigations, in addition to perceived learning, which was used as a moderator in this study. Perceived usefulness and perceived ease of use may be utilized as moderators in this context, and further explanations related to this moderation effect can be learned [23]. Furthermore, numerous studies have demonstrated that the human factor plays a vital role in persuading students to seek online education—in particular, a research study by [8] advocates that transformational leadership can be explored as a moderator in order to observe the associations in online learning frameworks.
Additionally, various paths in the scope of the current study might be extrapolated to produce a new scenario. Actual usage and user satisfaction may be replaced in future studies. Moreover, while this study is constrained to the education sector, future research might explore integrating the framework into other industries to evaluate the framework used in this study. Furthermore, learner characteristics in an online learning system were found to influence student satisfaction and performance impact in this study, but other factors may significantly impact both. As a result, future research should concentrate on the effects of various other elements, such as institutional factors, the role of the instructor, and course material design. Similarly, the online education system has been evaluated from the standpoint of students, but future studies should also examine the perspectives of the institution’s administrative and academic personnel [115].

Author Contributions

Conceptualization, S.B. and A.M.; Data curation, S.B.; Formal analysis, A.M.; Investigation, A.M. and S.A.M.; Methodology, S.H.; Project administration, S.S. and E.M.; Supervision, A.M. and S.S.; Validation, S.H., S.A.M. and E.M.; Writing—original draft, S.B.; Writing—review & editing, S.S., S.H., S.A.M. and E.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

The studies involving human participants were reviewed and approved by the Ethical Review Committee of the University of the Punjab, Lahore, Pakistan, Ref:PY-ERC/2021-007.

Informed Consent Statement

To engage in this study, the patients/participants gave their written informed consent.

Data Availability Statement

The data that support the findings of this study are available from the corresponding author, A.M., upon reasonable request.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A. Research Questionnaire

QUESTIONNAIRE
Dear Respondent!
This study is being conducted by the Institute of Quality & Technology Management, University of the Punjab, Lahore and aims to investigate “THE CONTRIBUTION OF LEARNER CHARACTERISTICS AND PERCEIVED LEARNING TO STUDENTS’ SATISFACTION AND ACADEMIC PERFORMANCE DURING COVID-19” Your participation will be greatly appreciated and all data will be kept strictly confidential. During the questionnaire, if in any manner you feel offended, I sincerely apologize for the inconvenience of the matter caused.
Table A1. The Questionnaire.
Table A1. The Questionnaire.
Gender MaleFemale
Age 20 or less21–3031–4041–5051–6061 & above
University Name
Education Level IntermediateBachelorsMastersM.PhilPh.D.Others
Record your responses on a scale given below:
Strongly DisagreeDisagreeSomewhat DisagreeNeither Agree nor DisagreeSomewhat AgreeAgreeStrongly Agree
1234567
Learner Characteristics
1The most satisfying thing for me in online courses is trying to understand the content as thoroughly as possible1234567
2When I study for online courses, I go through the readings and my class notes and try to find the most important ideas1234567
3Considering the difficulty of online courses, the teacher, and my skill, I think I will do well in online classes1234567
4I think the course material in the class is useful for me to learn1234567
5When reading for the course, I make up questions to help focus my reading1234567
6I try to change the way I study in order to fit the course requirements and the instructor’s teaching style1234567
7When I study for online courses, I set goals for myself in order to direct my activities in each study period1234567
8In an online class, I prefer assignments and questions that really challenge me so that I can learn new things.1234567
9I want to do well in the online class because it is important to show my ability to my family and friends.1234567
10I like to be one of the most recognized students in online class1234567
11I do not give up easily when confronted with technology-related obstacles (e.g.,Internet connection issues, difficulty with downloads, difficulty locating information, unable to contact instructor immediately, etc.).1234567
12I am comfortable working in alternative learning environments.1234567
13I am good at completing tasks independently1234567
14I organize my time to complete course requirements in a timely manner.1234567
15I achieve goals I set for myself1234567
16I am able to express my opinion in writing so that others understand what I mean1234567
17I regulate and adjust my behavior to complete course requirements.1234567
18I give constructive and proactive feedback to others even when I disagree.1234567
Perceived Learning
19Overall the online course met my learning expectations. 1234567
20I have learned as much from this online class as I might have from a face-to-face version of the course.1234567
21I learned new things and added new information to my knowledge 1234567
22The quality of the learning experience in online classes is at par to face-to-face classes.1234567
User Satisfaction
23My decision to use online learning was a wise one.1234567
24Online learning has met my expectations. 1234567
25Overall, I am satisfied with online learning.1234567
Actual Usage
26On average, how frequently do you use online learning?1234567
27On average, how much time do you spend per week using online learning?1234567
Task Technology Fit
28Online learning fits with the way I like to learn and study.1234567
29Online learning is suitable for helping me complete my academic assignments. 1234567
30Online learning is necessary to my academic tasks.1234567
Performance Impact
31Online learning helps me to accomplish my tasks more quickly1234567
32Online learning makes it easier to complete my tasks.1234567
33Online learning saves my money. 1234567
34Online learning improves my learning performance. 1234567
35Online learning enhances my academic effectiveness. 1234567
36Online learning helps reviews and eliminate errors in my work tasks. 1234567
37Online learning helps me to realize my future target. 1234567
38Online learning helps me acquire new knowledge. 1234567
39Online learning helps me acquire new skills.1234567
40Online learning helps me to come up with innovative ideas.1234567

Appendix B

Table A2. Scale validity and reliability.
Table A2. Scale validity and reliability.
EstimateS.E.C.R.p Label
PI1<---PI1.000
PI2<---PI0.9660.03230.341***
PI3<---PI1.0020.02934.091***
PI4<---PI1.0090.02836.539***
PI5<---PI0.9580.02834.728***
PI7<---PI0.9820.02934.440***
PI8<---PI0.9520.02932.972***
PI9<---PI0.9500.03130.489***
PI10<---PI0.9520.03130.254***
LC1<---LC1.000
LC2<---LC1.2100.06718.027***
LC3<---LC1.1490.06517.448***
LC5<---LC1.0770.06516.691***
LC6<---LC0.9860.05916.806***
LC7<---LC1.0580.06516.369***
LC8<---LC0.9940.06116.207***
LC10<---LC1.0750.05619.347***
LC11<---LC0.9880.06116.072***
LC18<---LC1.0280.06415.998***
LC13<---LC1.0070.06316.046***
LC14<---LC1.1670.06517.998***
LC15<---LC1.1100.06517.110***
LC17<---LC1.2020.06718.044***
PL1<---PL1.000
PL2<---PL1.1580.04227.516***
PL3<---PL1.0670.03530.314***
PL4<---PL1.2050.04228.359***
TTF1<---TTF1.000
TTF2<---TTF0.9160.02832.501***
TTF3<---TTF0.9650.02537.912***
US1<---US1.000
US2<---US0.9790.04621.306***
US3<---US1.0720.04722.841***
AU1<---AU1.000
AU2<---AU0.7980.086 9.294 ***
Note: *** p-value < 0.001.

Appendix C

Item-Total Statistics
Scale Mean if Item DeletedScale Variance if Item DeletedCorrected Item-Total CorrelationSquared Multiple CorrelationCronbach’s Alpha if Item Deleted
The most satisfying thing for me in online courses is trying to understand the content as thoroughly as possible201.33927.5690.3100.3830.917
When I study for online courses, I go through the readings and my class notes and try to find the most important ideas201.58929.2240.2460.4590.918
Considering the difficulty of online courses, the teacher, and my skill, I think I will do well in online classes201.20939.9900.1850.2830.918
I think the course material in the class is useful for me to learn201.42930.2700.3170.3530.917
When reading for the course, I make up questions to help focus my reading201.43928.8690.3160.5000.917
I try to change the way I study in order to fit the course requirements and instructor’s teaching style201.45925.9770.3320.3910.917
When I study for online courses, I set goals for myself in order to direct my activities in each study period201.47927.1790.2860.3930.917
In an online class, I prefer assignments and questions that really challenge me so that I can learn new things201.39925.4790.3190.3680.917
I want to do well in the online class because it is important to show my ability to my family and friends201.47924.5060.3310.4730.917
I like to be one of the most recognized students in online class201.17929.5910.3120.3980.917
I do not give up easily when confronted with technology-related obstacles (e.g.,Internet connection issues, difficulty with downloads, difficulty locating information, unable to contact instructor immediately, etc.).201.42924.2590.3080.4120.917
I am comfortable working in alternative learning environments.201.46927.2140.2790.4490.917
I am good at completing tasks independently201.23925.9680.3230.3550.917
I organize my time to complete course requirements in a timely manner.201.39931.7160.2490.3840.918
I achieve goals I set for myself201.26927.6770.3110.4740.917
I am able to express my opinion in writing so that others understand what I mean201.24931.6180.2690.4010.917
I regulate and adjust my behavior to complete course requirements.201.26926.2640.3330.5240.917
I give constructive and proactive feedback to others even when I disagree201.22930.0140.3090.4650.917
My decision to use online learning was a wise one.202.15885.5920.6470.7520.913
Online learning has met my expectations.202.32879.9980.6510.7780.913
Overall, I am satisfied with online learning.202.23878.8230.6560.8310.913
Online learning fits with the way I like to learn and study.202.29879.1410.6620.8050.913
Online learning is suitable for helping me complete my academic assignments.201.87884.2680.6860.7380.913
Online learning is necessary for my academic tasks.202.04884.1440.6690.7340.913
Online learning helps me to accomplish my tasks more quickly201.94888.0520.6490.7880.913
Online learning makes it easier to complete my tasks.201.77889.9670.6590.7840.913
Online learning saves me money.201.54905.9730.4880.4600.915
Online learning improves my learning performance.202.23877.4480.6760.8180.912
Online learning enhances my academic effectiveness.202.23883.1440.6550.8100.913
Online learning helps reviews and eliminate errors in my work tasks.201.98888.2160.6410.7200.913
Online learning helps me to realize my future target.202.23888.9330.6120.7350.913
Online learning helps me acquire new knowledge.201.83885.0780.6610.7790.913
Online learning helps me acquire new skills.201.82889.4930.6380.7090.913
Online learning helps me to come up with innovative ideas.201.96887.9980.6400.7360.913
Overall the online course met my learning expectations.201.29936.2650.1970.4480.918
I have learned as much from this online class as I might have from a face-to-face version of the course201.33924.9100.3360.4930.917
I learned new things and added new information to my knowledge201.59930.8910.2250.5350.918
The quality of the learning experience in online classes is at par to face-to-face classes201.57913.9390.3380.6130.917
On average, how frequently do you use online learning?201.73907.8090.4100.4320.916
Average, how much time do you spend per week using online learning?201.71924.3010.3140.3490.917

References

  1. UNESCO. Global Education Monitoring (GEM) Report 2020. Available online: https://www.unesco.org/en/articles/global-education-monitoring-gem-report-2020 (accessed on 31 December 2021).
  2. Bignoux, S.; Sund, K.J. Tutoring executives online: What drives perceived quality? Behav. Inf. Technol. 2018, 37, 703–713. [Google Scholar] [CrossRef]
  3. Adam, E.A. Self-Regulated Learning and Online Learning. Int. J. Acad. Res. Bus. Soc. Sci. 2017, 8, 1–5. [Google Scholar] [CrossRef]
  4. Zhu, X.; Liu, J. Education in and After Covid-19: Immediate Responses and Long-Term Visions. Postdigital Sci. Educ. 2020, 2, 695–699. [Google Scholar] [CrossRef] [Green Version]
  5. Lembani, R.; Gunter, A.; Breines, M.; Dalu, M.T.B. The same course, different access: The digital divide between urban and rural distance education students in South Africa. J. Geogr. High. Educ. 2020, 44, 70–84. [Google Scholar] [CrossRef]
  6. Aguilera-Hermida, A.P. College students’ use and acceptance of emergency online learning due to COVID-19. Int. J. Educ. Res. Open 2020, 1, 100011. [Google Scholar] [CrossRef] [PubMed]
  7. Tenório, T.; Bittencourt, I.I.; Isotani, S.; Silva, A.P. Does peer assessment in on-line learning environments work? A systematic review of the literature. Comput. Hum. Behav. 2016, 64, 94–107. [Google Scholar] [CrossRef]
  8. Aldholay, A.H.; Isaac, O.; Abdullah, Z.; Ramayah, T. The role of transformational leadership as a mediating variable in DeLone and McLean information system success model: The context of online learning usage in Yemen. Telemat. Inform. 2018, 35, 1421–1437. [Google Scholar] [CrossRef]
  9. Isaac, O.; Aldholay, A.; Abdullah, Z.; Ramayah, T. Online learning usage within Yemeni higher education: The role of compatibility and task-technology fit as mediating variables in the IS success model. Comput. Educ. 2019, 136, 113–129. [Google Scholar] [CrossRef]
  10. Panigrahi, R.; Srivastava, P.R.; Sharma, D. Online learning: Adoption, continuance, and learning outcome—A review of literature. Int. J. Inf. Manag. 2018, 43, 1–14. [Google Scholar] [CrossRef]
  11. Ajzen, I.; Fishbein, M. Understanding Attitudes and Predicting Social Behaviour; Prentice-Hall: Englewood Cliffs, NJ, USA, 1980. [Google Scholar]
  12. Ajzen, I. From intentions to actions: A theory of planned behavior. In Action Control; Springer: Berlin/Heidelberg, Germany, 1985; pp. 11–39. [Google Scholar]
  13. Davis, F.D. Perceived usefulness, perceived ease of use, and user acceptance of Information Technology. MIS Q. 1989, 13, 319. [Google Scholar] [CrossRef]
  14. DeLone, W.H.; McLean, E.R. Information systems success: The quest for the dependent variable. Inf. Syst. Res. 1992, 3, 60–95. [Google Scholar] [CrossRef] [Green Version]
  15. Delone, W.H.; McLean, E.R. The DeLone and McLean Model of Information Systems Success: A Ten-Year Update. J. Manag. Inf. Syst. 2003, 19, 9–30. [Google Scholar]
  16. Dale, L.G.; Ronald, L.T. Task-Technology Fit and Individual Performance. MIS Q. 1995, 19, 213–236. [Google Scholar]
  17. Viswanath Venkatesh, M.G.M.G.B.D.; Fred, D.D. User Acceptance of Information Technology: Toward a Unified View. MIS Q. Manag. Inf. Syst. 2003, 27, 425–478. [Google Scholar] [CrossRef] [Green Version]
  18. Cheng, B.; Wang, M.; Moormann, J.; Olaniran, B.A.; Chen, N.S. The effects of organizational learning environment factors on e-learning acceptance. Comput. Educ. 2012, 58, 885–899. [Google Scholar] [CrossRef]
  19. Aldholay, A.; Abdullah, Z.; Isaac, O.; Mutahar, A.M. Perspective of Yemeni students on use of online learning: Extending the information systems success model with transformational leadership and compatibility. Inf. Technol. People 2019, 33, 106–128. [Google Scholar] [CrossRef]
  20. McGill, T.J.; Klobas, J.E. A task-technology fit view of learning management system impact. Comput. Educ. 2009, 52, 496–508. [Google Scholar] [CrossRef]
  21. Cho, K.W.; Bae, S.K.; Ryu, J.H.; Kim, K.N.; An, C.H.; Chae, Y.M. Performance evaluation of public hospital information systems by the information system success model. Healthc. Inform. Res. 2015, 21, 43–48. [Google Scholar] [CrossRef] [Green Version]
  22. Wu, L.Y.; Wang, C.J. Transforming resources to improve performance of technology-based firms: A Taiwanese Empirical Study. J. Eng. Technol. Manag. 2007, 24, 251–261. [Google Scholar] [CrossRef]
  23. Ashfaq, M.; Yun, J.; Yu, S.; Loureiro, S.M.C. I, Chatbot: Modeling the determinants of users’ satisfaction and continuance intention of AI-powered service agents. Telemat. Inform. 2020, 54, 101473. [Google Scholar] [CrossRef]
  24. Shim, M.; Jo, H.S. What quality factors matter in enhancing the perceived benefits of online health information sites? Application of the updated DeLone and McLean Information Systems Success Model. Int. J. Med. Inform. 2020, 137, 104093. [Google Scholar] [CrossRef]
  25. Xinli, H. Effectiveness of information technology in reducing corruption in China A validation of the DeLone and McLean information systems success model. Electron. Libr. 2015, 33, 52–64. [Google Scholar] [CrossRef]
  26. Anat Cohen, O.B. Personality, Learning, and Satisfaction in Fully Online Academic Courses. Comput. Hum. Behav. 2017, 72, 1–12. [Google Scholar] [CrossRef]
  27. Harmon-Jones, E.; Harmon-Jones, C.; Price, T.F. What is Approach Motivation? Emot. Rev. 2013, 5, 291–295. [Google Scholar] [CrossRef]
  28. Doris, U.; Bolliger, S.S.; Christine, B. Impact of podcasting on student motivation in the online learning environment. Comput. Educ. 2010, 55, 714–722. [Google Scholar] [CrossRef]
  29. Sun, C.-Y.J. Motivational Influences in Distance Education: The Role of Interest, Self-Efficacy, and Self-Regulation. Ph.D. Thesis, University of Southern California, Los Angeles, CA, USA, 2009. [Google Scholar]
  30. Kee, N.S.; Omar, B.; Mohamed, R. Towards student-centred learning: Factors contributing to the adoption of E-Learn@USM. Malays. J. Distance Educ. 2012, 14, 1–24. [Google Scholar]
  31. Roca, J.C.; Chiu, C.M.; Martínez, F.J. Understanding e-learning continuance intention: An extension of the Technology Acceptance Model. Int. J. Hum. Comput. Studies 2006, 64, 683–696. [Google Scholar] [CrossRef] [Green Version]
  32. Wang, Y.S.; Liao, Y.W. Assessing eGovernment systems success: A validation of the DeLone and McLean model of information systems success. Gov. Inf. Q. 2008, 25, 717–733. [Google Scholar] [CrossRef]
  33. Lu, H.-P.; Yang, Y.-W. Computers in Human Behavior Toward an understanding of the behavioral intention to use a social networking site: An extension of task-technology fit to social-technology fit. Comput. Hum. Behav. 2014, 34, 323–332. [Google Scholar] [CrossRef]
  34. Mohammadi, H. Investigating users’ perspectives on e-learning: An integration of TAM and IS success model. Comput. Hum. Behav. 2015, 45, 359–374. [Google Scholar] [CrossRef]
  35. Yuan Sun, S.M. Assessing the impact of enterprise systems technological characteristics on user continuance behavior: An empirical study in China. Comput. Ind. 2015, 70, 153–167. [Google Scholar] [CrossRef]
  36. Kim, T.G.; Lee, J.H.; Law, R. An empirical examination of the acceptance behaviour of hotel front office systems: An extended technology acceptance model. Tour. Manag. 2008, 29, 500–513. [Google Scholar] [CrossRef]
  37. Chiu, C.-M.; Chang, H.-C.; Chiu, C.-S. Examining the integrated influence of fairness and quality on learners’ satisfaction and Web-based learning continuance intention. Inf. Syst. J. 2007, 17, 271–287. [Google Scholar] [CrossRef]
  38. Shih, Y.; Fang, K. The use of a decomposed theory of planned behavior to study Internet banking in Taiwan. Internet Res. 2004, 14, 213–223. [Google Scholar] [CrossRef] [Green Version]
  39. Isaac, O.; Abdullah, Z.; Ramayah, T.; Mutahar, A.M.; Alrajawy, I. Perceived Usefulness, Perceived Ease of Use, Perceived Compatibility, and Net Benefits: An empirical study of internet usage among employees in Yemen. In Proceedings of the 7th International Conference on Postgraduate Education, Universiti Teknologi MARA (UiTM), Shah Alam, Malaysia, 4 October 2016; pp. 899–919. [Google Scholar]
  40. Norzaidi, M.D.; Chong, S.C.; Murali, R.; Salwani, M.I. Intranet usage and managers’ performance in the port industry. Ind. Manag. Data Syst. 2007, 107, 1227–1250. [Google Scholar] [CrossRef]
  41. Rovai, A.P. Development of an instrument to measure classroom community. Internet High. Educ. 2002, 5, 197–211. [Google Scholar] [CrossRef]
  42. Alavi, M.; Marakas, G.M.; Yoo, Y. A Comparative Study of Distributed Learning Environments on Learning Outcomes. Inf. Syst. Res. 2002, 13, 404–415. [Google Scholar] [CrossRef]
  43. Shraim, K.; Khlaif, Z. An e-learning approach to secondary education in Palestine: Opportunities and challenges. Inf. Technol. Dev. 2010, 16, 159–173. [Google Scholar] [CrossRef]
  44. Rovai, A.P. A practical framework for evaluating online distance education programs. Internet High. Educ. 2003, 6, 109–124. [Google Scholar] [CrossRef]
  45. Lynch, R.; Dembo, M. The Relationship between Self-Regulation and Online Learning in a Blended Learning Context. Int. Rev. Res. Open Distrib. Learn. 2004, 5, 189. [Google Scholar] [CrossRef] [Green Version]
  46. Thompson, M.M. Evaluating Online Courses and Programs. J. Comput. High. Educ. 2004, 15, 63–84. [Google Scholar] [CrossRef]
  47. Selim, H.M. Critical success factors for e-learning acceptance: Confirmatory factor models. Comput. Educ. 2007, 49, 396–413. [Google Scholar] [CrossRef]
  48. Aldholay, A.H.; Abdullah, Z.; Ramayah, T.; Isaac, O.; Mutahar, A.M. Online learning usage and performance among students within public universities in Yemen. Int. J. Serv. Stand. 2018, 12, 163–179. [Google Scholar] [CrossRef]
  49. Aparicio, M.; Bacao, F.; Oliveira, T. Cultural impacts on e-learning systems’ success. Internet High. Educ. 2016, 31, 58–70. [Google Scholar] [CrossRef] [Green Version]
  50. Khan, I.U.; Hameed, Z.; Yu, Y.; Islam, T.; Sheikh, Z.; Khan, S.U. Predicting the acceptance of MOOCs in a developing country: Application of task-technology fit model, social motivation, and self-determination theory. Telemat. Inform. 2018, 35, 964–978. [Google Scholar] [CrossRef]
  51. Li, Y.; Shang, H. Service quality, perceived value, and citizens’ continuous-use intention regarding e-government: Empirical evidence from China. Inf. Manag. 2020, 57, 103197. [Google Scholar] [CrossRef]
  52. Barbera, E.; Linder-vanberschot, J.A. Factors Influencing Student Satisfaction and Perceived Learning in Online Courses. E-Learn. Digit. Media 2013, 10, 226–235. [Google Scholar] [CrossRef] [Green Version]
  53. Baber, H. Determinants of students’ perceived learning outcome and satisfaction in online learning during the pandemic of COVID19. J. Educ. e-Learn. Res. 2020, 7, 285–292. [Google Scholar] [CrossRef]
  54. Alqurashi, E. Predicting student satisfaction and perceived learning within online learning environments. Distance Educ. 2019, 40, 133–148. [Google Scholar] [CrossRef]
  55. Alshare, K.A.; Freeze, R.D.; Lane, P.L.; Wen, H.J. The Impacts of System and Human Factors on Online Learning Systems Use and Learner Satisfaction. Decis. Sci. J. Innov. Educ. 2011, 9, 437–461. [Google Scholar] [CrossRef]
  56. Sun, P.C.; Tsai, R.J.; Finger, G.; Chen, Y.Y.; Yeh, D. What drives a successful e-Learning? An empirical investigation of the critical factors influencing learner satisfaction. Comput. Educ. 2008, 50, 1183–1202. [Google Scholar] [CrossRef]
  57. Wang, W.-T.; Lai, Y.-J. Computers in Human Behavior Examining the adoption of KMS in organizations from an integrated perspective of technology, individual, and organization. Comput. Hum. Behav. 2014, 38, 55–67. [Google Scholar] [CrossRef]
  58. Ikhsan, J.; Akhyar, M.; Nais, M.K. The Effects of Science-On-Web Learning Media on Junior High School Students’ Learning Independency Levels and Learning Outcomes. Turk. Sci. Educ. 2019, 16, 231–239. [Google Scholar]
  59. DeLone, W.H.; McLean, E.R. Information Systems Success Measurement. Found. Trends® Inf. Syst. 2016, 2, 1–116. [Google Scholar] [CrossRef]
  60. Montesdioca, G.P.Z.; Maçada, A.C.G. ScienceDirect Measuring user satisfaction with information security practices. Comput. Secur. 2014, 8, 15. [Google Scholar] [CrossRef]
  61. Rostaminezhad, M.A.; Mozayani, N.; Norozi, D.; Iziy, M. Factors Related to E-learner Dropout: Case Study of IUST Elearning Center. Procedia Soc. Behav. Sci. 2013, 83, 522–527. [Google Scholar] [CrossRef] [Green Version]
  62. Kintu, M.J.; Zhu, C.; Kagambe, E. Blended learning effectiveness: The relationship between student characteristics, design features and outcomes. Int. J. Educ. Technol. High. Educ. 2017, 14, 7. [Google Scholar] [CrossRef] [Green Version]
  63. Eom, S.B.; Ashill, N. The Determinants of Students’ Perceived Learning Outcomes and Satisfaction in University Online Education: An Update. Decis. Sci. J. Innov. Educ. 2016, 14, 185–215. [Google Scholar] [CrossRef]
  64. Arbaugh, J.B.; Duray, R. Technological and Structural Characteristics, Student Learning and Satisfaction with Web-Based Courses: An Exploratory Study of Two On-Line MBA Programs. Manag. Learn. 2002, 33, 331–347. [Google Scholar] [CrossRef]
  65. D’Ambra, J.; Wilson, C.S.; Akter, S. Application of the task-technology fit model to structure and evaluate the adoption of E-books by academics. J. Am. Soc. Inf. Sci. Technol. 2013, 64, 48–64. [Google Scholar] [CrossRef] [Green Version]
  66. Daud, N.M. Factors Determining Intranet Usage: An Empirical Study of Middle Managers in Malaysian Port Industry. Ph.D. Thesis, Multimedia University, Cyberjaya, Malaysia, 2008. [Google Scholar]
  67. Glowalla, P.; Sunyaev, A. ERP system fit—An explorative task and data quality perspective. J. Enterp. Inf. Manag. 2014, 27, 668–686. [Google Scholar] [CrossRef]
  68. Larsen, T.J.; Sørebø, A.M.; Sørebø, Ø. The role of task-technology fit as users’ motivation to continue information system use. Comput. Hum. Behav. 2009, 25, 778–784. [Google Scholar] [CrossRef]
  69. Lee, D.Y.; Lehto, M.R. User acceptance of YouTube for procedural learning: An extension of the Technology Acceptance Model. Comput. Educ. 2013, 61, 193–208. [Google Scholar] [CrossRef]
  70. Lee, K.C.; Lee, S.; Kim, J.S. Analysis of mobile commerce performance by using the task-technology fit. IFIP Adv. Inf. Commun. Technol. 2005, 158, 135–153. [Google Scholar] [CrossRef]
  71. Norzaidi, M.D.; Chong, S.C.; Murali, R.; Salwani, M.I. Towards a holistic model in investigating the effects of intranet usage on managerial performance: A study on Malaysian port industry. Marit. Policy Manag. 2009, 36, 269–289. [Google Scholar] [CrossRef]
  72. Gu, L.; Wang, J. A study of exploring the “big five” and task technology fit in web-based decision support systems. Issues Inf. Syst. 2009, 10, 210–217. [Google Scholar] [CrossRef]
  73. Isaac, O.; Abdullah, Z.; Ramayah, T.; Mutahar, A.M.; Isaac, O.; Abdullah, Z. Internet usage, user satisfaction, task-technology fit, and performance impact among public sector employees in Yemen. Int. J. Inf. Learn. Technol. 2017, 34, 210–241. [Google Scholar] [CrossRef]
  74. Shih, Y.-Y.; Chen, C.-Y. The study of behavioral intention for mobile commerce: Via integrated model of TAM and TTF. Qual Quant 2013, 47, 1009–1020. [Google Scholar] [CrossRef]
  75. Isaac, O.R.T.; Mutahar, A.M. Internet Usage and Net Benefit among Employees within Government Institutions in Yemen: An Extension of DeLone. Int. J. Soft Comput. 2017, 12, 178–198. [Google Scholar]
  76. Awad, H.A.H. Investigating employee performance impact with integration of task technology fit and technology acceptance model: The moderating role of self-efficacy. Int. J. Bus. Excell. 2020, 21, 231–249. [Google Scholar] [CrossRef]
  77. Sinha, A.; Kumar, P.; Rana, N.P.; Islam, R.; Dwivedi, Y.K. Impact of internet of things (IoT) in disaster management: A task-technology fit perspective. Ann. Oper. Res. 2019, 283, 759–794. [Google Scholar] [CrossRef] [Green Version]
  78. Hamidi, F.; Ghorbandordinejad, F.; Rezaee, M.; Jafari, M. A comparison of the use of educational technology in the developed/developing countries. Procedia Comput. Sci. 2011, 3, 374–377. [Google Scholar] [CrossRef] [Green Version]
  79. Norzaidi, M.D.; Salwani, M.I. Campus-Wide Information Systems Article information: Evaluating technology resistance and technology satisfaction on students’ performance. Res. Pap. 2014, 9, 460–466. [Google Scholar]
  80. Petter, S.; DeLone, W.; McLean, E. Measuring information systems success: Models, dimensions, measures, and interrelationships. Eur. J. Inf. Syst. 2008, 17, 236–263. [Google Scholar] [CrossRef]
  81. Hou, C.K. Examining the effect of user satisfaction on system usage and individual performance with business intelligence systems: An empirical study of Taiwan’s electronics industry. Int. J. Inf. Manag. 2012, 32, 560–573. [Google Scholar] [CrossRef]
  82. Fan, J.C.; Fang, K. ERP implementation and information systems success: A test of DeLone and McLean’s model. Portland Int. Conf. Manag. Eng. Technol. 2006, 3, 1272–1278. [Google Scholar] [CrossRef]
  83. Alrajawy, I.; Mohd Daud, N.; Isaac, O.; Mutahar, A.M. Mobile Learning in Yemen Public Universities: Factors Influence student’s Intention to Use. In Proceedings of the 7th International Conference on Postgraduate Education, Universiti Teknologi MARA (UiTM), Shah Alam, Malaysia, 4 October 2016; pp. 1050–1064. [Google Scholar]
  84. Makokha, M.W.; Ochieng, D.O. Assessing the Success of ICT’s from a User Perspective: Case Study of Coffee Research Foundation, Kenya. J. Manag. Strategy 2014, 5, 12–58. [Google Scholar] [CrossRef] [Green Version]
  85. Wang, C.; Teo, T.S.H. International Journal of Information Management Online service quality and perceived value in mobile government success: An empirical study of mobile police in China. Int. J. Inf. Manag. 2020, 52, 102076. [Google Scholar] [CrossRef]
  86. Eom, S.B.; Wen, H.J.; Ashill, N. The determinants of students’ perceived learning outcomes and satisfaction in University Online Education: An empirical investigation. Decis. Sci. J. -Innov. Educ. 2006, 4, 215–235. [Google Scholar] [CrossRef]
  87. Aldholay, A.; Isaac, O.; Abdullah, Z.; Abdulsalam, R.; Al-Shibami, A.H. An extension of Delone and McLean is success model with self-efficacy. Int. J. Inf. Learn. Technol. 2018, 35, 285–304. [Google Scholar] [CrossRef]
  88. Hensler, J.; Ringle, M.C.; Sarstedt, M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J. Acad. Mark. Sci. 2015, 43, 115–135. [Google Scholar] [CrossRef] [Green Version]
  89. Isaac, O.; Abdullah, Z.; Ramayah, T.; Mutahar, A.M. Factors determining user satisfaction of internet usage among public sector employees in Yemen. Int. J. Technol.Learn. Innov. Dev. 2018, 10, 37. [Google Scholar] [CrossRef]
  90. Vivian, H.; Wright, C.S.S.; Elizabeth, K.W. Research on Enhancing the Interactivity of Online Learning; Information Age Pub.: Greenwich, CT, USA, 2006. [Google Scholar]
  91. Gray, J.A.; DiLoreto, M. The Effects of Student Engagement, Student Satisfaction, and Perceived Learning in Online Learning Environments This. NCPEA Int. J. Educ. Leadersh. Prep. 2016, 11, 98–119. [Google Scholar]
  92. Richardson, J.C.; Swan, K. Examining social presence in online courses in relation to students’ perceived learning and satisfaction. Online Learn. 2019, 7, 68–88. [Google Scholar] [CrossRef] [Green Version]
  93. Swan, K. Virtual interaction: Design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Educ. 2006, 22, 306–331. [Google Scholar] [CrossRef]
  94. Duque, L.C. A framework for analysing higher education performance: Students’ satisfaction, perceived learning outcomes, and dropout intentions. Total Qual. Manag. Bus. Excell. 2014, 25, 1–21. [Google Scholar] [CrossRef]
  95. Marks, R.B.; Sibley, S.D.; Arbaugh, J.B. A structural equation model of predictors for effective online learning. J. Manag. Educ. 2005, 29, 531–563. [Google Scholar] [CrossRef]
  96. Podsakoff, P.M.; Mackenzie, S.B.; Lee, J.-Y.; Podsakoff, N.P. Common Method Biases in Behavioral Research: A Critical Review of the Literature and Recommended Remedies. J. Appl. Psychol. 2003, 88, 879–903. [Google Scholar] [CrossRef]
  97. Harman, H.H. Modern Factor Analysis; University of Chicago Press: Chicago, IL, USA, 1976. [Google Scholar]
  98. Amoozegar, A.; Mohd Daud, S.; Mahmud, R.; Ab Jalil, H. Exploring Learner to Institutional Factors and Learner Characteristics as a Success Factor in Distance Learning. Int. J. Innov. Res. Educ. Sci. 2017, 4, 2349–5219. [Google Scholar]
  99. Fornell, C.; Larcker, D.F. Evaluating Structural Equation Models with Unobservable Variables and Measurement Error. J. Mark. Res. 1981, 18, 39. [Google Scholar] [CrossRef]
  100. Thomas, J.R.; Nelson, J.K.; Silverman, S.J. Research Methods in Physical Activity; Human Kinetics: Champaign, IL, USA, 2015. [Google Scholar]
  101. Hair, J.F., Jr.; Hult, G.T.M.; Ringle, C.M.; Sarstedt, M.; Danks, N.P.; Ray, S. Partial Least Squares Structural Equation Modeling (PLS-SEM) Using R: A Workbook; Springer International Publishing AG: Cham, Switzerland, 2021. [Google Scholar]
  102. Hair, J.F.; Ringle, C.M.; Sarstedt, M. Corrigendum to “Editorial Partial Least Squares Structural Equation Modeling: Rigorous Applications, Better Results and Higher Acceptance”. Long Range Plan. 2014, 47, 392. [Google Scholar] [CrossRef]
  103. Chin, W.W. The partial least squares approach for structural equation modeling. Mod. Methods Bus. Res. 1998, 295, 295–336. [Google Scholar]
  104. Sarstedt, M.; Ringle, C.M.; Hair, J.F. Partial Least Squares Structural Equation Modeling; Springer International Publishing: Berlin/Heidelberg, Germany, 2017. [Google Scholar] [CrossRef]
  105. Hoyle, R.H. Handbook of Structural Equation Modeling; The Guilford Press: New York, NY, USA, 2015. [Google Scholar]
  106. Sullivan, G.M.; Feinn, R. Using Effect Size—Or Why the P Value Is Not Enough. J. Grad. Med. Educ. 2012, 4, 279–282. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  107. Bolin, J.H.; Hayes, A.F. Introduction to mediation, moderation, and conditional process analysis: A regression-based approach. J. Educ. Meas. 2014, 51, 335–337. [Google Scholar] [CrossRef]
  108. Yu, T.-K.; Yu, T.-Y. Modelling the factors that affect individuals’ utilisation of online learning systems: An empirical study combining the task technology fit model with the theory of planned behaviour. Br. J. Educ. Technol. 2010, 41, 1003–1017. [Google Scholar] [CrossRef]
  109. Gatara, M.; Cohen, J.F. Mobile-health tool use and community health worker performance in the Kenyan context: A task-technology fit perspective. In Proceedings of the ACM International Conference Proceeding Series 2014, Centurion, South Africa, 28 September 2014; pp. 229–240. [Google Scholar] [CrossRef]
  110. Chen, T.; Peng, L.; Jing, B.; Wu, C.; Yang, J.; Cong, G. The impact of the COVID-19 pandemic on user experience with online education platforms in China. Sustainability 2020, 12, 7329. [Google Scholar] [CrossRef]
  111. Islam, A.K.M.N. E-learning system use and its outcomes: Moderating role of perceived compatibility. Telemat. Inform. 2016, 33, 48–55. [Google Scholar] [CrossRef]
  112. Crawford, J.; Cifuentes-Faura, J. Sustainability in higher education during the COVID-19 pandemic: A systematic review. Sustainability 2022, 14, 1879. [Google Scholar] [CrossRef]
  113. Clemente-Suárez, V.J.; Rodriguez-Besteiro, S.; Cabello-Eras, J.J.; Bustamante-Sanchez, A.; Navarro-Jiménez, E.; Donoso-Gonzalez, M.; Beltrán-Velasco, A.I.; Tornero-Aguilera, J.F. Sustainable development goals in the COVID-19 pandemic: A narrative review. Sustainability 2022, 14, 7726. [Google Scholar] [CrossRef]
  114. Faura-Martínez, U.; Lafuente-Lechuga, M.; Cifuentes-Faura, J. Sustainability of the Spanish university system during the pandemic caused by covid-19. Educ. Rev. 2021, 74, 645–663. [Google Scholar] [CrossRef]
  115. Mahmood, A.; Naveed, R.T.; Ahmad, N.; Scholz, M.; Khalique, M.; Adnan, M. Unleashing the barriers to CSR implementation in the SME sector of a developing economy: A thematic analysis approach. Sustainability 2021, 13, 12710. [Google Scholar] [CrossRef]
Figure 1. The theoretical framework for the current study.
Figure 1. The theoretical framework for the current study.
Sustainability 15 01348 g001
Figure 2. CFA for Measurement Model.
Figure 2. CFA for Measurement Model.
Sustainability 15 01348 g002
Figure 3. The Structural Model.
Figure 3. The Structural Model.
Sustainability 15 01348 g003
Table 1. The demographics of the respondents.
Table 1. The demographics of the respondents.
FrequencyPercentValid PercentCumulative Percent
Gender
Valid(1) Males24159.760.060.0
(2) Females16139.940.0100.0
Total40299.5100.0
MissingSystem20.5
Total 404100
Age
Valid(1) 20 or less4912.112.112.1
(2) 21–3031979.079.091.1
(3) 31–40368.98.9100.0
Total404100.0100.0
University
Valid 10.20.20.2
BZU Bahauddin Zakariya University307.47.47.4
COMSATS Institute of Information Technology307.47.414.9
FC Forman Christian College and Uversity, Lahore317.77.722.5
GCU Government College University5212.912.935.4
GIFT307.47.442.8
IIUI International Islamic University Islamabad10.20.243.1
IUB The Islamia University of Bahawalpur307.47.450.5
LUMS Lahore University of Management Sciences205.05.055.4
NUST307.47.462.9
PMAS ARID UNIVERSITY307.47.470.3
PU University of the Punjab307.47.477.7
RIPHAH307.47.485.1
UOL University of Lahore307.47.492.6
UOS University of Sargodha307.47.4100.0
Total403100.0100.0
Education
Valid
(1) Intermediate30.70.70.7
(2) Bachelors21543.143.245.6
(3) Masters13353.253.354.1
(4) MPhil358.78.795.8
(5) Ph.D.112.72.798.5
(6) Others61.51.5100.0
Total40399.8100.0
MissingSystem10.2
Total 404100
Table 2. Measurement scales and corresponding references for all the constructs.
Table 2. Measurement scales and corresponding references for all the constructs.
ConstructMeasurement ScaleReferences
Learner characteristics18 Items[63,98]
User Satisfaction3 Items[48]
Perceived Learning4 Items[92]
Task Technology Fit3 Items[75]
Actual Usage2 Items[19]
Performance Impact10 Items[48]
Table 3. Descriptive statistics.
Table 3. Descriptive statistics.
NMiniMaxMSDSkewnessKurtosis
StatisticStatisticStatisticStatiticStatisticStatisticStd.ErrorStatisticStd.Error
Perceived Learning4041.257.004.80571.28082−0.003121−0.8240.242
Learner Characteristics4041.007.004.83081.42443−0.5720.121−0.2520.242
User Satisfaction4041.337.004.75331.41341−0.0010.121−0.7130.242
Performance Impact4041.006.894.77671.57719−0.6320.121−0.6630.242
Task Technology Fit4041.006.673.88371.781550.0190.121−1.0840.242
Actual usage4041.007.004.67201.47586−0.7390.121−0.1710.242
Table 4. Discriminant validity.
Table 4. Discriminant validity.
CRAVEMSVMaxR(H)LCPIPLTTFUSAU
LC0.9650.6630.1060.9670.814
PI0.9790.8370.2360.9800.0370.915
PL0.9510.8310.2440.9720.176 ***0.326 ***0.911
TTF0.9500.8650.3230.9570.325 ***0.486 ***0.333 ***0.930
US0.9100.7720.3230.9250.239 ***0.468 ***0.494 ***0.568 ***0.878
AU0.7760.6320.1490.7970.313 ***0.381 ***0.126 ***0.327 ***0.387 ***0.925
*** p < 0.001; Note: For all the constructs, square roots of AVE (Average Variance Extracted) are shown as diagonal elements and inter-construct correlations are shown as off-diagonal.
Table 5. HTMT Analysis.
Table 5. HTMT Analysis.
LCPIPLTTFUSAU
LC
PI0.038
PL0.1830.317
TTF0.3270.4960.334
US0.2490.4610.5050.581
AU0.3160.3830.1380.3260.393
Table 6. Evaluation of Structural Model.
Table 6. Evaluation of Structural Model.
HypothesesRelationsEstimateS.EC.Rp-ValueResults
H1US <--- LC0.2390.0544.404***Accept
H2TTF <--- US0.7150.06610.746***Accept
H3AU <--- US0.3730.0635.932***Accept
H4PI <--- TTF0.2620.0485.424***Accept
H5PI <--- AU0.2010.0603.352***Accept
Note: *** p-value < 0.001.
Table 7. Results of mediating effects.
Table 7. Results of mediating effects.
PathDirect PathIndirect PathTotal EffectVAFMediation Type
H6 LC → US → TTF0.2780.1710.44938.08%Partial
H7 LC→ US → AU0.2500.0890.3426.176%Partial
H8 LC → US → TTF → PI0.160.0450.20521.95%Partial
H9 LC→ US → AU → PI0.0120.0180.03060%Partial
Table 8. Summarized results of moderating variable.
Table 8. Summarized results of moderating variable.
VariablesCoeffSETPLLCIULCI
Constant4.70360.060877.30990.00004.58404.8232
LC-->US0.09770.04442.19880.02850.01030.1850
PL-->US0.49430.047510.40050.00000.40090.5878
Interaction
LC x PL-->US0.15530.03434.52040.00000.08770.2228
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Butt, S.; Mahmood, A.; Saleem, S.; Murtaza, S.A.; Hassan, S.; Molnár, E. The Contribution of Learner Characteristics and Perceived Learning to Students’ Satisfaction and Academic Performance during COVID-19. Sustainability 2023, 15, 1348. https://doi.org/10.3390/su15021348

AMA Style

Butt S, Mahmood A, Saleem S, Murtaza SA, Hassan S, Molnár E. The Contribution of Learner Characteristics and Perceived Learning to Students’ Satisfaction and Academic Performance during COVID-19. Sustainability. 2023; 15(2):1348. https://doi.org/10.3390/su15021348

Chicago/Turabian Style

Butt, Sameera, Asif Mahmood, Saima Saleem, Shah Ali Murtaza, Sana Hassan, and Edina Molnár. 2023. "The Contribution of Learner Characteristics and Perceived Learning to Students’ Satisfaction and Academic Performance during COVID-19" Sustainability 15, no. 2: 1348. https://doi.org/10.3390/su15021348

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop