Abstract
Orientation: Student engagement is crucial for students’ success in higher education. Therefore, this study seeks to validate the psychometric properties of the Utrecht Work Engagement Scale-Student (UWES-S) version in a South African context.
Research purpose: The study evaluated the psychometric properties of the UWES-S, focusing on factorial validity, measurement invariance, item bias and internal consistency in a sample of university students across multiple campuses of a South African tertiary institution.
Motivation for the study: International best practices in psychometric assessment advocate for validated assessments. This study evaluates the psychometric properties of the UWES-S within the South African context.
Research approach/design and method: This quantitative cross-sectional study assessed the UWES-S’s psychometric properties, including factorial validity, measurement invariance, item bias and internal consistency in a sample of 2434 students from three campuses of a South African university.
Main findings: Results supported the three-factor structure of the UWES-S across language and campus groups. Factor loadings identified one problematic item with a negative loading, which was omitted. Measurement invariance was established, and while item bias was detected, the effect size was negligible. The UWES-S was also deemed reliable with high internal consistency.
Practical/managerial implications: The study provided preliminary evidence for the reliable, valid and unbiased application of the UWES-S on university students in South Africa.
Contribution/value-add: This study supports the valid, reliable and unbiased application of the UWES-S within South Africa’s higher education context.
Keywords: student engagement; UWES-S; vigour; dedication; absorption; factorial validity; factorial invariance; item bias; reliability; university students.
Introduction
Student engagement has garnered extensive attention from the research community because of its effect on learning outcomes (Gurcan et al., 2023). Studies have shown a significant relationship between student engagement and various indicators of student success, including graduation rates (Price & Tovar, 2014), academic performance (Carmona-Halty et al., 2021) and student involvement (Snijders et al., 2020). In addition, engaged students also report improved mental (e.g., academic stress) and physical (e.g., headaches, sweating and self-harm) well-being (Conner & Pope, 2014).
Conversely, disengaged students are more likely to discontinue their studies, significantly contributing to student attrition rates (Bumbacco & Scharfe, 2023; Saqr & López-Pernas, 2021). Moreover, disengaged students typically lack the necessary preparations and hold impractical expectations about the level of effort expected of them and their academic responsibility (Chipchase et al., 2017). Studies further report that disengaged students are also more likely to experience physical symptoms such as headaches, sweating, difficulty sleeping and substance abuse (Conner & Pope, 2014). Consequently, student engagement is considered a key concept in student learning (Gurcan et al., 2023).
Although different instruments exist to measure student engagement, the psychological engagement scale, the Utrecht Work Engagement Scale-Student (UWES-S) version, developed by Schaufeli et al. (2002a), is one of the most widely used instruments to measure psychological student engagement. Schaufeli et al. (2002a) defined engagement as a positive study-related state of mind characterised by vigour, dedication and absorption. Throughout the years, scholars have found acceptable psychometric properties for the UWES-S in various countries, including Japan (Tayama et al., 2019), Italy (Loscalzo & Giannini, 2018), Portugal (Cadime et al., 2016), Korea (Jang & An, 2022; Römer, 2016), Spain (Serrano et al., 2019), Greece (Dimitriadou et al., 2020) and South Africa (Mostert et al., 2007; Pienaar & Sieberhagen, 2005).
Recent studies have increasingly explored ways to enhance student well-being, including student engagement, by advocating for an adapted version of the Job Demands-Resources (JD-R) model, known as the Study Demands-Resources (SD-R) model (Bakker & Mostert, 2024; Salmela-Aro et al., 2022). According to SD-R theory, student well-being and performance are influenced by two key processes: the health impairment process and the motivational process. The health impairment process explains how excessive study demands can contribute to fatigue, anxiety and psychological strain, while the motivational process demonstrates how study resources can foster engagement, creativity and academic performance by promoting vigour, dedication and absorption (Bakker et al., 2023).
Within this framework, the UWES-S has been used as a key measure to assess the impact of various interventions aimed at increasing student engagement. For example, it has been employed to evaluate the effectiveness of study crafting programmes (Körner et al., 2023), playful study design interventions (Liu et al., 2023) and self-directed study crafting approaches (Mülder et al., 2022). Consequently, the UWES-S serves as a valuable tool for designing, implementing and assessing evidence-based interventions that enhance student engagement.
However, studies have yielded conflicting results regarding the factor structure of the UWES-S, with studies finding support for a three-factor model (Cadime et al., 2016; Loscalzo & Giannini, 2018), two-factor models (Dimitriadou et al., 2020; Naudé & Rothmann, 2004) or a one-factor model (Römer, 2016; Serrano et al., 2019). Although South Africa has made significant progress in certifying psychometric assessments based on their reliability, validity, bias and norms, challenges persist in using international instruments with inappropriate psychometric properties (Laher, 2024). These results, coupled with the complex and diverse South African population, with variations in the quality of education, socioeconomic discrepancies and increased acculturation, highlight the need for validation studies concerning psychometric assessment (Laher & Cockcroft, 2013).
Examining the psychometric properties of assessments is aligned with international best practices and is especially important when using assessments in diverse populations to ensure they are used in a fair and equitable manner (Laher, 2024). Consequently, this study aimed to evaluate the psychometric properties of the UWES-S on a sample of South African university students. Specifically, this study investigated the instrument’s factorial validity, measurement invariance, item bias and internal consistency.
Literature review
Background and conceptualisation of student engagement
The concept of engagement became prominent in the early 20th century because of the rise of positive psychology and a shift from the conventional focus on flaws and dysfunction to human strengths and optimal performance (Seligman & Csikszentmihalyi, 2000). As a result, scholars perceived burnout as a gradual decline in engagement, characterised by transforming energy into exhaustion, involvement into cynicism and efficacy into ineffectiveness (Maslach & Leiter, 1997). Accordingly, it was conceived that engagement could be measured using the opposite poles of the Maslach Burnout Inventory (MBI): exhaustion, cynicism and a lack of professional efficacy (Maslach et al., 1996; Maslach & Leiter, 1997). However, this perspective had two significant drawbacks. It is unreasonable to expect a perfect negative association between burnout and engagement. Additionally, the empirical investigation of burnout and engagement would not be possible if both were assessed using the same questionnaire (Schaufeli & Bakker, 2004).
Consequently, the UWES was developed by Schaufeli and colleagues, who conceptualised engagement consisting of three dimensions, namely vigour, dedication and absorption (Schaufeli et al., 2002b). Vigour is defined by elevated levels of energy and mental fortitude while working (Schaufeli et al., 2002b). Feelings of importance, passion, motivation, satisfaction and challenge from work define dedication (Schaufeli et al., 2002b), whereas absorption is defined as a state of complete concentration and contentment in one’s work, resembling a state of flow (Nakamura & Csikszentmihalyi, 2014). As students experience challenges similar to those of the working population, the UWES was adapted for applicability in the student context (UWES-S) through the revision of the original items (Schaufeli et al., 2002a).
Explanation of psychometric properties tested within this study
Establishing the psychometric properties of questionnaires is crucial to getting valid and reliable results, especially in psychometric assessments (De Souza et al., 2017). Psychometric assessments are deemed appropriate when they offer accurate, valid and interpretable scientifically robust data (De Souza et al., 2017). Therefore, analysing an instrument’s factor validity, measurement invariance, item bias and internal consistency can assist in determining its psychometric properties.
Factorial validity assesses the degree to which the purported structure of a scale can be discerned in a set of test scores (Piedmont, 2014). Measurement invariance is assessed on three levels to ensure that the factor structure (configural invariance), item loadings (metric invariance) and item threshold (strong invariance) remain similar across groups (Chen, 2008; Morin et al., 2020). Item bias quantifies the extent to which items within a scale discriminate against specific individuals or groups (Berry et al., 2002). Item bias is characterised as either uniform bias or non-uniform bias. Uniform bias occurs when items consistently discriminate across all individuals or groups irrespective of competency level (Kristjansson et al., 2005). Non-uniform bias occurs when items discriminate against specific individuals and groups based on their competency level (Kristjansson et al., 2005). Internal consistency is a measure of reliability that reveals the extent to which the items of an instrument accurately evaluate the concept it is designed to assess (Revicki, 2014).
Although multiple language groups participated in this study, the questionnaire was not translated into other languages for several reasons. Firstly, as the original UWES-S was developed in English, it was essential to analyse whether the items showed any bias against non-English-speaking groups. Instead of translating the instrument, we conducted measurement invariance and bias analyses to evaluate whether the factor structure (configural invariance), interpretation of items (metric invariance) and item difficulty (scalar invariance) remained consistent across language groups. Secondly, in South Africa, English is the primary language of communication, instruction and research at the university level (Rakgogo, 2024). As a result, students are expected to read and write in English throughout their academic studies. Thirdly, translating questionnaires into multiple languages presents several challenges, including the potential for translation errors, linguistic nuances, colloquialisms and cultural subtleties, that may alter the intended meaning of the items (Van De Vijver & Tanzer, 2004). To avoid these complications, we decided to retain the English version of the UWES-S and assess its measurement invariance and bias across different language and campus groups. This approach ensures that the instrument is fair and applicable to diverse student populations without introducing issues related to translation.
Psychometric properties of the Utrecht Work Engagement Scale-Student
The UWES-S has been validated in various countries, including Japan, Korea, Spain, the Netherlands and South Africa, to name only a few (Jang & An, 2022; Mostert et al., 2007; Pienaar & Sieberhagen, 2005; Römer, 2016; Schaufeli et al., 2002a; Tayama et al., 2019). Although the UWES-S has been validated among South African university students, the study only focused on Afrikaans and Setswana-speaking students and found one vigour item (‘When I am studying, I feel mentally strong’) to have a different meaning across the two language groups (Mostert et al., 2007). Similarly, another South African study used a small sample of student leaders, primarily white and Afrikaans speaking (Pienaar & Sieberhagen, 2005). Studies have used various forms of the UWES-S, including the UWES-S17 (17 items), UWES-S14 (14 items) and UWES-S9 (9 items) (Jang & An, 2022; Rastogi et al., 2018; Serrano et al., 2019), with this literature review including all forms of the UWES-S.
Studies testing the factorial validity of the UWES-S on student populations found conflicting results, with international studies supporting either the original three-factor structure (Cadime et al., 2016; Loscalzo & Giannini, 2018), two-factor structure (vigour and dedication with absorption) (Dimitriadou et al., 2020), another two-factor structure (vigour and dedication with absorption) (Naudé & Rothmann, 2004) or one-factor structure (Römer, 2016; Serrano et al., 2019). South African studies conducted on university students have also supported the three-factor (Pienaar & Sieberhagen, 2005), two-factor (vigour and dedication with absorption) (Mostert et al., 2007) and one-factor structures (Cilliers et al., 2018).
Furthermore, an initial analysis of the invariance of the UWES-S on the student population found that only the absorption factor of the UWES-S was invariant across university students in Spain, Portugal and the Netherlands (Schaufeli et al., 2002a). However, international studies have found the UWES-S to be invariant across gender, class year, academic major and choice of department (Carmona-Halty et al., 2019; Chi et al., 2022; Dimitriadou et al., 2020). Within South Africa, Mostert et al.’s (2007) measurement invariance of the two-factor UWES-S (vigour and dedication) was established across Afrikaans and Setswana language groups.
Although international studies investigating the UWES-S’s item bias on the student population are scarce, a South African study found that one vigour item (‘while I am studying, I feel mentally strong’) had a different meaning for Afrikaans and Setswana-speaking students (Mostert et al., 2007).
With regard to the reliability of the UWES-S, the three-factor structure showed acceptable Cronbach’s alpha coefficients in studies conducted in Portugal (vigour 0.82, dedication 0.86 and absorption 0.84) and Italy (vigour 0.82, dedication 0.88 and absorption 0.76) (Cadime et al., 2016; Loscalzo & Giannini, 2018). A Greek study using a two-factor structure showed reliable Cronbach’s alpha coefficients (vigour 0.73 and dedication with absorption 0.85) (Dimitriadou et al., 2020). The one-factor structure has also showed acceptable Cronbach’s alpha coefficients in studies conducted in Korea (0.83) and Spain (0.91) (Römer, 2016; Serrano et al., 2019). Additionally, a study conducted in South Africa found Cronbach’s alpha coefficients (vigour 0.77, dedication 0.85 and absorption 0.60) for the three-factor UWES-S (Pienaar & Sieberhagen, 2005).
Rational and aim
The UWES-S has been used to highlight the impact of student engagement on student success and performance (Carmona-Halty et al., 2021). Additionally, recent studies have also identified how student engagement can affect life satisfaction (Rastogi et al., 2018), entrepreneurship behaviour (Liu et al., 2023) and mental health challenges (Ishimaru et al., 2023). However, according to the Employment Equity Act, No. 55 of 1998, before any psychometric assessment may be used, it must be proven scientifically valid, reliable and fair (Government Gazette, 1998). Moreover, the validation of psychometric assessment is a continuous process to ensure that the application of an instrument is still appropriate (Schaap & Kekana, 2016). Therefore, this study aimed to revalidate the psychometric properties of the UWES-S, specifically focusing on factorial validity, measurement invariance, item bias and internal consistency in a sample of university students across multiple campuses of a South African tertiary institution.
Research design
Research approach
This study used a quantitative research approach. Quantitative research uses a deductive approach to confirm, reject or add credibility to existing theories by measuring variables (Leavy, 2017). The study also used a cross-sectional design, allowing data to be collected at a specific point in time using an online questionnaire (Zangirolami-Raimundo et al., 2018).
Research participants and procedure
This study utilised a stratified convenience sampling technique, categorising university students from a South African tertiary institution into subgroups based on their campus affiliation and home language. Approval for data collection across the three campuses was obtained from the university’s ethical, research and management structures. All students enrolled at any of the three campuses were invited to participate in this study. Data were collected through a secure quick response code (QR) distributed using electronic and physical pamphlets. The QR code directed participants to the QuestionPro platform. The link was distributed electronically and through fieldworkers across the three respective campuses. The trained fieldworkers conducted awareness sessions to explain the study’s purpose and the participants’ involvement before voluntary participation. Before completing the questionnaire, participants were provided with detailed information regarding the purpose and nature of the study, along with the intended outcomes. Moreover, participants were required to sign a consent form and complete a biographical questionnaire electronically. The biographical and engagement questionnaire took approximately 15–20 min to complete.
The sample consisted of 2434 participants, of which 887 (36.4%) were between the ages of 17 years and 20 years, followed by 667 (27.4%) being older than 25 years and 840 (34.2%) aged between 21 years and 24 years. Furthermore, in terms of language, 685 (28.1%) indicated that they spoke Afrikaans, followed by 472 (19.4%) speaking Setswana, 255 (10.5%) speaking Sesotho, 190 (7.8%) speaking English and 808 (33.4%) speaking another language. Although other language groups participated in this study, bias and invariance analyses were only conducted on the four primary recognised language groups (Afrikaans, English, Setswana and Sesotho) at the specific university and the three campuses. Most participants were enrolled at campus two (67.4%), followed by campus one (19.3%), with the remaining participants studying at campus three (12.5%). Regarding gender, the sample was predominantly male, with 1779 men (73.1%) and 600 women (24.7%). Regarding ethnicity, 1571 (64.5%) of the sample were Black, followed by 691 (28.4%) White, 110 (4.5%) Mixed race, 23 (0.9%) Indian, 5 (0.2%) Asian and 14 (0.6%) indicated other. Given that the biographical questionnaire was optional, percentages not equalling 100% are because of missing values.
Measuring instruments
Participants completed a biographical questionnaire to report on the sample characteristics and adhere to the American Psychological Association reporting standards (Appelbaum et al., 2018). The biographical information collected includes the participant’s age, home language, campus, gender and ethnicity.
Student engagement was measured with the Utrecht Work Engagement Scale-Student (UWES-S) version developed by Schaufeli et al. (2002a). The UWES-S consists of 14 items, measuring three dimensions: five items measure vigour (e.g., ‘When studying I feel strong and vigorous’), five items measure dedication (e.g., ‘I am proud of my studies’) and four items measure absorption (e.g., ‘I can get carried away by my studies’). Responses were captured on a seven-point Likert scale ranging from 0 (never) to 6 (every day).
Statistical analysis
Statistical analysis was conducted using the Mplus 8.9 modelling programme (Muthén & Muthén, 2023). Confirmatory factor analysis was used to estimate the fit between different theorised models (Piedmont, 2014). The following fit statistics were used to establish an acceptable fit: the Chi-square statistic (χ2), standard root mean square residual (SRMR), root mean square error of approximation (RMSEA), Tucker–Lewis index (TLI) and the comparative fit index (CFI) (Brown & Moore, 2012). Values of less than 0.05 for SRMR and below 0.08 for RMSEA indicate an acceptable fit (Van De Schoot et al., 2012). Concerning TLI and CFI values, 0.90 and above were considered a good fit, with values above 0.95 being considered excellent (Van De Schoot et al., 2012).
Measurement invariance was calculated for language and campus groups. Multi-group confirmatory factor analysis was used to test for configural invariance (i.e., the factors structure similar across groups), metric invariance (i.e., the factors interpreted similarly across groups) and strong invariance (i.e., the item threshold similar across groups) (Morin et al., 2020). Tucker–Lewis index and CFI values above 0.90 are considered acceptable, with values above 0.95 being considered better, whereas values less than 0.08 for RMSEA are considered acceptable and below 0.05 being better (Van De Schoot et al., 2012). Additionally, the changes in ∆RMSEA, ∆SRMR and ∆CFI were considered as they are less susceptible to changes in sample size and observable variables (Shi et al., 2019). Changes in ∆RMSEA should be less than 0.015, and ∆CFI should not deteriorate the model by more than −0.01 (Chen, 2007).
Item bias was detected using differential item functioning (DIF) between the four language groups (Afrikaans, English, Setswana and Sesotho) and the three campuses that participated in the study. Uniform bias and non-uniform bias were assessed using the lordif package (Choi et al., 2011) within R studio (https://rstudio.com/). Bias was identified by using ordinal logistic regressions to generate three likelihood-ratio statistics (χ2) and comparing the following models (Equation 1, Equation 2, Equation 3 and Equation 4) (Choi et al., 2011):




Uniform bias can be identified with a statistical significance (p < 0.01) when comparing models one and two ( ; degree of freedom [df] = 1. Non-uniform bias with statistical significance (p < 0.01) is identified by comparing models two and three( ; df = 1). Total bias with statistical significance is identified by comparing models one and three ( ; df = 1)(Choi et al., 2011). The Pseudo-McFadden R2 statistics were used to indicate the magnitude or practical significance of the DIF. Values of less than 0.13 are considered negligible, values between 0.13 and 0.26 are considered moderate and values above 0.26 are large (Zumbo, 1999). Additionally, the β1 coefficients were used to measure uniform bias by comparing models one and two with a 10% threshold indicating practical significance (Crane et al., 2007). Cronbach’s alpha coefficient and McDonald’s omega coefficients were calculated to determine the scale’s internal consistency, with values exceeding 0.70 regarded as acceptable (Hoyle, 2023).
Ethical considerations
In addition to obtaining permission from the Vice-Chancellor of Teaching and Learning, the research proposal was also approved by the university’s scientific and ethics committee for student data collection. The study was conducted as part of an ongoing research project. Ethical clearance to conduct this study was obtained from the North-West University Economic and Management Sciences Research Ethics Committee (EMS-REC) (No. NWU-HS-2014-0165) Additionally, the electronic consent form that participants had to sign before completing the questionnaire outlined critical ethical considerations, including confidentiality, anonymity, voluntary participation and the processing of personal information. To enhance confidentiality, the biographical questionnaire did not include identifiable information (e.g. name, surname and contact information). Participants could also withdraw their participation at any time during the questionnaire.
Results
Factorial validity
The factorial validity of the UWES-S factor structure was analysed using confirmatory factor analysis. Based on literature, four models were examined, including a one-factor model (all items loading onto one-factor), a two-factor model (vigour with dedication and absorption), an alternative two-factor model (vigour with dedication and absorption) and the original three-factor model. Table 1 represents the results of the confirmatory factor analysis for the UWES-S14.
The common factor analysis (CFA) results for the overall sample indicate that the three-factor model had a superior fit (χ2 = 1641.034; df = 74; CFI = 0.977; TLI = 0.972; RMSEA = 0.094 [0.090, 0.098]; SRMR = 0.036) when compared to the other factor structures. Consequently, the three-factor model was selected for further analysis.
The standardised factor loadings (λ) of the UWES-S14 indicate the variance explained by individual items and their corresponding factor. Table 2 represents the items’ standardised loadings (λ).
TABLE 2: Standardised factor loadings (N = 2434). |
Factor loadings for the UWES-S14 were statistically significant (p < 0.001) and positive. Factor loadings ranged between medium (≥ 0.500) and high (≥ 0.700) except for one dedication item DE5, ‘I find my studies challenging’ (Schaufeli et al., 2002a, p. 478), which had a negative factor loading with a small effect of -0.187 (Shevlin & Miles, 1998). Consequently, this item (DE5) was omitted from further analysis.
The correlation coefficients (r) were calculated to explore the relationship between variables on a scale ranging from −1 to +1 (Schober et al., 2018). The effect sizes of these values were classified as small (r ≥ 0.10), medium (r ≥ 0.30) or large (r ≥ 0.50) (Cohen, 2013). Table 3 depicts the correlation between the three UWES-S14 factors.
TABLE 3: Estimated correlation matrix (N = 2434). |
The results of the correlation matrix indicated that dedication had a strong positive correlation with vigour (0.799). Similarly, absorption strongly correlates with vigour (0.828) and dedication (0.734).
Measurement invariance
The UWES-S’s measurement invariance was analysed using language (Afrikaans, English, Sesotho and Setswana) and three campus groups. Table 4 shows the results of the configural, metric and strong invariance.
TABLE 4: Results of the measurement invariance models across the language and campuses. |
The fit indices (measurement invariance) for the three-factor UWES-S14 showed sufficient configural, metric and strong invariance across language and campus groups. The TLI and CFI values were above 0.90 (Van De Schoot et al., 2012) and ∆CFI and ∆TLI deteriorated by no more than 0.01 (Chen, 2007). Moreover, ∆RMSEA values did not increase more than 0.015 (Chen, 2007).
Item bias
Differential item functioning was used to test possible item bias between the language (Afrikaans, English, Sesotho and Setswana) and three campus groups. Table 5 displays the DIF for the language groups participating in this study. Specifically, it presents the uniform, total and non-uniform biases and the effect size of the bias.
TABLE 5: Differential item functioning for language (N = 1602). |
Uniform bias was identified for all vigour items (p < 0.01 for ), except item 3, total bias for all items (p < 0.01 for ) and non-uniform bias for items 1 and 3 (p < 0.01 for ). Similarly, there was uniform bias identified for all dedication items (p < 0.01 for ) except item 3, total bias for all items (p < 0.01 for ) except for item 3 and non-uniform bias for item 2 (p < 0.01 for ). Furthermore, three absorption items (items 1, 3 and 4) showed uniform bias (p < 0.01 for ), total bias (p < 0.01 for ) and non-uniform bias (p < 0.01 for
). Although multiple items were identified as biased, the effects were negligible based on the Pseudo-McFadden R2 values smaller than 0.13 and ∆β1 coefficients smaller than 0.10 (10%).
Table 6 displays the DIF for the campus groups. Specifically, it presents the uniform, total and non-uniform biases and the effect size of the bias.
TABLE 6: Differential item functioning for campus (N = 2414). |
Uniform bias (p < 0.01 for ) and total bias (p < 0.01 for ) were identified for three vigour items (items 1, 2 and 3). Additionally, total bias (p < 0.01 for ) and non-uniform bias (p < 0.01 for ) were present for two vigour items (items 3 and 5). Moreover, three dedication items (items 1, 2 and 3) indicated uniform bias (p < 0.01 for ) and total bias (p < 0.01 for ), while non-uniform bias (p < 0.01 for ) was present for item 2. Additionally, three absorption items (items 1, 3 and 4) were identified with uniform bias (p < 0.01 for ), total bias (p < 0.01 for ) and non-uniform bias (p < 0.01 for ). Again, these effects were negligible based on the pseudo-McFadden R2 values being smaller than 0.13 and ∆β1 coefficients smaller than 0.10 (10%).
Reliability
Internal consistency was measured using Cronbach’s alpha (α) and McDonald’s omega (ω) coefficients. Internal consistency of the three-factor UWES-S (vigour, dedication and absorption) is depicted in Table 7.
TABLE 7: Reliability analysis (N = 2434). |
Cronbach’s alpha coefficient (α) and McDonald’s omega coefficient (ω) exceeded 0.70, indicating acceptable internal consistency (Hoyle, 2023).
Discussion
Similar to the working population, students also experience demands, such as workload, time constraints and cognitive challenges (Cilliers et al., 2018; Lesener et al., 2019), which are associated with physical and psychological consequences such as stress, disengagement and student attrition (Bumbacco & Scharfe, 2023; Chipchase et al., 2017; Martin et al., 2021). Conversely, student resources such as lecturer support, peer support and developmental opportunities (Cilliers et al., 2018; Lesener et al., 2019) assist students in attaining their academic goals, managing demands, motivating students to continue with their studies and lead to higher levels of study engagement (Cilliers et al., 2018; Lesener et al., 2019). Indeed, multiple studies have demonstrated the beneficial impact of student engagement on students’ academic performance, graduation rates and involvement (Carmona-Halty et al., 2021; Snijders et al., 2020). Nevertheless, their psychometric properties must be established before engagement questionnaires can be used on the student population (Schaap & Kekana, 2016).
Therefore, this study assessed the psychometric properties of the UWES-S14 to determine whether the instrument is valid, reliable, invariant and unbiased for measuring university student engagement within the South African context. Specifically, the study examined factor validity, measurement invariance, item bias and internal consistency.
Factor validity was assessed using confirmatory factor analysis to measure the fit of various measurement models. The findings supported the original three-factor (vigour, dedication and absorption) structure, demonstrating superior fit. Notably, the factor loading results indicated one problematic dedication item (item 5: ‘I find my studies challenging’) (Schaufeli et al., 2002a, p. 478), which displayed a negative and non-significant factor loading. In this context, the item may reflect the perceived difficulty of the study environment rather than dedication itself. Considering the diverse sample, this issue could stem from item formulation or content (Van De Vijver & Leung, 2021). Negative factor loadings often indicate that the item is inversely worded to others, suggesting a potential need for reverse scoring (Chen, 2017). Nevertheless, the results are aligned with previous validation studies that used the UWES-S in Portuguese (Cadime et al., 2016) and Italian (Loscalzo & Giannini, 2018) university student samples. Moreover, it supports the original three-factor structure proposed by the author of the scale (Schaufeli et al., 2002a).
Measurement invariance was examined by testing for configural, metric and strong invariance across language and campus groups. The invariance testing indicated full configural, metric and strong invariance across language and campus groups participating in this study. Initial analysis of the UWES-S only supported invariance for the absorption factor across university samples in the Netherlands, Spain and Portugal (Schaufeli et al., 2002a). However, our results are aligned with other studies that found full invariance (configural, metric and scalar invariance) for the three-factor UWES-S across gender groups in Chile (Carmona-Halty et al., 2019) and academic levels in Portugal (Cadime et al., 2016).
Item bias was assessed across language and campus groups using DIF. In terms of language groups, uniform bias was identified for four vigour items (items 1, 2, 4 and 5), three dedication items (items 1, 2 and 4) and three absorption items (items 1, 3 and 4). Additionally, total bias was observed for all the vigour items (items 1, 2, 3, 4 and 5), three dedication items (items 1, 2 and 4) and three absorption items (items 1, 3 and 4). Non-uniform bias was identified for two vigour items (items 1 and 3), one dedication item (item 2) and three absorption items (items 1, 3 and 4). Similar results were observed between the three campus groups. Specifically, uniform bias was identified for three vigour items (items 1, 2 and 4) and one absorption item (item 4). Total bias was identified for all the vigour items (items 1, 2, 3, 4 and 5) and two absorption items (items 1 and 4). Lastly, non-uniform bias was identified for two vigour items (items 3 and 5) and one absorption item (item 1). The item bias identified within the various language and campus groups can potentially be attributed to cultural differences, item interpretation and ambiguous wording of items (He & Van De Vijver & Leung, 2012). However, further analysis indicated that the effect of the item bias within the language and campus groups is negligible and non-significant. According to Zumbo (1999), for an item to be considered biased, the pseudo-McFadden R2 value needs to exceed 0.05. Additionally, the ∆β1 coefficients can also be used with values exceeding 0.10 (10%), indicating practical significance (Crane et al., 2007). Therefore, as the pseudo-McFadden R2 values do not exceed 0.05 and ∆β1 coefficients do not exceed 0.10, we found sufficient evidence to support the unbiased application of the UWES-S within our sample. Studies investigating the item bias of the UWES-S are limited, and only one South African study that examined DIF could be identified. The study found that Setswana-speaking students encountered difficulty with one vigour item ‘when I am studying, I feel mentally strong’ (Mostert et al., 2007).
Internal consistency was measured using Cronbach’s alpha coefficient alpha (α) and McDonald’s omega (ω) coefficient. The results indicate acceptable internal consistency for vigour (α = 0.873, ω = 0.878), dedication (α = 0.783, ω = 0.849) and absorption (α = 0.769, ω = 0.772). These findings are consistent with previous studies involving Portuguese, Italian and South African students, which reported acceptable Cronbach’s alpha coefficients for the three-factor UWES-S (Cadime et al., 2016; Loscalzo & Giannini, 2018; Pienaar & Sieberhagen, 2005).
Limitations and recommendations
Although this study provides preliminary evidence for the valid, reliable and unbiased use of the UWES-S within the university context, several limitations should be highlighted. Firstly, the study was conducted on three campuses of a specific South African university, and the sample was predominantly male (73.1%), limiting the generalisation and application of the findings. Future replications should be conducted in other tertiary institutions to enhance the application of the UWES-S within the university context.
Secondly, multiple items within the UWES-S contained bias, which could affect the application of the measure across diverse groups. Although the effect was negligible, further research is needed to identify whether the bias was sample specific. In addition to item bias, the result indicates one problematic dedication item (item 5) ‘I find my studies challenging’ (Schaufeli et al., 2002a, p. 478) with negative factor loadings warranting further investigation regarding its formulation and scoring (Chen, 2017; Van De Vijver & Leung, 2021).
Thirdly, only measured invariance across four language groups and three campuses was analysed. Future studies, especially in South Africa, can broaden their analysis to measure the invariance of the UWES-S across other language groups, genders, faculties and tertiary institutions. This would assist further in making cross-cultural and regional comparisons.
Fourthly, to advance research on academic engagement, it would be valuable to explore the application of the UWES-S14 within the field of artificial intelligence (AI)-driven analytics. Artificial intelligence technology simulates human intelligence in machines, allowing them to process data, recognise patterns and make autonomous decisions (Sheikh et al., 2023). Given the growing integration of AI analytics in various domains, particularly psychometrics (Wang et al., 2023), AI-driven analytics can enhance the study of academic engagement by facilitating the analysis of large datasets, the identification of complex patterns, the development of predictive models that provide early warning signs of disengagement and the implementation of personalised interventions.
Conclusion
This study aimed to examine the psychometric properties of the UWES-S14 for a sample of South African university students. Specifically, it evaluated factorial validity, measurement invariance, item bias and internal consistency. The results of this study provide adequate evidence to support the three-factor structure of the UWES-S and demonstrate measurement invariance (configural, metric and strong invariance) across the language and campus groups involved in this study. The analysis also found the UWES-S to be unbiased across language and campus groups. Moreover, the three-factor UWES-S was deemed reliable. Therefore, this study supports the valid, reliable and unbiased application of the UWES-S among university students in South Africa.
Acknowledgements
Competing interests
The authors declare that they have no financial or personal relationships that may have inappropriately influenced them in writing this article. The author, L.D.B., serves as an editorial board member of this journal. The peer review process for this submission was handled independently, and the author had no involvement in the editorial decision-making process for this manuscript. The authors have no other competing interests to declare.
Authors’ contributions
C.P.B. conducted the literature review, collected the data and wrote the initial draft of the manuscript. K.M. conceptualised and supervised the study, verified the analytical methods, assisted with the reviewing and editing of the manuscriptand provided the necessary resources and secured funding for the project. C.D.T. assisted with the reviewing and editing of the manuscript. L.D.B. conducted the formal statistical analyses and assisted with interpreting the results. A.B.B. was a critical reader of the final draft. All authors discussed the results and contributed to the final manuscript. C.P.B., K.M., C.D.T., L.D.B. and A.B.B. discussed the results and contributed to the final manuscript.
Funding information
The information in this article is based on research funded by North-West University’s Deputy Vice-Chancellor: Teaching and Learning office.
Data availability
The data that support the findings of this study are available from the corresponding author, K.M., upon reasonable request.
Disclaimer
The views and opinions expressed in this article are those of the authors and are the product of professional research. They do not necessarily reflect the official policy or position of any affiliated institution, funder, agency, or that of the publisher. The authors are responsible for this article’s results, findings, and content.
References
Appelbaum, M.I., Cooper, H., Kline, R.B., Mayo-Wilson, E., Nezu, A.M., & Rao, S.M. (2018). Journal article reporting standards for quantitative research in psychology: The APA publications and communications board task force report. American Psychologist, 73(1), 3–25. https://doi.org/10.1037/amp0000191
Bakker, A.B., & Mostert, K. (2024). Study demands–resources theory: Understanding student well-being in higher education. Educational Psychology Review, 36(3), 92. https://doi.org/10.1007/s10648-024-09940-8
Bakker, A.B., Demerouti, E., & Sanz-Vergel, A. (2023). Job demands–resources theory: Ten years later. Annual Review of Organizational Psychology and Organizational Behavior, 10, 25–53. https://doi.org/10.1146/annurev-orgpsych-120920-053933
Berry, J.W., Poortinga, Y.H., Segall, M.H., & Dasen, P.R. (2002). Cross-cultural psychology: Research and applications (2nd ed.). Cambridge University. Retrieved from https://psycnet.apa.org/record/2003-02471-000
Brown, T.A., & Moore, M.T. (2012). Confirmatory factor analysis. In R.H. Hoyle (Ed.), Handbook of structural equation modeling (pp. 361–380). The Guilford Press. Retrieved from https://www.researchgate.net/publication/361861039_Handbook_of_Structural_Equation_Modeling
Bumbacco, C., & Scharfe, E. (2023). Why attachment matters: First-year post-secondary students’ experience of burnout, disengagement, and drop-out. Journal of College Student Retention: Research, Theory & Practice, 24(4), 988–1001. https://doi.org/10.1177/1521025120961012
Cadime, I., Lima, S., Pinto, A.M., & Ribeiro, I. (2016). Measurement invariance of the Utrecht work engagement scale for students: A study across secondary school pupils and university students. European Journal of Developmental Psychology, 13(2), 254–263. https://doi.org/10.1080/17405629.2016.1148595
Carmona-Halty, M., Salanova, M., Llorens, S., & Schaufeli, W.B. (2021). Linking positive emotions and academic performance: The mediated role of academic psychological capital and academic engagement. Current Psychology, 40(6), 2938–2947. https://doi.org/10.1007/s12144-019-00227-8
Carmona-Halty, M.A., Schaufeli, W.B., & Salanova, M. (2019). The Utrecht Work Engagement Scale for Students (UWES–9S): Factorial validity, reliability, and measurement invariance in a Chilean sample of undergraduate university students. Frontiers in Psychology, 10, 01017. https://doi.org/10.3389/fpsyg.2019.01017
Chen, F.F. (2007). Sensitivity of goodness of fit indexes to lack of measurement invariance. Structural Equation Modeling: A Multidisciplinary Journal, 14(3), 464–504. https://doi.org/10.1080/10705510701301834
Chen, F.F. (2008). What happens if we compare chopsticks with forks? The impact of making inappropriate comparisons in cross-cultural research. Journal of Personality and Social Psychology, 95(5), 1005–1018. https://doi.org/10.1037/a0013193
Chen, Y. (2017). On the impact of negatively keyed items on the assessment of the unidimensionality of psychological tests and measures. PhD thesis, University of British Columbia.
Chi, L., Tang, T., & Tang, E. (2022). Psychometric properties of the Utrecht Work Engagement Scale for Students (UWES-S) in the Taiwanese context. Current Psychology, 42(31), 27428–27441. https://doi.org/10.1007/s12144-022-03737-0
Chipchase, L.S., Davidson, M., Blackstock, F., Bye, R., Colthier, P., Krupp, N., Dickson, W., Turner, D.E., & Williams, M.A. (2017). Conceptualising and measuring student disengagement in higher education: A synthesis of the literature. International Journal of Higher Education, 6(2), 31–42. https://doi.org/10.5430/ijhe.v6n2p31
Choi, S.W., Gibbons, L.E., & Crane, P.K. (2011). Lordif: An R package for detecting differential item functioning using iterative hybrid ordinal logistic regression/item response theory and Monte Carlo simulations. Journal of Statistical Software, 39(8), 1–30. https://doi.org/10.18637/jss.v039.i08
Cilliers, J., Mostert, K., & Nel, J.A. (2018). Study demands, study resources and the role of personality characteristics in predicting the engagement of first-year university students. South African Journal of Higher Education, 32(1), 49–70. https://doi.org/10.20853/32-1-1575
Cohen, J. (2013). Statistical power analysis for the behavioral sciences (2nd ed.). Routledge.
Conner, J., & Pope, D. (2014). Student engagement in high-performing schools: Relationships to mental and physical health. Teachers College Record, 116(13), 80–100. https://doi.org/10.1177/016146811411601314
Crane, P.K., Gibbons, L.E., Ocepek-Welikson, K., Cook, K., Cella, D., Narasimhalu, K., Hays, R.D., & Teresi, J.A. (2007). A comparison of three sets of criteria for determining the presence of differential item functioning using ordinal logistic regression. Quality of Life Research, 16(1), 69–84. https://doi.org/10.1007/s11136-007-9185-5
De Souza, A.C., Alexandre, N.M.C., & De Brito Guirardello, E. (2017). Propriedades psicométricas na avaliação de instrumentos: Avaliação da confiabilidade e da validade. Epidemiologia E Serviços De Saúde, 26(3), 649–659. https://doi.org/10.5123/S1679-49742017000300022
Dimitriadou, S., Lavidas, K., Karalis, T., & Ravanis, K. (2020). Study engagement in university students: A confirmatory factor analysis of the Utrecht Work Engagement Scale with Greek students. Journal of Well-being Assessment, 4(3), 291–307. https://doi.org/10.1007/s41543-021-00035-7
Government Gazette. (1998). Republic of South Africa, Vol. 400, no.19370. Retrieved from https://www.gov.za/sites/default/files/gcis_document/201409/a55-98ocr.pdf
Gurcan, F., Erdogdu, F., Cagiltay, N.E., & Cagiltay, K. (2023). Student engagement research trends of past 10 years: A machine learning-based analysis of 42,000 research articles. Education and Information Technologies, 28(11), 15067–15091. https://doi.org/10.1007/s10639-023-11803-8
He, J., & Van De Vijver, F. (2012). Bias and equivalence in cross-cultural research. Online Readings in Psychology and Culture, 2(2), 8. https://doi.org/10.9707/2307-0919.1111
Hoyle, R.H. (2023). Handbook of structural equation modeling (2nd ed.). Guilford Publications.
Ishimaru, D., Adachi, H., Mizumoto, T., Erdelyi, V., Nagahara, H., Shirai, S., Takemura, H., Takemura, N., Alizadeh, M., Higashino, T., Yagi, Y., & Ikeda, M. (2023). Criteria for detection of possible risk factors for mental health problems in undergraduate university students. Frontiers in Psychiatry, 14, 1184156. https://doi.org/10.3389/fpsyt.2023.1184156
Jang, A., & An, M. (2022). Korean version of the 17-Item Utrecht Work Engagement Scale for university students: A validity and reliability study. Healthcare, 10(4), 642–652. https://doi.org/10.3390/healthcare10040642
Körner, L.S., Mülder, L.M., Bruno, L., Janneck, M., Dettmers, J., & Rigotti, T. (2023). Fostering study crafting to increase engagement and reduce exhaustion among higher education students: A randomized controlled trial of the STUDYCoach online intervention. Applied Psychology: Health and Well-Being, 15(2), 776–802. https://doi.org/10.1111/aphw.12410
Kristjansson, E., Aylesworth, R., McDowell, I., & Zumbo, B.D. (2005). A comparison of four methods for detecting differential item functioning in ordered response items. Educational and Psychological Measurement, 65(6), 935–953. https://doi.org/10.1177/0013164405275668
Laher, S. (2024). Advancing an agenda for psychological assessment in South Africa. South African Journal of Psychology, 54(4), 515–530. https://doi.org/10.1177/00812463241268528
Laher, S., & Cockcroft, K. (2013). Contextualising psychological assessment in South Africa. In S. Laher & K. Cockcroft (Eds.), Psychological Assessment in South Africa: Research and applications (pp. 1–14). Wits University Press.
Leavy, P. (2017). Research design: Quantitative, qualitative, mixed methods, arts-based, and community-based participatory research approaches. Guilford Publications.
Lesener, T., Gusy, B., & Wolter, C. (2019). The job demands-resources model: A meta-analytic review of longitudinal studies. Work & Stress, 33(1), 76–103. https://doi.org/10.1080/02678373.2018.1529065
Liu, M., Gorgievski, M.J., Zwaga, J., & Paas, F. (2023). How entrepreneurship program characteristics foster students’ study engagement and entrepreneurial career intentions: A longitudinal study. Learning and Individual Differences, 101, 102249. https://doi.org/10.1016/j.lindif.2022.102249
Liu, W., Zhang, W., Van Der Linden, D., & Bakker, A.B. (2023). Flow and flourishing during the pandemic: The roles of strengths use and playful design. Journal of Happiness Studies, 24(7), 2153–2175. https://doi.org/10.1007/s10902-02300670-2
Loscalzo, Y., & Giannini, M. (2018). Study engagement in Italian University students: A confirmatory factor analysis of the Utrecht Work Engagement Scale – Student version. Social Indicators Research, 142(2), 845–854. https://doi.org/10.1007/s11205-018-1943-y
Martin, A.J., Collie, R.J., & Nagy, R.P. (2021). Adaptability and high school students’ online learning during COVID-19: A job demands-resources perspective. Frontiers in Psychology, 12, 702163. https://doi.org/10.3389/fpsyg.2021.702163
Maslach, C., & Leiter, M.P. (1997). The truth about burnout: How organizations cau se personal stress and what to do about it. Jossey-Bass. (1st ed.).
Maslach, C., Jackson, S.E., & Leiter, M.P. (1996). Maslach burnout inventory (3rd ed.). Mountain View.
Morin, A.J.S., Myers, N.D., & Lee, S. (2020). Modern factor analytic techniques: Bifactor models, exploratory structural equation modeling (ESEM), and bifactor-ESEM. In G. Tenenbaum, & R.C. Eklund (Eds.), Handbook of sport psychology (1st ed., pp. 1044–1073). Wiley.
Mostert, K., Pienaar, J., Gauche, C., & Jackson, L. (2007). Burnout and engagement in university students: A psychometric analysis of the MBI-SS and UWES-S. South African Journal of Higher Education, 21(1), 147–162. https://doi.org/10.4314/sajhe.v21i1.25608
Mülder, L.M., Schimek, S., Werner, A.M., Reichel, J.L., Heller, S., Tibubos, A.N., Schäfer, M., Dietz, P., Letzel, S., Beutel, M. E., Stark, B., Simon, P., & Rigotti, T. (2022). Distinct patterns of university students study crafting and the relationships to exhaustion, well-being, and engagement. Frontiers in Psychology, 13, 895930. https://doi.org/10.3389/fpsyg.2022.895930
Muthén, L., & Muthén, B. (2023). Mplus user’s guide, version 8.9, 1998–2023 (8th ed.). Muthén & Muthén. Retrieved from https://www.statmodel.com/download/usersguide/MplusUserGuideVer_8.pdf
Nakamura, J., & Csikszentmihalyi, M. (2014). The concept of flow. In M. Csikszentmihalyi (Ed.), Flow and the foundations of positive psychology (pp. 239–263). Springer Netherlands.
Naudé, J., & Rothmann, S. (2004). The validation of the Utrecht work engagement scale for emergency medical technicians in Gauteng. South African Journal of Economic and Management Sciences, 7(3), 459–468. https://doi.org/10.4102/sajems.v7i3.1356
Piedmont, R.L. (2014). Factorial validity. In A.C. Michalos (Ed.), Encyclopedia of quality of life and well-being research (pp. 2148–2149). Springer Netherlands.
Pienaar, J., & Sieberhagen, C.F. (2005). Burnout and engagement of student leaders in a higher education institution. South African Journal of Higher Education, 19(1), 155–166. Retrieved from https://repository.nwu.ac.za:443/bitstream/10394/175/1/sieberhagen_c.pdf
Price, D.V., & Tovar, E. (2014). Student engagement and institutional graduation rates: Identifying high-impact educational practices for community colleges. Community College Journal of Research and Practice, 38(9), 766–782. https://doi.org/10.1080/10668926.2012.719481
Rakgogo, T.J. (2024). A linguistic evaluation of the South African higher education sector: A reflection on 30 years of democracy (1994–2024). Transformation in Higher Education, 9(0), a342. https://doi.org/10.4102/the.v9i0.342
Rastogi, A., Pati, S.P., Kumar, P., Dixit, J.K., & Pradhan, S. (2018). Student engagement in Indian context: UWES-S validation and relationship with burnout and life satisfaction. International Journal of Work Organisation and Emotion, 9(1), 89. https://doi.org/10.1504/IJWOE.2018.091340
Revicki, D. (2014). Internal consistency reliability. In A.C. Michalos (Ed.), Encyclopedia of quality of life and well-being research (pp. 3305–3306). Springer Netherlands.
Römer, J. (2016). The Korean Utrecht Work Engagement Scale-Student (UWES-S): A factor validation study. Testing, Psychometrics, Methodology in Applied Psychology, 23(1), 65–81. Retrieved from https://psycnet.apa.org/record/2016-12605-005
Salmela-Aro, K., Tang, X., & Upadyaya, K. (2022). Study demands-resources model of student engagement and burnout. In A.L. Reschly, & S.L. Christenson (Eds.), Handbook of research on student engagement (pp. 77–93). Springer International Publishing.
Saqr, M., & López-Pernas, S. (2021). The longitudinal trajectories of online engagement over a full program. Computers & Education, 175, 104325. https://doi.org/10.1016/j.compedu.2021.104325
Schaap, P., & Kekana, E. (2016). The structural validity of the experience of Work and Life Circumstances Questionnaire (WLQ). SA Journal of Industrial Psychology, 42(1), a1349. https://doi.org/10.4102/sajip.v42i1.1349
Schaufeli, W.B., & Bakker, A.B. (2004). UWES Utrecht work engagement scale: Preliminary manual version 1.1. Department of Psychology, Utrecht University.
Schaufeli, W.B., Martínez, I.M., Pinto, A., Marques., Salanova, M., & Bakker, A.B. (2002a). Burnout and engagement in university students: A cross-national study. Journal of Cross-Cultural Psychology, 33(5), 464–481. https://doi.org/10.1177/0022022102033005003
Schaufeli, W.B., Salanova, M., González-Romá, V., & Bakker, A.B. (2002b). The measurement of engagement and burnout: A two sample confirmatory factor analytic approach. Journal of Happiness Studies, 3(1), 71–92. https://doi.org/10.1023/A:1015630930326
Schober, P., Boer, C., & Schwarte, L.A. (2018). Correlation coefficients: Appropriate use and interpretation. Anesthesia & Analgesia, 126(5), 1763–1768. https://doi.org/10.1213/ANE.0000000000002864
Seligman, M.E.P., & Csikszentmihalyi, M. (2000). Positive psychology: An introduction. American Psychologist, 55(1), 5–14. https://doi.org/10.1037/0003-066X.55.1.5
Serrano, C., Andreu, Y., Murgui, S., & Martínez, P. (2019). Psychometric properties of Spanish version Student Utrecht Work Engagement Scale (UWES–S–9) in high-school students. Spanish Journal of Psychology, 22(21), 1–9. https://doi.org/10.1017/sjp.2019.25
Sheikh, H., Prins, C., & Schrijvers, E. (2023). Artificial intelligence: definition and background. In J.E.J. Prince & F.W.A. Bron (Eds.), Mission AI: The new system technology (pp. 15–41). Springer.
Shevlin, M., & Miles, J.N.V. (1998). Effects of sample size, model specification and factor loadings on the GFI in confirmatory factor analysis. Personality and Individual Differences, 25(1), 85–90. https://doi.org/10.1016/S0191-8869(98)00055-5
Shi, D., Lee, T., & Maydeu-Olivares, A. (2019). Understanding the model size effect on SEM fit indices. Educational and Psychological Measurement, 79(2), 310–334. https://doi.org/10.1177/0013164418783530
Snijders, I., Wijnia, L., Rikers, R.M.J.P., & Loyens, S.M.M. (2020). Building bridges in higher education: Student-faculty relationship quality, student engagement, and student loyalty. International Journal of Educational Research, 100, 101538. https://doi.org/10.1016/j.ijer.2020.101538
Tayama, J., Schaufeli, W.B., Shimazu, A., Tanaka, M., & Takahama, A. (2019). Validation of a Japanese version of the Work Engagement Scale for students. Japanese Psychological Research, 61(4), 262–272. https://doi.org/10.1111/jpr.12229
Van De Schoot, R., Lugtig, P., & Hox, J. (2012). A checklist for testing measurement invariance. European Journal of Developmental Psychology, 9(4), 486–492. https://doi.org/10.1080/17405629.2012.686740
Van De Vijver, F., & Tanzer, N.K. (2004). Bias and equivalence in cross-cultural assessment: An overview. European Review of Applied Psychology, 54(2), 119–135. Retrieved from https://psycnet.apa.org/doi/10.1016/j.erap.2003.12.004
Van De Vijver, F.J.R., & Leung, K. (2021). Methods and data analysis for cross-cultural research (2nd ed.). Cambridge University Press.
Wang, W., Kofler, L., Lindgren, C., Lobel, M., Murphy, A., Tong, Q., & Pickering, K. (2023). AI for psychometrics: Validating machine learning models in measuring emotional intelligence with eye-tracking techniques. Journal of Intelligence, 11(9), 170. https://doi.org/10.3390/jintelligence11090170
Zangirolami-Raimundo, J., Echeimberg, J.D.O., & Leone, C. (2018). Research methodology topics: Cross-sectional studies. Journal of Human Growth and Development, 28(3), 356–360. https://doi.org/10.7322/jhgd.152198
Zumbo, B.D. (1999). A handbook on the theory and methods of differential item functioning (DIF). Directorate of Human Resources Research and Evaluation. Retrieved from https://faculty.educ.ubc.ca/zumbo/DIF/handbook.pdf
|