LEARNING AGILITY AND ITS RELATIONSHIP WITH ACADEMIC ACHIEVEMENT AMONG UNIVERSITY STUDENTS: A COMPARATIVE STUDY ACROSS TWO FACULTIES
Main Article Content
Abstract
This study investigates learning agility among students from two faculties: the Faculty of Science and Industrial Technology and the Faculty of Innovation in Agriculture, Fisheries, and Food. The objectives were to 1) Identify the level of students’ learning agility, 2) Examine the relationship between learning agility and cumulative grade point average (GPA), and 3) Compare differences in learning agility between students from the two faculties. A survey research design was employed, with data collected from 477 students at Prince of Songkla University, Surat Thani Campus. The research instrument was a learning agility scale developed based on the concept proposed by Hallenbeck and validated for content validity by experts. Data analysis included descriptive statistics, Pearson correlation, and multivariate analysis of variance (MANOVA). The findings revealed that students demonstrated a high overall level of learning agility (M = 5.75). However, its relationship with GPA was low and not statistically significant. Specifically, a slight negative correlation was found among students in the Faculty of Science and Industrial Technology (r = -.07, p > .05), whereas a slight positive correlation was observed among students in the Faculty of Innovation in Agriculture, Fisheries, and Food (r = .12, p > .05). The comparison between the two faculties showed a statistically significant multivariate difference (Pillai’s Trace = .042, F (4,472) = 5.12, p < .01), with students from the Faculty of Innovation in Agriculture, Fisheries, and Food scoring higher across all dimensions. These findings suggest that learning agility may not be directly associated with academic achievement as measured by GPA, but rather functions as a context-dependent competency. Therefore, higher education assessment practices should incorporate authentic and competency-based evaluation approaches to better capture students’ holistic capabilities.
Article Details
References
Andrade, H. L. (2020). A critical review of research on student self-assessment. Frontiers in Education, 5, 87. https://doi.org/10.3389/feduc.2020.00087.
Boateng, G. O. et al. (2018). Best practices for developing and validating scales for health, social, and behavioral research: A primer. Frontiers in Public Health, 6, 149. https://doi.org/10.3389/fpubh.2018.00149.
Boud, D. & Falchikov, N. (2020). Aligning assessment with long-term learning. Assessment & Evaluation in Higher Education, 45(7), 1035-1048. https://doi.org/10.1080/02602938.2020.1764761.
Burke, W. W. (2014). Organization change: Theory and practice. (4th ed.). Thousand Oaks, California: Sage Publications.
DeRue, D. S. et al. (2012). Learning agility: In search of conceptual clarity and theoretical grounding. Industrial and Organizational Psychology, 5(3), 258-279. https://doi.org/10.1111/j.1754-9434.2012.01444.x.
DeVellis, R. F. (2017). Scale development: Theory and applications. (4th ed.). Thousand Oaks, California: Sage Publications.
Field, A. (2013). Discovering statistics using IBM SPSS statistics. (4th ed.). Thousand Oaks, California: Sage Publications.
Hallenbeck, G. S. (2016). Learning agility: Unlock the lessons of experience. Greensboro: Center for Creative Leadership.
Haynes, S. N. et al. . (1995). Content validity in psychological assessment: A functional approach. Psychological Assessment, 7(3), 238-247. https://doi.org/10.1037/1040-3590.7.3.238.
Hoff, D. & Burke, W. W. (2017). Learning agility: The key to leader potential. New York: Columbia University Press.
Lombardo, M. M. & Eichinger, R. W. (2000). High potentials as high learners. Human Resource Management, 39(4), 321-329.
McKinsey & Company. (2020). The future of work after COVID-19. New York: McKinsey Global Institute.
Messick, S. . (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances. American Psychologist, 50(9), 741-749. https://doi.org/10.1037/0003-066X.50.9.741.
Organisation for Economic Co-operation and Development. (2021). 21st-century readers: Developing literacy skills in a digital world. Paris: OECD Publishing.
Pellegrino, J. W. (2020). Educational assessment in the 21st century: Challenges and opportunities. Educational Measurement: Issues and Practice, 39(3), 34-41. https://doi.org/10.1111/emip.12342.
Polit, D. F. & Beck, C. T. (2006). The content validity index: Are you sure you know what’s being reported? Critique and recommendations. Research in Nursing & Health, 29(5), 489-497. https://doi.org/10.1002/nur.20147.
Tabachnick, B. G. & Fidell, L. S. (2019). Using multivariate statistics. (7th ed.). New York: Pearson.
Tai, J. et al. (2022). Developing evaluative judgement: Enabling students to make decisions about the quality of work. Higher Education, 83(2), 353-370. https://doi.org/10.1007/s10734-021-00705-1.
World Economic Forum. (2016). The future of jobs: Employment, skills and workforce strategy for the Fourth Industrial Revolution. Geneva, Switzerland: World Economic Forum.
Yamane, T. (1967). Statistics: An introductory analysis. (2nd ed.). New York: Harper & Row.