Comparing Competencies Assessment Tools: Identifying the Most Effective and Accurate Approach Academic Article in Scopus uri icon

abstract

  • This paper presents a 3 years-long study that analyzes changes in student competencies assessment and performance, upon modification of assessment tools across different academic periods at a private Mexican university, focusing on their accuracy and reliability. In 2019, the university introduced a New Educational Model that is built on a student competency-based curricula; it uses active learning and considers Challenge Based Learning as the main learning technique for most courses. However, the assessment instrument that was selected at the time was a binary 'Observed or Not Observed' checklist, which led to an overly positive evaluation bias. After careful analysis, the university transitioned to a multi-level, competency-specific rubric designed by professors from various disciplines, allowing for a more granular evaluation.This study compares three full semesters of assessments using the original checklists (1730 students, 20 courses, 830 groups and 306 educators) and three semesters using the new rubrics (2856 students, 20 courses, 899 groups and 258 educators), utilizing a normalized assessment average. The comparison includes only those competencies assessed throughout the semesters starting from Spring 2020..Analysis of the data (n=120,245 assessments) reveals that checklists generate a positive bias, resulting in a higher weighted average (92.9%) compared to rubrics (89.0%). This bias decreases with rubrics, bringing assessments closer to institutional goals. The data indicate a statistically significant difference (chi-square, p=1.35E-128) in assessment results provided by different instruments. As a result, the shift to rubrics appears to enhance professors and instructors' evaluation of competencies across an array of courses, in turn providing students with a fairer performance and competency assessment, and educators with a more precise feedback tool. Further work is needed to include more students in the sample and implement more specific and improved rubrics. © 2024 IEEE.

publication date

  • January 1, 2024