Alternative Digital Credentials: UAE’s First Adopters’ Assessment and Evaluation Part (2)

El-Farra Samar

Abstract


Abstract. Despite the wealth of research on full credentials assessments, standardized approaches are still scarce. This is even more threatening to the acceptance of higher education alternative digital credentials. To address this threat, validated and transparent assessments and evaluation processes are of paramount importance. This study is a continuum to our previous review on the pedagogical program analysis, design, development, and implementation. This paper reviews, assesses and evaluates the alternative digital credential offering case study. We review the development and administration of seven requirements and assessment tools used to evaluate students’ performance and use Kirkpatrick’s model to evaluate the effectiveness of the alternative credential offered. The predominantly clinical-based assessment tools and assessment decision criteria are reviewed in detail in this paper, allowing educators to leverage the outcome of this work.

 

Results: The reviewed alternative digital credential case study in the human thorax and extremities from medical imaging has achieved Kirkpatrick level three, as evident in results, particularly from clinical assessments and clinical site viva-voce. When introducing a new competency-based assessment, professional standards can be used as a reference point to develop Behavioral Marker System rubrics. The Ebel method in calculating the cut score, which reflects expert judgment, should be considered when developing competency-based rubrics. Standardization of at least the top common technical and NTS is possible when researchers consider international collaboration by publishing comprehensive methodologies, frameworks, and results. This paper is unique as we are unaware of any publication on alternative digital credentials combining medical imaging and technical and non-technical skills within entrustable professional task assessment, verification, and program evaluation.

https://doi.org/10.26803/ijlter.21.11.11


Keywords


alternative digital credentials; clinical-based assessment; non-technical skills; medical imaging; entrustable professional tasks

Full Text:

PDF

References


Australian Society of Medical Imaging and Radiation Therapy (ASMIRT). (2018). Professional Practice Standards. https://www.asmirt.org/asmirt_core/wp–content/uploads/371.pdf

Cimatti, B. (2016). Definition, development, assessment of soft skills and their role for the quality of organizations and enterprises. International Journal for quality research, 10(1), 97–130. https://doi.org/10.18421/ijqr10.01–05

Curtis, D. (2004). The assessment of generic skills. In J. Gibb (Ed.), Generic Skills in Vocational Education and Training: Research Readings (pp. 136-156). Centre for Vocational Education Research Ltd. https://files.eric.ed.gov/fulltext/ED493988.pdf

Davis, M. H., & Karunathilake, I. (2005). The place of the oral examination in today's assessment systems. Medical teacher, 27(4), 294-297. http://doi.org/10.1080/01421590500126437

De Champlain, A. (2019). Standard Setting Methods in Medical Education: High?stakes Assessment. In T. Swanwick, K., Forrest & B.C. O’Brien (Eds.). Understanding Medical Education: Evidence, Theory, and Practice (3rd ed., pp. 347–360). The Association for the Study of Medical Education (ASME). John Wiley & Sons Ltd. https://doi.org/10.1002/9781119373780.ch24

Driessen, E., & Tartwijk,J. (2019). Portfolios in personal and professional development. In T. Swanwick, K., Forrest & B.C. O’Brien (Eds.). Understanding Medical Education: Evidence, Theory, and Practice (3rd ed., pp. 225–161). The Association for the Study of Medical Education (ASME). John Wiley & Sons Ltd. https://doi.org/10.1002/9781119373780.ch18

Duffy, D., Gordon, H., Whelan, G., Cole-Kelly, K., & Frankel, R. (2004). Assessing competence in communication and interpersonal skills: the Kalamazoo II report. Academic medicine, 79(6), 495–507. https://journals.lww.com/academicmedicine/fulltext/2004/06000/assessing_competence_in_communication_and.2.aspx

El-Farra S.A. (2022). Alternative Digital Credentials: UAE’s First Adopters’ Design, Development, and Implementation Part (1). International Journal of Learning, Teaching and Educational Research. 10(21), pp. 64-87. https://doi.org/10.26803/ijlter.21.10.4

El-Farra, S., Mohaidat, M., Aldajah, S., & Alshamsi, A. (2022) Alternative Digital Credentials—UAE’s First Adopters’ Quality Assurance Model and Case Study. In K. Cheng, B. Koul , T. Wang , & X. Yu. (Eds.). Artificial Intelligence in Education: Emerging Technologies, Models and Applications. Lecture Notes on Data Engineering and Communications Technologies, 104, (pp. 339–359). Springer. https://doi.org/10.1007/978-981-16-7527-0_25

Englander, R., Frank, J. R., Carraccio, C., Sherbino, J., Ross, S., Snell, L., & ICBME Collaborators. (2017). Toward a shared language for competency–based medical education. Medical teacher, 39(6), 582–587. https://doi.org/10.1080/0142159X.2017.1315066

Gallardo, K. (2020). Competency-based assessment and the use of performance-based evaluation rubrics in higher education: Challenges towards the next decade. Problems of Education in the 21st Century, 78(1), 61-79. https://doi.org/10.33225/pec/20.78.61

Ganji, K. K. (2017). Evaluation of reliability in structured viva voce as a formative assessment of dental students. Journal of dental education, 81(5), 590-596. 88 https://doi.org/10.21815/JDE.016.017

Gibbs, G. (1988). Learning by Doing: A guide to teaching and learning methods. Further Education Unit. Oxford Polytechnic. https://thoughtsmostlyaboutlearning.files.wordpress.com/2015/12/learning-by-doing-graham-gibbs.pdf

Goldman, J., & Wong, B.M. (2020). Nothing soft about ‘soft skills’: core competencies in quality improvement and patient safety education and practice. BMJ Quality & Safety, 29(619–622). http://dx.doi.org/10.1136/bmjqs-2019-010512.

Gordon, M., Farnan, J., Grafton-Clarke, C., Ahmed, R., Gurbutt, D., McLachlan, J., & Daniel, M. (2019). Non-technical skills assessments in undergraduate medical education: a focused BEME systematic review: BEME Guide No. 54. Medical teacher, 41(7), 732-745. https://doi.org/10.1080/0142159X.2018.1562166

Gruppen, L. D., Mangrulkar, R. S., & Kolars, J. C. (2012). The promise of competency-based education in the health professions for improving global health. Human Resources for Health, 10(1), 1-7.? https://human-resources-health.biomedcentral.com/articles/10.1186/1478-4491-10-43

Higham, H., Greig, P. R., Rutherford, J., Vincent, L., Young, D., & Vincent, C. (2019). Observer-based tools for non-technical skills assessment in simulated and real clinical environments in healthcare: a systematic review. BMJ Quality & Safety, 28(8), 672-686. https://ora.ox.ac.uk/objects/uuid:b9e628c5-9501-4ad4-9304-dcfd75d773b2/download_file?safe_filename=Higham_et_al_2019_Observer-based_tools_for_non-technical_skills_assessment.pdf&file_format=application%2Fpdf&type_of_work=Journal+article

Hojat, M. (2016). A definition and key features of empathy in patient care. In Empathy in Health Professions Education and Patient Care (pp. 71-81). Springer. https://doi.org/10.1007/978-3-319-27625-0_6

Hojat, M., DeSantis, J., Shannon, S. C., Mortensen, L. H., Speicher, M. R., Bragan, L., LaNoue, M. & Calabrese, L. H. (2018). The Jefferson Scale of Empathy: a nationwide study of measurement properties, underlying components, latent variable structure, and national norms in medical students. Advances in Health Sciences Education, 23(5), 899-920. https://doi.org/10.1007/s10459-018-9839-9

Hojat, M., & Gonnella, J. S. (2017). What matters more about the Interpersonal Reactivity Index and the Jefferson Scale of Empathy? Their underlying constructs or their relationships with pertinent measures of clinical competence and patient outcomes?. Academic Medicine, 92(6), 743-745. https://doi.org/10.1097/ACM.0000000000001424

International Society of Radiographers and Radiological Technologists (ISRRT). (2022). Board of Management Motion International Academic Network (IAN) New membership Category. https://www.isrrt.org/pdf/Item_6_new_membership_category_educational_institutes.pdf

Jefferies, A., Simmons, B., Ng, E., & Skidmore, M. (2011). Assessment of multiple physician competencies in postgraduate training: Utility of the structured oral examination. Advances in Health Sciences Education, 16(5), 569-577. https://doi.org/10.1007/s10459-011-9275-6

Kirkpatrick, J. D., & Kirkpatrick, W. K. (2016). Kirkpatrick's four levels of training evaluation. Association for Talent Development.

Lampignano, J., & Kendrick, L. E. (2017). Bontrager's textbook of radiographic positioning and related anatomy-E-book. Elsevier Health Sciences.

Norcini, J., & Zaidi, Z. (2019). Workplace assessment. In T. Swanwick, K., Forrest & B.C. O’Brien (Eds.). Understanding Medical Education: Evidence, Theory, and Practice (3rd ed., pp. 319–334). The Association for the Study of Medical Education (ASME). John Wiley & Sons Ltd. https://doi.org/10.1002/9781119373780.ch22

Schuwirth, L.W.T. & van der Vleuten, C.P.M (2019). How to Design a Useful Test: The Principles of Assessment. In T. Swanwick, K., Forrest & B.C. O’Brien. (Eds.). Understanding Medical Education: Evidence, Theory, and Practice (3rd ed., pp. 277–290). The Association for the Study of Medical Education (ASME). John Wiley & Sons Ltd. https://doi.org/10.1002/9781119373780.ch20

Shenton, A. K. (2004). Strategies for ensuring trustworthiness in qualitative research projects. Education for information, 22(2), 63-75. https://www.pm.lth.se/fileadmin/_migrated/content_uploads/Shenton_Trustworthiness.pdf

Shenwai, M. R., & Patil, K. (2013). Introduction of Structured Oral Examination as A Novel Assessment tool to First Year Medical Students in Physiology. Journal of clinical and diagnostic research: JCDR, 7(11), 2544–2547. https://doi.org/10.7860/JCDR/2013/7350.3606

Thomas, M. J. (2018). Training and assessing non–technical skills: A practical guide. CRC Press. https://doi.org/10.1201/9781315550336

Velasco-Martínez, L. C., & Hurtado, J. C. T. (2018). The use of rubrics in higher education and competences evaluation. Profesorado, 22(3), 183–208. https://doi.org/10.30827/profesorado.v22i3.7998

Yune, S. J., Lee, S. Y., Im, S. J., Kam, B. S., & Baek, S. Y. (2018). Holistic rubric vs. analytic rubric for measuring clinical performance levels in medical students. BMC medical education, 18(1), 1-6. https://doi.org/10.1186/s12909-018-1228-9


Refbacks

  • There are currently no refbacks.


e-ISSN: 1694-2116

p-ISSN: 1694-2493