Abstract:
‘Purpose – The purpose of this paper is to design a rubric instrument for assessing oral presentation performance in higher education and to test its validity with an expert group.
Design/methodology/approach – This study, using mixed methods, focusses on: designing a rubric by identifying assessment instruments in previous presentation research and implementing essential design characteristics in a preliminary developed rubric; and testing the validity of the constructed instrument with an expert group of higher educational professionals (n=38).
Findings – The result of this study is a validated rubric instrument consisting of 11 presentation criteria, their related levels in performance, and a five-point scoring scale. These adopted criteria correspond to the widely accepted main criteria for presentations, in both literature and educational practice, regarding aspects
as content of the presentation, structure of the presentation, interaction with the audience and presentation delivery.
Practical implications – Implications for the use of the rubric instrument in educational practice refer to the extent to which the identified criteria should be adapted to the requirements of presenting in a certain domain and whether the amount and complexity of the information in the rubric, as criteria, levels and scales, can be used in an adequate manner within formative assessment processes.
Originality/value – This instrument offers the opportunity to formatively assess students’ oral presentation performance, since rubrics explicate criteria and expectations. Furthermore, such an instrument also facilitates feedback and self-assessment processes. Finally, the rubric, resulting from this study, could be used
in future quasi-experimental studies to measure students’ development in presentation performance in a pre-and post-test situation.’
Full reference:
Van Ginkel, S., Laurentzen, R., Mulder, M., Mononen, A., Kyttä, J. Kortelainen, M.J. (2017). Assessing oral presentation performance: Designing a rubric and testing its validity with an expert group. Journal of Applied Research in Higher Education, 9(3), 74-486, doi: 10.1108/JARHE-02-2016-0012.