Development of an Objective Structured Clinical Examination as a Summative Clinical Competency Examination for the Doctor of Optometry Program

Miller's pyramid of assessment, arrow highlighting

Grant recipients:

Patricia Hrynchak, Stan Woo, Jenna Bright, Andre Stanberry, School of Optometry and Vision Science

(Project timeline: January 2018-June 2020)

Description

This project will lead to the development and implementation of an Objective Structured Clinical Examination (OSCE) for the Doctor of Optometry program which is an optimal summative assessment of end-of-program student abilities.  The OSCE will more effectively measure exit-level competencies than our current assessment system, help prepare students to be successful on board examinations (as they will gain exposure to this format which is used in the Optometry Examining Board of Canada’s  examinations) and use standardized methodology to increase reliability and validity of the assessment system. The results of this assessment will be used in program evaluation to improve the quality of the Doctor of Optometry curriculum.  

Intended Project Goals

The objective of this proposal is to develop and implement a summative clinical competency examination for the UW Doctor of Optometry program in the form of an OSCE. The OSCE is a well-established assessment tool used in undergraduate and high-stakes assessment systems in medicine, dentistry, pharmacy and nursing as well as other health professions. This will bring the Doctor of Optometry program up to current standards of best practices in assessment. This assessment was to be used in the International Optometric Bridging Program (IOBP) as a summative assessment but that program has been discontinued and replaced by an Advanced Standing Program where foreign trained practitioners enter the UW optometry curriculum at the beginning of third year. These students will graduate with the regular optometry cohort and take any assessments required for completing the program.

Findings

An organizational team was formed. The examination was blueprinted to the Optometric Examining Board of Canada competency profile. Development of the examination stations included determining the topic, the station type and the clinical abilities to be assessed. Assessment tools were developed including global rating scales and check lists. The stations were reviewed by other members of the team and then piloted for feasibility by the team members. A pool of examiners was trained to grade consistently. Standardized patients were hired from a professional program and trained to represent a particular problem consistently. Equipment was purchased as needed and the space and human resources were organized. The OSCE was first trialed on a group of students graduating from an International Optometric Bridging Program. The results were used to refine the stations. The OSCE was then given to a volunteer group of graduating students from the professional optometry program. The borderline groups method was used to determine the cut score. 

There were 90 students eligible to take the OSCE. Of those, 53 volunteered and consented to have their examination results used in analysis. Only one student declined to have their examination results used in analysis. There were 11 active stations and three rest stations. Six of the active stations involved standardized patients and four used simulators for procedural skills demonstration. Of the 53 students, 46 (87%) passed the examination. Analysis of the data is ongoing.

A satisfaction survey as a part of program evaluation was given to the optometry students.  There were 90 students eligible to take the OSCE. Of those, 54 volunteered. All students who volunteered attended the examination and all completed the survey. The overall satisfaction level was very high.  Students were very positive about the interactions with standardized patient and the organization of the examination. They felt the examination used realistic clinical scenarios. There was mixed reaction to the use of simulators for skills assessment and the length of time available in each station to perform those tasks. Students were uncertain of how to prepare for the examination.

The results of the student satisfaction survey have been accepted for publication in the Journal of Optometric Education “Student Satisfaction with an Objective Structured Clinical Examination in Optometry”. Another paper “Applicability of Entry to Practice Examinations for Optometry in Canada” has been submitted for consideration by the Canadian Journal of Optometry.

References

Project Reference List (PDF)

Return to browse projects