Developing an Objective Structured Clinical Examination to Assess Clinical Competence



Similar documents
Draft proposal Standards for assessment in RACP training programs

OSCE in Maternity and Community Health Nursing: Saudi Nursing Student s Perspective

Developing Competency- Based Assessments in Field Education Courses. Cheryl Regehr Marion Bogo

PROGRAMME SPECIFICATION POSTGRADUATE PROGRAMMES. School of Health Sciences Division of Applied Biological, Diagnostics and Therapeutic Sciences

Benchmarking across universities: A framework for LMS analysis

Application for a Teaching Award in the Sustained Excellence category Dr Jennifer McGaughey, School of Nursing and Midwifery

Faculty of Health & Human Sciences School of Psychology

ACCREDITATION Classroom ready: Demonstrating the impact on student learning of initial teacher education programs

How Do We Assess Students in the Interpreting Examinations?

Course Specification

A. Terms and Definitions Used by ABET

Annual Assessment Report: Bachelor of Science in Exercise Science School of Kinesiology College of Education and Human Services

Needs Assessment as Foundation for Effective Simulation-based Experiences: Applying INACSL Simulation Standard IX: Simulation Design

Department of Health Sciences

Developing new GCSE, A level and AS qualifications for first teaching in 2016

Programme Specification. Graduate Diploma in Professional and Clinical Veterinary Nursing

Programme Specification BSc (Hons) Nursing (Adult)

Nurse Education Today

Re: Stakeholder feedback on Advanced Diploma of Sports Therapy

Programme Specification 1

APAC Accreditation Assessment Summary Report

Evolving expectations for teaching in higher education in Canada

BA (Hons) Business Administration

University of Cambridge: Programme Specifications MASTER OF STUDIES IN INTERDISCIPLINARY DESIGN FOR THE BUILT ENVIRONMENT

PROGRAMME SPECIFICATION

The Importance and Impact of Nursing Informatics Competencies for Baccalaureate Nursing Students and Registered Nurses

Student Evaluation of Faculty at College of Nursing

Project management skills for engineers: industry perceptions and implications for engineering project management course

Clinical education without borders: development of an online multidisciplinary preceptor program

Procedures for the Review of New and Existing Undergraduate Programmes

How To Accredit A Psychology Program At The University Of Melbourne

Course/programme leader: Tina Cartwright (from Sept 2014)

The Development of the Clinical Healthcare Support Worker Role: A Review of the Evidence. Executive Summary

Critical Appraisal of a Research Paper

MSc Educational Leadership and Management

2. PROGRAMME SPECIFICATION

Part one: Programme Specification

Assessment and feedback principles - draft for consultation

BSc (Honours) Computing and Information Systems (Distance Learning)

NORTH CAROLINA AGRICULTURAL AND TECHNICAL STATE UNIVERSITY

Strategic Plan Department of Communication Disorders. Minot State University

SCHOOL OF NURSING Baccalaureate Study in Nursing Goals and Assessment of Student Learning Outcomes

Setting Standards in Public Health Training. The Australian Experience Asia-Pacific Academic consortium for PH Accreditation in PH Education

Project Management in Marketing Senior Examiner Assessment Report March 2013

MSc in Computer and Information Security

Objective Structured Clinical Evaluation (OSCE) versus Traditional Clinical Students Achievement at Maternity Nursing: A Comparative Approach

PROGRAMME SPECIFICATION UNDERGRADUATE PROGRAMMES. Cass Business School Department or equivalent UG Programme (Cass Business School) UCAS Code

AMC DIRECTORS REPORT THE MELBOURNE MEDICAL SCHOOL, FACULTY OF MEDICINE, DENTISTRY AND HEALTH SCIENCES, THE UNIVERSITY OF MELBOURNE EXECUTIVE SUMMARY

PROGRAMME SPECIFICATION

The goals of this program in the Department of Exercise Science are to:

POSTGRADUATE PROGRAMME SPECIFICATION

National assessment of foreign languages in Sweden

Programme Specification 1

Procedures for Assessment in VCE Studies

Academic Program Review SUMMARY*

Developing a school accreditation policy for teachers progressing to Proficient Teacher

Gavin Beccaria University of Southern Queensland AUTHORS NOTES. Gavin Beccaria, Faculty of Sciences, University of Southern Queensland, Toowoomba,

Assessment Regulations for Undergraduate Taught Studies

Programme Specification

Higher Education Review Unit

Education about and for S u s ta i n a b i l i t y i n A u s t r a l i a n B u s i n e s s

SUBJECT TABLES METHODOLOGY

Guide to the National Safety and Quality Health Service Standards for health service organisation boards

master s courses fashion & law

Factors affecting professor facilitator and course evaluations in an online graduate program

Mathematics, Computer Science and Engineering Department or equivalent Computer Science

There must be an appropriate administrative structure for each residency program.

A Guide. to Assessment of Learning Outcomes. for ACEJMC Accreditation

Programme Specification and Curriculum Map for MA Global Governance and Public Policy

Standard 1. Governance for Safety and Quality in Health Service Organisations. Safety and Quality Improvement Guide

The accuracy of the information contained in this document is reviewed by the College and may be checked by the Quality Assurance Agency.

MBA (Full-Time) Programme Specification. Primary Purpose: Course management, monitoring and quality assurance.

BA (HONS) INTERNATIONAL DEVELOPMENT STUDIES

BA (Hons) Accountancy and Financial Management - Kaplan Higher Education Institute/Academy (Singapore)

Valid from: September 2016 Faculty of Technology Design & Environment/ Solihull College

Healthcare Interpreting Career Lattice

Transcription:

Developing an Objective Structured Clinical Examination to Assess Clinical Competence Naumann F.L. Faculty of Medicine, School of Medical Sciences, Exercise Physiology. The University of New South Wales Faculty of Medicine, Prince of Wales Clinical School, The University of New South Wales Moore K.M. Learning and Teaching Unit, The University of New South Wales School of Health and Human Sciences, Southern Cross University Keywords: Exercise Physiology, Objective Structured Clinical Examination, Work-simulated Assessment of clinical skills. Abstract The distinguishing features of skill-based learning situations are that they are inherently variable, unpredictable, sometimes brief, high risk learning events which are not replicable, and this presents challenges for the quality assurance requirements of the assessment process (Cooper, Orrell & Bowden; 2003, Hodges, 2011; Yorke, 2011). One potential method for assessing clinical competence in the university setting is the Objective Structured Clinical Examination (OSCE). This paper describes an action research project, a three-phase pilot of an OSCE for Exercise Physiology undergraduate students. Fifty-six students participated. As a means of assessing validity, after each of the three pilots the following elements of the design and implementation reviewed: number of stations, time allocated to station, content of each station, descriptors of tasks, preparation and training of simulated patients and examiners and assessment criteria. Based on a comparative analysis all station and domain items, in the first OSCE, the Cronbach alpha for the examination was 0.52 and in the third OSCE 0.86, indicating continued improvement. This suggests the results would be reliable for making a summative judgement of student performance of Exercise Physiology clinical skills. Introduction In order for Exercise Physiology students to be accredited to practice, they undertake university studies and are required to complete practicum experience in a range of settings and environments. It is generally expected that during the practicum students will be assessed on their application of theory to practice. It is expected that these activities strengthen the work readiness of graduates. However, assessing students during a practicum can be highly variable in terms of the opportunities for assessment and the skills of the examiners. This means some students face more stringent examinations than other students which raises questions of the equity of in-situ exams. Universities are rethinking curriculum design and assessment practices related to work-integrated learning (WIL) (Cooper et al, 2010). Therefore researching the effectiveness of skill-based assessment, work-simulated assessment practices, is both timely and pertinent to the current climate. One potential method for assessing clinical competence in the work-simulated setting is the Objective Structured Clinical Examination (OSCE). The OSCE was first described by Harden et al. (1975) as a means to assess the clinical skills of final year medical students. Since its development, the OSCE has gained acceptance as a benchmark for assessing clinical competence (Bartfay et al. 2004) and has been adopted by several health professions including radiography (Marshall and Harris, 2000), nursing (McKnight et al., 1987; Alinier, 2006),

physiotherapy (Nayer, 1999) and dentistry (Brown et al. 1999). As Exercise Physiology is a relatively new allied health profession, it is critical to demonstrate that exercise physiologists have proficient clinical skills and OSCE was regarded by the team as a viable option. The OSCE aims to enhance the validity of clinical assessment by simulating realistic clinical scenarios to reflect real-life professional tasks. Typically, an OSCE consists of multiple stations at which students are required to perform a range of tasks to demonstrate competency in relevant skills. During OSCE students rotate through the various stations spending a predetermined time. At each station the students are instructed to complete specified tasks in the time allocated. Stations can involve actual patients or surrogate volunteers who are normally trained to present a standardised portrayal of the clinical problem. Each station is scored separately using either detailed checklists or global rating scales. Establishing reliable assessment is critical to the success of the Exercise Physiology Program. The underlying premise is that standardised assessment procedures ensure objectivity and maximise reliability (Bartfay et al. 2004; Major 2005). The aim of this study was to assess both the validity and reliability of an OSCE for assessing clinical skills of final year Exercise Physiology students. That is, we explored if each station in the Exercise Physiology OSCE tested the construct that it was intended to assess. Methodology This action research project constituted three pilots of an OSCE in the Exercise Physiology OSCE during 2011. Fifty-six final-year students participated. In addition to analysis of data from test results, feedback from examiners, clinical education staff and students provided insight into how the OSCE were perceived, the learning value and what improvements might be made. The OSCE The content of the work-simulated examinations were based on the diversity of skills expected of an exercise physiologist. Careful design of stations was required to ensure systematic and sufficient samples of the skill- based competencies, a key element of validity. A range of clinical presentations, typically expected in daily practice, were selected. These included the screening of healthy persons as well as assessment of patients with metabolic, cardiovascular, neuromuscular and musculoskeletal disorders. Each station assessed the student s competency in the following domains: communication skills; clinical and procedural skills; and technical skills. The assessment criteria and grading system used at each station evolved as the series of pilots progressed. Initially, checklists for assessing station-specific tasks were used. These were replaced by a global rating scheme employing criteria aligned to three competency domains mentioned above. A numerical scale was used for statistical analyses of test results. Therefore, each station was graded as follows: F for failed performance (score 0); P- for borderline performance (score 1); P for good performance (score 2); and P+ for excellent performance (score 3). To strengthen validity, after each pilot, elements of the design and implementation of the OSCE were scrutinised. The elements reviewed were the: number of stations; time allocated to station; content of each station; descriptors of tasks; preparation and training of simulated patients; and preparation and training of examiners and assessment criteria.

Statistical analysis The reliability of the examinations were analysed using methods based on classical test theory. This included generalisability theory which is an estimate of the examination s reliability to consistently rank students. This means, in order for a station to be considered reliable, it should assess various tasks across the three competency domains and, distinguish between the best and worst performing students. The measure for internal consistency utilised Cronbach alpha. The results for all marking criteria as well as the aggregate results from stations and competency domains were analysed. We were looking for a value >0.7 from the Charonbach alpha and generalizability coefficients analysis. All statistical analyses were performed using SPSS v20 with significance set at p<0.05. Discrimination index was ascertained using Pearson s correlation In addition, in order to determine the effect of varying the number of stations or competency domains, Decision (D) studies were performed. This showed the minimum number of stations and domains needed to produce a reliable exam result. Content validity was measured by using feedback from clinical education experts, examiners and academic staff. Assessing the content validity ensured representation of the diversity of clinical conditions typically encounter in clinical practice and the clinical competencies required to assess and manage patients. Results Based on analysis of all station and domain items in the first OSCE, the Cronbach alpha for the examination increased from 0.52 in the first OSCE to 0.86 for the third OSCE, indicating improvement across time which infers the OSCE is reliable for making a summative judgement of student performance (Table 1). Table 1: Cronbach alpha across all items of the 3 pilot OSCEs Cronbach's Alpha N of Items OSCE 1 0.520 22 OSCE 2 0.740 20 OSCE 3 0.863 20 Table 2 shows the results of analysis of the Discrimination index (measured by Pearson correlation) for each station. Significant discrimination index s (the tern, significant indicates that the ranking of students at these stations, correlate with the overall OSCE performance ranking which were as follows. OSCE 1, there were only two such stations: which were significant OSCE 2, the discrimination index was significant for four stations. OSCE 3, the discrimination index was significant for all stations indicating that the ranking of students in these stations correlated strongly with the overall ranking The aim of a good exam set up should be to have good students perform well at all stations. If there is an anomaly, it indicates the station could be suspect. The terms, significant meant that the top students did well on all stations

Generalisability Coefficient Table 2: The mean, SE and discrimination index for all OSCE stations OSCE 1 OSCE 2 OSCE 3 Station Mean SE R a Station Mean SE R a Station Mean SE R a CV Oncology 0.65 0.19 0.81* CV 0.66 0.04 0.96* CV 0.66 1.93 0.76* Oncology Oncology VO 2 0.50 0.60 0.62 Diabetes 0.71 0.06 0.12 Diabetes 0.66 1.44 0.58* interpretation ECG 0.51 0.64 0.38 ECG 0.72 0.05 0.51 ECG 0.66 1.87 0.69* interpretation ROM 0.54 0.53 0.60 Rest Rest Lung 0.60 0 NA Lung 0.51 0.06 0.69* Strength 0.68 1.64 0.65* function function Falls 0.57 0.16 0.15 Falls 0.67 0.04 0.54 Falls 0.57 1.97 0.64* Health & 0.60 0 NA 7. Health 0.62 0.03 0.76* Health & 0.64 1.41 0.60* Fitness & Fitness Fitness Mus-Sk 0.48 0.55 0.86* Mus-Sk 0.70 0.06 0.82* Mus-Sk 0.73 1.49 0.82* Key: CV Oncology Cardiovascular oncology; VO 2 interpretation maximum Oxygen consumption ECG Interpretation Electrocardiogram interpretation; ROM Range of Motion; Mus-Sk Musculoskeletal assessment; * Pearson correlation with total scores. Significant at the 0.05 level (2-tailed). The effect of varying the number of domains and stations was explored Decision (D) studies. Figure 1 shows increasing the number of domains from more than three had little effect on the generalizability coefficient >0.7 with only seven stations. Similarly increasing the number of stations beyond seven had little effect on improving generalizability. 0.90 0.80 0.70 0.60 0.50 0.40 0.30 0.20 1 domain 3 domains 5 domains 0.10 0.00 1 3 5 7 9 11 13 Number of Stations Figure 1: Generalizability Coefficient by domain and station number.

As far as content validity is concerned, staff and Clinical Educator feedback enabled station task and assessment criteria refinement throughout the project. Analysis of student feedback indicated that overall the OSCE was well perceived as a good way to assess their clinical competency. The student cohort remarked that it is essential to fully brief them on the format of the exam and assessment criteria. Staff and Clinical Educator feedback fully supported to use of the OSCE to assess clinical competence. Discussion and Conclusion Statistical analysis allowed improvements to be made to the reliability of the OSCE, exceeding the accepted Cronbach alpha benchmark of 0.7. This is consistent with the reported reliabilities for an OSCE in the medical and other allied health fields. This would suggest that the results from this examination would be reliable for making a summative judgement of Exercise Physiology student s performance. Improvements in exam reliability were achieved through better selection and training of volunteer patients and surrogates, better preparation of the students, improved assessment criteria for the examiners to work from and a refinement in task requirements to be performed within the allocated time. It is critically to successful implementation of an OSCE to determine the optimal number of stations and domains required to provide a reliable assessment of student performance. Decision (D) studies aid this process. Based on generalizability theory, Decision (D) studies can estimate the effect of varying conditions on the generalizability of an examination. Data from OSCE 2 and OSCE 3 shows the combination of seven stations: Cardiovascular oncology, diabetes, SCG interpretation, Strength, Falls, Health and Fitness and Musculo-skeletal assessment, each with three domains: communication, clinical and procedural skills and technical skills, achieved satisfactory generalizability. Increasing the number of stations or domains would not improve the reliability of the examination significantly. As far as administration is concerned, an OSCE is both labour and time intensive. Nevertheless, one of the many benefits was that this study showed through an iterative process involving analyses of staff and student feedback along with statistic data from three pilot examinations, an OSCE can evolve into a reliable and valid means of assessing, via simulation, clinical skills in Exercise Physiology students. An additional benefit was that the process of developing the OSCE facilitated a review of the curriculum. This project enables a fresh look at all aspects of clinical competency requirements throughout the curriculum. In some institutions, an OSCE is conducted prior to students undertaking WIL experiences. In this instance however, we used the OSCE after the WIL experience. Given that assessing students skills in-situ, during real- time clinical consultations can be highly variable an OSCE facilitates greater efficacy and equality in the assessment of a range of clinical skills. An OSCE is a valuable tool in the overarching assessment strategy of the clinical component of the Exercise Physiology curriculum. Limitations of the study The limitations are that this study focused on a small student cohort at one University. The student cohort were inexperienced taking this sort of exam. In future, an OSCE will be conducted in second year, to get them ready for the 4 th year OSCE? Directions for future research This project has been followed by a second study in which we compared examiner reliability for assessment of the exercise physiology objective structured clinical examinations using observation of filmed student performance. It is anticipated that this project will add further academic rigor to the assessment of the students skills prior to engaging in and after WIL.

References Alinie, G., Hunt, B., Gordon. R., & Harwood, C. (2006). Issues and innovations in nursing education. Effectiveness of intermediate fidelity simulation training technology in undergraduate education. Journal of Advanced Nursing 54: 539 569. Bartfay, W., Rombough, R., Howse, E., & Leblanc, R (2004). The OSCE approach in nursing education. Canadian Nurse 100: 18 23. Brown, G., Minogue, M., & Martin, M. (1999). The validity and reliability of an OSCE in dentistry. European Journal of Dental Education 3: 117-125. Cooper, L., Orrell, J., & Bowden, M. (2003).Workplace-learning management manual. A guide for establishing and managing university work-integrated learning courses, practical, field education and clinical education, Australia: Flinders University Staff Development and Training Unit. Cooper, L., Orrell, J., & Bowden, M. (2010). Work integrated learning: A guide to effective practice. Routledge: USA. Harden, R.M., Stevenson, M., Downie, W.W., & Wilson, G.M. (1975). Assessment of clinical competency using objective structured examination. British Medical Journal 5: 447 451. Hodges, D. (2011). The assessment of student learning in cooperative and work-integrated education. International handbook for cooperative & Work-Integrated Education ( 2 nd ed). World Association for Cooperative Education Inc. Major, D. (2005). OSCES seven years on the bandwagon: the progress of an objective structured clinical evaluation programme. Nurse Education Today 25: 442 454. Marshal,l G.,& Harris, P. (2000). A study of the role of an objective structured clinical examination (OSCE) in assessing clinical competence in third year student radiographers. Radiography 6: 117 122. McKnight, J., Rideout, E., Brown, B, Ciliska, D., Patton, D., Rankin, J., & Woodward, C. (1987). The objective structured clinical examination: an alternative approach to assessing student performance. Journal of Nursing Education 26: 39 41. Nayer, M. (1999). An overview of the objective structured clinical examination. Physiotherapy Canada. 45(3): 171-8. Yorke, M.. (2011). Work-engaged learning: towards a paradigm shift in assessment. Quality in Higher Education. 17:1. 117-130. Copyright 2012 Naumann F.L, Moore K.M. The author assign to the Australian Collaborative Education Network (ACEN Inc.) an educational non-profit institution, a nonexclusive licence to use this document for personal use and in courses of instruction, provided that the article is used in full and this copyright statement is reproduced. The author also grants a non-exclusive licence to the Australian Collaborative Education Network to publish this document on the ACEN website and in other formats for the Proceedings ACEN National Conference Melbourne / Geelong 2012. Any other use is prohibited without the express permission of the author.