Workplace Based Assessment SLE-DOPS Pilot 2014



Similar documents
Have you considered a career in Occupational Medicine?

Medical Career Advice and Guidance Survey 2014: Initial Findings

Good Practice Guidelines for Appraisal

FACULTY OF PHARMACEUTICAL MEDICINE

Royal College of Physicians of Edinburgh EDUCATION, TRAINING AND STANDARDS DEPARTMENT VACANCY FOR DIRECTOR OF EDUCATION

1. Improving information for International Medical Graduates about job prospects in the UK.

TRAINING AND QUALIFICATIONS FOR OCCUPATIONAL HEALTH NURSES

Genito-urinary Medicine

Modernising Scientific Careers. Scientist Training Programme (STP) Recruitment Frequently Asked Questions

Supporting information for appraisal and revalidation: guidance for General Practitioners

Approval of foundation programme training and new programmes for provisionally registered doctors outside the United Kingdom

Draft proposal Standards for assessment in RACP training programs

Core Medical Training and E-Portfolio A Guide for Trainees Core Medical Training Programme South Yorkshire

Core Module 2: Teaching, Appraisal and Assessment

Introduction to Clinical Examination and Procedural Skills Assessment (Integrated DOPS)

Royal College of Obstetricians and Gynaecologists. Faculty of Sexual and Reproductive Healthcare

Core Medical Training and E-Portfolio A Guide for Trainees Core Medical Training Programme South Yorkshire

Establishing a Regulatory Framework for Credentialing

INTRODUCTION TO THE UK CURRICULUM IN CLINICAL GENETICS

ROYAL HOLLOWAY University of London PROGRAMME SPECIFICATION

Quality Assurance of Medical Appraisers

CONTINUING PROFESSIONAL DEVELOPMENT GUIDELINES FOR RECOMMENDED HEADINGS UNDER WHICH TO DESCRIBE A COLLEGE OR FACULTY CPD SCHEME

A Summary Report. Developing an Antibiotic Prescribing online training resource for Foundation Year Doctors

Guidance for International Medical Graduates (IMGs)

Medical leadership for better patient care: Support for healthcare organisations 2015

ACADEMY OF MEDICAL ROYAL COLLEGES. Improving Assessment

A Reference Guide for Postgraduate Dental Specialty Training in the UK

Medical Leadership Content of Royal College Curricula

REVALIDATION GUIDANCE FOR PSYCHIATRISTS

FPH ASSESSOR GUIDANCE PACK

4a Revalidation: Guidance on Colleague and Patient Questionnaires Annex A. Revalidation: Guidance on Colleague and Patient Questionnaires

Role of the patient in medical education

Specialty Selection Test (SST) Pilot INFORMATION PACK

Postgraduate Medical Education in Scotland: Management of Trainee Doctors in Difficulty. Operational Framework

Orthodontic Specialty Training in the UK

Health for Healthcare Practitioners. Enhanced competencies for Occupational Physicians caring for Healthcare Practitioners

Education. The Trainee Doctor. Foundation and specialty, including GP training

Memorandum. The Faculty of Public Health of the Royal Colleges of the United Kingdom. (Registered Charity No )

Reporting tool user guide

Procedures for the Review of New and Existing Undergraduate Programmes

1. Implementation of the GMC s requirements on the number of attempts at and the currency of specialty examinations.

Reconfiguration of Surgical, Accident and Emergency and Trauma Services in the UK

ACADEMIC AWARD REGULATIONS Framework and Regulations for Professional Doctorates. Approval for this regulation given by :

COPDEND Standards for Dental Educators

Job description for post of Specialty Trainee in Dental Public Health based in the Southwest Peninsula.

THE COLLEGE OF EMERGENCY MEDICINE

NATIONAL UNIVERSITY OF SINGAPORE DIVISION OF GRADUATE MEDICAL STUDIES BASIC AND ADVANCED SPECIALIST TRAINING REQUIREMENTS FOR THE SPECIALTY OF

datasheet QCF Contact Centre Operations What is an NVQ?

Education and Training Department Dual Sponsorship Scheme

Teaching institution: Institute of Education, University of London

Supporting information for appraisal and revalidation

BSB50615 Diploma of Human Resources Management. Course Overview

Assessment. Leeds School of Medicine. Date 18 October Leeds School of Medicine (the School), and Bradford Royal Infirmary.

Advanced Training in Public Health Medicine

CURRICULUM VITAE. PAULINE ELIZABETH HOWARD MCSP., MHPC., GRAD. Dip. Phys., MBAE., MEWI., SEW., SJE.

Programme name Advanced Practice in Health and Social Care (Advanced Nurse Practitioner- Neonatal/Child/Adult)

PROGRAMME SPECIFICATION POSTGRADUATE PROGRAMMES. School of Health Sciences Division of Applied Biological, Diagnostics and Therapeutic Sciences

Do you agree with this proposal? Yes No

Nottingham Trent University Course Specification

My vision for the future of occupational health. Dr Richard Heron President, Faculty of Occupational Medicine

A new selection system to recruit general practice registrars: preliminary findings from a validation study. Abstract. Objective

Module12 Leadership Management and Governance

PROGRAMME SPECIFICATION KEY FACTS

University of Surrey. PsychD Clinical Psychology Programme

Foundation Programme Course Details

CURRICULUM VITAE. Mr. Colin Andrew Read. LLM, FRCS (Eng.) FCEM

SWMIT Pharmacy Accreditations Newsletter

PROGRAMME SPECIFICATION

MSc in Computer and Information Security

University of Cambridge: Programme Specifications. CLINICAL MEDICINE: MB/PhD PROGRAMME

The Development of the Clinical Healthcare Support Worker Role: A Review of the Evidence. Executive Summary

National Approach to Mentor Preparation for Nurses and Midwives

PROGRAMME SPECIFICATION MA/MSc Psychology of Education and the MA Education (Psychology)

The Open University Foundation Degree in Primary Teaching and Learning and Diploma of Higher Education in Primary Teaching and Learning

State of medical education and practice in the UK: Name: Dr. Judith Hulf Senior Medical Adviser to the GMC

Interface between NHS and private treatment Guidance from the Ethics Department February 2004

Transcription:

Workplace Based Assessment SLE-DOPS Pilot 2014 Executive Summary Workplace Based Assessment Advisory Group 28 January 2015 Dr L Batty (chair/nhs) Dr M McKinnon (deputy chair/industry) Dr AK Skimore (NHS) Dr K McKinnon (Armed Forces) Dr K Targett (NHS Scotland) Dr L Curran (trainee/nhs) Dr S Chavda (trainee/industry) Mr B Szafranski (FOM)

Summary of the Workplace Based Assessment SLE-DOPS Pilot 2014 Background The Faculty of Occupational Medicine Workplace-based Assessment Advisory Group (WBAAG) have revised the WBA Directly Observed Procedures (DOPs) form in line with recommendations from the GMC, changes to comparable tools made by other medical colleges/faculties and international developments in this area. A comprehensive summary document was produced and presented to the Examiners Assessment Sub-Committee in February 2013. The outcome of revision is a suite of new DOPs forms in a Supervised Learning Events (SLE) format. Proposal for piloting these new DOPs forms was presented to the Committee in May 2014. A three stage pilot of the six revised SLE DOPs forms was then undertaken in 2014 as agreed. This summary document has a corresponding comprehensive counter-part, which contains detailed evidence and numeric evidence. Revised SLE DOPs forms Up until 2014 the Faculty's Specialty Training Handbook and website had only one DOPs form, which was intended for workplace visits. Anecdotally trainees had been modifying and adapting the tool to their individual requirements. The WBAAG re-designed the layout of the DOPs and created six new DOPs forms. As part of the development of the forms, the WBAAG have incorporated the latest trends in medical education moving a summative to a formative style and structure. In keeping with this, the new tools are now termed Supervised Learning Events (SLEs). In addition, the tools have been designed to map to specific curriculum competencies, in order to allow trainees to better demonstrate their progress in training. SLE DOPs forms for audiometry, spirometry, biological monitoring, workplace assessment, communication activity and a generic form have been developed and piloted. The key changes to the forms included: a linkage to competencies re-design of evaluation (marking) areas elimination of numeric scoring clarification and expansion of the feedback section addition of a reflection section addition of a rubric to give guidance to trainers and trainee on the criteria for scoring At each stage of development, the WBAAG benchmarked the tools against those developed and later piloted by other Royal Colleges and the Academy of Medical Royal Colleges. 09 September 2015 Page 2 of 7

Pilot Methodology The pilot population were all Faculty of Occupational Medicine (FOM) trainers and trainees in the UK, registered with the Faculty in 2014. Using combined information, secured from the National School of Occupational Medicine (previously London and KSS Deanery) and FOM, we determined the number of specialist trainees in occupational medicine in the UK to be 77. The Pilot consisted of 3 stages: Stage 1 Testing (face and content) validity (January 2014) This was performed by means of a comprehension trial at the Faculty s Examiners Training Day on 22 January 2014. Since the Faculty did not use an e-portfolio (or collaborative software through which trainees' and trainers' completed forms could be assessed), a comprehension trial was performed at one setting, using a focus group of trainers and trainees. This trial assessed the validity of the forms (the relationship between what was actually being assessed and what the forms had been designed to measure). Stage 2 - Testing reliability (April to December 2014) Reliability is a measure of stability and the overall performance of the tools. It intends to assess the degree to which different raters agree in their decisions. In our study, we used video analysis to measure inter-rater reliability. Two short videos were developed - Video 1 of a trainee undertaking spirometry and interpreting the results with a trainer, and Video 2 of a trainee participating in a communication activity (meeting with a manager). Trainers were asked to watch the videos and complete corresponding SLE DOPs for a communication activity. Stage 3 - Testing usability, utility and acceptability (April to November 2014) The general usability, ease of use and practical acceptability of the forms were assessed by electronic survey. A request was sent to all trainers and trainees to complete a bespoke electronic survey delivered by Survey Monkey. Recruitment to study Stage 1 of the pilot began at the Examiners Training Day on 22 January 2014, when trainers were invited to participate in the comprehension trial, to assess face and content validity. Trainees also participated in a separate focus group session on the same day, having been invited by email invitation beforehand. Stages 2 and 3 were widely advertised throughout 2014. This included fortnightly advertisements in the FOM newsletter and FOM trainee newsletter. Additionally, there was personal promotion by the WBAAG members at the Regional Specialty Advisers (RSA) meeting on the 20 th May 2014 and the Annual Scientific Meeting of the Society of Occupational Medicine in Nottingham in June 2014 the 09 September 2015 Page 3 of 7

FOM Quality in OM conference in Birmingham in September 2014, the Assessment of Trainees in Occupational Medicine in Europe (ATOM) conference in Glasgow (in August 2014), the September launch of the National School of Occupational Health (NSOH), London Consortium of OH Professionals (LCOP November 2014) and the FOM HASAWA winter conference in London in December 2014. In order to boost our response rate, each WBAAG member also circulated information to their regional professional contacts. WBAAG members work in the NHS, independent sector, industry and defence sectors in England and Scotland. A request-to-participate email was sent to all occupational medicine administrators in all deaneries in the UK. Some of the deaneries then personally approached trainees and educational supervisors to participate in the Pilot. The Pilot was actively promoted by the Head of the NSOH. To increase the response rate, the data collection and response to the pilot deadline was extended to early December 2014. The extension of the study was in keeping with GMC recommendations (meeting held in June 2014) and the response rates compare favourably with the Joint Royal Colleges of Physicians Training Board (JRCPTB) national pilot. Results Stage 1. No specific areas of concern/difficulties were identified by the comprehension trial (CT) of trainees or trainers relating to the comprehension of the revised tools. Further areas for development of the guidance notes and training handbook were identified in the trainee CT, relating to the use (numbers and activities) of SLE DOPs expected. The trainer CT identified other opportunities to enhance the utility of the tools and to potentially modify the areas of assessment and rubric. Stage 2. Overall, 25 educational and clinical supervisors participated in the video pilot of SLE DOPs forms. The majority of assessors marked the individual categories as 'satisfactory', with greater disparity in the marks for the spirometry activity. Some assessors assessed the trainee against the end stage of training (in terms of expected competency), whereas others assessed against early stage of training, which was likely to contribute to the disparity in grading, particularly for spirometry. This was reflected in the comments. A 'below expected' outcome was given where the trainee was assumed to be at the end stage of training rather than the actual observed performance in the video, consistent with early training. The disparity may also have been affected by assessors' assumption about some of the information not covered in the video (e.g. consent). The key factor in understanding any below or above expected' scoring, was the presence of expanded comments and feedback boxes in the new SLE DOPs forms. This is reassuring, given the intended formative use of the new forms. Stage 3. Trainers' responses 27 participants completed the survey. There was an overwhelmingly positive response (90-95%) to questions relating to whether or not the re-designed tools represented an improvement over the currently available WBA DOPs tools. 09 September 2015 Page 4 of 7

Participants used between 4 and 5 tools from the suite during the survey period. The communication activity SLE DOPs form was the most frequently used, followed by workplace visit, spirometry, generic tool and audiometry in that order. The biological monitoring form was not used during the survey. On average, it took 25-30 minutes to complete (range from 20 to 35), with the longest time for workplace visit and communication. Other commentary in the responses received included: easier to link to the curriculum (95%), trainee initiation (90%), easier tool selection (90%), enhancement of trainees' learning and development (95%), perception of usefulness of rubric (95%), feedback provision (89%), formative purpose 88% and improvement over the previous tools (95%). Of note, in only 42% of responses trainers always or often translated the SLE DOPS into an action plan and 42% sometimes. It was not possible from the data obtained to identify whether this was a factor of the tool being used or the training activity. Further work would be required to clarify this issue. Trainees' responses 35 participants completed the survey: 6 from ST3, 7 from ST4, 15 from ST5 and 6 from ST6. Trainees used between 0 and 5 of the tools - with communication activity, generic and workplace visit as the most commonly used. Trainees reported generally positive responses (54-74%), but less so than the trainers, relating to whether or not the re-designed tools represented an improvement on the currently available WBA DOPs tools. Other responses included: easier to link to the curriculum (73%), trainee initiation (93%), easier tool selection (94%), enhancement of trainees' learning and development (54% of 26 respondents), perception of usefulness of rubric (71%), feedback provision (73%), formative purpose 83%, improvement over the previous tools (68%) and encouragement to reflect on performance (83%). Whilst a high proportion of trainees recognised benefits of the new tools (74% believed that they gained insight into their performance, 83% felt encouraged to reflect on their performance and 84% believed feedback was helpful), only 54% of respondents believed their training was enhanced. Further work would be required to understand this apparent disparity and, again, whether this is a reflection of the tools or the training activity environment. Please see the comprehensive version of the pilot analysis for detailed results and discussion. 09 September 2015 Page 5 of 7

Recommendations The WBAAG recommends that the FOM Assessment Subcommittee evaluates the findings of the pilot and considers it as sufficient and suitable for submission of curriculum changes to the GMC. The WBAAG converts the remainder of the WBA tools (mini-cex, CBD and SAIL) into the new SLE format in 2015, as agreed with the GMC (see minutes of the meeting in June 2014). Once they are converted, the WBAAG then undertakes a further 3-stage pilot using methodology described above. WBAAG already considered improvement strategies for the subsequent pilot. An economy of effort can be achieved by undertaking all three stages of the pilot on an SLE training Day, supported by the Faculty. WBAAG collaborates closely with the NSOH in development of a bank of videos with examples scenarios and instructions on their usage. WBAAG shares pilot results via national publication or any other suitable means as advised by the FOM. 09 September 2015 Page 6 of 7

References Workplace Based Assessment: A guide for implementation (General Medical Council; April 2010) Impact of Workplace Based Assessment's on doctors education and performance: a systematic review (British Medical Journal; Sept 2010) Learning and assessment in the clinical environment: the way forward (General Medical Council Nov 2011) Workplace Based Assessment's are no more (BMJ Careers Sept 2012) Improving Workplace Based Assessment's a major pilot (BMJ Careers; July 2012) Joint Royal College of Physicians Training Board - Results of Workplace Based Assessment Pilot 2012-14 (JRCPTB; May 2014) Crossley JGM et al. Sheffield Assessment Instrument for Letters (SAIL): performance assessment using outpatient letters. Medical Education 2001; 35: 1115-1124 Fox AT et al. Improving the quality of outpatient clinic letters using the Sheffield Assessment Instrument for Letters (SAIL). Medical Education 2004; 38: 857-858. 09 September 2015 Page 7 of 7