Context and aims of the workshops



From this document you will learn the answers to the following questions:

What is one of the main reasons that the GMC is responsible for setting standards?

What type of evidence is used to measure education and training?

How is feedback from participants from the Workshop described in this report?

Similar documents
The Role of the Medical Royal Colleges and Faculties in Quality Assurance, Quality Management and Quality Control: A Discussion Paper

Education Surveys and Trends for

Report of the Review of Quality Assurance of Medical Education and Training

Education. The Trainee Doctor. Foundation and specialty, including GP training

Review of Quality Assurance: Terms of Reference. Background

Inside R. of experience and competence and relevant to developmental needs; and. adequate time to provide training.

BMJcareers. Informing Choices

Assessment. Queen s University Belfast School of Medicine. Date November 2012

Small specialties thematic review

CQC s strategy 2016 to Shaping the future: consultation document

Update on NHSCB Key features of (proposed) NHSCB operating model for primary care

Supporting information for appraisal and revalidation: guidance for General Practitioners

Royal College of Obstetricians and Gynaecologists. Faculty of Sexual and Reproductive Healthcare

Research and Innovation Strategy: delivering a flexible workforce receptive to research and innovation

THE COLLEGE OF EMERGENCY MEDICINE

Quality Management & Quality Control of Postgraduate Medical Education: Guidance for Specialty Schools

APPENDIX ONE: SUMMARY TABLE OF SURVEY FINDINGS AND ACTIONS TAKEN ANNUAL PATIENT AND PUBLIC SURVEY 2013: SUMMARY OF KEY FINDINGS

Genito-urinary Medicine

Good Practice Guidelines for Indicator Development and Reporting

Reporting tool user guide

Establishing a Regulatory Framework for Credentialing

Supporting information for appraisal and revalidation: guidance for clinical oncology and clinical radiology, Second edition

Statement on the core values and attributes needed to study medicine

Personal Development Planning

Medical Appraisal Guide

Program: Speech Pathology and Audiology B.S. Department: Speech Pathology and Audiology. Number of students enrolled in the program in Fall, 2011: 220

A Framework for the Professional Development of Postgraduate Medical Supervisors

The Five Key Elements of Student Engagement

Evaluation of an Intra-Professional Learning workshop between Pharmacy and Pharmacy Technician Students

Personal Development Planning and eportfolio. Student Guide

WHAT WORKS IN INNOVATION AND EDUCATION IMPROVING TEACHING AND LEARNING FOR ADULTS WITH BASIC SKILL NEEDS THROUGH FORMATIVE ASSESSMENT STUDY OUTLINE

Case manager. Person specification and competencies. Version 2 November

ep Foundation Users Survey

Report to Trust Board Executive summary

Quality with Compassion: the future of nursing education

UK Dental Core Training Curriculum Framework. 16th April 2015

Core Module 2: Teaching, Appraisal and Assessment

TEACHING AND LEARNING STRATEGY to 2005

YOUR SERVICES YOUR SAY

Synergies for Better Learning

What our strategy means for the health and adult social care services we regulate

Perspectives. Employee voice. Releasing voice for sustainable business success

The Workplace Learning Environment

The Sharing Intelligence for Health & Care Group Inaugural report

Guide to Media Evaluation. 1. What is the purpose of Media Evaluation? 2. What forms can it take?

Hull York Medical School

National Approach to Mentor Preparation for Nurses and Midwives

Supporting information for appraisal and revalidation: guidance for paediatrics and child health

Guide for Clinical Audit Leads

Approval of foundation programme training and new programmes for provisionally registered doctors outside the United Kingdom

Health Education East Midlands Approach to Quality. Change to Quality Management Visits - Process from April 2014.

Evaluating teaching. 6.1 What is teacher evaluation and why is it important?

A fresh start for the regulation of independent healthcare. Working together to change how we regulate independent healthcare

Revalidation processes for sessional GPs: A feasibility study to pilot current proposals

Writing a degree project at Lund University student perspectives

Quality Assurance Framework

Quality Governance Strategy

Quality Standards for local training and education providers

TOOL D14 Monitoring and evaluation: a framework

GUIDANCE FOR DEANERIES/LETBs ON THE STANDARDS FOR GP SPECIALTY TRAINING. Introduction

University of Bradford N/A N/A N/A

Care service inspection report

Test your talent How does your approach to talent strategy measure up?

Practical Experience Requirements Initial Professional Development for Professional Accountants

Improving SAS appraisal: a guide for employers

Commissioning Strategy

Diversity is not about them and us. It s about. all of us.

TAXREP 01/16 (ICAEW REP 02/16)

Developing and managing courses

Social Return on Investment

MANAGING THE RISKS OF CHANGE

2. To note this report about the GMC s position on integrated academic pathways (paragraphs 13-19).

MASTER S COURSES FASHION & LUXURY BRAND MANAGEMENT

UKCPA - A Review of the Current Pharmaceutical Facility

Personal Development Planning with tutor and peer student mentoring - Interim report of an experiment in implementation (warts and all)

CORE TRAINING IN PSYCHIATRY CT1 CT3

Communications Strategy

ACADEMY OF MEDICAL ROYAL COLLEGES. Improving Assessment

Measuring the Impact of Volunteering

INTRODUCTION TO THE UK CURRICULUM IN CLINICAL GENETICS

Promoting and rewarding excellence

Assessment Policy. 1 Introduction. 2 Background

Careers and the world of work: roles and responsibilities

Zainab Zahran The University of Sheffield School of Nursing and Midwifery

Promoting hygiene. 9.1 Assessing hygiene practices CHAPTER 9

Staff Survey Results and Action Plan Report for the AWP NHS Trust Board Meeting Date: Serial: 27 April 2012

How Good is Our Community Learning and Development? Self-evaluation for quality improvement

FIVE YEAR REVIEWS OF HEALTH SCIENCES ORGANIZED RESEARCH UNITS UNIVERSITY OF CALIFORNIA, SAN DIEGO Supplement to UCSD ORU Policy & Procedures, May 2010

Chapter 3: Preparing doctors through medical education and training

Policy Document Planning, Assessment, Recording and Reporting September 2010

CLINICAL EXCELLENCE AWARDS. Academy of Medical Royal Colleges submission to the Review Body on Doctors and Dentists Remuneration

Nurse Consultant Impact: Great Ormond Street Hospital Workshop Report

Dementia. Post Diagnostic Support. HEAT Target

Effective complaint handling

Getting the best from your 360 degree feedback

Introduction to Clinical Examination and Procedural Skills Assessment (Integrated DOPS)

STEP 5: Giving Feedback

The Essential User Guide to Recognition of Trainers in Secondary Care

The Child at the Centre. Overview

UK Medical Education Data Warehouse

Transcription:

Themes from the Quality Assurance Review Workshops June 2012 Context and aims of the workshops 1. The General Medical Council (GMC) has legal responsibility for setting standards for undergraduate and postgraduate medical education and training. We are also responsible for quality assuring the way those standards are met by the organisations which manage and deliver education and training. 2. In 2012 we began a review of our quality assurance (QA) arrangements. We want to make sure that those arrangements are fit for purpose in ensuring that the management and delivery of education and training equips doctors leaving training with the knowledge, skills and attributes needed for their specialty and professionalism consistent with the application of Good Medical Practice. 3. As part of the review we held a series of stakeholder workshops in Edinburgh, London and Manchester on 18, 20 and 26 June 2012. The aim was to gather ideas about how our QA arrangements should operate in the future. This included identifying what currently works well and what might need to change. 4. Participants in the workshops were asked to discuss a number of themes which are being considered during the review. The themes were: The use of evidence in the QA process The role of visits in the QA process The role of the medical Royal Colleges in the QA process Measuring quality outcomes Reporting QA outcomes This report 5. This report sets out the comments made and ideas that emerged during the three workshops. The workshops were held under Chatham House rules and

therefore the comments and ideas recorded here are not attributable to particular individuals or organisations. 6. The comments and ideas do not necessarily reflect current GMC views. They do, however, constitute invaluable feedback from individuals and organisations with knowledge and experience of our QA arrangements. Their perceptions will be used to inform the ongoing work of the review. Theme 1: Use of evidence in the QA process The QA process relies on data drawn from a range of sources. Workshop participants were asked to consider the sort of data needed to enable the GMC to assure itself about the quality of training and how to improve the quality of data used without increasing the burdens for those collecting it. The feedback from participants is summarised below: Using evidence across the continuum of medical education and training A common core of evidence should be collected each year for all stages of the continuum; undergraduate, postgraduate and beyond. Evidence of the outcomes of training should be tracked over the long term and fed back to previous stages of training (especially from Foundation training back to medical schools). Medical schools want data on the progression of their students. A profile of Local Education Providers (LEP) could be developed based on evidence from deaneries and Royal Colleges about such elements as the level of support to trainees and the clinical opportunities available within an LEP. It was also suggested that the GMC could consider shifting the focus of QA away from deaneries and more towards LEPs. Outcomes of medical education and training The GMC should focus on collecting outcome data, though it was acknowledged that it is difficult to decide what the appropriate outcomes are and the data may be difficult to collect. It is difficult to trace individual trainees who have not achieved a Certificate of Completion of Training (CCT) to determine why they are failing to complete training. Data collection There are different approaches to collecting data for undergraduate and postgraduate education and training. This makes sharing information and tracking outcomes difficult. 2

It would be useful to have a common repository or linked system where data is collected and stored from various institutions, and made accessible through a searchable database. This would encourage data sharing and enable institutions to track the performance of their trainees over time, and would reduce the burden on organisations. It was said that the GMC is in a position to develop an overarching system for collecting data and a common data set. The GMC could define standards for what information is shared and how it is shared. A common methodology for gathering evidence and a common language between deaneries and colleges would be useful. The GMC should consider developing a central data specification to avoid duplication between organisations. The GMC should provide a clear statement of the evidence that needs to be collected in undergraduate medical education to support the QA process. The GMC should be more prescriptive about defining what data is collected, by whom and to what standards. It was noted, however, that this would be a challenge with increasingly different health systems operating across the UK. The GMC asks for too much data to be collected at once. It is not clear how the GMC uses the data, nor is it clear that it is fed back to stakeholders. Data collection should focus on certain things each year because programmes tend not to change all that much from year to year. There was support for a thematic approach to collecting data. This would mean setting up a comprehensive, rolling programme so that people know in advance what will be required by the GMC and can put in place arrangements to do so. The rolling programme could be over three years so as to manage the burden of collecting data. This would help to overcome the often anecdotal nature of much of the evidence currently provided by the Colleges for the Annual Specialty Reports. The GMC should collect longitudinal data as this provides more robust evidence than that from one-off incidents. GMC needs to make sure that it doesn t overreact to evidence from one-off incidents. The GMC could explore the use e-portfolios to gather and track information over time. The GMC should collect evidence that helps students and trainees make informed decisions about their future education and training. The timing of data collection is important so that timely action can be taken. The GMC could explore how the Annual Review of Competence Progression (ARCP) process could be better used for quality assurance, although ARCP data needs to be more robust. The GMC should coordinate with the Care Quality Commission (CQC) and their national surveys, and explore opportunities to share data with regulators of other healthcare professions, to reduce the burden for those collecting data. The GMC could use information from the external examiners who visit medical schools. 3

Local Education and Training Boards (LETBs) should use the same information flow so as to ensure a single approach to data collection. The pro-forma for the Annual Deanery Reports (ADR) needs to be revised. A deanery self assessment could be required before filling in the ADR. Three streams of data collection were identified: programmes, which the National Training Survey (NTS) collects data on; placements (this data is missing, particularly the ability to identify how posts in specialties are performing over a period of time); and curricular (data for which is picked up from the e-portfolio). Data quality There should be a critical appraisal of the quality of the data used. Less, but more relevant data, should be collected. College exam data is an example of useful evidence as it directly assesses the knowledge of trainees and regional variation may indicate local issues. It is difficult to measure the value-added of medical education and training. The quality of data from ASR is a concern. Use of surveys in quality assurance The NTS is about perceptions not hard evidence. Some questioned how useful the information was (bias of students and trainees to protect their medical school/deanery). A way was needed to address small specialties in the NTS (for example by gathering data over a longer term to enable longitudinal reporting). There is concern about proliferation of local surveys and survey fatigue. There is positive use of surveys in Scotland, with a national system leading to engaged boards and greater impact. The GMC needs to demonstrate how it has acted upon feedback to improve the value of the NTS to students and trainees. The GMC should consider an undergraduate student survey to provide information on the quality of education, and also consider surveying the progression of students. The GMC should explore coordination with the National Student Survey (NSS). Narrative responses in surveys give context and added value to data. The role of patient feedback Views on the role of patient feedback varied from the negative (patient feedback does not reflect accurately on the quality of a trainee) to the positive (it might inform about the culture of the placement). Annual Specialty Reports (ASRs) 4

The GMC should define more clearly what it wants reported from specialties (Colleges don t have the authority to require its collection). There is concern about the quality of data used for the ASR (difficult to identify exactly where information comes from; subjective/anecdotal; repetitive; difficult to interpret). The utility of the qualitative data collected was questioned. Annual Deanery Reports (ADRs) The formats vary between deaneries and standardisation would help. Annual reporting by deaneries is about right. ASR and ADR should be able to be triangulated Annual Medical School Returns The quality of data from medical schools is variable. Suggestions for data that should be collected for the QA process Time and funding for training. Good practice. Data to identify the quality of an educational environment. Soft evidence such as respect for dignity. Indicators of excellence. For example, the proportion of Foundation Year 2s gaining their first choice specialty placement; the number of students who pass MRCP 1 and 2 first time; publications within 10 years of graduation. Data on the quality of training in a particular region. Examination data Educational environment There is support for collecting more detail about the educational environment. A good educational environment will provide support for students and trainees, feedback and mentoring. However, it is difficult to define the training environment is it a single training programme within a region, an institution, a team or a combination of all of these? The educational environment is not only dependent on trainees and trainers, but also relies on allied health professionals and teams within an institution. Assessing the quality of an educational environment may therefore require joint QA visiting with other organisations and regulators. The education and training culture in an institution is difficult to measure but could be informed by evidence such as how individuals keep themselves up to date on educational issues (for example, programme director meetings with staff to discuss educational issues) and whether trainees portfolios are maintained on an ongoing basis rather than just at the end of a placement. 5

The RCGP is a good example of assessing educational supervisors reports and supporting the use of e-portfolios. 6

Theme two: The role of visits in the QA process Part of the quality assurance process involves visits to the organisations responsible for the management and delivery of medical education and training. The workshops looked at the strengths and weaknesses of the GMC s approach to QA visiting and how it might be improved. The feedback from participants is summarised below: Purpose and value of visits Visits bring about positive outcomes and are seen as valuable and affirming what is good. They are also important as a means of engaging with senior leadership and managers about training. The GMC s status creates leverage and helps those responsible for education and training within institutions to secure change. The visits process has clout. However, visits are only part of a process; they are not an end in themselves. The recent joint visits covering both the undergraduate and postgraduate stages of training in a region were seen as positive. However, it was also noted that undergraduate education and postgraduate training are different. What is common across the two is the quality of the learning environment. Visits could therefore explore how Trusts provide for both undergraduate and postgraduate training. Part of the value of QA visiting was seen as the opportunity for face-to-face meetings which are an important way of gathering softer evidence which can be difficult to measure. It is not possible to visit everyone there needs to be a trigger. Visits are sometimes too remote from the delivery of training. Visits are currently good at looking at generic issues, but not at triangulating specialty specific issues. They need to be more targeted to gain the right evidence. The GMC is quality assuring the process but not the outcomes of training, so weaknesses and failures can be hidden. Visits need to focus more on outcomes. Visits are an objective way of substantiating the information already collected and demonstrating what actually occurs in a department. If visits focus too much on identified risks this may be at the expense of identifying good practice. Visits need to balance risk with getting a broad over view of the quality of education. Risk based visits should take place alongside regular visits. Visits reports should refer to areas of good as well as poor practice. It was felt by some that this could be improved. Medical schools want feedback on how their graduates do in the Foundation Programme. There was support for targeted, interventional visits for urgent situations. 7

Strengths and weaknesses of the GMC s current approach A common standard of evidence should be used across the continuum. We shouldn t try and QA everything. We should concentrate on a common core of things and then a rolling programme of themes. Evidence must be triangulated. Interaction with students and trainees is important. It was suggested that when good practice is observed by an external specialist on a visit, this can have a positive impact on training programmes in the visitor s own region if the good practice is also implemented there. The loss of college visits was seen by some as having had a negative impact on patient care. It was said by some that the GMC should make greater use of college intelligence to help target visits. There was some support for targeted visits from colleges to be reintroduced. The visits process is expensive. Visiting new schools is particularly intensive. In future visits will need to focus more on providers to obtain assurance that LETBs are delivering education and training of the required quality. The GMC needs to be clear what it is quality assuring deaneries, LEPs, or medical schools. It is not clear that going into Trusts adds value, other than meeting the senior leadership and trainees. There is some concern that visiting LEPs was taking over the quality management role of the deanery. The value of walking around the physical environment is often underestimated. GMC staff could participate actively in the visiting process. Feedback given to the organisations visited was perceived to be very good. Value of other approaches to visiting Views about unannounced visits were mixed, with some expressing concern that this would be unhelpful and would effect current good relations, while others favoured them because it might demonstrate what it is really like on the ground rather than being staged. Scheduling visits at evenings and weekends should be explored, as well as short, or short notice, visits focussed on specific areas. Visits could include 360 degree contributions from other sources such as managers and administrators. Some supported the idea of the GMC participating in local deanery visits. Involving other regulators in visits would help provide a broader picture of what takes place. Within the visit teams there should be specialists from the Colleges who will know exactly what the curriculum needs are for that specialty. The composition and training of visit teams is important to ensure standards and consistency of approach, and prevent visitors pursuing their own agenda. The team leadership is very important. 8

Language used in visit reports There were mixed views about the nomenclature of visiting and whether they should be referred to as visits, inspections or reviews. There was agreement that it was the outcome of the process that was important. 9

Theme three: Role of the Medical Royal Colleges and Faculties in the QA process College involvement in the current QA process occurs in a number of ways. The workshops considered what the role of the colleges and faculties should be in the future. The feedback from participants is summarised below: Making best use of college specialty expertise The GMC and the colleges need to work more closely together, with colleges more clearly integrated within the regulatory framework. Colleges want a more formal role but it is not immediately clear where colleges sit within the GMC s Quality Improvement Framework (QIF). Closer relations would be possible if everyone s role was clearer. Colleges can provide external expertise to identify both excellence and problems. The GMC could consider establishing standards for external advisors. There is a view that the quality and training of external advisors is variable. The value of external advisors is that they are experienced in curricula and the learning environment; they can focus on specialty content against their curricula; and they are useful to involve when there is an identified problem. The GMC and colleges could work together on developing common data and outcomes for the profession. Impediments to obtaining effective specialty input in the current QA process There is tension between the role of deaneries and the role of colleges. Some deaneries are more receptive to college involvement than others but this is an essential relationship in terms of providing externality and independent input. It is important for the triangulation of evidence. The Academy of Medical Royal Colleges (AoMRC) could play a greater role in co-ordinating the individual colleges and developing a more consistent approach to engagement with deaneries and LEPS, and in developing the ASRs. Colleges need to work through the deaneries, with deaneries sitting between LEPs and colleges. There is sometimes a lack of feedback and linkage between the colleges and deaneries although some colleges consider the links between deaneries and heads of school are good. There was some concern that the loss of college visiting has resulted in a loss of focus on whether the training is delivery the requirements of the curricula. Funding external college advisors and other QA activity, such as employers releasing college personnel to participate in QA, is an issue. The ARCP process is not very robust. College contribution to improvements to the QA process The ASRs need more work but have improved and are now more evidence based. 10

The GMC should be clearer about what it wants from the ASR. There is insufficient feedback to the colleges and deaneries in the light of information provided for the ASRs. It is therefore not obvious to the colleges how the information they have provided is being used to drive improvement. GMC visits should include a college resource clearly appointed by and accountable to the GMC. They would be there as GMC representatives not as college representatives. They would need to understand the specialty although they do not necessarily need to come from the same specialty. The colleges need to be able to act independently, particularly where they have highlighted problems with training and the deanery is unwilling to act. The QA system would benefit from using more college exam data. 11

Theme four: Measuring quality outcomes The workshops were asked to discuss what the GMC should be trying to measure in order to assure itself that doctors leave the different stages of education and training equipped with the necessary knowledge, skills and behaviours. The feedback from participants is summarised below: Measuring outcomes and progression The GMC should align the outcomes for all stages of training. It was suggested that there should be a common portfolio that students and trainees carry with them from the time they enter medical school and throughout their subsequent training. This would allow progression to be monitored across all stages of training. The GMC could share information on the progression of graduates. Some participants thought this would be unfair on trainees. Measuring success can be very difficult. It is difficult to distinguish between what acceptable and what is excellent. There was concern that the system was providing assurance that trainees are competent but not assessing excellence. Doubt was expressed about whether the system always provides assurance that a trainee has performed to the standards necessary to progress to the next stage of training. It was said that there is a reluctance to not sign off a trainee who is underperforming. This was put down to fear of legal challenge. Failure also reflects badly on the education provider. The GMC should give feedback to deaneries and medical schools about the outcomes of referrals into its fitness to practice procedures. This could be broken down by region or specialty and provide an indicator of quality. Patient role in QA Patient feedback is not necessarily a useful or accurate indicator of the quality of training, but it is important. There is a question about how soft information should be used. The GMC should consider what the public is concerned about in their encounters with doctors. This should then be used to inform future education and training. Training environment There was some support for the idea of approving a training environment. There should be recognition for training environments and trusts which commit to education and training. GMC recognition of these environments could promote excellence in the service. However, some trainees are excellent despite the training environment rather than because of it. The GMC could take into account evidence from other regulators and professions about the training environment. 12

One indicator of an organisation dedicated to high quality training could be the amount of time allocated to training. There also needs to be Board level responsibility for education and training. Someone on the Board should be accountable for education and training and there should be evidence that education issues are discussed at Board level. The regulator should be interested in both processes and the outcomes of processes. Possible measures of quality Patienty safety Educational governance and scholarship in an institution Learner experience Outcomes defined differently for different participants in the system individual, institution, LEP and deanery. The outcomes of the trainee and trainer surveys The quality of supervision Time for training in consultant job plans Study leave Relevant teaching faculty engagement Faculty development (senior doctor experience, institutional commitment to change and self improvement, demonstration of action at board level) The number of attempts at an examination QA of portfolios Demonstrating improvement. The qualitative aspects of training are important. QA should not just measure what is easily quantifiable. Attitudes and behaviours are harder to measure. The profession is good at measuring knowledge and skills but not behaviours. Measures of professionalism are needed. The core elements or domains of professionalism should be incorporated into clinical supervisors reports, so that domains can be more reliably reported on. The GMC s work to develop generic outcomes for training might help in this respect. There was support for the GMC s generic outcomes work. There is value in developing a shared vision for, and alignment of, outcomes across all stages of education and training. The education and training continuum would be supported by consistent outcomes across the different stages. Quality indicators should be different for each specialty. Common assessments of competence There should be consistency between measurement in the undergraduate and postgraduate arenas. 13

Examinations are the best way to quality assure knowledge. A standardised, national medical examination could be introduced. But this may have the effect of encouraging trainees to shape their learning in order to pass the exam rather than meet the needs of the curriculum. There could be common assessments of competence at key points in training, including common assessments at the end of undergraduate education and at entry into specialty training. A national exam would make QA accessible to the public. Medical schools could still be free to develop along their own path. More should be done to address the differences between medical schools. Medical schools must cover the large number of outcomes Tomorrow s Doctors. The outcomes are all assessed in different ways at different medical schools. The role of external examiners in medical schools should be looked at. 14

Theme five: Reporting outcomes Participants in the workshops were asked to consider the way in which the results of our quality assurance activities should be reported. Their feedback is summarised below: Purpose of our reports Reports should assure the public that there is a robust process in place to identify poor performance and that action has been taken against poor performance. Standards can be raised by feeding back the outcomes of quality assurance activity and disseminating examples of good practice. Reports should provide an evidence base for quality improvement. Deaneries and LEPS need objective data to deal with problems in posts and programmes. Reports provide assurance to medical schools and deaneries that training placements are fit for purpose and can help alter the shape of placements. GMC reports can highlight issues that a school cannot. GMC reports could look at how LEPs add value to training given the resources available to them (and the cohorts of trainees in the environment). This might be part of approving the educational environment. Audience for reports There are an increasing number of stakeholders interested in medical education and training. The audience includes patients, public, students, potential students, and trainees, but these are not considered the primary audience. The primary audiences are medical schools, deaneries, LEPS (Directors of Medical Education and Training Programme Directors) and commissioners who want to know how they are performing. Reports should be accessible to patients and the public, and trainers but currently they are not accessible as they are very technical. To make them more publicly accessible the GMC would need to provide context by explaining the GMC s role in education and the link between good quality education and quality care. This needs a broader communication than just writing reports in a different way. The public will use education as a proxy for the organisation as a whole. There was discussion about whether weaknesses should be reported to the institution concerned but not made public in the first instance. Only if problems persisted unaddressed, should they be made public. Describing or scoring the outcomes of QA activity The GMC could be more transparent in the way it reports, setting out more evidence about the standards of the training programme. The GMC could identify where an institution has been given warnings, where it has been commended and those areas in which it has excelled. 15

There was some doubt about the utility of explicit descriptors in reports, such as the terms excellent or public warnings, because they were invariably out of date by the time the GMC report was published. Reports include longitudinal information to show progress over time. GMC reporting should be more timely. Comparing the performance of different institutions There was debate about the value of ranking and league tables to enable institutions to be compared. This was not generally supported. If ranking was introduced the rationale for an organisation s ranking would need to be explained. It was said that league tables miss the point of education and training and encourage gaming. However, there is already competitive ranking in the National Student Survey and it is prominent in medical education. It was noted that ranking would change the relationship between the GMC and the other parts of the QA system. In order to compare the performance of different institutions, the metrics would need to be reliable and the data accurate. Comparisons won t necessarily take into account of the fact that some Trusts will always be less attractive for trainees and trainers for reasons (such as geography) which are outside the control of deaneries. Unfavourable comparisons may therefore make it even harder for such Trusts to attract the resources which would help them to improve and may only serve to exacerbate problems. Even without league tables, it should still be possible to say whether an organisation is failing to meet standards, meeting standards, good or excellent. This might argue for having core and developmental standards. What should the GMC report on? There was some support for reporting against a combination of core and developmental standards, but these would need robust evidence. Consideration should be given to weighting the standards as measures of quality are not equal. Reports should highlight good and bad practice, but it is hard to find good practice if QA processes adopt an exclusively risk based approach. The GMC does not need to report in great detail. Reporting in great detail gives the sense of the GMC repeating what deaneries do. The outcomes of doctors completing training should be available, broken down in relation to specialty and region. Introducing a survey focusing on newly qualified consultants would provide an overview of a particular training programme. LEPs are under resource constraints. If the GMC wishes to influence their behaviour its reports must be based on robust and objective data, not just perceptions. 16

17