Theme: Licensing exams, credentialing and revalidation

Download a PDF of the abstracts: Licensing exams, credentialing and revalidation

Assessment, Appraisal, Achievement - Exploring the value of medical educators' portfolios

*Dalton CL, Wilson A, Agius SJ

Today’s doctors are required to teach colleagues and other health professionals and are responsible for their own professional development as educators, evidence of which must be formally documented. Portfolios are a well-recognised tool used to support learning and assessment within medical education. Their main focus has been on assessment of doctors’ clinical performance, providing evidence of competencies and professional development for progression through training or appraisal. Portfolios with the specific aim of capturing educational activities remain underused by clinicians with a special interest in education, especially early in their careers. Additionally, portfolios seem a neglected source of evidence when assessing or appraising a doctor’s education activities.

Portfolios can be used for the storage of evidence, demonstration of activities, stimulation of reflection, gathering of feedback, setting of goals, and presentation of achievements. Their structure should reflect their purpose. Experiments with portfolios suggest that, for evidence to be meaningful to assessors, it should be organised to reflect the competencies learners want to demonstrate. When collating a medical educator’s portfolio (MEP) for assessment or appraisal a focussed approach should be adopted to avoid assessors having to search for value amongst vast amounts of material. One approach would be to organise evidence according the five domains that make up the core values of medical educators (AoME, 2014). These can be further subdivided depending on which competencies need to be assessed and evidence for the MEP selected accordingly. Portfolios should provide insight into the learner’s development and progress, so entries may benefit from reflective notes, explaining how evidence demonstrates particular competencies/required achievements.

MEPs are unique and dynamic reflections of a clinician’s educational activities and attainments. The data capture is highly qualitative but, if the structure and contents of an MEP is chosen wisely, they can be excellent tools for assessment and appraisal of medical educators.

*Corresponding Author Mrs. C. Lucy Dalton, Arrowe Park Hospital,

DTH entry requirements and life science knowledge: Exploring the relationship

*Zahra, D., Belfield, L., McIlwaine, C.

The Dental Therapy and Hygiene (DTH) course at Plymouth University Peninsula Schools of Medicine and Dentistry is a new innovative three-year course, currently preparing to accept its third cohort into Year 1. DTH students are accepted from a range of backgrounds and enter the programme with a much more diverse range of prior qualifications than the Bachelor of Dental Science (BDS) students. All modules in Year 1 of the DTH programme are taught in an integrated fashion with students on the first year of the BDS course. Assessments for these modules are standard set in an integrated fashion as well, so it is important to ensure that DTH students are not disadvantaged by the difference in entry requirements between the two programmes. The current work explores the relationships between prior qualifications and attainment on dental science MCQ examinations, and also considers the relationship between performance on dental science exams and applied dental knowledge exams. The results are discussed in light of equality and diversity, widening participation, and in relation to how they can inform selection and admission in future cohorts.

*Corresponding Author Dr Elizabeth Gabe-Thomas, Plymouth University Peninsula Schools of Medicine and Dentistry,
*Corresponding author Dr Daniel Zahra, Plymouth University,

The Semantics of Remediation - primed to fail? The impact of negative word association upon student performance in remediation

*Dr James Read, Peninsula Schools of Medicine & Dentistry,

The long term impacts of remediation remain uncertain, with evidence to suggest that interventions which are exam focussed are often short-lived, with any benefit associated with remediation wearing-off after short periods of time. This is also the case with those interventions which address lapses in professionalism, with recent studies demonstrating that those medical students who are remediated for professional lapses are much more likely to be involved in fitness to practice proceedings as registered practitioners.

To date, guidance in relation to medial policy has only been issued by a small number of bodies. For post-graduate trainees this limits the source of guidance to either the GMC or the Academy of Medical Royal Colleges (AoMRC). Other organisations, including those organisations which coordinate the training of doctors, such as local education and training boards, cannot offer guidance but only direct people to other existing sources of information. Consequently, our paper explores the existing guidance for remediation that is available for trainees.

Previous research conducted in a NIHR Academic Clinical Fellowship funded Masters in Clinical Education was used to abduct hypothesis about the experiences of students who were undergoing remediation whilst still in under-graduate education. This study indicated that student’s expectation of remediation were significantly more negative prior to the taking part in remediation than after being involved in a programme. From this we hypothesised that the language used in remediation guidance primes students to have negative experience of remediation, potential leading to unneeded psychological morbidity.

A semantic analysis of the textual data from the AoMRC highlighted the paucity of the information provided by medical schools and royal colleges for medical students. And indicated the importance that organisations which offer information about remediation do so appropriately so that trainees engage in the most informed and positive position possible for remediation.

The making of a new Swedish Licensing test for Non-EU/EES physician

*Hultin M, Själander A, Edin B

Presently there are three ways to become a licenced physician in Sweden. (1) 5.5 years of study at a Swedish university followed by 1.5 years internship, (2) Converting an EU/EES licence, e.g. obtained by 6 years of studies at a European university, or, if you have a non-EU/EES-licence, (3) take the Swedish Licencing test followed by a clinical rotation.

Umeå University was in April 2016 assigned to create a new Swedish Licensing test for Non-EU/EES physicians to be available no later than October 2016. Those passing the theoretical and practical tests will be offered a short clinical rotation where other aspects of a practitioner’s knowledge, skills and attitudes will be assessed.

A framework has been developed based on the Swedish Acts regulating requirements for graduating from the Medical Programme and for becoming a licensed physician after internship. The framework will be refined and delineated in a National Reference group including the Swedish medical schools and selected stakeholders. The plan is that test should comprise of two main parts: one theoretical and one practical. The theoretical part includes Basic sciences (MCQ/SBA), Clinical sciences (MCQ/SBA), Patient cases (MEQ/SBA) and Scientific scholarship (MCQ/SBA). The practical test (OSCE), on the other hand, consists of multiple stations each requiring 6-14 minutes. Categories and topics for assessments during the clinical rotation will likely be developed autumn 2016.

In short, the aim is to design a high quality test program for assessing the knowledge and skills needed to be a licenced physician in Sweden. The presentation will discuss the design of the tests and the problems encountered in the process.

*Corresponding Author Assoc Prof, Magnus, Hultin, Umeå University,

MRCPsych Written Examinations - rationalising the content

Mrs Kiran Grewal, Royal College of Psychiatrists,

Royal College of Psychiatrists’ MRCPsych has traditionally consisted of 3 written examinations, each with 200 questions. A feasibility study of reducing a) the number of examination papers to 2, and b) the number of items in each paper, all whilst maintaining current reliability (0.88-0.95 in 2014), considered good/excellent for such examinations (George and Mallery, 2003), was undertaken.

Stage a) consisted of reviewing and remapping the syllabus from across 3 papers to 2, creating mock Papers, investigating their psychometric properties and conducting equality analysis of candidates’ performance in them. Stage b) used the 200 item papers as references from which several random proportions of questions of the paper were selected and analysed for reliabilities. This anticipated reliability of such a shorter test was confirmed using the Spearman-Brown prophecy formula.

Analysis for stage a) showed no significant difference in paper reliability, pass rates, test scores, standard deviation of results or SEMs between current and new papers. Equality analysis found performances by most candidate groups remained the same, whilst the performance gap between PMQ/non PMQ candidates reduced.

Analysis for stage b) found that having only 50% of the items would give cronbachs of 0.83-0.88, and 60% gave 0.85-0.91.

In conclusion, the creation of 2 papers with less questions overall will not compromise desirable paper statistics, and within these, 120 questions yields the same quality of information as 200 questions. Streamlining the papers may have added benefits of reducing the performance gap between key groups.

Such a change would have financial, administrative and time resource benefits for the organisation, and time, cost and emotional investment benefits for examinees (Wainer and Feinburg, 2015).

Practical considerations include transitional arrangements, candidate feedback and quality assurances of items used.

Ultimately, smaller written examinations with good quality paper design and item selection yielding high quality information should be desired.