Adrian is a Professor of Medical Education at University of Exeter Medical School. He is a practising General Practitioner (Family Medicine Specialist). He has been heavily involved in medical education at the undergraduate and postgraduate levels. His particular interest is in assessments in education.
He is an examiner for the Royal College of General Practitioners and led the development of the licensing examination of Clinical Skills for the College. He is chair of the MRCGP INT board of the College. This programme accredits overseas training programmes and assessments in Family Medicine and is active in 9 countries around the world.
Adrian has personal experience in the field of working on such projects in Oman, Dubai, Malta, Kosovo and China. In the past he has worked on related projects in Egypt and Libya and has also delivered teaching courses in Indonesia and Canada.
Adrian is President of the European Board of Medical Assessors and Deputy chair of the GMC Panel for Tests of Competence. He has been a member of the International Board of the American Board of Medical Examiners. He is currently on the Board of the UK Medical Schools Council Assessment Alliance and a Council member for the Academy of Medical Educators.
Qualifications
MMedSci
FRCGP
FAcadMed
Key publications | Publications by category | Publications by year
Publications by category
Journal articles
Rendel S, Foreman P, Freeman A (2015). Licensing exams and judicial review: the closing of one door and opening of others?.
Br J Gen Pract,
65(630), 8-9.
Author URL.
Denney ML, Freeman A, Wakeford R (2013). MRCGP CSA: are the examiners biased, favouring their own by sex, ethnicity, and degree source?.
BRITISH JOURNAL OF GENERAL PRACTICE,
63(616), E718-E725.
Author URL.
Bennett J, Freeman A, Coombes L, Kay L, Ricketts C (2010). Adaptation of medical progress testing to a dental setting.
Med Teach,
32(6), 500-502.
Abstract:
Adaptation of medical progress testing to a dental setting.
Although progress testing (PT) is well established in several medical schools, it is new to dentistry. Peninsula College of Medicine and Dentistry has recently established a Bachelor of Dental Surgery programme and has been one of the first schools to use PT in a dental setting. Issues associated with its development and of its adaption to the specific needs of the dental curriculum are considered.
Abstract.
Author URL.
Coombes L, Ricketts C, Freeman A, Stratford J (2010). Beyond assessment: feedback for individuals and institutions based on the progress test.
Med Teach,
32(6), 486-490.
Abstract:
Beyond assessment: feedback for individuals and institutions based on the progress test.
BACKGROUND: Progress testing is used at Peninsula Medical School to test applied medical knowledge four times a year using a 125-item multiple choice test. Items within each test are classified and matched to the curriculum blueprint. AIM: to examine the use of item classifications as part of a quality assurance process and to examine the range of available feedback provided after each test or group of tests. METHODS: the questions were classified using a single best classification method. These were placed into a simplified version of the progress test assessment blueprint. Average item facilities for individuals and cohorts were used to provide feedback to individual students and curriculum designers. RESULTS: the analysis shows that feedback can be provided at a number of levels, and inferences about various groups can be made. It demonstrates that learning mostly occurs in the early years of the course, but when examined longitudinally, it shows how different patterns of learning exist in different curriculum areas. It also shows that the effect of changes in the curriculum may be monitored through these data. CONCLUSIONS: Used appropriately, progress testing can provide a wide range of feedback to every individual or group of individuals in a medical school.
Abstract.
Author URL.
Freeman A, Nicholls A, Ricketts C, Coombes L (2010). Can we share questions? Performance of questions from different question banks in a single medical school.
Med Teach,
32(6), 464-466.
Abstract:
Can we share questions? Performance of questions from different question banks in a single medical school.
BACKGROUND: to use progress testing, a large bank of questions is required, particularly when planning to deliver tests over a long period of time. The questions need not only to be of good quality but also balanced in subject coverage across the curriculum to allow appropriate sampling. Hence as well as creating its own questions, an institution could share questions. Both methods allow ownership and structuring of the test appropriate to the educational requirements of the institution. METHOD: Peninsula Medical School (PMS) has developed a mechanism to validate questions written in house. That mechanism can be adapted to utilise questions from an International question bank International Digital Electronic Access Library (IDEAL) and another UK-based question bank Universities Medical Assessment Partnership (UMAP). These questions have been used in our progress tests and analysed for relative performance. RESULTS: Data are presented to show that questions from differing sources can have comparable performance in a progress testing format. CONCLUSION: There are difficulties in transferring questions from one institution to another. These include problems of curricula and cultural differences. Whilst many of these difficulties exist, our experience suggests that it only requires a relatively small amount of work to adapt questions from external question banks for effective use. The longitudinal aspect of progress testing (albeit summatively) may allow more flexibility in question usage than single high stakes exams.
Abstract.
Author URL.
Ricketts C, Freeman A, Pagliuca G, Coombes L, Archer J (2010). Difficult decisions for progress testing: how much and how often?.
Med Teach,
32(6), 513-515.
Abstract:
Difficult decisions for progress testing: how much and how often?
This article is primarily an opinion piece which aims to encourage debate and future research. There is little theoretical or practical research on how best to design progress tests. We propose that progress test designers should be clear about the primary purpose of their assessment. We provide some empirical evidence about reliability and cost based upon generalisability theory. We suggest that the future research is needed in the areas of educational impact and acceptability.
Abstract.
Author URL.
Freeman A, Van Der Vleuten C, Nouns Z, Ricketts C (2010). Progress testing internationally.
Med Teach,
32(6), 451-455.
Author URL.
Chamberlain S, Freeman A, Oldham J, Sanders D, Hudson N, Ricketts C (2006). Innovative learning: employing medical students to write formative assessments.
Med Teach,
28(7), 656-659.
Abstract:
Innovative learning: employing medical students to write formative assessments.
Peninsula Medical School, UK, employed six students to write MCQ items for a formative applied medical knowledge item bank. The students successfully generated 260 quality MCQs in their six-week contracted period. Informal feedback from students and two staff mentors suggests that the exercise provided a very effective learning environment and that students felt they were 'being paid to learn'. Further research is under way to track the progress of the students involved in the exercise, and to formally evaluate the impact on learning.
Abstract.
Author URL.
Freeman AC, Sweeney K (2001). Why general practitioners do not implement evidence: a qualitative study. BMJ, 323, 1100-1100.
Conferences
Siriwardena AN, Edwards AGK, Campion P, Freeman A, Elwyn G (2006). Involve the patient and pass the MRCGP: investigating shared decision making in a consulting skills examination using a validated instrument.
Author URL.
Publications by year
2015
Rendel S, Foreman P, Freeman A (2015). Licensing exams and judicial review: the closing of one door and opening of others?.
Br J Gen Pract,
65(630), 8-9.
Author URL.
2013
Denney ML, Freeman A, Wakeford R (2013). MRCGP CSA: are the examiners biased, favouring their own by sex, ethnicity, and degree source?.
BRITISH JOURNAL OF GENERAL PRACTICE,
63(616), E718-E725.
Author URL.
2010
Bennett J, Freeman A, Coombes L, Kay L, Ricketts C (2010). Adaptation of medical progress testing to a dental setting.
Med Teach,
32(6), 500-502.
Abstract:
Adaptation of medical progress testing to a dental setting.
Although progress testing (PT) is well established in several medical schools, it is new to dentistry. Peninsula College of Medicine and Dentistry has recently established a Bachelor of Dental Surgery programme and has been one of the first schools to use PT in a dental setting. Issues associated with its development and of its adaption to the specific needs of the dental curriculum are considered.
Abstract.
Author URL.
Coombes L, Ricketts C, Freeman A, Stratford J (2010). Beyond assessment: feedback for individuals and institutions based on the progress test.
Med Teach,
32(6), 486-490.
Abstract:
Beyond assessment: feedback for individuals and institutions based on the progress test.
BACKGROUND: Progress testing is used at Peninsula Medical School to test applied medical knowledge four times a year using a 125-item multiple choice test. Items within each test are classified and matched to the curriculum blueprint. AIM: to examine the use of item classifications as part of a quality assurance process and to examine the range of available feedback provided after each test or group of tests. METHODS: the questions were classified using a single best classification method. These were placed into a simplified version of the progress test assessment blueprint. Average item facilities for individuals and cohorts were used to provide feedback to individual students and curriculum designers. RESULTS: the analysis shows that feedback can be provided at a number of levels, and inferences about various groups can be made. It demonstrates that learning mostly occurs in the early years of the course, but when examined longitudinally, it shows how different patterns of learning exist in different curriculum areas. It also shows that the effect of changes in the curriculum may be monitored through these data. CONCLUSIONS: Used appropriately, progress testing can provide a wide range of feedback to every individual or group of individuals in a medical school.
Abstract.
Author URL.
Freeman A, Nicholls A, Ricketts C, Coombes L (2010). Can we share questions? Performance of questions from different question banks in a single medical school.
Med Teach,
32(6), 464-466.
Abstract:
Can we share questions? Performance of questions from different question banks in a single medical school.
BACKGROUND: to use progress testing, a large bank of questions is required, particularly when planning to deliver tests over a long period of time. The questions need not only to be of good quality but also balanced in subject coverage across the curriculum to allow appropriate sampling. Hence as well as creating its own questions, an institution could share questions. Both methods allow ownership and structuring of the test appropriate to the educational requirements of the institution. METHOD: Peninsula Medical School (PMS) has developed a mechanism to validate questions written in house. That mechanism can be adapted to utilise questions from an International question bank International Digital Electronic Access Library (IDEAL) and another UK-based question bank Universities Medical Assessment Partnership (UMAP). These questions have been used in our progress tests and analysed for relative performance. RESULTS: Data are presented to show that questions from differing sources can have comparable performance in a progress testing format. CONCLUSION: There are difficulties in transferring questions from one institution to another. These include problems of curricula and cultural differences. Whilst many of these difficulties exist, our experience suggests that it only requires a relatively small amount of work to adapt questions from external question banks for effective use. The longitudinal aspect of progress testing (albeit summatively) may allow more flexibility in question usage than single high stakes exams.
Abstract.
Author URL.
Ricketts C, Freeman A, Pagliuca G, Coombes L, Archer J (2010). Difficult decisions for progress testing: how much and how often?.
Med Teach,
32(6), 513-515.
Abstract:
Difficult decisions for progress testing: how much and how often?
This article is primarily an opinion piece which aims to encourage debate and future research. There is little theoretical or practical research on how best to design progress tests. We propose that progress test designers should be clear about the primary purpose of their assessment. We provide some empirical evidence about reliability and cost based upon generalisability theory. We suggest that the future research is needed in the areas of educational impact and acceptability.
Abstract.
Author URL.
Freeman A, Van Der Vleuten C, Nouns Z, Ricketts C (2010). Progress testing internationally.
Med Teach,
32(6), 451-455.
Author URL.
2006
Chamberlain S, Freeman A, Oldham J, Sanders D, Hudson N, Ricketts C (2006). Innovative learning: employing medical students to write formative assessments.
Med Teach,
28(7), 656-659.
Abstract:
Innovative learning: employing medical students to write formative assessments.
Peninsula Medical School, UK, employed six students to write MCQ items for a formative applied medical knowledge item bank. The students successfully generated 260 quality MCQs in their six-week contracted period. Informal feedback from students and two staff mentors suggests that the exercise provided a very effective learning environment and that students felt they were 'being paid to learn'. Further research is under way to track the progress of the students involved in the exercise, and to formally evaluate the impact on learning.
Abstract.
Author URL.
Siriwardena AN, Edwards AGK, Campion P, Freeman A, Elwyn G (2006). Involve the patient and pass the MRCGP: investigating shared decision making in a consulting skills examination using a validated instrument.
Author URL.
2001
Freeman AC, Sweeney K (2001). Why general practitioners do not implement evidence: a qualitative study. BMJ, 323, 1100-1100.
Adrian_Freeman Details from cache as at 2018-04-25 11:38:22
Refresh publications
Awards
2017 RCGP President's medal for International Work
External Examiner Positions
Examiner MRCGP (Previously lead development of the national Clinical Skills Assessment)
I have been External Examiner for Medical Undergraduate programmes at Birmingham, St George's London, Imperial London, Kings College London, Newcastle , Sheffield, Leeds, Gaulway, Singapore.
External positions
President European Board of Medical Assessors
Deputy Chair GMC Panel of Tests of Competence
Chair Royal College of General Practitioners Panel of International Accreditation (MRCGP INT)
Board Member Medical Schools Council Assesssment Alliance
Council member Academy of Medical Educators
Educational Adviser to MRCP examination
Educational Adviser to NCAS (NAtional Clinical Asssessment Service)
Academic adviser to the Dean of the Worshipful Society of Apothecaries
Through postgraduate and undergraduate medical education I have supported developments Internationally in Oman,Malta,Kosovo, Dubai, Egypt, Libya, China, Myanmar, Indonesia,
Assessments in undergraduate and postgraduate medical education