Introduction and background

Dental foundation training (DFT) is a period of postgraduate training and development in the United Kingdom (UK), for which eligibility is via a combination of successful completion of a Bachelor of Dental Surgery (BDS) degree or equivalent and the national recruitment selection process. DFT signifies the first rung on the employment ladder following initial qualification and beyond into the life of a general dental practitioner. Successful completion of DFT is mandatory for UK graduates who wish to join the National Performers List and practise independently in NHS primary care in the UK.1

Nationally coordinated recruitment was introduced into DFT (formerly vocational training [VT]) in 2011, for dental graduates wishing to pursue DFT in England, Wales and Northern Ireland.2 A separate recruitment process is in place for VT in Scotland.3 The national recruitment model was designed by the Committee of Postgraduate Dental Deans and Directors (COPDEND) to safeguard 'an equitable and transparent recruitment process that minimises disruption to dental students and Educational Supervisors', away from the previous deanery-led process.2

The regular DFT recruitment pathway is outlined in Figure 1. Candidates attend one of six assessment centres (Newcastle, Manchester, Bristol, Birmingham, Belfast, London), with centre allocation based on the dental school of the candidate. A candidate's outcome ranking is based on performance across two face-to-face assessments (50% [25% each]) and a situational judgement test (SJT) (50%).3 Face-to-face assessment consists of two ten-minute stations, the format of which is outlined in Table 1.4 Face-to-face assessment allows the opportunity for standardised, probing questions and provides an opportunity for candidates to explain the reasoning and rationale for their responses, against predetermined marking criteria.4,5

Fig. 1
figure 1

DFT recruitment pathway

Table 1 DFT recruitment assessment stations4

National recruitment has radically transformed the selection and allocation method of dental graduates, amid fierce competition for training schemes.6 The ongoing COVID-19 pandemic has, however, mandated another radical change, in which recruitment for 2021 entry to DFT will be based exclusively on a candidate's score in the SJT, thus making the process even more competitive.3 This SJT-only method was also recently employed as a COVID-19 contingency measure for recruitment to dental core training in 2020. It has created a recruitment 'lottery' of sorts, whereby selection hinges solely on the SJT, a deviation from previously declared 'best practice' whereby experts acknowledged that in order to ensure efficient and effective methods of selection, the SJT should be used as part of a wider selection process.7 Current reliance on the SJT may be due to the limited timeframe to implement alternative measures and the proven testing capacity it possesses.

The SJT

An SJT is a type of psychometric evaluation designed to test a candidate's response to hypothetical scenarios they may encounter in the workplace.3 The SJT was formally introduced into DFT recruitment in 2014 and assesses non-academic, professional domains such as: professional integrity; resilience and coping with pressure; empathy and communication; and working effectively as part of a team.3,7 It has evolved from a multiple-choice, paper-based assessment taken on the same day as the face-to-face component, to the current computer-based format, undertaken separately at Pearson Vue test centres.1,3 The SJT lasts 105 minutes and comprises 56 questions: a mixture of two-thirds ranking-based questions, requiring responses ordered from most to least appropriate, and one-third multiple best-answer questions, which requires selection of the three most appropriate responses for the scenario.1 An example of a ranking-based question, previously published in the British Dental Journal, is shown in Box 1.7

The SJT has been designed and developed by the Work Psychology Group, experienced psychometricians and subject matter experts, such as experienced dental trainers and educators, to be a standardised selection tool that complements other selection methods; that is, not a standalone method of recruitment.7 In fact, these very experts have emphasised the importance of multiple methods of selection assessment and not sole reliance on one approach.5 SJTs have also been successfully implemented across recruitment in medicine and pharmacy, as part of a multi-component process.8

The evidence, supported by years of international research, is increasingly suggestive that the SJT is a well-accepted, reliable and valid assessment tool.9,10,11,12,13,14 Moreover, research in medicine has highlighted that not only is the general practice (GP) SJT the single best predictor of performance at the final stage of UK GP recruitment, but also it has been shown to provide a reliable forecast of performance throughout training and licensing exams.15 Stakeholder concerns of the SJT, however, have previously been raised across medicine and dentistry, concerning aspects such as: weighting of the SJT component; lack of assessment of ethical and moral reasoning; disadvantage to those whose university curriculum involves greater study of medical ethics; lack of involvement of clinical and academic ability; and risk of 'coaching' and 'faking'.8,16,17,18,19

SJT questions are mapped and constructed against targeted non-academic, professional domains. It must be noted, however, that each question does not exclusively measure a single domain. Each question encompasses a complex yet realistic workplace scenario and, as a result, avoids the possibility of a 'template' answer being given by the candidate, thus providing a reliable gauge of a candidate as a whole. SJTs are expensive to develop and each DFT SJT diet normally runs at least two versions of the paper each year; hence, items will need to be reused. This can create risks surrounding exposure, however, reused items can serve as 'anchor items' for comparison of cohort performance. It could be assumed that the greater the number of years the SJT has been running, alongside creation of new items, the greater the proportion of items that could be reused.20

Preparation courses

Sole reliance on one method of ranking candidates has naturally increased the amount of anxiety and pressure students are experiencing. Students have begun to voice their concerns as they no longer have face-to-face components to complement the SJT, which previously allowed them to 'present their personalities and enthusiasm' to prospective trainers.21

It has been noted that, in theory, candidates can neither prepare nor revise for the SJT.3,20 Following the announcement by COPDEND in June 2020 regarding the need to adapt DFT recruitment for 2021, there has been a meteoric rise in the volume of new SJT mocks, question banks, workshops and courses promoting their services. Given that the majority of these services are not free, with some charging upwards of £100 per person, we are witnessing a shameless monetisation of the collective angst of undergraduates, whereas others are making their resources available free of charge. These courses highlight a very real risk of 'coaching' and 'faking', and there has been previous widespread concern regarding the possibility for candidates to be 'tutored' for the SJT.20 Recent studies, however, have indicated little effect of commercial coaching on either the predictive validity of SJTs nor on SJT scores.11,22,23

While practice doesn't make perfect, preparation may certainly make proficient. As hard as they endeavour, low-fidelity assessments such as the SJT can never truly 'tap into' the actual behaviour of a candidate, only what the candidate decides to reveal about themselves.20 It is here that the susceptibility to 'faking' the SJT presents itself.24 Preparation courses pose a real threat of faking, with limited evidence suggesting otherwise.18,24,25 Faking relates to the moral dilemma of what a candidate would do in a given scenario versus what a candidate should do. In essence, there is no 'correct' answer; the aim is simply to determine a response matching or similar to that chosen by a panel of subject matter experts. It is, however, important to note that such subject matter experts and psychometricians involved in SJT item creation are not involved with any of the courses or workshops that are available. Instead, each course justifies their rationale in their own nuanced way, which makes it difficult for participants who have taken multiple workshops or courses to identify a blueprint or pathway to the 'correct' answer.26

Practice questions and simulated mocks enable candidates to not only familiarise themselves with the wording of questions but more importantly to acquaint themselves with the pace required when answering, a key element given the time-sensitive nature of the assessment, which allows for less than two minutes per question. Affleck et al.16 stipulate that candidates more familiar with the assessment format would perform better than their peers; further research into whether students can be trained to answer the DFT SJT is required.20 Every point counts in the SJT and previous studies have highlighted that if even just three SJT questions are left unanswered, due to (for example) running out of time, this could move a candidate out of the collective average and into the bottom 3% of applicants.19,27,28

For those who choose not to partake in preparation courses or workshops, due to (for example) financial constraints or otherwise, there are ten official practice questions on the COPDEND website, available since 2016.29 Crucially, these questions are accompanied by the answers and expert rationale for each response, which can give candidates an insight into where they went 'wrong'. Additional practice questions were promised by COPDEND to be available from September 2020, however, even at time of writing, these haven't materialised.30

Subject matter experts and critics alike have been calling for the procurement of a full mock DFT SJT paper since 2016, akin to those available for the UK Foundation Programme (UKFP) for aspiring foundation doctors.5,16,17,31 These downloadable UKFP practice papers are available free of charge and not only provide the questions, responses and full expert rationale for each item, but also an online practice test in the exact format of the real-time assessment on the Pearson Vue website.31,32

The future of national recruitment in DFT

The COVID-19 pandemic has mandated unprecedented changes across undergraduate and postgraduate education. Dental schools and deaneries have adapted their delivery of teaching and assessment towards remote methods, away from previously established face-to-face models. Recruitment has followed suit and COVID-19 contingency measures are currently in place across medical and dental recruitment and selection.3

Face-to-face recruitment has never been part of national recruitment for foundation doctors and perhaps the COVID-19 pandemic presents an opportunity for DFT to adopt elements of this process. Where medicine leads, dentistry nearly always follows, and although a process that mirrored the recruitment of foundation doctors was discussed in the early planning phases of national recruitment to DFT, it wasn't delivered.6 Despite the initial expense of SJT item creation, Ismail and Patel33 feel the SJT provides a cost- and time-effective mode of assessment. Although face-to-face interviews may provide a more dynamic assessment modality, the logistic and economic feasibility of these methods, in light of the COVID-19 burdened NHS, needs to be considered.33

Academic performance measure

The significant difference between medical and dental recruitment to foundation training is the distinct lack of any connection to academic performance in the dental model. Medical and dental students are some of the most academically assessed students in the UK, yet suddenly, performance across five years of dental school simply doesn't matter. The medical model, however, bases 50% of recruitment on SJT performance (50 points) and 50% on an 'educational performance measure' (EPM [50 points]).34 The EPM is a measure of clinical and non-clinical skills, knowledge and performance, up to the point of application.34 The EPM comprises two distinct elements: medical school performance, ranked in deciles and worth up to a maximum of 43 points, and additional educational achievements, worth up to a maximum of seven points. The decile scores are calculated by each medical school and divide a year group into ten equal groups (deciles), based on performance across an agreed number of assessments.34 Additional educational achievements award up to five points, depending on the highest level of additional degree the candidate possesses and one point for each publication with a PubMed ID on which the candidate is named as an author (maximum two points) (Table 2).19,34,35

Table 2 Educational performance measure19,34,35

The EPM is, however, not without its critics. It only ranks a candidate's educational performance against peers within their own medical school and, as a result, is not standardised across the UK and is subject to substantial variability.36,37 While common sense would dictate that those who perform well academically at university also perform well in the SJT, the evidence suggests otherwise.38 Simon et al.22 found no correlation between academic prowess and SJT performance, giving credence to dental recruitment incorporating a measure of academic performance to complement those domains assessed by the SJT. Medical students have voiced concerns that this may result in the less academically shrewd students exploiting the SJT to significantly boost their ranking, however, as the SJT is only one of a multitude of deciding factors, it could be assumed that those who perform well across the board reap the most reward.38 Anyone, even the most academic student, can experience an 'off day', and when the SJT is the sole method of ranking students to training schemes, five or more years of working hard towards a desired training scheme are gone in the space of 105 minutes.3,38 The EPM serves to remove the 'snapshot' nature of the SJT and should be easy to implement by dental schools.

The entry requirements for dentistry can differ significantly with each university, hence competition for entry to different dental schools is fiercer for some more than others. This would suggest that the standards of dental students vary at different dental schools;36 however, after five years of extensive clinical and academic scrutiny, one would assume that dental schools produce competent dentists, so why should academic performance not be included in DFT recruitment? There is no current reward for dental students who achieve academic success.

Standardised assessment

Each dental school is responsible for designing their own curriculum, however, it must align with the learning outcomes outlined in the General Dental Council's Preparing for practice document.39 Differences in the teaching, training and assessment of dental students has led to some educational supervisors commenting on the variations present between individuals and dental schools.40,41 Dental schools in the UK have yet to adopt any form of standardised assessment and set their own finals exams independently, unlike the standardised National Board Dental Examinations used in the United States.42

The General Medical Council and Medical Schools Council have announced the introduction of a Medical Licensing Assessment (MLA) from academic year 2024/25.43 This will be the first time UK medical graduates are able to demonstrate they have met 'a common and consistent threshold for safe practice' before being licensed to practise in the UK.43

The MLA will comprise a two-part assessment: a computer-based applied knowledge test taken by every final-year medical student, and a clinical and professional skills assessment, similar to the format of the Objective Structured Clinical Examination led by each medical school.44 The MLA would be the ideal replacement for the medical school performance portion of the EPM and would ensure parity between medical schools, something which has long been anticipated.37

Recruitment and assessment in dentistry tends to follow the path of medicine, so it will be interesting to follow the evolution of the MLA. The debate around large-scale licensing exams is favoured by strong opinions but is bereft of validity evidence.45 Certainly, with doubt on the future of face-to-face methods of assessment and selection, the creation of a Dental Licensing Assessment would seem feasible, alongside the introduction of an academic performance measure to DFT national recruitment.

Additional COVID-19 contingency measures

It remains to be seen what additional methods of recruitment could have been employed as COVID-19 contingency adjuncts to the SJT, especially as new methods of recruitment need to be subject to an equality impact assessment before implementation.46 The authors contend that no contingency measure should increase the burden of assessment on candidates nor should any recruitment process utilise an assessment which a candidate was not already expecting to sit.46

Historically, candidates only discovered the detail of face-to-face scenarios in the station reading time on the day of assessment. In 2017, however, COPDEND trialled the pre-release of multiple scenarios for both face-to-face stations, a week before the interview period, of which they would be assessed on two chosen at random (one communication and one professionalism, management and leadership).47 This created a level playing field and was aimed at creating a fairer process, with previously reported candidate grievance surrounding those who interview later in the week being at a possible advantage.14 Pre-release of scenarios was further modified in 2018 when only the communication station scenarios were released in advance, which has been the adopted method ever since.47 With this in mind, it is our opinion that remote interviews may have been possible to implement, with all possible scenarios released on a specified date before a virtual interview period. We echo the sentiments conveyed by the British Medical Association that remote interviews, even with current service demands taken into consideration, would be possible with one or two assessors and a lay person present per station.46 The very nature of two ten-minute scenarios, even for those applicants who require reasonable adjustments, circumvents the time limits of alternative lengthy remote assessments and invigilation. The reported failure rate of remote assessment is also relatively low and sits at around 3%, usually due to lost internet connection or technical issues on the candidate's side.46

Evaluation of clinical performance has also been proposed, however, previous evidence shows there is wide disparity among dental schools in the UK with regards to finals requirements in restorative dentistry, hence it would be difficult to evaluate clinical performance equitably at the point of application.48

A virtual test of knowledge-type assessment, similar in vein to that undertaken by Scottish foundation trainees, may have been considered, covering the 'legislative, financial and regulatory issues pertaining to dentistry'.49 Candidates already revise the bulk of these topics in preparation for recruitment assessment, however, this would create a complex new assessment for candidates, increasing the exam burden in a time when stress is already heightened due to the nature of the pandemic.46

Conclusion

Recruitment to educational opportunities should be robust and transparent, with an evidence-based approach to the selection process. Reliance on the SJT as the sole method for ranking candidates is not the recommended best practice and does not account for the domains and softer skills usually assessed at face-to-face interviews. Selection to DFT needs to strongly consider the introduction of academic performance measures. SJTs remain one of the most reliable, valid and cost-effective means of selection into healthcare roles, however, the authors contend only when part of a wider selection process.