GET THE APP

A Review of Competency Based Orthopaedic Training in the UK: A Tr
Orthopedic & Muscular System: Current Research

Orthopedic & Muscular System: Current Research
Open Access

ISSN: 2161-0533

+44-20-4587-4809

Review Article - (2014) Volume 3, Issue 4

A Review of Competency Based Orthopaedic Training in the UK: A Trainee’s Perspective

Graeme S. Carlile GS*
Royal Devon & Exeter NHS Foundation Trust, Barrack Road, Exeter, United Kingdom
*Corresponding Author: Graeme S. Carlile GS, Royal Devon & Exeter NHS Foundation Trust, Barrack Road, Exeter, EX2 5DW, United Kingdom, Tel: +61 0449116575 Email:

Abstract

This review explores the concepts and methodology in competency based education, with reference to orthopaedic training in the United Kingdom. In 2006, a new competency based curriculum for postgraduate training in Trauma and Orthopaedics was approved by the Postgraduate Medical Education Training Board. Though the curriculum is now widely accepted, few surgeons have a theoretical knowledge of the basis for competency-based medical education, beyond their own area of involvement. This paper explores the theory and concepts behind competency based education in UK orthopaedic training, which is also of relevance to comparable international orthopaedic training systems and the wider surgical specialities.

Keywords: Orthopaedic; Competency based orthopaedics; Medical education; OCAP; Surgical training

Introduction

In September 2006, a new competency based curriculum for postgraduate training in Trauma and Orthopaedics (T&O) was approved by the Postgraduate Medical Education Training Board (PMETB). The curricula details and syllabus were comprehensively outlined in the 2007 publication, Specialist Training in Trauma & Orthopaedics, A Competency Based Curricula [1], shortly after PMETB approval. The new curricula, termed Orthopaedic Competence Assessment Project (OCAP), would place the theory and practice of competency based education at the forefront of training the next generation of orthopaedic surgeons and with it would come an array of new assessment tools to be used by trainees and trainers. Together with summative assessment in the form of the Intercollegiate Speciality Board Examination, the expected competencies outlined in the new curriculum would contribute towards the award of the Certificate of Completion of Training (CCT). Though the curriculum is now widely accepted and used throughout the speciality on a day-to-day basis, few surgeons have a theoretical knowledge of the basis for competencybased medical education, beyond their own area of involvement. This review serves to highlight the concepts, issues & theorectical basis of competency based education (CBE), with specific reference to orthopaedic training.

Defining Outcome and Competency Based Education

The term “competency based education” was first used in medical literature in 1973 [2]. The exact definition of competency-based education remains a topic of debate within the medical educationalist literature, which may have contributed to uncertainty amongst doctors in neighbouring branches of medicine. A systematic review of the published definitions in 2010 by Frank et al., [3] found 173 definitions within the literature. Their paper proposed a 21st century definition of CBE taking into account key themes identified following qualitative analysis of the definitions found:

“Competency-based education (CBE) is an approach to preparing physicians for practice that is fundamentally orientated to graduate outcome abilities and organised around competencies derived from an analysis of societal and patient needs. It de-emphasises time-based training & promises greater accountability, flexibility and learnercenteredness”.

Throughout the medical educationalist literature the terms ‘outcome’ and ‘competency’ appear to be used interchangeably by different authors. With reference to the definition proposed by Frank et al., an ‘outcome’ is the product obtained from a competency based approach, which is in turn defined by the target audience, in this case, patients and society.

Competency based education: Theory and Concepts

The move towards CBE in undergraduate training has largely been responsible for influencing the introduction of CBE into postgraduate training, which can be viewed as a logical progression. The reform movement in medical education began a hundred years ago with Abraham Flexner’s report to the Carnegie Foundation [4]. The ‘Flexnarian model’ was the traditional forward-thinking model [5] of medical education that many of today’s consultants will be familiar with. Fundamental knowledge is defined, taught and rigorously tested, usually using a summative examination as the assessment tool. The origins of CBE began in the United States of America in the late eighties [6,7]. In 1990, Miller proposed a four-step model identifying levels of assessment for doctors, with an emphasis towards real life tasks [8]. With reference to a medical task, for example venepuncture, Miller’s pyramid outlines how a doctor may know of the indication for venepuncture and relevant anatomy, know how to perform the task (tier 2), be able to show how to perform venepuncture such as on a model (tier 3) and finally does the task; knows, knows how, shows how and does. Miller proposed that tiers 1 & 2 may be assessed using summative methods and tiers 3 & 4 using formative methods. Miller’s pyramid represents the clearest example of purely ‘performance’ based assessment, centred on the trainee acquiring the necessary knowledge and skills in order to perform the task, which is the final outcome.

In the United Kingdom (UK), the development & introduction of CBE into medical schools owes much to the work of Harden at the University of Dundee. Driven by the General Medical Council’s recommendations in 1993 on undergraduate medical education [9], Harden et al. established the need for a core medical curriculum driven by clearly specified learning outcomes [10]. Building on the work of Spady [7] in the United States, Harden outlined two fundamental principles essential to CBE; learning outcomes should be identified and made explicit to all, and educational outcomes dictate curricula content [11]. Harden argued that unlike the ‘Flexnarian Model’ which demonstrated little consideration for the medical students’ capabilities as a future doctor, the ‘backward-planning’ model employed in CBE ensured that recognition of the eventual product defined the process. Key to the ‘process’ would be the identification of ‘learning outcomes’, conceptualised in a three-circle model [12].

The inner circle represents what the doctor or trainee, is able to do and can be thought of as “doing the right thing”. This inner circle comprises seven learning outcomes related to technical intelligences such as clinical and practical skills, patient investigation, management, communication, health promotion and documentation. The middle circle represents approach to practise or “doing the thing right” and comprises three learning outcomes based on intellectual intelligence, basic science and knowledge, emotional intelligences, ethics, responsibility, probity, and analytical intelligences, decision-making and judgement. Finally, the outer circle represents the individual as a professional, with two learning outcomes based around personal intelligences, the individual’s role within the health service and personal development. Though Harden’s model was originally developed for CBE in the undergraduate setting, the basic fundamental principles can be applied to any sub-speciality trainee.

The model for competency-based education relies heavily on the behaviourist theory of learning [13,14]. Behaviourist theories assume the environment influences and shapes behavior [15]. Behaviourism views the student as blank slate or tabula rasa [16]. Behaviour is shaped through positive or negative reinforcement during or following an event. The most well known example of early behaviourist work is that of Pavlov’s dogs. In the clinical setting positive and negative reinforcement are given during feedback in reflective practice. Feedback tools provide a mechanism for reinforcement of a trainee’s performance and enables the individual to set and achieve goals [17]. Though feedback is now commonplace in the clinical setting, it can be seen as an example of educational learning theory in practice. Whilst the behaviourist model underpins much of CBE, it has also become one of its main criticisms. Critics of behaviourism do not accept that anyone can be trained to perform any given task [16]. That CBE focuses trainees on the minimum requirements to perform individual smaller tasks and ignores the overall bigger picture and higher order thinking [13,18]. A trainee deemed “competent” does not perform a task in the same manner as an experienced clinician and nor does this occur amongst other experienced clinicians [19,20].

OCAP curriculum and design

The Orthopaedic Competence Assessment Project (OCAP) was born from a wider appreciation for a need towards change in surgery. Following on from the recommendations in the Bristol Inquiry [1,21], the Joint Committee on Higher Surgical Training (JCHST) established a Competence Assessment Working Party and recommended generic and clinical competencies for all surgical trainees in 2002. Further political motivation towards reform of postgraduate training to a streamlined, competency-based approach [22] was provided by the publication of Unfinished Business – Proposals for reform of the Senior House Officer grade [23] in 2002, and the government’s response to this in the 2003 publication of Modernising Medical Careers (MMC) [24]. OCAP was established in December 2002 with the mission statement to “improve the quality of Higher Surgical Training in orthopaedics through the introduction of a competence based portfolio of coaching & assessment tools”.

As with all competency-based curricula, the eventual outcome drives the process by which the curriculum is designed. The outcome of training as set out in the OCAP syllabus, is to produce a Trauma & Orthopaedic (T&O) Consultant who is proficient in the management of trauma patients and has a routine elective surgery commitment. The syllabus is focused around three independent components; applied clinical knowledge, applied clinical skills, professionalism and management. Whether intentional or not, these have a similarity to Harden’s three circle model of technical, intellectual and emotional intelligences. The modular syllabus is divided into three phases over eight training years. The learning outcomes throughout the Initial Phase, Speciality Trainee (ST) years1 and 2, are generic to most surgical specialities and focus on basic principles. Much of the content is confluent with the pan-surgical speciality Intercollegiate Surgical Curriculum Programme (ISCP) syllabus. In addition to demonstration and portfolio evidence of competencies, trainees are required to pass the intercollegiate membership examinations prior to progressing to the Intermediate Phase (ST3-6). The trainee is expected to acquire competencies equivalent to a consultant practising at a district general hospital in this phase. Having completed the Intercollegiate Speciality Board Fellowship Examination, the Final Phase (ST7-8) allows the trainee to acquire remaining competencies and begin to develop a specialist interest that will be taken on into consultant practice, following completion of training.

The curriculum assessment tools have a strong emphasis towards feedback and reflective practice. Trainers are encouraged to produce a mini curriculum vitae (CV), reflecting on elements of their practice for trainees to view. Based on this, both identify learning objectives for the attachment. Core competencies in applied clinical skills (Table 1) are formatively assessed using Procedure Based Assessments (PBA’s). Fourteen ‘Core Competencies’ were initially selected to reflect the generality of orthopaedic training, and on the basis of what a “day one consultant” may be expected to do in practice.

1. Carpal tunnel decompression
2. Digital & palmar fasciectomy
3. Diagnostic arthroscopy & simple arthroscopic procedures
4. Total knee replacement
5. Application of limb external fixator
6. 1st ray surgery
7. Compression hip screw for intertrochanteric fracture neck of femur
8. Hemiarthroplasty for intracapsular fracture neck of femur
9. Total hip replacement
10. Lumbar discectomy
11. Operative fixation of Weber B fracture of ankle
12. Fixation of patella by tension band wiring
13. Intramedullary nailing of femur or tibia
14. Tendon repair

Table 1: Core Competencies.

Before and after each clinical attachment, trainees are encouraged to reflect on their knowledge of the syllabus and rank their level. The importance of reflective practice throughout OCAP cannot be overstated. The curriculum authors go to great lengths to emphasise reflection in the syllabus [1] and include an explanation of Kolb’s learning cycle [25]. Reflection allows trainees to develop a sense of perspective, explore the rationale behind decision making and learn from their experiences [26]. Feedback has been consistently shown to have a major impact on learning and professional development [27]. Additional workplace-based assessment tools adopted by OCAP include mini-clinic evaluation exercise (mini-CEX), direct observation of procedural skills (DOPS) and mini peer assessment tool (minipat) [28]. By way of a specific reference to Miller’s pyramid [8] in the curriculum [1], OCAP believe PBA’s target the highest level of assessment, with mini-CEX and DOPS targeting the middle levels [29]. Trainees are encouraged to developed a “portfolio of evidence of training” using OCAP assessment tools. Together with supporting evidence from the electronic logbook [30], to which OCAP is linked, demonstration of participation in research and clinical audit, the trainee’s progression is formally appraised at the Annual Review of Competence Progression (ARCP) in relation to their learning objectives.

Competency based education: contemporary issues

The competency based approach as a paradigm has drawn considerable criticism and is not specific to OCAP in isolation. The use of the term “competent” has drawn anger from those at odds with its use in the context of a desired goal, to be competent rather than excellent [13,16]. That trainee’s focus on the minimum standards required to pass and become obsessively driven on achieving micro targets, rather than developing higher order thinking on a macro level. The desire at undergraduate level to reduce factual burden [9] has been described as a reductionism [31] in basic knowledge, and is reflected in the feedback from doctors trained under the new system [32]. At postgraduate level the validity of assessment has been called into question. In a systematic review of original articles on reliability and validity of assessments in postgraduate certification, of the 7705 titles identified only 55 met the inclusion criteria for analysis [33]. The authors concluded there was insufficient evidence to support the validity and reliability of any single assessment process. However, how can any investigation realistically quantify the impact of CBE versus traditional educational methods in a standardised way? It is arguable that in reality, many competency based postgraduate curricula are in fact a blended approach. The use of psychosocial language and models throughout the medical educationalist literature, does little to convince analytical clinicians looking for ‘evidence’.

Perhaps it is the opinions of trainees immersed in competency based education that best reflect the contemporary issues. To many, rightly or wrongly, CBE is intrinsically linked with the MMC fiasco and debate surrounding the European Working Time Directive (EWTD). The implementation of a maximum 48-hour working week for trainees has been publicly condemned by the Royal Colleges [34]. There is strong evidence demonstrating a reduction in training time by a third [35], a reduction in the number of core procedures performed [36], associated with a subsequent increase in operative time [37], and an increase in trainee fatigue [38] further limiting ability. Why fatigue amongst trainees had increased whilst working hours have decreased is not clear, but may be due to a change in working patterns with the introduction of shifts, a greater intensity of work during those hours or may be a reflection lower morale across the work force. A competency based, time independent curriculum offers an attractive solution to the reduction in working hours. In reality, most curricula are time dependant [13] with clearly defined cut-off points. The introduction of surgical postgraduate competency based assessment in the form of ISCP, unfortunately coincided with the MMC fiasco in 2007 that generated significant ill feeling across the profession [39]. The ISCP website and logbook, which were separate from OCAP, was poorly received by the remaining surgical specialities. Trainers and trainees lacked experience with competency based assessment tools and required additional training [40]. Of 539 users surveyed, forty percent felt that ISCP impacted negatively on their training [41], with a perceived lack of user friendliness and a mandatory fee cited as reasons. Without question, one of the biggest disadvantages of CBE is the increase in trainee/trainer workload, technical infrastructure required and costs associated [42].

OCAP was fully integrated into ISCP to form one online panspeciality curriculum in 2011. This year marked the first major overhaul of the OCAP syllabus. The current core competency PBA’s are retained, now referred to as primary PBA’s which are compulsory for all trainees to complete. Additionally, in answer to the critisism that the competencies could be expanded, it is now possible to complete secondary and tertiary PBA’s, which aim to assess general and subspecialist areas respectively.

Competency based assessment has been introduced to all surgical specialities on the basis of recommendations by the GMC, JCHST and PMETB. The British Orthopaedic Association was heavily involved in the design of the postgraduate curriculum. ISCP/OCAP remains the only system for assessment of orthopaedic trainees in the United Kingdom.

OCAP: A trainee’s perspective

As a trainee using OCAP on a regular basis, I have generally found it easy to use and of benefit. The system allows trainees to build a ‘portfolio of evidence’ throughout their training to demonstrate their competencies. It is possible to upload extrnal evidence in the form of published papers, presentations and reports. The user interface has undergone upgrades and is relatively easy to use. Trainers and programme directors have accsess to their trainees profiles, which has become an ever increasing part of the annual review process. The burden of keeping paper copies and constantly updating one’s portfolio is a thing of the past.

Anecdotally, I have come across few colleagues that feel negatively about OCAP. Unlike the backslash of resentment towards ISCP, there are few, if any papers openly criticising OCAP. One of the main limitations however remains the heavy time investment on the part of both trainee and trainer. The majority of evidence is collected online, and the trainee has the ability to email a link to their trainer to complete a DOP. This has significantly streamlined the process, however a single trainer may have several trainees he or she is responsible for, which collectively generates a significant work load. For trainees that have not kept their profile up to date, uploading a careers worth of evidence at the end of their training is a gargantuan task, however younger trainees whom have used the system from day one will not have this problem.

The mission statement, in essence to produce an orthopaedic surgeon in the generalist sense, is not over ambitious, which is reflected in the core competencies (Table 1). The OCAP syllabus does a good job of encompassing an expanding speciality and is comprehensively outlined in the curriculum [1]. Those responsible for OCAP have listened to trainee’s perspectives on the system and are constantly making efforts to improve its content and interface. As a trainee coming to the end of training, I feel I have benefited from using the system greately and would encourage others to use OCAP to its full capabilities.

References

  1. Pitts D, Rowley DI, Marx C, Sher L, Banks T, et al. (2007) Specialist Training in Trauma & Orthopaedics, A Competency Based Curricula. London: British Orthopaedic Association.
  2. Brown TC, McCleary LE, Stenchever MA, Poulson AM Jr (1973) A competency-based educational approach to reproductive biology. Am J ObstetGynecol 116: 1036-1042.
  3. Frank JR1, Mungroo R, Ahmad Y, Wang M, De Rossi S, et al. (2010) Toward a definition of competency-based education in medicine: a systematic review of published definitions. Med Teach 32: 631-637.
  4. Flexner A (1910) Medical Education in the United States and Canada: A Report to the Carnegie Foundation for the Advancement of Teaching. Bulletin No. 4. Boston, Mass: Updyke.
  5. Dent JA, Hardman RM (2009) A Practical Guide for Medical Teachers. 3rd ed. London: Churchill Livingstone.
  6. Spady WG (1988) Organising for results: the basis of authentic restructuring and reform. Educational Leadership Oct: 4-8.
  7. Spady WG (1993) Perspectives and imperatives: outcome-based education: reform and the curriculum process. Journal of Curriculum and Supervision 8: 354-363.
  8. Miller GE (1990) The assessment of clinical skills/competence/performance. Acad Med 65: S63-67.
  9. Tomorrow’s Doctors; Recommendations on Undergraduate Medical Education. London: General Medical Council; 1993.
  10. Harden RM, Davis MH (1995)The core curriculum with options or special study modules. Med Teach 18:125-148.
  11. Harden J R Crosby M H Davis M Friedman RM (1999) AMEE Guide No. 14: Outcome-based education: Part 5-From competency to meta-competency: a model for the specification of learning outcomes. Med Teach 21: 546-552.
  12. Leung WC (2002) Competency based medical training: review. See comment in PubMed Commons below BMJ 325: 693-696.
  13. Mann KV (2004) The role of educational theory in continuing medical education: has it helped us? J ContinEduc Health Prof 24 Suppl 1: S22-30.
  14. Brooks MA (2009) Medical education and the tyranny of competency. PerspectBiol Med 52: 90-102.
  15. Violato C, Lockyer J, Fidler H (2003) Multisource feedback: a method of assessing surgical practice. BMJ 326: 546-548.
  16. Talbot M (2004) Monkey see, monkey do: a critique of the competency model in graduate medical education. Med Educ 38: 587-592.
  17. Grant J (1999) The Incapacitating Effects of Competence: A Critique. Adv Health SciEduc Theory Pract 4: 271-277.
  18. Hodges B, Regehr G, McNaughton N, Tiberius R, Hanson M (1999) OSCE checklists do not capture increasing levels of expertise. Acad Med 74: 1129-1134.
  19. The Inquiry into the management of care of children receiving complex heart surgery at The Bristol Royal Infirmary. Bristol Royal Infirmary Inquiry, July 2001.
  20. Intercollegiate Surgical Curriculum Project (ISCP). https://www.iscp.ac.uk (accsessed 14th March 2014).
  21. Donaldson L. Unfinished business - proposals for reform of the senior house officer grade. London: Department of Health; August 2002.
  22. Modernising medical careers: the response of the four UK Health Ministers to the consultation on "Unfinished business - proposals for reform of the senior house officer grade". London: Department of Health; February 2003.
  23. Kolb DA, Fry R (1975)Toward an applied theory of experiential learning. In: Cooper C (ed.) Theories of Group Process. London: John Wiley.
  24. Ginsburg S, Lingard L (2006) Using reflection and rhetoric to understand professional behaviours. In: Stern DT (ed.) Measuring medical professionalism. New York: Oxford University Press.
  25. Norcini J, Burch V (2007) Workplace-based assessment as an educational tool: AMEE Guide No. 31. Med Teach 29: 855-871.
  26. Beard J, Rowley D, Bussey M, Pitts D (2009) Workplace-based assessment: assessing technical skill throughout the continuum of surgical training. ANZ J Surg 79: 148-153.
  27. Pitts D, Rowley DI, Sher JL (2005) Assessment of performance in orthopaedic training. J Bone Joint Surg Br 87: 1187-1191.
  28. National Orthopaedic and Trauma Log Book. Faculty of Health Informatics, Royal College of Surgeons of Edinburgh.
  29. Frank JR, Snell LS, Cate OT, Holmboe ES, Carraccio C, et al. (2010) Competency-based medical education: theory to practice. Med Teach 32: 638-645.
  30. Watmough SD, O'Sullivan H, Taylor DC (2010) Graduates from a reformed undergraduate medical curriculum based on Tomorrow's Doctors evaluate the effectiveness of their curriculum 6 years after graduation through interviews. BMC Med Educ 29:10-65.
  31. Hutchinson L, Aitken P, Hayes T (2002) Are medical postgraduate certification processes valid? A systematic review of the published evidence. Med Educ 36: 73-91.
  32. The Royal College of Surgeons. Surgical training seriously compromised by European working time directive.
  33. Goddard AF, Hodgson H, Newbery N (2010) Impact of EWTD on patient:doctor ratios and working practices for junior doctors in England and Wales 2009. Clin Med 10: 330-335.
  34. Maxwell AJ, Crocker M, Jones TL, Bhagawati D, Papadopoulos MC, et al. (2010) Implementation of the European Working Time Directive in neurosurgery reduces continuity of care and training opportunities. ActaNeurochir (Wien) 152: 1207-1210.
  35. Wilson T, Sahu A, Johnson DS, Turner PG (2010) The effect of trainee involvement on procedure and list times: A statistical analysis with discussion of current issues affecting orthopaedic training in UK. Surgeon 8: 15-19.
  36. Tucker P, Brown M, Dahlgren A, Davies G, Ebden P, et al. (2010) The impact of junior doctors' worktime arrangements on their fatigue and well-being. Scand J Work Environ Health 36: 458-465.
  37. Chand M, Faruque M, Dabbas N, Nash GF (2010) Modernising medical careers and the British surgeons of the future. Br J Hosp Med (Lond) 71: 282-285.
  38. Beard J, Rowley D, Bussey M, Pitts D (2009) Workplace-based assessment: assessing technical skill throughout the continuum of surgical training. ANZ J Surg 79: 148-153.
  39. Pereira EA, Dean BJ (2009) British surgeons' experiences of mandatory online workplace-based assessment. J R Soc Med 102: 287-293.
  40. Taber S, Frank JR, Harris KA, Glasgow NJ, Lobst W, et al. (2010) Identifying the policy implications of competency-based education. Med Teach 32:687-691.
  41. Joint Committee on Surgical Training and British Orthopaedic Association communication to trainees, December 2010.
Citation: Carlile GS (2014) A Review of Competency Based Orthopaedic Training in the UK: A Trainee’s Perspective. Orthop Muscul Syst 3: 178.

Copyright: © 2014 Carlile GS. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Top