By Prof. William D. Hendricson, Assistant Dean for Educational and Faculty
Development, University of Texas Health Science Center San Antonio
Dental School, and Senior Consultant, Academy for Academic
Competency-based education (CBE) has been designated as
the educational model for dental school, so assessment strategies
in predoctoral dental education should be implemented in a manner
consistent with this educational philosophy. To do this, we must
first understand what defines competency-based education. Then we
can seek out best practices for assessing students' readiness to
provide dental care in the public domain, without supervision, and
under their own licenses.
Competency-based education came into prominence in the
United States during the 1950s. In the post-Sputnik era, concern
that we were falling behind the Soviet Union in the "space race"
and in science generally led the public to demand that our
universities produce better outcomes. CBE was conceived as a means
to achieve this goal.
Four characteristics distinguish CBE:
- Trainee outcomes are based on analysis of the job
responsibilities and tasks of practitioners.
- Curriculum is focused on what students need to learn to
perform these on-the-job responsibilities, not around the
traditional subject matter prerogatives of
- Hierarchically sequenced modules allow students to
proceed through the curriculum at their own pace.
- Educators employ assessment techniques that measure
unassisted learner performance in settings approximating real-world
CBE was first mentioned by the Commission on Dental
Accreditation (CODA) as a philosophy for dental education in its
1995 predoctoral standards. It was first described as a desired
curriculum model by ADEA in 1997 when the initial set of
competencies that define the outcomes of predoctoral education were
The 2008 revisions of the CODA predoctoral standards,
now undergoing scrutiny by dental communities of interest, and the
Competencies for the New General Dentist, adopted by the
ADEA House of Delegates in April 2008, both endorse CBE as the
model for the predoctoral curriculum, and both organizations now
clearly identify a "general dental practitioner" as the desired
outcome of dental school.
The preamble to the 2008 ADEA competencies states that
"a competency is a complex behavior or ability essential for
general dentists to begin independent and unsupervised dental
practice. Competency includes knowledge, experience, critical
thinking, problem-solving, professionalism, ethical values and
procedural skills. These components of competency become an
integrated whole during the delivery of patient care." This last
statement is critical for understanding competency assessment
strategies in CBE.
Assessments Used in CBE
What methods are used in CBE for assessing students'
readiness to provide dental care in the public domain without
supervision and under their own licenses? In CBE, the highest
priority is determining students' readiness for practice, i.e.,
their capacity to "put it all together." Appraisal of practice
readiness is based on two concepts:
- assessing students' overall or general competence
rather than focusing on individual skills, known as component or
silo competencies, which are often taught and evaluated in
isolation;5, 9, 13, 15 and
- employing multiple data sources based on the principle
Assessing students' general
One of the leaders in competency-based education, Paul
Pottinger, observed: "Competence cannot be meaningfully defined by
endless reductions of specific skills, tasks and actions which, in
the end, fall short of real world requirements for effective
performance. In fact, the more essential characteristics for
success often turn out to be broad or generalized abilities which
are sometimes more easily operationally defined and measured than
an array of subskills that do not add up to a general competence."
The ADEA Commission on Change and Innovation in Dental
Education (ADEA CCI) conducted a survey in 2008 with participation
by 53 of 56 U.S. dental schools, which asked course directors to
report how students' competency is assessed. Nearly one thousand
faculty (45% of course directors) responded, providing the most
extensive study of dental school assessment strategies to
date.17 A key survey question was, how does your school
make a comprehensive assessment of students' readiness for entry
into unsupervised practice; in other words, how does your school
assess Pottinger's "general competence?"
The response options included:
(1) a checklist system where students are certified for graduation
if they pass all courses, meet GPA standards, complete all clinical
requirements, meet all departmental expectations, complete all
rotations, submit all assignments, and pay their bills;
(2) an in-school internship where a small group of faculty work
with each student for several months to observe daily performance
across all competency domains;
(3) implementation of gatekeeper examinations to credential
accomplishment of core competencies; and
(4) departmental certification that students are competent in
areas of dentistry germane to their disciplines.
Seventy percent of schools used option 1, the checklist
approach. Only a handful of respondents (7% each) used options 2
and 3, yet option 2 followed by option 3 are the practice readiness
assessments most consistent with CBE.
Figure 1 illustrates the difference between assessing component
competencies and general competence. The individual skills in the
general dentists' toolkit (component or silo competencies) are
represented by ovals inside the circle and the new concept of
general competence (the capacity to "put it all together"
consistently) is represented by the outer circle in bold. Current
performance measurement theory indicates that primary emphasis
should be placed on assessing students' overall package of skills
(the outer ring) in working conditions that approximate "authentic
practice."8, 9, 10, 13
From my experience with many health professions, dental
education does the best job of assessing the silo components of
competency. However, as Michael Eraut articulated in
Professional Knowledge and Competence,18
"professional competence is more than demonstration of isolated
competencies. When we see the whole, we see its parts differently
than when we see them in isolation."
The prevailing recommendation for measuring general
competence is a pre-graduation internship of at least two months'
duration that resembles the work environment, tasks, and
responsibilities of entry-level practitioners. During the
internship, students work under the daily supervision of a small
group of the same faculty who observe and assess them on a range of
measures: reproducibility of component competencies, seamless
transition between silo competencies during patient care, depth of
knowledge, punctuality, decorum, appearance, stress management, and
the students' capacity for self-assessment and self-correction.
Many dental faculty are concerned about the "qualitative" or
"subjective" nature of this type of assessment, but George
Miller19 asserted that "the collective wisdom of faculty
who have consistent opportunities to observe and interact with the
student is the essential core of performance assessment," a
perspective endorsed by virtually every review of assessment best
practices in health professions
The Assessment Triangle
Readiness assessment based on multiple data sources is
more likely to be accurate than single-source measures or
disproportionate reliance on one measure over other potential
sources of information about students' practice
readiness.21 This best practice is commonly referred to
as triangulation, and it was recently described in the Journal
of Dental Education by Jehangiri.16
Figure 2 depicts a triangulation model for competency
assessment. The pinnacle of the model represents performance,
including the 3 Ps: process (human factors including communication,
diligence, organization, compassion, ethical behavior); product
(outcomes of patient care); and procedure (technical skills
necessary to provide patient care). The foundation legs are
appraisal and reflection (self-assessment and self-correction), and
knowledge. New assessment strategies, consistent with CBE
principles, have been developed for each leg of the
The Performance Leg (Internships and
For practice readiness relative to the performance leg,
the pre-graduation internship is considered to be an optimal
approach. Internships answer the questions: Can senior students put
it all together in an environment that approximates general
dentistry? Can they function in clinical environments where they
don't have several hours for one appointment, but are expected to
practice acceptably during four to five appointments
In most health professions education programs, the
internship is supported by the Objective Structured Clinical
Examination (OSCE), a technique for readiness assessment that is
used for gateway examinations that students must pass in order to
advance to a higher academic level, graduate, or obtain a
license.22-23 For example, the National Dental Examining
Board of Canada now implements an OSCE as a core component of the
licensure process in that nation. Many of us were able to
experience an OSCE at the 2008 ADEA CCI Conference in
In OSCEs, students rotate through 20-30 stations at
timed intervals that provide a representative sampling of patient
problems or clinical tasks. At action or task stations, students
perform specific procedures under observation by trained
evaluators. For instance, OSCEs typically contain several stations
where students interview and examine patients trained to portray
different types of oral health problems, often with co-morbidities
that may influence decision making. At subsequent assessment
stations, students report clinical findings to a faculty member,
propose and justify a diagnosis, and compare and contrast
During OSCEs, students may be asked to demonstrate
comprehension of underlying basic science principles by linking the
patient's symptoms to pathophysiological mechanisms. This is
accomplished by verbal questioning from a station proctor, or
students may respond by writing short answer essays or answering
multiple-choice questions. At other OSCE stations, students may be
asked to study case scenarios and then select answers to
multiple-choice questions about diagnostic tests, assessment
techniques, and treatment planning. Most OSCEs contain radiographic
interpretation stations as well as stations where students assess
laboratory findings and measure vital signs. The student's overall
score on the OSCE is derived from their understanding of
pathophysiology, use and comprehension of assessment techniques,
capacity to interpret clinical findings, ability to make
appropriate treatment planning decisions, performance of patient
examination skills, and interpersonal skills. Standardized
patients, trained patients who provide a standardized experience
for all students rotating through the station, are often used to
assess patient-examination and interpersonal
The integrative nature of the OSCE, which samples a
broad spectrum of competencies, is consistent with CBE assessment
The Appraisal & Reflection Leg
For the appraisal and reflection leg, where students
must reflect upon and assess their own performance, portfolios are
recommended.26, 27 Students use portfolios to collect
evidence that demonstrates their progress toward and accomplishment
of specified competencies including longitudinal documentation of
patient care, performance on competency exams, case presentations,
literature reviews, reports, formative evaluations, formal
performance reviews by supervising faculty, and most importantly,
the students' own appraisal of their performance and reflections on
needed improvements, lessons learned, and insights about dentistry
or the learning process. The reflection component allows faculty to
appraise the student's level of self-awareness and capacity for
reflection. Review of the portfolio content provides an opportunity
for student""teacher dialogue centered on the students' work
products and assessment of progress. Without self-assessment and
reflection, portfolios can digress to "scrapbooking."
U.S. schools of pharmacy use a portfolio system,
prescribed by their accreditation commission, as the principal
assessment technique to measure students' attainment of 18
competencies, and many doctoral programs, in and out of the
sciences, now employ portfolios instead of qualifying examinations
and other types of grading.
The Knowledge Leg (TJEs and CATS)
For the knowledge leg, the Triple Jump Exercise (TJE) is
considered state-of-the-art and is widely used in health
professions education to evaluate students' capacity to access,
analyze, and apply biomedical knowledge to healthcare
problems.28-30 There are several variations of TJEs.
Clinical TJEs consist of three phases (thus, the "jumps") completed
in one or two days. In the first jump, students interview and
examine patients while observed by faculty or while being
videotaped for retrospective review including student
self-assessment. In the second, they write an assessment of the
findings using the "SOAP" format (subjective data, objective data,
assessment, plans). The emphasis is on providing evidence from the
literature to support assessment and therapeutic decisions. In the
third jump, students participate in an oral examination during
which faculty members question them about the pathophysiology,
diagnosis, and treatment of the patient's problems and ask them to
review research evidence related to treatment options and outcomes.
Students receive an evaluation for each jump and a cumulative score
across all three jumps.
TJEs implemented in the preclinical curriculum focus on
students' research skills and capacity for self-directed learning.
Students are asked to find evidence in the literature that answers
research questions, which the students develop themselves,
pertinent to health problems. In TJEs for freshman or sophomore
students, the first jump involves reading a scenario depicting a
patient with an oral health problem, and then identifying key
issues in the case and writing a researchable question in the PICO
format (patient with problem, intervention, comparison, outcome).
For the second jump, students explore the literature to find
evidence pertinent to their question, and in the third jump,
students report their findings, answer the research question, and
critically appraise the quality of the research evidence. As with
clinical TJEs, students receive evaluations for each jump and a
cumulative score across all three jumps. Both types of TJEs
emphasize locating pertinent information, applying it to specific
health problems, and evaluating the quality of the information
accessed, in contrast to multiple-choice testing, which primarily
The Critically Appraised Topic Summary (CATS) is a new
technique to assess students' capacity to use biomedical knowledge
to make reasoned decisions.31-33 CATS is a cousin to the
TJE in that students start by reviewing a case scenario or an
actual patient's clinical presentation, identify unknowns that need
to be explored, write a researchable question in the PICO format,
explore the literature to find and analyze evidence, and then write
a summary that indicates an answer to the question and
recommendations based on appraisal of the research. Like the TJE
and aspects of the OSCE, CATS evaluates how students access,
analyze, and apply biomedical knowledge. In so doing, it measures
students' capacity for self-directed learning.
In summary, the good news is that there are several
techniques, which are relatively new to dental education, which can
provide comprehensive assessment of several competence domains, and
thus are consistent with CBE's emphasis on practice readiness.
Because internships, OSCEs, portfolios, TJEs, and CATS are new to
academic dentistry, there are few examples to provide
implementation heuristics. The 2008 ADEA CCI Survey of Competency
Assessment revealed that less than 2% of dental school course
directors use any of these techniques while traditional techniques
such as multiple choice testing, procedural requirements, practical
exams in the laboratory, clinical comps, and daily grades still
comprise 70% of all assessment done in the predoctoral
curriculum.17 These data might be considered "bad news,"
but they also reveal the opportunities that lie ahead for dental
educators to incorporate assessment strategies into the curriculum
that reinforce the philosophy of competency-based
1. Grant G (ed.) On Competence: a critical analysis of
competency-based reforms in higher education. Washington D.C.:
2. Grussing PG. Curricular design: competency
perspective. Am J Pharm Educ 1987; 51: 414-419.
3. Chambers DW, Glassman P. A primer on competency-based
evaluation. J Dent Educ 1997; 61: 651-666.
4. Hendricson WD, Kleffner JH. Curricular and
instructional implications of competency-based dental education. J
Dent Educ. 1998; 62(2): 183-196.
5. Smith SR. Dollase R. AMEE Guide no. 14: outcome-based
education. Part 2: planning, implementing and evaluating a
competency-based curriculum. Med Teacher 1999; 21:
6. Linn RL. Complex, performance-based assessment:
expectations and validation criteria. Educ Researcher 1991; 16(1):
7. Van der Vieuten CPM. The assessment of professional
competence: developments, research and practical implications. Adv
Health Science Educ. 1996; 1: 41- 67.
8. Wass V, Van der Vieuten CPM, Shatzer J, Jones R.
Assessment of clinical competence. The Lancet. 2001; 357: March 24:
9. Swing SR. Assessing the ACGME general competencies:
general considerations and assessment methods. Acad Emerg Med.
2002; 9(11): 1278-1287.
10. Epstein RM, Hundert EM. Defining and assessing
professional competence. JAMA. 2002; 287(2): 226-235.
11. Rethans J, Norcini J, Baron-Maldonado M.
Relationship between competence and performance: implications for
assessing practice performance. Med Educ. 2002; 36(10):
12. Smith SR, Dollase RH, Boss JA. Assessing students'
performance in a competency-based curriculum. Acad Med. 2003; 78:
13. Van derVleuten, CPM, Schuwirth L. Assessing
professional competence: from methods to programmes. Med Educ.
2005; 39: 309-317.
14. Epstein RM. Assessment in medical education. NEJM.
2007; 356(4): 387-396.
15. Pottinger PS. Comments and guidelines for research
in competency identification, definition and measurement. Syracuse,
NY: Educational Policy Research Center. 1975.
16. Jahangiri L, Mucciolo T, Choi M, Spielman A.
Assessment of teaching effectiveness in U.S. dental schools and the
value of triangulation. J Dent Educ. 2008 72: 707-718
17. Hendricson WD. Commission on Change and Innovation
in Dental Education. Selected Results of 2008 Survey of Dental
Student Competency Assessment Survey. Oral Presentation at the 2008
ADEA CCI Liaisons Conference; June 23, 2008. Chicago,
18. Eraut M. Professional Knowledge and Competence.
London: Falmer Press. 1994.
19. Miller GE. Assessment of clinical skills/competence/
performance. Acad Med. 1990; 9: 63-67.
20. McGaghie WC. Evaluating competence for professional
practice. in Curry L, Wergin JF (eds.) Educating Professionals. San
Francisco, CA: Jossey-Bass Publishers. 1993.
21. Lockyer J. Multisource feedback in assessment of
physician competencies. J Cont Educ Health Prof. 2003; 23: 2 ""
22. Carraccio C. The Objective Structured Clinical
Examination: a Step in the Direction of Competency-Based
Evaluation. Arch Pediatr Adolesc Med 2000; 154:
23. Zartman RR, McWhorter AG, Seale NS, Boone WJ. Using
OSCE-based evaluation: curricular impact over time. J Dental Educ.
24. Gerrow JD, Murphy HJ, Boyd MA, Scott DA. Concurrent
validity of written and OSCE components of the Canadian dental
certification examinations. J Dent Educ.
25. Johnson JA, Kopp KC, Williams RG. Standardized
patients for assessment of dental students' clinical skills. J Dent
Educ. 54: 331-333, 1990.
26. Chambers D. Portfolios for determining initial
licensure competency. JADA. 2004; 135: 173""84.
27. Friedman BDM, Davis MH, Harden RM, Howie PW, Ker J,
Pippard MJ. AMEE Medical Education Guide No. 24: portfolios as a
method of student assessment. Med Teacher. 2001;
28. Feletti G, Ryan G. Triple jump exercise in
inquiry-based learning: a case study. Assess & Eval in Higher
Educ. 1994; 19(3): 225-234.
29. Smith RM. The triple-jump examination as an
assessment tool in the problem-based medical curriculum at the
University of Hawaii, Acad Med. 1993; 13, 366""372.
30. Rangachari PK. The TRIPSE: A process-oriented
evaluation for problem-based learning courses in the basic
sciences. Biochem and Molecular Biol Educ. 2002; 30(1):
31. Wyer PC. The critically appraised topic: closing the
evidence transfer gap. Ann Emerg Medicine. 1997; 30(5): 639 ""
32. Suave R. The critically appraised topic: practical
approach to learning critical appraisal. Ann R Coll Physicians
& Surg. 1995; 28: 396
33. Iacopino AM. The influence of "new science" on
dental education: current concepts, trends, and models for the
future. J Dent Educ. 2007; 71(4): 450-462.
Figure 1: Component / Silo Competencies and
A fundamental principle of competency-based assessment is
to measure students' practice readiness as represented by General
Competence (i.e., capacity to "put it all together" over an
extended period of time).