ADEA CCI Liaison Ledger

Putting It All Together: Thinking About Assessment of a Dental Student's Overall Competence

(Guest Perspective) Permanent link   All Posts

Marilyn S. Lantz, D.M.D., M.S.D., Ph.D., Professor of Periodontics and Oral Medicine, University of Michigan School of Dentistry, and Associate Director for Education, Career Development, and Mentoring, Michigan Institute for Clinical and Health Research

Beginning July 1, 2013 dental education programs will be required to meet new American Dental Association (ADA) Commission on Dental Accreditation's Accreditation Standards for Predoctoral Dental Education, and as of January 2012 schools have the option to use the new Standards and the accompanying Self-Study Guide for Dental Education Programs to prepare their accreditation Self Study reports. As predoctoral dental education programs prepare to meet the new CODA Standards, discussions have begun regarding how to effectively address some of these requirements. One that has generated some attention is the new Standard 2-23 ("At a minimum, graduates must be competent in providing oral health care within the scope of general dentistry, as defined by the school...") (formerly 2-25), and its "Intent" statement, particularly the last sentence which states: "Programs should assess overall competency, not simply individual competencies in order to measure the graduate's readiness to enter the practice of general dentistry." In addition to assessing student's competence in each of the performance areas in former Standard 2-25 (a "" n) with one additional new area (communicating and managing dental laboratory procedures in support of patient care; the new Standard 2-23 has parts [a-o]), programs will be required to demonstrate that graduating students can "put it all together" and demonstrate "overall" competence.

It is always challenging for a program to consider whether it has mechanisms/assessments in place to document its compliance with a new standard or whether it needs to develop additional processes and documentation! Fortunately, the American Dental Education Association (ADEA) has been proactive in reaching out to schools through its ADEA CCI Liaisons and the ADEA CCI Liaison Ledger with high quality information that can assist schools in thinking about how to assess graduating students overall competence.

I strongly encourage anyone thinking about this issue to read the outstanding "Guest Perspective" feature written by Bill Hendricson for the November 2008 issue of the Ledger. You will find it extremely enlightening. Not only does it review "how we got to where we are" related to the issue at hand, it will also help you put the changes in the recently approved CODA Standards for predoctoral dental education into a broader perspective. You might also discover that you're already assessing overall competence or that you are already assessing it but need to refine how you document your outcomes. If you come away concerned that you may not be assessing overall competence adequately, you will have a lot of ideas about how to improve in this area to move your program forward.

Since I can't think of a way to improve upon Bill's excellent review, I decided to take this opportunity to elaborate a bit on information he provides about a performance-based assessment strategy that can be used as one component of a multi-source assessment to document a student's readiness to practice, or overall competence: the Objective Structured Clinical Examination (OSCE). As Bill describes, competence is a complex construct that includes knowledge, experience, critical thinking, problem solving, professionalism, ethical values, and procedural skills. The OSCE is a performance-based assessment that requires learners to integrate knowledge across domains and disciplines and apply their knowledge and skills to a variety of situations and tasks encountered during patient care "" in other words their ability to "put it all together" when it counts. OSCEs are particularly helpful in assessing general abilities in patient assessment, diagnosis and treatment planning, clinical decision-making, communication skills (particularly if standardized patient interactions are included), and professionalism. They can provide information about student's clinical competence that compliments information obtained using traditional methods of discipline-based technical/procedural skills assessments and they can be used to provide formative feedback. Used formatively, OSCEs can provide opportunities for students to practice and develop skills in a low-risk environment. OSCEs can be designed to probe a student's understanding of foundation knowledge and its application to patient care, and also, if desired, to provide multiple samples of student performance in one or more specific domains across stations.

In addition to the above, the OSCE has psychometric properties that have led to its widespread use in high stakes summative assessments. It is objective, the performance rating can be standardized, and numerous studies have documented its reliability and validity when appropriately designed. (For a recent review related to dental education, see Schoonheim-Klein et al 2008.) In Canada, the OSCE is used as part of the certification examination for licensure in dentistry and medicine. A 2003 study demonstrated positive correlations between performance on the National Dental Examining Board of Canada (NDEB) written and OSCE examinations and performance of students in the final year of dental school (concurrent validity). Moreover, written and OSCE examination scores showed positive correlation with each other (Gerrow et al 2003).

In his review, Bill provides evidence to support the conclusion that pre-graduation internships of at least two months' duration, during which dental students provide comprehensive patient care under the supervision of a small group of the same faculty is an optimal approach for performance-based assessment of readiness for practice. This approach provides the opportunity for faculty to observe and rate many samples of a student's performance in a setting that closely approximates independent practice. I would add that even in this setting, experiential "gaps" can occur for individual students and that complimentary OSCEs offer a mechanism to assist in filling them. For example, do you think it's important for every student to demonstrate the ability to manage an unconscious patient, a child patient presenting with an avulsed tooth, or breaking bad news to a patient? The good news is that schools can guarantee that all students have such experiences by including them in an OSCE.

Finally, the new CODA predoctoral standards suggest that OSCEs may be used as examples of evidence to demonstrate compliance with Standard 2-5 (former Standard 2-8), which states, "The dental education program must employ student evaluation methods that measure its defined competencies." So in addition to their value in documenting overall competence, OSCEs can also be used to demonstrate compliance with the new Standard 2-5 "" a bonus!

At the University of Michigan School of Dentistry, we have been administering a 10-12 station D4 OSCE (think "a day in practice") since 2001 as one of several measures to assess our students' overall competence. It is well accepted by our faculty, and is well integrated into our assessment culture. We continue to evolve it and learn much from analyzing the outcomes. I urge you to try developing and using OSCEs at your school. You will be surprised by what you will learn about your students and about your curriculum from analyzing OSCE outcomes. For example, if a station has a particularly high failure rate, we ask, is it because the degree of difficulty was too high, the tasks required were not clearly defined, or because the curriculum was inadequate? What about a station that has a 100% pass rate? Too easy, or are our students extremely capable in the area assessed? What a fascinating discussion to have with faculty colleagues!


Duggan ad 2013