This blog was co-authored by Gary L. Beck Dallaghan, Ph.D. and Michael Ashley, B.S.

In 2014, the Association of American Medical Colleges issued recommendations for essential activities every graduating medical student should be able to perform unsupervised. The guiding principles underscoring the development of these skills included patient safety and enhancing confidence of stakeholders regarding new residents' abilities. These activities are meant to be a foundational core and should complement specialty-specific competencies.

Englander and colleagues mirrored their conceptual framework of the core entrustable professional activities for medical students on that being used by residency training programs. This entailed systematically reviewing published graduation requirements, program director expectations for entering residents, and tasks residents perform without supervision. This helped them develop 21 distinct entrustable professional activities (EPAs) that are considered observable and measurable units of work that represent a variety of competencies expected of medical professionals. More than 100 unique educators representing the continuum of medical education settled on the 13 current EPAs.

Since publishing these EPAs, medical schools have struggled with the theoretical and practical implications of implementing such a framework. In theory, it has been argued that no student would be expected to perform some of these EPAs unsupervised, such as writing orders or doing patient handoffs. There is also concern about the implications of observing these activities in such a way that preceptors can determine entrustment. Current recommendations do not address the logistical issues within medical schools to establish a trained group of faculty to make summative entrustment decisions.

Another reason expressed for establishing the EPAs is the inability of program directors to know what level of skill entering interns have. Because the Medical School Performance Evaluation is not completely standardized, it is often difficult to obtain an honest appraisal of students' abilities. Therefore, program directors have resorted to relying primarily on a known objective measure, USMLE Step 1 scores. Yet, there is limited evidence that USMLE Step 1 performance meaningfully predicts outcomes as a physician. Since this exam is intended to assess medical knowledge only, it has not been shown to be a predictor of the type of physician one ultimately becomes.

A survey of internal medicine program directors indicated they would like to see a report of EPA performance as a hand off from medical school to residency. Few if any of the respondents felt it important to have that information prior to making the residency Match list. A survey of emergency medicine program directors corroborated this notion, suggesting sending a post-Match revised Medical Student Performance Evaluation that includes how the EPAs are assessed and the student's performance.

Key in all of this is the purpose of the assessment of the student. If these EPAs are used to provide formative feedback over the course of medical student education, it would allow them to work toward an established benchmark. However, the assessments need to be robust so students are not graduating with a false sense of confidence in their abilities, which necessitates feedback from credible, trustworthy supervisors. Studies suggested that students' level of confidence completing these EPAs was exaggerated, contradictory to program directors' assessment of their abilities.

Given these challenges, the question still arises as to what problem these EPAs are attempting to solve. Hawkins and colleagues addressed the benefits and challenges of competency-based medical education. They also suggest that over-simplification of using EPAs does not necessarily equate to professional competence. By parsing out this concept we may inadvertently undermine the notion of professional judgement leading to patient safety. Klamen and colleagues also suggest that faculty should be given credit for their ability to judge overall performance and not limit them with a series of items or checklists.

With all of the challenges presented by the framework of entrustment, the real question that should be asked is whether or not program directors think these items are of value. If they are, how should they be reported? When should they be reported? Is it enough to assess entrustment from artificial activities (e.g., simulated exercises) or does the decision need to come from authentic patient encounters? If these EPAs are not going to be used to guide improved residency education resulting in enhanced patient care, are medical schools spending excessive time grappling with something that may not help medical students, faculty or patients? Join the conversation by leaving a comment below!



Gary L. Beck Dallaghan, Ph.D. is Assistant Dean for Medical Education at the University of Nebraska College of Medicine. He serves as Executive Director of the Alliance for Clinical Education and focuses much of his professional energy on trying to identify the most appropriate, efficient methods of assessing medical student performance. Gary can be reached via email at or on Twitter at @glbdallaghan.

Michael Ashley, B.S. is from Phoenix, Arizona and is a Medical Student at Creighton University School of Medicine. He will graduate from medical school in 2021. Michael has an interest in both medical and biology education research. He can be reached via email at




Angus SV, Vu R, Willett LL, et al. Internal medicine residency program directors’ views of the core entrustable professional activities for entering residency: An opportunity to enhance communication of competency along the continuum. Acad Med 2017; 92(6):785-791.

Association of American Medical Colleges (AAMC). Core Entrustable Professional Activities for Entering Residency: Curriculum Developers’ Guide. Available at Retrieved on January 22, 2018.

Brown DR, Warren JB, Hyderi A, et al. Finding a path to entrustment in undergraduate medical education: A progress report from the AAMC AAMC Core Entrustable Professional Activities for Entering Residency entrustment concept group. Acad Med 2017; 92:774-779.

Duijin CCMA, Welink LS, Mandoki M, et al. Am I ready for it? Students’ perceptions of meaningful feedback on entrustable professional activities. Perspect med Educ 2017; 6:256-264.

El-Haddad C, Damodaran A, McNeil HP, et al. The ABCs of entrustable professional activities: An overview of ‘entrustable professional activities’ in medical education. Int Med J 2016; 46(9):1006-1010.

Englander R, Flynn T, Call S, et al. Toward defining the foundation of the MD degree: Core entrustable professional activities for entering residency. Acad Med 2016; 91:1352-1358.

Frayha N, Bontempo LJ, Retener NF, et al. Core entrustable professional activities: A survey of the confidence of fourth-year medical students and residency program directors. Med Sci Educ 2016; 26:475-480.

Hawkins RE, Welcher CM, Holmboe ES, et al. Implementation of competency-based medical education: Are we addressing the concerns and challenges? Med Educ 2015; 49:1086-1102.

Klamen DL, Williams RG, Roberts N, et al. Competencies, milestones, and EPAS – Are those who ignore the past condemned to repeat it? Med Teach 2016; 38(9):904-910.

Lindeman BM, Sacks BC, Lipsett PA. Graduating students’ and surgery program directors’ views of the Association of American Medical Colleges core entrustable professional activities for entering residency: Where are the gaps? J Surg Educ 2015; 72(6):184-192.

Lomis K, Amiel JM, Ryan MS, et al. Implementing an entrustable professional activities framework in undergraduate medical education: Early lessons from the AAMC Core Entrustable Professional Activities for Entering Residency pilot. Acad Med 2017; 92(6):765-770.

Prober CG, Kolars JC, First LR, et al. A plea to reassess the role of United States Medical Licensing Examination Step 1 scores in residency selection. Acad Med 2016; 91(1):12-15.

Sozener CB, Lypson ML, House JB, et al. Reporting achievement of medical student milestones to residency program directors: An educational handover. Acad Med 2016; 91(5):676-684.

Sutton E, Richardson JD, Ziegler C, et al. Is USMLE Step 1 score a valid predictor of success in surgical residency? Am J Surg 2014; 208:1029-1034.


HMI Staff