Among the myriad disruptions the COVID-19 pandemic has caused, medical educators are thinking about how they can ensure students moving through adapted curricula are progressing appropriately, and are motivated as they learn in this new environment. We recently published a 12-tips article: Twelve tips for embedding assessment for and as learning practices in a programmatic assessment system. This paper provides practical advice for schools to consider to help students learn from assessments and to learn with the goal of becoming excellent physicians. We encourage educational leaders and students to utilize evidence-based assessment practices to support these learning goals.
Developing a learning-focused program of assessment that meets a school’s culture, student population, resources, and mission is a substantial exercise – especially in our current rapidly evolving environment. As we discuss in our article, there are four main areas that schools can focus on: (i) culture and motivation of assessment, (ii) curricular assessment schedule, (iii) exam and question structure, and (iv) assessment follow-up and review considerations.
When considering the culture and motivation of assessment, we encourage schools and leaders to consider the following questions:
- How can programmatic assessment be used to promote student self-regulated learning?
- How can an honor code shape a school’s assessment system and student learning?
- How can un-proctored and remote exams promote student learning?
- How can a grading system be designed to promote learning?
The overall design of the assessment system, including the curricular assessment schedule can have a significant impact on student learning. There are many ways we can intentionally establish a schedule that incorporates formative assessments and cumulative assessments to promote learning and retention. Schools can draw from the broader educational research to determine best practices for how and when content should be assessed formatively to promote long-term learning and how and when students should be assessed cumulatively. Further, the timing of exams (e.g., Should study periods be provided? When should high-stakes exams be scheduled?) is an important consideration for both learning and wellness.
As educators (and learners ourselves), we recognize the role that exam and question structure play in approaches to studying and self-directed learning. Educators and educational leaders are now exploring various question types as well as formats of exams. During a time of increased remote learning, schools can explore the use of group-based or collaborative exams as well as “open-book” exams.
Finally, as the paper highlights, the assessment follow-up and exam review process raise important questions to consider, such as:
- What exam feedback data should be shared with students?
- When should students receive feedback about performance after an assessment, including reviewing exams and answers?
- What exam retake processes should exist to promote student learning?
- What academic support systems should support the assessment systems?
- What remediation systems support student learning?
Readers will find that we tried to share various learning-based uses of assessment, moving beyond sorting or promotion of students, and into helping students to learn and progress. With these questions on the table, the tips go through domain-by-domain to suggest that educators harness the full power of assessment to engage students in the work of learning. By keeping these principles in mind and implementing the concrete tips from the article, the hope is that learners will look at assessments as opportunities for growth more than obstacles they must overcome.
Did you know that the Harvard Macy Institute Community Blog has had more than 230 posts? Previous blog posts have explored topics including the RIME framework, core EPAs for entering residency, and moving forward with systems of assessment in medical and health professions education.
Jennifer Meka, PhD, MSEd (2.0 ’13; Educators ’17) is Assistant Professor and Assistant Dean for Medical Education and Director of the Medical Education and Educational Research Institute (MEERI) at the Jacobs School of Medicine and Biomedical Sciences at The State University of New York (SUNY) at Buffalo. Jennifer’s areas of interest include implementation of evidence-based educational principles and assessment for student learning. Jennifer can be followed on Twitter and contacted via email.
Jonathan Amiel, MD (Educators ’12, Leaders ’19) is a medical educator and psychiatrist. Yoni is currently Associate Professor, Senior Associate Dean for Curricular Affairs, and Interim Co-Vice Dean for Education at Columbia University Vagelos College of Physicians & Surgeons. Yoni’s areas of professional interest include competency-based medical education, professional identity formation, and leadership. Yoni can be followed on Twitter.
Aubrie Swan Sein, PhD, EdM, is a Medical Educator and Educational Psychologist. Aubrie is currently Assistant Professor and Director of the Center for Education Research and Evaluation at Columbia University Vagelos College of Physicians and Surgeons. Aubrie’s areas of professional interest include assessment for learning and fostering long-term student learning. Aubrie can be contacted via email.