Abstract

Purpose: The ability to form clinical questions and retrieve evidence to advance patient care is a foundational Entrustable Professional Activity (EPA) integral to the practice of evidence-based medicine (EBM). In addition to improving patient care, the ability to formulate questions and locate evidence is an essential skill for meeting lifelong learning goals. The assessment of EPA 7 may be suboptimal for several reasons including suboptimal role models, student lack of willingness to admit uncertainty, lack of clinical context, and difficulty mastering EBM skills. While the ability of trainees to demonstrate competence in EPA 7 is an expectation, the same cannot be said of medical schools’ ability to provide comprehensive assessment of this EPA. Importantly, this is a broader problem than EPAs. Performance of EBM is a category across all other competency frameworks for medical graduates, and yet, there are only validated tools to assess knowledge and skill, but not yet for competence. We have developed a tool and process to assess competence in and provide feedback on EBM behaviors via observation.

Approach: In 2016, the New York Simulation Center in collaboration with New York University Grossman School of Medicine began development of Night- onCall (NOC), a competency-based readiness-for-internship assessment program centered around the Association of American Medical Colleges’ Core EPAs. For assessing EBM, structured assessment forms were designed through analysis of student search behaviors via optimal foraging theory (OFT). OFT allowed the authors to describe and classify information-seeking patterns, which led to the development of a competency- based rubric anchored in observable on-screen behaviors. Following initial development, this assessment has been rolled out to 6 medical schools. This objective structured clinical examination station works as an addition to patient cases they are seeing. Following 1 patient, they sit down at a computer workstation and are given 10 minutes to ask what they think is the most important clinical question for this case, then they are given a standardized question for which to answer based on a search of the evidence. The computer screen is recorded during this process and remote librarian assessors score (either live or recorded) and provide feedback on the student’s process of seeking information, including which databases they search, how they construct search strategies, and what type of evidence they use in their response.

Outcomes: Since 2018, this station has been completed by 1,150 graduating medical students across 6 medical schools. Performance trends show that through the years, student ability to form clinical questions has improved from 27% well done in 2018 to 60% well done in 2022. Similarly, ability to perform a search has improved from 6% well done in 2018 to 21% well done in 2022. Selecting appropriate databases to search and selecting appropriate evidence to apply to the patient’s case have both remained relatively stable through the years. The biggest increase in performance scores over the 5-year period is in question articulation. We attribute this to being able to feedback these data into the curriculum. Knowing where students are performing suboptimally allows better tailoring of instruction on these concepts. Similarly, there were small improvements in searching skills, though this still lags far behind the other areas.

Significance: This innovation demonstrates that assessment of competence in EBM is both possible and useful. There are 3 direct benefits from this assessment. First, we are better able to provide guidance to students on concrete skills to improve. Second, we can guide librarians on areas where their students are weak and instruction should be adjusted. Third, we can raise awareness to curricular leadership on the competence profiles of their graduating students.

Citation: Nicholson, Joey MLIS, MPH1; Ark, Tavinder K. PhD1; Wargo, Ellie1; Zabar, Sondra MD1; Kalet, Adina MD, MPH1. Observing and Assessing Competence in Evidence-Based Medicine in Graduating Medical Students: A 5-Year Multi-Institution Report. Academic Medicine 98(11S):p S186, November 2023. | DOI: 10.1097/ACM.0000000000005396

Link: https://journals.lww.com/academicmedicine/fulltext/2023/11001/observing_and_assessing_competence_in.47.aspx

Previous
Previous

Entrustable Professional Activity 7: Opportunities to Collaborate on Evidence-Based Medicine

Next
Next

A simulated “Night-onCall”