Absite Smackdown! · Warning: Faculty Evaluations Don't Predict ABSITE Scores--A Program Director's Take

 

 

Today, we're going to further explore any association between Absite scores and resident performance as measured by evaluations.

The last time we did this, our colleague Jessica went through some of the available evidence about whether evaluations by faculty correlate with Absite performance. And we're going to pick that up today.

 

Special Interest As A Surgeon Who Educates Residents

A part of the reason why is I have a special interest as a program director--as a surgeon who's performed that role in my career. One of the most interesting things to me has been that my colleagues in our surgical faculty would evaluate residents. And it always seemed to me that often there wasn't a clear association between those evaluations and resident performance. And so I'm fascinated by this available evidence about resident performance and Absite score.

First, part of the message I take from it in my career and I pass along to residents is evaluations are useful.  We need them. And yet at the same time, everything has limitations.

So we need to know what to take from them and what not to take. And so today we'll investigate further whether there's an association between American Board of Surgery Inservice Training Examination scores, resident performance, and the way in which resident performance is measured by faculty evaluation.

 

Another Useful Study

So this time, we'll briefly review an Absite evaluation study that was designed to really determine whether favorable evaluations by faculty were associated with Absite performance. And here's that study:  it's called (from JAMA Surgery) Association Between American Board of Surgery In Service Training Examination Scores and Resident Performance. Again, this is from JAMA surgery in 2016. The link is right there in the title.

In this case, the investigators did a cross sectional analysis of both preliminary and categorical residents and PGY years one through five that were training at a single university-based general surgical program from July 1st, 2011 through July 30th, 2014. All of them took the Absite.

There were overall performance evaluations performed and subset evaluation performed in the following categories:  patient care, technical skills, problem based learning, interpersonal and communication skills, professionalism, systems based practice and medical knowledge. So really the core competencies.  The main outcomes and measures included passing the Absite with greater than 30th percentile and ranking in the top 30% of scores at the institution.

So when they did this, the study population included 44 PGY one residents, 31 PGY two residents, 26 PGY three residents, 25 PGY, four residents and 24 PGY five residents during that four year study for a total number studied of 150 evaluations seemed to demonstrate less variation than the abs site percentile standard deviations were 5.06 versus 28.82 respectively.

But of course the second one is percentile and the first one, maybe a Likert scale, could have been one to five or one to 10 for each one. So it's hard to know what a standard deviation of five versus 28 meant in terms of variation, but they report there was less variation in evaluations.

That may be because the scales were very different. Hard to say. But the bottom line is neither annual nor subset evaluation scores were significantly associated with passing the Absite.  P equals 0.15 in this case, and their confidence interval included one. So the bottom line is, in their study, that faculty evaluations correlated with Absite score.

 

Why's That Interesting?

And this is interesting for many reasons, including the fact that it's a different institution than what we saw in the previous study.  Also there was no association between receiving a top 30th percentile score and evaluations.

There was no difference in mean evaluation score between those who passed versus failed the Absite.

Bottom line here:  there didn't seem to be a correlation between annual evaluation score and Absite percentile. And I have to share this has really fit with my experience as a program director where a faculty evaluations really didn't always fit with what we saw on the more objective or the more test format measures of knowledge base.

I always wondered how well does our feedback, predict or correlate with Absite score and experientially it didn't seem to. And here we have the second of two studies that seems to substantiate that. I find it really interesting.

 

A Useful, Hopeful Message

I find it to be a really hopeful and useful message for those residents who have their knowledge base evaluated as poor.

It really gives them opportunity not to despair but to study and work hard so they can do well on the Absite realizing the limitations of the feedback they're getting from their faculty.

Lots of reasons for this are possible.  In my experience, some of it could be that faculty see you through the lens of their specialty when you're on specialty rotations and don't get the full sense of the broad scope of your knowledge base.  Also, how you communicate your knowledge base may make it look more or less strong.

In general, there are lots of reasons perhaps why, but here we have another piece of evidence. The faculty evaluations do not seem to predict Absite score in faculty evaluations of knowledge base in particular. So favorable evaluations in this study and in the last one doesn't necessarily correlate with Absite scores and they don't predict passing.

The evaluations in this case didn't show much discriminatory ability. It's really hard to say whether individual resident evaluations and Absite scores fully assess competency or allow comparisons to be made across programs. So here we have just one more piece of evidence. The author's conclusions in this case you may say don't directly flow from their data, but we do have further evidence that faculty evaluations do not seem to predict Absite score. Again, a hopeful message for those residents out there who have been previously evaluated as not having a great knowledge base. And that's why I like to share it.

 

Feedback Is Good (!)

All of this does not mean that feedback is useless. Feedback is very useful in part, to improve. So I want us to be really clear as we work through this podcast together. The faculty evaluations, coaching, mentoring are all key for success, especially when done well.

But the evaluation of knowledge base in particular seems to be more of a challenge. And the evaluation of knowledge base really seems to have limitations.

So thanks for listening for this episode of Absite SmackDown. I appreciate having you listened to the program. Remember to stop by AbsiteSmackdown.com for other podcasts and updates.

And remember, the Absite comes around once a year. This year may be more challenging owing what we've gone through with Coronavirus. Yet, no matter what happens:  work every day on it.  Improve and try to work through it to become the best surgeon you can be.

Have a great day and best of luck!

#AbsiteSmackdown

Previous Article Next Article

Recently Viewed