Insights to Inspire: Follow-up and Evaluation: Engagement for Positive Impact


The Careers in Clinical and Translational Research Metric is designed to measure and develop strategic management plans to enhance the ways CTSA program hubs train and support scientists to remain engaged in research. To measure and report the success of the metric, hubs conduct follow-up surveys to determine if scholars and trainees have remained in clinical research upon completion of the program. To that end, hubs must find ways to encourage graduates to respond to these follow-up surveys. Successful hubs have improved their KL2 and TL1 programs by being receptive to feedback and conducting ongoing evaluation.

A few of the CTSA Program hubs that achieved the most improvement in this metric between 2017 and 2018 were: Columbia University, Johns Hopkins University, Clinical and Translational Science Institute of Southeast Wisconsin (CTSI) at the Medical College of Wisconsin (MCW), South Carolina Clinical and Translational Research (SCTR) Institute at the Medical University of South Carolina (MUSC), Colorado Clinical and Translational Sciences Institute at University of Colorado | Anschutz Medical Campus, University of Kentucky Center for Clinical and Translational Science (CCTS) and University of Texas Health Science Center at San Antonio (UTHSCSA).

As a part of the Insights to Inspire 2020 series, these featured hubs were interviewed by CLIC staff to share their success at increasing survey response rates. These hubs achieved success through such efforts as emphasizing the need for program data, being open to and responding to scholar feedback, and staying in touch with scholars. CLIC’s goal in sharing the experience of the successful hubs is to turn their practices into actionable strategies for the entire consortium.

Stay in touch, early and often

When it comes to overcoming the challenge of getting scholars to respond to program evaluations, it helps to engage in relationship-building and in early and ongoing communication about the importance of responding to evaluation surveys.

For MCW, that meant considerable one-to-one contact with scholars throughout the program. “During the scholars’ orientation to the program, we reiterate as many times as we can how important it is for scholars to participate in the relevant surveys. This is important to the success of the program and in service of current and future scholars,” said Ramez Rashid, Evaluation Director, CTSI of Southeast Wisconsin, MCW. "Throughout the program, we'll try to repeat the message … we need the data so that we can develop and implement sustainable improvements.”

MCW also focused on building relationships with scholars, which can be challenging given the short timeframe of the program. “It's a one- or a two-year program, and you have to really be pretty active in interacting with the students to get to know them, to get them to trust you,” said Joe Barbieri, Associate Director of the TL1 Program. “We make the effort to interact with the students during the year to make sure that they are on track to find out what they are doing. And then as they graduate, we try to continue to have that relationship.”

Johns Hopkins University emphasized expectations to its TL1 scholars, explaining that scholars have to stay in touch, so the university can track their career path and achievements. Communicating that expectation upfront seems to have worked, as a recent survey seeking information about their current employment and research sent to 69 alumni generated 68 responses.

Mary Catherine Beach, Director of Johns Hopkins’ TL1 Predoctoral Clinical Research Training Program and Professor of Medicine, said, “We’ve been keeping in touch with people. We tell them all the time, this is what we are going to do for you. And, in turn, you always have to be in touch with us so that you can help us understand what you’re doing with your life.”

Another hub was more direct in how it communicates the importance of getting responses from individuals who finish its TL1 program. “It really is just letting them know when they start that we're going to be bugging them, that I'm going to be the biggest nag that they've probably ever had in their life,” said Lisa Cicutto, Director for Translational Workforce Development and of the TL1 program at University of Colorado CTSI.

Staying in touch with scholars can also pay dividends after they graduate from the programs. Alumni can play significant roles as mentors to newly registered scholars and they can help promote the value of the TL1 and KL2 programs to potential scholars. The University of Kentucky CCTS program continues to hold in-person meetings with KL2 alumni, and the alumni remain actively engaged in program activities and in assisting faculty.

Be open and responsive to feedback

Featured hubs found success by being open to scholar feedback and to making programmatic changes based on that feedback.

At MUSC, program leaders actively supported their scholars, checking in with them regularly to see if they have concerns or if they are facing any issues. The goal was to help provide support without overburdening the scholars. Scholars participated in quarterly meetings with their mentor and with program leaders to discuss their progress and to provide feedback.

According to Rechelle Paranal, Evaluation Manager, leaders not only listened to feedback, they adjusted the program based on what they heard. “We used to do something at 5 p.m. and we moved it to 4 p.m. because people let us know that 5 p.m. is hard because you have to pick up kids then and there’s too much going on at that time. So really listening to them and just making sure you’re hearing what they say and what’s going to help them get supported … that’s something that’s really important.”

Use formal evaluation tools

As most scientists recognize, quality research depends on having reliable data behind it. For hubs to evaluate their TL1 and KL2 programs, they must have good data. In addition to the informal modes of collecting feedback from scholars, featured hubs relied on formal evaluation tools to collect data as well.

At Columbia, an evaluation form — a combination of an individual development plan and an internal progress report — solicits information about which goals have been met, which goals are still in progress, the number of papers published, along with general feedback regarding what scholars think of the program. For program alumni, a survey is conducted to gauge whether the graduate is currently engaging in team science. The survey collects such information as current and past employment and leadership responsibilities, titles, and the types of research conducted since graduation.

UTHSCSA and MCW worked to begin collecting program data from the outset. UTHSCSA conducts individual KL2 scholar interviews with the evaluators rather than program staff to gauge effectiveness and solicit unbiased feedback, both at the mid-point and end of the program. Using the Clinical Research Appraisal Inventory (CRAI) tool, MCW conducts self-evaluations of scholars’ skills upfront, and the CRAI is given again once the scholars complete the program. MCW program leaders then evaluate the delta and conduct a statistical analysis to determine if there was a significant improvement in skill levels according to various criteria. In addition to the CRAI evaluation tool, MCW also conducts a graduate tracking survey that follows the graduates over many years to gauge their efforts and their output in terms of grant applications, promotions and other outcomes throughout their career after graduating from the program.

The activities and experiences of these featured hubs demonstrate that soliciting and responding to feedback, and early and ongoing relationship building can assist in improving evaluation and follow-up efforts and overall programs. For some hubs, the “secret sauce” is having an open-door policy, from the beginning of the program through graduation, to make the scholar/alumni-program evaluator relationship a two-way street.

“I think it’s just our involvement with each of the scholars,” said Daichi Shimbo, Columbia’s KL2 Director and Associate Professor of Medicine. “We work with them frequently … and have an open-door policy if any of the scholars want to talk about career development issues. And I think by emulating that kind of environment people are more likely to respond because you help them.”