Table 4.
Hub size and resources In addition to size of hub’s funding award, other resource-related factors contributed to an overall lack of available time and resources. |
---|
Lack of institutional investment† |
So a lot of the metrics, one would certainly hope could be facilitated by informatics systems, and our university, for example, has not invested in a citation index software, that would help a lot as we are trying to find investigator publications… Our…homegrown system works really well for the IRB, but any time anything needs to be added they have to contract with informatics people…, [who] are a scarce resource. So that’s a challenge. –Principal Investigator ** |
Interrupted funding |
…[G]iven our no-cost extension status, …we do not know yet if we are going to…turn the curve because we are not awarding, for example, …any more pilot awards…or K awards right now. –Implementer |
Lack of adequate staffing and expertise† |
Well, I can tell you the problem: we only pay a fraction of [his] time for evaluation because he does other functions for us, and our staff person who works with him does not have the capability to do this herself independently. …Nobody really thought about what impact it was going to have on the time allocation for the leadership that was responsible for evaluation… –Principal Investigator
Well, what I would like to change is to have an expert on hand, someone who has been trained in evaluation and metric design. And not so much just adding it on to people’s job descriptions, but actually having someone who could truly represent us at the level of NCATS for Common Metrics. –Administrator |
Alignment with needs of Common Metrics Implementation Lack of alignment of local data systems or institutional priorities created difficulties for metric data collection or local investment in the initiative. |
Lack of data system or an existing system that was not aligned with the Common Metrics definitions created more effort for effective tracking† |
…our information systems were not automatically and easily aligned to collect information in the form that the initial set of metrics request demanded, and so we discovered…that there were various kinds of gaps and holes in the way various things are tracked. –Principal Investigator |
Lack of alignment with institutional priorities† |
We have tried to make sure that the deans and other leaders know about the Common Metrics. I don’t know that those three Common Metrics have been exactly their highest priority. They look at it and they are happy with it. [But] it’s not like they have said, “Oh yeah, we want to adopt that Common Metric for our university over time.” But it’s early in the process and they may. –Principal Investigator |
Hub authority Lack of line authority over data, processes, or organizational components related to the metrics created challenges for implementation. |
Lack of line authority over key drivers |
One issue with the CTSAs, particularly in a decentralized organization like ours, is we’re responsible for outcomes but do not have authority over them. It is an exercise I am trying to lead from the middle. –Principal Investigator
There’s thousands of IRB protocols submitted to the IRB every year. We only touch a small fraction of them, so how much control do we have over time to IRB approval? And so, the cynical answer is how can we affect the 90% of IRB submissions that we have nothing to do with? –Principal Investigator |
Hub engagement Active hub engagement was important for completing implementation activities, but several factors undercut engagement. |
Annual reporting cycle induced bursts of effort |
I think a limitation has been this idea that you can report [the metrics] once a year, which is good to report to NCATS, but it is not good as a management tool… –Principal Investigator |
Interrupted funding |
Given our no-cost extension status, we realized that we would not be able to implement all action plans that we proposed or we had outlined…. –Implementer |
Reduced motivation due to lack of alignment with existing processes or unclear definitions |
…[W]hen I ask anybody on my staff to do something, I want to make sure it’s not busy work and I want to make sure it’s something that we’re using. … And so when we did a change of operations to basically…[compute the metric] the other way [for the Common Metrics], … the report at the end wasn’t useful to us….–Administrator |
Stakeholder engagement Engaging needed stakeholders external to the CTSA hub was crucial for performance improvement, but securing consistent participation was challenging. |
Lack of a direct line of consistent communication with other units |
Unlike some institutions, we do not manage the IRB, and we don’t manage contracting, so we are always the liaison working with those entities, to try and improve their performance. –Principal Investigator** |
Securing initial buy-in or sustained cooperation from key stakeholders |
Well, I think we have the same problems as everybody else. You give somebody a $50,000 pilot grant, and then they forget to cite you on papers. We preach, we give seminars, we hand out mouse pads and mugs and do all kinds of things, and put it in our emails. But people still forget… So it is a constant struggle… –Principal Investigator |
Unless stated otherwise, themes manifest in more than one way; a quotation represents one manifestation.
Participant is affiliated with a medical center that functions as a CTSA without current CTSA funding.
Indicates that the challenge, under reverse conditions, becomes a facilitator.