Table 3. Illustrative Quotes for Key Themes and Unintended Consequences.
Theme | Illustrative Quotes |
---|---|
Key factors in CCO–clinic collaborations | |
Establishing relationships and building partnerships | [CCO Name] is cited as a real pioneer in this work. . . . They have had such incredible community investment from the very beginning. It's not like they have to talk their partners into doing something or engaging in work around the metrics . . . because the partners were there from the beginning and were part of the founding governing board. (P12) We were really close with some clinics, and they trusted us. And some clinics, we didn’t have as close a relationship. So we had to figure that out in the strategy. [Now our relationships are] pretty close . . . partially because there’s a lot of need and they realize that we want to help. We don’t have some crazy ulterior motive. Our motive is the same as theirs. We want access for patients and quality care. (P7) |
Producing and sharing performance data | We’ve gotten more sophisticated about [our process of sharing performance data]. . . . We identified somebody at the clinic that’s our contact. It may be administration or care management, it’s not necessarily going to be the primary care provider anymore. (P10) This CCO puts [performance data] in front of all of the providers on a regular basis. This is how you’re doing, this is how the clinic next to you is doing, this is how the clinic down the street is doing. I would have thought that would have been very risky, but . . . [it] has generated competition and it's generated transparency and it’s generated a spirit of collaboration because clinics can look at each other and say, “Boy, you’re doing great. Tell me what your secret is and let’s figure this out together, and will you help us? What did you do to get from here to here?” (P12) |
Developing a process and infrastructure to support quality improvement | [We consider] each clinic and say, “For this clinic, what is it for them?” They’ve already got strong leadership, so maybe for them it’s that their data system makes it really difficult for them to track this metric. . . . We try to personalize our knowledge of each clinic to ensure that when we take something that seems straightforward, like they just need to improve the numerator hits for this process and it seems straightforward because you should just send out kits and they should get sent back but there’s always more beneath the surface. And typically what’s underneath it is some kind of system support that is not in place. (P4) Our first step is usually to educate the providers and their staff on what the quality measures are, how they are tracked, what kind of data are OHA looking for and what documentation do they need and the clinical record to back up that information . . . and then looking at what kind of clinical workflows or other strategies we can suggest to them or help them with that would improve the actual frequency in which services are occurring. (P16) It’s those kinds of hard stories that the clinics aren’t afraid to share [at the learning collaboratives] once we’ve developed trust . . . where they feel comfortable sharing their failures with each other, so you’re not [going] down the street reinventing the same crooked wheel. (P3) |
Unintended consequences | |
Engaging larger clinics, exclusion of smaller clinics | I feel for [these small clinics], because I think they're at a disadvantage in that larger clinics have built-in infrastructure of IT people, of performance improvement people, 3-tier leadership. . . . In some clinics, the office manager is the billing manager, is the front desk manager, is everything. I worry about those clinics and I wonder how they are doing. I don’t know if that falls on the CCO to provide that sort of infrastructure. Maybe it does. I just worry that they're being overlooked. (P22) We have really good reporting. . . . We have gap lists that we can produce by clinic, by provider, by measure. We know who’s got the most members for that measure, who’s contributing the most to the numerator and to the denominator so that we know where to target. Usually you would just go, “Oh, let’s let everybody know that we don’t, or everybody has to have them.” Well, now we go, “Okay, if we approach this one clinic, we can get everything we need to make the measure.” . . . We’re just being very strategic about that. (P10) |
Metric focus and fatigue | For good or for bad, I think the metrics are really driving a lot of the effort now, and if there’s any bandwidth leftover after you’ve hit the metrics, then they focus on those things that don’t necessarily impact the check at the end of the year. . . . Somebody said just a couple of weeks ago, “I thought this would get easier. I thought it would calm down. I thought it would become more routine functioning, and it isn’t.” It is intense work, and it has been from the beginning. (P12) That’s probably the biggest thing that hit the clinics with new metrics, which is one more thing. “We just are barely getting this other thing working, and now you want us to do one more, you want us to do 2 more, and 3 more things,” and that’s the hard part. (P15) There's just too many [metrics], and the administrative burden of capturing the data for many of them . . . is too much. So it deters from true quality, and it deters from CCOs being able to focus on things that aren't quality metrics that could improve quality even more because quality isn’t just about quality metrics. (P7) |
Abbreviations: CCO, coordinated care organization; IT, information technology; OHA, Oregon Health Authority; P, participant.