Skip to main content
The BMJ logoLink to The BMJ
. 2003 Feb 8;326(7384):314. doi: 10.1136/bmj.326.7384.314

Practice based, longitudinal, qualitative interview study of computerised evidence based guidelines in primary care

Nikki Rousseau a, Elaine McColl a, John Newton b, Jeremy Grimshaw c, Martin Eccles a
PMCID: PMC143528  PMID: 12574046

Abstract

Objective

To understand the factors influencing the adoption of a computerised clinical decision support system for two chronic diseases in general practice.

Design

Practice based, longitudinal, qualitative interview study.

Setting

Five general practices in north east England.

Participants

13 respondents (two practice managers, three nurses, and eight general practitioners) gave a total of 19 semistructured interviews. 40 people in practices included in the randomised controlled trial (34 doctors, three nurses) and interview study (three doctors, one previously interviewed) gave feedback.

Results

Negative comments about the decision support system significantly outweighed the positive or neutral comments. Three main areas of concern among clinicians emerged: timing of the guideline trigger, ease of use of the system, and helpfulness of the content. Respondents did not feel that the system fitted well within the general practice context. Experience of “on-demand” information sources, which were generally more positively viewed, informed the comments about the system. Some general practitioners suggested that nurses might find the guideline content more clinically useful and might be more prepared to use a computerised decision support system, but lack of feedback from nurses who had experienced the system limited the ability to assess this.

Conclusions

Significant barriers exist to the use of complex clinical decision support systems for chronic disease by general practitioners. Key issues include the relevance and accuracy of messages and the flexibility to respond to other factors influencing decision making in primary care.

What is already known on this topic

Randomised controlled trials of complex computerised decision support systems have found low rates of use and no effects on process and outcomes of care

What this study adds

Clinicians found a computerised decision support system for chronic disease in general practice to be difficult to use and unhelpful clinically

It did not fit well into a general practice consultation and compared unfavourably with “on-demand” information

“Active” decision support can make clinicians aware of gaps between their own practice and “best” practice, but computer prompts need to be relevant and accurate

Introduction

Systematic reviews have shown that computerised systems can be an effective means of implementing guidelines in clinical practice.13 However, they identified no studies of sophisticated computerised decision support systems in chronic disease management or integrated into routine computer systems. Although one of the most recent reviews identified 68 controlled trials,2 little use has been made of qualitative techniques in evaluating computerised decision support systems in health care,3,4 leaving unanswered questions about why systems are or are not effective. Models of implementation of guidelines and other innovations emphasise the importance of pre-existing attitudes and the context of the intervention, as well as the nature of the intervention itself, in the successful adoption of an intervention.57 We conducted a randomised controlled trial of a computerised decision support system for the primary care management of two common chronic diseases, which is reported in detail elsewhere and summarised in box B1.810 In this paper we report a qualitative interview study conducted in parallel in order to illuminate trial findings.11,12

Box 1.

Details of associated randomised controlled trial

Design—Before and after pragmatic cluster randomised controlled trial with a two by two incomplete block design
Setting—Sixty general practices in the north of England. Practices were eligible to participate if at least 50% of the doctors reported that they used one of two computer systems to view clinical data and to issue prescriptions during consultations
Participants—General practitioners and practice nurses in the study practices and their patients aged 18 years or over with angina or asthma
Main outcome measures—Adherence to the guidelines, based on review of case notes; generic and condition specific outcome measures reported by patients
Results—Use of a computerised decision support system had no significant effects on consultation rates, process of care measures (including prescribing), or any patient reported outcomes for either condition. Levels of use of the system were low

Methods

Design

We designed a practice based, longitudinal, qualitative interview study to enable us to examine attitudinal and contextual influences on the use of the computerised decision support system.57 Interviews in clinicians' consultation rooms allowed a detailed discussion of their usual practice in relation to the index conditions and a demonstration of how the system interacted with these consultations. We considered observing clinicians interacting with the system but judged this to be impracticable because interactions were infrequent and unpredictable outside chronic disease clinics for the index conditions.

Participating general practices

As the conduct of an interview study in practices participating in a randomised controlled trial could both constitute a co-intervention and increase the burden of participating in the study, we recruited practices to only one or other aspect of the study. From those practices eligible and willing to take part in the trial (box B1), we recruited five (from north east England) to the interview study.11,12 We purposively selected practices on the following criteria: supplier of clinical computer system, vocational training status, number of general practitioners, reported use of guidelines for asthma and angina, and level of computerisation (table).

Interviews

We undertook initial interviews with the designated contact person in each practice. We undertook further interviews with a purposive sample of professionals to ensure representation of clinicians described by their colleagues as having a particular interest in asthma or angina, those who attended a training workshop on the use of the computerised decision support system, and those who had not shown any particular interest in the system. We conducted interviews before and at different times during the intervention period. Interviews lasted approximately one hour, and we conducted most of them in a surgery consulting room with a computer available, enabling the interviewee to refer to the computerised guidelines.

Topics covered in the interviews included use of the computer; use of guidelines, especially for asthma and angina; organisation of care for patients with asthma or angina; and experiences of using the computerised decision support system. We asked respondents to discuss both their own experiences and those of their colleagues in the practice. NR conducted all the interviews, and all were taped and transcribed verbatim. NR and one other researcher (EMcC or JN) reviewed each transcript and made notes of topics to be followed up at subsequent interviews. This approach enabled later interviews to build on and explore further what was already known about a practice and to feed in ideas from other practices as appropriate. Three researchers (EMcC, JN, and NR) identified emergent themes and then met to construct an agreed list and coding frame. All three researchers applied this to two transcripts; comparison of coding decisions enabled some codes to be clarified, and others merged. We subsequently imported the transcripts into the NVivo qualitative data analysis package for detailed coding (version 1.3, QSR International, Melbourne). The dominant themes presented in this paper emerged through an iterative process of coding, analysis of coded text, and discussion among the authors.

Other sources of data

The importance of using different types of data in qualitative research has been highlighted.13 Six months after installation of the computerised decision support system we sent forms to all clinicians in randomised controlled trial practices and interview study practices inviting feedback on the software, the content of the guidelines, the information they had received about the system, how the system fitted into their care for patients, and any other aspect of the system. We compared this feedback with themes from the interviews, looking in particular for conflicting views or new themes.

Intervention

The intervention, which was the same in interview practices and trial practices, is described elsewhere.9,10 In summary, two suppliers of general practice clinical computer systems integrated evidence based guidelines for the primary care management of asthma in adults and angina into their products.14,15 The computerised decision support system anticipated clinicians' requirements by using information contained in patients' computerised records to trigger the guideline and present patient scenarios (for example, for asthma: review of stable patient; acute exacerbation). On the basis of the scenario chosen, the system offered suggestions for management informed by the content of the patient's record and requested the entry of relevant information, which was subsequently stored in the patient's record. The system could be triggered in two ways—either automatically when the clinician entered the electronic record of a patient previously identified as eligible or when a relevant morbidity code was entered.

Immediately before the intervention period we invited each practice to send two members of the practice to a one day training workshop for demonstration of the system and supply of training materials (including an html version of the guidelines). In addition, every clinician (doctor or practice nurse) received a paper copy of the summary version of both guidelines and each practice received one paper copy of the full version of both guidelines.

Results

We carried out 19 semistructured interviews with a total of 13 respondents—two practice managers, three nurses, and eight general practitioners. We received feedback from 40 people in randomised controlled trial practices (34 doctors, three nurses) and qualitative interview study practices (three doctors, including one previously interviewed). We identified no new themes in the feedback; rather, the feedback further clarified and reinforced themes from interviews.

People interviewed were largely enthusiastic about the benefits of computing for general practice and were optimistic about the potential for computers to present guidelines in a manageable format. However, negative comments about the computerised decision support system significantly outweighed the positive or neutral comments. We identified three main areas of concern: the timing of the guideline trigger, the ease of use of the system, and the helpfulness of the content.

Triggering of the system

Automatic triggering of the computerised decision support system on entry into the record of a patient with asthma or angina was designed to facilitate opportunistic chronic disease management. It also made the system visible within practices, ensuring that all clinicians whose computers were able to operate the system (see comment on nurses below) and who used the computer in their clinical practice (most clinicians in the study practices), were aware that the system was available. However, clinicians generally disliked this feature and said they would be unlikely to carry out a chronic disease review if a patient was consulting for another reason (box B2). In addition, inconsistencies in morbidity coding meant that the guidelines were sometimes triggered for patients without the condition. Given the time it took the system to launch, clinicians operating from branch surgeries with a slower computer connection found it particularly intrusive, as did those authorising repeat prescriptions for multiple patients.

Box 2.

Triggering mechanism

“That's the one point it does get a little bit annoying when that comes up and you think ‘well I’m seeing them for their big toe' . . . It's actually come up a few times and I've thought ‘they haven’t got asthma' [laughing] . . . it's obviously been labelled wrongly . . . so that's actually quite helpful in some ways . . . No, no it's not acting as a prompt to review . . . if a patient had come in with an unrelated topic, it's very unlikely, I haven't done it yet, I think it's unlikely that I would go to the asthma guideline” (General practitioner, interview study)
“They're off [patient has left], I turn back and go back into this and then select the problem title, and then I say right well I've looked at ischaemic heart disease and then this comes up—and there [system has activated] and the patient's already gone by this stage” (General practitioner, interview study)
“The [guideline trigger] came up too soon to be useful—before you have even defined the problem” (General practitioner, trial practice, feedback)

The timing of the trigger in relation to the consultation was also problematic. Many clinicians liked to glance through the computer record while waiting for the patient to enter the consulting room. This was not a good time for the guideline to trigger, as the clinician did not yet know why the patient was consulting. Equally, the entry of a morbidity code at the end of a consultation (a common pattern) activated the system, but too late to be used. It therefore became an automatic reaction to “escape” out of the guidelines whenever they triggered, even on occasions when it might have been appropriate to use them. Part way through the intervention period, in response to feedback from practices, we altered triggering to present the system only in response to the entry of a morbidity code.

Ease of use

Most clinicians who tried out the system found it difficult to navigate (box B3). They acknowledged that this would be less likely if they were more familiar with the system. However, this meant taking some time outside a consultation to explore the software; they were generally reluctant to experiment with the system during a patient consultation because of the risk of “getting lost.”

Box 3.

Ease of use

Navigation
“There's times I've gone oh I don't want to go that way . . . when that happens, to be honest, I tend to exit out of it; you know it's a catastrophe reaction” (General practitioner, interview study)
“It's got a little bit to do with pride and how you perceive your role; you ought to be able to use your tools; you ought to be able to use your blood pressure machine properly to take a blood pressure . . . I don't mind making the odd mistake and you do sometimes with the prescribing, the stuff that we've been doing for years you still hit the wrong keys every now and again, but you can at least demonstrate you know that you've hit the wrong key . . . but if I hit the wrong key with [the system] I'm lost” (General practitioner, interview study)
“The next screen it asks me . . . ‘which represents the patient’s current state' right? Now suppose this is someone I'd never seen or very rarely seen; he is usually seen by one of my partners—how on earth am I going to know what state this patient is in? I can go directly to the prescribing screen, which is handy, but what would be neat would be to go to the consultation screen here, scroll back to his consultations, be able to work out which one of these he was, and then go back in because what tends to happen here is that you think, oh my god, I can't answer this question, so I will then exit out of the guideline” (General practitioner, interview study)
Training
“It was a fun day, a good day, but I came away slightly confused about actually using it; but before I go and teach my colleagues, I'll have a good play with it” (Nurse, attended training day, interview study)
“Running through the system, despite the training, I find exasperating” (General practitioner, trial practice, feedback)
“Most systems have pretend patients, certainly we do . . . so a few exercises to say you know, put this code in, we'll set up a patient who is already halfway through and then just try it out, just, it gives you that reassurance that . . . if you don't get the answer that is the next step in the tutorial then at least you go and try and work out why. [NR: Do you think people would find the time to do that?] It depends upon whether or not they're keen; you'd only find the time to do it if you wanted to learn” (General practitioner, attended training day, interview study)
“It was difficult to get there [training workshop] to be honest with it being that far away, so that was a shame, so we just sort of started experimenting really” (General practitioner, did not attend training day, interview study)

Attendance at the training workshops did not seem to help clinicians to use the system. A delay between the training day and the guideline becoming operational in practices (increased in some cases because of factors external to the study) reduced the benefit of the day. An html version of the guideline available in the interim period did not adequately prepare clinicians for the full version. Although many clinicians seemed resigned to having to “just get in and fiddle” with new computer software, several people made suggestions for additional support.

Additionally, clinicians had limited access to clinical information from within the system. In practice, this meant that clinicians had to exit the system to access the patient's medical record, and once they had exited it was unusual for them to re-enter.

Helpfulness

Among the clinicians who persisted with using the system a strong theme that it was not helpful emerged (box B4). Three main factors contributed to this. (1) The guideline had limited ability to present options individualised to a specific patient. (2) Clinicians believed that they were already familiar with the content of the guideline (box B5). (3) The system did not (with some exceptions) aid adherence to those aspects of the guidelines that general practitioners were able and willing to follow and overemphasised areas to which they had given low priority or to which there were other barriers.

Box 4.

Views about the computerised decision support system

Negative
“Not enough categories—patients don't fit. Find it is not particularly helpful with management and I tend to ignore it” (General practitioner, trial practice, feedback)
“If it's labelled as asthma [but isn't] and you start following guidelines for asthma you can come to the wrong management decisions” (General practitioner, interview study)
“I don't trust . . . practising medicine like that . . . I do not want to find myself in front of a defence meeting, in front of a service tribunal, a court, defending myself on the basis of a trial of computer guidelines . . . it doesn't make bad guesses but I ain't gonna rely on that when I know there's history back there which he may not be . . . all it would need is to have on this an out of hours . . . two out of hours attendances where the patient's got huge whacks of steroids and this thing would be way off beat” (General practitioner, interview study)
“You see the computer does flash up if somebody is overusing the prescription. But sometimes it doesn't flash even if you are overusing so it's not consistent” (Nurse, reference to prompt system external to this system, interview study)
“It's just a recording facility and a longwinded way of prescribing” (General practitioner, interview study)
“The reminder that data is missing is irritating, as it is generally available but simply not entered on to the computer” (General practitioner, trial practice, feedback)
“The last partnership meeting they asked if we could switch it off cause they were just finding it, it was irritating more than anything, it was—at this end [branch surgery] it was slow, at the other end it was not slow but not helpful at all, not helping decision making in the least and just irritating, getting in the way, flashing up when they didn't particularly want it to flash up, spending more time—increasing the number of keystrokes per consultation because as I say, if its not useful, you won't use it” (General practitioner, interview study)
“I'm sorry to say that this software is driving me mad . . . It's also a nuisance when it comes up when everything has been done or I am waiting months for an exercise ECG. It's so annoying that I always exit, but I would feel less antagonistic if I had some individual control” (General practitioner, trial practice, feedback)
“I am very [respondent's emphasis] disappointed with the format. My lack of use of them does not mean I would not use computer based guidelines, I simply find these obstructive to the consultation process” (General practitioner, trial practice, feedback)
Positive
“I like the way it flashes up straight away what missing information” (Nurse, interview study)
“The good things about the [system] are the prompts for you to do things . . . making us make a bit more effort to actually put in the peak flow rate and things which were perhaps scribbled down but we wouldn't have made a computer record, which I think is useful to have” (General practitioner, interview study)
“It does, however, act as a decent prompt to make sure the patient is on aspirin, β blocker, etc” (General practitioner, trial practice, feedback)
“Using the asthma guidelines it leads you through into prescribing and that actually cuts your time down a bit because you can, it stops you having to search for you know, different drugs, and actually rather than causing a time lag it actually gains some time” (General practitioner, interview study)

Box 5.

Views about the guideline

Content of the asthma and angina guidelines
“I'm very happy with the content of the guidelines, and that is as good as expected” (General practitioner, interview study)
“The information in it is sound” (General practitioner, trial practice, feedback)
“I have to say I don't think I was doing an awful lot outside the guidelines to begin with anyway, so I think it was confirming my, my initial thoughts anyway” (General practitioner, interview study)
“We sort of know that to be honest, so it's not necessarily that useful. I think it's not telling us an awful lot that we don't already know” (General practitioner, interview study)
Barriers
“If a patient sits before you and says, ‘I feel a lot better after I’ve had my nebuliser, I feel as though I've got the dose and I'm less wheezy,' that's a subjective thing which you can't, you can't say, ‘no you don’t,' because that person is the person who's experienced it” (General practitioner, interview study)
“If you actually read the cardiology referral indications it's just about everyone (laugh). The system just couldn't stand it you know” (General practitioner, interview study)
“One of the problems that I have with evidence based medicine is that sometimes you go down certain lines because the evidence is best for certain things, but it may be that the evidence is only best for certain things because they are older and they've been around longer and the evidence is more robust, but because they have been around longer they may well not be the best. Because that's what they said at the [training] day—the evidence is there for verapamil, it isn't there for the other stuff, but it may not be there for other stuff because people haven't done it yet” (General practitioner, interview study)
General views
“Guidelines are there to be helpful, but . . . I'd say all GPs not just my partners are cynical about guidelines because you get guidelines for everything, and you get them till they're coming out your ears, to the point where you stick them in file and you think I'm not going to read them because you'll spend, you'll spend hours and hours each week I guess updating yourself on guidelines for this, guidelines for that” (General practitioner, interview study)
“I'm even using them to show the patient to say ‘I’m sorry but these are the guidelines that I've got to follow' ” (Nurse, interview study)
“Guidelines are good when you face difficult management problems” (General practitioner, interview study)

To reduce the number of decisions for the clinician the computerised decision support system presented options customised to a particular patient, by using information in the medical record. However, clinicians found that the system often presented too many or inappropriate options. In addition, clinicians expressed concerns about trusting a computer to make management decisions and about the prompting mechanisms within the existing clinical computer system.

Many people interviewed did critically engage with the content of the guidelines at the training workshop, in paper format, or on the html version of the computerised guideline. Relatively few people had read the guideline content within the computerised decision support system. Many people believed that they were already practising in line with the recommendations in the guidelines or that the guidelines did not contain much new information (box B5). Areas in which clinicians acknowledged that they did not follow or disagreed with the guidelines highlighted perceived shortcomings of evidence based medicine in relation to new treatments, issues of patient preferences, and perceived structural barriers in the healthcare system. Some areas of disagreement related to the computerised implementation of the guidelines, which went further than the paper guidelines in recommending particular brands and quantities of drugs.

All practices had recently been involved in initiatives tackling aspects of asthma or (more often) angina care that fell within the clinical area of the guidelines. Attempting change could be unrewarding or have negative effects on other areas of practice (box B6). Clinicians therefore seemed to weigh up the pros and cons of different activities and prioritised those for which they felt that stronger incentives existed; these included financial incentives, personal interest, and pressure from external bodies. However, with limited time available, general practitioners also prioritised the aspects that they felt were most likely to produce positive effects. Some suggestion emerged that the computerised decision support system encouraged clinicians to consider aspects of care that they regarded as more marginal, as did a stronger impression that the inclusion of these aspects contributed to the unpopularity of the system.

Box 6.

Concurrent activities in asthma and angina

External drivers
“We are being heavily encouraged by the primary care groups to do more secondary prevention of coronary heart disease” (General practitioner, interview study)
Expectations of change
“What is frustrating is all that effort and really not . . . there's some change but not a lot you know. Personally I expected an awful lot more change” (General practitioner, interview study)
Prioritisation
“Do we go for something a bit simpler, which is simply let's say, let's get everyone on to aspirin, let's just look at blood pressure control, let's forget the cholesterol for the time being? Do we take it in bites or do we just say fine let's find everyone with diabetes and try some primary prevention? What do we do? You know there's so many things we could be doing” (General practitioner, interview study)
“So this is the other information that we collect, some of it as I have said before, you can't do anything about, for example, BMI, how fat a person is, in the real world and we all know that we are highly unlikely to alter things significantly there, as is smoking, 5% success rate with brief advice for smoking. However, that data is recorded but other things like cholesterol, and if its high are you doing something about it, are they on statins or are they not on statins, they have ischaemic heart disease, are you prescribing aspirin or warfarin or is it contraindicated and have this group been screened for diabetes, which is another risk factor for ischaemic heart disease” (General practitioner, interview study)
Financial incentives
“I think like peak flow and things like that it may be annoying cos there's no incentive at the end of it is there, whereas tetanus and smears there is isn't there?” (Nurse, interview study)
“You can have postgraduate education until the cows come home, it doesn't change attitudes. The only thing that I know that works is actually setting a target system, with financial carrots or financial sticks” (General practitioner, interview study)

“On-demand” information

Clinicians judged helpfulness by comparing the computerised decision support system with “on-demand” information (box B7). As well as guidelines and traditional sources of information, such as the advice of colleagues, clinicians used other sources of evidence in both paper and computerised formats. They seemed to enjoy using these tools and had found sources that they trusted and that gave them information in a style and volume that they found helpful. Some people suggested that the computerised decision support system could be used in this way, particularly in the html version.

Box 7.

Comparison with other sources of information

“I have occasionally looked at the guidelines in order to check that my clinical decision matched them (after I have seen the patient), but in everyday practice I do not find it useful” (Nurse, trial practice, feedback)
“I am afraid no one in our practice is still using [the system]. We find BNF and Mentor much more use, as an ‘on-demand’ information source” (General practitioner, trial practice, feedback)
“There's a new publication from the BMJ called Evidence Based Medicine in Clinical Practice . . . and that's an excellent book . . . it's reviewed every six months and it gives you the evidence based information about what to do” (General practitioner, interview study)
“It's perfectly possible to say, well I don't know the answer to that but I know how we can look it up so you can get Mentor up or . . . you've got the quick keys for the BNF now . . . we can put that up and we can look at the information they are saying what about the side effects” (General practitioner, interview study)
“I do a lot of Medline searching . . . the immediate thing that comes to mind is a lady with awful cluster headaches. She came and she'd some journal about oxygen therapy, something I'd never heard of in my life—Medline search. So rather than referring her . . . I can actually find it out myself and deal with it, which you know is really quite sort of satisfying” (General practitioner, interview study)

Positive comments

Clinicians made a handful of more positive comments about the computerised decision support system. Some people seemed to be interested in the potential of the computer to remind them to carry out activities or suggest a course of action; some liked the patient information leaflets available through the system. Although the general perception was that the system took a long time to use, one clinician did suggest that some activities could be done more quickly with the system than by using usual approaches (box B4).

Nurses

Nurses have an important role in chronic disease management, and general practitioners suggested that nurses might be able to make use of computerised decision support systems as part of increasing responsibilities in this area (box B8). Consideration of the existing chronic disease management in study practices showed that nurses were more likely to make use of systematic forms of data collection. Those nurses who did try the system were more positive about features such as the missing information prompts and data collection tools than were general practitioners. In some practices lower levels of access to computers meant that nurses could not use the system. This, coupled with low levels of feedback from nurses, meant that we could not fully assess the relative value of the system for nurses compared with general practitioners.

Box 8.

Nurses and the computerised decision support system

Chronic disease management
“What happens, the patient gets asthma, the GP diagnoses . . . and does hopefully a few peak flows, the patient's sent to the nurse in the asthma clinic who then follows BTS guidelines, managing the asthmatic when they run into a problem they—and they would then come to us and say, look, this is how it is with the patient, I think they now need beclomethasone or whatever” (General practitioner, interview study)
“We've got that [shows NR a data collection tool]. I mean, but it's not very extensive. And I think she's [nurse] probably the only one using it at the moment for asthma” (General practitioner, interview study)
Access to computers
“We have not been able to load [the system] to practice nurses despite the fact that they came to the teaching day” (General practitioner, trial practice, feedback)
“You see the GPs have PCs, we just have the dumb terminal . . . there's a few things there when you go in they show us on their computers. ‘Oh that looks brilliant’ but that's no good on mine because we can't do it [laughs]. I mean . . . if they were overdue a smear or tetanus they would have a due date diary that would . . . it would flash. Well on their screen, because it's in colour it actually flashes in red . . . so it's something that stands out straight away, whereas in ours you know it's all black so it doesn't look . . . you know it doesn't stand out [voice drops]. I say we need one of them. It hasn't worked. We keep hinting. Maybe one day. I want a bit of colour [laughs]” (Nurse, interview study)
The computerised decision support system
“The plan for us is to delegate most of like heart disease management to our nurses, it's already the case and you can see a very good role for something like [the system]” (General practitioner, interview study)
“Well, I think my own personal view is that nurses are very good at working to protocols and pathways of care and all the rest of it and they're comfortable with going from A to B, whereas . . . our skill is perhaps in kind of thinking in a round about way and jumping through a few of those pathways through whatever it is, experience or whatever, so you don't ask people the 10 or 15 questions to get from question 1 to question 14 you know, you go straight from one to 14 by intuition almost” (General practitioner, interview study)
“I think with nursing you're into a lot of guidelines anyway you know . . . well we look at the clinical governance that they're bringing in—a lot of it's what nurses have to do anyway and have done . . . I think . . . I think they realise we do need guidelines” (Nurse, interview study)

Discussion

The results of the randomised controlled trial showed that a computerised decision support system was not effective in improving the process or outcome of care for patients with asthma or angina, and this was almost certainly owing to low levels of use of the system.8 The results of this interview study illuminate the reasons for this low use. Some of the issues highlighted by clinicians could be tackled with more timely training, in-practice support, and versions of the software that allow ready access to other parts of the clinical system. However, this would not tackle the more substantive challenges of providing a system that “fits” into the general practice context.4

Both the timing of the guideline trigger and the content of interjections were problematic. A primary care consultation is a complex interaction on both a professional and an interpersonal level, so intervening in this setting is difficult. Berg suggests that one problem with guidelines is the implication that patient management is a series of formal rational decisions and that there is a single optimal solution to every medical problem.16 Computerising guidelines within a decision support system can be seen as an extreme form of this. With a written guideline a clinician can still decide what is relevant to a particular patient and what to prioritise. With a computerised guideline it is the computer that compares what is known about the patient with formalised knowledge and presents solutions, but without the clinician's ability to judge the quality of the data and the relevance to a particular patient at a particular time.17 Instead of simplifying the process, this gives the clinician a new task—to evaluate the computer's choices and decisions.

General practitioners seem to value on-demand information (or “passive” decision support18), particularly when this is in an accessible form.19 However, to use such tools clinicians need to recognise that they have a need for information. Although clinicians considered themselves familiar with the content of the guidelines, process data from the trial indicate that clinicians did not always practise in line with the recommendations of the guidelines.8 Clinicians in this study mentioned many of the issues highlighted in previous work on implementation of guidelines.20,21 Clinicians seemed least happy when prompted in areas that they would not usually tackle or could not tackle because of external barriers. Any strategy for change in behaviour that prompts in such areas is likely to generate feelings of dissonance. Conversely, computerised decision support systems may be appreciated if they give clinicians tools, such as patient information leaflets, with which to overcome barriers to change.

Although on-demand information as a strategy for behavioural change requires that clinicians recognise a gap in their knowledge, our data suggest that more active decision support can be difficult to integrate into general practice. Unless the computer can be trusted to provide messages that are highly relevant and accurate, a strong tendency to ignore these interventions exists.22 Although systematic reviews have concluded that simple computer prompt systems can be effective,2,18 in routine practice prompts need to be carefully targeted. Prompting systems rely on consistent coding of medical record data and might best be reserved for occasions not only when strong evidence exists for a course of action but also when the potential benefit to the patient is greatest.

Limitations of the study

Although our sample of practices reflected the practices participating in the randomised controlled trial, within practices we interviewed fewer general practitioners who were low users of computers. The interviews are therefore more representative of general practitioners who were more likely to trigger the computerised decision support system. The people in the feedback group were self selecting and likely to include disproportionately more of those with strong reactions to the system. The voice of the disappointed enthusiast comes across strongly, and we know less about the views of those people who chose not to try the system. However, although the level of criticism of the system varied between clinicians, the nature of the criticisms, in terms of where the problems lay with the system, was remarkably consistent.

Developing technologies pose particular challenges in evaluation—it is difficult to identify a “right time” to conduct a summative evaluation, and the technology has often moved on by the time the results are known. This does not mean that evaluations should not be done. Although both qualitative and quantitative methods can assist in the development of technologies, eventually the question “does it work?” needs to be answered.23 In questions of effectiveness the randomised controlled trial is the most appropriate research design. When evaluating complex interventions, such as a computerised decision support system, a parallel qualitative study serves to “open the black box” and elucidate why an intervention does or does not work. Here the qualitative interview study enabled us to follow the intervention over a period of time, from different perspectives, without needing to cover preliminary ground on each occasion, and to build on what we already knew about the practice. Thus a combination of qualitative and quantitative methods provided a more thorough evaluation of the intervention than either alone would have done.

Conclusion

Clinicians did not adopt the computerised decision support system because they found it difficult to use and did not perceive it to bring benefits for practice. Key issues included the relevance and accuracy of messages and the flexibility to respond to other factors influencing decision making in primary care. These are important even for simple prompting systems but are multiplied in the more complex systems needed for chronic disease management. Computers have brought benefits to primary care and clearly have an important role in promoting evidence based practice. However, complex decision support systems for chronic disease, integrated into clinical computer systems, are, in their current state of development, unlikely to be widely taken up by general practitioners.

Table.

Characteristics of practices included in interview study

Selected practices
Practice identifier A B C D E
Supplier of clinical computing system* 1 1 1 2 2
Level of computerisation M M PF M PF
Number of general practitioner partners 8 3 5 6 5
Vocational training practice No No Yes Yes No
*

Clinical computing systems are referred to only by number to ensure respondents' anonymity. 

M=mixed paper and computer record system; PF=paper-free record system. 

Acknowledgments

We thank the general practitioners, practice nurses, and practice staff in the study practices, especially those who took part in interviews. We also thank David Stables (EMIS Computing); Jon Rogers (Torex Meditel); and Nick Booth, Neil Jones, and Bob Sugden (Sowerby Centre for Health Informatics). Monica Smith advised on the design of this study. Rachel Baker and Tim Rapley gave helpful comments on drafts of the paper. Sylvia Hudson provided secretarial support and transcribed interviews. David Parkin, Ian Purves, and Nick Steen were members of the research team for the wider study.

Footnotes

Funding: NHS R&D programme “Methods to promote the uptake of research findings”; additional funding from EMIS Computing and the Department of Health for England and Wales. The Health Services Research Unit, University of Aberdeen, is funded by the Chief Scientist Office of the Scottish Executive Health Department. EMcC and NR are funded by the UK NHS primary care development programme. The Centre for Health Services Research, University of Newcastle upon Tyne and the Health Services Research Unit, University of Aberdeen are part of the UK MRC Health Services Research Collaboration. The views expressed are those of the authors and not necessarily those of the funding bodies.

Competing interests: None declared.

References

  • 1.Grimshaw J, Freemantle N, Wallace S, Russell I, Hurwitz B, Watt I, et al. Developing and implementing clinical practice guidelines. Qual Health Care. 1995;4:55–64. doi: 10.1136/qshc.4.1.55. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 2.Hunt DL, Haynes RB, Hanna SE, Smith K. Effects of computer-based clinical decision support systems on physician performance and patient outcomes. JAMA. 1998;280:1339–1346. doi: 10.1001/jama.280.15.1339. [DOI] [PubMed] [Google Scholar]
  • 3.Kaplan B. Evaluating informatics applications—clinical decision support systems literature review. Int J Med Inf. 2001;64:15–37. doi: 10.1016/s1386-5056(01)00183-6. [DOI] [PubMed] [Google Scholar]
  • 4.Kaplan B. Evaluating informatics applications—some alternative approaches: theory, social interactionism and call for methodological pluralism. Int J Med Inf. 2001;64:39–56. doi: 10.1016/s1386-5056(01)00184-8. [DOI] [PubMed] [Google Scholar]
  • 5.Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care. 1998;7:149–158. doi: 10.1136/qshc.7.3.149. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 6.Lomas J. Teaching old (and not so old) docs new tricks: effective ways to implement research findings. In: Dunn EV, Norton PG, Sterwart M, Tudiver F, Bass MJ, editors. Disseminating research/changing practice. London: Sage; 1994. [Google Scholar]
  • 7.Rogers EM. Diffusion of innovations. 4th ed. New York: The Free Press; 1995. [Google Scholar]
  • 8.Eccles M, McColl E, Steen N, Rousseau N, Grimshaw J, Parkin D, et al. Effect of computerised evidence based guidelines on management of asthma and angina in adults in primary care: cluster randomised controlled trial. BMJ. 2002;325:941–947. doi: 10.1136/bmj.325.7370.941. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Eccles M, Grimshaw J, Steen N, Parkin D, Purves I, McColl E, et al. The design and analysis of a randomised controlled trial to evaluate computerised decision support in primary care: the COGENT study. Fam Pract. 2000;17:180–186. doi: 10.1093/fampra/17.2.180. [DOI] [PubMed] [Google Scholar]
  • 10.Eccles M, McColl E, Steen N, Rousseau N, Grimshaw J, Parkin D. An evaluation of computerised guidelines for the management of two chronic conditions. Newcastle: Centre for Health Services Research; 2002. [Google Scholar]
  • 11.Yin RK. Case study research: design and methods. 2nd ed. Thousand Oaks, CA: Sage; 1994. [Google Scholar]
  • 12.Rousseau N, McColl E, Eccles M, Hall L. Qualitative methods in implementation research. In: Thorsen T, Makela M, editors. Changing professional practice: theory and practice of clinical guidelines implementation. Copenhagen: DSI; 1999. pp. 99–116. [Google Scholar]
  • 13.Seale C. The quality of qualitative research. London: Sage; 1999. [Google Scholar]
  • 14.Eccles M, Rousseau N, Adams P, Thomas L. Evidence-based guideline for the primary care management of stable angina. Fam Pract. 2001;18:217–222. doi: 10.1093/fampra/18.2.217. [DOI] [PubMed] [Google Scholar]
  • 15.Eccles M, Rousseau N, Higgins B, Thomas L. Evidence-based guideline on the primary care management of asthma. Fam Pract. 2001;18:223–229. doi: 10.1093/fampra/18.2.223. [DOI] [PubMed] [Google Scholar]
  • 16.Berg M. Problems and promises of the protocol. Soc Sci Med. 1997;44:1081–1088. doi: 10.1016/s0277-9536(96)00235-3. [DOI] [PubMed] [Google Scholar]
  • 17.Berg M, Goorman E. The contextual nature of medical information. Int J Med Inf. 1999;56:51–60. doi: 10.1016/s1386-5056(99)00041-6. [DOI] [PubMed] [Google Scholar]
  • 18.Elson RB, Connelly DP. Computerized decision support systems in primary care. Med Decis Making. 1995;22:365–384. [PubMed] [Google Scholar]
  • 19.McColl A, Smith H, White P, Field J. General practitioners' perceptions of the route to evidence based medicine: a questionnaire survey. BMJ. 1998;316:361–365. doi: 10.1136/bmj.316.7128.361. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Cabana MD, Rand CS, Powe NR, Wu AW, Wilson MH, Abboud PA, et al. Why don't physicians follow clinical practice guidelines? A framework for improvement. JAMA. 1999;282:1458–1465. doi: 10.1001/jama.282.15.1458. [DOI] [PubMed] [Google Scholar]
  • 21.Freeman AC, Sweeney K. Why general practitioners do not implement evidence: qualitative study. BMJ. 2001;323:1–5. doi: 10.1136/bmj.323.7321.1100. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Litzelman DK, Tierney WM. Physicians' reasons for failing to comply with the computerized preventive care guidelines. J Gen Intern Med. 1996;11:497–499. doi: 10.1007/BF02599049. [DOI] [PubMed] [Google Scholar]
  • 23. Eccles M, McColl E, Steen N, Rousseau N, Grimshaw J, Parkin D. Authors' reply to electronic responses. Effect of computerised evidence based guidelines on management of asthma and angina in adults in primary care: cluster randomised controlled trial. BMJ 2002. bmj.com/cgi/eletters/325/7370/941#26964 (accessed 9 Jan 2003).

Articles from BMJ : British Medical Journal are provided here courtesy of BMJ Publishing Group

RESOURCES