Abstract
Context
Adequate symptom management is essential to ensure quality cancer care, but symptom management is not always evidence based. Adapting and automating national guidelines for use at the point of care may enhance use by clinicians.
Objectives
This article reports on a process of adapting research evidence for use in a clinical decision support system that provided individualized symptom management recommendations to clinicians at the point of care.
Methods
Using a modified ADAPTE process, panels of local experts adapted national guidelines and integrated research evidence to create computable algorithms with explicit recommendations for management of the most common symptoms (pain, fatigue, dyspnea, depression, and anxiety) associated with lung cancer.
Results
Small multidisciplinary groups and a consensus panel, using a nominal group technique, modified and subsequently approved computable algorithms for fatigue, dyspnea, moderate pain, severe pain, depression, and anxiety. The approved algorithms represented the consensus of multidisciplinary clinicians on pharmacological and behavioral interventions tailored to the patient’s age, comorbidities, laboratory values, current medications, and patient-reported symptom severity. Algorithms also were reconciled with one another to enable simultaneous management of several symptoms.
Conclusion
A modified ADAPTE process and nominal group technique enabled the development and approval of locally adapted computable algorithms for individualized symptom management in patients with lung cancer. The process was more complex and required more time and resources than initially anticipated, but it resulted in computable algorithms that represented the consensus of many experts.
Keywords: Lung cancer and symptom management algorithms, decision making, decision support systems, guideline implementation, consensus methods
Introduction
Adequate symptom management is essential to ensure quality cancer care.1 Effective management of distressing symptoms is of particular concern in patients with advanced stages of lung cancer. More than 80% of patients with advanced stage lung cancer have disease-related symptoms as well as a high degree of psychological distress at presentation.2,3 Thus, this group of patients represents an ideal group for which to develop and test novel interventions for management of multiple symptoms.
Evidence-based symptom management interventions are available but often these interventions are not integrated into the practice setting.4,5 Clinical practice guidelines (defined as systematically developed statements of best practices based on the characteristics of the patient population, the illness, or clinical cir-cumstances6) provide a tool for transferring evidence from research into the practice setting to optimize care.7
To address inconsistencies in care and improve the quality of cancer care, a growing number of national and international professional organizations have undertaken the task of examining the research literature to develop clinical practice guidelines intended to provide clinicians with up-to-date information8–11 (Table 1). The development of these guidelines often requires a significant investment of human and financial resources. Although the application of these guidelines is one of the most promising methods for improving patient outcomes, the dissemination and implementation of guidelines in practice settings is often lacking.12 One challenge in using paper-based guidelines is that they may have multiple pathways and comprise many pages, making it difficult to quickly locate the information needed in a busy clinical setting.13 Computerizing guidelines and making them more accessible during routine care has been suggested as 1 strategy to address integration of guidelines into practice settings. A major challenge to computerizing current guidelines is that they are text based and often lack definitive decision points, which are necessary to make them computable.14,15 To make guidelines computable, explicit details are needed for decision points.15 For example, laboratory test threshold levels in decision points need to be numerically defined (e.g., “creatinine clearance < 60 ml/min” as opposed to “low creatinine clearance” or “renal failure”).
Table 1. Resources Available for Supportive Care Clinical Practice Guidelines for Cancer Care.
| Resource for CPG | Web site Information | Selected CPG or Resources Available |
|---|---|---|
| ADAPTE | http://www.adapte.org/www/upload/actualite/pdf/Manual%20&%20Toolkit.pdf | ADAPTE is a collaborative group available to promote the adaptation of existing guidelines for use in local contexts. A manual and toolkit are available to standardize the adaptation of clinical practice guidelines and promote quality control. This process is currently undergoing evaluation. Future efforts will be a joint collaborative with the Guidelines International Network |
| American Society for Clinical Oncology | http://www.asco.org/ASCOv2/Practice+%26+Guidelines/Guidelines/Clinical+Practice+Guidelines/Supportive+Care+and+Quality+of+Life | Integration of palliative care into standard
oncology care Antiemetics Use of epoetin and darbepoetin Use of white blood cell growth factors Venous thrombosis prophylaxis and treatment Platelet transfusion |
| American Thoracic Society |
http://www.thoracic.org/statements/resources/respiratory-disease-adults/palliative-care.pdf http://www.thoracic.org/statements/resources/respiratory-disease-adults/palliative-care.pdf |
Dyspnea Pain |
| Cochrane Library |
http://www.thecochranelibrary.com/view/0/index.html# http://www.thecochranelibrary.com/view/0/browse.html |
Anxiety Delirium Dyspnea Fatigue |
| European Medical Oncology Society |
http://www.esmo.org/ http://www.esmo.org/education-research/esmo-clinical-practice-guidelines.html#c3347 |
Antiemesis Febrile neutropenia Mucositis Venous thrombosis management |
| Guidelines International Network |
http://www.g-i-n.net/ http://www.g-i-n.net/activities/implementation/giranet/guideline-implementability-research-and-application-network |
Lymphedema Delirium Depression |
| International Association for Hospice and Palliative Care | http://www.hospicecare.com/resources/treatment.htm | Palliative care Pain |
| Institute of Medicine Report: Clinical Practice Guidelines We Can Trust | http://www.iom.edu/Reports/2011/Clinical-Practice-Guidelines-We-Can-Trust.aspx | Resource for clinical practice guidelines |
| Inventory of Cancer Guidelines | http://www.cancerguidelines.ca/guidelines/inventory/search.php | Pain Antiemesis |
| Multinational Association of Supportive Care in Cancer Care |
http://www.mascc.org/index.php?option=com_content&view=article&id=167 http://www.mascc.org/guidelines-and-tools |
Mucositis Antiemesis EGRF inhibitor skin toxicity |
| National Comprehensive Cancer Network | http://www.nccn.org/professionals/physician_gls/f_guidelines.asp#supportive | Anemia Antiemesis Distress Fatigue Growth factors Infection Pain |
| National Guidelines Clearinghouse | http://www.guideline.gov/ | Psychosocial health care for patients with
cancer and their families Prevention and treatment of mucositis |
| Oncology Nursing Society | http://www.ons.org/Research/PEP | Anorexia Anxiety Caregiver burden Constipation Depression Fatigue Hot flashes Peripheral neuropathy Sleep-wake disturbances |
Adapting guidelines for use in a variety of local contexts, ranging from a single clinic, single or multiple institution(s), a geographic region or the entire country, provides the clinician with recommendations at the point of care tailored to available resources and practice norms (called local adaptation) and can be accomplished with currently available technology.16,17 Computer-based algorithms that are explicit, detailed, and automatically computed from the patient's data can standardize clinical decision making and enable individualization of recommendations for patient care.17 Evidence from primary care suggests that the addition of computerized guidelines during the clinical encounter resulted in a twofold increase in clinician adherence with care guidelines in diabetes.18 Moreover, further evidence suggests that more than 90% of clinician-directed clinical decision support interventions have improved patient care when the intervention was delivered as part of the work flow, provided at the time and location of decision making, and gave specific recommendations.19,20
Systematic approaches to transform guidelines into computable, locally individualized algorithms that provide information that can be used quickly at the point of care need to be developed and tested to see if they can improve patient outcomes in oncology.16,21,22 To date, only 3 studies have been conducted in patients with cancer to test the use of paper-based symptom management guidelines to improve patient outcomes: 2 studies focused on pain management and 1 focused on de-pression.23–25 Results from these studies seem to be promising but further studies are needed. Bertsche et al.26 conducted a prospective controlled intervention cohort study testing computerized pain management guidelines in 100 patients with cancer admitted to a radiation oncology inpatient unit (50 patients in the intervention group) and found that deviation from guidelines decreased from 74% to 14% (P < 0.001). In addition, use of co-analgesics increased from 46% to 66% (P = 0.04) and 85% of recommendations were implemented by physicians when suggestions to improve adherence to the guidelines were delivered by e-mail from a pharmacist.
An important approach to increase the use of these computable algorithms is to actively engage expert clinicians, who are the intended future users, in the process of developing the algorithms for implementation in the clinical setting.27,28 Evidence suggests that actively involving the target user in the algorithm development process leads to significant changes in practice.16,29–31 These approaches have the potential to reduce duplication of effort and increase dissemination of evidence-based practice.16,32,33 This article is the first to describe a replicable process for creating computable algorithms for the management of multiple symptoms in an outpatient thoracic oncology setting.21
Methods
Two distinct clinical sites were involved in this project to enhance the socioeconomic and racial diversity of the sample: Dana-Farber Cancer Institute (DFCI), a comprehensive cancer center, and Boston Medical Center (BMC), the largest safety net hospital in the New England region. Institutional Review Board approval was obtained at both clinical sites before implementing the project. Clinicians from all relevant specialties were nominated as potential participants in expert panels by thoracic oncology and palliative care department leadership. All clinicians who agreed to participate as members of the expert panel did so on a voluntary basis. These participants were nominated either because they were experts from the clinical sites or would be responsible for implementing the clinical intervention in the 2 sites after the algorithms were programmed into a usable decision aid. Each nominee was invited by e-mail to participate on the expert panel. A follow-up phone call or in-person meeting was scheduled to answer any questions that the clinicians had about their role and the time commitment involved in participation in the panels. Participating clinicians received a $150 USD honorarium.
Pain, fatigue, dyspnea, and depression were the symptoms initially targeted for developing computable algorithms to enhance symptom management in patients with lung cancer.34 These symptoms were targeted based on preliminary work conducted by the first author identifying that these are the most common symptoms in adults with lung cancer and that national guidelines are available for their management.13,34–42
Four expert panels were convened; each panel was given a specific task to focus on algorithm development for 1 of the target symptoms of pain, dyspnea, depression/anxiety, or fatigue. Each panel comprised a team of multi-disciplinary experts (Appendix; available at jpsmjournal.com); they began with a prototype algorithm template that was developed by a palliative care expert (J.L.A.). Each panel was supported by research staff and had a leader to help facilitate group discussion and guide decisions for adapting the algorithms. One member of the core research team also participated on each expert panel.
To create explicit computable algorithms for symptom management, a 7-step process was used similar to the steps of the ADAPTE process, a framework for adapting clinical guidelines.32,43 Table 2 shows how the various phases of the ADAPTE process were used to guide creation of algorithms described in this article.32,43 The 7 processes coordinated by the core research team included: 1) identifying expert panel members for each target symptom group (i.e., pain, dyspnea, anxiety/depression, fatigue); 2) formulating the questions, and reviewing and synthesizing the literature within each expert symptom panel (the core research team collaborated with the leader of each expert panel before convening the expert panels to formulate the research questions that focused on identifying evidence-based symptom management strategies for each of the target symptoms; these core staff members also conducted a review of the literature focused on peer-reviewed published practice guidelines and/or a published synthesis of the evidence related to the target symptoms in preparation for the expert panel meeting);13,37–40,42,44–55 3) convening each expert symptom panel to translate clinical guidelines into computable algorithms; 4) conducting a peer review of draft algorithms by another symptom expert panel before the consensus panel meeting; 5) holding a multidisciplinary consensus panel meeting to discuss the evidence-based algorithms; 6) revising algorithms based on feedback generated from the review meeting; and 7) reviewing and approving revised computable algorithms. The core research team provided administrative support to the expert and multi-disciplinary panels throughout these processes (Fig. 1).
Table 2. Using the ADAPTE Framework as a Guide to Adapt Clinical Practice Guidelines16,32.
| Phase | Task |
|---|---|
| Setup | Establish expert panel Determine topic Check if guidelines are available to adapt Develop plan and timeline for completion of tasks |
| Adaptation | Determine and clarify the question to be
examined Search for guidelines and related materials Assess the quality of the guideline Assess the content and acceptability of the guideline Prepare a draft of the adapted guideline |
| Finalization | Solicit feedback on draft guidelines from
those who will be using it Acknowledge source guidelines Produce a final document |
Fig. 1.

Process for adapting algorithms.
The nominal group technique (NGT) was used to facilitate discussion during the multi-disciplinary consensus panel meeting.56,57 All members of the core research team and the leaders of each expert panel attended this meeting, along with key opinion leaders from the clinical practices in which the computable algorithms were ultimately going to be used. This technique is a consensus building process and was chosen because it helps to identify creative solutions when a group is making judgmental decisions, the kind that are needed in the development of explicit computable algorithms for symptom management from national guidelines.56 After the group leader for each expert panel presented an overview of a draft algorithm (e.g., moderate pain), panel members were asked to write down their concerns about the draft algorithm, after which the concerns were discussed and key problems recorded on a flip chart. Next, through discussion, the group clarified and prioritized the changes needed. Once discussion was completed for the draft algorithm of each target symptom, a preliminary vote for approval was taken within each panel to identify any remaining areas of concern about the interventions that were proposed by the group. Additional revisions were proposed, and consensus voting was used to recommend acceptance of revisions and/or achieve consensus and approve the algorithms as revised.
Results
Of the 19 clinicians who were nominated and invited to participate as members of the expert panels, 18 (95%) voluntarily accepted the invitation. A range of disciplines were represented including medical oncology, radiation oncology, palliative care, social work, psychiatry, pulmonary medicine, rehabilitation medicine, nursing, and pharmacy. Once the groups were formed, the group leader discussed the specific questions to be addressed and the time lines to complete the tasks. Subsequently, the research staff provided published guidelines and online literature for panel members to review before and during the expert panel meetings where this relevant literature was discussed under the guidance of the core research team and the leaders of each expert panel.37,38,44–55,58–63 Each panel met in person and by phone approximately 4 times to translate the literature into computable algorithms, which were displayed as flow charts.
As the expert panels began their work, it became clear that the prototype algorithms were too simple to provide the comprehensive and explicit recommendations needed in the computable algorithms if they were to be effective in treating each symptom of interest. The computable algorithms needed to be tailored to incorporate key clinical characteristics, symptom severity and other co-occurring symptoms64 reported by each individual patient who completed validated symptom assessment surveys on a tablet computer.
With input from the expert panel members, the computable algorithm for moderate pain grew in complexity to include patient age, co-morbidities such as peptic ulcer disease and impaired renal function, whether the patient was opioid naive or tolerant, the need for adjuvant medications for somatic and/or neuropathic pain and actions to prevent opioid-induced constipation (Fig. 2a [original] and 2b [final]). This same growth in complexity also occurred with the algorithms for managing the other target symptoms, so that they began to resemble more closely how clinicians actually think and practice. Table 3 summarizes the data requirements for each algorithm for computing customized recommendations for individual patients.64 Through this process, anxiety, the presence of constipation, and the intensity of insomnia, anorexia, and nausea were eventually included in the computable algorithms because the clinicians identified them as interrelated to the symptoms of interest;64 however, management recommendations were produced only for pain, fatigue, dyspnea, and depression/anxiety.
Fig. 2.

a) Proposed moderate pain algorithm. Adapted from the American Pain Society guidelines.38 b) Actual moderate pain algorithm.
Table 3. Data Requirements for Final Algorithms.
|
Pain Algorithm Pain severity (none/mild, moderate, severe) Pain characteristics (neuropathic, somatic) and pattern (constant, intermittent) Current opioid use (opioid naive or current opioid strength and dosage forms) History of peptic ulcer disease Constipation severity, bowel medication use Age, platelet count, renal function (creatinine clearance calculation) |
|
Dyspnea Algorithm Dyspnea severity (none/mild, moderate, severe) Current opioid use (opioid naive or opioid strengths and dosage forms) Pain severity, pain characteristics (neuropathic or somatic) Presence of anxiety History of peptic ulcer Constipation severity, bowel medication use Hemoglobin level, renal function (creatinine clearance calculation), age, platelet count |
|
Depression
Algorithm Depression severity (none, mild, moderate, severe) Presence of anhedonia, suicidal ideation, previous report of depression Medication dosing and start date of 3 medication classes (name the classes) Presence of pain, fatigue, insomnia, anorexia, and/or nausea frequency and severity |
|
Anxiety Algorithm Anxiety severity (none, mild, moderate, severe) Previous report of anxiety, presence of pain, dyspnea, insomnia, anorexia, and/or nausea frequency and severity Medication dosing and start date of 3 medication classes (name the classes) |
|
Fatigue Algorithm Severity of fatigue (none/mild, moderate, severe) Previous report of fatigue, presence of anhedonia, pain, dyspnea, depression (none/mild, moderate, severe), and/or insomnia severity Benzodiazepine use Hemoglobin level |
We relied heavily on the expert clinicians to create the algorithms and these paper-based algorithms were then translated into a format that could be interpreted by a computer. Each expert panel was provided with individualized assistance from the palliative care (J.L.A.) and informatics (D.F.L.) experts to ensure that the clinical algorithms that were developed were specific enough and had adequately explicit decision points so that they could be converted into computable logic. In addition, as the work on the algorithms progressed, representatives from the oncology practices from both of the clinical sites reviewed them to ensure that they were compatible with the local formularies and with the services typically covered by the insurance carriers for patients seen at each site.
Once this initial work was completed, each expert panel circulated the draft computable algorithm to another panel for external review before the all-day multidisciplinary consensus panel meeting. Any comments that were provided for improvement were shared with all conference participants before the meeting. This opportunity to review the draft algorithms ahead of time facilitated the peer-review process and enriched the discussions that occurred during the multidisciplinary consensus panel meeting. The consensus panel met from 8 am to 3 pm; lunch was provided on site to increase the efficient use of time of the panel members. Each expert panel presented its draft algorithm(s) to all participants and then NGT was used to obtain feedback.56,57 Other participants included the core research team, opinion leaders from the 2 oncology practices, and relevant experts from both organizations in the areas of palliative care, psychiatry, pulmonary medicine, nursing, social work, medical informatics, radiation oncology, behavioral science, physical therapy, and thoracic oncology.
The most significant concern raised by the conference participants, who were clinical experts in the care of patients with lung cancer, was how to reconcile the computable algorithms to address the needs of patients who had more than 1 of the target symptoms. For example, a patient might have both moderate pain and dyspnea. Although each algorithm recommended a specific dose of opioid, the group offered principles for calculating a joint recommendation to ensure that the doses were not additive and that they would be adequate to address management of both symptoms. Similarly, decisions were made to prioritize recommendations when a patient experienced a combination of certain symptoms.64–66 For example, if a patient had both severe pain and fatigue, the computable algorithms would offer specific pharmacological recommendations to treat the severe pain first, along with nonpharmacological therapies for fatigue, and a suggestion to reassess for fatigue after the pain was under control.65
During the workshop, computable algorithms for fatigue, dyspnea, severe pain, and anxiety were approved. However, the algorithms for moderate pain and depression were returned to their panels to resolve defined areas of concern, and were subsequently approved and finalized by the panels. The approved computable algorithms represented consensus of multidisciplinary clinicians on appropriate guideline-based recommendations for the management of these symptoms. Recommendations that the algorithms could generate included suggestions for pharmacotherapy, behavioral interventions, referrals (social work, psychiatry, palliative care, respiratory therapy, or physical therapy), and laboratory tests, and were tailored based on each patient's symptom severity, age, comorbidities, laboratory values, and medications taken.
This process was more time and resource intensive than originally planned. We anticipated that computable algorithms would be produced in approximately 9 months and programmed in an additional 6 months. However, because of the complexity of the algorithms, the expert panels worked on modifying them and finalizing the flow charts for 9 months; another 3 months were needed for review by the palliative care and informatics experts to refine them for programming, and testing of the effect on the output (i.e., clinical recommendations) of each possible combination of input factors (e.g., patient age, renal function, previous opioid use) took another 12 months because entirely novel programming was required to accommodate the complexity of the tailored recommendations. It would have been ideal to have input from the expert panels to answer questions that arose during the programming process. However, the expert panels completed their work with the production of draft algorithms in the form of flow charts for programming. Clinician co-investigators were available to answer questions and ensure that the translation of algorithms was clinically appropriate in cases where clarification was required to program the algorithm. The adaptation of the algorithms alone entailed approximately 520 hours of person time, which cost approximately $24,000 USD. We estimate the effort of each panel member as 16 hours (288 hours total for 18 members), of research staff as 220 hours, and of investigators as 72 hours.
Discussion
The integration of symptom management with standard oncology care, beginning with diagnosis and continuing through treatment of advanced stage disease and end of life, is a priority for quality cancer care.1,67–69 A plethora of clinical practice guidelines have been developed to reflect best practices for symptom management and palliative care. However, the development of clinical practice guidelines alone is not useful.70 It is imperative to increase dissemination of these guidelines to potential users and, most importantly, increase the use of the suggestions provided by the guidelines to improve patient care.71,72 To date, major efforts have been made toward improving the quality and rigor of the development of guidelines, but less attention has been paid to increasing the effective use of guidelines in clinical practice.8,73 Providing information about the process of creating explicit computable algorithms to guide care of individual patients with cancer is important to build an evidence base in the science of knowledge translation.74
This study is the first to describe the process of creating computable symptom management algorithms from evidence-based guidelines for multiple symptoms. The algorithms incorporate both patient assessment of the severity of their symptoms, the medications they are already taking, and key variables that clinicians take into account as they make decisions on symptom management. These computable algorithms were available to be programmed into and inform a clinical decision support system to test the feasibility of using this system to deliver specific suggestions to potentially enhance the care of individual patients by thoracic oncology clinicians at 2 institutions. Patient acceptability for use of electronic self-reporting systems in the outpatient oncology setting has been found to be high; 99%ofpatients reported that completion of symptom questionnaires on tablet computers was easy and 88% indicated that they would recommend the use of a self-reporting tool to others.75 Moreover, the use of electronic questionnaires has been found to improve communication about symptoms and decrease distress in patients with cancer.76,77 Bertsche et al.26 have reported that the use of computerized clinical decision support for pain management is promising, with increased adherence to guidelines and an increased use of co-analgesics. Although there is mounting evidence that clinical decision support can be effective, current use and adoption of these types of systems is limited.78 Barriers to adoption of these systems include technical challenges in developing a standardized approach for clinical decision support that can be shared across sites; the absence of a central repository where computerized guideline knowledge can be updated, stored, and shared across sites; and difficulty with integrating clinical decision support into the work flow of clinicians. Further studies are needed to identify whether these systems enhance patient outcomes and to identify solutions to integrating their use during routine care.
National and international efforts are underway to standardize the development of clinical practice guidelines and adapt them for use within local communities.8–10 In this article, we used a process similar to that proposed by the ADAPTE group.32 The creation and publication of the ADAPTE materials sets the stage for future researchers to engage clinicians in creating computable algorithms from published guidelines using standardized approaches and then assessing the relevance of using this approach for specific contexts of care, such as cancer care (http://www.g-i-n.net/document-store/adapte-resource-toolkit-guideline-adaptation-version-2). The ADAPTE process consists of 3 phases: setup, adaptation, and finalization. Table 3 lists the steps recommended by the ADAPTE framework. Although our project began before the ADAPTE framework was published, we used a similar process and found that it was easy to use and clinicians were easily engaged in the process of creating explicit, computable, locally adapted clinical algorithms from guidelines. The major difference between our approach and the ADAPTE approach is that we did not have clinicians grade the level of evidence.79 We used clinical practice guidelines that had been approved or published in peer-reviewed journals so regrading the level of evidence was not necessary.
The core research team actively engaged a multidisciplinary group of expert practitioners to examine published research and evidence-based clinical practice guidelines for treatment of the most common symptoms experienced by patients with lung cancer and then we engaged in peer review at various levels throughout the process of developing the algorithms. We had a high rate of voluntary participation (95%) among the experts who were invited to be involved, which may be related to multiple factors including having strong leadership support for the study; embracing clinicians as expert partners; providing structured guidelines for the process of developing computable algorithms; providing administrative support to access references and other scholarly materials requested by the expert panels; providing an incentive for the work accomplished during the study; and fostering a sense of community. Browman et al.29 also had relatively high rates of participation (72%) among clinicians who responded to a survey about evidence-based recommendations and clinical practice guidelines in cancer care. The results were found to be very useful in shaping the next iteration of guidelines. These studies suggest that embracing clinicians as partners in research and quality improvement efforts is essential and feasible.
Our solution to the challenge of translation of knowledge into practice was to use a community of practice model, defined as a process of building support for members through the creation of teams that focus on generating and sharing knowledge for the purpose of improving organizational performance.80 The community of practice model originated in business but has recently been used in health care to improve performance.81 This approach offers a bottom up rather than a top down social structure for integrating knowledge across care sites, allowing those who are closest to the clinical issues the opportunity to identify clinical problems and then find solutions to improve the quality of care.80,82,83
The clinicians who participated in the expert panels were key to specifying the data requirements of the computable algorithms. The ability to process complex data is at the heart of medical decision making and the ability to identify strategies for dealing with complex decisions is 1 of the hallmarks of being an expert clinician.84 The research team encouraged the expert clinicians to articulate the variables they would consider as they made their clinical decisions, especially the decision-making processes they used when managing multiple co-occurring symptoms. We learned from this process that adapting the computable algorithms was more complex and required more time and resources than initially anticipated; future efforts at adapting algorithms for local use may benefit from allowing for this longer time frame. Creating algorithm specifications that can be implemented in a clinical decision support system is much more complicated than creating paper-based guidelines, as the former requires a more explicit and a more granular level of detail.85 As mentioned previously, it is essential to have definitive decision points that must be defined precisely so that the algorithms can be computable. The complexity of the computable algorithms increased and ultimately reflected how expert clinicians really manage each of the key symptoms alone or in combination. In turn, creating the algorithms in partnership with expert clinicians enabled the core research team to understand the complex human decision-making processes that occur during decisions for management of co-occurring symptoms. This understanding will provide the foundation for developing computable algorithms for symptom management as we move toward building information systems that can support complex decision making.86
Acknowledgments
This work was funded by National Cancer Institute grant R01 CA125256 (Mary E. Cooley, Principal Investigator). The R01 CA125256 grant was prepared as part of a Mentored Career Development Award (1 K07 CA92696 to Mary E. Cooley), Karen M. Emmons, PhD, and Bruce E. Johnson, MD, Mentors.
Biographies
Pain Expert Panel Members
Janet L. Abrahm, MD, Group Co-coordinator Chief, Division of Adult Palliative Care, DFCI Professor, Harvard Medical School
Gail Wilkes, RN, MS, ANP-BC, AOCN Nurse Educator, Department of Nursing, BMC
Bridget Fowler, Pharm D Pharmacist, Psychosocial Oncology and Palliative Care, DFCI
Pamela Calarese, RN, MS, CS Nurse Practitioner, Thoracic Oncology Program, DFCI
Dyspnea Expert Panel Members
Christine Campbell Reardon, MD, Group Co-coordinator Pulmonary Physician, Thoracic Oncology Program, BMC Assistant Professor, Boston University Medical School
Bridget Fowler, Pharm D Pharmacist, Palliative Care, DFCI
Rachelle E. Bernacki, MD, MSDirector of Quality Initiatives, Psychosocial and Palliative Care, DFCI Palliative Care Physician, Adult Palliative Care Instructor, Harvard Medical School
Jennifer Kales, NP Nurse Practitioner, Psychosocial and Palliative Care, DFCI
Depression/Anxiety Expert Panel Members
Isidore L. Berenbaum, MD, Group Co-coordinator Psychiatrist, Department of Psychiatry, BMC Assistant Professor, Boston University School of Medicine
Michael Miovic, MD Psychiatrist, Psychosocial Oncology Program, DFCI
Nancy Borstelman, MPH, LICSW Director of Patient and Family Support Services, Care Coordination, DFCI
Kristin Schaefer, MD Director of Residency Education Palliative Care Physician, Adult Palliative Care, DFCI Instructor, Harvard Medical School
Sue Cocchiarella, LICSW Social Work Supervisor, Department of Social Work, BMC
Victor Phantumvanit, Pharm D Pharmacist, Psychosocial Oncology and Palliative Care, DFCI
Margaret McMullin, NP Nurse Practitioner, Thoracic Oncology Program, DFCI
Fatigue Expert Panel Members
Andrea K. Ng, MD, MPH, Group Co-coordinator Radiation Oncologist, DFCI Associate Professor of Medicine, Harvard Medical School
Mary K. Buss, MD Medical Oncology Thoracic Oncologist, Beth Israel Deaconess Medical Center
Linda Arslanian, PT, DPT, MS Director of Rehabilitation Services, Brigham and Women’s Hospital
Victor Phantumvanit, Pharm D Pharmacist, Psychosocial Oncology and Palliative Care, DFCI
Kitty Hooper, RN Program Nurse, Thoracic Oncology Program, DFCI
Footnotes
Disclosures: The authors declare no conflicts of interest.
References
- 1.Ferrell B, Paice J, Koczywas M. New standards and implications for improving the quality of supportive oncology practice. J Clin Oncol. 2008;26:3824–3831. doi: 10.1200/JCO.2007.15.7552. [DOI] [PubMed] [Google Scholar]
- 2.Hopwood P, Stephens R. Symptoms at presentation for treatment in patients with lung cancer: implications for evaluation of palliative care. The Medical Research Council Lung Cancer Working Party. Br J Cancer. 1995;7:633–636. doi: 10.1038/bjc.1995.124. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Earle CC, Weeks JC. The science of quality-of-life measurement in lung cancer. In: Lipscomb J, Gotay CC, Snyder C, editors. Outcomes in assessment in cancer: measures, methods, and applications. New York: Cambridge University Press; 2005. pp. 160–177. [Google Scholar]
- 4.Borneman T, Koczywas M, Cristea M, et al. An interdisciplinary care approach for integration of palliative care in lung cancer. Clin Lung Cancer. 2008;9:352–360. doi: 10.3816/CLC.2008.n.051. [DOI] [PubMed] [Google Scholar]
- 5.Berry DL. Patient-reported symptoms and quality of life integrated into clinical cancer care. Semin Oncol Nurs. 2011;27:203–210. doi: 10.1016/j.soncn.2011.04.005. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 6.Jacobsen PB, Wagner LI. A new quality standard: the integration of psychosocial care into routine cancer care. J Clin Oncol. 2012;30:1154–1159. doi: 10.1200/JCO.2011.39.5046. [DOI] [PubMed] [Google Scholar]
- 7.Pavlidis N. Evidence-based medicine: development and implementation of guidelines in oncology. Eur J Cancer. 2009;45(Suppl 1):468–470. doi: 10.1016/S0959-8049(09)70093-0. [DOI] [PubMed] [Google Scholar]
- 8.Somerfield MR, Einhaus K, Hagerty KL, et al. American Society of Clinical Oncology clinical practice guidelines: opportunities and challenges. J Clin Oncol. 2008;26:4022–4026. doi: 10.1200/JCO.2008.17.7139. [DOI] [PubMed] [Google Scholar]
- 9.Mallory GA. Professional nursing societies and evidence-based practice: strategies to cross the quality chasm. Nurs Outlook. 2010;58:279–286. doi: 10.1016/j.outlook.2010.06.005. [DOI] [PubMed] [Google Scholar]
- 10.Qaseem A, Forland F, Macbeth F, et al. Guidelines International Network: toward international standards for clinical practice guidelines. Ann Intern Med. 2012;156:525–531. doi: 10.7326/0003-4819-156-7-201204030-00009. [DOI] [PubMed] [Google Scholar]
- 11.Gagliardi AR, Brouwers MC, Bhattacharyya OK. The guideline implementability research and application network (GIRAnet): an international collaborative to support knowledge exchange: study protocol. Implement Sci. 2012;7:26. doi: 10.1186/1748-5908-7-26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Miller M, Kearney N. Guidelines for clinical practice: development, dissemination and implementation. Int J Nurs Stud. 2004;41:813–821. doi: 10.1016/j.ijnurstu.2003.09.005. [DOI] [PubMed] [Google Scholar]
- 13.Benedetti C, Brock C, Cleeland C, et al. NCCN practice guidelines for cancer pain. Oncology (Williston Park) 2000;14:135–150. [PubMed] [Google Scholar]
- 14.Tierney WM, Overhage JM, Takesue BY, et al. Computerizing guidelines to improve care and patient outcomes: the example of heart failure. J Am Med Inform Assoc. 1995;2:316–322. doi: 10.1136/jamia.1995.96073834. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Morris AH. Treatment algorithms and protocolized care. Curr Opin Crit Care. 2003;9:236–240. doi: 10.1097/00075198-200306000-00012. [DOI] [PubMed] [Google Scholar]
- 16.Harrison MB, Legare F, Graham ID, Fervers B. Adapting clinical practice guidelines to local context and assessing barriers to their use. CMAJ. 2010;182:E78–E84. doi: 10.1503/cmaj.081232. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Latoszek-Berendsen A, Tange H, van den Herik HJ, Hasman A. From clinical practice guidelines to computer-interpretable guidelines. A literature overview. Methods Inf Med. 2010;49:550–570. doi: 10.3414/ME10-01-0056. [DOI] [PubMed] [Google Scholar]
- 18.Lobach DF, Hammond WE. Computerized decision support based on a clinical practice guideline improves compliance with care standards. Am J Med. 1997;102:89–98. doi: 10.1016/s0002-9343(96)00382-8. [DOI] [PubMed] [Google Scholar]
- 19.Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330:765. doi: 10.1136/bmj.38398.500764.8F. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 20.Jaspers MW, Smeulers M, Vermeulen H, Peute LW. Effects of clinical decision-support systems on practitioner performance and patient outcomes: a synthesis of high-quality systematic review findings. J Am Med Inform Assoc. 2011;18:327–334. doi: 10.1136/amiajnl-2011-000094. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Kawamoto K, Lobach DF. Design, implementation, use, and preliminary evaluation of SEBASTIAN, a standards-based Web service for clinical decision support. AMIA Annu Symp Proc. 2005:380–384. [PMC free article] [PubMed] [Google Scholar]
- 22.Bright TJ, Wong A, Dhurjati R, et al. Effect of clinical decision-support systems: a systematic review. Ann Intern Med. 2012;157:29–43. doi: 10.7326/0003-4819-157-1-201207030-00450. [DOI] [PubMed] [Google Scholar]
- 23.Du Pen SL, Du Pen AR, Polissar N, et al. Implementing guidelines for cancer pain management: results of a randomized controlled clinical trial. J Clin Oncol. 1999;17:361–370. doi: 10.1200/JCO.1999.17.1.361. [DOI] [PubMed] [Google Scholar]
- 24.Cleeland CS, Portenoy RK, Rue M, et al. Does an oral analgesic protocol improve pain control for patients with cancer? An intergroup study coordinated by the Eastern Cooperative Oncology Group. Ann Oncol. 2005;16:972–980. doi: 10.1093/annonc/mdi191. [DOI] [PubMed] [Google Scholar]
- 25.Passik SD, Kirsh KL, Theobald D, et al. Use of a depression screening tool and a fluoxetine-based algorithm to improve the recognition and treatment of depression in cancer patients. A demonstration project. J Pain Symptom Manage. 2002;24:318–327. doi: 10.1016/s0885-3924(02)00493-1. [DOI] [PubMed] [Google Scholar]
- 26.Bertsche T, Askoxylakis V, Habl G, et al. Multi-disciplinary pain management based on a computerized clinical decision support system in cancer pain patients. Pain. 2009;147:20–28. doi: 10.1016/j.pain.2009.07.009. [DOI] [PubMed] [Google Scholar]
- 27.Smith TJ, Hillner BE. Ensuring quality cancer care by the use of clinical practice guidelines and critical pathways. J Clin Oncol. 2001;19:2886–2897. doi: 10.1200/JCO.2001.19.11.2886. [DOI] [PubMed] [Google Scholar]
- 28.Browman GP, Makarski J, Robinson P, Brouwers M. Practitioners as experts: the influence of practicing oncologists “in-the-field” on evidence-based guideline development. J Clin Oncol. 2005;23:113–119. doi: 10.1200/JCO.2005.06.179. [DOI] [PubMed] [Google Scholar]
- 29.Browman GP, Newman TE, Mohide EA, et al. Progress of clinical oncology guidelines development using the Practice Guidelines Development Cycle: the role of practitioner feedback. J Clin Oncol. 1998;16:1226–1231. doi: 10.1200/JCO.1998.16.3.1226. [DOI] [PubMed] [Google Scholar]
- 30.Fervers B, Carretier J, Bataillard A. Clinical practice guidelines. J Visc Surg. 2010;147:e341–e349. doi: 10.1016/j.jviscsurg.2010.10.010. [DOI] [PubMed] [Google Scholar]
- 31.Capdenat Saint-Martin E, Michel P, Raymond JM, et al. Description of local adaptation of national guidelines and of active feedback for rationalising preoperative screening in patients at low risk from anaesthetics in a French university hospital. Qual Health Care. 1998;7:5–11. doi: 10.1136/qshc.7.1.5. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.Fervers B, Burgers JS, Haugh MC, et al. Adaptation of clinical guidelines: literature review and proposition for a framework and procedure. Int J Qual Health Care. 2006;18:167–176. doi: 10.1093/intqhc/mzi108. [DOI] [PubMed] [Google Scholar]
- 33.Groot P, Hommersom A, Lucas P. Adaptation of clinical practice guidelines. Stud Health Technol Inform. 2008;139:121–139. [PubMed] [Google Scholar]
- 34.Cooley ME, Short TH, Moriarty HJ. Symptom prevalence, distress, and change over time in adults receiving treatment for lung cancer. Psychooncol-ogy. 2003;12:694–708. doi: 10.1002/pon.694. [DOI] [PubMed] [Google Scholar]
- 35.Cooley ME. Symptoms in adults with lung cancer. A systematic research review. J Pain Symptom Manage. 2000;19:137–153. doi: 10.1016/s0885-3924(99)00150-5. [DOI] [PubMed] [Google Scholar]
- 36.Cooley ME, Short TH, Moriarty HJ. Patterns of symptom distress in adults receiving treatment for lung cancer. J Palliat Care. 2002;18:150–159. [PubMed] [Google Scholar]
- 37.DiSalvo WM, Joyce MM, Tyson LB, Culkin AE, Mackay K. Putting evidence into practice: evidence-based interventions for cancer-related dyspnea. Clin J Oncol Nurs. 2008;12:341–352. doi: 10.1188/08.CJON.341-352. [DOI] [PubMed] [Google Scholar]
- 38.Gordon DB, Dahl JL, Miaskowski C, et al. American Pain Society recommendations for improving the quality of acute and cancer pain management. American Pain Society Quality of Care Task Force. Arch Intern Med. 2005;165:1574–1580. doi: 10.1001/archinte.165.14.1574. [DOI] [PubMed] [Google Scholar]
- 39.Mock V, Atkinson A, Barsevick A, et al. NCCN practice guidelines for cancer-related fatigue. Oncology (Williston Park) 2000;14:151–161. [PubMed] [Google Scholar]
- 40.Sheldon LK, Swanson S, Dolce A, Marsh K, Summers J. Putting evidence into practice: evidence-based interventions for anxiety. Clin J Oncol Nurs. 2008;12:789–797. doi: 10.1188/08.CJON.789-797. [DOI] [PubMed] [Google Scholar]
- 41.Alexopoulos GS, Katz IR, Reynolds CF, 3rd, et al. Pharmacotherapy of depression in older patients: a summary of the expert consensus guidelines. J Psy-chiatr Pract. 2001;7:361–376. doi: 10.1097/00131746-200111000-00003. [DOI] [PubMed] [Google Scholar]
- 42.Fulcher CD, Badger T, Gunter AK, Marrs JA, Reese JM. Putting evidence into practice: interventions for depression. Clin J Oncol Nurs. 2008;12:131–140. doi: 10.1188/08.CJON.131-140. [DOI] [PubMed] [Google Scholar]
- 43.Fervers B, Burgers JS, Voellinger R, et al. Guideline adaptation: an approach to enhance efficiency in guideline development and improve utilisation. BMJ Qual Saf. 2011;20:228–236. doi: 10.1136/bmjqs.2010.043257. [DOI] [PubMed] [Google Scholar]
- 44.Dworkin RH, O'Connor AB, Backonja M, et al. Pharmacologic management of neuropathic pain: evidence-based recommendations. Pain. 2007;132:237–251. doi: 10.1016/j.pain.2007.08.033. [DOI] [PubMed] [Google Scholar]
- 45.Dy SM, Asch SM, Naeim A, et al. Evidence-based standards for cancer pain management. J Clin Oncol. 2008;26:3879–3885. doi: 10.1200/JCO.2007.15.9517. [DOI] [PubMed] [Google Scholar]
- 46.Del Fabbro E, Dalal S, Bruera E. Symptom control in palliative care–Part III: dyspnea and delirium. J Palliat Med. 2006;9:422–436. doi: 10.1089/jpm.2006.9.422. [DOI] [PubMed] [Google Scholar]
- 47.Ben-Aharon I, Gafter-Gvili A, Paul M, Leibovici L, Stemmer SM. Interventions for alleviating cancer-related dyspnea: a systematic review. J Clin Oncol. 2008;26:2396–2404. doi: 10.1200/JCO.2007.15.5796. [DOI] [PubMed] [Google Scholar]
- 48.Dy SM, Lorenz KA, Naeim A, et al. Evidence-based recommendations for cancer fatigue, anorexia, depression, and dyspnea. J Clin Oncol. 2008;26:3886–3895. doi: 10.1200/JCO.2007.15.9525. [DOI] [PubMed] [Google Scholar]
- 49.Lorenz KA, Lynn J, Dy SM, et al. Evidence for improving palliative care at the end of life: a systematic review. Ann Intern Med. 2008;148:147–159. doi: 10.7326/0003-4819-148-2-200801150-00010. [DOI] [PubMed] [Google Scholar]
- 50.Viola R, Kiteley C, Lloyd NS, et al. The management of dyspnea in cancer patients: a systematic review. Support Care Cancer. 2008;16:329–337. doi: 10.1007/s00520-007-0389-6. [DOI] [PubMed] [Google Scholar]
- 51.Rizzo JD, Somerfield MR, Hagerty KL, et al. Use of epoetin and darbepoetin in patients with cancer: 2007 American Society of Clinical Oncology/American Society of Hematology clinical practice guideline update. J Clin Oncol. 2008;26:132–149. doi: 10.1200/JCO.2007.14.3396. [DOI] [PubMed] [Google Scholar]
- 52.Wright JR, Ung YC, Julian JA, et al. Randomized, double-blind, placebo-controlled trial of erythropoietin in non-small-cell lung cancer with disease-related anemia. J Clin Oncol. 2007;25:1027–1032. doi: 10.1200/JCO.2006.07.1514. [DOI] [PubMed] [Google Scholar]
- 53.Kangas M, Bovbjerg DH, Montgomery GH. Cancer-related fatigue: a systematic and meta-analytic review of non-pharmacological therapies for cancer patients. Psychol Bull. 2008;134:700–741. doi: 10.1037/a0012825. [DOI] [PubMed] [Google Scholar]
- 54.Page MS, Berger AM, Johnson LB. Putting evidence into practice: evidence-based interventions for sleep-wake disturbances. Clin J Oncol Nurs. 2006;10:753–767. doi: 10.1188/06.CJON.753-767. [DOI] [PubMed] [Google Scholar]
- 55.Mitchell SA, Beck SL, Hood LE, Moore K, Tanner ER. Putting evidence into practice: evidence-based interventions for fatigue during and following cancer and its treatment. Clin J Oncol Nurs. 2007;11:99–113. doi: 10.1188/07.CJON.99-113. [DOI] [PubMed] [Google Scholar]
- 56.Pena A, Estrada CA, Soniat D, Taylor B, Burton M. Nominal group technique: a brainstorming tool for identifying areas to improve pain management in hospitalized patients. J Hosp Med. 2012;7:416–420. doi: 10.1002/jhm.1900. [DOI] [PubMed] [Google Scholar]
- 57.Fink A, Kosecoff J, Chassin M, Brook RH. Consensus methods: characteristics and guidelines for use. Am J Public Health. 1984;74:979–983. doi: 10.2105/ajph.74.9.979. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Borneman T, Piper BF, Sun VC, et al. Implementing the Fatigue Guidelines at one NCCN member institution: process and outcomes. J Natl Compr Canc Netw. 2007;5:1092–1101. doi: 10.6004/jnccn.2007.0090. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 59.Mock V, Atkinson A, Barsevick AM, et al. Cancer-related fatigue. Clinical Practice Guidelines in Oncology. J Natl Compr Canc Netw. 2007;5:1054–1078. doi: 10.6004/jnccn.2007.0088. [DOI] [PubMed] [Google Scholar]
- 60.Qaseem A, Snow V, Shekelle P, et al. Evidence-based interventions to improve the palliative care of pain, dyspnea, and depression at the end of life: a clinical practice guideline from the American College of Physicians. Ann Intern Med. 2008;148:141–146. doi: 10.7326/0003-4819-148-2-200801150-00009. [DOI] [PubMed] [Google Scholar]
- 61.Lloyd-Williams M, Reeve J, Kissane D. Distress in palliative care patients: developing patient-centred approaches to clinical management. Eur J Cancer. 2008;44:1133–1138. doi: 10.1016/j.ejca.2008.02.032. [DOI] [PubMed] [Google Scholar]
- 62.Bjelland I, Dahl AA, Haug TT, Neckelmann D. The validity of the Hospital Anxiety and Depression Scale. An updated literature review. J Psychosom Res. 2002;52:69–77. doi: 10.1016/s0022-3999(01)00296-3. [DOI] [PubMed] [Google Scholar]
- 63.Jacobsen PB, Jim HS. Psychosocial interventions for anxiety and depression in adult cancer patients: achievements and challenges. CA Cancer J Clin. 2008;58:214–230. doi: 10.3322/CA.2008.0003. [DOI] [PubMed] [Google Scholar]
- 64.Kirkova J, Walsh D, Aktas A, Davis MP. Cancer symptom clusters: old concept but new data. Am J Hosp Palliat Care. 2010;27:282–288. doi: 10.1177/1049909110364048. [DOI] [PubMed] [Google Scholar]
- 65.Fleishman SB. Treatment of symptom clusters: pain, depression, and fatigue. J Natl Cancer Inst Monogr. 2004;32:119–123. doi: 10.1093/jncimonographs/lgh028. [DOI] [PubMed] [Google Scholar]
- 66.Williams LA. Clinical management of symptom clusters. Semin Oncol Nurs. 2007;23:113–120. doi: 10.1016/j.soncn.2007.01.006. [DOI] [PubMed] [Google Scholar]
- 67.Smith TJ, Temin S, Alesi ER, et al. American Society of Clinical Oncology provisional clinical opinion: the integration of palliative care into standard oncology care. J Clin Oncol. 2012;30:880–887. doi: 10.1200/JCO.2011.38.5161. [DOI] [PubMed] [Google Scholar]
- 68.Von Roenn JH, Temel J. The integration of palliative care and oncology: the evidence. Oncology (Williston Park) 2011;25:1258–1265. [PubMed] [Google Scholar]
- 69.Abernethy AP, Wheeler JL, Currow DC. Utility and use of palliative care screening tools in routine oncology practice. Cancer J. 2010;16:444–460. doi: 10.1097/PPO.0b013e3181f45df0. [DOI] [PubMed] [Google Scholar]
- 70.Lugtenberg M, Burgers JS, Westert GP. Effects of evidence-based clinical practice guidelines on quality of care: a systematic review. Qual Saf Health Care. 2009;18:385–392. doi: 10.1136/qshc.2008.028043. [DOI] [PubMed] [Google Scholar]
- 71.Gagliardi AR, Brouwers MC, Palda VA, Lemieux-Charles L, Grimshaw JM. How can we improve guideline use? A conceptual framework of implementability. Implement Sci. 2011;6:26. doi: 10.1186/1748-5908-6-26. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 72.Grimshaw J, Eccles M, Thomas R, et al. Toward evidence-based quality improvement. Evidence (and its limitations) of the effectiveness of guideline dissemination and implementation strategies 1966-1998. J Gen Intern Med. 2006;21(Suppl 2):S14–S20. doi: 10.1111/j.1525-1497.2006.00357.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 73.Grimshaw JM, Thomas RE, MacLennan G, et al. Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technol Assess. 2004;8:iii–iv. doi: 10.3310/hta8060. 1–72. [DOI] [PubMed] [Google Scholar]
- 74.Browman GP. Challenges in knowledge translation: the early years of Cancer Care Ontario's Program in Evidence-Based Care. Curr Oncol. 2012;19:27–35. doi: 10.3747/co.19.985. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 75.Abernethy AP, Herndon JE, 2nd, Wheeler JL, et al. Feasibility and acceptability to patients of a longitudinal system for evaluating cancer-related symptoms and quality of life: pilot study of an e/Tablet data-collection system in academic oncology. J Pain Symptom Manage. 2009;37:1027–1038. doi: 10.1016/j.jpainsymman.2008.07.011. [DOI] [PubMed] [Google Scholar]
- 76.Berry DL, Blumenstein BA, Halpenny B, et al. Enhancing patient-provider communication with the electronic self-report assessment for cancer: a randomized trial. J Clin Oncol. 2011;29:1029–1035. doi: 10.1200/JCO.2010.30.3909. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 77.Ruland CM, Holte HH, Roislien J, et al. Effects of a computer-supported interactive tailored patient assessment tool on patient care, symptom distress, and patients' need for symptom management support: a randomized clinical trial. J Am Med Inform Assoc. 2010;17:403–410. doi: 10.1136/jamia.2010.005660. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 78.Sittig DF, Wright A, Ash JS, Middleton B. A set of preliminary standards recommended for achieving a national repository of clinical decision support interventions. AMIA Annu Symp Proc. 2009;2009:614–618. [PMC free article] [PubMed] [Google Scholar]
- 79.Burgers JS, Fervers B, Haugh M, et al. International assessment of the quality of clinical practice guidelines in oncology using the Appraisal of Guidelines and Research and Evaluation Instrument. J Clin Oncol. 2004;22:2000–2007. doi: 10.1200/JCO.2004.06.157. [DOI] [PubMed] [Google Scholar]
- 80.Li LC, Grimshaw JM, Nielsen C, et al. Use of communities of practice in business and health care sectors: a systematic review. Implement Sci. 2009;4:27. doi: 10.1186/1748-5908-4-27. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 81.Fischer GOJ. Knowledge management: problems, promises, realities, and challenges. IEEE Intell Syst. 2001;16:60–72. [Google Scholar]
- 82.Bentley C, Browman GP, Poole B. Conceptual and practical challenges for implementing the communities of practice model on a national scale–a Canadian cancer control initiative. BMC Health Serv Res. 2010;10:3. doi: 10.1186/1472-6963-10-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 83.Ranmuthugala G, Plumb JJ, Cunningham FC, et al. How and why are communities of practice established in the healthcare sector? A systematic review of the literature. BMC Health Serv Res. 2011;11:273. doi: 10.1186/1472-6963-11-273. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 84.Kushniruk AW. Analysis of complex decision-making processes in health care: cognitive approaches to health informatics. J Biomed Inform. 2001;34:365–376. doi: 10.1006/jbin.2001.1021. [DOI] [PubMed] [Google Scholar]
- 85.Morris AH. Developing and implementing computerized protocols for standardization of clinical decisions. Ann Intern Med. 2000;132:373–383. doi: 10.7326/0003-4819-132-5-200003070-00007. [DOI] [PubMed] [Google Scholar]
- 86.Mukabatsinda C, Nguyen J, Bisig B, et al. Is increasing complexity of algorithms the price for higher accuracy? Virtual comparison of three algorithms for tertiary level management of chronic cough in people living with HIV in a low-income country. BMC Med Inform Decis Mak. 2012;12:2. doi: 10.1186/1472-6947-12-2. [DOI] [PMC free article] [PubMed] [Google Scholar]
