ABSTRACT
Chronic pain is one of the most common presenting problems in primary care. Standards and guidelines have been developed for managing chronic pain, but it is unclear whether primary care providers routinely engage in guideline-concordant care. The purpose of this study is to develop a tool for extracting information about the quality of pain care in the primary care setting. Quality indicators were developed through review of the literature, input from an interdisciplinary panel of pain experts, and pilot testing. A comprehensive coding manual was developed, and inter-rater reliability was established. The final tool consists of 12 dichotomously scored indicators assessing quality and documentation of pain care in three domains: assessment, treatment, and reassessment. Presence of indicators varied widely. The tool is reliable and can be utilized to gather valuable information about pain management in the primary care setting.
KEYWORDS: Chronic pain, Primary care, Quality indicators, Chart extraction, Chart review
Chronic non-cancer pain is one of the most common presenting problems in outpatient settings [1–4]. The number of Americans with chronic pain exceeds those with cancer, diabetes, and heart disease combined [5], and treatment for chronic pain in the USA costs more than $600 billion annually [6]. Despite the high prevalence of chronic pain and the substantial cost associated with treating it, poor pain management continues to exist in health care. Common practice errors in pain management have been identified and standards and guidelines for pain management published. However, improvements in guideline-concordant care remain modest at best [7] despite research showing that effective guideline implementation positively impacts health-care outcomes including patient functioning [8], health-care costs [9], and rates of unnecessary tests and procedures [10].
To our knowledge, only two metrics have been developed to assess the quality of pain management in the primary care setting. Krebs and colleagues [11], in a study examining concordance rates between documented care and patient report of care, developed a tool to assess pain care processes within an academic internal medicine clinic. Information extracted from medical records included two pain assessment-related indicators (i.e., any mention of location or duration of pain and order of any diagnostic test) and four treatment-related indicators (i.e., provision of advice on pain management, initiating an analgesic medication, modifying current medication regimen, or recommending non-pharmacologic treatment). Similarly, Corson and colleagues [12] developed the Pain Process Checklist (PPC) which included three pain-specific indicators (e.g., whether pain severity was assessed quantitatively), three pain-related psychosocial indicators (e.g., whether depression was assessed), and two treatment-related indicators (e.g., if side effects of opioid medication were addressed). In both studies, results suggested that primary care providers (PCPs) tended to document basic pain assessments, but failed to document a comprehensive pain care plan or specify actual treatments provided.
As part of a quality improvement project, we sought to develop a reliable tool to evaluate pain-relevant performance metrics and expand on previous chart extraction tools in several important areas. Given the importance of a comprehensive pain assessment in order to improve the diagnostic and treatment processes, we sought to obtain more detailed information about specific factors that were addressed during the pain assessment. For example, consistent with guidelines outlining key aspects of a quality pain assessment, we include indicators on pain description/sensation and the impact of pain on patient functioning [13–15]. Similarly, specific information is gathered about treatments and interventions, which includes information about pain medications but also consults to pain specialty services and patient education. Finally, our tool differs from previous tools by including an indicator for pain reassessment, or the assessment of effects of an intervention, which plays a vital role in quality pain care. The reliability and validity of the tool was evaluated throughout the development process and during pilot testing to support the psychometric properties of the tool.
METHOD
Tool development
The tool was designed to extract information about the quality and documentation of pain assessment, pain treatment, and reassessment during primary care appointments. Tool development was informed by the methodology used in creating other chart extraction tools [16] and proposed guidelines for methodology for retrospective chart review research [17, 18]. After problem identification and definition of goals for the tool, the literature and current guidelines and recommendations regarding aspects of quality pain care were reviewed. Information gathered through this process was discussed at team meetings, where evidence for guidelines and recommendations was cross-checked with the literature regarding aspects of quality pain care.
Provisional quality indicators were developed and grouped into three domains: pain assessment, pain treatment, and reassessment. Pain assessment targets information gathered by the PCP to help with diagnosis, factors that contribute to or alleviate pain, and the impact of pain on patient functioning. Pain treatment includes actions taken to treat pain, including entering a consult for specialty services, ordering a diagnostic test, prescribing a medication, and/or providing education/information. Pain reassessment addresses whether PCPs check in with patients about the effectiveness of current pain treatments and whether pain and/or functioning has changed since the previous visit. Based on the review of the complexity of progress note text and consideration of various methods for scoring extracted data, it was decided that indicators would utilize a dichotomous scoring system to indicate whether each indicator was present or absent in each progress note. Provisional indicators were presented to a multidisciplinary panel of providers and researchers with backgrounds and expertise in pain management and treatment from the disciplines of psychology, internal medicine, nursing, and pain medicine.
After careful revision of the indicators by the multidisciplinary panel, an initial draft of the coding manual was developed. The coding manual consisted of operational definitions for each domain and individual indicators, guidelines for how to code certain passages of text that occurred frequently in PCP notes, and examples of what would and would not meet criteria for each specific indicator. The coding manual was further refined during pilot testing (see below). While the final version of the manual was extensive, a modified, brief version can be seen in the Appendix. The tool was pilot tested by two independent coders on a sample of 200 PCP progress notes from patients’ EHRs and demonstrated evidence of inter-rater reliability (κ ≥ 0.75). Additional information regarding reliability of the tool is presented below.
Table 2.
Definitions and examples of indicators
| Indicator | Description | Example |
|---|---|---|
| Presence of pain | The word “pain” must be explicitly mentioned in the note | Examples: “Pain 8/10”, “c/o pain,” “cc: pain,” “returns to clinic for pain management,” “chronic pain,” “lower back pain,” “no pain today” Exception: “no chest pain” |
| Cause/source of pain | Etiology, location/site, and/or diagnosis contributing to pain | Examples: DJD, osteoarthritis, injury, headaches, reference to site/location of pain (e.g., knee pain, lower back pain, joint pain). |
| Review of pain-related assessment/consult | Reference to, or inclusion of, findings/results of pain-related consults, assessments, and/or imaging | Examples: findings/results from a consult/MRI/x-ray/other test for pain/pain-related condition included in the note; can be considered a review even if the notes appear to be copied/pasted from another note/chart as long as it is clearly pain-related. |
| Pain sensation | Patient description of the feeling/sensation of pain | Examples: “sharp,” “dull,” “burning,” “electric,” “throbbing,” etc. |
| Constant vs. intermittent pain | Any indication in the text of the note that pain sensation is constant or intermittent | Examples: “constant, relentless, always present” versus “intermittent, comes and goes, fluctuates”. |
| Alleviating/exacerbating factors | Any indication in the text of the note of factors (activities, positions, temperature, psychosocial factors, etc.), other than medications or other pharmacological interventions that make pain better or worse | Examples: “Walking more than 2 blocks,” “lying down,” “standing more than 5 minutes,” “rain,” “cold weather,” “feeling stressed.” |
| Functional assessment | Reference to ways in which pain impacts or interferes with one’s life. Can be physical or emotional; must be clear that the effect on functioning being described is due to pain; needs to be more than “disabled” | Examples: “Because of pain cannot work,” “pain wakes him up at night,” “uses wheelchair because of pain.” |
| Pain medication order | Any medication intervention – a new medication, change of dosage, or continuation of current medication regimen. Also, indicate yes if a pain medication is included in the medication list within the note. If a medication is simply included in the med list but not further discussed, this would still count as a “yes” for this item; must be a PRESCRIPTION medication order for pain | Examples: Morphine, Naproxen, Oxycodone, Gabapentin, etc. Exception: Medications that may help pain that are not prescribed by a medical provider (e.g., “OTC Advil”) |
| Pain-related consult | Reference to entering a new pain-related consult. Does not include statements that a prior consult has been completed or that patient is continuing to follow-up with another pain-related service | Examples: “Refer to Neurology for neuropathy,” “refer to rehab medicine for home traction,” “brace clinic for knee brace,” “podiatry/prosthetics for orthotics for foot pain.” |
| Provision of education/information | Provision of pain-related education, information, handouts, etc. | Examples: “discussed stretching for pain,” “counseled patient on importance of exercise for pain,” “reviewed use of ice/heat.” Exception: Education specifically regarding instructions for pain medication. |
| Diagnostic intervention | Any intervention conducted to clarify pain diagnosis/treatment | Examples: X-ray, MRI, EMG. |
| Reassessment | Evaluation of the effectiveness of the current treatment plan/pain care | Examples: “states medication is effective,” “meds help pain somewhat,” “patient states PT didn’t help,” “injections not working,” “pain stable with Percocet.” Exception: does not include reassessment of pain itself, without connection to effectiveness of the treatment plan (e.g., “still has pain,” “pain worse”). |
After pilot testing, the tool and coding manual were again reviewed by the panel to ensure that coding domains and indicators were operationally defined and mutually exclusive and to review any indicators for which frequency of endorsement was particularly low. The final tool consisted of an indicator extracting the patient’s pain rating at the visit and 12 dichotomously scored indicators assessing quality and documentation of pain care in three domains: assessment, treatment, and reassessment.
Sample
All data for this study were gathered through extraction from EHRs; a HIPAA waiver of consent was granted by the local Institutional Review Board, which approved this study. A random sample of 200 patients enrolled for care at a single VA health-care facility who reported moderate to severe pain (reporting pain intensity ≥4 on the 0 (no pain) to 10 (worst pain imaginable) numeric rating scale) was identified, and primary care progress notes for patients in the sample were examined. Cases were excluded if patients did not have a routine primary care visit with their PCP within the 1-year time period. More specifically, cases were excluded if (1) the only primary care encounter was one in which the patient saw only the nurse and not the PCP, (2) the only primary care encounter was an unscheduled or urgent rather than routine visit, (3) the only primary care encounter was an initial rather than follow-up visit with the PCP, or (4) there were no primary care encounters within the time period. Cases were also excluded if opioids were being prescribed for cancer pain. After implementing these exclusion criteria, 153 cases remained. For each of these cases, the first PCP progress note within the time period was evaluated using the tool.
Basic provider information was also collected during the chart review process. Of the providers, 81.5 % were attending physicians, 13.6 % were advanced practice registered nurses or physician assistants, and 4.8 % were medical residents.
Data analysis
Descriptive analyses were conducted to evaluate mean pain scores and frequencies of presence of each indicator. Inter-rater reliability was assessed using Cohen’s kappa.
RESULTS
The sample consisted of 153 individuals receiving primary care services at a VA health-care facility from July 2009 to June 2010. Mean pain score using the numeric pain rating scale was 6.42 (SD = 1.80). A random sample of 50 cases was selected to assess inter-rater reliability. Two coders extracted information using the tool. Inter-rater reliability for each item is presented in Table 1. Reliability ranged from κ = 0.56 to κ = 1.00.
Table 1.
Inter-rater reliability
| Indicator | Κ | p | CI |
|---|---|---|---|
| Presence of pain | 1.00 | <0.001 | |
| Cause/source of pain | 1.00 | <0.001 | |
| Review of pain-related assessment/consult | 0.80 | <0.001 | 0.63–0.97 |
| Pain sensation | 0.63 | <0.001 | 0.39–0.87 |
| Constant vs. intermittent pain | 0.62 | <0.001 | 0.39–0.86 |
| Alleviating/exacerbating factors | 0.65 | <0.001 | 0.42–0.88 |
| Functional assessment | 0.58 | <0.001 | 0.35–0.80 |
| Pain medication order | 0.79 | <0.001 | 0.39–1.19 |
| Pain-related consult | 0.77 | <0.001 | 0.55–0.98 |
| Provision of education/information | 0.59 | <0.001 | 0.33–0.84 |
| Diagnostic intervention | 0.69 | <0.001 | 0.37–1.02 |
| Reassessment | 0.56 | <0.001 | 0.31–0.82 |
K Cohen’s kappa, CI confidence interval
Figure 1 shows the percentage of cases in which each indicator was present in the PCP progress notes. Findings demonstrated that in the overwhelming majority of cases, both the presence of pain (95.4 %) and reference to the cause or source of pain (94.8 %) were documented. However, the frequency of documentation for all remaining indicators occurred to a much lesser degree with percentages ranging from a high of 56.9 % (i.e., assessment of pain medication) to as low as 8.5 % (i.e., referral for diagnostic intervention).
Fig 1.
Frequencies of documentation in PCP progress notes (%)
DISCUSSION
While pain is often the primary concern of patients in primary care appointments, it frequently does not receive the time and attention that patients desire [19, 20]. Our tool was developed to assess the quality of pain care and documentation in the primary care setting, with specific focus on the quality of pain assessment, pain treatment, and reassessment. We have demonstrated that the tool is reliable and can be utilized to gather valuable information about pain assessment, treatment, and reassessment during primary care appointments.
The tool was pilot tested to assess reliability and examine strengths and weaknesses in pain care within the primary care setting at a VA health-care facility. We found that among individuals reporting moderate to severe pain, some type of assessment was conducted in most cases (i.e., acknowledgement of presence and source of pain), but the comprehensiveness of the assessment was lacking. Other critical aspects of pain assessment and treatment, such as the patient’s description of the quality of pain, evaluation of factors that alleviate or exacerbate pain, functional assessment, and provision of pain-related education, occurred relatively infrequently. These findings are consistent with past studies reporting that these factors are often absent from documentation of clinical encounters [12, 21, 11] despite the fact that guidelines for quality pain management and studies examining pain-related outcomes typically emphasize the importance of these factors [14, 15, 22–25]. Similarly, reassessment of treatment only occurred in about one third of encounters with patients reporting moderate to severe pain, further reinforcing that pain management is often not a high priority and is frequently overlooked in the primary care setting [4, 6] or perhaps that in the absence of a specific treatment plan, there is little to reassess at a subsequent visit.
Inter-rater reliability was good to excellent, ranging from κ = 0.56 to κ = 1.00, though there was some coder drift after training. Future studies and/or quality improvement programs can further improve inter-rater reliability by periodically assessing reliability throughout coding to provide feedback to raters and improve coding fidelity.
Extracted information can be useful in a number of ways at the provider, service, and organizational levels. Individual providers could benefit from feedback about the frequency with which they engage in and document certain key behaviors consistent with guideline-concordant pain management. The tool can also be used to evaluate the effect of educational interventions that aim to improve the quality of pain care by PCPs by assessing PCP behavior and documentation before and after such interventions. Individual services or clinics, or organizations as a whole, can also utilize such information to track the quality of pain care over time and evaluate and monitor PCP performance. The tool could be used for research purposes as it demonstrates adequate inter-rater reliability and provides quantitative data about pain care extracted from medical records. For example, investigators may be interested in examining how the quality of pain assessment relates to pain treatment and subsequent outcomes and whether patients who receive information and education about pain in the primary care setting have improved outcomes.
There are some limitations of chart extraction techniques that should be noted. Chart extraction can be time consuming, though once raters have been adequately trained coding can be performed more efficiently. In addition, extraction may be conducted more efficiently and reliably through the use of machine learning and natural language processing in which programs can be used to identify and code text-based information from medical records. Such a program would significantly reduce time and resources needed to assess quality of care and thus would make it much more feasible for both researchers and health-care organizations to assess provision of guideline-concordant pain care. The preliminary version of the tool presented here establishes the reliability and content validity of the tool; development of an automated version of the tool is already underway.
A second limitation of chart extraction is that there may be discrepancies between what is documented in a progress note and what actually occurs during a clinical encounter. Several studies have reported that data extracted from the medical record tends to result in underestimation of provider behavior [11, 26, 27], and that extracted data are often inconsistent with patient report of information [11, 28]. However, evidence for accuracy of other proxy measures of provider behavior, including patient and provider self-report, are mixed as well [26]; therefore, in the absence of direct observation, which is typically more difficult to carry out and has its own methodological limitations (e.g., the Hawthorne effect), data extraction can be considered comparable to other methods of collecting data about provider behavior. Utilization of multiple methods of data collection would be ideal in order to have a comprehensive view of what occurs during clinical encounters.
A final limitation of this study is generalizability. We examined quality of pain care and documentation among a sample of patients reporting moderate to severe pain. Because patients in the sample were being seen for a scheduled primary care visit, rather than an urgent visit or emergency room visit, it is likely that patients were reporting pain related to a chronic, rather than acute, issue. However, it is possible that some patients were experiencing acute, rather than chronic, pain at the time of their regularly scheduled primary care visit. Future studies could explicitly compare assessment, treatment planning, and reassessment in patients presenting with acute vs. chronic pain problems.
Acknowledgments
Funding
This study was funded by a Program for Research Leadership Award from The Patrick and Catherine Weldon Donaghue Medical Research Foundation and the Mayday Fund.
Appendix
Footnotes
Implications
Practice: Practitioners should strive to provide guideline-concordant pain care, such as by conducting a functional assessment and providing patient education, which are important to good pain care yet are frequently overlooked.
Policy: Guidelines and recommendations for pain management have been disseminated, yet a gap remains between such guidelines and routine practice in primary care; policymakers should consider how to address this gap between evidence-based guidelines and practice.
Research: Future research should examine provider reports of barriers to comprehensive pain assessment and guideline-concordant treatment, as well as interventions to address such barriers.
References
- 1.Kerns RD, Otis J, Rosenberg R, Reid MC. Veterans' reports of pain and associations with ratings of health, health-risk behaviors, affective distress, and use of the healthcare system. J Rehabil Res Dev. 2003;40:371–379. doi: 10.1682/JRRD.2003.09.0371. [DOI] [PubMed] [Google Scholar]
- 2.Haskell SG, Heapy A, Reid MC, Papas RK, Kerns RD. The prevalence and age-related characteristics of pain in a sample of women veterans receiving primary care. J Womens Health (Larchmt) 2006;15:862–869. doi: 10.1089/jwh.2006.15.862. [DOI] [PubMed] [Google Scholar]
- 3.Reid MC, Engles-Horton LL, Weber MB, Kerns RD, Rogers EL, O'Connor PG. Use of opioid medications for chronic noncancer pain syndromes in primary care. J Gen Intern Med. 2002;17:173–179. doi: 10.1046/j.1525-1497.2002.10435.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Upshur CC, Luckmann RS, Savageau JA. Primary care provider concerns about management of chronic pain in community clinic populations. J Gen Intern Med. 2006;21:652–655. doi: 10.1111/j.1525-1497.2006.00412.x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Tsang AM, Von Korff S, Lee J, et al. Common chronic pain conditions in developed and developing countries: gender and age differences and comorbidity with depression-anxiety disorders. J Pain. 2008;9:883–891. doi: 10.1016/j.jpain.2008.05.005. [DOI] [PubMed] [Google Scholar]
- 6.Relieving pain in America: a blueprint for transforming prevention, care, education, and research. Washington, DC: The National Academies; 2011. [PubMed] [Google Scholar]
- 7.Cleeland CS, Reyes-Gibby CC, Schall M, et al. Rapid improvement in pain management: the Veterans Health Administration and the institute for healthcare improvement collaborative. Clin J Pain. 2003;19:298–305. doi: 10.1097/00002508-200309000-00003. [DOI] [PubMed] [Google Scholar]
- 8.Kirshbaum M. The development, implementation and evaluation of guidelines for the management of breast cancer related to lymphoedema. Eur J Cancer Care. 1996;5:246–251. doi: 10.1111/j.1365-2354.1996.tb00243.x. [DOI] [PubMed] [Google Scholar]
- 9.Aucott JN, Pelecanos E, Dombrowski R, Fuehrer SM, Laich J, Aron DC. Implementation of local guidelines for cost-effective management of hypertension. J Gen Intern Med. 1996;11:139–146. doi: 10.1007/BF02600265. [DOI] [PubMed] [Google Scholar]
- 10.Nardella A, Pechet L, Snyder LM. Continuous improvement, quality control, and cost containment in clinical laboratory testing. Effects of establishing and implementing guidelines for preoperative tests. Arch Pathol Lab Med. 1995;119:518–522. [PubMed] [Google Scholar]
- 11.Krebs EE, Bair MJ, Carey TS, et al. Documentation of pain care processes does not accurately reflect pain management delivered in primary care. J Gen Intern Med. 2010;25:194–199. doi: 10.1007/s11606-009-1194-3. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Corson K, Doak MN, Denneson L, et al. Primary care clinician adherence to guidelines for the management of chronic musculoskeletal pain: results from the study of the effectiveness of a collaborative approach to pain. Pain Med. 2011;12:1490–1501. doi: 10.1111/j.1526-4637.2011.01231.x. [DOI] [PubMed] [Google Scholar]
- 13.Sellinger J, Wallio S, Clark EA, Kerns RD. Comprehensive pain assessment: integration of biopsychosocial principles. In: Ebert MH, Kerns RD, editors. Behavioral and psychopharmacologic pain management. London: Cambridge University; 2011. [Google Scholar]
- 14.VA/DoD clinical practice guideline for management of opioid therapy for chronic pain. Washington, DC: Department of Veterans Affairs, Department of Defense; 2010. [Google Scholar]
- 15.Assessment and management of chronic pain. Bloomington: Institute for Clinical Systems Improvement; 2011. [Google Scholar]
- 16.Lorenz KA, Dy SM, Naeim A, et al. Quality measures for supportive cancer care: the Cancer Quality-ASSIST Project. J Pain Symptom Manag. 2009;37:943–964. doi: 10.1016/j.jpainsymman.2008.05.018. [DOI] [PubMed] [Google Scholar]
- 17.Gearing RE, Mian IA, Barber J, et al. A methodology for conducting retrospective chart review research in child and adolescent psychiatry. Can Acad Child Adolesc Psychiatry. 2006;15:126–134. [PMC free article] [PubMed] [Google Scholar]
- 18.Kergoat MJ, Leclerc BS, Leduc N, et al. Quality of care assessment in geriatric evaluation and management units: construction of a chart review tool for a tracer condition. BMC Geriatr. 2009;9:34. doi: 10.1186/1471-2318-9-34. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Matthias MS, Bair MJ, Nyland KA, et al. Self-management support and communication from nurse care managers compared with primary care physicians: a focus group study of patients with chronic musculoskeletal pain. Pain Manag Nurs. 2010;11:26–34. doi: 10.1016/j.pmn.2008.12.003. [DOI] [PubMed] [Google Scholar]
- 20.Matthias MS, Parpart AL, Nyland KA, et al. The patient-provider relationship in chronic pain care: providers' perspectives. Pain Med. 2010;11:1688–1697. doi: 10.1111/j.1526-4637.2010.00980.x. [DOI] [PubMed] [Google Scholar]
- 21.Chodosh J, Solomon DH, Roth CP, et al. The quality of medical care provided to vulnerable older patients with chronic pain. J Am Geriatr Soc. 2004;52:756–761. doi: 10.1111/j.1532-5415.2004.52214.x. [DOI] [PubMed] [Google Scholar]
- 22.Lorig KR, Ritter PL, Laurent DD, et al. Long-term randomized controlled trials of tailored-print and small-group arthritis self-management interventions. Med Care. 2004;42:346–354. doi: 10.1097/01.mlr.0000118709.74348.65. [DOI] [PubMed] [Google Scholar]
- 23.Buchbinder R. Self-management education en masse: effectiveness of the Back Pain: Don't Take It Lying Down mass media campaign. Med J Aust. 2008;189:S29–S32. doi: 10.5694/j.1326-5377.2008.tb02207.x. [DOI] [PubMed] [Google Scholar]
- 24.de Jong JR, Vlaeyen JW, Onghena P, et al. Fear of movement/(re)injury in chronic low back pain: education or exposure in vivo as mediator to fear reduction? Clin J Pain. 2005;21:9–17. doi: 10.1097/00002508-200501000-00002. [DOI] [PubMed] [Google Scholar]
- 25.Dorflinger L, Kerns RD, Auerbach SM. Providers' roles in enhancing patients' adherence to pain self management. Transl Behav Med. 2013;3:39–46. doi: 10.1007/s13142-012-0158-z. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Hrisos S, Eccles MP, Francis JJ, et al. Are there valid proxy measures of clinical behaviour? A systematic review. Implement Sci. 2009;4:37. doi: 10.1186/1748-5908-4-37. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Luck J, Peabody JW, Dresselhaus TR, et al. How well does chart abstraction measure quality? A prospective comparison of standardized patients with the medical record. Am J Med. 2000;108:642–649. doi: 10.1016/S0002-9343(00)00363-6. [DOI] [PubMed] [Google Scholar]
- 28.Tisnado DM, Adams JL, Liu H, et al. What is the concordance between the medical record and patient self-report as data sources for ambulatory care? Med Care. 2006;44:132–140. doi: 10.1097/01.mlr.0000196952.15921.bf. [DOI] [PubMed] [Google Scholar]

