Abstract
Purpose:
To develop and apply an outcomes assessment framework (OAF) for care management programs in health care delivery settings.
Background:
Care management (CM) refers to a regimen of organized activities that are designed to promote health in a population with particular chronic conditions or risk profiles, with focus on the triple aim for populations: improving the quality of care, advancing health outcomes, and lowering health care costs. CM has become an integral part of a care continuum for population-based health care management. To sustain a CM program, it is essential to assure and improve CM effectiveness through rigorous outcomes assessment. To this end, we constructed the OAF as the foundation of a systematic approach to CM outcomes assessment.
Innovations:
To construct the OAF, we first systematically analyzed the operation process of a CM program; then, based on the operation analysis, we identified causal relationships between interventions and outcomes at various implementation stages of the program. This set of causal relationships established a roadmap for the rest of the outcomes assessment. Built upon knowledge from multiple disciplines, we (1) formalized a systematic approach to CM outcomes assessment, and (2) integrated proven analytics methodologies and industrial best practices into operation-oriented CM outcomes assessment.
Conclusion:
This systematic approach to OAF for assessing the outcomes of CM programs offers an opportunity to advance evidence-based care management. In addition, formalized CM outcomes assessment methodologies will enable us to compare CM effectiveness across health delivery settings.
Keywords: Methods, Comparative Effectiveness, Quality Improvement, 2014 EDM Forum Symposium, Learning Health Systems
Introduction
Care management (CM) refers to a regimen of organized activities that are designed to promote health in a population with particular chronic conditions or risk profiles, with focus on the triple aim for populations: improving the quality of care, advancing health outcomes, and lowering health care costs.1 Such activities cover a range of interventions including wellness promotion, disease management, and care coordination. CM has become an integral part of a care continuum for population-based health care management in health care delivery organizations.2,3,4 To sustain a CM program, it is essential to assure and improve CM effectiveness through rigorous outcomes assessment.5 However, CM programs are often designed and implemented before an outcomes assessment is considered. This reality poses a great challenge to carrying out a robust outcomes assessment with accuracy and precision.
In CM, population health is promoted through a variety of psychosocial and economic means such as health education, care coordination, utilization management, and incentives. In order to achieve maximum validity, this multidisciplinary nature in CM calls for a multidisciplinary approach to CM outcomes assessment. In an effort to seek a comprehensive CM outcomes-assessment methodology that accommodates to the reality of post hoc assessment, we developed a systematic approach that incorporates useful methods from multiple disciplines into existing quality improvement (QI) practices in managed care. We adopted the principles of systems engineering including project management and software engineering6,7,8 to guide the analysis of complex CM operation process; we adopted the methodologies of systematic literature review and patient-centered outcomes research9,10 to construct outcomes analysis model and to formulate analytic questions; and we followed statistical methodologies in clinical trials11,12 to develop the analysis plan. The systematic approach consists of six steps that are to be carried out in the following order: (1) constructing an outcomes assessment framework (OAF), (2) formulating analytic questions, (3) developing an analysis plan, (4) collecting data, (5) conducting statistical analysis, and (6) disseminating findings.
The OAF is the foundation of the CM outcomes assessment. In constructing an OAF, we first systematically analyzed the program operation process, then—based on the operation analysis—we identified causal relationships between interventions and outcomes at various implementation stages. This set of causal relationships established a roadmap for the rest of the outcomes assessment, particularly formulating analytic questions, which is the key to the assessment. In the present paper, we disseminate the innovative aspects of the OAF, using a real-world example, the outcomes assessment of a maternity CM program (MCMP).13
The MCMP was designed to help pregnant women with complications to self-manage their health during pregnancy and throughout the postpartum period in a managed care setting. Pregnant women with the following complications were eligible for the MCMP: diabetes, hypertension, substance abuse, preterm birth history, preterm labor history or signs and symptoms, obesity, asthma, adolescent pregnancy, HIV/AIDS, and other potential risks. The MCMP team included obstetrical nurse case manager (OBCM), obstetrical physician consultant, social worker, health coach, behavioral health coordinator, and maternity-care outreach coordinator. The OBCM was the primary care manager, the physician consultant provided medical advice, the social worker helped address nonmedical needs, the health coach educated the enrollees on healthy lifestyle, the behavioral health coordinator assisted enrollees with mental health care, and the maternity care coordinator assisted OBCMs with identifying and screening potential enrollees.
Outcomes Assessment Framework
The OAF comprises two components: the operation analysis model and outcomes analysis model. In constructing an OAF, operation analysis precedes outcomes analysis because the outcomes analysis utilizes the information obtained from the operation analysis. Each of the two components is described below in the context of the MCMP.
The Operation Analysis Model
A comprehensive assessment can only be built upon thorough knowledge of the CM program that is to be assessed. Our systematic approach thus starts with analyzing a CM program’s operation process. The goal of the operation analysis is to understand the chronological sequence of activities taking place on an individual patient during the intervention, from initiation to completion. The central task in CM operation analysis is to identify the key activities and connect them in a time sequence. Appropriately identifying those activities helps in accurately understanding CM intervention. In addition, the activities are the checkpoints where process and outcome measurement are made.
In analyzing the MCMP operation process, we first documented the MCMP workflow through reviewing program documents and interviewing case managers and other program staff. Based on the workflow, we identified five key activities and their chronological relationships as depicted in Figure 1. MCMP started with enrollment. Once a patient had enrolled in MCMP, a case manager worked with the enrollee to develop a self-management plan. The enrollee managed her own pregnancy complications under the guidance of the self-management plan, while the case manager followed up periodically. This cycle of self-management, follow-up, and assessment continued until the 56th day postpartum. Based on the nature of the activities, we determined three key activities for process measurement (enrollment, care management, enrollee’s adherence to the intervention plan) and two for outcome measurement (enrollee’s adherence to the intervention plan, birth).
Figure 1.
The MCMP Operation Analysis Model
Note: The oval labels and dotted lines denote measures associated with an individual activity.
Process and outcome metrics were defined at the population level, though process and outcome data were collected from individual patients. For the process measurement, we developed the following metrics:
Eligibility Rate to measure high-risk pregnancy prevalence (enrollment analysis);
Referral Rate to measure surveillance (enrollment analysis);
Enrollment Rate to measure MCMP coverage (enrollment analysis);
First-trimester Enrollment Rate to measure intervention timeliness (enrollment analysis);
Mean Contact Attempts per 30 Days to measure care managers’ workload (care management analysis);
Mean Successful Contact Rate to measure intervention dosage (care management analysis);
Rate of First Obstetric (OB) Visit in the First Trimester or 42 Days of Health Plan Enrollment to measure prenatal care timeliness (analysis of enrollee’s adherence to the intervention plan);
Frequency of Ongoing Prenatal Care to measure prenatal care adherence (analysis of enrollee’s adherence to the intervention plan); and
OB Visit During 21–56 Days Postpartum to measure postpartum care adherence (analysis of enrollee’s adherence to the intervention plan).
For the outcome measurement, we developed clinical indicators for risk factors and birth outcomes. For example, the following indicators are one indicator for diabetic care management and five for birth outcome:
Normal A1C Rate for Diabetic Pregnancy (analysis of enrollee’s adherence to the intervention plan);
Mortality Rate (birth analysis);
Full-Term Birth Rate (birth analysis);
Normal Birth Weight Rate (birth analysis);
Neonatal Intensive Care Unit (NICU) Admission Rate (birth analysis); and
Mean NICU Days (birth analysis).
Table 1 provides definitions of some of the metrics.
Table 1.
Sample Population-Level Process and Outcome Metrics Derived from MCMP Operation Analysis
| Metrics | Numerator | Denominator |
|---|---|---|
| Analysis of Enrollment | ||
| MCMP Eligibility Rate | Number of deliveries by those who were eligible for MCMP | Number of deliveries |
| MCMP Referral Rate | Number of deliveries by those who were eligible for and referred to MCMP | Number of deliveries by those who were eligible for MCMP |
| MCMP Enrollment Rate | Number of deliveries by MCMP enrollees | Number of deliveries by those who were eligible for and referred to MCMP |
| MCMP First-trimester Enrollment Rate | Number of deliveries by those who were enrolled in MCMP during their first trimester | Number of deliveries by MCMP enrollees |
| Analysis of Care Management | ||
| Mean Successful MCMP Contact Rate | ∑ [(Number of successful contacts for each delivery by a MCMP enrollee) / (Number of total contact attempts for the delivery)] | Number of deliveries by MCMP enrollees |
| Mean MCMP Contact Attempts per 30 Days | ∑ [(Number of contact attempts for each delivery by a MCMP enrollee) / (MCMP enrollment days of the delivery)]*30 | Number of deliveries by MCMP enrollees |
| Analysis of Enrollee’s Adherence to the Intervention Plan | ||
| First OB Visit in the First Trimester or Within 42 Days of Health Plan Enrollment | Number of live-birth deliveries by members with first OB visit in the first trimester or within 42 days of Health Plan enrollment | Number of live-birth deliveries |
| Frequency of Ongoing Prenatal Care | Number of live-birth deliveries by members with <21% of expected OB visits | Number of live-birth deliveries |
| Number of live-birth deliveries by members with ≥21 % & ≤40% of expected OB visits | Number of live-birth deliveries | |
| Number of live-birth deliveries by members with ≥41% & ≤60% of expected OB visits | Number of live-birth deliveries | |
| Number of live-birth deliveries by members with ≥61% & ≤80% of expected OB visits | Number of live-birth deliveries | |
| Number of live-birth deliveries by members with ≥81% of expected OB visits | Number of live-birth deliveries | |
| OB Visit Within the 21–56 Days Postpartum Period | Number of live-birth deliveries by members with a postpartum OB visit on or between 21 and 56 days after delivery | Number of live-birth deliveries |
| Analysis of Birth | ||
| Perinatal Mortality Rate | Number of stillborns or infant deaths of age less than 28 days | Number of neonates (i.e., live newborns or stillborns) |
| Full-Term Birth Rate | Number of live neonates with gestational age at birth ≥ 37 weeks | Number of live neonates |
| Normal Birth Weight Rate | Number of live neonates with birth weight ≥ 2500 grams | Number of live neonates |
| NICU Admission Rate | Number of live neonates with NICU admission | Number of live neonates |
| NICU Length of Stay | Number of live neonates with NICU LOS ≤ 5 days & > 0 days | Number of live neonates with NICU admission |
| Number of live neonates with NICU LOS ≤ 10 days & > 5 days | Number of live neonates with NICU admission | |
| Number of live neonates with NICU LOS ≤ 20 days & > 10 days | Number of live neonates with NICU admission | |
| Number of live neonates with NICU LOS ≤ 40 days & > 20 days | Number of live neonates with NICU admission | |
| Number of live neonates with NICU LOS > 40 days | Number of live neonates with NICU admission | |
The Outcomes Analysis Model
Once we understand a program’s operation process, we are ready to develop an outcomes analysis model. The goal of the outcomes analysis model is to establish a chain of causal relationships among CM elements, from intervention to primary outcome. The model can best be presented graphically as shown in Figure 2, with each box representing a CM element that may be susceptible to a change in the preceding element and may cause a change in the following element, resulting in a chain reaction. The chain reaction starts with intervention design, followed by intervention implementation and, in turn, by manifestation of intervention effect on intermediate, surrogate, and finally primary outcome measure.
Figure 2.
The MCMP Outcomes Analysis Model
Note: The oval labels and dotted lines denote the relationship between two boxes.
An outcome measure is labeled as a primary measure if it is of the main interest of a study, a surrogate measure if it closely correlates with the primary measure, or an intermediate measure if it is neither primary nor surrogate. For example, birth weight is the primary measure in the MCMP but immeasurable until birth; gestational age serves as a surrogate measure for birth weight because full-term birth likely results in normal weight; the A1C measure for pregnancies with diabetes complications is an intermediate measure because A1C is a marker of pregnancy health but does not necessarily correlate with birth weight. Listed in Table 2 are a sample of measures in MCMP outcomes assessment.
Table 2.
Sample Outcome Measures Defined in MCMP Outcomes Assessment
| Outcome Measure | Outcome Type | Function Type |
|---|---|---|
| Birth weight | Primary | Clinical |
| Gestational age at birth | Surrogate | Clinical |
| A1C | Intermediate | Clinical |
| OB visit frequency by gestational age | Intermediate | Utilization |
| Inpatient days, per member per month | Intermediate | Utilization |
| Mean number of successful contacts between case manager and patients, per patient per month | Intermediate | Process |
Figure 2 shows the outcomes analysis model for the MCMP assessment. The MCMP outcomes analysis model established a causal relationship chain: self-management leads to better adherence to maternity care, to better condition-specific clinical outcomes, to full-term delivery and presumably normal-weight newborns, and to lower health care costs. Analytic questions can be formulated by linking the intervention and outcomes (curve arrows in Figure 2). The linkages serve dual purposes: identifying analytic questions and providing an “evidence pathway” for identifying opportunities for program improvement.
Deriving Analytic Questions from Outcomes Assessment Framework (OAF)
Based on the outcomes analysis model, one can readily formulate analytic questions that address aspects of CM program effectiveness. An analytic question concerning comparative effectiveness of an intervention shall define six elements—referred to as “PICOTS”—Population, Intervention, Comparator, Outcomes, Timing, and Setting.9,10 PICOTS specifies what population the intervention targets, what the intervention is, what the comparator intervention is, what the outcome is, when the outcome is to be measured, and settings in which the intervention is to be delivered. Methodically formulating an analytic question in the PICOTS format is crucial for rigorous outcomes assessment because PICOTS maximizes precision and minimizes bias. Five of the six PICOTS elements are already explicitly specified in OAF, i.e., “P,” “I,” “O,” “T,” and “S.” “C” is implied as no intervention.
Table 3 lists three sample analytic questions formulated based on Figure 2. Each question falls into one category, respectively: process, clinical outcome, and utilization. The process question is derived from the first two elements in Figure 2: MCMP intervention results in better adherence to self-management plan. The clinical question connects elements of 1, 2, 3, and 4—intervention enhances adherence to care plan, the adherence improves clinical indicators and, in turn, improves birth outcome. The utilization question is derived from the relationships among all five elements—MCMP intervention facilitates adherence to the self-management plan that reduces avoidable high-cost services (e.g., inpatient care) and likelihood of delivery complications (e.g., NICU admission), and consequently achieves lower utilization of health care services.
Table 3.
Sample Analytic Questions
| Category | Analytic Question | Population | Intervention | Comparator | Outcome | Timing | Setting |
|---|---|---|---|---|---|---|---|
| Process | Did a larger percentage of MCMP enrollees adhere to prenatal OB visit schedule than non-MCMP enrollees who did? | Pregnant women with complications | MCMP | No MCMP | Prenatal OB visit schedule | Scheduled OB visit | Health plan |
| Clinical | Did a larger percentage of MCMP enrollees with diabetes who adhere to the prenatal OB visit schedule have full-term birth than those who did not comply? | Pregnant women with diabetes | OB visit adherence | OB visit nonadherence | Birth term | At birth | Health plan |
| Utilization | Did a larger percentage of MCMP enrollees with diabetes who maintained A1C < 7 have no emergency department visits than those whose A1C ≥ 7? | Pregnant women with diabetes | Maintaining A1C < 7 | Not maintaining A1C < 7 | Emergency department visit | At birth | Health plan |
Discussion
Care management represents one kind of complex system that contains a variety of factors—biological, clinical, behavioral, social, and economic—that interactively contribute to a physical and mental condition under management. In addition, there is a great deal of heterogeneity among the patients in demographics and health profiles, and of variations in CM interventions. The challenge we face here is how to deal with the complexity and sort out the interweaving relationships among the contributing factors.
Systems engineering is an interdisciplinary approach and is a means to construct complex systems. And it recognizes each system as an integrated whole even though composed of diverse, specialized structures and subfunctions.6,14 Systems engineering has been applied to developing and managing complex systems since its inception in the 1940s,15,16,17 and has expanded its reach to social systems and human behaviors.18,19,20 We applied the principles of systems engineering in a reverse engineering fashion by dissecting a complex CM program into interrelated components with unique subfunctions.
Project management is a related field in which systems engineering plays a significant role and provides us with practical techniques for operation analysis. For example, we applied the technique of work breakdown structure (WBS) from project management to help identify key activities in MCMP operation.7,21
Software engineering is another field where systems engineering finds a home. The formal methods in software engineering are mathematically rigorous techniques for specifying and analyzing complex systems.22 In this context, “mathematically rigorous” implies that specifications and analyses used in formal methods are logically well formed. One important aspect of the formal methods is specification languages, such as notations.23,24 We adopted the design principles of the specification languages and developed our own notation and classification system as manifested in Figures 1–2 and Tables 1–3. By bringing a range of disciplines and approaches to the OAF for the CM program, we are able to systematically dissect a complex and often irregular process, and consequently establish a formal model of causal relationships among multiple factors.
One of the biggest challenges of studying complex systems is how to describe their behavior accurately in an intuitive way. Graphical representation —such as diagrams —is an effective tool for such analysis and has been widely used in quality control.25,26,27,28 In both the operation analysis model and outcome analysis model, we used flowcharts to describe interactions among CM activities and elements. There are international and national standards defining the vocabulary and syntax of flowcharts for various industries.29,30 However, standards are yet to be developed for flowcharts depicting CM process.
We developed the outcomes analysis model to identify elements in a CM program, to describe interactions among the elements, and eventually to establish a chain of causal relationships among the elements. A well-developed OAF lays a foundation for formulating sound analytic questions. It is essential to establish an outcome analysis model prior to formulating analytic questions, because the model guides us in navigating the maze of intertwined relationships among elements of a complex CM program and in clarifying true causal relationships. Similar analytical approaches have been pursued in a variety of medical and social sciences research.31,32,33,34 In our systematic approach, we formulated— in the PICOTS format—analytic questions that had originated in the methodology of systematic literature reviews and had been endorsed by the Patient-Centered Outcomes Research Institute.35,11 We consider the PICOTS formula the best way to construct analytic questions on intervention effectiveness. PICOTS enables us to think through the subject matter in a systematic manner. By specifying the six components —population, intervention, comparator, outcome, timing (of measuring the outcome), setting (for intervention)—one can develop a comprehensive analytic question and a corresponding analysis plan that addresses the question. Our version of an analysis plan (not presented in this paper) is influenced by statistical analysis plans for clinical trials recommended by the U.S. Food and Drug Administration (FDA). The FDA requires that a statistical analysis plan be prepared for a trial and that the analysis be carried out according to the plan.9,10 This regulation assures procedural control of trial quality. While we recognize the importance of quality assurance, we are cognizant of the differences between CM and drug discovery, and thus promote the development of a standardized analysis plan template suitable for CM outcomes assessment. Data collection is the most labor-intensive and error-prone part of CM outcomes assessment. Following the best practices in project management and software engineering,36 we adopted rigorous documentation as a crucial measure of quality control for data set construction. On the surface, documentation appears to be time-consuming and less productive. In the final analysis, documentation is the most efficient and economical way to construct quality data sets. All of the time and other resources spent on documentation will pay off by reducing avoidable bugs and debugging time.
Our systematic approach to CM outcomes assessment was developed in a managed care setting where CM is offered to plan members as value-added service, free of charge. However, CM has a cost, and it needs justification in a business model. Valid and reliable data are needed to provide decision makers with evidence of the value of CM. In addition to demonstrating financial value, continuous and comprehensive data collection and program monitoring allow CM programs to be strengthened over time. This systematic approach offers an opportunity to advance evidence-based care management. In addition, formalized CM outcomes assessment methodologies will enable us to compare CM effectiveness across settings.
Though the OAF was initially developed to address challenges faced in CM assessment, the basic OAF concepts and methodologies are readily generalizable to other population health programs because they share the same task and challenge: delivering effective self-care intervention to a population with heterogeneous health profiles in community settings. Such methods as analyzing operation process through identifying key activities in time sequence, establishing a chain of causal relationships among elements of an intervention regimen, and formulating PICOTS questions are generally applicable to other programs. In addition, the CM outcomes assessment was implemented as part of QI activities in an operational context. It thus does not require the approval by the Institutional Review Board (IRB). However, such an operational study can naturally evolve into a clinical research project over time because it incorporates the basic methodological elements required for scientific research. Our MCMP project is actually an example of such QI-research evolution.
Acknowledgments
The authors thank U.S. Census Bureau and Johns Hopkins HealthCare for their support for this project. This submission is based on work presented at the 2014 EDM Forum Symposium.
Footnotes
Disciplines
Health Services Research
References
- 1.Berwick DM, Nolan TW, Whittington J. The triple aim: Care, health, and cost. Health Affairs. 2008;27:759–769. doi: 10.1377/hlthaff.27.3.759. [DOI] [PubMed] [Google Scholar]
- 2. Care Continuum Alliance/About Us. http://www.carecontinuumalliance.org. Accessed December 20, 2013.
- 3.Chronic Conditions: Making the Case for Ongoing Care. Partnership for Solutions. 2004. http://www.partnership-forsolutions.org/DMS/files/chronicbook2004.pdf. Accessed December 24, 2013.
- 4.Designing and Implementing Medicaid Disease and Care Management Programs: A User’s Guide. Agency for Healthcare Research and Quality; Rockville, MD: 2008. http://www.ahrq.gov/professionals/systems/long-term-care/resources/hcbs/medicaidmgmt/index.html. Accessed December 24, 2013. [Google Scholar]
- 5.Effectiveness of Outpatient Case Management for Adults with Medical Illness and Complex Care Needs. Agency for Healthcare Research and Quality; Rockville, MD: 2013. http://www.effectivehealthcare.ahrq.gov/search-for-guides-reviews-and-reports/?pageaction=displayproduct&productID=1677. Accessed December 27, 2013. [PubMed] [Google Scholar]
- 6.International Council on Systems Engineering What is Systems Engineering? http://www.incose.org/practice/whatissystemseng.aspx. Accessed March 30, 2014.
- 7.A Guide to the Project Management Body of Knowledge. 5th Ed. Project Management Institute; Newtown Square, PA: 2013. [Google Scholar]
- 8.Humphrey WS. Managing the Software Process. Addison-Wesley; Boston, USA: 1989. [Google Scholar]
- 9.Methods Guide for Effectiveness and Comparative Effectiveness Reviews. Agency for Healthcare Research and Quality; Rockville, MD: 2012. http://www.effectivehealthcare.ahrq.gov/ehc/products/60/318/MethodsGuide_Prepublication-Draft_20120523.pdf. Accessed December 27, 2013. [PubMed] [Google Scholar]
- 10.PCORI Methodology Committee; 2013. The PCORI Methodology Report. http://www.pcori.org/assets/2013/11/PCORI-Methodology-Report.pdf. Accessed December 27, 2013. [Google Scholar]
- 11.U.S. Food and Drug Administration; 1998. Guidance for Industry: E9 Statistical Principles for Clinical Trials. http://www.fda.gov/downloads/drugs/guidancecomplianceregulatoryin-formation/guidances/ucm073137.pdf. Accessed on March 30, 2014. [Google Scholar]
- 12.Meinert CL. Clinical Trials: Design, Conduct, and Analysis. Oxford University Press; New York, USA: 1986. [Google Scholar]
- 13.Wang L, Lu Y, Sherry M, Evans K, Richardson R, Hawkins M, Neale D. Evaluation of Partners-with-Mom maternity care management program for PP population (FY2009–10, Phase I & II). JHHC internal report. 2011.
- 14.Chestnut H. Systems Engineering Tools. Wiley; Hoboken, NJ: 1965. [Google Scholar]
- 15.Schlager KJ. Systems engineering - key to modern development. IRE Transactions on Engineering Management. 1956;EM-3:64–66. [Google Scholar]
- 16.von Bertalanffy L. General System Theory: Foundations, Development, Applications. New York: George Braziller; 1968. [Google Scholar]
- 17.Ross Ashby W. An Introduction to Cybernetics. Chapman & Hall; London: 1956. [Google Scholar]
- 18.Trist E, Bamforth K. Some social and psychological consequences of the longwall method of coal getting. Human Relations. 1951;4:3–38. [Google Scholar]
- 19.Cooper R, Foster M. Sociotechical systems. American Psychologist. 1971;26:467–474. [Google Scholar]
- 20.Luhmann N. Systemtheorie, Evolutionstheorie und Kommunikationstheorie. Soziologische Gids. 1975;22:154–168. [Google Scholar]
- 21.Kerzner H. Project Management: A Systems Approach to Planning, Scheduling, and Controlling. 10th Ed ed. Wiley; Hoboken, NJ: 2009. [Google Scholar]
- 22.Hinchey GM, Bowen PJ, Vassev E. Formal Methods. In: Laplante AP, editor. Encyclopedia of Software Engineering. Taylor & Francis; 2010. pp. 308–320. [Google Scholar]
- 23.Hanks KS, Knight JC. In Search of Best Practices for the Use of Natural Language in the Development of High-Consequence Systems. In Fastabstracts, International Conference of Dependable Systems and Networks; Annapolis, MD. 2002. [Google Scholar]
- 24.Holt J. UML for Systems Engineering: Watching the Wheels. 2004 IET. [Google Scholar]
- 25.Gilbreth FB, Gilbreth LM. Process Charts. American Society of Mechanical Engineers; New York, NY: 1921. [Google Scholar]
- 26.ASME standard; operation and flow process charts. American Society of Mechanical Engineers; New York, NY: 1947. [Google Scholar]
- 27.Goldstine H. The Computer from Pascal to Von Neumann. Princeton University Press; Priceton, NJ: 1972. pp. 266–267. [Google Scholar]
- 28.Taub AH. John von Neumann Collected Works 5. Macmillan; New York, NY: 1963. pp. 80–151. [Google Scholar]
- 29. Flow Diagrams for Process Plants - General Rules. ISO 10628.
- 30. Graphical Symbols for Process Flow Diagrams. ANSI Y32.11.
- 31.Woolf SH. An organized analytic framework for practice guideline development: using the analytic logic as a guide for reviewing evidence, developing recommendations, and explaining the rationale. In: McCormick KA, Moore SR, Siegel RA, editors. Methodology Perspectives. Rockville, MD: Agency for Health Care Policy and Research; 1994. pp. 105–13. AHCPR Pub. No. 95-0009. [Google Scholar]
- 32.Leipzig RM, Whitlock EP, Wolff TA, Barton MB, Michael YL, Harris R, Petitti D, Wilt T, Siu A. U.S. Preventive Services Task Force Geriatric Workgroup. Reconsidering the approach to prevention recommendations for older adults. Ann Intern Med. 2010;153:809–814. doi: 10.7326/0003-4819-153-12-201012210-00007. [DOI] [PubMed] [Google Scholar]
- 33.Ragin CC. Pine Forge Press. 1994. Constructing Social Research: The Unity and Diversity of Method. [Google Scholar]
- 34.Coble CR. Association of Public and Land-granted Universities; 2012. Developing the analytic framework assessing innovation and quality design in science and mathematics teacher preparation. http://www.aplu.org/document.doc?id=3652. Accessed on March 30, 2014. [Google Scholar]
- 35.Russell R, Chung M, Balk EM, et al. Issues and Challenges in Conducting Systematic Reviews to Support Development of Nutrient Reference Values: Workshop Summary. Rockville, MD: Agency for Healthcare Research and Quality; 2009. Nutrition Research Series, Vol. 2. http://www.ncbi.nlm.nih.gov/books/NBK44088/. Accessed March 30, 2014. [PubMed] [Google Scholar]
- 36.Sommerville I. Software Engineering. 9th ed. Upper Saddle River, NJ: Pearson; 2011. [Google Scholar]


