Abstract
Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members.
Keywords: informatics infrastructure, data management system, meta-analysis
The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we conducted. We selected electronic tools that enabled us to organize and track the meta-analytic process, as well as to enhance communication among research team members. Key aspects of the electronic process are discussed below and include (a) locating studies that report relevant data among variables of interest, (b) extracting correlational data, or data that could be converted into correlations, from primary studies and pooling data across studies, (c) importing extracted data into data management software for analyses, and (d) gleaning meaningful, reliable, and clinically useful information from the synthesis of these data.
The primary aim of the meta-analysis was to examine a set of models that included psychological (e.g., depression, stress, anxiety), motivational (e.g., self-efficacy, health beliefs), and diabetes-related knowledge factors in predicting diabetes outcomes, such as glycemic control and quality of life. Behavioral factors — adherence to diet, physical activity, medications, glucose self-monitoring, and appointment keeping — were examined as mediating variables because they are key targets for interventions. Health care providers need evidence-based guidelines on how behavioral change can be fostered in order to improve diabetes health outcomes and glycemia, because fewer than 30% of individuals with type 2 diabetes achieve nationally recommended goals for glucose control.1,2
Recent conversations with leading meta-analysts have indicated that even today, meta-analysis continues to be mainly a paper-and-pencil process, with a research team member assigned to organize paper copies of the literature and coding documents. The meta-analyst may develop paper code sheets on which to extract data from primary studies, guided by codebooks that contain theoretical and operational definitions of the study variables and detailed coding instructions for each variable on the code sheet. Managing hard copy study documents becomes unwieldy in meta-analyses, especially those of broad scope and complexity, requiring multiple coders and inclusion of large numbers of studies. For our complex four-year model-testing meta-analysis, we chose to design an electronic system for storage of study documents and files and to enhance the efficiency, productivity, communication, file sharing, and work flexibility of the research team members. In this meta-analytic study, we examined relationships among 30 variables; the code sheet used to extract data from each primary study contained 2,714 data fields; we screened more than 26,000 articles, dissertations, and other research reports; and we coded more than 600 studies. Faced with the challenges of storing, organizing, and sharing copies of individual studies, code sheets, and other study documents, among the research team members, we were compelled to design an efficient electronic working process. The electronic process described here, although presented within the context of meta-analysis, is adaptable for any type of research that requires an efficient, smooth, well-organized working process.
Selection of Electronic Tools
The selection of electronic tools and software for the meta-analytic study involved several group discussions with members of the research team. Criteria for selection required the software or electronic tool to be (a) readily available from an established company to avoid any user contracts or patent issues, (b) commonly used software produced by longstanding, stable companies unlikely to go out of business during the course of the four-year project, (c) available from a company that provided technical support, (d) compatible with both Apple and Windows-based computers because team members worked on different computer platforms, (e) supported by the university’s information technology services, (f) flexible with interface capabilities, and (g) reasonably priced and relatively easy to use.
Based on these criteria, the research team selected: EndNote (Thomson Reuters [Scientific] Inc., Carlsbad, CA), Adobe Acrobat Pro (Adobe, San Jose, CA), Blackboard (Blackboard Inc., Washington, DC), Excel (Microsoft, Redmond, WA), and IBM SPSS Statistics ([SPSS], International Business Machines Corporation, Armonk, NY). Blackboard was already in use university-wide as a Web-based platform designed to enhance education and effective communication among users. This basic platform is easily adapted for any project that requires sharing of documents and communication among a working group of individuals, such as in the meta-analysis project. EndNote and Adobe Acrobat Pro were selected based on their design purposes and ability to interface with each other and also with Excel and IBM SPSS Statistics (SPSS). EndNote is a tool for publishing and managing bibliographies and thus, we used it for searching bibliographic databases on the Internet and organizing references and electronic files in portable document format (PDF). Adobe Acrobat Pro is a software program commonly used for creating and managing PDF documents. Particularly useful for our purposes, it enabled the creation of fillable forms with a specific interactive data field for each variable, as well as the highlighting of key information and insertion of comments directly on documents. Adobe Acrobat Pro enabled the interfacing of data from the fillable meta-analysis coding forms to our selected database spreadsheet tool (Excel) and statistical analysis (IBM SPSS Statistics [SPSS]) software. More detailed information on each software product is provided in Table 1, as is the website location of each program and a brief summary of how each program was integrated into the overall project to facilitate the steps of the research process.
Table 1.
Selected electronic tools
SOFTWARE | DESCRIPTION | USE IN PROJECT |
---|---|---|
Adobe Acrobat Pro |
Online document management system; enables online document reviews, creation of fillable forms, highlighting key information and inserting of “call out” notes directly on documents (http://www.adobe.com/products/acrobatpro.html) |
• Creating PDFs of all documents and inserting comments and justification for coding decisions made by coders • Formatting code sheets with fields that allowed exporting of data directly into Excel and/or SPSS databases |
Blackboard | Web-based collaboration platform designed to enhance education and effective communication among users; can be adapted for any project that requires sharing of documents and communication among a working group of individuals (http://www.blackboard.com/) |
• Posting and organizing of PDFs of all study documents, including studies and code sheets in file folders categorized by variable • Providing a site for notations during the coding process, which enhances communication among coders and other team members • Highlighting in color coded notes status of each primary study |
EndNote | Tool for publishing and managing bibliographies; enables the search of bibliographic databases on the Internet, organizing of references and PDFs, collaborative writing (http://www.endnote.com/eninfo.asp) |
• Searching electronic databases, using database-specific search terms • Organizing and managing downloaded documents resulting from searches • Screening studies and noting code according to inclusion criteria • Searching sample of studies according to inclusion code or some other relevant variable of interest |
IBM SPSS Statistics |
Computer program for statistical analyses (http://www-01.ibm.com/software/analytics/spss/products/statistics/) |
• Managing data • Conducting data analyses |
Microsoft Excel |
Database spreadsheet tool for designing tables, formatting data, inserting formulas (http://office.microsoft.com/en-us/excel/) |
• Creating database of coded data • Interfacing of database with various statistical software tools as needed |
The Meta-Analysis Electronic Process
The software programs selected to support the meta-analysis study needed to facilitate the project steps of searching the literature, screening and organizing studies, coding studies, tracking coding progress, data management and analyses, and communicating among project staff. In Table 2, we show a flow diagram of the project steps and the sequence of how the software programs were integrated. A detailed discussion of each step shown in Table 2 is provided below.
Table 2.
Steps in using electronic tools (EndNote, Adobe Acrobat Pro, Blackboard, EXCEL, SPSS)
LITERATURE SEARCHING | |
---|---|
• Establish standardized search process for each database using thesaurus of terms | |
• Search electronic databases with guidance from research librarian | |
• Download search results directly into EndNote for storing, categorizing, & tracking literature | |
| |
SCREENING AND ORGANIZING STUDIES | |
| |
• Remove duplicates, download study PDFs from the internet, store bibliographic information and PDFs | |
• Use column headings to categorize studies by author, title, publication, year published, etc. | |
• Use “Group” and “Group Set” functions in EndNote to group studies by database of origin or search variable | |
• Conduct 1st screen of studies in EndNote files, using the Quick Search function and Show/Hide function to find only studies meeting inclusion criteria | |
• Insert in EndNote the established screening code notations re status until final disposition | |
• Use EndNote for tracking studies for “ancestry search” | |
| |
CODING | |
| |
• Staff posts studies on Blackboard in Studies to be Coded folder, model variable subfolder, specific study subfolder | |
• 1st coder selects study to code & notes “in process”; enters data onto an electronic, Adobe Acrobat Pro formatted code sheet | |
• When finished, posts code sheet in the specific study subfolder with a note indicating that it is ready for 2nd coding | |
• 2nd coder checks code sheet and identifies any coding discrepancies | |
• 2nd coder discusses any discrepancies found with 1st coder to resolve discrepancies, if possible | |
If Coder Agreement | If Coder Discrepancies |
Code sheet renamed and posted in the specific study subfolder |
Code sheet renamed and posted in the specific study subfolder |
Notation “READY” highlighted in yellow so staff knows code sheet is ready to be moved to Final Studies folder |
Notation “EXPERT” in green indicating needing to be checked by statistical or methods experts |
E-mail sent to Expert Coder that study is ready for checking; indicate author, date, & location |
|
Expert Coder reviews study code sheet, makes changes if necessary and reposts in same location Notation “READY” highlighted in yellow so staff knows code sheet is ready to be moved to Final Studies folder |
|
•Staff assigns study ID and moves final code sheet to Final Studies folder and notes date in the specific study subfolder | |
• After code sheet finalized, all comments and working files for that study are placed in an Archive folder on Blackboard | |
| |
TRACKING PROGRESS AND FINAL CHECKING OF STUDIES | |
| |
• A minimum of 10% of code sheets with Coder Agreements randomly checked by an Expert Coder | |
• 100% of code sheets with Coder Discrepancies are checked by an Expert Coder | |
• All included studies, code sheets, coding communication notes, and process statuses of studies are available on Blackboard for all team members to view and access as needed | |
| |
ANALYZING | |
| |
• Data exported from electronic Adobe Acrobat Pro formatted code sheets into SPSS database via EXCEL | |
| |
ONGOING COMMUNICATION | |
| |
• Enhanced communication via online notes and e-mails on status, progress, justification of decisions | |
• In-person meetings held for discussion and decisions, which are logged on an online decision log file | |
• All documents easily available wherever research staff are working |
Note: Titling of the code sheet according to established naming convention, revised at each stage to reflect coder initials and coding status, e.g., 0170 Jones 2009-TG/LS/SAB-Final
Literature searching
An early step in a meta-analytic study is a search of online databases, similar to usual online searches but significantly more thorough and systematic. The meta-analyst is looking for ALL studies, published or unpublished (e.g., dissertations), in the area in order to decrease the likelihood of sampling error.3,4 We have searched 13 databases with specific search terms and search strategies designed for each database, based on the specific database taxonomy. To organize the located studies, we employed EndNote for managing bibliographies and references. This tool allowed us to store each citation in the EndNote library and attach the full text article to its citation. Citations were exported directly from the search database and categorized within EndNote for tracking purposes. An essential feature of EndNote is that it allowed us to identify duplicates of studies found through different, overlapping databases. Also, for future reference and reporting, we have stored the source database(s) used for each search strategy.
Screening and organizing studies
EndNote is useful for organizing full text articles located and saved in PDF format within Adobe Acrobat Pro, a document management software tool. The citations in EndNote were organized by each variable combination of the study (e.g., the variable combination of depression, type 2 diabetes, and adherence) under each database that was searched (e.g., PubMed). Initial screening of studies by title and abstract in the EndNote library was done by research staff, using the screening protocol (see Table 3), and allowed tracking of screening codes in a custom field to indicate whether the study seemed on face value to involve relevant variables. This electronic approach enabled quick screening by using the search function of EndNote to look for key words, thus allowing research staff to isolate and remove studies that did not meet our meta-analysis inclusion criteria, for example those involving children with type 1 diabetes. For this complex model-testing meta-analysis project, this process allowed full text articles and other research reports to be reviewed through the electronic document search feature in Adobe Acrobat Pro. This software also enabled reviewers to highlight key information. In many instances, the analytic results we searched for were not identifiable in the title or abstract but were buried in the body of the report as secondary findings. Comments were inserted indicating justification for decisions made. For scanned research documents that were unsearchable, the OCR (optical character recognition) function was applied so that highlighting and/or comments could be inserted and the document search function used. Studies that passed the first screening were uploaded to the meta-analysis Blackboard site, the web-based document management system useful for posting, organizing, and sharing documents, as well as enhancing communication among a group of individuals, in this case, the research team members. For the project, the Blackboard website was organized by variables included in the meta-analysis. (See Figure 1 for screen shot of Blackboard website, which shows variable folders containing studies that need to be coded.) Other important citations were preserved in EndNote for potential further analysis and background information.
Table 3.
Screening Levels
Code | Code Definition |
---|---|
Studies to be included | |
1 | Type 2 diabetes, contain 2 or more variables in the model, correlational data provided |
2 | Type 2 diabetes, contain 2 or more variables in the model, non-correlational data provided but in format that allows computation of correlational effects |
| |
Studies to be excluded | |
3 | Mixed type 1 and type 2 diabetes, n and/or % of sample that is type 2 but no data provided to verify |
4 | No type 2 diabetes—sample involves participants with type 1, gestational, or drug-induced diabetes |
5 | Type of diabetes not defined |
6 | Sample does not involve individuals with diabetes |
7 | Data are either not correlational or not in format that would enable computation of correlational effects, that is, no usable data |
8 | Other: state specify reason |
9 | Other: state specify reason |
10 | Data in the study are duplicative of another study |
Figure 1.
Snapshot of study site on Blackboard
Coding of primary studies
Development of the code sheet and codebook
A key step in conducting meta-analyses is development of a reliable and valid coding protocol for extracting data from primary studies, a difficult, time-consuming task for complex model-testing meta-analyses,5 and even for smaller meta-analytic studies. A formalized process for developing code sheets has been used previously by the research team in three separate studies.6,7,8,9,10 Our previous code sheets contained four basic variable categories, including (a) methodological and substantive features, (b) study quality,11,12 (c) intervention descriptors, and (d) outcome measures. The development and refinement of our previous coding protocols were described in a 2003 publication by the first author13: (a) selecting a random subset of primary studies that meet inclusion criteria, (b) reviewing and listing variables that are of interest in these studies, (c) adding or substituting variables to the code sheet and reorganizing the order of variables for a logical flow and ease of extracting the data, and (d) pilot testing the code sheet on a separate subset of studies.
The code sheet for this meta-analytic study was developed in Microsoft Word first and then saved as an Adobe Acrobat Pro document. Using the Form Wizard feature in Adobe Acrobat Pro, we converted the Microsoft Word document into a form with blank fillable fields, which were named and customized for format (font, field size and color, text wrapping, etc.). All fields did not directly transfer and some fields required manual formatting and customization before the coding data could be entered. However, the main benefit of this process was that the data that were later entered into the fillable fields were then exportable directly into data management software, e.g., Excel and IBM SPSS Statistics (SPSS), thus eliminating the usual time-consuming data entry process.
Then, the codebook was revised to reflect the list of coding variables on the companion code sheet and contained theoretical definitions and mutually exclusive and collectively exhaustive operational definitions of each variable to establish acceptable levels of intercoder and intracoder agreements. In addition, the codebook contained decision criteria. For example, when duplicate reports of the same data were screened, the decision on handling duplicate data was to include only one publication per study, the study that provided the most complete set of variables targeted by the meta-analysis.14 The code sheet and codebook were posted on the Blackboard® website as Adobe Acrobat Pro® files for ease of access and search capability. A comprehensive and clear coding protocol plus the use of the electronic tools ensured that the coders understood the coding process and had ease of communication with other coders and senior investigators. Thus, we were more likely to achieve consistently high coder agreement levels during the coding process.
Coding processes
From each primary study meeting inclusion criteria for the meta-analysis, the project staff extracted correlations, or other data representing variable associations from which correlations could be calculated, between any two variables relevant to the meta-analysis.4 Extracted data were entered into the corresponding variable data element fields on the electronic code sheet formatted in Adobe Acrobat Pro. Descriptive characteristics of each primary study were also coded in order to describe the sample and to conduct moderator analyses.15,16
All documents needed by the research team for coding primary studies were stored on the project’s Blackboard website organized by variable folders: research reports in Adobe Acrobat Pro format, project codebook to guide the coding process, a blank code sheet with coding fields designed to interface with the project database, the decision log containing coding decisions made by the research team, and all completed code sheets containing data extracted from primary research reports. The majority of these documents were saved in PDF format using Adobe Acrobat Pro. This completely electronic process enabled each coder to independently code studies, highlight and insert comments for clarification, communicate and reach agreement or disagreement without having to meet face-to-face, in most instances. It also allowed senior investigators to double check the coding of studies that were completed by the initial coders, with the added benefit of seeing their decision processes as documented in notes on the specific website for each primary study. The Blackboard website allowed us to easily determine initial and ongoing interrater agreements as well as intrarater reliabilities. To date, our interrater agreements have been greater than 0.87. However, we actually achieved 100% agreement because no code sheet was finalized until at least two coders agreed on the coding. Additionally, the research team met at least bi-weekly to discuss any remaining disagreements in coding until consensus was reached. Thus, the coding process was strengthened and coding discrepancies were corrected.
Tracking progress and final checking of studies
Using these electronic tools for the meta-analysis enabled tracking of coding progress and final checking of coded data for accuracy because all the needed documents were easily available to members of the research team, wherever and whenever they were working. Code sheets were checked for omitted data, direction of relationships between variables, and data entry errors. Justification of coding decisions made by the initial coders and senior investigators were noted on Blackboard® in the relevant study subfolder beside the copies of the research report and the related code sheet. Senior investigators, including a member with statistical expertise, randomly checked a minimum of 10% of finalized code sheets posted on the project’s Blackboard website, as well as reviewed any studies referred by initial coders due to coding disagreements.
Analyzing data
Using Adobe Acrobat Pro, code sheets were designed with fields that enabled export of coded data into a Microsoft Excel spreadsheet or directly into IBM SPSS Statistics (SPSS) statistical software for analyses. This was accomplished through the Tools menu options. Several code sheets can be added one at a time. Using this automated process, data entry was less time consuming and less vulnerable to data entry errors.
Ongoing communication
On the project’s Blackboard website, coders posted comments or questions in the study subfolder where each study and its code sheet were posted. Decisions made by initial coders during the coding process and areas of coder disagreement were communicated through notes and facilitated the achievement of excellent levels of coding agreement as described above. Also, status updates were noted regarding whether the study had been finalized or whether there were issues to be taken to the next research team meeting. The notes were highlighted in specific colors (color coded) to facilitate scanning through the folder to quickly identify study status, such as whether the study was completed or was being referred to a senior investigator for review or to the research team meeting for discussion.
Weekly or bi-weekly research team meetings have been held to discuss coding issues and review any specific studies that required a new coding decision or needed group input into the coding process. The project’s Blackboard website was projected onto a screen and all team members could visualize the specific study and code sheet under discussion. If any new coding decisions were made, the decision log was updated at that time. For training purposes and also for complicated studies, the meetings were used for group coding.
Strengths Versus Limitations of the Electronic Process
The electronic process described here has been useful from a number of perspectives. One of the primary considerations involved the desire to establish processes that would enable the research team to achieve, maintain, and track interrater reliabilities of many aspects of the coding process, as well as of decisions (e.g., whether the primary study met inclusion criteria) that were made during the meta-analytic process. As in any research study, reliability and validity were important concerns. But in meta-analytic studies, acceptable levels of reliability and validity may be more difficult to attain because multiple decisions are required during the steps of a meta-analysis and each decision can ultimately affect, and potentially confound, the findings.4,5 Any process that enables careful management and tracking of coding and other decisions would be useful.
The electronic tools described here have been extremely helpful in streamlining the meta-analysis process — storing and screening studies for inclusion, coding studies, establishing interrater agreements, and increasing mobility and access to data for review — and enhancing communication and sharing of documents among research team members, particularly study coders. Further, documents on the Blackboard website were archived on a regular basis so that back-up copies were secured. (We also backed-up documents on automated external drives for additional data security.) Using electronic tools also enabled the members of the team to work on coding studies, conduct checks of coded data, and calculate correlations (when necessary) wherever they desired to work, such as on airline flights or when working at home, as long as they had computer access.
The major limitation in designing and implementing a fully electronic system for our model testing meta-analysis was that it took considerable preparatory time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. We spent much of the first year designing, testing, and revising the system and training research team members. Training sessions frequently involved group coding of studies as documents were projected onto a screen from the project’s Blackboard website. Once these developmental steps were accomplished, coding proceeded smoothly and more quickly, although we found it necessary to “tweak” the processes several times for greater efficiency.
There were several issues that we had to address before the process was efficient and streamlined. We initially had inserted steps that were later found to be superfluous and were ultimately removed. For example, coders posted notes in the Finalization Log (Microsoft Word document) when they finished the coding of a study to indicate where the code sheet was located and inform project staff to migrate study documents to the Final Sample folder. We determined that it was more efficient to make these notes directly on the Blackboard website (through color coding) in the subfolder where documents of each primary study were stored. Two other issues involved (a) the cumbersome process of having to download and repost the latest version of documents onto the Blackboard website and (b) being able to determine which code sheet was the latest version. An ideal system would enable coders to make changes in real time without having to download and repost the updated version. In terms of identifying the latest versions of documents, we developed a labeling format that made identification clear, eliminated deleting old documents, and left a paperless audit trail. (See Table 2.) The ability for two research staff to share documents in real time would be a useful feature. Other shared network sites, such as Sakai or Google Docs, could perhaps facilitate sharing documents in real time; but we were limited in our choices by the sites that were available at and supported by the university.
We encountered an issue with data transfer when data were saved to a PDF file in Adobe Acrobat Pro, but was opened through the Preview program (Apple, Cupertino, CA), the default on an Apple computer. The saved data were there but could only be seen from that particular Apple computer; it appeared invisible on other machines. To fix the problem, we exported the “invisible” data out and re-imported it back into the same document ensuring it was not opening in the Preview program. We re-set all Apple computers so they did not default to the Preview program when opening PDF documents. Minor problems also can occur when updated versions of software are released but any program differences are easily adapted so that the research is not delayed unnecessarily. One final caveat that should be noted again is that any electronic tools selected for such research endeavors need to be supported by the investigators’ university or institution and any anticipated changes in technical support need to be taken into consideration.
The largest meta-analysis we have conducted in the past, using a paper-and-pencil process, involved less than 100 studies in the sample. We could not have conducted a meta-analysis with the complexity and scope of the model-testing study without electronic support. However, even for less complicated meta-analyses and other types of non-meta-analytic studies, the steps reported here are directly applicable, e.g., automated data entry from coding forms into Microsoft Excel or IBM SPSS Statistics (SPSS) statistics software spreadsheets. So, the approach described has wide generalizability and utility, although it would be necessary to make the necessary adaptations to reflect the software that is availability in an individual researcher’s setting.
Blackboard is not the only application that could serve as a site for organizing research data and documents and fostering communication among project staff. When this meta-analytic project was started in 2009, many of the current “cloud applications” were not developed yet. Blackboard was already in place at the university and did not require us to use any grant funds to invest in new data management systems. Blackboard had been the university’s online learning tool for some time and thus, we knew that it was always reliable and accessible from both on and off-campus locations. It: (a) was, and continues to be, stable and reliable; (b) met our data organization purposes and security requirements; (c) provided us with local technical support by university employees who received training on data privacy and security; and (d) furnished us with back-up systems for project data. Blackboard is run on university-owned, centrally managed, secured, controlled-access servers; as investigators on the meta-analysis project, we set access permissions locally. Given that Blackboard served our purposes securely and effectively, there has been no motivation to seek other solutions. From a continuity perspective, if the university replaced Blackboard with another system, the entire campus would migrate to it, along with the many IT support systems already in place; and the transition would be relatively seamless.
When newer applications, such as “cloud computing technology” became available more recently, we did not consider changing systems since Blackboard and other programs we had selected were working well. Cloud applications are less of a known entity; data are stored on remote servers so one gives up local control and servers can at times be overloaded and unavailable. The university, like other institutions involved in research, is required by funding agencies to impose high standards regarding data security; it is not clear that newer technologies would meet these security criteria. Meta-analysis data from the study we are conducting are aggregate data, not individual data; and therefore, the data are considered low-risk and are not subject to human subjects protections regulations. Despite the fact that we are working with lower-risk data, we take steps to assure data security so that we are confident in the accuracy and safe retention of the database we are building.
The main message of this paper is that researchers should consider using existing electronic resources in innovative ways for research purposes, that is, to employ tools that are familiar, accessible, and reliable. In academic settings, almost any courseware would meet these objectives because such systems typically enhance staff intercommunication and allow data sharing in a secure environment.
In summary, criticisms of meta-analysis, as well as other types of research, have included that such studies require substantial personnel effort and are very time intensive.17 Meta-analysis is a research strategy for uncovering the meaning of a body of research so that findings can be implemented confidently into practice. It is unwise to change practice based on the findings of a single study, unless it is a large, well-designed clinical trial involving a large, diverse sample. Few studies reach this level of complexity and scope due mainly to the large number of resources, including funding, required for conducting such a large trial. Synthesizing smaller studies via a well-constructed meta-analysis serves the same purpose, i.e., to identify evidence upon which to base future clinical practice. Finding ways to streamline the research process in any context, such as the employment of electronic tools as reported here, will enable future discovery of improvements that can be made in practice in a more efficient and less time-consuming manner.
Acknowledgments
Funding This publication was made possible by grant #5R01NR011450 from the National Institute of Nursing Research (NINR) at the National Institutes of Health to Sharon A. Brown. Its contents are solely the responsibility of the authors and do not necessarily represent the official views of NINR.
Footnotes
Conflicts of Interest: The authors declare no conflicts of interest.
This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final citable form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.
Contributor Information
Sharon A. Brown, School of Nursing The University of Texas at Austin.
Ellen E. Martin, School of Nursing The University of Texas at Austin.
Theresa J. Garcia, School of Nursing The University of Texas at Austin.
Mary A. Winter, School of Nursing The University of Texas at Austin.
Alexandra A. García, School of Nursing The University of Texas at Austin.
Adama Brown, School of Nursing The University of Texas at Austin.
Heather E. Cuevas, School of Nursing The University of Texas at Austin.
Lisa L. Sumlin, School of Nursing The University of Texas at Austin.
References
- 1.Del Prato S, Felton A. Glucose control: closing the gap between guidelines and practice. Diabetes Voice. 2006 Mar;51(1):15–8. [Google Scholar]
- 2.Global guideline for type 2 diabetes. International Diabetes Federation; Brussels, Belgium: 2005. International Diabetes Federation (IDF) Clinical Guidelines Task Force. [Google Scholar]
- 3.Conn VS, Valentine JC, Cooper HM, Rantz MJ. Grey literature in meta-analyses. Nurs Res. 2003 Jul-Aug;52(4):256–61. doi: 10.1097/00006199-200307000-00008. [DOI] [PubMed] [Google Scholar]
- 4.Hunter JE, Schmidt FL. Methods of Meta-analysis: Correcting Error and Bias in Research Findings. 2nd ed Sage Publications; Thousand Oaks, CA: 2004. [Google Scholar]
- 5.Cooper HM. Research Synthesis and Meta-analysis: A Step-by-Step Approach. 4th ed Sage Publications; Thousand Oaks, CA: 2010. [Google Scholar]
- 6.Studies of educational interventions and outcomes in diabetic adults: a meta-analysis revisited. Patient Educ Couns. 1990 Dec;16(3):198–215. doi: 10.1016/0738-3991(90)90070-2. Author. [DOI] [PubMed] [Google Scholar]
- 7.Quality of reporting in diabetes patient education research: 1954-1989. Res Nurs Health. 1990 Feb;13(1):53–62. doi: 10.1002/nur.4770130109. Author. [DOI] [PubMed] [Google Scholar]
- 8.Effects of educational interventions in diabetes care: a meta-analysis of findings. Nurs Res. 1988 Jul-Aug;37(4):223–30. Author. [PubMed] [Google Scholar]
- 9.Hedges LV. Predicting metabolic control in diabetes: a pilot study using meta-analysis to estimate a linear model. Nurs Res. 1994 Nov-Dec;43(6):362–8. Author. [PubMed] [Google Scholar]
- 10.Upchurch S, Anding R, Winter M, Ramírez G. Promoting weight loss in type II diabetes. Diabetes Care. 1996 Jun;19(6):613–24. doi: 10.2337/diacare.19.6.613. Author. [DOI] [PubMed] [Google Scholar]
- 11.Conn VS, Rantz MJ. Research methods: managing primary study quality in meta-analyses. Res Nurs Health. 2003 Aug;26(4):322–33. doi: 10.1002/nur.10092. [DOI] [PubMed] [Google Scholar]
- 12.Linde K, Scholz M, Ramírez G, Clausius N, Melchart D, Jonas WB. Impact of study quality on outcome in placebo-controlled trials of homeopathy. J Clin Epidemiol. 1999 Jul;52(7):631–6. doi: 10.1016/s0895-4356(99)00048-7. [DOI] [PubMed] [Google Scholar]
- 13.Upchurch SL, Acton GJ. A framework for developing a valid and reliable coding scheme for meta-analysis. West J Nurs Res. 2003 Mar;25(2):205–22. doi: 10.1177/0193945902250038. Author. [DOI] [PubMed] [Google Scholar]
- 14.Tramèr MR, Reynolds DJM, Moore RA, McQuay HJ. Impact of covert duplicate publication on meta-analysis: a case study. BMJ. 1997 Sep 13;315(7109):635–40. doi: 10.1136/bmj.315.7109.635. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 15.Conn VS. Anxiety outcomes after physical activity interventions: meta-analysis of findings. Nurs Res. 2010 May-Jun;59(3):224–31. doi: 10.1097/NNR.0b013e3181dbb2f8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 16.Conn VS, Hafdahl AR, Mehr DR. Interventions to increase physical activity among healthy adults: meta-analysis of outcomes. Am J Public Health. 2011 Apr;101(4):751–8. doi: 10.2105/AJPH.2010.194381. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 17.Lipsey MW, Wilson DB. Practical Meta-analysis. Sage Publications; Thousand Oaks, CA: [Google Scholar]