Skip to main content
Journal of Neurotrauma logoLink to Journal of Neurotrauma
. 2014 Aug 1;31(15):1354–1361. doi: 10.1089/neu.2014.3400

Minimum Information about a Spinal Cord Injury Experiment: A Proposed Reporting Standard for Spinal Cord Injury Experiments

Vance P Lemmon 1,, Adam R Ferguson 2, Phillip G Popovich 3, Xiao-Ming Xu 4, Diane M Snow 5, Michihiro Igarashi 6, Christine E Beattie 7, John L Bixby 1
PMCID: PMC4120647  PMID: 24870067

Abstract

The lack of reproducibility in many areas of experimental science has a number of causes, including a lack of transparency and precision in the description of experimental approaches. This has far-reaching consequences, including wasted resources and slowing of progress. Additionally, the large number of laboratories around the world publishing articles on a given topic make it difficult, if not impossible, for individual researchers to read all of the relevant literature. Consequently, centralized databases are needed to facilitate the generation of new hypotheses for testing. One strategy to improve transparency in experimental description, and to allow the development of frameworks for computer-readable knowledge repositories, is the adoption of uniform reporting standards, such as common data elements (data elements used in multiple clinical studies) and minimum information standards. This article describes a minimum information standard for spinal cord injury (SCI) experiments, its major elements, and the approaches used to develop it. Transparent reporting standards for experiments using animal models of human SCI aim to reduce inherent bias and increase experimental value.

Key words: : axonal injury, axonal regeneration, MIASCI, spinal cord injury

Introduction

The lack of reproducibility in science has been recognized for decades.1–3 Recently, this issue has been highlighted by scientists in pharmaceutical companies who reported that the majority of published basic biomedical science experiments identifying potential therapeutic targets could not be replicated.4,5 Similarly, a project funded by the National Institute of Neurological Disorders and Stroke (NINDS) resulted in a high failure rate in replicating published pre-clinical results for treatment of spinal cord injury (SCI).6 Strikingly, studies using RhoA/Rock inhibitors or stem cells to treat SCI have more favorable outcomes if the articles do not report whether investigators are blinded during behavioral testing.7,8 Challenges in interpreting primary endpoints of morphological and functional neurological regeneration make careful documentation of methods used in SCI experiments essential.9–11 In addition, concerns have been raised about a number of issues in neuroscience publications, including inappropriate statistics,12,13 low power (calculated from 49 meta-analyses of 739 primary studies to be 21%14), and a call to decrease p values from the commonly used 0.05 to 0.005–0.001.15 Changes in standard practices are needed to improve reproducibility and thus the translation of basic SCI research to the clinic.16

Leaders in the neuroscience community have recommended specific changes in the way basic science studies are conducted and reported.6,17,18 In the larger scientific community, there has been extensive discussion of strategies to improve reproducibility. For example, litter-to-litter variability can have a large impact on animal behavioral studies.19 Remarkably, between 2008 and 2013, there were at least 20 peer-reviewed publications presenting guidelines for in vivo pre-clinical studies.20 Common recommendations included appropriate sample size, randomization of animals to groups, the use of positive and negative controls, and blinding of scientists to treatments during outcome assessments.

In addition, the general lack of data availability is a major hindrance to interpretation and reanalysis of published experiments. Critical metadata regarding how experiments are done are often missing from published methods, and the raw data underlying graphs and figures are rarely included in publications. Many journals stipulate that primary data associated with a publication, such as microarray data, must be posted in public databases. However, an analysis of publications in the 50 journals with the highest impact factors in 2009 revealed poor compliance (only 47 of 500 articles) with full deposition of primary data.21

Scientists and clinicians conducting clinical trials have had to comply with standardized design and reporting guidelines for many years. For example, a common data element (CDE) system is in place for stroke and SCI trials.22,23 A group of editors for some 400 journals have already set standards for publication of clinical trials through the CONSORT (Consolidated Reporting of Trials) guidelines,24,25 and, similarly, journal editors are calling for improvements in the conduct and reporting of pre-clinical research.26 However, the CDE concept is being applied slowly in the pre-clinical arena. A parallel worldwide effort is being launched under the umbrella of the Minimum Information about a Biomedical and Biological Investigation (MIBBI) project.27 Working groups associated with various organizations and ad-hoc groups have developed reporting standards. Perhaps the most widely used is the Minimum Information About a Microarray Experiment (MIAME).28 Similar approaches are being encouraged by the ARRIVE guideline (Animals in Research: Reporting in Vivo Experiments)29 and the CAMARADES initiative (http://www.camarades.info/) for experimental models of multiple sclerosis, traumatic brain injury, and stroke.30–32 Finally, reporting standards can help reviewers evaluate and critique grants and manuscripts. It is worth noting that although reporting standards encourage the use of best practices by focusing attention on critical concepts in a particular experimental domain, they are not to be confused with experimental practice standards. Pre-clinical research requires innovation that might be hampered if only existing models or assessment methods were allowed.

The adoption of a reporting standard not only improves transparency of research and encourages the use of best practices, but it also has the additional benefit of facilitating the aggregation and interrogation of large data sets in a given domain by annotating data and metadata using standardized terminology.33 The number of publications each year already exceeds the capacity of individuals to find, read, and absorb them, even in relatively small research areas, such as SCI. It is also difficult for researchers to compare variables across studies because of the lack of consistency in reporting experimental design parameters. Consequently, to ensure that valuable data are not lost or needlessly duplicated, these need to be collected into user-friendly knowledge bases. Moreover, as Minimum Information about a Spinal Cord Injury experiment (MIASCI) aims to reduce bias and thereby enhance the predictive value of experimental SCI modeling, the number of animals needed to verify or falsify a scientific hypothesis could be reduced. Such an effort is in line with the “3R” (Replacement, Refinement, and Reduction) initiative to increase animal welfare34 and reduce waste in biomedical research.35,36

To begin to address the problems outlined above, an international group of scientists studying SCI have worked collaboratively to develop a draft standard, termed MIASCI.

Methods and Results

In October 2012, a 3-day workshop, entitled “Growth Cones and Axon Regeneration: Entering The Age of Informatics” was held in New Orleans. The 35 participants (listed in Appendix B) spent substantial time in small working groups developing lists to describe important information relating to the performance of SCI experiments (i.e., the “metadata” concerning these experiments). After the meeting, the documentation from these discussions was collated and circulated to the meeting's advisory group for further review. A draft MIASCI checklist was assembled based on the guidelines recommended by the MIBBI project.27

The draft MIASCI was presented at four international meetings to obtain feedback from the SCI community.37–40 Finally, an online poll was conducted to obtain specific feedback concerning the importance of individual items in the checklist. Seventy-six SCI experts (of 125 invited to participate) from the global SCI research community participated in this poll. Thirty-four percent of the participants had more than 10 years of experience in the SCI field and 24% had 4–10 years of experience. Finally, individuals in the SCI community were invited to become part of the MIASCI Consortium (Appendix A). They reviewed the MIASCI checklist and the manuscript, made suggestions, and agreed to be co-author.

On December 11, 2013, a draft MIASCI was listed at the BioSharing portal (http://biosharing.org/bsg-000541), the current host of MIBBI checklists, and the MIASCI checklist (version 0.8) was posted at SourceForge (https://sourceforge.net/projects/miasci), a resource for open-source software development and distribution. The 1.0 version was posted at SourceForge on April 14, 2014, and is provided in the supplementary material. Please visit the MIASCI SourceForge site for the current version.

MIASCI data elements

Typical minimum information standards strive to minimize the number of required data elements to increase the likelihood that investigators will provide essential information. The average minimum information standard has approximately 50 required data elements. Because the design and analysis of SCI experiments varies widely, the draft MIASCI has approximately 250 required data elements. However, most SCI studies only cover a narrow range of these elements, and for a given study, the number of applicable data elements from the MIASCI is likely to be 100 or less.

The draft MIASCI has 11 major sections: investigator, organism, surgery, perturbagen, cell transplantation, biomaterials, histology, immunohistochemisty, imaging, behavior, and data analysis and statistics.

The section on organism covers important information, such as animal source, strain, and congenic status. Correct identification of strain and details about congenic status is increasingly recognized as essential information. Different mouse strains show different response to injuries and recovery patterns after SCI and optic nerve crush.41–43 The same rat strain from two different regional suppliers of the same commercial vendor can yield different outcomes in SCI experiments.44 Thus, the organism section also covers information about housing that has proven critical to the outcome of SCI experiments, such as animal batching, environmental enrichment, and light/dark cycle.10 Animal batching (randomized, randomized block, and so on) can be complicated,45 especially when working with transgenic mice that might be difficult to produce in large numbers. Nonetheless, reporting how this is managed is important. Details about control groups (littermates, congenic status, and so on) should also be documented.

The surgery and behavior sections include whether the surgeon(s) are blinded to treatment group (concealed allocation), the type of injury and device used to inflict it, including whether compression was maintained after impact and for how long, as well as data elements regarding drugs (e.g., anesthetics, analgesics, and antibiotics) given to animals and hydration. Based on the recommendations of Landis and colleagues,17 those sections also cover animal batching methods, power calculation methods, and estimation of effect size and details about whether observers were blinded to the experimental intervention. Small sample size in biological studies is a general problem,14 but can be a special problem in SCI research, because animal experiments are especially time-consuming and expensive. Therefore, it is critical for scientists to consider and report this aspect of their work. In the behavior section, special attention is paid to collecting important metadata about the widely used BBB test,46 BMS test,41 and various gait analysis methods.

A major focus of SCI research is in evaluating the ability of agents to improve recovery after injury. The drug discovery field has adopted the term “perturbagen” to refer to small molecules, peptides, antibodies, oligonucleotides, and so on, that alter a biological process by interfering with one or more molecular targets. Common perturbagens used in the SCI field are U.S. Food and Drug Administration–approved drugs, chemical compounds, naturally occurring bioactive agents, siRNAs, shRNAs, and cDNAs. The route (oral, intravenous, subcutaneous, minipump, and so on) and other aspects (time since injury and dose or doses) of administration of the perturbagen needs to be documented. The Minimum Information About a Cellular Assay (MIACA) project (http://miaca.sourceforge.net) proposed a standard that included data elements about perturbagens, such as vendor, catalog number, stock concentration, storage, and solvent. MIACA also covers experiments using viruses (type and titer). In accord with best practices in MIBBI and ontology development, we have chosen to reuse relevant sections of MIACA to describe perturbagen use in MIASCI.

The cell transplantation section requests information about cell source, isolation and purification methods, batch, and passage inspired by good laboratory practice/good manufacturing practice standards. Additional information about growth factors and differentiation methods is included, as is information about any immunosuppression methods or use of supporting growth factors administered at the time of cellular delivery. In addition, the time since injury, site, route, and volume of introduction are also required as well as the numbers of cells and numbers of times the cells are delivered and over what duration.

Biomaterials are also covered, because they are commonly used in SCI experiments. Domain experts recommended that details about composition, mechanical properties, manufacturer, and manufacturing methods be collected along with details about release kinetics, as well as mode and rate of metabolism, if the material is used to deliver a compound or biologic and whether any safety data are available regarding biodistribution after introduction.

Histology, immunohistochemistry, and imaging are three highly interrelated areas often used in evaluating the results of SCI experiments. The histology section of MIASCI deals with fixation, tissue processing, and axonal tracing methods. The immunohistochemistry section is devoted to collecting information about antibodies used in the research project. The Journal of Comparative Neurology has set a very high bar for describing antibodies and verifying the specificity of antibody labeling.47 Most other journals do not have such high standards, so better documentation of antibodies and their specificity remains a major concern48 (source and catalog number, at least, are essential, whereas batch information is very desirable). The Neuroscience Information Framework (NIF) has an antibody registry (www.antibodyregistry.org), which provides unique identifiers for antibodies, including those available from approximately 200 vendors. MIASCI will use these identifiers and will link to the NIF search tool to facilitate data entry into MIASCI. Basic information about the imaging platform is requested, especially information about image acquisition and analysis software and settings, as well as the specific counting and measurement methods used.17 Software developers and computational biologists have discussed the critical importance of documenting software algorithms and settings.49–51 This includes archiving exact versions of programs used and version control of all custom scripts. When this logic is applied to image analysis, it requires documentation of software versions and settings, optical filters, and software filters, such as thresholds, used during image acquisition and analysis. We hope that the MIASCI will encourage the use of these imaging best practices.

The data analysis and statistics section is relatively brief, but covers concepts related to blinding of investigators, the kinds of technical and biological replicates used, prospective analysis plans and the statistical tests performed, normalization schemes, and positive and negative controls. These data elements are deemed critical by the SCI domain experts and are common recommendations in standards for life science research.20

An inevitable limitation is that a MIASCI statement relies on good faith and therefore cannot detect or prevent intentionally false statements. However, the utility of MIASCI could be tested experimentally, for example, by a prospective study comparing relevant publications before and after inauguration of the MIASCI standard. MIASCI acceptance and integration could be measured by analyzing whether experimental characterization according to the MIASCI statement is increasing. MIASCI efficacy could be investigated by determining whether the precision of the primary outcome measure effect size increased after inauguration of the MIASCI standard.

Summary

The MIASCI draft standard has been proposed to capture SCI experimental details, including the investigator, organism, behavior, surgery, perturbagens, histology, immunohistochemistry, imaging, biomaterials, cell transplantation, and data analysis. MIASCI provides a basis for establishing good laboratory practice in the modeling of SCI. This first version of MIASCI will require modifications and further development as new methodologies are applied to the study and treatment of SCI. In some cases, simply requiring the use of other standards, such as MIAME or minimum information about an RNA-Seq experiment, will be appropriate. In other cases, however, expanding or developing a new section may be needed. At present, the biomaterials section is minimal and there is no coverage of some molecular techniques, such as in situ hybridization or real-time polymerase chain reaction. Because scientists, reviewers, and editors use the MIASCI when reporting SCI studies, this should not only improve reporting, but also promote the adoption of best practices, such as randomization of animals to treatment groups, appropriate use of power analysis, and the blinding of scientists to treatment conditions. To facilitate the adoption of MIASCI, an overall framework, including an ontology and simple-to-use annotation tools, is being built and will be freely available for data annotation and submission. An important benefit of MIASCI adoption is that it will greatly facilitate the population of domain-specific databases, such as NIF, Regenbase,37 Adam Ferguson's SCI Database,52,53 Barbara Grimpe's SCI text mining initiative,54 the University of Dusseldorf Center for Neuronal Regeneration's SCI Database,55 and the CAMARADES project on meta-analysis of animal studies.30,32,36 Thus, establishment of transparent reporting standards in pre-clinical SCI research should facilitate experimental accuracy and lab-to-lab reproducibility, helping to speed up the development of novel therapeutic approaches to SCI.

Supplementary Material

Supplemental data
Supp_Data.pdf (675.6KB, pdf)

Appendix A

MIASCI Consortium

Saminda W. Abeyruwan, Computer Sciences, University of Miami, Coral Gables, Florida.

Michael S. Beattie, Neurological Surgery, University of California San Francisco, San Francisco, California.

John R. Bethea, Biology, Drexel University, Philadelphia, Pennsylvania.

Frank Bradke, Axon Growth and Regeneration, German Center for Neurodegenerative Diseases, Bonn, Germany.

Jacqueline C. Bresnahan, Neurological Surgery, University of California San Francisco, San Francisco, California.

Mary B. Bunge, Miami Project to Cure Paralysis, University of Miami, Miami, Florida.

Alison Callahan, Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, California.

Samuel David, Center for Research in Neuroscience, McGill University, Montreal, Quebec, Canada.

Sarah A. Dunlop, Experimental and Regenerative, Neurosciences, University of Western Australia, Crawley, WA, Australia.

James W. Fawcett, Center for Brain Repair, Cambridge University, Cambridge, United Kingdom.

Michael G. Fehlings, Division of Neurosurgery, University of Toronto, Toronto, Ontario, Canada.

Itzhak Fischer, Neurobiology and Anatomy, Drexel University, College of Medicine, Philadelphia, Pennsylvania.

Paul Forscher, Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, Connecticut.

Roman J. Giger, Cell & Developmental Biology, University of Michigan, Ann Arbor, Michigan.

Yoshio Goshima, Department of Molecular Pharmacology & Neurobiology, Yokohama City University Graduate School of Medicine, Yokohama, Japan.

Barbara Grimpe, Department of Neurology, Applied Neurobiology Group, University Medical Center Düsseldorf, Düsseldorf, Germany.

Theo Hagg, Department of Biomedical Sciences, East Tennessee State University, Johnson City, Tennessee.

Edward D. Hall, Departments of Anatomy and Neurobiology, Neurosurgery, Neurology and Physical Medicine and Rehabilitation, University of Kentucky, Lexington, Kentucky.

Benjamin J. Harrison, Kentucky Spinal Cord Injury Research Center, University of Louisville, Louisville, Kentucky.

Alan R. Harvey, School of Anatomy, Physiology and Human Biology, The University of Western Australia, Crawley, WA, Australia.

Cheng He, Department of Neurobiology, Second Military Medical University, Shanghai, China.

Zhigang He, F.M. Kirby Neurobiology Center, Boston Children's Hospital, Harvard Medical School, Boston, Massachusetts.

Tatsumi Hirata, Division of Brain Function, National Institute of Genetics, Mishima, Japan.

Ahmet Hoke, Neurology and Neuroscience Center, Johns Hopkins University, Baltimore, MD USA

Claire E. Hulsebosch, Department of Neuroscience and Cell Biology, University of Texas Medical Branch, Galveston, Texas.

Andres Hurtado, Neurology and Neuroscience Center, Johns Hopkins University, Baltimore, Maryland.

Anjana Jain, Biomedical Engineering, Worcester Polytechnic Institute, Worcester, Massachusetts.

Ken Kadoya, Department of Neurosciences, University of California San Diego, San Diego, California.

Hiroyuki Kamiguchi, Laboratory for Neuronal Growth Mechanisms, RIKEN Brain Science Institute, Wako, Japan.

Mineko Kengaku, Institute for Integrated Cell-Material Sciences, Kyoto University, Kyoto, Japan.

Jeffery D. Kocsis, Department of Neurology, Yale University, West Haven, Connecticut.

Brian K. Kwon, Department of Orthopedics, University of British Columbia, Vancouver, BC, Canada.

Jae Lee, Miami Project to Cure Paralysis, Department of Neurological Surgery, University of Miami, Miami, Florida.

Daniel J. Liebl, Miami Project to Cure Paralysis, Department of Neurological Surgery, University of Miami, Miami, Florida.

Shao-Jun Liu, Department of Neurobiology, The Academy of Military Medical Sciences, Beijing, China.

Laura A. Lowery, Department of Biology, Boston College, Chestnut Hill, Massachusetts.

Shweta Mandrekar-Colucci, Center for Brain and Spinal Cord Repair, Department of Neuroscience, Ohio State University, Columbus, Ohio.

John H. Martin, Department of Physiology, Pharmacology, and Neuroscience, The City College of the City University of New York, New York, New York.

Carol A. Mason, Department of Pathology and Cell Biology, Columbia University, New York, New York.

Dana M. McTigue, Department of Neuroscience, Ohio State University, Columbus, Ohio.

Nassir Mokarram, The Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology, Atlanta, Georgia.

Lawrence D. Moon, Wolfson Center for Age-Related Diseases, King's College London, University of London, London, United Kingdom.

Hans W. Muller, Department of Neurology, Heinrich Heine University, Düsseldorf, Germany.

Takeshi Nakamura, Research Institute for Biomedical Sciences, Tokyo University of Science, Chiba, Japan.

Takashi Namba, Department of Cell Pharmacology, Nagoya University, Nagoya, Japan.

Mariko Nishibe, Department of Molecular Neuroscience, Osaka University, Osaka, Japan.

Izumi Oinuma, Laboratory of Molecular Neurobiology, Graduate School of Biostudies, Kyoto University, Kyoto, Japan.

Martin Oudega, Departments of Physical Medicine and Rehabilitation, Neurobiology, and Bioengineering, University of Pittsburgh, Pittsburgh, Pennsylvania.

David E. Pleasure, Department of Neurology, University of California Davis, Sacramento, California.

Geoffrey Raisman, Spinal Repair Unit, Institute of Neurology, University College London, London, United Kingdom.

Matthew N. Rasband, Departments of Neuroscience, Developmental Biology, and Molecular and Cellular Biology, Baylor College of Medicine, Houston, Texas.

Paul J. Reier, Department of Neuroscience, University of Florida, Gainesville, Florida.

Miguel Santiago-Medina, Department of Neuroscience, University of Wisconsin, Madison, Wisconsin.

Jan M. Schwab, Department of Neurology with Experimental Neurology, Charitè Universitätsmedizin Berlin, Berlin, Germany.

Martin E. Schwab, Brain Research Institute, University of Zurich, Zurich, Switzerland.

Yohei Shinmyo, Department of Developmental Neurobiology, Kumamoto University, Kumamoto, Japan.

Jerry Silver, Department of Neurosciences, Case Western Reserve University, Cleveland, Ohio.

George M. Smith, Department of Neuroscience, Temple University, Philadelphia, Pennsylvania.

Kwok-Fai So, GHM Institute of CNS Regeneration, Jinan University, Guangzhou, China.

Michael V. Sofroniew, Department of Neurobiology, University of California Los Angeles, Los Angeles, California.

Stephen M. Strittmatter, Program in Cellular Neuroscience, Neurodegeneration and Repair, Yale University, New Haven, Connecticut.

Mark H. Tuszynski, Center for Neural Repair, Department of Neurosciences, University of California San Diego, San Diego, California.

Jeffery L. Twiss, Department of Biological Sciences, University of South Carolina, Columbus, South Carolina.

Ubbo Visser, Department of Computer Science, University of Miami, Miami, Florida.

Trent A. Watkins, Department of Neuroscience, Genentech, Inc., South San Francisco, California.

Wutian Wu, Department of Anatomy, The University of Hong Kong, Hong Kong, China.

Sung Ok Yoon, Department of Molecular and Cellular Biochemistry, Ohio State University, Columbus, Ohio.

Michisuke Yuzaki, Department of Neurophysiology, Keio University, Tokyo, Japan.

Binhai Zheng, Department of Neurosciences, University of California San Diego, San Diego, California.

Fengquan Zhou, Departments of Orthopedic Surgery and Neuroscience, Johns Hopkins University, Baltimore, Maryland.

Yimin Zou, Neurobiology Section, Division of Biological Sciences, University of California San Diego, San Diego, California.

Appendix B

Growth cones and axon regeneration: entering the age of informatics

A 3-day workshop sponsored by the National Institute of Neurological Disorders and Stroke, the Eunice Kennedy Shriver National Institute of Child Health and Human Development, and the Japan Society for the Promotion of Sciences.

New Orleans, Louisiana, October 10–12, 2012

Co-organizers

Michihiro Igarashi, Department of Neurochemistry and Molecular Cell Biology, Niigata University, Niigata, Japan.

Vance P. Lemmon, Miami Project to Cure Paralysis, Department of Neurological Surgery, University of Miami, Miami, Florida.

Participants

Saminda W. Abeyruwan, Department of Computer Sciences, University of Miami, Miami, Florida.

Christine E. Beattie, Department of Neuroscience, Ohio State University, Columbus, Ohio.

John L. Bixby, Miami Project to Cure Paralysis, Departments of Molecular and Cellular Pharmacology and Neurological Surgery, University of Miami, Miami, Florida.

Alison Callahan, Stanford Center for Biomedical Informatics Research, Stanford University, Stanford, California.

Adam R. Ferguson, Brain and Spinal Injury Center, Department of Neurological Surgery, University of California San Francisco, San Francisco, California.

Paul Forscher, Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, Connecticut.

Yoshio Goshima, Department of Molecular Pharmacology and Neurobiology, Yokohama City University, Yokohama, Japan.

Benjamin J. Harrison, Kentucky Spinal Cord Injury Research Center, University of Louisville, Louisville, Kentucky.

Zhigang He, F.M. Kirby Neurobiology Center, Boston Children's Hospital, Harvard Medical School, Boston, Massachusetts.

Tatsumi Hirata, Division of Brain Function, National Institute of Genetics, Mishima, Japan.

Ken Kadoya, Department of Neurosciences, University of California San Diego, San Diego, California.

Hiroyuki Kamiguchi, Laboratory for Neuronal Growth Mechanisms, RIKEN Brain Science Institute, Wako, Japan.

Mineko Kengaku, Institute for Integrated Cell-Material Sciences, Kyoto University, Kyoto, Japan.

Laura A. Lowery, Department of Biology, Boston College, Chestnut Hill, Massachusetts.

Shweta Mandrekar-Colucci, Center for Brain and Spinal Cord Repair, Department of Neuroscience, Ohio State University, Columbus, Ohio.

Carol A. Mason, Department of Pathology and Cell Biology, Neuroscience, and Ophthalmic Science, Columbia University, New York, New York.

Nassir Mokarram, The Wallace H. Coulter Department of Biomedical Engineering, Georgia Institute of Technology, Atlanta, Georgia.

Takeshi Nakamura, Research Institute for Biomedical Sciences, Tokyo University of Science, Noda, Chiba, Japan.

Takashi Namba, Department of Cell Pharmacology, Nagoya University, Nagoya, Japan.

Mariko Nishibe, Department of mMolecular Neuroscience, Osaka University, Osaka, Japan.

Izumi Oinuma, Laboratory of Molecular Neurobiology, Graduate School of Biostudies, Kyoto University, Kyoto, Japan.

Phillip G. Popovich, Center for Brain and Spinal Cord Repair, Ohio State University, Columbus, Ohio.

Miguel Santiago-Medina, Department of Neuroscience, University of Wisconsin, Madison, Wisconsin.

Yohei Shinmyo, Department of Developmental Neurobiology, Kumamoto University, Kumamoto, Japan.

Diane M. Snow, Spinal Cord and Brain Injury Research Center, University of Kentucky, Lexington, Kentucky.

Oswald Steward, Reeve-Irvine Research Center, University of California Irvine, Irvine, California.

Jeffery L. Twiss, Department of Biological Sciences, University of South Carolina, Columbus, South Carolina.

Ubbo Visser, Department of Computer Science, University of Miami, Miami, Florida.

Trent A. Watkins, Department of Neuroscience, Genentech, Inc., South San Francisco, California.

Xiao-Ming Xu, Stark Neurosciences Research Institute, Indiana University, Indianapolis, Indiana.

Michisuke Yuzaki, Department of Neurophysiology, Keio University, Tokyo, Japan.

Binhai Zheng, Department of Neurosciences, University of California San Diego, San Diego, California.

Contributor Information

Collaborators: the MIASCI Consortium

Acknowledgments

The MIASCI was developed with the assistance of many people. It was inspired by the MIAME microarray reporting standard, the MIBBI Project, and the BioAssay Ontology Project. The original idea for MIASCI emerged while planning an international meeting in 2012 on axon growth and regeneration supported by the National Institutes of Health (NIH) and the Japan Society for the Promotion of Sciences. As with MIAME and other reporting standards, the MIASCI is a grassroots movement and has depended on the pro-bono work of stakeholders as well as support from the NIH (HD057632, NS080145, and U01HL111561) and the Miami Project to Cure Paralysis. Many of the articles on reproducibility were introduced to us by Carol Goble's keynote talk at ISMB/ECCB 2013 Berlin.56

Author Disclosure Statement

No competing financial interests exist.

References

  • 1.Sterling T.D. (1959). Publication decisions and their possible effects on inferences drawn from tests of significance—or vice versa. J. Am. Stat. Assoc. 54, 30–34 [Google Scholar]
  • 2.Cohen J. (1962). Statistical power of abnormal-social psychological research: a review. J. Abnorm. Soc. Psychol. 65, 145–153 [DOI] [PubMed] [Google Scholar]
  • 3.Cohen J. (1994). The earth is round (p<.05). Am. Psychol. 49, 997–1003 [Google Scholar]
  • 4.Prinz F., Schlange T., and Asadullah K. (2011). Believe it or not: how much can we rely on published data on potential drug targets? Nat. Rev. Drug. Discov. 10, 712. [DOI] [PubMed] [Google Scholar]
  • 5.Begley C.G., and Ellis L.M. (2012). Drug development: raise standards for preclinical cancer research. Nature 483, 531–533 [DOI] [PubMed] [Google Scholar]
  • 6.Steward O., Popovich P.G., Dietrich W.D., and Kleitman N. (2012). Replication and reproducibility in spinal cord injury research. Exp. Neurol. 233, 597–605 [DOI] [PubMed] [Google Scholar]
  • 7.Watzlawick R., Sena E.S., Dirnagl U., Brommer B., Kopp M.A., Macleod M.R., Howells D.W., and Schwab J.M. (2014). Effect and reporting bias of RhoA/ROCK-blockade intervention on locomotor recovery after spinal cord injury: a systematic review and meta-analysis. JAMA Neurol. 71, 91–99 [DOI] [PubMed] [Google Scholar]
  • 8.Antonic A., Sena E.S., Lees J.S., Wills T.E., Skeers P., Batchelor P.E., Macleod M.R., and Howells D.W. (2013). Stem cell transplantation in traumatic spinal cord injury: a systematic review and meta-analysis of animal studies. PLoS Biol. 11, e1001738. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 9.Steward O., Zheng B., and Tessier-Lavigne M. (2003). False resurrections: distinguishing regenerated from spared axons in the injured central nervous system. J. Comp. Neurol. 459, 1–8 [DOI] [PubMed] [Google Scholar]
  • 10.Fouad K., Hurd C., and Magnuson D.S. (2013). Functional testing in animal models of spinal cord injury: not as straight forward as one would think. Front. Integr. Neurosci. 7, 85. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 11.Weishaupt N., Krajacic A., and Fouad K. (2013). Lipopolysaccharide can induce errors in anatomical measures of neuronal plasticity by increasing tracing efficacy. Neurosci. Lett. 556, 181–185 [DOI] [PubMed] [Google Scholar]
  • 12.Nieuwenhuis S., Forstmann B.U., and Wagenmakers E.J. (2011). Erroneous analyses of interactions in neuroscience: a problem of significance. Nat. Neurosci. 14, 1105–1107 [DOI] [PubMed] [Google Scholar]
  • 13.Burke D.A., Whittemore S.R., and Magnuson D.S. (2013). Consequences of common data analysis inaccuracies in CNS trauma injury basic research. J. Neurotrauma 30, 797–805 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 14.Button K.S., Ioannidis J.P., Mokrysz C., Nosek B.A., Flint J., Robinson E.S., and Munafo M.R. (2013). Power failure: why small sample size undermines the reliability of neuroscience. Nat. Rev. Neurosci. 14, 365–376 [DOI] [PubMed] [Google Scholar]
  • 15.Johnson V.E. (2013). Revised standards for statistical evidence. Proc. Natl. Acad. Sci. U. S. A. 110, 19313–19317 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 16.Kwon B.K., Okon E.B., Tsai E., Beattie M.S., Bresnahan J.C., Magnuson D.K., Reier P.J., McTigue D.M., Popovich P.G., Blight A.R., Oudega M., Guest J.D., Weaver L.C., Fehlings M.G., and Tetzlaff W. (2011). A grading system to evaluate objectively the strength of pre-clinical data of acute neuroprotective therapies for clinical translation in spinal cord injury. J. Neurotrauma 28, 1525–1543 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 17.Landis S.C., Amara S.G., Asadullah K., Austin C.P., Blumenstein R., Bradley E.W., Crystal R.G., Darnell R.B., Ferrante R.J., Fillit H., Finkelstein R., Fisher M., Gendelman H.E., Golub R.M., Goudreau J.L., Gross R.A., Gubitz A.K., Hesterlee S.E., Howells D.W., Huguenard J., Kelner K., Koroshetz W., Krainc D., Lazic S.E., Levine M.S., Macleod M.R., McCall J.M., Moxley R.T., III, Narasimhan K., Noble L.J., Perrin S., Porter J.D., Steward O., Unger E., Utz U., and Silberberg S.D. (2012). A call for transparent reporting to optimize the predictive value of preclinical research. Nature 490, 187–191 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 18.Vesterinen H.M., Sena E.S., Egan K.J., Hirst T.C., Churolov L., Currie G.L., Antonic A., Howells D.W., and Macleod M.R. (2014). Meta-analysis of data from animal studies: a practical guide. J. Neurosci. Methods 221, 92–102 [DOI] [PubMed] [Google Scholar]
  • 19.Lazic S.E., and Essioux L. (2013). Improving basic and translational science by accounting for litter-to-litter variation in animal models. BMC Neurosci. 14, 37. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Henderson V.C., Kimmelman J., Fergusson D., Grimshaw J.M., and Hackam D.G. (2013). Threats to validity in the design and conduct of preclinical efficacy studies: a systematic review of guidelines for in vivo animal experiments. PLoS Med. 10, e1001489. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 21.Alsheikh-Ali A.A., Qureshi W., Al-Mallah M.H., and Ioannidis J.P. (2011). Public availability of published research data in high-impact journals. PLoS One 6, e24357. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 22.Saver J.L., Warach S., Janis S., Odenkirchen J., Becker K., Benavente O., Broderick J., Dromerick A.W., Duncan P., Elkind M.S., Johnston K., Kidwell C.S., Meschia J.F., and Schwamm L.; National Institute of Neurological Diseases and Stroke Stroke Common Data Element Working Group. (2012). Standardizing the structure of stroke clinical and epidemiologic research data: the National Institute of Neurological Disorders and Stroke (NINDS) Stroke Common Data Element (CDE) project. Stroke 43, 967–973 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 23.Biering-Sorensen F., Charlifue S., Devivo M.J., Grinnon S.T., Kleitman N., Lu Y., and Odenkirchen J. (2011). Incorporation of the International Spinal Cord Injury Data Set elements into the National Institute of Neurological Disorders and Stroke Common Data Elements. Spinal Cord 49, 60–64 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 24.Boutron I., Moher D., Altman D.G., Schulz K.F., Ravaud P., and Group C. (2008). Extending the CONSORT statement to randomized trials of nonpharmacologic treatment: explanation and elaboration. Ann Intern Med 148, 295–309 [DOI] [PubMed] [Google Scholar]
  • 25.Moher D., Hopewell S., Schulz K.F., Montori V., Gotzsche P.C., Devereaux P.J., Elbourne D., Egger M., and Altman D.G.; Consolidated Standards of Reporting Trials Group. (2010). CONSORT 2010 Explanation and elaboration: updated guidelines for reporting parallel group randomised trials. J. Clin. Epidemiol. 63, e1–e37 [DOI] [PubMed] [Google Scholar]
  • 26.Hoke A. (2013). Experimental neurology and state of preclinical research. Exp. Neurol. 239, A1. [DOI] [PubMed] [Google Scholar]
  • 27.Taylor C.F., Field D., Sansone S.A., Aerts J., Apweiler R., Ashburner M., Ball C.A., Binz P.A., Bogue M., Booth T., Brazma A., Brinkman R.R., Michael Clark A., Deutsch E.W., Fiehn O., Fostel J., Ghazal P., Gibson F., Gray T., Grimes G., Hancock J.M., Hardy N.W., Hermjakob H., Julian R.K., Jr., Kane M., Kettner C., Kinsinger C., Kolker E., Kuiper M., Le Novere N., Leebens-Mack J., Lewis S.E., Lord P., Mallon A.M., Marthandan N., Masuya H., McNally R., Mehrle A., Morrison N., Orchard S., Quackenbush J., Reecy J.M., Robertson D.G., Rocca-Serra P., Rodriguez H., Rosenfelder H., Santoyo-Lopez J., Scheuermann R.H., Schober D., Smith B., Snape J., Stoeckert C.J., Jr., Tipton K., Sterk P., Untergasser A., Vandesompele J., and Wiemann S. (2008). Promoting coherent minimum reporting guidelines for biological and biomedical investigations: the MIBBI project. Nat. Biotechnol. 26, 889–896 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 28.Brazma A., Hingamp P., Quackenbush J., Sherlock G., Spellman P., Stoeckert C., Aach J., Ansorge W., Ball C.A., Causton H.C., Gaasterland T., Glenisson P., Holstege F.C., Kim I.F., Markowitz V., Matese J.C., Parkinson H., Robinson A., Sarkans U., Schulze-Kremer S., Stewart J., Taylor R., Vilo J., and Vingron M. (2001). Minimum information about a microarray experiment (MIAME)—toward standards for microarray data. Nat. Genet. 29, 365–371 [DOI] [PubMed] [Google Scholar]
  • 29.Kilkenny C., Browne W.J., Cuthill I.C., Emerson M., and Altman D.G. (2010). Improving bioscience research reporting: the ARRIVE guidelines for reporting animal research. PLoS Biol. 8, e1000412. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 30.Sena E., van der Worp H.B., Howells D., and Macleod M. (2007). How can we improve the pre-clinical development of drugs for stroke? Trends Neurosci. 30, 433–439 [DOI] [PubMed] [Google Scholar]
  • 31.Howells D.W., Sena E.S., and Macleod M.R. (2014). Bringing rigour to translational medicine. Nat Rev Neurol 10, 37–43 [DOI] [PubMed] [Google Scholar]
  • 32.Crossley N.A., Sena E., Goehler J., Horn J., van der Worp B., Bath P.M.W., Macleod M., and Dirnagl U. (2008). Empirical evidence of bias in the design of experimental stroke studies: a metaepidemiologic approach. Stroke 39, 929–934 [DOI] [PubMed] [Google Scholar]
  • 33.Schurer S.C., Vempati U., Smith R., Southern M., and Lemmon V. (2011). Bioassay ontology annotations facilitate cross-analysis of diverse high-throughput screening data sets. J. Biomol. Screen. 16, 415–426 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 34.van Luijk J., Cuijpers Y., van der Vaart L., de Roo T.C., Leenaars M., and Ritskes-Hoitinga M. (2013). Assessing the application of the 3Rs: a survey among animal welfare officers in The Netherlands. Lab. Anim. 47, 210–219 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 35.Macleod M.R., Michie S., Roberts I., Dirnagl U., Chalmers I., Ioannidis J.P., Al-Shahi Salman R., Chan A.W., and Glasziou P. (2014). Biomedical research: increasing value, reducing waste. Lancet 383, 101–104 [DOI] [PubMed] [Google Scholar]
  • 36.Chalmers I., Bracken M.B., Djulbegovic B., Garattini S., Grant J., Gulmezoglu A.M., Howells D.W., Ioannidis J.P., and Oliver S. (2014). How to increase value and reduce waste when research priorities are set. Lancet 383, 156–165 [DOI] [PubMed] [Google Scholar]
  • 37.Lemmon V.P. (2013). Functional genomics and spinal cord injury. In: The 15th International Spinal Research Network Meeting, September6–7, London, UK [Google Scholar]
  • 38.Lemmon V.P., Ferguson A.R., Popovich P.G., and Bixby J.L. (2013). A draft minimal information standard for spinal cord injury experiments. In: The 31st Annual National Neurotrauma Symposium, August4–7, Nashville, Tennessee [Google Scholar]
  • 39.Lemmon V.P., Ferguson A.R., Popovich P.G., Snow D.M., Mason C., Igarashi M., Xu X.-M., Beattie C., and Bixby J.L. (2013). A Minimal Information Standard for Spinal Cord Injury Experiments. In: 3rd International Neural Regeneration Symposium, October10–15, Shenyang, China [Google Scholar]
  • 40.Lemmon V.P., Ferguson A.R., Popovich P.G., Snow D.M., Mason C., Igarashi M., Xu X.-M., Beattie C., and Bixby J.L. (2013). A minimal information standard for spinal cord injury experiments. In: British Society for Developmental Biology, Axon Guidance and Regeneration, August28–30, Aberdeen, UK [Google Scholar]
  • 41.Basso D.M., Fisher L.C., Anderson A.J., Jakeman L.B., McTigue D.M., and Popovich P.G. (2006). Basso Mouse Scale for locomotion detects differences in recovery after spinal cord injury in five common mouse strains. J. Neurotrauma 23, 635–659 [DOI] [PubMed] [Google Scholar]
  • 42.Kigerl K.A., McGaughy V.M., and Popovich P.G. (2006). Comparative analysis of lesion development and intraspinal inflammation in four strains of mice following spinal contusion injury. J. Comp. Neurol. 494, 578–594 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 43.Cui Q., Hodgetts S.I., Hu Y., Luo J.M., and Harvey A.R. (2007). Strain-specific differences in the effects of cyclosporin A and FK506 on the survival and regeneration of axotomized retinal ganglion cells in adult rats. Neuroscience 146, 986–999 [DOI] [PubMed] [Google Scholar]
  • 44.Bunge M.B., and Pearse D.D. (2012). Response to the report, “A re-assessment of a combinatorial treatment involving Schwann cell transplants and elevation of cyclic AMP on recovery of motor function following thoracic spinal cord injury in rats”. Exp. Neurol. 233, 645–648 [DOI] [PubMed] [Google Scholar]
  • 45.Keppel G. and Wickens T.D. (2004). Design and Analysis: A Researcher's Handbook. Pearson Prentice Hall: Upper Saddle River, NJ [Google Scholar]
  • 46.Basso D.M., Beattie M.S., and Bresnahan J.C. (1995). A sensitive and reliable locomotor rating scale for open field testing in rats. J. Neurotrauma 12, 1–21 [DOI] [PubMed] [Google Scholar]
  • 47.Saper C.B. (2005). An open letter to our readers on the use of antibodies. J. Comp. Neurol. 493, 477–478 [DOI] [PubMed] [Google Scholar]
  • 48.Rhodes K.J., and Trimmer J.S. (2006). Antibodies as valuable neuroscience research tools versus reagents of mass distraction. J. Neurosci. 26, 8017–8020 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 49.Peng R.D. (2011). Reproducible research in computational science. Science 334, 1226–1227 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 50.Sandve G.K., Nekrutenko A., Taylor J., and Hovig E. (2013). Ten simple rules for reproducible computational research. PLoS Comput. Biol. 9, e1003285. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 51.Nekrutenko A., and Taylor J. (2012). Next-generation sequencing data interpretation: enhancing reproducibility and accessibility. Nat. Rev. Genet. 13, 667–672 [DOI] [PubMed] [Google Scholar]
  • 52.Ferguson A.R., Irvine K.A., Gensel J.C., Nielson J.L., Lin A., Ly J., Segal M.R., Ratan R.R., Bresnahan J.C., and Beattie M.S. (2013). Derivation of multivariate syndromic outcome metrics for consistent testing across multiple models of cervical spinal cord injury in rats. PLoS One 8, e59712. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 53.Nielson J.L., Guandique C.F., Liu A.W., Burke D.A., Lash A.T., Moseanko R., Hawbecker S., Strand S.C., Zdunowski S., Irvine K.A., Brock J.H., Rosenzweig E.S., Nout Y.S., Gensel J.C., Anderson K.D., Segal M.R., Magnuson D.S.K., Whittemore S.R., McTigue D.M., Popovich P.G., Rabchevsky A.G., Scheff S.W., Steward O., Courtine G., Edgerton V.R., Tuszynski M.H., Beattie M.S., Bresnahan J.C., and Ferguson A.R. (in press). Development of a database for translational spinal cord injury research. J Neurotrauma [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 54.Ries A., Goldberg J.L., and Grimpe B. (2007). A novel biological function for CD44 in axon growth of retinal ganglion cells identified by a bioinformatics approach. J. Neurochem. 103, 1491–1505 [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 55.Brazda N., Kruse F., Kruse M., Kirchhoffer T., Klinger R., Cimiano P. and Muller H.W. (2013). The CNR preclinical database for knowledge management in spinal cord injury research. In: Society for Neuroscience, November9–13, 2012, San Diego, California [Google Scholar]
  • 56.Goble C. (2013). Results may vary: reproducibility, open science and all that jazz. In: 21st Annual International Conference on Intelligent Systems for Molecular Biology, July19–23, Berlin, Germany [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

Supplemental data
Supp_Data.pdf (675.6KB, pdf)

Articles from Journal of Neurotrauma are provided here courtesy of Mary Ann Liebert, Inc.

RESOURCES