Skip to main content
PLOS Biology logoLink to PLOS Biology
. 2020 Jul 16;18(7):e3000737. doi: 10.1371/journal.pbio.3000737

The Hong Kong Principles for assessing researchers: Fostering research integrity

David Moher 1,2,*, Lex Bouter 3,4, Sabine Kleinert 5, Paul Glasziou 6, Mai Har Sham 7, Virginia Barbour 8, Anne-Marie Coriat 9, Nicole Foeger 10, Ulrich Dirnagl 11
PMCID: PMC7365391  PMID: 32673304

Abstract

For knowledge to benefit research and society, it must be trustworthy. Trustworthy research is robust, rigorous, and transparent at all stages of design, execution, and reporting. Assessment of researchers still rarely includes considerations related to trustworthiness, rigor, and transparency. We have developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research Integrity with a specific focus on the need to drive research improvement through ensuring that researchers are explicitly recognized and rewarded for behaviors that strengthen research integrity. We present five principles: responsible research practices; transparent reporting; open science (open research); valuing a diversity of types of research; and recognizing all contributions to research and scholarly activity. For each principle, we provide a rationale for its inclusion and provide examples where these principles are already being adopted.


Assessment of researchers still rarely includes considerations related to trustworthiness, rigor, and transparency. This Essay presents the Hong Kong Principles (HKPs), developed as part of the 6th World Conference on Research Integrity, with a specific focus on the need to drive research improvement by ensuring that researchers are explicitly recognized and rewarded for behavior that leads to trustworthy research.

Introduction

In a quest to advance knowledge, researchers publish approximately 1.5 million journal articles each year. The assumption is that this literature can be used by other researchers, stakeholders, and the wider society because it is trustworthy, robust, rigorous, and complete [1].

The approach taken to validating research and its outcomes differs depending on the nature of the research. For example, to rigorously examine the effects of a health intervention, trial participants (human or animal) are typically required to be randomized between the intervention being studied. Many researchers advocate registration of protocols as a way to ensure transparency and to reduce bias, to discriminate between exploratory and confirmatory modes of research, and to provide insight into ongoing research projects. Subsequently, the use of reporting guidelines can help ensure complete and transparent reporting of the researchers’ methods and results. When the research is being disseminated, the research team would ensure that the associated data, materials, and any analytical code are made available as an integral part of publication. Such data sharing facilitates reanalysis of the data to check reproducibility and to perform secondary analyses.

Although some mechanisms exist to support researchers in ensuring transparency at all stages of design, execution, and reporting, there is no widespread adoption of these practices across all disciplines. There are many interwoven reasons for this. One contributing factor, we argue, is that little emphasis is placed on the rigor of research when hiring, reviewing, and promoting researchers. It seems to us more emphasis is placed on the novelty of perceived “impact” of research rather than on rigor [2]. Working together across the research sector as a whole to address this systemic issue, we believe, offers a global opportunity to improve research and impact.

We developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research Integrity (WCRI) specifically to drive greater recognition for researchers who commit to robust, rigorous, and transparent practices (i.e., their careers are advanced) (see Fig 1). If implemented, the HKPs could play a critical role in evidence-based assessments of researchers and put research rigor at the heart of assessment, as well as open up research to the wider benefit of society.

Fig 1. Indicators of responsible research practices.

Fig 1

We propose five principles, each with a rationale for its inclusion. The principles target exploratory and confirmatory types of research and analysis. Similarly, the principles are also applicable for quantitative and qualitative research, although there is more of a focus on assessing researchers who engage in empirical research. The principles were formulated with a focus on rewarding behaviors that strengthen research integrity that have an emphasis on responsible research practices and the avoidance of detrimental research practices [3]. We illustrate these principles with examples where we know they exist. These examples are not exhaustive, and many are relevant to more than one principle. Together, they illustrate of a breadth of approaches as to how these principles can operate at the very highest levels of international research. Early drafts of the HKPs were circulated to the 700 participants registered for the 6th WCRI. Further discussions took place during two sessions at the 6th WCRI. A penultimate version was uploaded on the 6th WCRI website after the conference. More than 100 people provided input and feedback. We acknowledge all of these valuable contributions and the global leadership of those working on the San Francisco Declaration on Research Assessment (DORA), the Leiden Manifesto, and other initiatives to promote the responsible use of metrics, which have laid the foundations for much of our work [2,4,5,6,7]. The HKPs are formulated from the perspective of the research integrity community. We, like the DORA signatories, strongly believe that current metrics may act as perverse incentives in the assessment of researchers. However, the principles outlined in this essay focus specifically on the undermining effect on research integrity [8]. We have used abbreviated versions of the wording of the HKPs below to facilitate dissemination. The complete wording of each principle is provided in Box 1.

Box 1. Complete wording of the HKPs

  • Principle 1: Assess researchers on responsible practices from conception to delivery,

  • including the development of the research idea, research design, methodology, execution, and effective dissemination

  • Principle 2: Value the accurate and transparent reporting of all research, regardless of the results

  • Principle 3: Value the practices of open science (open research)—such as open methods, materials, and data

  • Principle 4: Value a broad range of research and scholarship, such as replication, innovation, translation, synthesis, and meta-research

  • Principle 5: Value a range of other contributions to responsible research and scholarly activity, such as peer review for grants and publications, mentoring, outreach, and knowledge exchange

Principles

Principle 1: Assess responsible research practices

Rationale

The numbers of publications, citations, and total volume of grants are often still the dominant metrics used by research institutions for assessing and rewarding their researchers [2,4,5,6]. Providing bonuses to academics for publishing in certain journals (i.e., merit pay) is also common in many parts of the world [911]. These assessment criteria tell assessors little about the researchers and the rigor of their work; thus, they are not particularly “responsible” metrics, although research cited thousands of times probably indicates some measure of impact. These metrics can also be unduly influenced by field and citation practices and provide little information about a publication’s (and therefore a researcher’s) contributions to research and society. Other criteria are required to provide a broader view of markers of best practices: for example, the extent to which a researcher develops research questions with the involvement of appropriate members of the public (see Fig 1). Researchers who participate in responsible research practices, such as data sharing, which can take more time and resources, may disadvantage themselves compared to colleagues not participating in these practices. Career assessments need to acknowledge this issue.

Current implementation

The Canadian Institutes of Health Research’s Strategy for Patient-Oriented Research (SPOR) is a multimillion-dollar initiative to bring patients into a broad range of activities regarding research across Canadian provinces and territories [12]. Patients are now active in the development of research projects in setting priorities and formulating study questions. The Ontario response (Ontario SUPPORT Unit) has included a series of articles with patients taking a leadership role in coauthoring the content [13]. In the United Kingdom, the James Lind Alliance, funded by the UK National Institute of Health Research (NIHR), is a successful example of including patients, carers, and clinicians to develop priority-setting partnerships [14] and question formulation [15]. Other examples of citizen science across research disciplines also exist [16].

With a focus on enhancing reproducibility, the United States National Institutes of Health (NIH) have revised their application instructions and review criteria to strengthen scientific rigor and transparency [17]. One of the resources the NIH recommends is the Experimental Design Assistant (EDA) developed by the National Centre for the Replacement, Refinement & Reduction of Animals in Research (NC3Rs). This 10-module online tool was developed to assist researchers in the design and analysis of animal experiments. It includes dedicated support for randomization, blinding, and sample size calculation. It can also be used to help researchers prepare the experimental design information and analysis plan requested for grant applications [18]. The EDA is one of many tools available to help with ensuring the rigor of proposals and research more generally.

Other examples of alternative criteria include social media metrics as indicators of disseminating research [19], public lectures about the results of a research project, public engagement, and other types of events that bring together funders, researchers, and other stakeholders to work on an effective communication plan of the research program [20]. Organizations such as the Wellcome Trust are taking a holistic attitude to redefining their approach to engagement explicitly to help people feel empowered to access, use, respond to, and create health research [21].

Principle 2: Value complete reporting

Rationale

Failure to publish all findings of all studies seriously distorts the evidence base for decision-making. For example, a systematic review of trials of reboxetine for treating depression found that almost three-quarters of included patients were in unpublished trials [22]; other examples across different disciplines also exist [23,24]. Selective publishing of research with positive results (i.e., publication bias) distorts science’s evidence base and has been demonstrated in a variety of disciplines including economics, psychology, and clinical and preclinical health research (e.g., [25]). Furthermore, the frequency of other reporting biases (e.g., switched primary outcomes without disclosure, and spin) is around 30% [26]. This is unacceptably high and diminishes the trustworthiness and integrity of research [11]. It also appears that promotion and tenure committees (PTCs) generally do not give sufficient importance to registering protocols and data analysis plans, full publishing of completed studies, or making data, code, and materials available [27]. These activities deserve to be credited in the assessment of researchers because they are essential for replicability, to make it possible to verify what was done, and to enable the reuse of data.

Current implementation

Study registration and reporting guidelines are useful tools to help improve the completeness and transparency of a very broad spectrum of research [2831]. As part of the editorial policies of the Wellcome Trust’s open-access publishing platform (Wellcome Open Research [WOR]), authors are required to use reporting guidelines when submitting study protocols (e.g., SPIRIT) and completed studies (e.g., ARRIVE) [32]. Other funders, such as Gates Open Research [33], the NC3Rs Gateway [34], and the Association of Medical Research Charities [35], do likewise. To help reduce publication bias, WOR and other journals [36,37] use registered reports [38] (see Participating journals tab). Similarly, to promote the registration and publication of all research, the NIHR in the UK indicate that “when submitting an application to NIHR programmes for funding for a new clinical trial, the applicant must disclose past publication and trial Registration history for any relevant publications and research grants held, referenced in the application” [39]. Whereas these are examples of best practice from funders, few research institutions have incorporated them into researcher assessments [27, 40, 41].

Several research institutions (e.g., University of Toronto) are now recommending that their clinical trialists use SEPTRE [42], a web-based protocol creation and management tool. When SEPTRE is used, protocol information for trials is automatically registered in clinicaltrials.gov. This saves time and helps the researchers, and their research institutions, to maintain best publication practices (e.g., trial registration). Some journals in the social sciences, particularly psychology, use registered reports to help ensure that research is published regardless of its results [43,44].

Principle 3: Reward the practice of open science (open research)

Rationale

Openness (e.g., open access, open methods, open data, open code) in research is more than just access to research—it brings equality to the research process. It encompasses a range of practices across the entire life cycle of research [45]. Access to research should not be about who has the resources to pay to see behind a paywall, typically subscription journals. Healthcare and social policy decisions should be made based on access to all research knowledge rather than only a part of it [46]. A considerable amount of public funds is used for research, and its results can have profound social impact. Preclinical scientists are committing to openly share their laboratory notebooks [47] to streamline research, foster collaborations, and reduce unnecessary duplication. In an effort to deter questionable authorship practices, the Consortia Advancing Standards in Research Administration Information supports the CRediT taxonomy [48] as a way for research authors to more openly describe how each person has contributed to a research project.

Data sharing is another example of openness but is not common practice in clinical research (with some exceptions, such as genetics) [49], although patients seem supportive of sharing their data, at least of randomized trials they have participated in [50]. Data sharing is also not considered standard in many other disciplines. Without data sharing, it is difficult to check the selectivity of reports; data sharing is key to addressing concerns about reproducibility [51] and building trust [1]. There are varying estimates as to what proportion of research is made available through open-access mediums, such as open-access journals and repositories or as preprints, but it is far from 100% [52]. It seems clear that the various modalities of open science need to be rewarded in the assessment of researchers because these behaviors strongly increase transparency, which is a core principle of research integrity [45,53].

Current implementation

Ghent University, Belgium, has employed data sharing guidance stating, “Sound data management is a basic requirement for this [academic analysis] and provides additional guarantees for a flawless methodology, for sharing, and reusing data by other researchers in an Open Science context and for the accountability of a researchers own academic integrity" [54]. The Nanyang Technological University (NTU), Singapore, implemented an Open Access Policy in 2011. All NTU faculty and staff must deposit their final peer-reviewed manuscript of journal articles and conference papers in the Digital Repository (DR-NTU) maintained by the library upon acceptance of their publications. At NTU’s faculty of medicine, random data audits are conducted on the submitted (required) data management plans (DMPs), and checks are made to see if the final data are indeed shared on NTU’s open-access data repository DR-NTU. A coalition of funders to enforce open-access publishing in the near future [55].

To help facilitate data sharing, the University of Cambridge has introduced the concept of “data champions” [56]. Here, volunteers advise members of the research community on proper handling of research data supporting the use of the Findable, Accessible, Interoperable, and Re-usable (FAIR) research principles [57]. Delft University of Technology, the Netherlands, has taken this concept a step further and implemented it as a career assessment criterion [58]. The University of Glasgow’s academic promotion criteria explicitly allow for data sharing as a research and scholarship output (to support replication) [59].

Some journals have also established strong data sharing policies. For example, the PLOS journals “require authors to make all data underlying the findings described in their manuscript fully available without restriction at the time of publication. When specific legal or ethical requirements prohibit public sharing of a dataset, authors must indicate how researchers may obtain access to the data. Refusal to share data and related metadata and methods in accordance with this policy will be grounds for rejection” [60]. The Center for Open Science’s Transparency and Openness Promotion initiative provides information on data transparency standards for a wide variety of discipline journals [61]. Given that societal benefit is part of an emerging career assessment, clinical researchers should also respond to a growing view that patients want their data shared [50].

Open research is supported by key infrastructure compliance, such as requiring an Open Researcher and Contributor ID (ORCID) by every researcher, whereby each researcher can be uniquely identified. A recent letter from global funders committing to implementing ORCIDs for all researchers is a significant step forward [62]. This was recently implemented at the Ottawa Hospital Research Institute. In Australia and New Zealand, there is a consortium that supports ORCID nationally.

The NIH promotes the use of preprints in grant applications [63], as do all major UK public funders (e.g., Medical Research Council, UK) [64], The Wellcome Trust made them compulsory for work in health emergencies and promotes their use widely in particular for early-career researchers [65].

Principle 4: Acknowledge a broad range of research activities

Rationale

A system that rewards benefit to society and encourages trustworthy and important research needs to take the different types of research into account: creating new ideas; testing them; replicating key findings; synthesis of existing research; developing and validating new tools, measures, or methods; etc. Different indicators and criteria need to be developed that are relevant to these different types and stages of research (Fig 1). This includes different timeframes of assessment for different types of research.

Incentives that encourage one fixed idea of the “right kind” of research will slow down, or even stall, progress. So-called blue-sky research that builds on chance findings or curiosity-driven research based on “out-of-the-box” thinking should be possible and encouraged as well in an academic reward system that values societal progress [66]. For example, the discovery of graphene at the University of Manchester, UK, was the result of Friday afternoon discussions outside the “normal” research activities [67]. Funders are also encouraging multidisciplinary, high-risk applications [68]. The short-term nature of academic reward cycles makes this kind of research less attractive for funders, institutions, and individual researchers. Equally, replication studies or research synthesis efforts are often not regarded as innovative enough in researcher assessments, despite their critical importance for the credibility of research or for a balanced and robust systematic presentation of all available evidence, respectively [51,69]. This is not universally appreciated by PTCs. Research on research and meta-research are practiced at, for example, METRICS (Stanford, CA, USA) [70], QUEST (Berlin, Germany) [71] (whose focus is on clinical and preclinical meta-research), the Meta-Research Center at Tilburg University [72] (Tilburg, the Netherlands) (whose focus is on the social sciences and the Open Science Collaboration), and the ongoing Psychology Science Accelerator, which consists of contributors from hundreds of universities and independent nonprofit organizations working to evaluate the barriers to replicability in psychology, in preclinical cancer biology, and across the social sciences [73]. Such activities are important to inform and improve research practices and therefore contribute to making research more reliable and relevant. The issue is that we know very little about the drivers of detrimental and responsible research practices. Furthermore, research on research (also known as meta-research) is still underfunded. As such, it is important to explicitly award this type of scholarship when assessing researchers.

Current implementation

Some funders have already recognized the relevance of a broad range of research activities. The Research Impact Assessment Platform (Researchfish) works to capture some of this diversity and can generate reports on the impact of a broad spectrum of funded research [74]. The Wellcome Success Framework highlights the importance of a long-term vision and shared objectives in order to take a more balanced approach to assessment [75]. The German Federal Ministry of Science and Education is funding preclinical confirmatory trials [76].

The Wellcome Trust has developed a new Longitudinal Population Studies Strategy, funded data reuse prizes [77], and supports research on research [78]. All approaches are aimed at valuing a broad range of scholarship and maximizing the value of research. The Netherlands Organization for Scientific Research is in its third call for replication studies [79]. Research on research and meta-research are also gaining momentum and now have some formal outlets. For example, PLOS Biology and eLIFE have a meta-research section in their journals [80,81]. We were unable to find any academic institution that has incorporated replication or meta-research into their career assessment portfolio [27]. NIHR requires the completion of a systematic review prior to funding any new research [82]. The NC3Rs have also promoted the importance of systematic reviews for providing a rationale for project proposals [83,84]. In the event that such a review does not exist, they provide funding to perform one.

Principle 5: Recognize essential other tasks like peer review and mentoring

Rationale

As discussed alongside Principle 1, research assessments frequently focus on a narrow range of easy-to-measure metrics, including publications, citations, and funding income [2,27]. For the research ecosystem to function optimally, other research activities are also essential. Peer review remains the cornerstone of quality assessment of grants, publications, and conferences. The quality of peer-review contributions to journals and funders should also be part of assessments for promotion and tenure, as should contributions to various research infrastructure, oversight, or regulations. Equally, contributions to improvements that go beyond an individual-centered approach for assessment should be considered. These activities are currently largely missing from PTCs [27]. Contributions to developing the careers of others at all stages of their career are critical, as are contributions of various committees related to research (e.g., assuming the role of an editor). How best to do this without creating further barriers and bureaucracy, however, has long been debated [85].

Any reward system that has the whole research enterprise at heart and aims to foster a climate conducive to trustworthy and useful research with the highest regard to research integrity needs to find ways to incorporate these vital roles into its overall assessment structure. This is especially important because being a good role model as well as adequately supervising and mentoring early-career researchers are identified as top priorities in fostering research integrity [86].

Current implementation

Macquarie University, Sydney, Australia, has some exciting initiatives in their new academic promotion policy, which includes five pillars, one of which is in leadership and citizenship. Here, researchers can show their alignment with the university’s values and broader contribution to the university and its community [87]. Since this policy was introduced, it has been reported that the number of promotion applications increased by 50%, and the number of women promoted has also increased [88].

The University of Glasgow’s academic promotion criteria explicitly reward researchers for participation in peer review and other related activities (e.g., journal editorship) [59,89]. For this to occur, it is necessary to have organizations that can provide reviewers with a permanent identifier (a digital object identifier [DOI]) for journals that publish open reviews [90] that can be included in a researcher’s CV or that can aggregate completed peer reviews [91]. Such policies might also help promote more meaningful involvement in training in peer review [91]. The University of Exeter, UK, has developed “Exeter Academic,” a hub to help their researchers navigate career progression [92]. Leadership and citizenship are two (of five) major areas of focus. The former includes mentoring and the latter includes avenues to disseminate research knowledge from the university’s researchers.

The Finnish Advisory Board on Research Integrity (TENK) template for researcher CVs includes a broad spectrum of contributions, including mentoring and “trust in society” [93]. As a measure of mentorship, Maastricht University, the Netherlands, assesses the career progression of its PhD graduates [94]. We were unable to identify research institutions that reward researchers who have participated in training courses on high-quality mentorship [27].

The Irish Health Research Board (HRB) has a knowledge exchange and dissemination grant program providing existing HRB-funded researchers with an opportunity to seek supplementary funding for exchange and dissemination activities that can accelerate and maximize the potential translation and impact of the research findings, and learning gained, on policy or practice and health outcomes [95]. A similar scheme exists through the Canadian Institutes of Health Research [96] and the NC3Rs Skills and Knowledge Transfer grants [97] and their Crack IT open innovation platform [98].

Wellcome’s grant forms limit the number of publications applicants can submit and explicitly invite applicants to detail other achievements. This is combined with explicit guidance for panel members reminding them of the importance of taking a broad view when assessing individuals [99].

Discussion

The HKPs focus on promoting assessment practices that strengthen research integrity by deliberately concentrating primarily on what research institutions can do to modify the criteria used by PTCs for career assessments. The emphasis on responsible research practices and the avoidance of detrimental research practices is important because these behaviors are time and resource intensive and may result in a smaller number of grants and publications. The HKPs send a clear message that behaviors that foster research integrity need to be acknowledged and rewarded. The five principles we formulated are aimed at how research institutions should incentivize, reward, and assess individual researchers for behavior that fosters research integrity within their respective organization. The HKPs do not address gender and other forms of diversity, inclusiveness, and related issues. These themes require an assessment of a group of researchers (e.g., research institution) when making decisions about funding allocations or human resources policies. Furthermore, these issues concern the social justice and societal relevance of research rather than research integrity.

Dissemination

The WCRI Foundation [100] and the REduce research Waste And Review Diligence (REWARD) Alliance [101] will make the HKPs available on their websites. This “home” will include the principles, the signatories, infographics, translations into several languages (ongoing), future implementation plans (ongoing), and crucially, a place to highlight those who have endorsed the HKPs. Beyond journal publication, we are developing other synergistic dissemination routes.

Endorsement and uptake

Research institutions are key to the HKPs. They are the home of current and future researchers, where promotion and tenure assessments are carried out. To help facilitate HKPs “on the ground,” local key opinion leaders and their endorsement should be included in any plan. The HKPs have been recognized by the Governing Board of the WCRI Foundation and the Steering Committee of the REWARD Alliance. We invite academic institutions, funders, other groups, and individuals to do likewise on the WCRI Foundation’s website.

We are inviting individuals and organizations to deliver brief (2–3 minutes) YouTube testimonials as to how they have implemented the HKPs (categorized by stakeholder group) and to discuss how they integrate HKPs into their, and other, initiatives. We will provide a link to these videos on the WCRI Foundation website. This approach can serve as a pragmatic way for individuals and organizations to show how they are endorsing and using the HKPs and as a nudge to others to do likewise.

To implement some of these principles is likely straightforward, although this might not be the case for all principles. To do so requires more understanding of the complexities of today’s research environment, such as the availability of institutional infrastructure, whether current CV formats are optimal to collect best practices, enabling transparency about career assessment, and considering closer alignment with policies of funders.

We would like to evaluate our approach and develop tool kits for those interested in ways to implement the five principles. We will work with signatories to take this forward. We see the HKPs as an important step along the way to improving research integrity, and we encourage an ongoing dialog to support implementation of these important principles.

Acknowledgments

We thank the many participants in the 6th World Conference on Research Integrity who provided feedback on earlier versions of the document and actively participated in the focus group sessions during the conference. We thank Raymond Daniel for help with building the reference list database.

Abbreviations

DMP

data management plan

DOI

digital object identifier

DORA

Declaration on Research Assessment

DR-NTU

NTU Digital Repository

EDA

Experimental Design Assistant

FAIR

Findable, Accessible, Interoperable, and Re-usable

HKPs

Hong Kong Principles

HRB

Health Research Board

NC3Rs

National Centre for the Replacement, Refinement & Reduction of Animals in Research

NIH

National Institutes of Health

NIHR

National Institute of Health Research

NTU

Nanyang Technological University

ORCID

Open Researcher and Contributor ID

PTC

promotion and tenure committee

REWARD

REduce research Waste And Review Diligence

SPOR

Strategy for Patient-Oriented Research

WCRI

World Conference on Research Integrity

WOR

Wellcome Open Research

Funding Statement

PG is funded by an Australian National Health and Medical Research Council NHMRC Fellowship APP1155009. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Footnotes

Provenance: Not commissioned; externally peer reviewed.

References


Articles from PLoS Biology are provided here courtesy of PLOS

RESOURCES