Abstract
Many scientists are struggling with a deluge of requests to review grant applications that they increasingly have to refuse. A simple and well‐structured application format would make it easier to recruit reviewers and improve the quality and reliability of peer review.

Subject Categories: S&S: Careers & Training, S&S: Media & Publishing
During the past decades, grants have become the standard mode of financing research in most European countries, which has generated a growing number of review requests to individual scientists. This has been exacerbated by the increasing submission of all kinds of grants to international peer review, including small grants that had previously been dealt with by grant panels. Moreover, the low success rate of most grant schemes has pushed researchers to multiply the number of grant proposals they submit per year. As a result, many researchers, in particular internationally recognized specialists in their fields, receive more and more solicitations to serve as external reviewers, as panel members, or as chairpersons of grant review panels.
Many scientists react to this deluge of requests by refusing, simply because they no longer have the time to deal with all of them. Others do their best to follow the “call of duty”, and end up overworked and thus compromised in their ability to carefully assess the proposed work and its potential to generate new knowledge. Highly solicited scientists may also pass on the task to younger researchers in their team, who, owing to insufficient experience, might be overcritical, thus compromising the applicant's chances of getting funded. All these factors weaken the quality and reliability of scientific review, which affects in particular emerging research areas that require thorough review to appreciate the originality and feasibility of the proposal. This has created an increasing distrust in peer review, which negatively feeds back on the appreciation of the work of the reviewers, and thus further increases the reluctance of scientists to participate in grant review.
To maintain the quality of grant review, solutions are needed that would allow highly solicited specialists to better focus on reviewing grants and deliver competent evaluations. One way to encourage and help potential reviewers would be a simple and intuitive format to make reviewing a predictable and time‐efficient task. This is currently not the case: Many grant proposals come with a convoluted and user‐unfriendly format, evaluation forms, and rules that make review a difficult, stressful, and time‐consuming ordeal.
Reviewing a grant nowadays means reading a lengthy document—proposals of more than 50 pages are not unusual—in which the information is often arranged in a non‐intuitive order. Many reviewers end up spending more time deciphering the format and the evaluation forms than assessing the proposal. On top of this, many national agencies require bilingual forms, which makes grant proposals even longer and information‐mining more difficult. In addition, scientific reviewers are increasingly asked to judge non‐scientific criteria, such as budget requirements—which can vary considerably among different countries—or the potential impact on a country's economy, which are the purview of grant panels or administrative officers with appropriate expertise. Finally, many grant agencies use online formats that are non‐intuitive to use, and require extra time and attention to correctly fill them out. Some of these websites are online only, which makes it impossible to work on a grant proposal while traveling. Thus, an easier review process, which would allow reviewers to focus on judging the scientific quality of a grant and the applicant, would make the review task much easier, faster, and thereby both increase the motivation to review and the quality of peer review. Vice versa, funding agencies would find it much easier to recruit international experts by making their job as efficient and easy as possible.
Scientific projects are generally similar in format; what really distinguishes them is the call they respond to. Apart from this detail, all grant proposals are evaluated based on universal criteria: quality of the project, achievements of the proposer or consortium, feasibility, and the context of research in this domain. It would therefore be a great simplification to adopt a unified format, which defines the structure of the grant or evaluation report into specific sections with specified length.
A unified grant format would have to clearly distinguish between scientific arguments, administrative arguments—how the project fits the call, and other details such as economic impact, and so on—and budget. A clearly structured format would be of great advantage for all three actors of the review process: the administration, the evaluators, and the applicants. The administration could request the precise information they need by adapting a standard application form—it would just require less work for them. The evaluators would no longer need to decipher individual evaluation formats from different agencies, and could fully focus on reviewing. The authors would be able to use the same project for complementary purposes—for instance, a grant proposal for financing the laboratory and a PhD grant proposal based on the same scientific project. They would no longer need to rewrite the project proposal from scratch, but just adapt the administrative information. There is no danger of abuse, as the administrative part requests an individual justification of the grant according to the call.
The greatest potential for simplification is in the administrative sections of grant proposals. Despite a plethora of grant schemes, they all essentially ask for the same information albeit in different formats, order, detail, and wording. These forms require a lot of attention by the authors, but they equally challenge reviewers to extract relevant information. It would therefore be a huge simplification if “personal information” sections could be organized as a unified, single‐format “researcher's passport”, similar to the American Biosketch. To adapt the multitude of personal information in such a “passport” to the requirements of specific grant schemes, it could be best organized as an editable database, from which applicants can chose what details to export, similar to the generation of a reference list using bibliographic software. The database should contain all relevant information: degrees, achievements—invited conferences, funding, consortium coordinator, teaching, supervision, and so on—along with publication and funding records. It would allow grant agencies to request a precise subset of data, which the researcher could provide with a couple of mouse clicks. The reviewers would equally profit from a logical structure that corresponds precisely to the criteria she or he needs to evaluate—for instance, all last‐author publications of the past 5 years—and would only need to spend a short time on this part of the review. Technically, Research ID and ORCID are already developing such approaches; however, a portable offline version of such databases would probably be required to implement this system globally for grant writing and evaluation.
A unified application form should also address a couple of practical issues to optimize its use. A mandatory online system might not be ideal, as firewalls or spontaneous internet ruptures can interfere with grant writing. Instead, a proposal form should be downloadable for offline use so scientists and reviewers can write and comment on grants without the requirement of internet access. It should also help the reviewer to focus on the relevant scientific details and not administrative issues: for instance, whether the number and type or requested positions are appropriate, but not the overall budget requirements or economic impact. A further improvement could be integrating the evaluation forms directly into the proposal forms—in this way the referee would find the precise criteria to be evaluated along with every part of the proposal, and does not need to juggle with many different forms at the same time. This is particularly important when evaluation is done out‐of‐office. Finally, the system should be able to evolve, which would be easy in the database format, to accommodate new evaluation criteria if necessary.
Some of these aspects have already been implemented and used by various European funding agencies. For instance, many agencies use online forms that clearly define the length and content of each part of the grant application, but these are still part of a bewildering diversity of forms, formats, and requirements from different submission systems. Moreover, these systems are often not adapting to need, as smaller funding agencies might not have the resources to further develop their online submission systems; indeed, they might therefore be more willing to use a unified application format.
Overall, a unified grant submission and evaluation format that is adopted by all or most European funding agencies would greatly benefit scientists and science administrators. The simple fact that it would help funding agencies to recruit the best experts to review grant proposals should make it attractive enough to be implemented. Moreover, a unified grant scheme would make evaluation more transparent, thereby increasing the public trust in science and how it is funded. In summary, simplifying the submitting and reviewing of a grant proposal would considerably reduce the time scientists have to spend on these tasks, improve the quality of review and evaluation, and thus the quality and efficiency of the whole research enterprise.
