Abstract
BACKGROUND
Federal research funding is decreasing, forcing specialty organizations to have an increasingly important position in developing and fostering research.1,2 As the research and innovation arm of the American Society of Plastic Surgeons, The Plastic Surgery Foundation (PSF) has a key role in supporting promising plastic surgery research. Understanding the grant review process, as well as factors that contribute to funding well-written grant funding applications, is paramount for aspiring academic surgeons.
METHODS
All research grant applications submitted to The PSF in 2012 and 2013 were evaluated. Each reviewer comment was independently assessed by two study team members and classified into key weakness categories. Chi-square test compared results between funded and unfunded grants. Linear regression identified which critique elements corresponded to changes in scores, and logistic regression identified elements that predicted funding.
RESULTS
We analyzed 1,764 comments from 240 applications. Of these, 55 received funding. Funded grants had significantly fewer reviewer comments in 4 of 5 weakness categories. As expected, funded grants received better (lower) scores. Concerns in the categories of “plan for execution” and “other elements/granstmanship” significantly affected score as well as odds of funding.
CONCLUSION
Ensuring that a grant addresses all required elements is important for receiving a low reviewer score. Our study demonstrates that “plan for execution” and “grantsmanship” influence reviewer scoring more than others. Investigators must clearly address items associated with conducting their experiments and performing the analysis. Investigators must also give equal attention to elements of overall quality and completeness to optimize chances of funding.
Keywords: research grants, grant funding, grant writing, pitfalls, career development
Introduction
Medical research is critical to the ongoing development of new clinical knowledge and information. Well-designed basic, translational, and clinical research provide the data and evidence needed to move physicians to accept and adopt new treatments and develop new cures. Funding medical research is necessary to ensure ongoing advancement. In 2012, over $130 billion was spent on medical and health research in the United States (US).3 The combined pharmaceutical, biotechnology, and medical technology industries underwrote a majority of this funding – contributing nearly $70 billion to research. The Federal government contributed approximately $41 billion. The remaining $20 billion came from universities, state and local governments, independent research institutes, philanthropic foundations and voluntary health associations.
Despite what appears to be a substantial allocation of funds for medical research, the current level of research expenditure only represents 4.5% of total US health spending.3 Furthermore, competition among the research community for these funds has become increasingly difficult. In 2013, nearly 24,000 new R01-equivalent grant applications were submitted to the National Institutes of Health (NIH); however, only 3,387 received funding for an overall funding rate of l4.0%. Of those that did receive funding, only 1,238 (13%) were awarded to new investigators.
The Plastic Surgery Foundation (PSF) is the research and innovation arm of the American Society of Plastic Surgeons. One of the key missions of The PSF is to identify and support promising research in plastic and reconstructive surgery. Providing grant funding to young investigators early in their careers helps draw some of the brightest minds into plastic surgery research. Giving young investigators an opportunity to test their ideas and gather data that can be used to compete for additional, more substantial extramural funding is a key strategy used by The PSF to encourage innovation and build a researcher pipeline for the specialty. The PSF has played an integral part in funding high-impact projects that have advanced the specialty and benefitted patients. In the last five years alone, The PSF has invested over $3.5 million to support 157 meritorious research projects. In many cases PSF funded projects have gathered adequate pilot data to then obtain NIH-level funding and a growing number of PSF funded investigators have gone on to receive NIH funding.5 Additionally, The PSF serves as an important funding vehicle for studies important to the care of plastic surgery patients that would not likely be funded by NIH or other Federal agencies.
With limited funding available through The PSF’s research grant program, successfully obtaining research support is a highly competitive process. The funding success rate through The PSF is approximately 22%. Recognizing the importance of securing grant support in order to establish a research career and foster medical advances, the purpose of this study was to analyze and identify critical components that impact the likelihood of receiving grant funding from The PSF.
Methods
The PSF Grant Review Process
Each year, The PSF reviews approximately 140 research and training grant applications. In evaluating research grant proposals, The PSF uses a scientific peer-review process modeled closely on the one used by the National Institutes of Health.6 This facilitates consistency and objectivity, and helps investigators prepare for future applications to the NIH. Two to three reviewers critically evaluate each grant application, and then present their impressions to a study section panel consisting of 12 to 30 members. Using standard criteria, the reviewers scored each application on a scale from 1.0 (exceptional) to 9.0 (poor) by considering the project’s significance, approach, innovation, investigator team, and the environment where the research will be completed.7
Reviewer Comment Database
Research project grant applications (e.g., Pilot and National Endowment for Plastic Surgery research grants) submitted and reviewed by The PSF’s 2012 and 2013 study sections were identified. Research Fellowship applications were excluded from this analysis because they were reviewed by different criteria with additional emphasis on the training modules. All grant applications were assigned a unique identification number. Then the grant critique forms from each application’s primary and secondary reviewer were compiled and read by a member of The PSF’s Grant Management Department. All reviewer comments pertaining to application weaknesses and deficits were abstracted and entered into the study database. Each comment was matched to the appropriate unique application identification number, so study team members were blinded to the names of grant applicants and their respective institutions.
Grant Reviewer Comment Classification
Taking into consideration categories identified by Agarwal, et at., and after reviewing the PSF’s National Endowment for Plastic Surgery (NEPS) and Pilot research grant guidelines as well as the PSF Study Section Reviewer instructions, a list of 19 common grant application weaknesses (Table 1) was developed.8 Each comment in the Reviewer Comment Database was read by the same two study team members. Each study team member independently classified every comment into at least one of the 19 weakness types.
Table 1.
Results of comment review categorization by grant funding status (2012 – 2013)
| Area of Weakness | Funded (n=55) |
Not Funded (n=185) |
|
|---|---|---|---|
| Project Concept | 76.4% | 83.2% | |
| Study Significance/Relevance/Impact | 27.2% | 55.1% | |
| Overlap with Current/Previous Work | 20.0% | 8.6% | |
| Published Literature/Preliminary Data | 16.4% | 31.4% | |
| Study Aims/Question | 32.7% | 50.3% | |
| Hypothesis | 5.5% | 26.5% | |
| Scope/Timeline/Feasibility to Complete | 23.6% | 22.2% | |
| Project Design | 61.8% | 83.8% | |
| Approach/Model/Design | 56.4% | 76.2% | |
| Enrollment/Sample Size/Power | 18.2% | 35.1% | |
| Consideration of Limitations/Alternatives | 10.1% | 18.4% | |
| Plan for Execution | 61.8% | 82.7% | |
| Research Strategy/Methods/Techniques | 36.4% | 58.9% | |
| Procedures Not Explained Well | 10.9% | 31.9% | |
| Questionable Unclear Variables/Endpoints | 27.3% | 40.0% | |
| Statistical Issues | 3.6% | 16.2% | |
| Analysis Issues | 40.0% | 49.2% | |
| Team Environment | 25.5% | 40.0% | |
| Research Team Experience/Unclear Roles | 23.6% | 36.2% | |
| Institutional Support/Mentor/Sponsor Support | 3.6% | 7.0% | |
| Other Grant Elements | 32.7% | 49.7% | |
| Poorly Written/Not Focused/Not Complete | 14.5% | 37.3% | |
| Budget Issues | 20.0% | 20.5% | |
| IRB/Ethical Concerns | 0.0% | 2.7% | |
The initial comment classifications made by each study team member were compared. A conference call was used to discuss the instances where there were discrepancies in how each study team member classified a specific comment and to achieve final consensus. Considering the large number of comments that were reviewed, consensus on the discrepancies in classification was reached between the same two reviewers rather than bringing in a third reviewer who would have been far less familiar with the general nature of the comments and the classification system being used. Once final consensus was reached on all comments, application specific information (e.g., year of submission, application type, grant score, and funding status) was matched using the unique application identification number. This process produced the final dataset used for analysis.
Data Analysis
Each of the 19 weakness types were assigned to one of five general categories: project concept, project design, plan for execution, team environment, and other grant elements/grantsmanship (Table I). These five categories correspond to the critical elements common to grant applications and to research study design, and formed the basis for our data analysis.
Descriptive statistics were used to evaluate how the grant applications were scored, and which general categories were criticized most often in the funded and unfunded grants. Chi-square test was used to compare results between funded and unfunded grants. Linear regression was then used to identify which elements of the grant critique corresponded to changes in grant scores. Logistic regression was used to identify which elements of the grant reviewer evaluation significantly predicted funding decisions. All analysis was done using Stata 12.0.
Results
A total of 240 research project grant applications were reviewed by The PSF during the 2012 and 2013 grant cycles. All 480 critique forms from these grants were reviewed, with a total of 1,764 reviewer comments extracted. Final grant application scores ranged from 1 to 9, with balanced distribution of scores across the range (Table 2, Figure 1). The average score of the funded and unfunded applications was 3.25 and 5.98, respectively. Fifty-five applications (22.9%) received funding from The PSF. Of the funded applications, 13 were submitted to the NEPS and 42 were submitted to the Pilot Research Grant mechanisms, respectively.
Table 2.
Grant score distribution (2012 – 2013)
| Grant Score | Number of Grants |
|---|---|
| 1.00 – 1.99 | 4 |
| 2.00 – 2.99 | 26 |
| 3.00 – 3.99 | 29 |
| 4.00 – 4.99 | 48 |
| 5.00 – 5.99 | 78 |
| 6.00 – 6.99 | 40 |
| 7.00 – 7.99 | 37 |
| 8.00 – 8.99 | 11 |
| 9 | 3 |
FIGURE 1.
Distribution of final priority scores for research project grants submitted to The PSF in 2012 and 2013.
All grant applications received at least one weakness comment. Eight grants had comments in only one of the 19 weakness areas, whereas 30 grants had comments in more than 10 of 19 weakness areas. After collapsing this data into the five general categories, across all applications 80% had at least one comment associated with project concept, 77% with project design, 74% with plan for execution, 39% with team environment, and 48% with other grant elements (Table 2). Chi-square analysis identified that funded grants had significantly fewer comments related to project design, plan for execution, team environment, and other grant elements as compared to grants that were unfunded (Table 3). Funded grants also had fewer comments related to project concept, but this difference did not reach statistical significance. Nearly 83% of all unfunded applications had weakness in the “plan for execution” category. The most common weakness areas associated with the plan for execution category involved problems with the research project’s strategy/methods/techniques (58.9%), data analysis (49.2%), and unclear variables or endpoints (40.0%). Almost 50% of the applications that were unfunded had weaknesses in the “other grant elements/grantsmanship” category. These weaknesses are associated with applications that are likely hastily prepared, poorly written, confusing to read, not complete or did not follow the directions.
Table 3.
Chi-square results of reviewer category concerns between funded and not funded grants.
| Reviewer Category | Reviewer Comments in Funded Grants (%) |
Reviewer Comments in Not Funded Grants (%) |
Chi-square (p value) |
|---|---|---|---|
| Project Concept | 76.4 | 83.2 | 0.247 |
| Project Design* | 61.8 | 83.8 | <0.001 |
| Plan for Execution* | 61.8 | 82.7 | 0.001 |
| Team Environment* | 25.5 | 40.0 | 0.05 |
| Other Grant Elements/Grantsmanship* | 32.7 | 49.7 | 0.026 |
significant result with p ≤ 0.05
Logistic regression confirmed that the score an application received was significantly associated with the likeliness of it being funded – with each full-point increase in score resulting in nearly 95% relative decrease in odds of the application being funded (OR 0.058, p<0.001, 95% CI 0.024,0.135). Controlling for funding mechanism, regression analysis identified that concerns in the “plan for execution” and “other grant elements/grantsmanship” categories resulted in significant increases in grant score (Table 4). Furthermore, logistic regression showed that reviewer concerns in these same two categories significantly reduced odds of the grant being funded (Table 5). Using a separate model we also identified that odds of having the grant funded decreased by nearly 50% for each additional general category with a reviewer concern (OR 0.527, p < 0.001, 95% CI 0.41, 0.7).
Table 4.
Linear regression results of reviewer-identified deficiencies predicting changes in grant score
| Category with Reviewer Concern | Effect on Reviewer Score | ||
|---|---|---|---|
| Coef.◊ | SE◊◊ | p value | |
| Project Concept | 0.161 | 0.291 | 0.58 |
| Project Design | 0.289 | 0.335 | 0.389 |
| Plan for Execution | 1.005* | 0.284 | < 0.001 |
| Team Environment | 0.308 | 0.207 | 0.139 |
| Other Grant Elements | 0.410* | 0.201 | 0.05 |
Coefficient
Standard error
significant result with p ≤ 0.05
Table 5.
Logistic regression results of reviewer category deficiencies predicting grant being funded
| Deficiency Identified by Reviewer | Effect on Grant Being Funded | ||
|---|---|---|---|
| Odds Ratio | SE◊◊ | P | |
| Project Concept | 0.681 | 0.299 | 0.38 |
| Project Design | 0.606 | 0.290 | 0.30 |
| Plan for Execution | 0.409* | 0.166 | 0.03 |
| Team Environment | 0.721 | 0.256 | 0.36 |
| Other Grant Elements/Grantsmanship | 0.439* | 0.149 | 0.02 |
Standard error
significant result with p ≤ 0.05
Discussion
In today’s competitive research environment, just having a new or innovative idea is not enough to secure the necessary funding to complete a research project. Most supporting agencies now require investigators to submit a formal grant proposal describing the need for the project, detailed methods about how experiments will be performed and data analyzed, as well as expected results and potential impact.9–12 To be effective, an investigator’s grant proposal must prove to the reviewers that there is high likelihood that the project will produce meaningful results.13
Similar to national funding agencies such as the NIH and the Agency for Healthcare Research and Quality, The PSF uses a peer review system to systematically evaluate the scientific merit of research grant applications. Through peer review, the critical evaluation of grant applications performed by independent experts helps funding organizations identify projects that have the highest potential for success. This study examined the critical comments submitted by scientific reviewers volunteering in The PSF’s peer review process for the purpose of identifying common weaknesses that affect the final funding decision of research grant applications submitted to The PSF.
The results of our study demonstrate that a grant application does not have to be perfect or flawless to receive funding. All the research grant applications evaluated for this study had at least one weakness identified during the peer review process. In fact, of the 55 research project grant applications that were funded by The PSF in 2012 and 2013, the average score was 3.25. Using the NIH rating scale for scoring grant applications, this corresponds to these applications being described as very strong with only some minor weaknesses.14 Conversely, the average score of non-funded applications was 5.98. These applications would be described as having at least one or more moderate weakness or worse.
Although writing a flawless grant application is difficult, especially when expert grant reviewers are conducting the critical review, investigators need to limit the number and severity of mistakes. The data from our study confirm that, overall, funded applications have fewer problems or weaknesses when compared with those applications that did not receive funding from the PSF. Specifically, in 18 of the 19 weakness areas used to categorize the critical comments, a greater percentage of non-funded applications had problems in those respective areas compared to applications that received funding (Table 1). The only weakness area where this was not demonstrated occurred with applications having “overlap with current or previous studies” identified as a weakness. It is possible that investigators submitting research grant applications with that particular weakness wrote well-structured applications but the grant reviewers were concerned that the research project is not new or novel and just a continuation of a project that has already been funded or conducted.
Our study also highlights that certain types of mistakes likely impact the success or failure of achieving funding more than others. Weaknesses categorized in the “plan for execution” and “other grant elements/grantsmanship” categories significantly impacted reviewers’ scoring tendencies. Deficiencies in either of these categories were found to negatively influence grant application scores as well as the odds of a grant being funded. Data from our study show that grant reviewers do not want to struggle with working their way through a poorly written or structured grant application. Grant applicants need to ensure that they closely follow the instructions provided in the grant guidelines.9 These guidelines are provided to help grant applicants address all of the required information needed for a grant reviewer to effectively evaluate their proposal. Grant applicants also need to pay particular attention to grammatical errors and misspellings. When reviewers identify those types of errors in a grant application it commonly results in a reviewer questioning the ability and seriousness of the applicant to carry out the proposed research.
Furthermore, grant reviewers want to clearly see research methodologies that are focused and have tangible metrics/endpoints that directly test a project’s hypothesis. When reviewers have a difficult time following the science and how experiments will actually be conducted, they often lose enthusiasm for an application. Applications that only give brief or no attention at all to how the data will be analyzed also receive low scores. Particular attention should be given to ensuring an adequate sample size to convince the grant reviewers that the data produced by a particular set of experiments will have sufficient power to yield meaningful results. By following the directions listed in the grant guidelines and allowing adequate time to appropriately review the application prior to submission, as well as ensuring that the application clearly describes how the research and analysis plans address the aims, grant applicants will substantially increase their chances of receiving funding (Table 6).9,12,13
Table 6.
Elements of a successful grant application.
| Elements Needed for a Successful Grant Application |
|---|
Excellent Writing and Organization
|
| Complete Application with all Required Elements |
Clear Central Hypothesis
|
Specific Aims
|
Background
|
Approach/Methods
|
Analysis
|
This study has limitations. We reviewed two years of applications. Although our results had adequate power to reach significance, it is possible that this narrow time window limited our analysis. However, using the two most recent funding cycles helps ensure the timely nature and value of our results. Although study team members were blinded during our analysis, the original grant reviewers were not blinded and any bias or additional extenuating factors in their review cannot he controlled for in our study. Additionally, PSF team members familiar with the PSF grant and review process conducted this project, which may have introduced bias in the rating and grading of grant comments for our study.
Conclusion
This study provides the first comprehensive evaluation of common weaknesses identified by expert peer reviewers of research project grant applications submitted to The PSF. These findings should be used as a helpful resource for investigators planning on submitting applications to The PSF or other funding agencies. By thoroughly addressing as many of the common weakness areas identified in this study, grant applicants will be able to strengthen their funding applications and their likelihood of receiving funding. In a quantitative way, our study also demonstrates that certain factors tend to influence how reviewers score an application more than others. When writing grant applications, investigators should pay special attention to adequately addressing items associated with conducting the research experiments as well as thoughtful analyses to generate meaningful results, because these factors significantly impact how a grant application is scored and ultimately its chances of funding. Additionally, investigators need to focus on overall grantsmanship and understanding the grant guidelines. Even small mistakes in grantsmanship make it easy for a reviewer to pass on funding an application. However, considering that in 2012 and 2013 over 20% of applications did receive funding. it is encouraging for those planning to apply for PSF grants that investigators who submitted applications with few weaknesses in key areas were successfully funded.
Acknowledgments
Source of Funding:
Support for this work was provided (in part) by The Plastic Surgery Foundation® (to AMG). Additional support was provided by the National Institute of Arthritis and Musculoskeletal and Skin Diseases of the National Institutes of Health under Award Number 2K24-AR053120-06 (to KCC). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Footnotes
- Keith M. Hume, MA – reports no commercial associations or financial disclosures
- Aviram M. Giladi, MD – reports no commercial associations or financial disclosures
- Kevin C. Chung, MD, MS – reports no commerical associations or financial disclosures
References
- 1.Dorsey ER, de Roulet J, Thompson JP, et al. Funding of US biomedical research, 2003 – 2008. JAMA. 2010;303:137–143. doi: 10.1001/jama.2009.1987. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Chakma J, Sun GH, Steinberg JD, Sammut SM, Jagsi R. Asia’s ascent – global trends in biomedical R&D expenditures. N Engl J Med. 2014;370:3–6. doi: 10.1056/NEJMp1311068. [DOI] [PubMed] [Google Scholar]
- 3.Truth and Consequences: Health R&D Spending in The U.S. (FY11-12) [Accessed May 30, 2014]; Available at http://www.researchamerica.org/uploads/healthdollarl2.pdf. [Google Scholar]
- 4.NIH Data Book Reports: R01-Equivalent grants, New (Type 1): Success rates, by career stage of investigator. [Accessed May 30, 2014]; Available at: http://report.nih.gov/nihdatabook/index.aspx.
- 5.Larson KE, Gastman B. Sources of Federal Funding in Plastic and Reconstructive Surgery Research. Plast Reconstr Surg. 2014;133(5):1289–1294. doi: 10.1097/PRS.0000000000000083. [DOI] [PubMed] [Google Scholar]
- 6.NIH Peer Review: Grants and Cooperative Agreements. [Accessed May 15, 2014]; Available at http://grants.nih.gov/grants/peerreview22713webv2.pdf.
- 7.NIH Reviewer Orientation. [Accessed May 15, 2014]; Available at http://grants.nih.gov/grants/peer/guidelines_general/reviewer_orientation.pdf.
- 8.Agarwal R, Chertow GM, Mehta RL. Strategies for Successful Patient Oriented Research: Why Did I (not) Get Funded? Clin J Am Soc Nephrol. 2006;1:340–343. doi: 10.2215/CJN.00130605. [DOI] [PubMed] [Google Scholar]
- 9.Chung KC, Shauver MJ. Fundamental principles of writing a successful grant proposal. J Hand Surg Am. 2008;33(4):566–572. doi: 10.1016/j.jhsa.2007.11.028. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 10.Kotsis SV, Chung KC. A guide for writing in the scientific forum. Plast Reconstr Surg. 2010;126(5):1763–1771. doi: 10.1097/PRS.0b013e3181ef8074. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Wiseman JT, Alavi K, Milner RJ. Grant Writing 101. Clin Colon Rectal Surg. 2013;26(4):228–231. doi: 10.1055/s-0033-1356722. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Davidson NO. Grant Writing: Tips and Pointers from a Personal Perspective. Gastroenterology. 2012;142(1):4–7. doi: 10.1053/j.gastro.2011.11.005. [DOI] [PubMed] [Google Scholar]
- 13.Kaplan K. Funding: Got to Get a Grant. Nature. 2012;482(7385):429–431. doi: 10.1038/nj7385-429a. [DOI] [PubMed] [Google Scholar]
- 14.Interpreting new application scores and critiques. [Accessed May 15, 2014]; Available at http://enhancing-peer-review.nih.gov/docs/scoring_and_critique_overview_June2009.pdf. [Google Scholar]

