Skip to main content
AMIA Annual Symposium Proceedings logoLink to AMIA Annual Symposium Proceedings
. 2025 May 22;2024:378–387.

Development and Usability Testing of a Web-Based Research Guide for Health Solutions Grant Writing

Maheswari Eluru 1, Aishwarya S Potturu 1, Matthew Scotch 1, Lisa Allen 1, Nancy Osgood 1, Ana Tello 1, Adela Grando 1
PMCID: PMC12099331  PMID: 40417530

Abstract

Young scientists, including postdocs and assistant professors, need access to grant writing resources for training and proposal development. To assist in this, we developed a web-based research guide providing centralized access to curated tools throughout the research funding process- finding funding, preparing proposals, managing awards, etc. Using consumer informatics principles, we enhanced the research grant repository’s effectiveness, with lessons learned and insights generalizable to other institutions. Six faculty members completed nine tasks to explore the guide’s ten sections. Participants found the guide highly usable, with an excellent System Usability Scale (SUS) score of 89.2. Suggestions included improving navigation, content organization and providing education on award management processes. Liked features were the chronological organization of information, samples from successful grants, pre-populated templates, and mechanisms for ongoing feedback. These findings underscore the importance of usability in developing resources that effectively support faculty in grant writing and proposal development.

Introduction

In biomedical informatics research, usability studies have been instrumental in understanding the nuances of human-computer interaction in clinical and public health settings. However, there exists a glaring void when it comes to illustrating how usability principles can be leveraged to enhance the effectiveness of research grant repositories for faculty working on developing health solutions.

Young scientists, including postdocs and assistant professors, must have access to grant writing information and resources for training and proposal development. However, it is challenging to integrate relevant information from multiple sponsors (e.g., NIH, NSF, foundations, etc.) in a usable manner. Preparing research proposals to acquire funding is highly competitive, and the percentage of successful grant applications supplied by large funding organizations has declined [1]. Constantly changing requirements and hyper-competition have contributed to this decline, discouraging young investigators from being creative because they have no guarantee of success and spend too much time preparing their grants [1].

An eight-year study was conducted to determine the effectiveness of different grant writing workshops in improving faculty’s ability to obtain research grants [2]. The key findings of this study indicated that institutions need to implement a network of resources for faculty to improve the quality of grants, such as providing examples, peer-revision, and explaining the specifics of grant writing [2]. The ability to write research grants can be learned [3].

Web-based resources that assist researchers in preparing successful grants are not new. Research-based organizations are creating websites that provide online resources for grant writing [4,5]. For instance, the Marshfield Clinic implemented a website that offered centralized access to tools that assist researchers with various aspects, from conceptualizing their research to preparing their grant proposals [6]. Similarly, Vanderbilt University created a comprehensive web-based resource to support its research faculty and conducted studies to ascertain the benefit of its websites [6, 7].

Usability testing is an important aspect of designing a website that both previous studies lacked. It can help determine errors and gaps in learnability, efficiency, and memorability to ensure the system is easy to use and accessible to all users [8]. The Marshfield Clinic recruited 25 testers to evaluate its website and determine its effectiveness. In contrast, Vanderbilt measured the benefit of their website by collecting analytics on the usage of their website, such as the number and change in the number of users and the major section activity over three years.

The College of Health Solutions (CHS) at Arizona State University (ASU), with over 200 faculty members, aims to enhance scientific research and translate it into practical solutions. Research faculty come from seven areas: biomedical informatics and diagnostics, healthcare systems, health/medical sciences, movement sciences, nutrition, population health, and speech and hearing sciences.

CHS developed the Research Guide to educate and train faculty on grantsmanship and increase funded research. The web-based CHS Research Guide, only accessible to those with ASU user accounts (faculty, staff, and students), provides centralized access to curated tools to assist CHS faculty throughout the research funding process—from finding funding to preparing proposals and managing awards. The intention behind this curated resource was to provide faculty with continuously updated information for constantly changing proposal guidelines. Equally important was the ease of navigation throughout the webpage.

This study aims to assess the research guide’s usability, identify areas for improvement to support faculty grantsmanship needs better, and share the lessons learned to help other institutions create effective education and training material for their faculty.

Methods

CHS Research Guide

The CHS research guide was designed to explain the research funding process. It is divided into ten sections (Table 1), each outlining a different step. The Guide follows a chronological flow to facilitate navigation.

Table 1:

Descriptions of the different sections in the CHS Research Guide

Section Description Example Resources
1. Getting Started Provides a brief overview of the intranet
  • Research Success Hub (team lead by the CHS Assistant Dean for Research)

  • Research Advancement Team (pre and post award specialists)

2. Funding Describes the grant initiation phase and directs researchers to funding sources Funding searching tools:
  • federal (e.g., Grants.gov)

  • non-federal (e.g., Foundation Directory Online)

  • ASU funding (e.g., college seed grants)

3. Proposal Preparation Compiles resources- guidelines, samples, and templates- for various types of grants and descriptions of all the steps taken during the proposal preparation phase
  • Intake forms to start the grant application

  • CHS resources to support research, like the Center for Health Information and Research (CHiR)

  • Samples, templates and training for the major funding sources (e.g., NIH, NSF, DOD, DOE)

  • Password-protected access to past CHS funded applications

4. Proposal Revision Lists and describes resources for researchers to improve the quality of their grants prior to submissions
  • Free CHS resources (e.g. CHS Grant Review Committee)

5. Proposal Resubmission Details the process for resubmitting a grant with samples and templates
  • Resubmission guidance (e.g., NIH instructions).

  • Password-protected past CHS funded resubmitted applications (submission, response to reviewers and re-submission)

6. Award Review & Acceptance Explains the Notice of Award and the process of reviewing the award acceptance
  • What to expect when getting an award

7. Award Setup Provides information about setting up a grant account and certain exceptions to this process such as an At-Risk Account
  • Grant account opening in the financial management system

8. Manage Award Guides researchers through the process of managing their award from making purchases to hiring others and cost-sharing
  • Pre- and post-kick-off meeting details

  • Post-award tasks (e.g., purchasing equipment)

9. Award Close Out Describes award close out process and lists the steps that must be taken to close out an award
  • Institutional policies on award close out

  • Grant account closing details

10. Dissemination Emphasizes the importance of sharing research findings and provides resources to ease this process
  • ASU resources to disseminate research

  • ASU after-award initiatives (e.g., commercialization)

Study Participants

The Institutional Review Board (IRB) at Arizona State University reviewed and approved this study (STUDY00017107). Participants consented to participate. We aimed to recruit six faculty members (Two Assistant Professors or similar rank, two Associate Professors or similar rank, and two Professors or similar rank). We selected potential participants based on their ranks and research area, aiming for diversity. We recruited them through flyers sent by email.

Study Procedures

We created a demographic questionnaire to collect participants’ gender, academic rank, academic program, experience writing grants, and awareness of the CHS Research Guide (Table 3).

We employed a within-subject experimental design for the usability study, giving participants a five-minute familiarization period with the CHS Research Guide. During this time, participants explored the website’s layout to become acquainted with its structure. Following this familiarization phase, the participants completed nine tasks to explore different sections of the CHS Research Guide (Table 2). The tasks served to assess the ease of navigation throughout the website and determine areas where the website can be improved to be more user-friendly and accessible to all. We gave participants a maximum of five minutes to complete each task with no breaks between tasks, and they were allowed to skip any of the tasks if they wished.

Table 2:

Tasks presented to participants in the usability study and corresponding section in the CHS Research Guide

Tasks
Task 1: You need to find federal funding for your research. Where can you find the information on NIH types of grants?
Task 2: You are preparing a grant proposal. Can you find the Biostatics Core Recharge Center for Collaborative Research Support?
Task 3: You are preparing a grant proposal again. Can you locate the CHS intake form to indicate your interest to submit a grant?
Task 4: You are preparing a grant proposal. Can you find an NIH biosketch sample?
Task 5: You are revising your grant proposal. Can you find information on CHS Grant Consultation services?
Task 6: You are revising your grant proposal again. What are the due dates for the CHS Grant Review Committee assistance?
Task 7: You are resubmitting your grant proposal. Can you find examples of grant resubmissions?
Task 8: You are managing your award. Can you find information on kickoff meeting with the research advancement team?
Task 9: You are managing your award. Can you find information on how to pay a single invoice?

After completing the tasks, participants completed a Likert-scale ranging from 1 (Strongly disagree) to 5 (Strongly agree) post-study satisfaction survey and shared their overarching thoughts and satisfaction with the website.

Data Collection and Analysis

We summarized the participant’s demographics using descriptive statistics.

We conducted the usability study in person using Loop 11 [9], a user experience testing software. Loop11 captured the participants’ screens, audio, and video while they completed the usability tasks. Loop11 also captured responses to the pre-study demographic survey and a post-study satisfaction survey. Loop11 measured the time and number of clicks required for successful task completion and the number of errors during time completion. We presented those results in a heat map.

For each task, we summarized usability issues, provided exemplary quotes, and quantified the severity and frequency. For each issue, we proposed improvements to the research guide to address it.

Participant satisfaction was measured through the post-study survey questions. We used descriptive statistics and the System Usability Scale (SUS.) [10]

Results

Participant Demographics

The participant demographics, summarized in Table 3, showcase key characteristics of the study’s sample group (N=6). More than half of the participants were male (66.66%), with a varied distribution across teaching program areas. Teaching ranks were evenly distributed among Professor, Associate Professor, and Assistant Professor roles. Most participants had submitted 1-5 grants (50.00%), with NIH grants being the most common (44.44%). About a third of participants were aware of the CHS Research Guide Website.

Table 3:

Participant demographics (N=6)

Demographics Groups Frequency Percentage
Gender Male 4 66.66%
Female 2 33.33%
Teaching Program Biomedical Informatics and Diagnostics 4 36.36%
Area(s) Healthcare Delivery and Behavioral Health 3 27.27%
Movement Science 2 18.18%
Population Health 2 18.18%
Teaching Rank Professor 2 33.33%
Associate Professor 2 33.33%
Assistant Professor 2 33.33%
Grants Submitted 1-5 3 50.00%
6-15 1 16.66%
16-25 0 00.00%
25+ 2 33.33%
Grant Type Department of Defense (DOD) 2 22.22%
Department of Education (DOE) 0 00.00%
National Institutes of Health (NIH) 4 44.44%
National Science Foundation (NSF) 1 11.11%
Other 2 22.22%
Aware of CHS Yes 2 33.33%
Research Guide No 4 66.66%
Website

Usability Study

Across the nine usability tasks, participants achieved a 100% success rate. Every task was completed successfully. Examining the time participants spent on tasks revealed variations (Figure 1). The severity scales in our study utilize a color scheme ranging from green to red to indicate different levels of usability issues: green for minor, yellow for moderate, and red for significant problems. This color-coded system helps quickly assess and prioritize improvements needed to enhance user experience. Tasks 1 and 8 took significantly longer than expected, with average completion times of 111 and 95 seconds (about one and a half minutes) each, compared to the overall average of 66.44 seconds across all tasks. These findings suggest potential areas for improvement regarding user efficiency in specific tasks. Qualitative analysis uncovered common themes in participant feedback (see Table 4).

Figure 1:

Figure 1:

Heat map for task performance metrics based on average page views and average time taken (in seconds).

Table 4:

Summary of usability issues for each task, including information on each issue’s severity, frequency, exemplary quotes, and suggested website improvements.

Task Comments Severity Freq. Improvements
1
  1. Appears as an embedded website; suggest navigating through federal tools to access NIH information.

  2. NIH section might be misunderstood as an ASU webpage; I recommend a separate, highlighted section with clear distinctions for different mechanisms like R01, R21.

Major (Fig. 2) 2
  1. Highlight distinctions for different grant mechanisms.

  2. Clarify NIH section as a separate, easily distinguishable entity.

  3. Streamline content organization.

2
  1. I suggest relocating this information; usage of certain features like ‘Center of Health Information and Research’ and ‘Recording Studio’ is occasional and might benefit from a separate section.

Moderate (Fig. 3) 1
  1. Consider creating a dedicated section for occasional features.

3
  1. Suggest relocating the intake form used to start the grant application process; it doesn’t align with the ‘Find Funding’ section.

  2. Observed redundancy in multiple locations; recommend streamlining and organizing content.

Minor 2
  1. Reorganize the intake form to align with the appropriate section.

  2. Streamline and organize content to avoid redundancy.

4
  1. As a grant applicant with limited experience, I submitted a grant through the Department of Defense and am unfamiliar with a ‘biographical sketch.’

  2. I propose a specific search function for this research segment or the inclusion of a site-wide search option.

  3. Noticing non-alphabetical organization of resources like sample applications listed before biosketch. Consider an alphabetical arrangement.

Moderate (Fig. 4) 3
  1. Provide clear explanations for unfamiliar terms.

  2. Implement a specific search function for research segments.

  3. Consider alphabetical organization of resources.

5
  1. While reviewing the content, I noticed a significant amount of red text, creating uncertainty about the correct page.

  2. Seeking information on proposal revision raised questions about the distinctions between revision and resubmission.

  3. I wondered about the response time for proposal revision assistance, whether it operates on a first-come, first-served basis, and the expected timeframe.

Moderate (Fig. 5) 3
  1. Minimize red text for clarity.

  2. Provide clear distinctions between revision and resubmission.

  3. Communicate expected response times for assistance.

6
  1. Suggesting breaking long text into bullet points.

  2. Asked for clearer presentation of NIH due dates in the table.

  3. Commenting on the order of due dates, highlighting the potential confusion regarding NIH deadlines that may vary based on specific proposal types. Proposing a more intuitive arrangement by emphasizing the initial due date for submission to CHS and subsequently detailing the panel meeting date associated with that deadline.

Minor 3
  1. Use visual cues for multiple examples.

  2. Maintain clear due date presentation.

  3. Emphasize initial due dates for clarity.

7
  1. Noticing potential oversight due to the lack of red highlighting, I skipped over the section that wasn’t emphasized as a main bullet point.

  2. Observing an abundance of dots, indicating potential complexity or information overload.

  3. Raising a question about the distinction between proposal revision and proposal resubmission, highlighting a point of uncertainty in the process.

Minor 3
  1. Use consistent highlighting for clarity.

  2. Simplify complex information presentation.

  3. Clearly define the differences between revision and resubmission.

8
  1. Questioning the logic of finding kickoff meeting information under ‘Award Setup’ for first-time grant recipients. Initial search led to ‘At-Risk Account,’ but clarity came under ‘Manage Award’ in ‘Pre to Post Kickoff Meeting.’ Suggesting ‘Award Setup’ for improved intuitiveness.

  2. Not anticipating kickoff meeting details under ‘Manage Award,’ but rather in ‘Award Setup.’ Proposing a more logical placement to enhance user expectations.

Major (Fig. 6) 2
  1. Clarify logic in information placement.

  2. Align with user expectations for information location.

9
  1. Suggesting a need for rephrasing the question asked for ‘You are kicking off your award […]’

  2. Overlooking ‘Purchasing’ under the assumption it involves invoices post- purchase. Questioning whether invoice setup precedes the actual purchase process.

Moderate (Fig. 7) 2
  1. Standardize terminology.

  2. Clarify question phrasing for user understanding.

  3. Provide guidance on invoice setup to eliminate uncertainty.

Usability issues were categorized based on severity, with ‘Task 1’ needing major improvement due to longer completion times and participant feedback. Prioritizing enhancements to ‘Task 8’ and refining the “Manage Award” process emerged as key areas to improve the overall user experience.

Participants appreciated the guide’s user-friendly layout and found the “Proposal Preparation” section effective with least time take for completing the task. They liked having samples and templates for different grants. However, they felt the “Find Funding” process was complex and suggested adding a search engine and organizing NIH samples alphabetically for improvement.

Post Study Satisfaction Survey

participants (66.66%) strongly agreed that the website’s title accurately reflected its content, ensuring clear expectations from the outset. Similarly, 66.66% found the information within the CHS Research Guide to be clearly presented, facilitating ease of understanding.

Moreover, most participants (83.33%) agreed that the information provided in the guide was highly relevant to their grant writing needs and purposes. This indicates that the resources and tools curated in the guide effectively supported their grant proposal development processes.

In terms of usability, all participants strongly agreed that they could navigate the site without difficulty, highlighting the guide’s intuitive design and user-friendly interface. They also found the links within the guide helpful and appropriately placed, enhancing their overall navigation experience. Importantly, all participants expressed their intention to use the research guide in future grant writing endeavors and would recommend it to other faculty members within CHS.

The positive feedback from participants is further underscored by an average SUS score of 89.2, which falls within the “Excellent” range [12]. This score reflects participants’ high satisfaction with the CHS Research Guide, indicating that it was highly usable, easy to navigate, and instrumental in effectively meeting their grant writing needs.

Examining individual SUS scores revealed that even the lowest score, 77.5, still indicates a high level of usability satisfaction among participants. Overall, the SUS score analysis strongly supports the conclusion that the CHS Research Guide significantly enhances faculty members’ ability to navigate the complexities of grant writing, making it a valuable resource within CHS.

Discussion

This study demonstrates the application of usability testing, a consumer informatics technique, to improve a web-based repository to help faculty more effectively prepare grant submissions. Lessons learned from this exercise are generalizable to other institutions and should be of value to medical informaticists and other professionals looking for funding for developing health solutions.

Lessons Learned

This usability study provided valuable lessons that could help other institutions create effective educational and training material to improve faculty grantsmanship. The participants liked the most the following features of the research guide:

  • Chronological approach to organizing information (Table 1, Sections) to find better information related to the life-cycle of preparing a grant proposal and conducting funded research.

  • Samples of successful grant proposals and grant resubmissions from other college faculty (Table 1, Proposal Preparation and Proposal Resubmission section) that are password-protected (only college faculty can access them) and need potentially sensitive information (e.g., budget or faculty salary) information.

  • Pre-populated templates that could be adapted and reused (e.g., Institution Facilities and Resources template), that are curated and keep up to date by in the institution.

  • Feedback mechanisms embedded on the website for faculty to provide comments and questions on the shared material.

The participants had challenges understanding administrative terms, forms, and processes related to pre-, during, and post-award management (e.g., kickoff pre-award meeting, invoice payment, etc.) This finding stresses the need to educate and support faculty on the institution’s specific award preparation and management requirements. This could include new faculty training, newsletters/email updates on changes/new administrative processes, on-demand tutorials, workshops with hands-on training, etc.

Limitations

The study involved a small sample of six faculty members, potentially impacting the generalizability of the findings to a larger academic community. Nevertheless, according to Nielsen et al [13], this sample size is acceptable.

This study may have limited applicability outside the CHS, particularly for researchers in different academic disciplines.

The five-minute familiarization period before usability tasks might only capture some aspects of the long-term user experience, potentially overlooking more nuanced usability issues. Despite this, time constraints are often a practical consideration.

Positive responses in the post-study survey may be influenced by participants’ awareness of being part of the study, potentially affecting the objectivity of the survey results. However, efforts were made to ensure participant anonymity and encourage honest feedback.

Future work

We will address specific participant feedback, such as incorporating a search engine, arranging content alphabetically, and refining the grant intake form.

We will conduct usability studies at regular intervals, at least once a year, to continuously evaluate the effectiveness of the research website. This ensures ongoing alignment with user needs and expectations. We will also directly integrate a user feedback mechanism into the website to facilitate real-time user input. This can provide valuable insights for immediate improvements and enhance user engagement.

Furthermore, we plan to track website usage (e.g., most viewed pages) to help guide website updates and improvement strategies.

Additionally, as part of our future goals, we aim to track the number of grant applications completed and the success rate of funding post-implementation of the research guide. This will provide concrete data on the guide’s impact on research funding activities and help us continuously improve the resource to better meet the needs of our faculty.

Conclusion

Valuable lessons, generalizable to other institutions, were learned from testing the usability of a web-based research guide. Suggestions for improvement included enhancing navigation for better usability, streamlining, and organizing the content. Recommendations for others creating similar educational resources include a chronological approach to organizing information, sample grants from faculty, pre-populated templates maintained by the institution, mechanisms to provide ongoing feedback, and education on the institution’s specific grant preparation and management resources and requirements.

Figures & Tables

Figure 2:

Figure 2:

Post-study survey results.

References


Articles from AMIA Annual Symposium Proceedings are provided here courtesy of American Medical Informatics Association

RESOURCES