Abstract
Background
GeoHealth tools differ from other health-IT platforms, requiring analytical transformation tools, location-based discovery, and responsive design frameworks. While knowledge exists surrounding related platforms, there is a literature gap specific to GeoHealth tool end-user needs.
Methods
This qualitative focus group study identified design needs for GeoHealth tools. Seven focus groups included non-experts (three sessions, n = 15) and experts (healthcare professionals, four sessions, n = 16) from October 2024 to February 2025. Researchers conducted inductive thematic analysis, identifying emerging themes.
Results
Thirty-one participants completed seven sessions: 20 [65%] female, 16 [52%] White, and 16 [52%] with graduate degrees. Both groups identified similar facilitators and barriers: simple interfaces, contrasting colors, cross-device functionality. Both valued filtering, customizing regions, downloading data, and chatbot integration. Non-experts reported frustrations with mobile use and content density, while experts emphasized integrating GeoHealth tools into clinical workflows for decision-making.
Discussion and Conclusion
End-user preferences are critical as GeoHealth tools expand. Key recommendations include: customizable features (filters, personalized regions, data layering, and export options), accessible design with high-contrast color schemes and intuitive navigation, and mobile optimization (tap-triggered overlays, optimized touch targets). Chatbots were valued with transparent data sourcing. Healthcare professionals highlighted integrating tools into Health-IT systems for clinical decision-making. These findings can improve usability and acceptance, making health information more accessible and potentially improving health outcomes. Future work should validate findings through iterative usability testing with diverse samples and investigate technical pathways for Health-IT integrations and trustworthy chatbot development.
Keywords: Geospatial, GeoHealth, digital health, usability, focus group, health information
Introduction
End-user needs are often not fully assessed or considered during the development of GeoHealth tools.1,2 A GeoHealth tool combines geospatial visualizations (e.g. interactive maps) with health-related information.3,4 These tools differ from other health-IT platforms because they require analytical transformation tools, location-based discovery, and responsive design frameworks.5–7 They are also increasingly used, 8 with major media outlets regularly employing interactive maps to share health data with the public. 9 Given that survey data shows 75% of U.S. consumers search for health-related information online, 10 ignoring user-centered design can create unnecessary difficulties for users seeking reliable health information.11–13
The primary end-users of GeoHealth tools are healthcare professionals (i.e. researchers and providers), as well as the general population seeking online health information. In healthcare, these tools can enhance the accuracy and transparency of findings 14 while boosting overall analytical and evaluation performance. 15 Examples include spatial epidemiology depicting disease distribution 16 and tools measuring social determinants of health.17,18 Visualizations can deliver actionable insights, creating efficiencies in research by converting large data sets into scalable formats. Furthermore, the significant rise in academic citations of GeoHealth tools since 2019 demonstrates their benefits for these populations. 19
Despite the increasing use of GeoHealth tools, their effectiveness depends on the user's level of engagement. 20 Although they might meet usability standards, many tools are simply data repositories and are ineffective at communicating infection risks or encouraging compliance behaviors. 21 While many frameworks include user feedback in the design process, this usually occurs late, resulting in generic visualizations that do not address the diverse needs of end users.3,22 Identifying the needs and preferences of end users earlier in development is essential to unlock their full potential.
Although there is a recognized need for user-centered design, a recent systematic review highlighted a critical need for more usability evaluations, 23 as issues still persist in GeoHealth tools. Problems traditionally associated with Geographic Information System (GIS) software development have carried over to modern tools, including the exclusion of user input and concerns about the growth of extraneous features (i.e. feature creep).24,25 While efforts are made to involve targeted end users, recruiting a diverse group of participants remains challenging for researchers. 20 Furthermore, while frameworks that focus on a user-centered approach in health-related interactive tools exist, particularly concerning public health dashboards, 26 they often do not include the general population.27,28 Additionally, emphasizing visual appeal over functionality often leaves end users with inefficient methods to complete specific tasks. 29
The current state of GIS participatory design and usability research suffers from a lack of GeoHealth-specific studies, which creates difficulty in comparing findings or building on previous research. 30 A substantial amount of research is still needed to address the gaps in our digital health platforms to improve accessibility and equity across all users. 31 These examples highlight a continued neglect of involving targeted end-users during the development of GeoHealth tools.
While related research exists surrounding the user needs and usability aspects of geospatial visualization tools (e.g. public health dashboards or GIS tools), to our knowledge, there are no published studies explicitly focusing on the user needs of GeoHealth tools. GeoHealth tools are inherently transdisciplinary, as they can integrate spatial data on social exposures, quantify place-based determinants of health, and enable analysis of health risks or inequities. 32 Conversely, typical geospatial tools focus on visualization, mapping, and basic spatial queries rather than cross-domain analysis. 33 To systematically address this literature gap, this study was guided by the following research questions:
RQ1: What are the key user experience dimensions that facilitate or inhibit engagement with GeoHealth tools among general population users and healthcare professionals?
RQ2: How do end-users across different populations perceive the usability, interpretability, and information design of existing GeoHealth tools?
RQ3: What adaptability and compatibility features do end-users identify as necessary to enhance the functionality and relevance of GeoHealth tools for their specific contexts?
These research questions guided our focus group guide to examine user perceptions across multiple dimensions of user experience. Dimensions include usability (e.g. ease of use and navigation), interpretability, information overload, adaptability (e.g. customization), and compatibility (e.g. device modality). Conducting an early investigation into the perceptions of a tool's potential users could inform its design and implementation, ultimately positively impacting health outcomes. This qualitative study aimed to address this literature gap by examining the end-user needs of GeoHealth tools.
Methods
Study design
We conducted a qualitative study 34 using focus groups across two target audiences: non-expert users (i.e. general population) and expert users (i.e. healthcare researchers and providers). Compared to other research methods, the focus group is likely to capture the most representative themes and insights through the atmosphere this methodology promotes. 35 Furthermore, research has shown that three to five focus groups are sufficient to identify the most prevalent themes, provided they align with five factors that justify the generalizability of these findings. 36 These factors align with this study's construct, including a standardized instrumentation structure, homogeneous groups, simple topics with a common purpose, and a broad coding style.
Through semi-structured moderation via an IRB-approved focus group guide (Appendix 2), participants’ perceptions and preferences regarding GeoHealth tools were elicited. Each focus group session exposed participants to the same example tools: the Area Deprivation Index (ADI), 37 the Child Opportunity Index (COI), 38 and the New York Times COVID-19 map. 9 These GeoHealth tools were selected to provide a broad sampling of design styles, layouts, functions, and features for participants to note, as they each have unique designs. After a brief site tour, participants were asked to share their thoughts on these elements. Specifically, we asked about the features and functions they liked and disliked the most, what functions participants would want to see integrated, and how they might see themselves using a GeoHealth tool (i.e. in what setting, modality, and for what purpose). Qualitative methods were guided and reported in compliance with the Consolidated Criteria for Reporting Qualitative Research (COREQ) reporting guideline (Appendix 1). 39 The University of North Carolina at Chapel Hill Institutional Review Board (IRB 23-3207) approved all components of this study.
Participant selection
To obtain a representative sample of potential users of GeoHealth tools, we identified two target groups for recruitment through purposive sampling: the general population and healthcare professionals. For the general population, we aimed for broad representation in each session by recruiting individuals who are interested in health and information interaction, encompassing a range of educational levels, ages, and diverse racial and ethnic backgrounds. The healthcare professional target group included healthcare providers (e.g. MDs, NPs, RNs) and healthcare researchers. Physicians and nurse practitioners were our main focus, but we also included registered nurses to ensure broader representation. Healthcare researchers were categorized based on their experience: those with five years or fewer experience and those with more than five years of experience.
Recruiting occurred from September 2024 to February 2025 through a university-affiliated Research for Me platform, flyers posted at public libraries, and targeted emails sent to researcher and provider populations. The exact reach is unknown due to the nature of public postings. However, a total of 1344 responses to the postings and screening survey were received. Of these responses, 881 (66%) were general population, 231 (17%) healthcare researchers, and 216 (16%) as healthcare providers.
A Qualtrics 40 screening survey was used to collect demographic information and determine participant eligibility, which required individuals to be at least 18 years old, proficient in English, and to confirm their occupational roles. Survey responses were screened to validate eligibility criteria. From this eligible pool, we employed variation sampling to strategically select participants who would provide diverse perspectives across key dimensions. Participants were chosen to maximize variation in education, race, ethnicity, and digital literacy levels. Specifically, we prioritized selecting individuals with a high school education, older adults (e.g. those aged 50 and above), and individuals who self-reported lower digital literacy levels. These metrics were mostly concentrated within the general population group, as all healthcare professionals will have some level of college education.
A total of 31 participants (total=31) were enrolled, including 15 from the general population (n = 15) and 16 healthcare professionals (n = 16). Participants were compensated, and there were no dropouts or refusals to participate.
Data collection
Focus groups were conducted virtually from October 2024 to February 2025 on the Zoom 41 web-conferencing platform. JG and a trained research assistant facilitated each one-hour session. Seven (n = 7) focus group sessions were held: three with the general population (n = 3) and four with healthcare professionals (n = 4). The general population groups consisted of five participants each and were organized based on their self-reported level of digital literacy, as determined from screening survey responses, with options including satisfactory, proficient, or expert. Focus groups with researchers each consisted of five participants and were categorized based on their experience level. Healthcare provider sessions were divided into two groups of three participants due to scheduling conflicts. No repeat sessions were conducted.
An IRB-approved focus group guide facilitated consistency throughout the study (Appendix 2). Participants had to consent to the study, including agreeing to be audio-recorded both in writing and verbally before the session started. Participants were encouraged to keep their cameras on during the session to foster a conversational environment, although it was not required. During the sessions, the participants’ first names were used; however, transcripts and coded data excluded all names and identifiable information. All sessions were audio recorded on a standalone recording device, and not via the Zoom platform. This device was stored in accordance with IRB protocols in an institutional facility and was only accessible to authorized members of the research team. Additionally, transcripts were not shared with participants for review or correction.
Analysis
Focus group audio files were automatically transcribed using VookAI, 42 de-identified, and manually reviewed before being uploaded to Dedoose, 43 a web-based qualitative data analysis software. To ensure transcription accuracy, two members of the research team independently reviewed a subset of transcripts against the original audio recordings before proceeding with formal coding. Access to the Dedoose platform was controlled with account credentials of authorized members of the research team.
An inductive thematic research approach grounded in a constructivist perspective 44 was used for analysis, with each unique population group serving as the unit of analysis. 45 This approach allowed themes to emerge organically from the data rather than being imposed by predetermined theoretical frameworks. This was particularly appropriate given the exploratory nature of this study and the limited existing literature on end-user preferences of GeoHealth tools.
Two coders (JG and KM) created a codebook through written notes and initial transcript readings. 46 Both coders coded one transcript as a sample, then reviewed discrepancies, clarified some code definitions, and merged similar codes for simplicity. Inter-coder reliability was assessed through this iterative process, with coders discussing and resolving all disagreements through consensus until achieving alignment on code application and definitions. Following this calibration phase, both coders independently coded all remaining transcripts, meeting regularly to compare coding decisions and ensure consistency throughout the analytical process. Codebook stability and thematic saturation were achieved after the analysis of the third focus group. Cohen's kappa was not formally calculated due to the exploratory, inductive nature of the analysis. However, the consensus-building approach ensures analytical rigor.
After initial coding was completed, both coders engaged in a systematic process of code aggregation, identifying patterns and relationships among codes within and across focus group sessions. Thematic analysis was conducted collaboratively across codes to identify broader, crosscutting themes that reflected shared findings. Themes were refined through an iterative process that involved returning to the raw data to verify that they accurately captured participant experiences and that existing supporting evidence existed for each theme. Additionally, representative quotes were extracted for each theme to ground findings in participants’ own words and ensure authentic representation of their perspectives. Participants did not provide feedback on the results.
Results
Sample characteristics
A total of 31 participants took part in seven focus group sessions. Of the participants, 65% were female, 52% identified as White, and 62% fell within the 18–34 age range. Regarding level of education, four participants (13%) had a High School/GED, while 81% had a Bachelor's or Graduate-level degree. Four participants (13%) were non-native English speakers. Of the healthcare provider population, three were physicians, one was a nurse practitioner, and two were registered nurses. Table 1 contains a breakdown of participant demographics.
Table 1.
Participant characteristics.
| General population (n = 15) | Healthcare professionals (n = 16) | Total (N = 31) | |
|---|---|---|---|
| Gender, n (%) | |||
| Male | 6 (40) | 5 (31) | 11 (35) |
| Female | 9 (60) | 11 (69) | 20 (65) |
| Age, n (%) | |||
| 18–34 years | 9 (60) | 10 (62) | 19 (62) |
| 35–49 years | 4 (27) | 4 (25) | 8 (26) |
| 50–64 years | 0 (0) | 2 (13) | 2 (6) |
| 65+ years | 2 (13) | 0 (0) | 2 (6) |
| Education, n (%) | |||
| High School/GED | 4 (27) | 0 (0) | 4 (13) |
| Associate Degree | 2 (13) | 0 (0) | 2 (6) |
| Bachelor's Degree | 7 (47) | 2 (13) | 9 (29) |
| Graduate-level | 2 (13) | 14 (87) | 16 (52) |
| Race, n (%) | |||
| White | 4 (27) | 12 (75) | 16 (52) |
| Black or African American | 7 (47) | 2 (13) | 9 (29) |
| Hispanic or Latino | 1 (6) | 1 (6) | 2 (6) |
| Asian | 1 (6) | 1 (6) | 2 (6) |
| Native Hawaiian or Pacific Islander | 2 (13) | 0 (0) | 2 (6) |
| Other | 0 (0) | 0 (0) | 0 (0) |
| Digital literacy level, n (%) | |||
| Satisfactory | 5 (33) | 0 (0) | 5 (16) |
| Proficient | 5 (33) | 10 (63) | 15 (48) |
| Expert | 5 (33) | 6 (37) | 11 (35) |
| Geospatial web tool experience, n (%) | |||
| Does not know what it is | 4 (27) | 4 (25) | 8 (26) |
| No experience | 4 (27) | 8 (50) | 12 (39) |
| Some experience | 5 (33) | 4 (25) | 9 (29) |
| Extensive experience | 2 (13) | 0 (0) | 2 (6) |
| Non-native English speaker, n (%) | |||
| 2 (13) | 2 (12) | 4 (13) | |
| Healthcare profession | |||
| Physician | N/A | 3 (19) | 3 (19) |
| Nurse Practitioner | N/A | 1 (6) | 1 (6) |
| Registered Nurse | N/A | 2 (13) | 2 (13) |
We identified four major themes and 11 subthemes related to perceptions and preferences of the GeoHealth tools, highlighting the comparable concerns of the general population and healthcare professionals. These themes directly address the study's objective of understanding end-user experiences and informing design improvements for GeoHealth applications. With each theme supported by a majority of each group, and codebook stability attained after three transcripts, thematic saturation was achieved. The themes, subthemes, and representative quotations are presented in detail in Table 2, while Figure 1 depicts a broad illustration of the findings. The Facilitators and Barriers capture factors that facilitate or hinder user interaction, Adaptability addresses user-driven adaptability needs, and the theme of Compatibility examines how GeoHealth tools can be integrated with workflows and emerging technologies.
Table 2.
Themes, subthemes, and representative quotations.
| Theme/Subtheme | User group | Representative quotation |
|---|---|---|
| THEME 1: FACILITATORS | ||
| Ease of Use | ||
| General Population | “…you're able to just hover your mouse over the specific county or city you're looking at…I found it easy to use.” (P09) | |
| Researcher | “I like the ones that just zoom in when you click on a region, or you select something because I usually look at these on my phone…I like it on the phone that when you click on an area, it just zooms in, and you can actually see it better.” (P25) | |
| Ease of Understanding | ||
| General Population | “You didn't just have the different gradients of colors to show the different prevalences of different social demographic information, but they also included dots to show different racial demographics. It was just very easy to digest.” (P12, regarding COI) | |
| Provider | “For me, I liked the ADI. I thought the red coloring somehow made it stand out differently to me as, like, not necessarily the association, but, like, a dark red is bad if I was reading the thing correctly.” (P30) | |
| Site Features | ||
| General Population | “I kind of liked the option where you could download the data…it's going to take me a lot more time writing down items than just being able to pull the entire data myself.” (P13) | |
| Researcher | “If you could select an area and then change the parameters within that area. In the NYT map, I liked how it zoomed into the Raleigh location when you typed in a zip code.” (P23) | |
| Data Granularity | ||
| General Population | “…if I live somewhere more rural…I could drive up to 30 miles to go somewhere for care. And so with that, I think that capability does make those sorts of geospatial tools a lot easier to use. And I think that applies to the full general public.” (P11) | |
| Provider | “I think county for me. If there was, like, school district level, it'd be pretty cool. But at least county, I think, would be helpful for my research. For more of the clinical side of my role…regions of state (level) would be enough.” (P30) | |
| THEME 2: BARRIERS | ||
| Color Scheme | ||
| General Population | “When the [COI] was first brought up, my first question was, why are certain areas the same color as the sea? That just kind of threw me off…and it was a little difficult to discern which blue was actually darker…made it harder to digest.” (P13) | |
| Researcher | “I couldn't tell what the border was for the zip code that I had typed in. So it was a little frustrating because there's there's a zillion different colors like green or red and I wasn't sure where the border was.” (P23) | |
| Device Modality | ||
| General Population | “…from a phone, there really isn't a hover option; you just have to click on it. I'll find myself spending a lot more time looking through items on a phone than I would on a computer.” (P14) | |
| General Population | “…I wanted to go back and swipe the phone to get back to where I was, it'll just start me back at the very top again and I have to continue to scroll down to where I was…that'll eventually deter me from wanting to click into any of the other hyperlinks to get more information. I'll end up just closing out because I get frustrated.” (P11) | |
| Advertisements | ||
| Researcher | “I agree with ads being annoying, and if it's too much, videos start playing, or it's interfering with the page loading quickly enough. And I just don't like the experience enough and that I'll just go find another source since there's so many sources.” (P22) | |
| Researcher | “…if there's too many ads and just all over the page, I don't even bother looking at it really, I just kind of move on…How many ads are there depends on how long I stay on that page for me personally.” (P17) | |
| Overwhelming Information | ||
| General Population | “…if I'm trying to find information quickly, I want to be able to just kind of jump right into it, because I'm guessing if I'm looking up something, I know what I'm looking for. But that one (ADI) was very wordy for me, and I didn't like that.” (P02) | |
| General Population | “…it's a lot of information and trying to decide what's actually valuable to the person reading it. It can be kind of overwhelming looking at all the different colors…So I would find too much information to be overwhelming.” (P12) | |
| THEME 3: ADAPTABILITY | ||
| Suggestions for Improvement | ||
| General Population | “…it's important data to…see changes in public health policies and simultaneously see changes in the public health outcomes in certain areas, such as vaccination rates. You might see changes in poverty, abortion access, maternal health.” (P15) | |
| Provider | “It'd be nice if you could click, select, drag over a certain area and capture…copy and paste into PowerPoint. If I'm going to make a pitch to somebody…(and) the AI gives you the breakdown…you're also able to copy and paste whatever the AI says.” (P27) | |
| General Population | “…if I'm researching something, I want to do screenshots or easily grab stuff and put it in a separate file on my computer so that I can go back to it. So, something that would facilitate ease of saving information from the site would really be useful.” (P07) | |
| Provider | “These are great tools, and they tend to grow stale…if you could get media outlets to use it as clickbait, you're going to have more traffic to it…then the researchers might get ideas and the policymakers will know where to go.” (P27) | |
| THEME 4: COMPATIBILITY | ||
| Implications for Usage | ||
| General Population | “If you're looking for a medical specialist, a service provider, an oncologist or a gynecologist it could be useful to have geographic information so that you're not driving halfway across the state to find that…” (P07) | |
| General Population | “I think the primary benefit kind of ties into that people who might not have as much knowledge about a subject or as information are able to see something and get a surface level takeaway from it.” (P09) | |
| Researcher | “I think it's very useful to have this quick snapshot that provides tangible data to support your claims or your questions…I can just take a screenshot of it and put it on a slide, and everyone immediately sees where I'm going with my project.” (P18) | |
| Provider | “I'm looking at websites to see kind of what distance rates are…and where they grew up to get a sense of what is their environment like. Because where you grew up and how you grew up impacts cognition across your lifetime.” (P30) | |
| Chatbot Integration | ||
| General Population | “I think that there could be some benefit… if someone could just say ‘can you explain to me what it means if I'm seeing darker shades here versus lighter shades here?’ And it could just be a way to clarify things that someone may not understand.” (P09) | |
| General Population | “I think that it makes life a lot easier with the bots as well. Just being able to ask questions and not spend a lot of time trying to figure it out on your own. Something that you don't necessarily know what's going on.” (P02) | |
| Researcher | “I think it would be super cool to very quickly say…'compare these two places, or describe the general trends in the state or, you know, geographically across the US over the past however many years,’ that would be very cool if it could quickly summarize it.” (P23) | |
| Provider | “…it's helping you synthesize a whole ton of data. I can see that being pretty helpful then…So having some sort of tool that could help with retrieving that quickly would be helpful.” (P27) | |
Figure 1.
Study themes.
Throughout the focus group discussions, we allowed participants to articulate their interpretations of terms such as “accessible” and “user-friendly” rather than providing definitions, which aligns with this study's inductive approach. When describing positive experiences, participants referenced specific observable features such as intuitive navigation or visual clarity. Conversely, when tools frustrated them, participants identified barriers at the opposite end of the spectrum, such as confusing navigation and poor color contrast. These participant-generated usability criteria structured the analysis and themes presented below.
Facilitators
Both groups demonstrated an awareness of each site's usability and ability to comprehend its features and content. They equally recognized and appreciated the intuitive site navigation, interactive elements like hover-over descriptions, and content organization. Specifically, the ability for a mapping tool to filter and search by zip code and zoom into a selected region was praised by many participants, as it introduced efficiency to the process. Both groups also valued features such as searching within select timeframes, downloading data, and being able to customize the scale (e.g. census tract or county level).
Most of the general population appreciated each tool's capacity to unveil data visually, particularly those who had never seen or interacted with one. Additionally, this group prefers graphics and other visualizations over reading text. A few general population participants explicitly mentioned their appreciation for one site's incorporation of instructional videos for using the tool.
Within the healthcare professional group, a highly contrasting color scheme, filter customizations, and the ability to download data dominated their remarks. Both researchers and providers specifically noted enjoying how a tool can zoom in and out of a selected region. Overall, facilitators center on the elements of ease of use, ease of understanding, and functional site features.
Barriers
A majority of participants from both groups shared common frustrations regarding ill-defined or confusing color schemes. For example, many found distinguishing data variance and significance difficult when shades of blue blended into the background or when legends were unclear or missing. The COI was the leading culprit, confusing users with its varying shades of blue, some likening land features to water. Conversely, the ADI was praised for its high color contrast. Regarding device modality, both groups prefer using a desktop or laptop to access GeoHealth tools.
Members of the general population expressed challenges using mobile devices, citing the inability to go back to previous pages and repeatedly scroll each time entering a new section. Interactive maps often feature hover-over functions that do not translate well to smartphones. Additionally, many participants found some sites overwhelming at first glance, which created uncertainty about the tool's purpose and how to use it, prompting them to leave.
Unique to the healthcare professional group were statements regarding advertisements, pop-ups, and self-starting videos. Too many of these types of distractions or extraneous content would impel them to leave a site. Barriers to effective GeoHealth tool use clustered around four primary issues: poor color scheme design, limited mobile device compatibility, intrusive advertisements, and overwhelming information presentation. Each of these elements impedes user engagement and tool effectiveness.
Adaptability
Statements regarding the improvement of GeoHealth tools provide insight into adapting the experience to the end user's context. This theme highlights user-driven recommendations for enhancing tool functionality and ensuring its relevance over time.
Participants from the general population expressed the need for speed when navigating the mapping tool and removing extraneous visuals, such as regions or countries unrelated to the dataset. This group also noted a desire to continue delving into a data point through continuous clicking. For example, clicking on a region would populate some basic data, with a subsequent click providing additional details.
Healthcare professionals resoundingly perceived a need to customize the data for their needs. This included exporting visualizations into other media for presentations, clicking and dragging the cursor over a custom area, and adding additional data onto the map (e.g. income levels). None of the three GeoHealth tools examined had this level of customization. One provider focus group agreed that marketing and showcasing these tools is essential to maintaining their value as resources over time, rather than potentially fading away and becoming irrelevant.
These adaptability preferences highlight a gap between current tool capabilities and user expectations, particularly regarding advanced customization features that would enable users to tailor GeoHealth tools to their specific analytical and presentation needs.
Compatibility
Despite some skepticism regarding integrating a chatbot into a GeoHealth tool, it was generally viewed as an innovative element that introduced value to users across both groups. Both groups also expressed the benefits of a chatbot's capacity to quickly retrieve and synthesize information, clarify geospatial visualizations, and assist users unfamiliar with the tool. However, a few participants from both groups perceived concern regarding a chatbot's accuracy and reliability, emphasizing a need for transparency in the sources referenced.
Healthcare professionals cited potential use cases of leveraging a chatbot to compare different locations against each other or provide general trends of a region over time. The added efficiency was an attractive feature to this group as it can save them time when conducting research or adding context to a patient's living area for clinical decision-making.
Regarding the implications of GeoHealth tools for use, both audiences reflected a balance between practical applications and broader impacts for informing research and policy. General population participants noted how they could streamline access to services, such as locating healthcare providers, and providing key, surface-level takeaways that serve as a first step to learning about a topic.
Healthcare professionals discussed the importance of integrating patient location data to improve care delivery and how they might provide evidence to support claims about health disparities, vaccine access, and demographic trends across multiple settings (e.g. academic, clinical, and business). Notably, healthcare professionals specifically mentioned the potential for GeoHealth tools to be integrated into existing Health-IT systems such as EHRs.
Discussion
This study investigated end-user perceptions and preferences regarding GeoHealth tools, for which there is a lack of existing literature. By addressing this gap, our research provides foundational knowledge and actionable recommendations about how both non-expert and expert users experience and interact with GeoHealth tools. This represents a significant step toward integrating user-centered design principles in this rapidly growing field. Findings were grouped into four overarching themes: (1) facilitators and (2) barriers when interacting with GeoHealth tools, (3) adaptability of such tools, which included suggestions for improvement, and the tool's (4) compatibility concerning its implications for usage and integration of a chatbot feature.
Both groups provided comparable feedback regarding the facilitators and barriers when interacting with GeoHealth tools. Capturing these insights is essential to improving user engagement and, consequently, the tool's effectiveness in achieving its intended purpose. 20 The appreciation of customizable data visualizations, intuitive navigation, and data export features can elevate a user's experience so that they can efficiently explore, learn, and inform decisions. The map visualizations bring data to life, making it more accessible to a broader audience. These findings align with prior research showing that data visualizations improve cognitive understanding and can enhance effective communication.47,48 By validating these principles within the GeoHealth tool context, our study extends existing theory on information visualization, demonstrating that spatially-referenced health data requires particular attention to customization and interactivity to maximize user comprehension and decision-making capacity.
Notable barriers included multiple comments about confusing color schemes, mobile device access, the amount of overwhelming site content, and the annoyance of advertisements. Users desired higher color contrasts and more distinct color differentiations to ease readability. Well-designed color contrasts enable users to distinguish between data classes and identify spatial patterns. 49 However, color-blind users must be accounted for when developing a color scheme. Published guidelines and online resources offer suitable color schemes that promote inclusive accessibility. 50 From an implementation perspective, these findings illustrate the necessity of adopting established accessibility standards during the initial design phase rather than as retroactive modifications.
Many interactive map features are not compatible with mobile devices. Specifically, hover-over features on touchscreen devices and navigation challenges due to a smaller screen size can lead to touch imprecision.51,52 Our findings revealed that these problems persist with GeoHealth tools in use today and should be addressed when developing responsiveness for these highly interactive tools. Mobile usability design recommendations might include replacing hover-over features with tap-triggered overlays, optimizing touch targets, and integrating responsive layering. The persistence of mobile compatibility issues across existing GeoHealth platforms suggests systemic challenges in translating desktop-optimized geospatial interfaces to mobile environments. Therefore, developers should prioritize mobile-first or responsive design approaches, given the increasing prevalence of mobile device usage for health information seeking.53–55
Research has shown that intrusive digital content, such as advertisements, impacts user trust and retention. 56 Participants from both groups noted feeling overwhelmed by the content or perceived complexity upon initially interacting with the web tools. Users of GeoHealth tools desire to understand what they see and begin using them immediately. Too much content, options, advertisements, and pop-ups can drive users away, and should be used with discretion.
Our study showed a strong preference for increased customization from both groups, such as exporting visualizations, personalizing map regions, and layering data onto the map. Giving users the ability to tailor their experience facilitates extracting the most value from the data. Adapting a geospatial application's capabilities to user needs has been demonstrated to enhance a tool's capabilities and support specialized requirements. 57 Moreover, these preferences reveal that users conceptualize GeoHealth tools as interactive platforms that should adapt to their specific needs and workflows. This user-driven customization paradigm has significant implications for how developers allocate resources. For example, a flexible, modular platform architecture may produce greater user satisfaction and tool adoption over a more sophisticated, but rigid interface.58,59
Findings on compatibility themes show that a GeoHealth tool can have a widespread influence, such as the healthcare professionals who mention its potential to be integrated into current medical reporting systems like EHRs. This is an important discovery because future integration of GeoHealth tools with clinical systems must be considered during the design process. Although some studies emphasize the importance of using geospatial technologies in public health planning, 60 there is a knowledge gap about how to incorporate these tools into clinical workflows. Our findings demonstrate healthcare professionals’ interest in EHR integrations and their recognition of the potential of GeoHealth tools to inform clinical decision-making. Policy considerations must address this interest to support the use of geospatial data and platform integration in clinical settings.
Integrating a chatbot feature into a site was widely viewed as beneficial. Both groups saw the value in a chatbot's ability to help retrieve and synthesize data, clarify visualizations, or simplify insights. Comments on the reliability and transparency of chatbots support existing studies that emphasize the importance of source attribution and building user trust through these features. 61 Participant concerns about accuracy and transparency, such as misinformation or lack of source attribution, can be addressed by explaining how the chatbot was developed and how it operates. Knowing that the chatbot was custom-built for a specific site and only uses internally sourced data helps ease most of these concerns. The overwhelmingly positive reception of chatbot integration represents a significant opportunity for advancing GeoHealth tool functionality, particularly as artificial intelligence capabilities continue to evolve. However, implementation should be approached thoughtfully, and organizations should establish applicable governance frameworks.
Beyond individual tool improvements, our findings suggest the need for broader ecosystem-level changes. Organizations should prioritize resources for user experience research and ensure user-centered design principles are being implemented from the start of any GeoHealth tool development. This study provides user-centered insights into enhancing the usability and accessibility of GeoHealth tools and their potential integration into existing Health-IT systems. Developers and healthcare professionals should collaborate to implement these evidence-based improvements, focusing on visual clarity, responsiveness, and facilitating access across device types and individual needs (i.e. digital literacy levels, disabilities, etc.).
Future work should concentrate on applying iterative user-centered design to continually improve GeoHealth tools. Usability testing should involve the tool's likely end users, ensuring a diverse group that includes individuals with visual impairments and access on various platforms (e.g. mobile phones or tablets. 62 Future research should also aim for broader representation of people with lower digital literacy and education levels. Additionally, organizations need to explore the technical aspects and policy pathways for integrating their GeoHealth tools into Health-IT systems like EHRs. Moreover, the overwhelmingly positive results regarding chatbot integration in GeoHealth tools highlight the need for further research on how to develop and promote trustworthy, transparent chatbots. Collectively, these efforts can enhance the usefulness and impact of GeoHealth tools for both users and organizations.
Limitations
There are some limitations to this study. First, the virtual environment of the focus group sessions may have limited cross-conversation and engagement among participants, which are typically benefits of in-person group settings. However, the facilitators created an inclusive atmosphere, and participants frequently engaged in dialogue with one another. Additionally, all sessions were conducted virtually, ensuring consistency across the study. Second, although participants shared their video via webcam during each session, only the audio was recorded. As a result, our analysis lacked body language and non-verbal cues that could have enhanced our findings. Third, there is potential bias in this study, as certain demographics are overrepresented, notably the 81% of participants who have a college degree. Despite our targeted recruiting efforts, recruitment challenges resulted in a final sample skewed toward higher educational attainment and younger age groups. This affects the generalizability of the findings to the broader population, since a wide range of educational backgrounds was not included. Fourth, random sampling was not used due to the small sample size required for this qualitative study. Participants who volunteered for this study may have had a pre-existing interest in health information or GeoHealth tools, which could potentially skew responses more positively or lead to higher engagement levels. This self-selection limits our ability to capture perspectives from those who are disinterested in or skeptical of such technologies. Furthermore, individuals interested in virtual focus groups were likely those with sufficient digital literacy, which may limit the applicability of the results to the general population. Fifth, while the four overarching themes identified in this study provide valuable insights, they may not capture the full spectrum of user experiences and preferences across all GeoHealth tool contexts. The themes that emerged from our analysis may require validation across broader populations and diverse tool types. Lastly, to mitigate potential researcher bias, two researchers independently coded data and reached consensus through discussion. Through actions such as employing two independent coders and conducting focus group sessions with two target audiences, our research team used multiple strategies to mitigate these limitations. However, we acknowledge that these efforts do not fully eliminate the limitations, and future research will be necessary to validate and extend our findings.
Conclusion
The growth and use of GeoHealth tools continue; therefore, the preferences of their end users cannot be ignored. Our study investigated end-user perceptions and preferences regarding GeoHealth tools to provide actionable recommendations for improving their design and effectiveness. Participant feedback revealed distinct pathways from user frustrations to design solutions. Key findings from non-expert and expert users emphasize the importance of incorporating customizable features (filters, personalized regions, data layering, and export options), driven by participants’ expressed need to “drill down” into data points through successive clicks and customize visualizations for reports. Participants’ frequent struggles to distinguish data variance due to blended color schemes directly informed recommendations to ensure accessibility through high-contrast color schemes and intuitive navigation. Frustrations among general population users with mobile access, including the inability to navigate back to previous pages and nonfunctional hover-over features, led to recommendations to optimize mobile responsiveness by replacing hover-over features with tap-triggered overlays and improving touch targets. The integration of chatbots was universally valued when paired with transparent data sourcing, as participants expressed both enthusiasm for efficiency gains and concerns about accuracy and source reliability. Finally, healthcare professionals highlighted the need to design tools with future integration into Health-IT systems (e.g. EHRs) to improve clinical decision-making.
Applying these findings can boost tool usability and acceptance, making health-related information more accessible and potentially leading to better health outcomes. Future work should validate these findings through iterative usability testing with larger, more diverse samples. Specifically, testing prototype implementations of the recommended customization features and mobile-optimized interaction features across varied user demographics and digital literacy levels. Additionally, research should investigate technical and policy pathways for Health-IT integration through pilot studies examining interoperability requirements and workflow integration points. Lastly, chatbot features should be evaluated for transparency, accuracy, and reliability by both the general population and healthcare professionals.
Supplemental Material
Supplemental material, sj-docx-1-dhj-10.1177_20552076261415910 for User perceptions and preferences for GeoHealth tools: A qualitative focus group study of non-expert and expert users by John Geracitano, Kaushalya Mendis, Christopher M Shea, Fei Yu, David McSwain and Saif Khairat in DIGITAL HEALTH
Supplemental material, sj-docx-2-dhj-10.1177_20552076261415910 for User perceptions and preferences for GeoHealth tools: A qualitative focus group study of non-expert and expert users by John Geracitano, Kaushalya Mendis, Christopher M Shea, Fei Yu, David McSwain and Saif Khairat in DIGITAL HEALTH
Footnotes
ORCID iDs: John Geracitano https://orcid.org/0009-0003-6029-1778
Kaushalya Mendis https://orcid.org/0000-0003-2668-3404
Christopher Shea https://orcid.org/0000-0002-7437-7607
Fei Yu https://orcid.org/0000-0003-1079-1590
David McSwain https://orcid.org/0000-0002-8831-4666
Saif Khairat https://orcid.org/0000-0002-8992-2946
Ethical approval: This study was approved by the University of North Carolina at Chapel Hill Institutional Review Board.
Consent to participate: All study participants provided written and verbal consent in accordance with institutional protocols.
Author contributions: JG: manuscript development and writing, data analysis, principal investigator. KM: data analysis, expert consultation. CS, FY, DM: expert consultation, manuscript review. SK: expert consultation, mentorship, manuscript review.
Funding: The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This study was supported by the National Center for Advancing Translational Sciences of the National Institutes of Health (NIH) under Award Number RC2TR004380 and the Health Resources and Services Administration (HRSA) grant 6U3GRH40003-01-01. The content of this article is solely the responsibility of the authors and does not necessarily represent the official views of the NIH and HHS, nor does the mention of department or agency names imply endorsement by the US government.
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Data availability: The data underlying this study (e.g. transcripts, etc.) are not publicly available due to Institutional Review Board restrictions. The data are available from the corresponding author upon reasonable request.
Guarantor: JG accepts full responsibility for the execution and content of this study and controlled the decision to publish.
Supplemental material: Supplemental material for this article is available online.
References
- 1.Pang PC-I, Chang S, Verspoor K, et al. Designing health websites based on Users’ web-based information-seeking behaviors: a mixed-method observational study. J Med Internet Res 2016; 18: e145. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 2.Beard R, Wentz E, Scotch M. A systematic review of spatial decision support systems in public health informatics supporting the identification of high risk areas for zoonotic disease outbreaks. Int J Health Geogr 2018; 17: 38. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 3.Joshi A, Amadi C, Katz B, et al. A human-centered platform for HIV infection reduction in New York: development and usage analysis of the ending the epidemic (ETE) dashboard. JMIR Public Health Surveill 2017; 3: 95. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 4.Osei FB, Sasidharan S. Geospatial health (Geohealth): current trends, methods, and applications. Trop Med Infect Dis 2023; 8: 1–3. DOI: 10.3390/tropicalmed8070366 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 5.Luan H, Law J. Web GIS-based public health surveillance systems: a systematic review. ISPRS Int J Geoinf 2014; 3: 481–506. [Google Scholar]
- 6.Davenhall WF, Kinabrew C. GIS In health and human services. In: Kresse W, Danko DM. (eds) Springer handbook of geographic information. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012, pp.557–578. [Google Scholar]
- 7.Aanestad M, Grisot M, Hanseth O, et al. Strategies for building eHealth infrastructures. In: Aanestad M, Grisot M, Hanseth O, et al. (eds) Information Infrastructures within European health care: working with the installed base. Cham: CH: Springer, 2017, pp.34–51. [PubMed] [Google Scholar]
- 8.Malone JB, Bergquist R, Martins M, et al. Use of geospatial surveillance and response systems for vector-borne diseases in the elimination phase. Trop Med Infect Dis 2019; 4: 1–16. DOI: 10.3390/tropicalmed4010015 [DOI] [PMC free article] [PubMed] [Google Scholar]
- 9.The New York Times. Coronavirus in the U.S.: Latest Map and Case Count. The New York Times, https://www.nytimes.com/interactive/2021/us/covid-cases.html (2023, accessed 29 October 2024).
- 10.Finney Rutten LJ, Blake KD, Greenberg-Worisek AJ, et al. Online health information seeking among US adults: measuring progress toward a healthy people 2020 objective. Public Health Rep 2019; 134: 617–625. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 11.Lowndes AM, Connelly DM. User experiences of older adults navigating an online database of community-based physical activity programs. Digit Health 2023; 9: 20552076231167004. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 12.Fiksdal AS, Kumbamu A, Jadhav AS, et al. Evaluating the process of online health information searching: a qualitative approach to exploring consumer perspectives. J Med Internet Res 2014; 16: e224. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 13.Tzimourta KD. Human-centered design and development in digital health: approaches, challenges, and emerging trends. Cureus 2025; 17: e85897. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 14.Lech M, Uitto JI, Harten S, et al. Improving international development evaluation through geospatial data and analysis. Int J Geospatial Environ Res 2018; 5: 1–15. [Google Scholar]
- 15.Bamberger M, Raftree L, Olazabal V. The role of new information and communication technologies in equity-focused evaluation: opportunities and challenges. Evaluation 2016; 22: 228–244. [Google Scholar]
- 16.Rai PK, Nathawat MS. Health care system and geospatial technology: a conceptual framework of the study. In: Geoinformatics in health facility analysis. Cham: Springer International Publishing, 2017, pp.1–28.
- 17.Centers for Disease Control. CDC/ATSDR SVI 2020 Documentation.
- 18.Geracitano J, Barron L, McSwain D, et al. How is digital health suitability measured for communities? A systematic review. Digit Health 2024; 10: 20552076241288316. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 19.Noelke C, McArdle N, DeVoe B, et al. Child Opportunity Index 3.0 Technical Documentation. Waltham, MA, USA, diversitykids.org, Brandeis University, 2024. [Google Scholar]
- 20.Honary M, Fisher NR, McNaney R, et al. A web-based intervention for relatives of people experiencing psychosis or bipolar disorder: design study using a user-centered approach. JMIR Ment Health 2018; 5: e11473. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 21.Tieman J, Nicholls S. Enhancing the efficacy of healthcare information websites: a case for the development of a best practice framework. BMJ Open 2024; 14: e088789. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 22.Timpka T, Olvander C, Hallberg N. Information system needs in health promotion: a case study of the safe community programme using requirements engineering methods. Health Inf J 2008; 14: 183–193. [DOI] [PubMed] [Google Scholar]
- 23.Kurniawan D, Rosa Indah D, Sari P, et al. Understanding the landscape of usability evaluation in geographic information systems: a systematic literature review. J Appl Sci Eng Technol Educ 2023; 5: 35–45. [Google Scholar]
- 24.Roth R, Ross K, MacEachren A. User-centered design for interactive maps: a case study in crime analysis. ISPRS Int J Geoinf 2015; 4: 262–301. [Google Scholar]
- 25.Duncan IK, Gastner MT. Comparative evaluation of the web-based contiguous cartogram generation tool go-cart.io. PLoS ONE 2024; 19: e0298192. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 26.Yanovitzky I, Stahlman G, Quow J, et al. National public health dashboards: protocol for a scoping review. JMIR Res Protoc 2024; 13: e52843. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 27.Sorapure M. User perceptions of actionability in data dashboards. J Bus Tech Commun 2023; 37: 253–280. [Google Scholar]
- 28.Ivanković D, Barbazza E, Bos V, et al. Features constituting actionable COVID-19 dashboards: descriptive assessment and expert appraisal of 158 public web-based COVID-19 dashboards. J Med Internet Res 2021; 23: e25682. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 29.Zakkar M, Sedig K. Interactive visualization of public health indicators to support policymaking: an exploratory study. Online J Public Health Inform 2017; 9: e190. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 30.Doerwald F, Stalling I, Recke C, et al. A rapid review of digital approaches for the participatory development of health-related interventions. Front Public Health 2024; 12: 1461422. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 31.Crowe-Cumella H, Nicholson J, Aguilera A, et al. Editorial: digital health equity. Front Digit Health 2023; 5: 1184847. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 32.American Geospacital Union. GeoHealth: a transdisciplinary science for global environmental and human health. AGU position statement.
- 33.Davenhall W, Kinabrew C. GIS in health and human services. Springer handbook of geographic information.
- 34.Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol 2006; 3: 77–101. [Google Scholar]
- 35.Barbour RS. Making sense of focus groups. Med Educ 2005; 39: 742–750. [DOI] [PubMed] [Google Scholar]
- 36.Guest G, Namey E, McKenna K. How many focus groups are enough? Building an evidence base for nonprobability sample sizes. Field Methods 2017; 29: 3–22. [Google Scholar]
- 37.University of Wisconsin School of Medicine and Public Health. Area Deprivation Index, https://www.neighborhoodatlas.medicine.wisc.edu/ (2022, accessed 1 November 2024).
- 38.Institute for Child, Youth and Family Policy at the Heller School for Social Policy and Management at Brandeis University. Child Opportunity Index (COI), https://www.diversitydatakids.org/child-opportunity-index (2021, 1 accessed November 2024).
- 39.Tong A, Sainsbury P, Craig J. Consolidated criteria for reporting qualitative research (COREQ): a 32-item checklist for interviews and focus groups. Int J Qual Health Care 2007; 19: 349–357. [DOI] [PubMed] [Google Scholar]
- 40.Qualtrics, Provo, Utah, U.S.A. Qualtrics software. Computer Software, Qualtrics, 2005.
- 41.Zoom Communications, Inc. San Jose, California. Zoom software. Computer Software, Zoom Communications, Inc., 2025.
- 42.Vook.ai, Chicago, IL USA. Vook Software Transcription Service. Computer Software, Vook.ai, 2025.
- 43.SocioCultural Research Consultants, LLC. Dedoose, cloud application for managing, analyzing, and presenting qualitative and mixed method research data. Computer Software, www.dedoose.com, 2023.
- 44.Creswell JW, Creswell JD. Research design: qualitative, quantitative, and mixed methods approaches. Sixth. Los Angeles: Sage Publications, Inc, 2022. [Google Scholar]
- 45.Chapman AL, Hadfield M, Chapman CJ. Qualitative research in healthcare: an introduction to grounded theory using thematic analysis. J R Coll Physicians Edinb 2015; 45: 201–205. [DOI] [PubMed] [Google Scholar]
- 46.Attride-Stirling J. Thematic networks: an analytic tool for qualitative research. Qual Res 2001; 1: 385–405. [Google Scholar]
- 47.Dowlatabadi S, Preim B, Meuschke M. Visualization of Age Distributions as Elements of Medical Data-Stories, Semantic Scholar. arXiv.org.
- 48.Guo G, Stasko J, Endert A. What we augment when we augment visualizations: a design elicitation study of how we visually express data relationships. In: Conati C, Torre I, Volpe G. (eds) Proceedings of the 2024 international conference on advanced visual interfaces. New York, NY, USA: ACM, 2024, pp.1–5. [Google Scholar]
- 49.Castro Noblejas H, Sortino Barrionuevo JF, Orellana Macías JM. Mapping method for the integrated analysis of gentrification and touristification: the case of Málaga (Spain). Cuadernos Geográficos 2023; 62: 109–129. [Google Scholar]
- 50.World Wide Web Consortium. Making the Web Accessible: Web Accessibility Initiative, https://www.w3.org/WAI/ (accessed 5 June 2024).
- 51.Pendell KD, Bowman MS. Usability study of a library’s mobile website: an example from Portland State University. ITAL 2012; 31: 45–62. [Google Scholar]
- 52.Fenley S. Multimedia design decisions, visualisations and the user’s experience. In: Deliyannis I. (ed) Interactive Multimedia. Oxford, United Kingdom: InTech, 2012, pp.159–175. [Google Scholar]
- 53.Cajas V, Urbieta M, Rossi G, et al. Challenges of migrating legacies web to mobile: a systematic literature review. IEEE Latin Am Trans 2020; 18: 861–873. [Google Scholar]
- 54.Aljunid SS, bin Mohd Satar NS, Razali R. Usability guidelines for designing m-health clinical coding and grouping application. MJPHM 2023; 23: 350–358. [Google Scholar]
- 55.Weichbroth P. Factors influencing the perceived usability of mobile applications. arXiv. Epub ahead of print 2025. DOI: 10.48550/arxiv.2502.11069. [DOI]
- 56.McCoy S, Everard A, Galletta DF, et al. Here we go again! The impact of website ad repetition on recall, intrusiveness, attitudes, and site revisit intentions. Inf Manage 2017; 54: 14–24. [Google Scholar]
- 57.Kelly GC, Hii J, Batarii W, et al. Modern geographical reconnaissance of target populations in malaria elimination zones. Malar J 2010; 9: 89. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 58.Lacroix D, Wullenkord R, Eyssel F. Who’s in charge? Using personalization vs. customization distinction to inform HRI research on adaptation to users. In: Companion of the 2023 ACM/IEEE international conference on human-robot interaction. New York, NY, USA: ACM, 2023, pp.580–586.
- 59.Wood R, Griffith M, Jordan JB, et al. “Creatures of habit”: influential factors to the adoption of computer personalization and accessibility settings. Univ Access Inf Soc 2023; 23: 1–27. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 60.Tandy CB, Odoi A. Geographic disparities and socio-demographic predictors of pertussis risk in Florida. PeerJ 2021; 9: e11902. [DOI] [PMC free article] [PubMed] [Google Scholar]
- 61.Ng SWT, Zhang R. Trust in AI chatbots: a systematic review. Telemat Inform 2025; 97: 102240. [Google Scholar]
- 62.Lim PC, Lim YL, Rajah R, et al. Usability questionnaire for standalone or interactive mobile health applications: a systematic review. BMC Digit Health 2025; 3: 11. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Supplementary Materials
Supplemental material, sj-docx-1-dhj-10.1177_20552076261415910 for User perceptions and preferences for GeoHealth tools: A qualitative focus group study of non-expert and expert users by John Geracitano, Kaushalya Mendis, Christopher M Shea, Fei Yu, David McSwain and Saif Khairat in DIGITAL HEALTH
Supplemental material, sj-docx-2-dhj-10.1177_20552076261415910 for User perceptions and preferences for GeoHealth tools: A qualitative focus group study of non-expert and expert users by John Geracitano, Kaushalya Mendis, Christopher M Shea, Fei Yu, David McSwain and Saif Khairat in DIGITAL HEALTH

