Abstract
According to recent Congressional testimony by the Secretary for Veterans Affairs (VA), improving the timeliness of services is one of five current priorities for VA. A comprehensive access measure, grounded in veterans’ experience, is essential to support VA’s efforts to improve access. In this article, the authors describe the process they used to develop the Perceived Access Inventory (PAI), a veteran-centered measure of perceived access to mental health services. They used a multiphase, mixed-methods approach to develop the PAI. Each phase built on and was informed by preceding phases. In Phase 1, the authors conducted 80 individual, semistructured, qualitative interviews with veterans from 3 geographic regions to elicit the barriers and facilitators they experienced in seeking mental health care. In Phase 2, they generated a preliminary set of 77 PAI items based on Phase 1 qualitative data. In Phase 3, an external expert panel rated the preliminary PAI items in terms of relevance and importance, and provided feedback on format and response options. Thirty-nine PAI items resulted from Phase 3. In Phase 4, veterans gave feedback on the readability and understandability of the PAI items generated in Phase 3. Following completion of these 4 developmental phases, the PAI included 43 items addressing 5 domains: logistics (five items), culture (three items), digital (nine items), systems of care (13 items), and experiences of care (13 items). Future work will evaluate concurrent and predictive validity, test/retest reliability, sensitivity to change, and the need for further item reduction.
Keywords: access, mental health, veteran, patient-centered
Ensuring access to high-quality health care, including menta health care, remains a high priority for the Department of Veterans Affairs (VA). Over the past several decades, the VA has invested heavily in several strategies to improve access including the intro duction of community-based outpatient clinics (CBOCs), primary care mental health integration, intensive case management, in creased use of mobile clinics, widespread use of tele-health, and contracting with community providers (Department of Veteran Affairs, 2014c; Kehle, Greer, Rutks, & Wilt, 2011). The VA need to be able to evaluate the outcomes of these and future strategie by assessing changes in access over time (e.g., prepost policy change or intervention implementation).
In 2010, VA’s Health Services Research and Development Service held a State-of-the-Art Conference (SOTA 2010) to identify the knowns and unknowns about the relationships among health care access, individual patient needs and characteristics, community networks and their characteristics, health care services utilization (VA and non-VA), and patient outcomes (Fortney, Burgess, Bosworth, Booth, & Kaboli, 2011), SOTA 2010 reconceptualized access to services, adding a fifth domain (digital access) to the four commonly recognized domains of access: geographical, temporal, financial, and cultural. In keeping with this model, the SOTA 2010 defined access as “the potential ease of having virtual or face-to-face encounters with a broad array of health care providers and resources including clinicians, caregivers, peers, and computer applications” (Fortney, Burgess, et al., 2011, p. S641).
In 2012, VA Office of Inspector General noted the need for relevant measures of access to mental health care, recommending that VA “reevaluate alternative measures or combinations of measures that could effectively and accurately reflect the patient experience of access to mental health appointments.” Current VA access measures include wait times according to a preferred date (Department of Veterans Affairs, 2014b, 2016), veteran ratings of mental health appointment access (Department of Veterans Affairs, 2014a), veterans’ perspectives as reflected in the Survey of Health Care Experiences of Patients (SHEP), which includes items from the Consumer Assessment of Health Care Providers and Systems (Department of Veterans Affairs, 2017b; Wright, Craig, Campbell, Schaefer, & Humble, 2006), and Strategic Analytics for Improvement and Learning measures (Department of Veterans Affairs, 2017a). There are also a variety of access measures reported in the research literature (Bauer, Williford, McBride, McBride, & Shea, 2005; Clement et al., 2012; Cunningham et al., 1995; Eakin & Strycker, 2001; Hoge et al., 2004; Lingley-Pottie & McGrath, 2011; Ouimette et al., 2011; Pepin, Segal, & Coolidge, 2009; Tanielian et al., 2008). However, these research measures do not appear to be comprehensive or grounded in veteran experience. In addition, an individual’s perception of access can improve or worsen with care experience (Ajzen, 1991; Fortney, Tripathi, Walton, Cunningham, & Booth, 2011), therefore, the VA also needs a measure that is sensitive to changes in perceived access over time.
In part because most of the frequently used measures of access predate SOTA 2010, none encompasses all five of the SOTA 2010 domains. For instance, perceived access to digital health technologies, other than encounters by telephone, is largely absent from existing measures referenced above. We believe that a comprehensive measure grounded in veterans’ experience is essential to support VA’s efforts to increase access. Without extensive input from veterans who have sought VA mental health care, a measure may fail to capture the full range of issues that matter most to VA patients and thus be inadequate for identifying modifiable barriers to service use.
In this article, we describe tne process used to develop the Perceived Access Inventory (PAI), a veteran-centered measure of perceived access to mental health services that encompasses all domains in the SOTA 2010 model.
Methods and Results
We used a multiphase, mixed-methods approach (Creswell, 2013) to develop the PAI. Each phase built on and was informed by preceding phases. In Phase 1, we conducted individual, semistructured, qualitative interviews with veterans to explore their experiences and elicit the barriers and facilitators they faced in seeking mental health care. In Phase 2, we generated a preliminary set of survey items based on Phase 1 qualitative data. In Phase 3, an external expert panel rated preliminary PAI items in terms of relevance and importance and provided feedback on format and response options. In Phase 4, veterans gave feedback on the readability and understandability of the PAI item set generated through Phase 3.
Because the results of each phase informed the methods and content of subsequent phases, the methods and results sections are combined into a single section, organized by study phase, rather than describing these sections independently. All phases of the study were reviewed and approved by the VA Central Institutiona Review Board.
Phase 1: Qualitative Interviews
Recruitment procedures
Two groups of veterans were recraited for the qualitative interviews. The first group (n = 8) was recruited using opt-out letters and clinician referral from a VA Medical Center in Arkansas to assess qualitative interview pacing and content; these veterans were also included in the final sample. Once the interview guide was finalized, it was used to collect data from a second group of 72 veterans across three Veterans Integrated Service Networks (VISNs; VISN 1 in the Northeast, VISN 16 in the Central South, and VISN 21 in the West). The opt-out letter recruitment strategy used VA administrative data to select a sampling frame that would allow us to include participants from specific subpopulations (i.e., women, members of racial/ethnic minorities, a balance of rural and urban residents, and a balance of veterans with and without a history of mental health service use; Miller et al., 2017). Potential participants for the second group were mailed a study packet (n = 585) that included an invitation letter describing the study together with an opt-out letter they could use to decline further contact with study personnel. Veterans could opt out either by calling study personnel or returning the opt-out letter (89/585, 15.2%). Study staff called potential participants who had not opted out after two weeks had passed from the date of study packet mailing to assess interest, confirm eligibility, and answer questions. A substantial proportion of those who did not opt-out could not be reached (162/585, 27.7%) because of wrong address (n = 30), wrong phone number (n = 34), did not answer the phone (n = 87), or veteran was deceased (n = 3). Of the veterans who were reached by phone (258/585, 44.1%), 27 were deemed ineligible, 159 declined to participate, and 72 (27.9% of those who could be reached by phone) agreed to participate in person (n = 66) and by phone (n = 6). Written informed consent was obtained for face-to-face participants; with VA Central Institutional Review Board approval, verbal consent was obtained for interviews conducted by phone.
Sites and participants
For Phase 1, We recruited 80 veterans from a VA Medical Center in Arkansas and VA community-based outpatient clinics (CBOCs) in Arkansas, Northern California, and Maine. Most of the participants were recruited from CBOCs to facilitate recruitment of rural veterans. To be eligible to receive an opt-out letter for this phase of the study, participants needed to be United States military veterans between the ages of 18 and 70 years and to have had at least one positive screen for PTSD, alcohol use disorder, or depression documented in their VA medical record in the previous year. We did not send opt-out letters to veterans with psychosis or dementia documented in the problem list of their medical record due to concerns that these conditions could limit the recall and cognitive function necessary to complete a meaningful qualitative interview. During the recruitment phone call, potential participants were asked about reliable access to a phone and any stress-related or emotional problems related to PTSD, alcohol use, or depression in the past year. We excluded veterans who reported no reliable phone access or no stress-related or emotional problems related to PTSD, alcohol use, or depression. Participants were compensated $30 for completing the interview and an additional $20 if they had to travel more than 30 min one-way to complete an in-person interview and did not have any other appointments that day that provided travel pay.
To ensure geographic diversity within our sample, we recruited participants from three separate clinics within each of three VISNs (VISN 1 in the Northeast, VISN 16 in the Central South, and VISN 21 in the West) for a total of nine clinics. Sampling was stratified by geographic area, rural/urban residence, having used/not used mental health services in the past year, and gender (female veterans were oversampled). Participants who had not used mental health services were in included to identify barriers associated with screening positive for a mental health problem and not accessing mental health services. The mean age of the total sample (n = 80) was 46.7 years (SD = 13.8), 75% were male, 61% were non-Hispanic White, 46% Were rural, and 66% had used mental health services within the past year (see Table 1. Compared to all veterans who were mailed opt-out packets in the second group (n = 72), veterans who participated were more likely to have a history of mental health service use (63.9% vs. 47.4%, p = .01) and tended to be younger (44.7 vs. 48.0 years, p = .07). Otherwise there was no statistically significant differences in gender, race, ethnicity, or rural/urban address (Miller et al., 2017).
Table 1.
Variable | VISN 1 (n = 25) | VISN 16 (n = 30) | VISN 21 (n = 25) | TOTAL (N = 80) |
---|---|---|---|---|
Age, M (SD) | 47.3 (14.4) | 43.5 (14.1) | 47.2 (12.6) | 45.8 (13.7) |
Gender | ||||
Male | 18 (22.5%) | 22 (27.5%) | 20 (25%) | 60 (75%) |
Female | 7 (8.7%) | 8 (10%) | 5 (6.2%) | 20 (25%) |
Race** | ||||
Non-Hispanic White | 20 (25.3%) | 12 (15.2%) | 17 (21.5%) | 49 (62.0%) |
All others | 4(5.1%) | 18 (22.8%) | 8 (10.1%) | 30 (38.0%) |
Education | ||||
High school or less | 5 (6.2%) | 11 (13.7%) | 2 (2.5%) | 18 (22.5%) |
Tech school/some college | 11 (13.7%) | 13 (16.2%) | 12 (15%) | 36 (45%) |
College grad or higher | 9 (11.2%) | 6 (7.5%) | 11 (13.7%) | 26 (32.5%) |
Marital status | ||||
Married/live as married | 13 (16.2%) | 12 (15%) | 12 (15.0%) | 37 (46.2%) |
Separated/divorced | 8 (10.0%) | 14 (17.5%) | 10 (12.5%) | 32 (40%) |
Widow/Never married | 4 (5.0%) | 4 (5.0%) | 3 (3.7%) | 11 (13.7%) |
Living situation | ||||
On own | 6 (7.5%) | 10 (12.5%) | 11 (13.7%) | 27 (33.7%) |
Spouse/partner | 14 (17.5%) | 12 (15.0%) | 12 (15.0%) | 38 (47.5%) |
All others | 5 (6.2%) | 8 (10.0%) | 2 (2.5%) | 15 (18.7%) |
Employment | ||||
Employed (full-time or part-time) | 7 (8.7%) | 13 (16.2%) | 10 (12.5%) | 30 (37.5%) |
Sick leave, disabled, or SSI/SSDI | 10 (12.5%) | 13 (16.2%) | 9 (11.2%) | 32 (40.0%) |
All others | 8 (10.0%) | 4 (5.0%) | 6 (7.5%) | 18 (22.5%) |
Annual household income | ||||
≤$25,000 | 1 (2.1%) | 3 (6.4%) | 5 (10.6%) | 9 (19.1%) |
>$25,000 | 13 (31.9%) | 12 (25.5%) | 11 (23.4%) | 38 (80.8%) |
Current residential location | ||||
Rural | 9 (11.2%) | 13 (16.2%) | 15 (18.7%) | 37 (46.2%) |
Urban | 16 (20%) | 17 (21.2%) | 10 (12.5%) | 43 (53.7%) |
Perceived Barriers, M (SD) | 33.6 (7.9) | 34.2 (9.5) | 36.5 (12.0) | 34.7 (9.9) |
Readiness for Treatment, M (SD) | 45.1 (13.7) | 42.8 (11.3) | 36.6 (13.9) | 41.6 (13.2) |
Client Satisfaction, M (SD) | 24.4 (6.2) | 22.2 (5.2) | 25.9 (4.4) | 23.9 (5.5) |
AUDIT-C, M (SD) | 3.6 (2.5) | 3.6 (3.4) | 5.4 (2.4) | 4.2 (2.9) |
PHQ-9, M (SD) | 12.1 (5.7) | 15.5 (5.5) | 10.1 (6.9) | 12.7 (6.4) |
GAD-7, M (SD) | 10.5 (5.6) | 13.6 (4.3) | 9.4 (6.71 | 11.3 (5.8) |
PCL, M (SD) | 48.2 (15.1) | 56.9 (13.5) | 42.7 (17.4) | 49.7 (16.2) |
Note. SSI/SSDI = Supplemental Security Income/Social Security Disability Insurance; AUDIT-C = Alcohol Use Disorders Identification Test; PHQ-9 = Patient Health Questionnaire 9-item depression module; GAD-7 = 7-item generalized anxiety disorder screener; PCL = PTSD Checklist-Civilian version.
p < .05 comparing Veterans Integrated Service Networks (VISN) means.
p < .01 comparing VISN means.
Quantitative measures and analysis
Before each qualitative interview began, quantitative data were collected on sociodemo-graphics, residence (raral/urban), perceived treatment barriers, symptom severity (PTSD, depression, alcohol use, and generalized anxiety with higher scores meaning more severe symptoms), treatment satisfaction, and readiness for treatment. Residential status (rural/urban) was defined using census-tract-based Rural Urban Commuting Area codes. Perceived barriers to mental health treatment was measured using Hoge’s 13-item measure (possible score range 13 to 65, higher score means more severe barriers; Hoge et al., 2004). Mental health symptom severity (higher scores means greater symptom severity) was measured using the 17-item PTSD Checklist-Civilian version (possible range 17 to 85; Keen, Kutter, Niles, & Rrinsley, 2008), the Patient Health Questionnaire 9-item depression module (possible range 0 to 27; Spitzer, Kroenke, & Williams, 1999), the Alcohol Use Disorders Identification Test (possible range 0 to 12; Bradley et al., 2007), and the 7-item generalized anxiety disorder screener (possible range 0 to 21; Spitzer, Kroenke, Williams, & Lowe, 2006). Patient satisfaction with VA mental health care was measured using the eight-item Client Satisfaction Questionnaire (possible range 8 to 32, higher score means greater satisfaction; Larsen, Attkisson, Hargreaves, & Nguyen, 1979). We used a modified six-item readiness ruler to assess motivation for treatment (possible range 0 to 60, higher score means greater motivation for treatment; Center for Substance Abuse Treatment, 1999). Quantitative data were analyzed descriptively and used to characterize the sample (see Table 1). Chisquare tests were used to compare categorical variables across VISNs and generalized linear models were used to compare continuous variables.
Qualitative interview and analysis
Qualitative interviews were conducted at each of the study sites/CBOCs by experienced investigators with training in qualitative data collection and analysis. A semistructured interview guide was developed based on the access literature and on the research team’s clinical experience with veterans with mental health problems. The interview guide asked the veteran about their experience using or attempting to use VA mental health care (interview guide available from first author on request). Interviews were digitally recorded and professionally transcribed. Transcripts were entered into the Atlas.ti software program to facilitate management and analysis of qualitative data (Muhr & Friese, 2004).
Qualitative data analysis used blended deductive (model testing) and inductive (model development) content analysis techniques. Qualitative analysis began with a provisional list (Saldaña, 2015) of deductive codes derived from the SOTA 2010 model. As data collection proceeded, the qualitative team met biweekly to analyze interview content using an interdisciplinary team-based approach (MacQueen, McLellan, Kay, & Milstein, 1998) designed to maximize creativity, credibility, and reliability of coding (Hall, Long, Bermbach, Jordan, & Patterson, 2005). An iterative analysis process was used in which all team members first read the same transcripts, discussed the applicability of deductive codes, and developed and refined inductive codes. This team-based process helped establish joint understanding of the code definitions and applicability across sites (Ryan & Bernard, 2003). After approximately three months of reading and discussing interviews as a group, the qualitative team reached consensus on the application of both deductive and inductive codes. After this period, investigators began reading and coding interviews from their own geographical area, because they were more familiar with geographical references and local culture.
Throughout the independent coding process, the team regularly evaluated consistency in code assignment and coder agreement by auditing approximately 20% of the transcripts. The auditing team consisted of qualitative team leaders from each site. Intercoder agreement was obtained by resolving differences in code application through discussion until consensus was reached among audit team members. The qualitative team also held regular meetings where code definitions and practices were discussed to maintain consistency among team members. To help communicate qualitative findings with the larger team, the qualitative team generated summaries of the codes within each subdomain and indicated the frequency with which each code was identified. These summaries, which provided an overview of interview content, were used in Phase 2 to guide development of PAI items as discussed below.
Results
A total of 80 veterans participated in Phase 1: the 8 veterans from Arkansas with whom we piloted the interview guide and 72 subsequent veterans from all sites. Most interviews were conducted in-person (92.5%, 74/80); the remaining participant preferred to complete their interviews by phone because of trave distance or disability. The sociodemographic and clinical charac teristics of the sample are summarized in Table 1. Participan characteristics were similar across VISNs except that VISN 16 participants were significantly less likely to be non-Hispanic White and more likely to report higher Patient Health Questionnaire 9-item depression module, 7-item generalized anxiety disorder screener, and PTSD Checklist-Civilian version scores. VISN 21 participants reported significantly higher Alcohol Use Disorders Identification Test scores.
There was a total of 8,955 coded segments in the qualitative transcripts (see Table 2). A coded segment coinsisted of a question asked by the interviewer and the respondenťs answer(s) to that question. Coded segments were organized into subdomains and subdomains into domains. The final codebook coinsisted of 49 subdomains related to access to VA mental health services across six domains: logistics (which encompasses the geographical, temporal, and financial domains in the SOTA 2010 model), culture, digital, systems of care, experiences of care, and experiences of treatment. Three of these domains (systems of care, experiences of care, and experiences of treatment) were not explicitly included in the perceived access to care definition of the SOTA 2010 model. Table 2 includes brief descriptive definitions of each of these domains, the number of coded segments for each domain (frequency), and the number of participants who contributed at least one coded segment to each domain (prevalence). Frequency and prevalence for the digital subdomains are likely to be overestimated because digital platform use and barriers were specifically probed for in the qualitative interviews.
Table 2.
Domain | Definition | Subdomain | Segments coded | Participants (N = 80) |
---|---|---|---|---|
Logistics | Geographical, temporal, and financial issues that are perceived by veterans to affect their use of VA mental health Services | Convenience | 106 | 35 (44%) |
Distance | 147 | 58 (73%) | ||
Finances | 227 | 68 (85%) | ||
Physical limitations | 45 | 45 (56%) | ||
Time | 131 | 44 (55% ) | ||
Transportation | 101 | 37 (46%) | ||
Culture | Veterans’ beliefs, values, and attitudes regarding mental health symptoms and treatments | Meaningful activity | 37 | 15 (19%.) |
Identity | 206 | 55 (69%) | ||
Job/work | 420 | 62 (78%) | ||
Self-care | 154 | 44 (55%) | ||
Religion/spirituality | 71 | 21 (26%) | ||
Social relationships | 713 | 69 (86% ) | ||
Stigma | 152 | 48 (60%) | ||
Stigma denial MH symptoms | 82 | 37 (46%) | ||
Digital | Any existing technology veterans use that affects communication with the VA | Call/calling/phone | 300 | 57 (71%) |
Computer/laptop/tablet | 228 | 55 (69%) | ||
73 | 42 (53%) | |||
General | 133 | 58 (73%) | ||
Internet | 256 | 76 (95% ) | ||
Mobile App | 111 | 51 (64%) | ||
My Health eVet | 111 | 53 (66%.) | ||
Smartphone | 172 | 60 (75%) | ||
Social media | 138 | 59 (74%) | ||
Telehealth/distance health | 186 | 59 (74%) | ||
TV | 76 | 33 (41%) | ||
Systems | VA mental health structures and process and how those affect veterans’ experiences or perceptions of access to mental health services | Benefits and Service connection | 180 | 50 (63%) |
Bureaucracy | 129 | 44 (55%) | ||
Comparison | 112 | 40 (50%) | ||
Having choice in services/providers/etc. | 115 | 36 (45%) | ||
Lack of Services | 49 | 22 (28%) | ||
Mistrust in VA systems | 79 | 27 (34%) | ||
No health care | 18 | 11 (14%) | ||
Non-VA care | 156 | 43 (54%) | ||
Time appointment day delays | 33 | 18 (23%) | ||
Time scheduling appointments | 195 | 53 (66%) | ||
Time waiting for appointments | 181 | 54 (68%) | ||
Experiences of care | Veterans’ experiences about past or current interactions with VA or non-VA providers and facilities | Care continuity | 251 | 64 (80%) |
Getting the ball rolling—getting MH info | 299 | 69 (86%) | ||
Outreach—non-VA | 70 | 18 (23%) | ||
Outreach—VA | 65 | 29 (36%) | ||
Patient centeredness | 296 | 60 (75%) | ||
Sustained engagement | 108 | 33 (41%) | ||
Therapeutic alliance | 373 | 67 (84%) | ||
Experiences of treatment | Veterans’ perspectives about symptoms, interventions, side effects, and outcomes related specifically to mental health | Treatment effectiveness | 290 | 64 (80%) |
Treatment experiences mental health | 497 | 68 (85%) | ||
Treatment general | 181 | 55 (69%) | ||
Treatment side effects | 128 | 40 (50%) | ||
Substance use | 200 | 46 (58%) | ||
Symptom experiences | 573 | 74 (93%) | ||
Total | 8.955 |
Note. VA = Veterans Affairs; MH - mental health.
Not all domains and subdomains listed in this Table are represented in current version of Perceived Access Inventory (see text for details).
Phase 2: Development of Initial Items
Ten members of the research team used Phase 1 findings to generate a list of potential PAI items. This team included five qualitative interviewers (Christopher J. Miller. Patricia Wright, Kara Zamora, Christopher J. Koenig, and Regina Stanley), a coleader and another participant in the VA SOTA 2010 conference (John C. Fortney and James F. Burgess Jr.), a psychometrician (P. Adam Kelly), an epidemiologist (Ellen P. Fischer), and a psychi-atrist (Jeffrey M. Pyne).
Procedures
As described above, Phase 1 qualitative interviewers presented analytic summaries for each subdomain to the larger research team. Each analytic summary included descriptions of the codes representing perceived access barriers and facilitators for a subdomain, along with key illustrative quotes to give a feel for the issue in participants’ own words. The larger research team reviewed each analytic summary as a group to identify codes and quotes that were both relevant to access and important to participants, and to discuss how best to convert these into survey items. Based on these discussions, a subgroup of the larger team (item-generation subteam) drafted potential PAI items. Once all the analytic summaries had been discussed, the potential items were aggregated into an initial master list of 167 potential items across the six domains listed above. The research team then performed initial item reduction (combining related items, eliminating duplicates, and eliminating items that were not considered to be easily modifiable through VA policies), standardized item phrasing, and considered alternative resnonse formats.
Results
By eliminating duplicates and combining items at this stage, we reduced the initial master list from 167 to 108 items. We then eliminated items that were outside the influence of practical health care-related VA policies. Examples in this category included military mental health experience, barriers to using digital technology (e.g., unable to use apps, not having Internet access at home), changes in VA policy (e.g., changes in service connection rating, eligibility, access to certain medications, budgets), and VA benefits barriers (e.g., delay in benefits process, lack of VA mental health services for spouse and children). Although these items were relevant for access, addressing them was not considered practical (e.g., changing past military experience, determining location of Internet networks, and changes in national VA policy and benefits). Items not directly related to access also were eliminated (e.g., quality of care, veteran unemployment, motivation to seek care, and availability of other health insurance). Eliminating these additional items resulted in a list of 77 potential items in six domains: logistics (10 items), culture (10 items), digital (seven items), systems of care (21 items), experiences of care (23 items), and experiences of treatment (seven items). We then compared the wording of these items with the wording of relevant items in the existing measures item bank (n = 253; Bauer et al., 2005; Clement et al., 2012; Cunningham et al., 1995; Eakin & Strycker, 2001; Hoge et al., 2004; Lingley-Pottie & McGrath, 2011; Ouimette et al., 2011; Pepin et al., 2009; Tanielian et al., 2008) and adopted standard phrasing to the extent possible. We also considered various response sets (e.g., 5-point Likert-type scales of “none of the time” to “all of the time” and “strongly agree” to “strongly disagree,” and an 11-point Likert-type scale of “does not interfere at all” to “completely interferes”).
Phase 3: External Expert Panel Review
In Phase 3, subject-matter experts (e.g., VA policymakers, advocates, and administrators) who had not been involved with the project previously took part in a web-based, modified Delphi process to evaluate and rank the 77 potential PAI items remaining after Phase 2. These expert panel participants were selected by the research team and invited to participate via e-mail. Most (14/17, 82.3%) agreed to participate and were sent a link to a web-based survey. The survey was administered electronically using the SurveyMonkey platform. Round 1 of the survey was completed by 11 participants; 10 of the 11 also completed Round 2.
In Round 1, panel members were asked to evaluate the appropriateness and importance of each of the 77 potential PAI items. Participants were asked to respond to three questions for each item. The first question was whether the item was consistent with the definition of perceived access (yes/no). Perceived access was operationally defined as “the perceived ease with which the patient is able to initiate and sustain interaction with desired mental health services.” The second question used a 1 (almost no fit) to 10 (almost perfect fit) rating scale to indicate how well the item fit with the definition of the preassigned domain (see Table 2 for domain definitions). The third question was to indicate how important they thought the item was to an understanding of perceived access to mental health services using a 1 (not important) to 10 (very important) rating scale (Linstone & Turoffj 1975; Park et al., 1986; Pyne et al., 2008). An importance score of zero was given to any item rated as not consistent with the definition of perceived access. Open-text boxes were also included to allow panel members to comment on readability, accuracy, and other concerns they had about an item (e.g., not being modifiable through practical changes in VA policies). We allowed 2 weeks for participant responses to Round 1, sending e-mail reminders after 5, 10, and 12 days to those who had not yet responded, as needed.
After the Round 1 response-period closed, the research team summarized and reviewed Round 1 results. Forty-four items were evaluated in Round 2 (see Results section below). For each item in Round 2, the participant received the following Round 1 summary: (a) the frequency of “Yes” and “No” responses to the perceived access definition question, (b) the domain-match and importance scores the respondent him/herself had given during Round 1, (c) the external panel mean and median scores on domain-match and importance, (d) the 25th and 75th percentiles for the panel’s domain-match and importance scores, and (e) a de-identified list of participant comments on the item. Panel members used this information to repeat the Round 1 evaluation process; they were free to either keep or change their scores, and to provide additional comments. Those who scored an item in Round 2 outside the first round 25th–75th importance percentile range (defined as outliers) were asked to provide a written rationale for their score. Following Round 2, results were summarized and presented to the expert panel for a final opportunity to make written comments on items. The final product of this Delphi process was an item-by-item assessment by subject-matter experts of the content validity of the proposed PAI items.
Results
In Round 1, 27 items were considered to meet the perceived access definition by >90% of participants and received a mean overall importance score >6.0. These were considered “keepers” that did not need to be reconsidered or rescored in Round 2. None of the Experience of Treatment items met the “keeper” criteria after Round 1. Six items were dropped at the end of Round 1 because <60% of participants considered them to meet the perceived access definition and they received a mean overall importance score <5.0. The remaining 44 items were considered to meet the definition by 60–90% of participants and/or had mean importance scores of 5.1–5.9 and were included in Round 2.
Following the second round of scoring, 15/44 (34.1%) of the Round 2 items were kept based on meeting the scoring criteria above. Of these 15 items, one experiences of treatment item was eliminated (i.e., I prefer individual treatment but it seems that only group therapy is available) because veteran treatment-preferences were considered not directly related to access. This was the only experiences of treatment item that met inclusion criteria following Round 2. The research team also reviewed the Round 2 items that were not kept based on scoring criteria above and added back six: one item each from the experience of care, logistics, and systems of care domains; and three digital items when items that included more than one digital platform were split into individual items. Therefore, 20 items were kept from Round 2 list.
The research team also reviewed the Round 1 “keeper” list and decided to eliminate eight and ultimately keep 19/27 (70.4%). The eight that were eliminated included three from the experiences of care domain that were facilitator versions of existing barrier items (e.g., receiving follow-up calls, primary care providers facilitating access to mental health, and scheduling convenient mental health appointment times). Two items from the logistics domain were eliminated (one item was physical health problems making it difficult to get to appointments and the other item was mental health problems making it difficult) and two were combined to form one item (evening and weekend appointment items were combined). Two items from the system of care domain were combined with items from other domains (i.e., a cost item from the logistics domain and lack of awareness of available mental health services from an experiences of care domain). In total, 39 items across five domains (logistics, culture, digital, systems of care, and experiences of care) survived Phase 3 (19 items from the Round 1 keeper list and 20 items from the Round 2 list).
Phase 4: Feedback From Veterans
Participants and recruitment
Eleven Phase 4 participants were recruited from the CBOCs involved in Phase 1. The Phase 4 eligibility criteria and opt-out recruitment strategy (Miller et al., 2017) were identical to those used in Phase 1.
Process and findings
In-person interviews were conducted by the same group of investigators as in Phase 1. In future psychometric testing, the PAI will be administered by telephone; therefore, to mimic the conditions of a telephone interview, interviewers read the PAI questions to participants in Phase 4. Whereas Phases 1–3 focused on the nature and scope of veterans’ perceptions of access to VA mental health care, Phase 4 focused more on using veteran feedback to improved item format. Modifications were made to the PAI in Phase 4 as data collection progressed (e.g., changes to response format, question format, and question order).
In the initial Phase 4 PAI, items were formatted as single questions about interference with getting needed VA mental health care; responses were on an 11-point (0–10) scale. The phrase “mental health care you need” was chosen deliberately to focus attention on the veteran’s perception of need and to distinguish needed care from desired mental health care. The 11-point Likert scale was adapted from the Consumer Assessment of Health Care Providers and Systems survey in which respondents were asked to rate their overall experiences with care in general and with their provider on a 0 (worst) to 10 (best) scale. It rapidly became apparent that veterans did not like the 11-point Likert scale because its many options proved confusing. In addition, they reported problems in responding because the response options started from “did not interfere at all” which implied that the barrier was present. The single-item approach conflated prevalence of the barrier and impact of the barrier. A barrier may be highly prevalent but have negligible impact on service use or visa versa. To be important at the population level, a barrier would have to both be prevalent and have a substantial impact on utilization. Based on this feedback, we replaced each single-question item with two questions per item: the first question addressed the occurrence of the issue as a barrier and used a yes/no response format (e.g., “In the past 12 months, was transportation to VA mental health care a problem for you?). Only when a potential barrier was endorsed as a problem was a second question asked addressing the extent to which the barrier interfered with getting needed mental health care (e.g., How much did that interfere with getting the VA mental health care you needed?). This second question used a shorter, 5-point Likert scale, ranging from 1 (not at all) to 5 (completely).
Initially, the digital modalities (e.g., secure messaging, smart phone apps) items used a complicated, four-option response format. Because participants reported that this response set was difficult to use, we changed the digital domain questions to a two-question branching format. The first question was “Have you ever used …?”. If they had, participants were asked whether it was helpful; if they had not, participants were asked whether they had access to the modality. Participants reported that this format was easy to answer. It also provided more nuanced information that included both the frequency of use and usefulness of a wide variety of digital platforms available to veterans.
There were three instances in which a pair of PAI items appeared to be addressing the same issue. For each pair, after participants had responded to each item, they were asked whether the items seemed to be addressing the same or different issues. One example was a pair of items that addressed personal stigma: “ … did you ever feel that you should just suck it up and drive on and not seek mental health care?” and “ … did you ever feel that you were weak because you might need mental health care?”. Veterans suggested using the phrase “tough it out” rather than “suck it up.” In addition, although they reported that the two items were similar, they thought it was important to include both because “tough it out” addressed a behavioral expectation whereas “weakness” was a personal attribution/feeling. We retained both items in the preliminary PAI, pending further psychometric testing.
Veterans suggested splitting some items (e.g., items about mental health care providers not genuinely caring about veterans, mental health care staff failing to show respect, and veterans lacking trust in any of their VA mental health care providers). For the first two, they suggested asking separately about providers and about staff; for the last, they suggested having separate items for providers and for the VA health care system.
Not all suggestions were adopted. Some veterans reported difficulty with questions about the VA and the Department of Defense (DoD) sharing medical records because they were not aware of this happening. However, we decided to retain this item because it could provide insight into veterans’ perceptions of current efforts to streamline access to VA and DoD health records. In addition, the research team reintroduced an item about fear of losing the right to own a gun as a potential barrier to using VA mental health services. Although the expert panel did not give this item a high importance score, it was raised by veterans in Phase 1, mentioned again in Phase 4, and seemed to be very important to those veterans it affected.
Following completion of these fouor developmental phases, the PAI included 43 items addressing five domains: logistics (five items), culture (three items), digital (nine items), systems of care (13 items), and experiences of care (13 items). The 43-item version is available in the Appendix.
Discussion
Our intent in developing the PAI was to generate a veterancentric access measure that would capture and reflect veterans’ experiences accessing VA mental health care. We intentionally used veterans’ own words to formulate questions that would be authentic and engaging. Comments from veterans participating in Phase 4 (e.g., “I have not been asked these questions before” and “These are good questions”) validated that decision. Our veteran-centered approach also led to identification of three domains (systems of care, experiences of care, and experiences of treatment) not well represented in the VA SOTA 2010 definition of perceived access to care. Only one of the experiences of treatment items survived the external expert panel review. The research team considered that item, which reflected a preference for individual versus group treatment, an indirect measure of access and dropped it. Thus, only two of the three domains not included in the VA SOTA 2010 model were incorporated into the current version of the PAI and these two domains account for more than half the total number of PAI items.
The importance of including items from the systems of care and experiences of care domains is underscored by findings from a recent comparison of VA and non-VA inpatient care. This comparison found that, although VA facilities outperform non-VA facilities on many quality metrics, they underperform on patient-experience metrics such as provider communication, responsiveness, quietness (while on the inpatient ward), pain management, and whether the patient would recommend the hospital to others (Blay, DeLancey, Hewitt, Chung, & Bilimoria, 2017). These inpatient-experience metrics are similar to some of the outpatient items in the Systems of Care and Experiences of Care domains of the PAI.
In Phase 1, interviewers specifically probed for digital domain issues, which likely inflated the frequency and prevalence with which digital subdomains appear in Table 2. However, according to a 2016 Pew Research Center survey, the number of American adults with access to various digital platforms is steadily increasing. For example, approximately 90% of American adults use the Internet (89% urban, 81% rural), 95% of American adults own a cellphone (95% urban, 94% rural), and 77% of Americans own a smartphone (77% urban, 67% rural) (Pew Research Center, 2017a, 2017b). Rural and urban patients were also found to be similarly receptive to telehealth interventions (Bashshur, Shannon, Bashshur, & Yellowlees, 2015; Grubaugh, Cain, Elhai, Patrick, & Frueh, 2008). Veterans being treated for mental health problems also report access to various digital platforms (Miller, Mclnnes, Stolzmann, & Bauer, 2016). The general acceptance of digital communication and the pace of digital development suggest that the variety of digital communication modalities will only increase in the future. As the paradigm of health care delivery evolves to encompass delivery of more care via digital health technologies, it is imperative that our access models and measurement instruments keep pace (Fortney, Burgess, et al., 2011). The dynamic growth in digital modalities means that the PAI will need to be similarly dynamic as digital platforms move on- and offline.
The PAI differs from access measures currently in use in the VA (see the introduction). For example, wait times are averages calculated from administrative data that may not reflect the specific veteran experience attempting to get an appointment. The Veteran Satisfaction Survey asks veterans about the timeliness of mental health appointments but does not ask about specific access barriers. The SHEP asks veterans about the timeliness of mental health appointments and about a limited number of barriers (e.g., inconvenient appointment times, transportation problems, cost). Strategic Analytics for Improvement and Learning measures include items from the Veteran Satisfaction Survey and SHEP question-naires plus composite measures of continuity of care and experiences of care. In contrast, the PAI includes a comprehensive list of specific perceived access items across five domains derived from veterans’ experience accessing VA mental health services.
Although the PAI currently addresses mental health care in the VA overall, items could be easily reframed in terms of access to a specific mental health service at a given site (e.g., prolonged exposure therapy for PTSD delivered at a specified VA facility) or to evidence-based care more broadly, thus allowing it to be applied to assessing access to a specific treatment or categories of treatment. In addition, as mentioned in the recent VA Office of Inspector General Review of Veterans’ Access to Mental Health Care, the “data and measures needed by decision makers for planning and service provision may differ at the national, VISN, and facility level.” At the national or VISN level, PAI items could be incorporated into surveys such as the SHEP, managed by the VHA Office of Analytics and Business Intelligence, to better understand perceived access to mental health care. At the more local level (facility or clinic), the PAI could be used to identify access intervention targets, inform intervention development, and assess the impact of single or multicomponent interventions to improve access to mental health care.
Additional work is needed including formal assessment of test/ retest reliability, validity, and sensitivity to change. At 43 items, the current version is associated with substantial respondent burden; further item reduction work is underway. A two-question format was chosen to capture the prevalence and impact of a given barrier. This format increases the length of the questionnaire; however, by avoiding any suggestion that interviewers assume that a given bander is present, it may provide a more accurate assessment of the impact of the banier for an individual veteran and a more sensitive measure of change over time. These assumptions can be tested in future work. Individual items and combinations of items also need to be evaluated for their ability to predict service utilization and treatment engagement.
There are limitations to the PAI. Qualitative interviews were conducted at one VA Medical Center and eight CBOCs and therefore may not have captured all the barriers to accessing VA mental health services. The VA increasingly is purchasing care using non-VA community providers (e.g., through the Veterans Choice Act). While the current version of the PAI does not include Choice Act-specific items, qualitative interviews are underway to identify Choice Act-specific barriers. We will use these data to create a Choice Act-specific version of the PAI. Finally, there is likely overlap between perceived access to physical health care and mental health care, but the extent of that overlap from the veteran perspective is unknown.
Despite these limitations, several implications arise from PAI development work to-date. First, veterans reported that the items included in the preliminary version of the PAI represented their “voice” and included important barriers for VA to consider. This suggests that veterans will be willing partners in developing and testing interventions to address those barriers. Second, improving access to VA mental health services may require increased attention to Systems of Care and Experiences of Care barriers. Third, digital resources are likely to change over time which will require corresponding changes to digital items.
Conclusion
The PAI offers a comprehensive, veteran-centered perceived mental health access measure for the 21st century. The PAI will allow administrators, policymakers, and researchers to identify access barriers, design interventions to address them, and measure the impact of the interventions over time. Forthcoming work on item reduction and the creation of a Veterans Choice Act-specific version of the PAI will ensure that this tool keeps pace with evolving needs for assessment of mental health care access.
Appendix
Items 1–34 and 43 | |
Part 1: (administered to all respondents) | |
“In the past 12 months, …” [insert item stem from list below] | |
Yes | 1 |
No | 0 |
Part 2: (administered only to respondents who respond “Yes” in Part 1a) | |
“How much did that interfere with getting the VA mental healthcare you needed?” | |
Completely | 5 |
A great deal | 4 |
Somewhat | 3 |
A little bit | 2 |
Not at all | 1 |
Item | Item Stem: “In the past 12 months …” |
1 | … Did you have to travel a long distance to get VA mental healthcare? |
2 | … Was transportation to VA mental healthcare a problem for you? |
3 | … Did you have to spend a lot of money on travel to get to VA mental healthcare? |
4 | … Did you lose income because of taking time off from work to get VA mental healthcare? |
5 | … Did your VA mental health facility have clinic hours on evenings or weekends? |
6 | … Did your VA mental health facility have convenient appointment times? |
7 | … Did you have to spend a lot of money overall to get VA mental healthcare? |
8 | … Did you have to spend a lot of time in the waiting room before your VA mental health appointments? |
9 | … Did you have to wait a long time between your VA mental health appointments? |
10 | … Have you felt that your VA mental health appointments were short? |
11b | … Did you have to wait a long time to get that first VA mental health appointment? |
12 | … Did any of your VA mental healthcare providers lack knowledge of military culture? |
13 | … Did you ever feel that you should just “tough it out” and not seek mental healthcare? |
14 | … Did you ever feel that you were weak because you might need the help of a mental healthcare provider? |
15 | … Did you notice differences in cultural, religious or personal valaes between yourself and your VA mental healthcare providers? |
16 | … Did you have problems getting in touch with your VA mental healthcare providers between appointments? |
17 | … Did you get reminder calis about your VA mental health appointments? |
18 | … Did any of your VA mental healthcare providers fail to take your mental health problems seriously? |
19 | … Did any of your VA mental healthcare providers fail to ask for your opinion about treatment options? |
20 | … Were you able to see the same VA mental healthcare providers consistently over time? |
21 | … Did you have to repeat your story to new VA mental healthcare providers over and over? |
22 | … Did you ever feel that your VA mental healthcare providers did not genuinely care about you? |
23 | … Did you ever feel that VA mental healthcare staff did not genuinely care about you? |
24 | … Did you ever feel stuck in VA “red tape” or paperwork? |
25 | … Did you ever need to find childcare so that you could get to a VA mental health appointment? |
Item | Item Stem: “In the past 12 months 12 months …” |
26 | … Have you felt comfortable that you were aware of all the VA mental health Services that were available to you? |
27 | … Did you notice any problems with the military and VA sharing medical records concerning your mental healthcare? |
28 | … Did you ever lack trust in any of your VA mental healthcare providers? |
29 | … Did you ever lack trust in the VA healthcare system? |
30 | … Have you ever felt that your VA mental healthcare providers were not available to you as soon as you needed them? |
31 | … Have any of your VA mental healthcare providers failed to show you respect? |
32 | … Have any of the VA mental healthcare staff failed to show you respect? |
33c | When you go to the VA, do you see other Veterans who you feel you can share your experiences with? |
34 | … Did you think your right to own a gun might be taken away if you used VA mental health Services? |
43c,d | Considering all the technologies we’ve just discussed, have you had any concerns about your privacy when using these technologies for your mental healthcare? The technologies are My HealtheVet, smart phone apps, televideo, telephone, internet chat rooms, or searching the internet. |
Items 35–42: | |
Part 1: “Have you ever …” [insert item stem from list below] | |
Yes | |
No | |
Part 2: If “Yes” to Part 1, ask: “When you did, was it/were they helpful?” | |
Yes | |
No | |
If “No” to Part 1, ask: “Do you have access to …” [insert item stem from list below] | |
Yes | |
No | |
Item | Item Stem: “Have you ever …” |
35 | … Used My HealtheVet to share information with your VA mental healthcare providers? |
36 | … Used secure messaging, which is a feature of My HealtheVet that lets you exchange messages online with your mental healthcare providers? |
37 | … Used smart phone apps related to your mental healthcare? |
38 | … Used televideo at a VA outpatient clinic, which is using a camera on a TV or computer screen at a clinic to talk with VA mental healthcare providers at another facility? |
39 | … Used televideo at home, which is using a camera on a TV or computer screen at home to talk with VA mental healthcare providers? |
40 | … Used Internet chat rooms to share information online with Veterans or other people about specific issues related to your mental healthcare? |
41 | … Met with any of your VA mental healthcare providers by telephone? |
42 | … Searched the internet for information related to your mental healthcare? |
For Items 5, 6, 17, 20, and 26, only respondents who respond “No” in Part 1 continue to Part 2.
Item 11 includes a prescreening question, “Did youhave your first VA mental health appointment in the past 12 months?” Only those who respond “Yes” proceed to Part 1 of the item.
Items 33 and 43 do not start with the clause “In the past 12 months.”
Item 43 includes a follow-up free-response question, “Which technologies have caused you the most concern about your mental healthcare privacy?”
Contributor Information
Jeffrey M. Pyne, Central Arkansas Veterans Healthcare System, North Little Rock, Arkansas, and University of Arkansas for Medical Sciences
P. Adam Kelly, Southeast Louisiana Veterans Health Care System, New Orleans, Louisiana, and Tulane University.
Ellen P. Fischer, Central Arkansas Veterans Healthcare System, North Little Rock, Arkansas, and University of Arkansas for Medical Sciences
Christopher J. Miller, VA Boston Healthcare System, Boston, Massachusetts, and Harvard Medical School
Patricia Wright, University of Arkansas for Medical Sciences.
Kara Zamora, San Francisco VA Health Care System, San Francisco, California.
Christopher J. Koenig, San Francisco VA Health Care System, San Francisco, California, and San Francisco State University
Regina Stanley, Central Arkansas Veterans Healthcare System, North Little Rock, Arkansas.
Karen Seal, San Francisco VA Health Care System, San Francisco, California.
Jr James F. Burgess, VA Boston Healthcare System, Boston, Massachusetts, and Boston University.
John C. Fortney, VA Puget Sound Health Care System, Seattle Washington and University of Washington
References
- Ajzen I (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50, 179–211. 10.1016/0749-5978(91)90020-T [DOI] [Google Scholar]
- Bashshur RL, Shannon GW, Bashshur N, & Yellowlees PM (2015). The Empirical Evidence for Telemedicine Interventions in Mental Disorders. Telemedicine and e-Health, 22, 87–113. 10.1089/tmj.2015.0029 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bauer MS, Williford WO, McBride L, McBride K, & Shea NM (2005). Perceived barriers to health care access in a treated population. The International Journal of Psychiatry in Medicine, 35, 13–26. 10.2190/2FU1D5-8B1D-UW69-U1Y4 [DOI] [PubMed] [Google Scholar]
- Blay E Jr., DeLancey JO, Hewitt DB, Chung JW, & Bilimoria KY (2017). Initial public reporting of quality at Veterans Affairs vs non-Veterans Affairs hospitals. Journal of the American Medical Association Internal Medicine, 177, 882–885. 10.1001/jamainternmed.2017.0605 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Bradley KA, DeBenedetti AF, Volk RJ, Williams EC, Frank D, & Kivlahan DR (2007). AUDIT-C as a brief screen for alcohol misuse in primary care. Alcoholism: Clinical and Experimental Research, 31, 1208–1217. 10.1111/j.1530-0277.2007.00403.x [DOI] [PubMed] [Google Scholar]
- Center for Substance Abuse Treatment. (1999). Enhancing motivation for change in substance abuse treatment (Treatment Improvement Protocol [TIP] Series, No. 35 HHS Publication No. [SMA] 13–4212). Rockville, MD: Substance Abuse and Mental Health Services Administration; Retrieved from https://store.samhsa.gov/shin/content/SMA13-4212/SMA13-4212.pdf [PubMed] [Google Scholar]
- Clement S, Brohan E, Jeffery D, Henderson C, Hatch SL, & Thornicroft G (2012). Development and psychometric properties the Barriers to Access to Care Evaluation scale (BACE) related to people with mental ill health. BMC Psychiatry, 12, 36 10.1186/1471-244X-12-36 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Creswell JW (2013). Research design: Qualitative, quantitative, and mixed methods approaches (4th ed.). Thousand Oaks, CA: Sage. [Google Scholar]
- Cunningham WE, Hays RD, Williams KW, Beck KC, Dixon WJ, & Shapiro MF (1995). Access to medical care and health-related quality of life for low-income persons with symptomatic human immunodeficiency virus. Medical Care, 33, 739–754. 10.1097/00005650-199507000-00009 [DOI] [PubMed] [Google Scholar]
- Department of Veterans Affairs. (2014a). Independent 2013 survey shows veterans highly satisfied with VA care higher rating than private-sector hospitals on average (News Release) Retrieved from https://www.va.gov/opa/pressrel/pressrelease.cfm?id=2537
- Department of Veterans Affairs. (2014b). Publication of wait-times for the Department for the Veterans Choice Program [Notice]. Federal Register, 79, 65771–65773. Retrieved from https://www.federalregister.gov/articles/2014/11/05/2014-26274/publication-of-wait-times-for-the-department-for-the-veterans-choice-program [Google Scholar]
- Department of Veterans Affairs. (2014c). Veterans Access, Choice, and Accountability Act of 2014 Section 204: Improvement of access of veterans to mobile vet centers and mobile medical centers of the Department of Veterans Affairs [Fact Sheet]. Retrieved from https://www.va.gov/opa/choiceact/documents/Fact-Sheet-Mobile-Vet-Centers.pdf
- Department of Veterans Affairs. (2016). Outpatient scheduling processes and procedures (VHA Directive 1230). Retrieved from https://www.va.gov/vhapublications/ViewPublication.asp?pub_ID=3218.
- Department of Veterans Affairs. (2017a). Strategic Analytics for Improvement and Learning (SAIL) value model measure definitions. Retrieved from https://www.va.gov/QUALITYOFCARE/measure-up/SAIL_definitions.asp
- Department of Veterans Affairs. (2017b). What veterans say about access to care (access to care). Retrieved from https://www.accesstoshep.va.gov/Main/Results
- Eakin EG, & Strycker LA (2001). Awareness and barriers to use of cancer support and information resources by HMO patients with breast, prostate, or colon cancer: Patient and provider perspectives. Psycho-Oncology, 10, 103–113. 10.1002/pon.500 [DOI] [PubMed] [Google Scholar]
- Fortney JC, Burgess JF Jr., Bosworth HB, Booth BM, & Kaboli PJ (2011). A re-conceptualization of access for 21st century healthcare. Journal of General Internal Medicine, 26 (2, Suppl. 2), 639–647. 10.1007/s11606-011-1806-6 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Fortney JC, Tripathi SP, Walton MA, Cunningham RM, & Booth BM (2011). Patterns of substance abuse treatment seeking following cocaine-related emergency department visits. The Journal of Behavioral Health Services & Research, 38, 221–233. 10.1007/s11414-010-9224-9 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Grubaugh AL, Cain GD, Elhai JD, Patrick SL, & Frueh BC (2008). Attitudes toward medical and mental health care delivered via telehealth applications among rural and urban primary care patients. Journal of Nervous and Mental Disease, 196, 166–170. 10.1097/NMD.0b013e318162aa2d [DOI] [PubMed] [Google Scholar]
- Hall WA, Long B, Bermbach N, Jordan S, & Patterson K (2005). Qualitative teamwork issues and strategies: Coordination through mutual adjustment. Qualitative Health Research, 15, 394–410. 10.1177/1049732304272015 [DOI] [PubMed] [Google Scholar]
- Hoge CW, Castro CA, Messer SC, McGurk D, Cotting DI, & Koffman RL (2004). Combat duty in Iraq and Afghanistan, mental health problems, and barriers to care. The New England Journal of Medicine, 351, 13–22. 10.1056/NEJMoa040603 [DOI] [PubMed] [Google Scholar]
- Keen SM, Kutter CJ, Niles BL, & Krinsley KE (2008). Psychometric properties of PTSD Checklist in sample of male veterans. Journal of Rehabilitation Research and Development, 45, 465–474. 10.1682/JRRD.2007.09.0138 [DOI] [PubMed] [Google Scholar]
- Kehle SM, Greer N, Rutks I, & Wilt T (2011). Interventions to improve veterans’ access to care: A systematic review of the literature. Journal of General Internal Medicine, 26 (Suppl. 2), 689–696. 10.1007/s11606-011-1849-8 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Larsen DL, Attkisson CC, Hargreaves WA, & Nguyen TD (1979). Assessment of client/patient satisfaction: Development of a general scale. Evaluation and Program Planning, 2, 197–207. 10.1016/0149-7189(79)90094-6 [DOI] [PubMed] [Google Scholar]
- Lingley-Pottie P, & McGrath PJ (2011). Development and initial validation of the treatment barrier index scale: A content validity study. Advances in Nursing Science, 34, 151–162. 10.1097/ANS.0b013e3182186cc0 [DOI] [PubMed] [Google Scholar]
- Linstone HA, & Turoff M (1975). The Delphi method: Techniques and applications. Reading, MA: Addison Wesley Educational Publishers Inc. [Google Scholar]
- MacQueen KM, McLellan E, Kay K, & Milstein B (1998). Codebook development for team-based qualitative analysis. Cultural Anthropology Methods, 10, 31–36. [Google Scholar]
- Miller CJ, Burgess JF Jr., Fischer EP, Hodges DJ, Belanger LK, Lipschitz JM,…Pyne JM (2017). Practical application of opt-out recruitment methods in two health services research studies. BioMed Central Medical Research Methodology, 17, 57 10.1186/s12874-017-0333-5 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Miller CJ, McInnes DK, Stolzmann K, & Bauer MS (2016). Interest in Use of Technology for Healthcare Among Veterans Receiving Treatment for Mental Health. Telemedicine and e-Health, 22, 847–854. 10.1089/tmj.2015.0190 [DOI] [PubMed] [Google Scholar]
- Muhr T, & Friese S (2004). User’s manual for ATLAS.ti 5.0 (2nd ed.) Berlin, Germany: Scientific Software Development; Retrieved from https://socthesis.fas.harvard.edu/files/socseniorthesis/files/atlas_manual.pdf [Google Scholar]
- Ouimette P, Vogt D, Wade M, Tirone V, Greenbaum MA, Kimerling R,…Rosen CS (2011). Perceived barriers to care among Veterans Health Administration patients with posttraumatic stress disorder. Psychological Services, 8, 212–223. 10.1037/a0024360 [DOI] [Google Scholar]
- Park RE, Fink A, Brook RH, Chassin MR, Kahn KL, Merrick NJ, …Solomon DH (1986). Physician ratings of appropriate indications for six medical and surgical procedures. American Journal of Public Health, 76, 766–772. 10.2105/AJPH.76.7.766 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Pepin R, Segal DL, & Coolidge FL (2009). Intrinsic and extrinsic barriers to mental health care among community-dwelling younger and older adults. Aging & Mental Health, 13, 769–777. 10.1080/13607860902918231 [DOI] [PubMed] [Google Scholar]
- Pew Research Center. (2017a). Internet/broadband fact sheet. Retrieved from http://www.pewinternet.org/fact-sheet/internet-broadband/
- Pew Research Center. (2017b). Mobile fact sheet. Retrieved from http:// www.pewinternet.org/fact-sheet/mobile/
- Pyne JM, Asch SM, Lincourt K, Kilbourne AM, Bowman C, Atkinson H, & Gifford A (2008). Quality indicators for depression care in HIV patients. AIDS Care, 20, 1075–1083. 10.1080/09540120701796884 [DOI] [PubMed] [Google Scholar]
- Ryan GW, & Bernard HR (2003). Techniques to identify themes. Field Methods, 15, 85–109. 10.1177/1525822X02239569 [DOI] [Google Scholar]
- Saldaña J (2015). The coding manual for qualitative researchers (3rd ed.). Thousand Oaks, CA: Sage [Google Scholar]
- Spitzer RL, Kroenke K, & Williams JB (1999). Validation and utility of a self-report version of PRIME-MD: The PHQ primary care study. Journal of the American Medical Association, 282, 1737–1744. 10.1001/jama.282.18.1737 [DOI] [PubMed] [Google Scholar]
- Spitzer RL, Kroenke K, Williams JBW, & Löwe B (2006). A brief measure for assessing generalized anxiety disorder: The GAD-7. Archives of Internal Medicine, 166, 1092–1097. 10.1001/archinte.166.10.1092 [DOI] [PubMed] [Google Scholar]
- Tanielian T, & Jaycox LH (Eds.). (2008). Invisible wounds of war: Psychological and cognitive injuries, their consequences, and services to assist recovery. Santa Monica, CA: RAND Corporation; Retrieved from https://www.rand.org/content/dam/rand/pubs/monographs/2008/RAND_MG720.pdf [Google Scholar]
- Wright SM, Craig T, Campbell S, Schaefer J, & Humble C (2006). Patient satisfaction of female and male users of Veterans Health Administration services. Journal of General Internal Medicine, 21(Suppl. 3), S26–S32. 10.1111/j.1525-1497.2006.00371.x [DOI] [PMC free article] [PubMed] [Google Scholar]