Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2014 Nov 18.
Published in final edited form as: Nurs Res. 2013 Nov-Dec;62(6):383–393. doi: 10.1097/NNR.0000000000000005

Patient-Centered Communication and Health Assessment with Youth

Kristy K Martyn 1,, Michelle L Munro 2, Cynthia S Darling-Fisher 3, David L Ronis 4, Antonia M Villarruel 5, Michelle Pardee 6, Hannah Faleer 7, Nicole M Fava 8
PMCID: PMC4235338  NIHMSID: NIHMS636097  PMID: 24165214

Abstract

Background

Patient-centered communication is the hallmark of care that incorporates the perspective of patients to provide tailored care that meets their needs and desires. However, at this time there has been limited evaluation of patient-provider communication involving youth.

Objectives

This manuscript will report on results from secondary analysis of data obtained during a participatory research-based randomized control trial designed to test a sexual risk event history calendar intervention with youth to address the following research questions: (a) Based on the event history calendar’s (EHC) inclusion of contextual factors, does the EHC demonstrate improved communication outcomes (i.e., amount, satisfaction, mutuality, client involvement, client satisfaction, patient-provider interaction, and patient-centeredness) when compared to the Guidelines for Adolescent Preventive Services (GAPS) tool? and (b) How do patients and providers describe the characteristics of each tool in regards to patient-centered communication?

Method

This report will utilize a sequential explanatory mixed methods approach to evaluate communication. A split plot design with one between factor (i.e., communication structure between EHC and GAPS) and one within factor (i.e., time between pretest and posttest) was used for analyses of data collection from male and female youth (n=186) and providers (n=9). Quantitative analysis of survey data evaluated changes in communication from pre-test to post-test. Qualitative data collected from open-ended questions, audio-taped visits, and exit interviews was employed to enhance interpretation of quantitative findings.

Results

Patient-centered communication using assessment tools (EHC and GAPS) with youth demonstrated improved communication outcomes both quantitatively and qualitatively. Additional analyses with subgroups of males and Arab-Americans demonstrated better post-intervention scores among the EHC group in certain aspects of communication. Qualitative results revealed that the EHC demonstrated improved outcomes in the four components of patient-centered communication including: validation of the patient’s perspective; viewing the patient within context; reaching a shared understanding on needs and preferences; and helping the patient share power in the healthcare interaction.

Discussion

Though both tools provided a framework from which to conduct a clinical visit, the integrated time-linked assessment captured by the EHC enhanced the patient-centered communication in select groups compared to GAPS.

Keywords: patient-centered communication, youth, event history calendars


During the developmental periods between adolescence and young adulthood patterns of health promotion and risk behaviors are being developed which can impact short and long-term health outcomes. Recent research underscores that adolescence (ages 11-17), emerging adulthood (ages 18-25), and young adulthood (ages 26 – 30s) are timeframes among youth for health promotion and risk reduction (Arnett, 2007; Ozer, Urquhart, Brindis, Park, & Irwin, 2012; Sawyer et al., 2012). However, the ability of healthcare providers to communicate with and engage these youth regarding their healthcare needs remains limited due to time constraints and comfort (Ozer et al., 2012). New approaches are needed to facilitate communication between youth and providers. This study reports on an innovative health history tool and its potential for enhancing patient-centered communication and improving clinical practice.

Background

Patient-Centered Communication

Patient-centered care focuses on the patient and their experience of health and illness rather than centering on the disease, technology, or the provider (Stewart, 2001). This requires: (a) understanding the illness from a broad psychosocial context; (b) taking into consideration the patient’s experience of illness; (c) promoting an equal relationship between the provider and patient; (d) forming a therapeutic alliance; and (e) being aware of the provider’s influence on the healthcare interaction in order to tailor care to the patient’s individual needs (Mead & Bower, 2000). Communication is an essential component of this process.

The Institute of Medicine (IOM) identifies seven basic principles and expectations of patient-provider communication, which are considered to be essential for attaining quality patient-provider communication (Paget et al., 2011). These include respect, mutual goals, a supportive environment, shared decision making, correct information, transparency, and a process of continuous learning (Paget et al., 2011). Despite the use of patient-centered communication by nurses for centuries, the modern healthcare environment has now begun to emphasize patient-centered communication as an integral component of high quality healthcare that should be used by all healthcare providers (IOM, 2001). In fact, patient-centered care has become a national imperative, underscoring the importance of patient-provider communication, patient participation in healthcare decisions, and healthcare that integrates patients’ values and preferences (IOM, 2008 Paget et al., 2011).

Communication within the patient-centered visit is central and can encompass verbal behaviors such as avoiding interruptions, encouraging patient participation, and offering support, as well as non-verbal cues like maintaining eye contact and avoiding distracting movements (Epstein & Street, 2007). Research on patient-centered approaches for care and communication has shown improved patient outcomes in adherence, emotional health, physical function, recovery, and physiological outcomes (Sequist et al., 2008; Stewart, 1995; Stewart et al., 1999). Use of patient-centered approaches has also demonstrated improved patient and provider satisfaction resulting in more positive patient-provider relationships (Little et al., 2001; Stewart et al., 1999). Although there is the potential for increased visit lengths associated with patient-centered communication, this approach has fewer diagnostic testing expenditures and less total standardized expenditures (Epstein et al., 2005a).

Healthcare Assessment Tools for Youth

Several tools have been developed to improve healthcare interactions with youth with a focus on health promotion and prevention. These include the Guidelines for Adolescent Preventive Services (GAPS), the current standard of care; and the event history calendar (EHC). The GAPS was developed in an effort to provide a screening tool for routine evaluation of adolescent psychosocial problems, health risk behaviors, and biomedical problems (Low, 2003). The GAPS provides a framework for organizing preventive healthcare recommendations (American Medical Association, 1997; Levenberg, 1998). The GAPS tool is a four page self-administered questionnaire (ranging from 60-72 questions) with yes-no questions that serve as a comprehensive assessment tool of the leading causes of morbidity among adolescents (Levenberg, 1998). It is often considered a model for adolescent health assessment and gold standard comprehensive screening (Low, 2003; Yi, Martyn, Salerno, & Darling-Fisher, 2009).

The EHC provides an alternative to the GAPS. It is a structured yet flexible approach to interviewing that facilitates the recall of past events by utilizing past experiences as cues to remembering. The EHC obtains social and risk behavior information using horizontal rows over a four year period including the past two years, current year, and future in vertical columns. When a participant completes the EHC history, they use open-ended questions on context and risk behaviors, autobiographical memory cues, and retrieval cycles that both encourage reflection on their time-linked integrated sexual risk history graph, and also prepare them to discuss their risk behavior history with the provider (Martyn & Belli, 2002). Past research has demonstrated that when compared to traditional survey methods, EHCs improve data quality, use of retrieval cues, cognitive abilities, and conversational engagement (Belli, Lee, Stafford, & Chow, 2004). High agreement has been found when comparing retrospective reports on EHCs to survey reports obtained one (Belli, Shay, & Stafford, 2001), five (Freedman et al., 1988), and 18 years earlier (Furstenberg et al., 1987). The EHC has also been recommended as a tool for clinical assessment of health risk patterns and triggers (Caspi et al., 1996; Martyn & Martin, 2003; Martyn, Reifsnider, & Murray, 2006). Thus, EHCs have generated quality data about activities, behaviors, events, and transitions occurring over extensive time periods.

Theoretical Framework

The overarching framework for this study is the Interaction Model of Client Health Behavior (IMCHB), which was developed by Cox (1982) as a way to assess patient-provider interactions that were initiated by the patient. The underlying assumption of the model is that the patient has control to make decisions regarding their healthcare and the healthcare interaction is assessed in regards to three key constructs including the elements of patient singularity, elements of the patient-professional interaction, and elements of the health outcomes (Cox, 1982). Within each construct of the IMCHB, the patient’s role is to: (a) see their own risk behavior and goals in life context and compare their behavior change over time; (b) perceive that the provider understands them; and (c) mutually identify strengths, risks, and solutions with the provider. The provider’s role includes: (a) identifying the patient’s behavior and goals in context to determine content of communication; (b) reviewing and clarifying the patient’s history to identify needs and frame meaningful messages; and (c) mutually identifying strengths, risks, and solutions with the patient. This framework therefore supports the processes and outcomes that would result from a successful patient-clinician encounter including: (a) confirming the patient’s perspective; (b) recognizing the patient within context; (c) achieving a common understanding of the patient’s problem and desired outcomes; and (d) mutual participation in decision-making (Epstein & Street, 2007).

Purpose

Despite the call to implement patient-centered care and communication, there are challenges in measuring this construct. Communication during the healthcare visit must be evaluated using objective descriptions of communication in the patient-provider interaction and also include the subjective experiences of patients and providers during the interaction (Epstein et al., 2005b). Additionally, there is limited data on the effectiveness of interventional tools to improve patient-centered communication among youth. Previous work with adolescents has evaluated the interaction between an adolescent patient and their healthcare provider (AAPIS; Woods et al., 2006), while past work with adults has investigated the patient’s perception of patient-centeredness (PPPC; Stewart, Meredith, Ryan, & Brown, 2004). Additionally, the EHC has demonstrated improved communication scores about sexual risk behaviors in the domains of amount, satisfaction, support, client involvement in decision-making, and client satisfaction with interpersonal style due to the inclusion of open-ended responses and contextually-relevant cues (Martyn et al., 2012a). However, there is a gap in the literature evaluating patient-centered communication among youth during health promotion visits using interventional tools and both objective and subject reports.

The purpose of this report is to evaluate patient-centered communication during adolescence and emerging adult health promotion visits using a sequential explanatory mixed methods design to evaluate communication before and after a clinical interaction using the GAPS or EHC. This manuscript will report on results from secondary analysis of data obtained during a participatory research-based randomized control trial (PR-RCT) designed to test a sexual risk EHC intervention with youth to address the following research questions: (a) Based on the EHC’s inclusion of contextual factors, does the EHC demonstrate improved communication outcomes (i.e., amount, satisfaction, mutuality, client involvement in decision-making, client satisfaction with interpersonal style, patient-provider interaction, and patient-centeredness) when compared to the GAPS? and (b) How do patients and providers describe the characteristics of each respective tool in regards to patient-centered communication? This work will supplement the primary goal of the present study to evaluate sexual risk behavior outcomes and will extend our understanding of patient-centered communication among youth during health promotion visits.

Methods

Design

The present study was a secondary analysis of patient-provider communication data from a PR-RCT designed to evaluate the potentially differential effect of the EHC and GAPS clinical assessment on male and female youths’ cognitive appraisal of risk, sexual risk behavior and intentions, and quality of communication with providers. The study was designed, conducted, and evaluated using a community based participatory framework in which participants and clinic staff were incorporated as partners during the research process (Trin-Shevrin et al., 2007). Institutional review board approval was obtained from each institution involved with data collection and a Certficate of Confidentiality was obtained. With the Certificate of Confidentiality, researchers cannot be forced to share information that may identify participants. However, the Certificate of Confidentiality cannot be used to resist a demand for information from personnel of the United States Government that is used for auditing or evaluation of federally funded projects nor does it prevent the researchers from taking steps to report to authorities to prevent serious harm to participants or others. In addition, this manuscript utilizes a sequential explanatory mixed methods approach, with the qualitative aspect of the study embedded in the quantitative design to evaluate patient-centered communication. A split plot design with one between factor (i.e., communication structure between EHC versus GAPS) and one within factor (i.e., time between pretest and posttest) was used for analyses.

Recruitment and Procedures

Sample and setting

The sample for this study was obtained from three health clinics in the Midwest including: (a) a university health center; (b) a sexually transmitted infection (STI) testing clinic; and (c) a community health center that primarily provides healthcare to Arab-Americans. Inclusion criteria were that individuals be: adolescents or emerging adults (15-27 years old) and new patients to the clinic; or providers (nurse practitioners or physicians) with adolescent health experience; and able to speak and read English. We will be reporting on the communication patterns of the full sample of participants: adolescents (n=186) and providers (n=9) within these clinics.

Randomization

At the beginning of the study providers were randomized within each clinic to either the EHC or GAPS condition. Thus, each clinic had at least one EHC and one GAPS control condition (i.e., provider who would use these particualr instruments). One clinic had two EHC providers enrolled from the outset because they both worked part-time. Additionally, throughout the course of the study, two EHC providers left their respective practices. Therefore, the new providers were integrated into the study based on the randomization of the original provider.

Further randomization procedures occurred at the participant level, whereby participants were randomly assigned within clinic to minimize selection biases associated with providers. Random assignment to the providers was made at the time the appointment was scheduled or when the participant agreed to enroll (e.g., if the participant appeared at a walk-in clinic). The assignment was done by a computer generated sequence blocked by time and stratified by gender. Assignments were sealed in two stacks of envelopes, one for males and one for females. This allowed the participant’s condition assignment to remain unknown until the instruments were completed by the participant. At the beginning of the study if one of the providers was not available, then the patient was rescheduled or considered a lost case (n=2; Martyn et al., 2013a). However, given time constraints for enrollment toward the end of the study, when individuals seeking walk-in appointments expressed interest in participating they were attended to by the first available provider, regardless of randomization procedures. This resulted in the EHC group having a larger sample size, a consequence the research team was aware of, but chose to accept given the value of more participants being involved in the study.

Procedures for providers

Providers, nurse practitioners (n=6) and physicians (n=3), were recruited from each of the three locations. Consent was obtained from all providers and then they were randomly assigned and trained on either the EHC or GAPS condition to use as an interventional tool to review the patient’s risk behavior history during the clinic visit. Providers completed a pre-training survey that assessed past experience with assessment tools and their usual communication with youth in regards to amount, satisfaction, patient-provider interaction, and patient-centeredness. After completion of each patient visit, providers also completed a post-intervention survey related to treatment recommendations and the providers’ perceptions of the quality of communication. Finally, at the completion of the study, providers completed a post-study survey and interview to evaluate their perceptions about the patient-provider communication (i.e., amount, satisfaction, patient-provider interaction, and patient-centeredness) and the clinical assessment using their respective history tools. Providers were reimbursed at a rate of $60/hour with a gift card at two time points including: (a) after completion of the study training (2 hours) and (b) again at the end of study procedures during their exit interview (1 hour).

Procedures for youth

Youth participants were recruited via posted flyers in the clinics and affiliated schools. The clinic staff assisted with recruiting by telling new clients and students about the study and providing information packets and contact information. The research team members were notified by clinic staff of eligibility and reviewed the study procedures and consent with each participant. Assent for those under age 18 or consent for those 18 and older was obtained from all youth participants. A wavier of consent from parents was obtained for those under age 18.

Youth participants completed a pre-intervention questionnaire that collected information about their risk behaviors and sexual activity (using items from the Youth Risk Behavior Surveillance Survey, YRBSS; CDC, 2009), past communication with healthcare providers (amount, satisfaction, mutuality, client involvement in decision-making, client satisfaction with interpersonal style, patient-provider interaction, and patient-centeredness), intentions, risk perceptions, attitudes, and risk perception open-ended questions about using the assessment tool during the visit. The participants were then randomized to one of two interventions which included the completion of either an EHC or the GAPS questionnaire. The EHC intervention consisted of a single patient-provider encounter (10-15 minutes), during which the patient completed the self-administered EHC, and then the provider tailored communication with the patient about those risk behaviors identified on the EHC. Participants assigned to the control condition completed a standard self-administered GAPS history in preparation for a single patient-provider encounter (10-15 minutes). After the participant completed the interaction with the provider, they completed a post-intervention questionnaire which again assessed risk behaviors and sexual activity, current communication with the healthcare provider (amount, satisfaction, mutuality, client involvement in decision-making, client satisfaction with interpersonal style, patient-provider interaction, and patient-centeredness), intentions, risk perceptions, attitudes, and risk perception open-ended questions about using the assessment tool during the visit. We audiotaped randomly selected visits during the study to ensure fidelity of the research protocol. All participants involved in the audtiotaped visits gave consent and were aware that the visit was being recorded. This practice has the potential to introduce a change in provider behavior during the visits that were audiotaped. However, this risk was considered minimal because only 5.9% of visits were audiotaped. Participants received a $25 gift certificate after the study clinic visit.

Measures

Quantitative

Past work using the EHCs with adolescents have demonstrated improved communication scores with five scales constructed by Martyn and colleagues (2012a) to evaluate amount of communication, satisfaction with communication, mutuality of communication, client involvement in decision-making, and client satisfaction with interpersonal style. The independent variables included in this study were the youth participant’s intervention group (EHC versus GAPS) and descriptive demographic measures (e.g., age, race, ethnicity, gender, education, employment, living situation, health insurance, and past healthcare interactions).

The dependent variables in this study included the seven communication scales of: (a) amount of communication; (b) satisfaction with communication; (c) mutuality of communication; (d) client involvement in decision-making; (e) client satisfaction with interpersonal style; (f) patient-provider interaction; and (g) patient perception of patient-centeredness. This study utilized modified versions of the communication scales originally used by Martyn and colleagues (2012a) in which problematic questions or those that elicited a lower reliability in the previous study were changed or removed. All questions are scored on a Likert response scale from 1 to 5 with a score of 5 indicating maximum quantity (e.g. amount of communication) or quality (e.g. satisfaction with communication).

We also used two pre-existing scales including the Adolescent Patient-Provider Interaction Scale (APPIS; Woods et al., 2006) and the Patient-Perception of Patient-Centeredness Questionnaire (PPPC; Stewart et al., 2004). The APPIS was developed from the Roter and Hall model of patient-provider interaction (Woods et al., 2006). It is a 9-item scale with eight items that are measured on a Likert scale designed to measure patient-provider communication and empowerment with good internal consistency (Cronbach alpha =.83). The additional item measures who made the decisions in the interaction and is measured 1=both, 2=you, 3=provider, 4=neither, 5=don’t know. This additional item was analyzed independently. There are two versions of the PPPC, which include a 14-item and 9-item questionnaire. The 14-item questionnaire has demonstrated good reliability (Cronbach alpha =.71) and validity has been previously established by significant correlations with patient health outcomes (Stewart et al., 2004). For our study, we used the 14-item PPPC as well as two additional questions that matched the constructs within the 14-item PPPC (see Table 1 for conceptual definitions, number of items, and reliabilities of the PPPC scales and Table 2 for bivariate scale correlations).

Table 1.

Communication Scales

Name of Scale Conceptual Definitions Number of Items Pre-test Reliability Post-test Reliability

Amount of Communication How much a patient talked to their healthcare provider about certain topics such as risk behaviors, sexual behaviors, opinions, and general health. 14 .91 .93
Example: “How much have you talked with the provider about your thoughts, opinions, or concerns about your past life?”

Satisfaction with Communication How the patient felt about the visit overall including such topics as time spent with the provider, how the provider listened, and things that were important to the patient to discuss. 7 .87 .95
Example: “The provider talked to me about things that were important to me.”

Mutuality of Communication Measured components of supportive communication including empowerment, responsiveness to the patient’s problems, and explanation of processes of care. 11 .86 .82
Example: “The provider told me what I could do to take care of myself.”

Client Involvement in Decision-Making Explored the equality of the exchange between the healthcare provider and patient. 5 .84 .81
Example: “The provider asked me if I felt comfortable following advice that they gave me.”

Client Satisfaction with Interpersonal Style Explored how the provider treated the patient. 9 .75 .69
Example: “The provider made sure I understood what they were saying before going on.”

Adolescent Patient-Provider Interaction Scale Explored what occurred during the patient-provider interaction, including communication/interaction quality items (i.e., listen, explain, respect, control) and satisfaction with communication. 9 (8 Likert Scale) .81 .84
Example: “There was equal exchange of information with the healthcare provider.”

Patient Perception of Patient-Centeredness Focused on patient-centered communication within the clinic visit. 16 .90 .89
Example: “The provider listened carefully to me.”
Table 2.

Correlation Matrix of Communication Scales

Amount of Communication Satisfaction with Communication Mutuality of Communication Client Involvement in Decision-Making Client Satisfaction with Interpersonal Style Adolescent Patient-Provider Interaction Scale Patient Perception of Patient-Centeredness
Amount of Communication -
Satisfaction with Communication r=.375, p<.001 -
Mutuality of Communication r=.342, p<.001 r=.761, p<.001 -
Client Involvement in Decision-Making r=.252, p=.001 r=.575, p<.001 r=.663, p<.001 -
Client Satisfaction with Interpersonal Style r=.362, p<.001 r=.730, p<.001 r=.767, p<.001 r=.673, p<.001 -
Adolescent Patient-Provider Interaction Scale r=.294, p<.001 r=.745, p<.001 r=.889, p<.001 r=.646, p<.001 r=.829, p<.001 -
Patient Perception of Patient-Centeredness r=.453, p<.001 r=.851, p<.001 r=.838, p<.001 r=.875, p<.001 r=.805, p<.001 r=.780, p<.001 -

Qualitative

Open-ended responses obtained from the history tools (EHC and GAPS) and from survey measures were utilized for qualitative analysis. Open-ended questions from the survey measures assessed for the participants thoughts about their health, their worries, the good things in their life, what will happen if their life (and sexual behavior) continues as it is. There was also space for additional comments. Transcriptions from patient-provider interactions (n=11) and from the providers’ exit interviews (n=7) were assessed. All qualitative analyses were undertaken in order to address both research questions: (a) Based on the EHC’s inclusion of contextual factors, does the EHC demonstrate improved communication outcomes (i.e., amount, satisfaction, mutuality, client-involvement in decision-making, client satisfaction with interpersonal style, patient-provider interaction, and patient-centeredness) when compared to the GAPS? and (b) How do patients and providers describe the characteristics of each respective tool in regards to patient-centered communication?

Data analyses

Quantitative

All quantitative analyses were completed using the Statistical Package for the Social Sciences (SPSS) version 19.0. To ensure fidelity of data entry, all of the data were double-entered by two research assistants. Communication scales were computed in SPSS using a 75% rule; if participants completed at least 75% of the scale items then a mean score was computed (Martyn et al., 2012a). Mixed effects analysis of variance (ANOVA) was employed to compare mean scale scores between groups of scales at pretest and posttest for those dependent variables that met the necessary statistical assumptions including normality. All significance values were set at a p<.007 based on Bonferroni’s correction.

Additional analyses were completed to examine potential differences between groups in regards to gender and ethnicity based on past work suggesting that differences in the utility and experience of the assessment tools may differ according to these individual characteristics (Martyn, Saftner, Darling-Fisher, & Schell, 2012b; Martyn, Munro, Tate, Saftner, & Darling-Fisher, 2013b). Following these analyses, data from providers was also analyzed. First we compared the two groups (EHC and GAPS) pre-study and post-study communication scores using a Wilcoxon Signed Rank Test. Then we specifically compared communication scores from providers in the GAPS and EHC groups that were assessed after each clinical interaction with a youth participant (n=186) after the intervention was completed using paired t-tests.

Qualitative

Qualitative analysis using the constant comparative method was employed to facilitate interpretation of the quantitative results using two forms of data: (a) open-ended data from the actual history tools (i.e., GAPS and EHC) and surveys, and (b) transcriptions from patient-provider interviews and provider exit interviews. We used the constant comparative method of analysis (Glaser, 1978; 1992) to identify themes related to perceptions of EHC and GAPS risk assessment and communication with adolescents. Credibility of qualitative results was ensured during data collection and analysis using audit trails and member checking with participants, and by clinician and research colleagues (Lincoln & Guba, 1985).

Results

Sample

The sample of youth (n=186) for this study consisted of a diverse group with an age range of 15-27. Youth were split into two developmental age groups for analysis purposes including adolescents (ages 15-17) and emerging adults (18-27). Because the study sample of youth only included five participants over the age of 25, these individuals were included in the emerging adult group. Table 3 demonstrates the characteristics of the sample and compares those in the EHC versus the GAPS group. The EHC group included more older participants (M=20.06, SD=2.74) in comparison to the GAPS group (M=19.28, SD=2.71); t(184)=1.94, p=.054. Thus, the EHC group included more emerging adults compared to the GAPS group; χ2(1)=5.71, p=.017. As would be expected with an older sample, more of the EHC participants lived with friends or a partner in comparison to their GAPS counterparts; χ2(2)=6.72, p=.035. Other notable characteristics of the sample included that the majority of participants (80.7% of GAPS and 77.7% of EHC) had seen a healthcare provider in the last year. This particular characteristic indicates that these participants had access to healthcare services and had interactions with healthcare providers in the past.

Table 3.

Description of the Sample

Variables % (n) GAPS n=83 EHC n= 103 T-test or Chi-Square to compare differences between groups
DEMOGRAPHICS
Mean Age (Standard Deviation) 19.28 (2.71) 20.06 (2.74) t(184)=1.94, p=.054

Age Group
 Adolescent (15-18 years old) 28.9 (24) 14.6 (15) χ2(1)=5.71, p=.017
 Emerging/Young Adult (18 – 27 years old) 71.1 (59) 85.4 (88)

Clinic Type
 STI Clinic 12.0 (10) 7.8 (8) χ2(2)=2.85, p=.241
 Community Health Clinic 39.8 (33) 32.0 (33)
 University Health Clinic 48.2 (40) 60.2 (62)

Gender
 Male 43.4 (36) 39.8 (41) χ2(1)=.24, p=.623
 Female 56.6 (47) 60.2 (62)

Ethnicitya
 Hispanic/Latino 2.5 (2) 7.1 (7) χ2(2)=3.97, p=.137
 Not Hispanic/Latino 55.6 (45) 62.6 (62)
 Arab 42.0 (34) 30.3 (30)

Racea
 Black/African American 28.2 (22) 33.3 (33) χ2(5)=5.06, p=.408
 White 64.1 (50) 62.6 (62)
 Asian 3.8 (3) 3.0 (3)
 American Indian/Alaska Native 1.3 (1) 0.0 (0)
 Native Hawaiian/Pacific Islander 2.6 (2) 0.0 (0)
 Other (Middle Eastern) 0.0 (0) 1.0 (1)

Education
 6-8 grades 1.2 (1) 0.0 (0)
 9th grade 3.6 (3) 0.0 (0) χ2(5)=8.38, p=.136
 10th grade 6.0 (5) 3.9 (4)
 11th grade 16.9 (14) 9.7 (10)
 12th grade (graduated high school)/GED 16.9 (14) 19.4 (20)
 Some college/technical school 55.4 (46) 67.0 (69)

Occupationb
 Student 91.6 (76) 92.2 (95) χ2(1)=.03, p=.868
 Work full-time or part-time 44.6 (37) 41.7 (43) χ2(1)=.15, p=.698
 Unemployed currently 6.0 (5) 8.7 (9) χ2(1)=.49, p=.486

Living Situationa
 Live alone 9.9 (8) 22.3 (23) χ2(2)=6.72, p=.035
 Live with parents/family 55.6 (45) 39.8 (41)
 Live with friends or partner 34.6 (28) 37.9 (39)

Eligible for Subsidized Meals at Schoola
 Yes 50.6 (41) 45.6 (47) χ2(1)=.45, p=.501
 No 49.4 (40) 54.4 (56)

Type of Health Insurancea
 Medicaid 25.3 (20) 18.2 (18) χ2(1)=3.08, p=.379
 Private 34.2 (27) 33.3 (33)
 None 13.9 (11) 23.2 (23)
 I don’t know 26.6 (21) 25.3 (25)

Last Time you Saw a Healthcare Provider
 Within the last year 80.7 (67) 77.7 (80) χ2(1)=1.49, p=.828
 1-2 years ago 9.6 (8) 14.6 (15)
 3-4 years ago 2.4 (2) 1.9 (2)
 More than 5 years ago 2.4 (2) 2.9 (3)
 I do not know 4.8 (4) 2.9 (3)
a

Responses may not add up to 100% due to missing data

b

Denotes categories where males and females marked more than one response

The sample of healthcare providers for this study consisted of a diverse group of eight females and one male. They ranged in age from 34-55 years old with a mean age of 43 (SD=7.10). They all self-identified as White race with two reporting an Arab ethnicity and seven reporting that they were Not Hispanic. The providers educational backgrounds included: (a) two providers with a Master’s of Science and Family Nurse Practitioner certification; (b) three providers with a Master’s of Science in Nursing and an Adult Nurse Practitioner certification; (c) one provider with a PhD and an Adult Nurse Practitioner certification; and (d) three medical doctors (MDs). Providers had a range of backgrounds and experience including 3-18 years of experience working with youth (M=9.17 years, SD=5.06). They spent 5-30 minutes of time with adolescents during their visits (M=21.11 minutes, SD=8.21). Only two providers reported previous training using assessment tools, specifically the GAPS tool, one of these providers was randomized to the GAPS arm while the other was randomized to the EHC arm.

Quantitative

Table 4 displays the descriptive statistics and mixed effects ANOVA results for each scale. Sphericity is assumed in each model because there are only two levels of treatment. There were no significant differences between the two groups (EHC vs. GAPS) on changes in communication scores among the seven scales. As presented in Table 4, both the EHC and GAPS youth participants significantly improved communication in all domains (i.e., amount of communication, satisfaction with communication, mutuality of communication, client involvement in decision-making, client satisfaction with interpersonal style, adolescent patient-provider interaction, and patient perception of patient-centeredness). All communication scales demonstrated a significant improvement in communication across time (from pre-test to post-test; p<.001). However, there was no significant group by time interactions indicating that both groups improved equally as well and that the EHC did not improve communication more than the GAPS. When the question, “Who made the decisions during the visit” was assessed both groups demonstrated an increase in the percentage of individuals reporting that both the healthcare provider and patient made the decisions in the post-intervention time period (GAPS: Pre – 37.3%, Post – 62.5%; EHC: Pre – 29.4%, Post – 67.6%) implying that more unified decision making was taking place in the post-intervention versus in prior visits with healthcare providers.

Table 4.

Comparing Communication between the GAPS and EHC Groups

GAPS n=83 EHC n=103 ANOVA

Measure n Pretest Mean (SD) Posttest Mean (SD) n Pretest Mean (SD) Posttest Mean (SD) Time by group Time
Amount of Communication 80 1.99 (.64) 3.16 (.88) 102 1.89 (.68) 3.25 (.85) Time by group: F(1, 180)=1.64, p=.202
Time: F(1, 180)=334.95, p<.001

Satisfaction with Communication 80 3.63 (.77) 4.35 (.62) 101 3.68 (.60) 4.42 (.64) Time by group: F(1, 179)=.004, p=.948
Time: F(1, 179)=144.76, p<.001

Mutuality of Communication 80 3.27 (.63) 3.67 (.48) 102 3.27 (.55) 3.73 (.53) Time by group: F(1, 180)=.23, p=.630
Time: F(1, 180)=75.80, p<.001

Client Involvement in Decision Making 79 2.52 (.79) 3.06 (.72) 103 2.38 (.74) 3.07 (.80) Time by group: F(1, 180)=1.12, p=.291
Time: F(1, 180)=69.22, p<.001

Client Satisfaction with Interpersonal Style 79 2.83 (.58) 3.22 (.47) 103 2.89 (.50) 3.22 (.52) Time by group: F(1, 180)=.29, p=.591
Time: F(1, 180)=56.31, p<.001

Adolescent Patient-Provider Interaction Scale 80 3.80 (.76) 4.30 (.57) 101 3.83 (.61) 4.41 (.63) Time by group: F(1, 179)=.39, p=.534
Time: F(1, 179)=84.78, p<.001

Patient-Perception of Patient-Centeredness 80 3.14 (.72) 3.90 (.59) 103 3.11 (.63) 4.00 (.69) Time by group: F(1, 181)=1.15, p=.285
Time: F(1, 181)=190.78, p<.001

Additional analyses using mixed effects ANOVA among subgroups demonstrated some noteworthy differences between groups that approached significance. Subgroup analyses were completed with (a) males based on past work that has demonstrated that the EHC increased communication among males and providers (Martyn et al., 2012b) and (b) Arab-American ethnicity based on work that has demonstrated that the EHC also may increase communication among Arab-American youth and providers (Martyn et al., 2013b). Among males (n=76), there was an interaction between intervention groups (EHC and GAPS) and time on the patient-provider interaction scale that approached significance with the EHC group demonstrating a larger increase in patient-provider interaction scores; F(1, 74)=3.49, p=.066. There was also a substantial main effect for time on the patient-provider interaction scale; F(1, 74)=34.26, p<.001. Among Arab-American youth (n=62), there was a significant interaction between intervention groups (EHC and GAPS) and time on the amount of communication scale with the EHC group demonstrating a larger increase in amount of communication scores; F(1, 60)=4.90, p=.031. There was also a substantial main effect for time on the amount of communication scale among Arab-Americans; F(1,60)=128.91, p<.001.

Table 5 presents communication scores from the providers using the GAPS and EHC interventions. Despite the small sample size (n=9), providers in the EHC group demonstrated differences between pre- and post-study that approached significance on the scales Amount of Communication (Wilcoxon Signed Rank Test, p=.068) and Patient-Perception of Patient-Centeredness (Wilcoxon Signed Rank Tests, p=.066). However, further analyses of communication scores using paired t-tests completed post-intervention on all participants (n=186), which are not presented in table format, demonstrated that the GAPS providers reported more collaboration in decision making [M=1.40(SD=.61)] when compared to EHC providers [M=2.35(SD=1.42); t(142.84)=6.11, p<.001]. Additionally, GAPS providers demonstrated higher mean communication scores after the intervention when compared to EHC providers in: (a) Amount of Communication [M=3.78(SD=.55) versus M=3.60(SD=.75); t(181.43)=1.98, p=.049]; (b) Satisfaction with Communication [M=4.47(SD=.37) versus M=4.28(SD=.53); t(178.26)=2.87, p=.005]; and (c) Patient-Perception of Patient-Centeredness [M=4.28(SD=.36) versus M=4.11(SD=.51); t(179.98)=2.57, p=.011].

Table 5.

Communication Scores for Guidelines for Adolescent Preventive Services and Event History Calendar Providers Preintervention and Postintervention

Measure GAPS (n=3) EHC (n=9)
Pretest, M (SD) Posttest, M (SD) pa Pretest, M (SD) Posttest, M (SD) pa
Amount of communication 4.13 (0.21) 4.13 (0.15) 1.00 3.28 (0.40) 4.22 (0.57) .068
Satisfaction with communication 4.06 (0.25) 4.61 (0.35) .180 4.11 (0.29) 4.41 (0.42) .197
Patient perception of patient-centeredness 4.14 (0.13) 4.42 (0.43) .180 3.92 (0.27) 4.40 (0.24) .066
Adolescent patient-provider interaction scale 4.67 (0.14) 4.63 (0.45) .655 4.38 (0.44) 4.56 (0.39) .180

Note. GAPS=Guidelines for Adolescent Preventive Services; EHC=event history calendar; M=mean; SD=standard deviation.

a

Wilcoxon signed rank test.

Qualitative

Analysis of patients and providers descriptions of the assessment tools revealed four themes when referring to the EHC as opposed to the GAPS that were consistent with the four components of patient-centered care: (a) understanding and validation of the patient’s perspective; (b) viewing the patient within their unique contextual factors; (c) reaching a shared understanding on needs and preferences; and (d) helping the patient share power in health decisions (Epstein & Street, 2007). It is interesting to note that the patients did not provide any positive feedback of the GAPS tool in their written responses to the open-ended questions that assessed for additional information about the study.

Understanding and validation of the patient’s perspective

Patients and providers both described the history tools as serving as a bridge to initiate communication that led to an improvement in understanding the patient’s perspective. For instance, one provider who utilized the GAPS history tool noted in their exit interview that, “It certainly helped… because we asked the questions in a different way, to zero in directly with that person.” Alternatively, providers saw the EHC as a way to understand and validate the patient’s perspective through increased disclosure and communication. As noted by one provider, “I have had a couple cases where somebody on their calendar didn’t disclose something but then when we started talking, it was kind of like having thought about those questions or just the fact that the questions were there made them think, well maybe I can, what the heck maybe I can just ask the provider about it. Even though it wasn’t something they would ordinarily do…” This provider found that the process of completing the calendar helped the patient to think about their history and led them to disclose sensitive information. Validation of patient needs and desires was also illustrated by a male patient in the EHC condition, “…the doctor communication was very good and how a real doctor should be. More communication and more connecting.” Understanding and validation of the patient’s perspective requires openness, disclosure, and meeting patient needs and expectations.

Viewing the patient within their unique context

While discussing the use of the EHC in the clinical visit, both patients and providers recognized factors that provided context and meaning to the patient’s behaviors and demonstrated the good and the bad within their lives. As one female participant noted, “I found it very helpful to write down the calendar of events. It gave me a better perspective of where my life was and where it was headed.”

Providers also noted that looking at the calendar with the participant allowed the provider to highlight the context in which certain behaviors occurred. This is illustrated by one providers’ comment, “I think that was mostly good for patients being able to provide insight as to identifying ‘oh, when I’m drinking I become a little more risky’ type of thing.” Providers appreciated how the EHC helped to relate the patient’s behaviors in context to their goals as part of sexual risk prevention communication. One provider stated, “The way it [EHC] was designed it made it a lot easier to just say, well in the context of everything, you’re saying that you are not using birth control but yet you plan on going to nursing school, and have you thought about how that might happen if you become pregnant? So, it was a nice approach.” Providers also noted that the history gave them a more comprehensive picture of each individual patient with examples of positive behaviors, risky behaviors, and support systems. After completing the EHC with the provider one female participant wrote, “I’m not what I used to be. I’m better and getting even better.” The providers noted that it was refreshing to also discuss the positive behaviors, with one provider proclaiming, “that was my favorite part!”

In comparison, the GAPS tool was viewed as a more general assessment tool that would help the provider identify risk behaviors or health promotion subjects to focus on. However, according to provider descriptions the GAPS tool seemed to lack the contextual details provided by the EHC. One provider specifically noted, “And the GAPS is a general questionnaire. But, at least it would give us an idea. It’s a good guideline, you know, for us to know how is this teenager behaving.”

Reaching a shared understanding of needs and preferences

Patients and providers visually looked at the patient’s risk behaviors, support system, and positive behaviors on the EHC while discussing what forms of health promotion and prevention needed to be adopted. Together viewing the visual risk history enhanced cognitive appraisal, affective support, and decisional control in order to identify patient strengths, risks, and solutions. Providers recognized that it is the patient’s needs that really matter as echoed by one provider who stated, “even though they write it down and it looks like something is supportive, or stressful, or risky, it’s their perception that is really what matters.” The mutual discussion elicited by the EHC allowed the patient and provider to review the decisions together and to then be able to “tailor what their specific needs were.” Recognition of the importance of patient perception was important for facilitating shared understanding.

In contrast, GAPS providers focused on the ability of the tool to gather general information and start a conversation and noted that it lacked detail and the ability to engage participants. Providers who utilized the GAPS during clinic visits also spoke to the need for more direct questions on sexual health and sexual risk behaviors. One provider said, “So, I guess a few questions could have been more geared toward really pinpointing direct questions about their sexual risks that they have taken in the past.”

Helping the patient share the power in health decisions

Both patients and providers engaged in meaningful interaction that allowed for shared power and decision-making regarding health decisions. In order for providers to share power with patients, mutual understanding of patient needs and desires is necessary. Providers noted that the EHC visual display assisted in reaching a shared understanding. For example, one EHC provider shared, “I thought it was helpful for them to look at where they were in their lives and how they have come along. I think it was encouraging to them to see that and to plan it out.” Providers also recognized the patient’s power to choose not to disclose, which influenced decision-making. One provider explained, “I think there were a few cases, some people I got the feeling that they kept certain things to themselves and they were going to, which is their right.” Recognition of a patient’s decisions is important in sharing power. Through using the EHC and talking about risk behaviors with a provider, patients were able to make meaningful choices related to their health. For example, when asked about what would happen if the patient’s sexual behavior remained the same, one female participant wrote, “I will be fine but I will start making my boyfriend wear condoms. Birth control is not enough.”

Providers utilizing the GAPS tool also recognized the importance of allowing patients to share whatever they felt comfortable with. For instance, one GAPS provider noted, “But, I definitely think it makes a difference if they fill it out. You know, we still go over it, but the fact that they have to think about and it starts the thought processes going in their head. And then, they may or may not share it with us.” Providers utilizing the GAPS history tool specifically valued that patients completing the GAPS prior to the visit were therefore able to think about their behaviors and what they did or did not want to discuss with the provider.

Discussion

In summary, this in-depth mixed methods analysis of patient-centered communication with youth demonstrated improved communication outcomes using both assessment tools. More specifically, quantitative analysis demonstrated higher post-intervention communication scores with both the GAPS and EHC assessment tools. While the GAPS is a gold standard assessment tool, the EHC has not yet been established as a clinical tool. These results provide support for using the EHC as a comparable assessment tool with adolescents and emerging adults to communicate about their health. Further, though both tools provided a framework from which to conduct a clinical visit, it seems that the integrated time-linked assessment captured by the EHC enhanced the patient-centered communication in select groups compared to GAPS. Additional analyses with subgroups of males demonstrated better post-intervention communication scores among the EHC group in patient-provider interaction. This may be related to the fact that most males have never talked to anyone about their sexual history (Martyn et al., 2012b). Past work utilizing the EHC in a male population demonstrated that males found the EHC made it “easy to talk with the NP [because it]… basically puts it all out there and asked what you wanted to find out” (Martyn et al., 2012a, p. 6). These results suggest that the EHC history assessment tool may have a greater potential to engage adolescent and emerging adult males, an often hard to reach group.

Additional analyses by ethnicity identified that Arab-American participants reported more communication in the post-intervention period when using the EHC. The unique ethnic sample of Arab-Americans generally have limited communication regarding risk behaviors (Kridli, 2002; Yosef, 2008); therefore, utilization of a retrospective history tool may have made it easier for these participants to communicate with healthcare providers. Providing care that is tailored to gender and culture is a hallmark of patient-centered care and communication that is sensitive and considerate to the needs of each individual patient (Wilkerson, Fung, May, & Elliott, 2010).

Qualitative analysis confirmed that the EHC enhanced understanding of patient-centered care and communication. Taken together, these analyses therefore demonstrate that utilizing an assessment tool (i.e. GAPS or EHC) with youth has the potential to improve communication; however the EHC may be superior in attaining patient-centered communication that allows the provider to share understanding with patients about their health.

The participants’ opinions about using the GAPS or the EHC differed. The GAPS participants did not comment on their visit with the provider while the EHC participants made specific notes about changing their sexual behavior and the communication they had with the provider in the open-ended risk perception questions. Even though both tools provided a framework from which to conduct a clinical visit, it seems that the contextually linked assessment captured by the EHC enhanced the patient-centered communication between patients and providers.

Limitations

The results presented here focus only on the self-reported outcomes of patient-centered communication and not on the health outcomes of sexual risk behaviors, intentions, and attitudes. Epstein and colleagues (2005b) note that patient-centered care is a complex construct and that care should be taken in linking patient-centered care with distal outcomes that are theoretically grounded. This study took a step forward in evaluating patient-centered communication among a unique group of youth using both mixed methods analysis. Future work should continue to expand upon these findings in order to explore the link between measures of patient-centered care and more distal outcomes such as health status and utilization (Epstein et al., 2005b).

Additionally, the decision to have each provider use either the GAPS or EHC tool rather than both means that the communication tool and provider were confounded. However, because of the training involved in utilizing each tool, it was necessary to have each provider only use one tool. In order to assess for fidelity to the study protocols, each provider had one or two visits audio taped. All audio taped visits were assessed for adherence to the history tool’s respective protocol and were found to be adequate. The final limitation was the unequal randomization totals that resulted due to constraints with community-based research.

Conclusion

In the report, Adolescent Health Care: Missing Opportunities, the authors note that development, context, and provider skill all matter within the healthcare visit (National Research Council and Institute of Medicine, 2009). Additionally, youth must be engaged in the healthcare visit so that providers can correctly identify their needs and desires. The EHC has demonstrated that it may be an important tool to use during the adolescent and emerging adult years because it captures the social determinants of health and provides a basis from which to educate youth about health behaviors that can have long-lasting effects (Sawyer et al., 2012). This tool therefore has the potential to enhance patient-provider communication to ensure more comprehensive care of the high risk youth population.

Acknowledgments

We gratefully acknowledge the following grant support by the National Institutes of Health and National Institute of Mental Health, R34-MH-082644 (PI: Kristy K. Martyn). Support to Michelle Munro was provided by the National Institutes of Health, National Institutes for Nursing Research grant number F31NR012852.

The authors acknowledge the Access Community Health and Research Center, Washtenaw County Public Health Department, and Eastern Michigan University – University Health Services for their assistance with recruitment and data collection.

Footnotes

Conflicts of Interest

The authors declare no potential conflicts of interest with respect to authorship or publication of this article.

Contributor Information

Kristy K. Martyn, Email: kmartyn@emory.edu, Nell Hodgson Woodruff School of Nursing, Emory University, 1520 Clifton Rd., Rm. 346, Atlanta, GA 30322.

Michelle L. Munro, Email: mlmunro@umich.edu, University of Michigan School of Nursing, 400 North Ingalls Building, Room 3188, Ann Arbor, MI 48109-5482.

Cynthia S. Darling-Fisher, Email: darfish@umich.edu, University of Michigan School of Nursing, 400 North Ingalls Building, Room 3176, Ann Arbor, MI 48109-5482.

David L. Ronis, Email: dronis@umich.edu, University of Michigan School of Nursing, 400 North Ingalls Building, Room 4330, Ann Arbor, MI 48109-5482.

Antonia M. Villarruel, Email: avillarr@umich.edu, University of Michigan School of Nursing, 400 North Ingalls Building, Room 4320, Ann Arbor, MI 48109-5482.

Michelle Pardee, Email: milopa@umich.edu, University of Michigan School of Nursing, 400 North Ingalls Building, Room 3185, Ann Arbor, MI 48109-5482.

Hannah Faleer, Email: hfaleer@umich.edu, Department of Psychology, Northern Illinois University, DeKalb, IL.

Nicole M. Fava, Email: nicole.fava@wayne.edu, Wayne State University, 71 East Ferry, Detroit, MI 48202.

References

  1. American Medical Association. Guidelines for adolescent preventive services (GAPS): Recommendations monograph. Chicago, IL: American Medical Association; 1997. [Google Scholar]
  2. Arnett JJ. Emerging adulthood: What is it and what is it good for? Society for Research in Child Development. 2007;1(2):68–73. [Google Scholar]
  3. Belli RF, Lee EH, Stafford FP, Chow CH. Calendar and question-list survey methods: Association between interviewer behavior and data quality. Journal of Official Statistics. 2004;20:185–196. [Google Scholar]
  4. Belli RF, Shay WL, Stafford FP. Event history calendars and question list surveys: A direct comparison of interviewing methods. Public Opinion Quarterly. 2001;65:45–74. doi: 10.1086/320037. [DOI] [PubMed] [Google Scholar]
  5. Caspi A, Moffitt TE, Thornton A, Freedman D, Amell JW, Harrington H, Smeijers J, Silva PA. The life history calendar: A research and clinical assessment method for collecting retrospective event-history data. International Journal of Methods in Psychiatric Research. 1996;6(2):101–114. [Google Scholar]
  6. Cox CL. An interaction model of client health behavior: Theoretical prescription for nursing. Advances in Nursing Science. 1982;5(1):41–56. doi: 10.1097/00012272-198210000-00007. [DOI] [PubMed] [Google Scholar]
  7. Epstein RM, Franks P, Shields CG, Meldrum SC, Miller KN, Campbell TL, Fiscella K. Patient-centered communication and diagnostic testing. Annals of Family Medicine. 2005a;3(5):415–421. doi: 10.1370/afm.348. [DOI] [PMC free article] [PubMed] [Google Scholar]
  8. Epstein RM, Franks P, Fiscella K, Shields C, Meldrum SC, Kravitz RL, Duberstein PR. Measuring patient-centered communication in patient-physician consultations: Theoretical and practical issues. Social Science & Medicine. 2005b;61:1516–1528. doi: 10.1016/j.socscimed.2005.02.001. [DOI] [PubMed] [Google Scholar]
  9. Epstein RM, Street RL. Patient-centered communication in cancer care: Promoting health and reducing suffering. National Cancer Institute(Publication No 07-6225) Bethesda, MD: NIH Publication; 2007. [Google Scholar]
  10. Freedman D, Thornton A, Camburn D, Alwin D, Young-DeMarco L. The life history calendar: A technique for collecting retrospective data. In: Clogg CC, editor. Sociological methodology. Vol. 18. San Francisco: Jossey-Bass; 1988. pp. 37–68. [PubMed] [Google Scholar]
  11. Furstenberg FJ, Brooks-Gunn J, Morgan SP. Adolescent mothers in later life. New York: Cambridge University Press; 1987. [Google Scholar]
  12. Kridli SA. Health beliefs and practices among Arab women. The American Journal of Maternal/Child Nursing. 2002;27(3):178–182. doi: 10.1097/00005721-200205000-00010. [DOI] [PubMed] [Google Scholar]
  13. Glaser BG. Advances in the methodology of grounded theory: Theoretical sensitivity. Mill Valley, CA: Sociology Press; 1978. [Google Scholar]
  14. Glaser BG. Basics of grounded theory analysis. Mill Valley, CA: Sociology Press; 1992. [Google Scholar]
  15. Institute of Medicine. Crossing the Quality Chasm: A New Health System for the 21st Century. Washington, DC: National Academy Press; 2001. [PubMed] [Google Scholar]
  16. Levenberg PB. GAPS: An opportunity for nurse practitioners to promote the health of adolescents through clinical preventive services. Journal of Pediatric Health Care. 1998;12:2–9. doi: 10.1016/s0891-5245(98)90023-2. [DOI] [PubMed] [Google Scholar]
  17. Lincoln YS, Guba EG. Naturalistic inquiry. Beverly Hills, CA: Sage; 1985. [Google Scholar]
  18. Little P, Everitt H, Williamson I, Warner G, Moore M, Gould C, Payne S, et al. Observational study of effect of patient centeredness and positive approach on outcomes of general practice consultations. British Medical Journal. 2001;323:908–912. doi: 10.1136/bmj.323.7318.908. [DOI] [PMC free article] [PubMed] [Google Scholar]
  19. Low LK. Guidelines for adolescent preventive services (GAPS) Journal of Midwifery & Women’s Health. 2003;48:231–233. doi: 10.1016/s1526-9523(03)00071-0. [DOI] [PubMed] [Google Scholar]
  20. Martyn KK, Belli RF. Retrospective data collection using event history calendars. Nursing Research. 2002;51(4):270–274. doi: 10.1097/00006199-200207000-00008. [DOI] [PubMed] [Google Scholar]
  21. Martyn KK, Munro ML, Fava NM, Darling-Fisher CS, Ronis DL, Villarruel AM, Pardee M. Testing a sexual risk event history calendar intervention with youth. 2013a Manuscript in preparation. [Google Scholar]
  22. Martyn KK, Munro ML, Tate NH, Saftner MA, Darling-Fisher CS. What are we missing? Risk behaviors among Arab-American adolescents and emerging adults. 2013b doi: 10.1002/2327-6924.12352. Manuscript in preparation. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Martyn KK, Darling-Fisher C, Pardee M, Ronis DL, Felicetti IL, Saftner MA. Improving sexual risk communication with adolescents using event history calendars. Journal of School Nursing. 2012a;28(2):108–115. doi: 10.1177/1059840511426577. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Martyn KK, Martin R. Adolescent sexual risk assessment. Journal of Midwifery and Women’s Health. 2003;8:213–219. doi: 10.1016/s1526-9523(03)00064-3. [DOI] [PubMed] [Google Scholar]
  25. Martyn KK, Reifsnider E, Murray A. Improving adolescent sexual risk assessment with Event History Calendars: A feasibility study. Journal of Pediatric Health Care. 2006;20(1):19–26. doi: 10.1016/j.pedhc.2005.07.013. [DOI] [PubMed] [Google Scholar]
  26. Martyn KK, Saftner MA, Darling-Fisher CS, Schell MC. Sexual risk assessment using event history calendars with male and female adolescents. Journal of Pediatric Health Care. 2012b doi: 10.1016/j.pedhc.2012.05.002. In Press. [DOI] [PMC free article] [PubMed] [Google Scholar]
  27. Mead N, Bower P. Patient-centeredness: A conceptual framework and review of the empirical literature. Social Science & Medicine. 2000;51:1087–1110. doi: 10.1016/s0277-9536(00)00098-8. [DOI] [PubMed] [Google Scholar]
  28. Lawrence RS, Appleton Grootmand J, Sims LJ, editors. National Research Council and Institute of Medicine. Adolescent Health Services: Missing Opportunities. Washington, DC: The National Academies Press; 2009. [PubMed] [Google Scholar]
  29. Ozer EM, Urquhart JT, Brindis CD, Park MJ, Irwin CE. Young adult preventive health care guidelines: There but can’t be found. Archives of Pediatric Adolescent Medicine. 2012;166(3):240–247. doi: 10.1001/archpediatrics.2011.794. [DOI] [PubMed] [Google Scholar]
  30. Paget L, Han P, Nedza S, Kurtz P, Racine E, Russell S, Von Kohorn I, et al. Patient-clinician communication: Basic principles and expectations. Institute of Medicine; Washington, DC: 2011. Discussion Paper. [Google Scholar]
  31. Sawyer S, Afifi RA, Bearinger LH, Blakemore S, Dick B, Ezech AC, Patton GC. Adolescence: A foundation for future health. Lancet. 2012;379:1630–1640. doi: 10.1016/50140-6736(12)60072-5. [DOI] [PubMed] [Google Scholar]
  32. Sequist TD, Schneider EC, Anastario M, Odigie EG, Marshall R, Rogers WH, Gelb Safran D. Quality monitoring of physicians: Linking patients’ experiences of care to clinical quality and outcomes. Journal of General Internal Medicine. 2008;23(11):1784–1790. doi: 10.1007/s11606-008-0760-4. [DOI] [PMC free article] [PubMed] [Google Scholar]
  33. Stewart M. Towards a global definition of patient-centered care. British Medical Journal. 2001;233:444–445. doi: 10.1136/bmj.322.7284.444. [DOI] [PMC free article] [PubMed] [Google Scholar]
  34. Stewart MA. Effective physician-patient communication and health outcomes: A review. Canadian Medical Association Journal. 1995;152(9):1423–1433. [PMC free article] [PubMed] [Google Scholar]
  35. Stewart M, Brown JB, Boon H, Galadja J, Meredith L, Sangster M. Evidence on patient-doctor communication. Cancer Prevention and Control. 1999;3(1):25–30. [PubMed] [Google Scholar]
  36. Stewart M, Meredith L, Ryan BL, Brown JB. Working Paper Series, Paper # 04-1. Second Edition. Thames Valley Family Practice Research Unit and Centre for Studies in Family Medicine; London Ontario, Canada: 2004. The patient perception of patient-centeredness questionnaire (PPPC) [Google Scholar]
  37. Wilkerson L, Fung C, May W, Elliott D. Assessing patient-centered care: One approach to health disparities education. Journal of General Internal Medicine. 2010;25(Suppl 2):86–90. doi: 10.1007/s11606-010-1273-5. [DOI] [PMC free article] [PubMed] [Google Scholar]
  38. Trin-Shevrin C, Islam N, Tandon SD, Abesamis N, Hoe-Asjoe H, Rey M. Using community-based participatory research as a guiding framework for health disparities research centers. Progress in Community Health Partnerships. 2007;1(2):195–205. doi: 10.1353/cpr.2007.0007. [DOI] [PMC free article] [PubMed] [Google Scholar]
  39. Woods ER, Klein JD, Wingood GM, Rose ES, Wypij D, Harris SK, DiClemente RJ. Development of a new Adolescent Patient-Provider Interaction Scale (AAPIS) for youth at risk for STDs/HIV. Journal of Adolescent Health. 2006;38:753e1–753e7. doi: 10.1016/j.jadohealth.2005.08.013. [DOI] [PubMed] [Google Scholar]
  40. Yi CH, Martyn KK, Salerno J, Darling-Fisher C. Development and Clinical Use of the Rapid Assessment for Adolescent Preventive Services (RAAPS) Questionnaire in School-based Health Centers. Journal of Pediatric Health Care. 2009;23(1):2–9. doi: 10.1016/j.pedhc.2007.09.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  41. Yosef ARO. Health beliefs, practices, and priorities for health care of Arab Muslims in the United States: Implications for nursing care. Journal of Transcultural Nursing. 2008;19:284–291. doi: 10.1177/1043659608317450. [DOI] [PubMed] [Google Scholar]

RESOURCES