Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2015 Sep 3.
Published in final edited form as: Eval Rev. 2014 Mar 3;38(1):3–28. doi: 10.1177/0193841X14524956

Tailored Panel Management: A Theory-Based Approach to Building and Maintaining Participant Commitment to a Longitudinal Study

Mica Estrada 1, Anna Woodcock 1, P Wesley Schultz 1
PMCID: PMC4153798  NIHMSID: NIHMS612771  PMID: 24590918

Abstract

Many psychological processes unfold over time, necessitating longitudinal research designs. Longitudinal research poses a host of methodological challenges, foremost of which is participant attrition. Building on Dillman’s work, we provide a review of how social influence and relationship research informs retention strategies in longitudinal studies. Objective: We introduce the tailored panel management (TPM) approach, which is designed to establish communal norms that increase commitment to a longitudinal study, and this commitment, in turn, increases response rates and buffers against attrition. Specifically, we discuss practices regarding compensation, communication, consistency, and credibility that increase longer term commitment to panel participation. Research design: Throughout the article, we describe how TPM is being used in a national longitudinal study of undergraduate minority science students. TheScienceStudy is a continuing panel, which has 12 waves of data collected across 6 academic years, with response rates ranging from 70% to 92%. Although more than 90% of participants have either left or graduated from their undergraduate degree program, this highly mobile group of people remains engaged in the study. TheScienceStudy has usable longitudinal data from 96% of the original panel. Conclusion: This article combines social psychological theory, current best practice, and a detailed case study to illustrate the TPM approach to longitudinal data collection. The approach provides guidance for other longitudinal researchers, and advocates for empirical research into longitudinal research methodologies.

Keywords: longitudinal research, panel management, commitment, response rates, attrition, online study


Determining the impact of a program or intervention often involves measuring the attainment of future milestones. Longitudinal research that tracks the trajectories of a cohort or panel of participants is the obvious methodological choice to answer such questions, but longitudinal studies are often plagued with shortcomings that threaten the validity of findings. For example, one of the most common critiques of longitudinal studies is that participant attrition rates are large and unexplained. Yet, longitudinal data collection is essential to answer important questions regarding the development and maintenance of a multitude of human behaviors, including academic and career choices. Across many areas of social and behavioral research, there are certain types of research questions that simply cannot be answered adequately without the use of a longitudinal methodology. In this article, we address the issue of retaining participants in multiwave, longitudinal studies.

Overview

Dillman’s (2007) total design method (TDM) is widely referenced as an effective strategy for achieving good response rates to surveys. However, TDM focuses on one-time surveys and includes procedures that increase subject’s perceptions of rewards, reduce perceptions of costs, and increase trust (Dillman 1978). Across many years of research, this method has been shown to consistently produce high response rates to surveys, especially postal surveys (Dillman 1991, 2007). Importantly, Dillman has continued to refine his approach to respond to the changing nature of surveys, which now occurs in a world of technological change and creative multimodal techniques. Social exchange theory guides Dillman’s approach, which asserts: “actions of individuals are motivated by the return these actions are expected to bring” (p. 14). While the approach has been very successful at inducing compliance to one-time requests, longitudinal research designs require more than a single weighing of the costs and benefits of completing a survey. They require commitment to ongoing participation in the research across time. As a result, shifting the emphasis from social exchange to communal exchange can improve longitudinal research response rates and overall survey coverage across time.

We call this communal-based approach to longitudinal studies tailored panel management (TPM). As with all methodologies, heuristic reasoning does influence participant’s decision to complete or ignore the request to complete a survey. However, in longitudinal panel research, a series of unrelated “yes/no” survey completion decisions does not fully capture the experience. Acknowledging this difference, the TPM approach borrows Dillman’s “tailored” concept, but instead of tailoring for each selection population, the research approach is tailored to the individuals in the panel. Thus, the need to maximize per wave survey responses across the life of a longitudinal study drives the TPM approach and is achieved by shifting emphasis to the researcher–participant relationship. For the purpose of maintaining a thriving panel, this article particularly focuses upon how TPM establishes and reinforces communal exchange norms which foster commitment.

Communal exchange and commitment

The overarching aim of the TPM approach is to foster and maintain participant commitment to a longitudinal research project. Commitment is the relative strength of an individual’s identification with, and involvement with a group or community (Steers 1977). A study can be thought of as a type of social group or community to which participants join, and in longitudinal research, maintain a relationship. According to Salancik (1977), an individual will tend to adhere to the norms and conform to the values and expectations of those groups to whom he or she is committed. Social science research suggests that participants who are committed to the longitudinal research panel will be more likely to comply with the panel norm of providing data when asked, than those who are less committed.

This focus on commitment is consistent with research on communal relationships where commitment to a longer term relationship structure, such as what occurs in families, friendships, and with romantic partners, result in compliance with communal norms and noncontingent engagement in relationship-sustaining behaviors (Clark and Mills 2012). The relationships literature suggests that in longer term healthy relationships, people switch from pure social exchange norms where costs and benefits are counted to more noncontingent response patterns (Clark et al. 2008; Clark et al. 2010; Clark and Mills 2012). This framework is more appropriate for longitudinal panel studies, where the investigator–participant relationship ceases to be short term. In shifting the focus from the social exchange paradigm to a more communal normative framework, the TPM approach focuses more on mutual commitments, whereby investigators fulfill their commitments to the participants, and participants routinely fulfill their survey commitment. Focusing on commitment to the research relationship is in contrast to focusing solely on promoting participants one-time decision to engage in a survey. The TPM approach, therefore, also distinguishes itself from leverage-saliency theory, which Groves first introduced (Groves and Couper 1998) and then refined (Groves, Singer, and Corning 2000) that also focuses on one-time decisions to engage (or not) in completing an investigator’s request. Promoting participant communal exchange norms to leverage longer term commitment to a longitudinal research panel is the cornerstone of the TPM approach.

A case study

To illustrate the TPM approach and its various components, we will refer to an ongoing longitudinal panel study, TheScienceStudy, throughout this article. TheScienceStudy is a national longitudinal study of 1,420 minority science students that began in 2005. The purpose of TheScienceStudy was to investigate the long-term impact of engagement in undergraduate minority science training programs on pursuing a doctoral level biomedical research career. One such program is the National Institutes of Health–funded Research Initiative for Scientific Enhancement (RISE) program. Data were collected from each participant twice per year via a 30- to 45-min online survey. Participants have been tracked and surveyed 12 times since the inception of the study. Using a prospective, quasi-experimental design, the panel consisted of RISE program members (now former members) and a matched sample of nonprogram members and was built specifically for this study (rather than recruited from an existing online marketing panel).

Response rates across the 12 waves of online data collection have ranged from 70% to 92% of those surveyed.1 Our contact with participants, to encourage participation, has been strictly through e-mail, telephone, and postal mail (in order of frequency). In the first four waves, we utilized a multimodal approach in that participants could complete the survey either online or over the phone. But such a low percentage (less than 1%) utilized the phone option and it was dropped thereafter. In order to maintain the anonymity of our participants, we have not utilized any emerging social media to track down participants and connect them to the project via a Facebook or Linked In group site. Instead, we have relied upon participants to update their own contact information through e-mail, telephone, or a secure online portal (www.TheScienceStudy.com). While our panel began as university students from 50 campuses across the United States, at the time of this writing over 90% of the panel had left university (most with and a few without their baccalaureate degree) and were either working in professional careers or pursuing graduate-level degrees. Despite the changing demographics and geographic locations, overall survey coverage of the 1,420 member panel of predominantly African American and Latino/Hispanic participants remains high (see Table 1 for full demographic information of the panel). We have usable longitudinal survey data from 96% of the initial panel.

Table 1.

TheScienceStudy Demographics.

Demographics Undergraduates
Graduate students
Left
Median Range % Median Range % Median Range %
Age 22 18–48 25 19–47 24 18–40
Gender
 Male 29 27 29
 Female 71 73 71
Ethnicity
 African American/Black 42 40 51
 Asian 5 5 5
 Hawaiian/Pacific Islander 1 2 2
 Hispanic/Latino/Latina 43 40 33
 Native American/Alaskan Native <1 1 1
 White—non- Hispanic 5 8 6
 Other ethnicity 1 1 0
 Multiracial 3 3 2

The TPM approach was developed and implemented at the inception of the panel and continues to guide TheScienceStudy protocol. We drew from the existing survey response literature, and social psychological theory and research to develop a protocol that promotes commitment to the panel. The original intention was to follow the TPM approach throughout the life of the research project. Consistent with those aims, we have followed the original protocol with very few modifications across the life of the panel, as the per wave response rates never showed significant signs of decline (see Table 2). Our intention was not to use the panel to test hypotheses related to longitudinal survey methodology—but to develop a survey protocol that reflected current “best practices.” Therefore, we offer our experiences with TheScienceStudy as a case study to illustrate the TPM approach.

Table 2.

TheScienceStudy Survey Response Rates Waves 1 Through 12.

Wave Response (%) Wave Response (%) Wave Response (%)
1 84.7 5 76.1 9 69.0
2 78.9 6 73.5 10 68.9
3 78.0 7 70.0 11 71.3
4 75.4 8 71.1 12 70.4

Summary

At the conclusion of each section, we summarize how the TPM principle was used to guide TheScienceStudy panel management choices. Specifically, we describe the practices regarding compensation, communication, consistency, and credibility that increase longer term commitment to panel participation (see Figure 1) and a reliable response pattern (see Figure 2). While the approach is theoretically grounded, a secondary objective of the article is to be useful to those who are designing or conducting longitudinal research. In addition, there are several testable hypotheses that emerge from the described case study that could be studied empirically in future research.

Figure 1.

Figure 1

Tailored panel management components that contribute toward building participant commitment.

Figure 2.

Figure 2

Response rates across every other wave of TheScienceStudy data collection following consistent protocol. Note. This graph reflect response rate for the entire original panel.

Compensation

If there is a carrot and stick involved in persuading people to participate in a study, compensation would be the carrot that gets the cart rolling and rewards a person for participating in the study. Compensation is a reinforcer. When determining what compensation provides the greatest reinforcement, resulting in the greatest compliance with requests to participate in surveys, research suggests investigators consider three issues:

  1. the amount of compensation to be given,

  2. the timing of when and how often to allocate the compensation, and

  3. how to tailor the compensation type to best fit panel participants’ values—the form of compensation.

Amount

According to Dillman’s work on compensation and Singer’s work on leverage-saliency theory, when encouraging people to participate in a study, an important consideration is the benefits of participating relative to the costs. Generally, the greatest cost to the participant is time. The most tangible benefit is typically some type of payment or incentive. On one hand, if the compensation is too high, people may feel coerced (rather than persuaded), though Singer’s work suggests that there is little evidence of coercion since risk taking does not increase with compensation increases. Importantly, coercing a person to participate is considered unethical and not acceptable under national Internal Review Board guidelines (American Psychological Association 2010). While some researchers have argued that greater compensation does not induce compliance with more risky requests (therefore showing coercion is not occurring; Singer and Couper 2008), rates of compliance have not been tested with a wide range of incentives and risks. Importantly, research does show that when people attribute their behaviors to external rewards, intrinsic motivation declines (Deci, Koestner, and Ryan 1999).

On the other hand, providing too small a compensation can also affect response rates. Just as investigators do not want their participants to feel coerced, they also do not want them to feel “taken advantage of.” Research shows that some forms of compensation for a person’s time are not only beneficial to initiating participation in a range of activities but can also reinforce desired behaviors across the experimental time frame (Galizio and Buskist 1988). There is some research demonstrating that a systematic increase in payment across time can also contribute to sustained participation rates (James and Bolstein 1992).

Timing

Classic behavioral theory dictates that compensation should follow compliance and serve as a reward or reinforcer. However, two meta-analyses have shown that prepayment is more effective than promising an incentive following completion of a survey (Church 1993; Singer et al. 1999). Dillman (2007) describes giving a small token prior to the request to engage in completing a survey as resulting in increased response rates. From a social exchange perspective, Dillman argues that prior compensation works because it increases benefits and trust. While Dillman may be correct, there is another social process operating—it also primes reciprocation norms. Research on reciprocation suggests that there are times, such as when we are building a relationship, when prior compensation results in greater compliance with future requests. There are strong norms to reciprocate kindness (Cialdini 1993; Whatley et al. 1999) and these norms can be utilized in the context of engaging participants in a longitudinal panel. Further, this noncontingent giving conveys norms associated with more communal relationships that may support longer term as well as immediate commitment to engage in the study.

Social psychological research suggests shifting the focus to communal norms fosters levels of engagement that reflects greater intrinsic motivation and greater internal commitment to the endeavor. Although seemingly risky because investment of compensation dollars may exceed return, consistent with previous research, payment enclosed or included with the request for survey data consistently yields significantly higher response rates that more than justified the investment (Kropf and Blair 2005; Szeleényi, Bryant, and Lindholm 2005; Millar and Dillman 2011). Based on these prior studies, we hypothesize that the “pay first, participate later” framework contributes to establishing a norm of reciprocation and trust that has important long-term effects in building panel commitment across years of data collection.

Tailored

Central to the TPM approach is the tailoring of many aspect of the longitudinal research experience to individual participants. While research design dictates little flexibility in the questions that are asked or the timing of data collection, there are other features of the research experience that can be tailored without compromising the integrity of the data collected—for example, type of compensation. Researchers may assume that one type of compensation should be used for all participants. However, providing options capitalizes on the power of choice. In the present era of technology, participants can easily be compensated in a variety of ways. Prior research has shown that cash typically is the most effective means of encouraging participation in a mailed or online study even among computer savvy participants (Birnholtz et al. 2004). Yet, there is reason to expect that offering choice of payment may encourage longer term commitment and involvement in a longitudinal survey panel. Theoretically, providing choice promotes the building of participant commitment to the study across time. An additional benefit of multiple forms of payment such as using PayPal or gift cards is that they provide incentives for participants to keep their e-mail and/or home address contact details current. Other forms of compensation, such as using a lottery system where only a few participants actually receive awards, or donations to a charity (Warriner et al. 1996), have not been found to promote high response rates as consistently as a guaranteed reinforcer (Ulrich et al. 2005; Zangeneh, Blaszczynski, and Turner 2008). There is a large body of research on compensation that is beyond the scope of this article; however, testing the impacts of providing a tailored payment approach may be worth further investigation.

Case study

Figure 3 provides a description of how compensation was specifically done in TheScienceStudy.

Figure 3.

Figure 3

TheScienceStudy: Compensation.

Communication

As with any relationship, communication is a necessary component of the survey administration process. Communication of expectations, rights, and the explicit agreement is central to ethically conducted research projects (Committee on Science, Engineering, and Public Policy 2009). Tactics for communicating have been examined in a variety of research settings, with strong evidence that communicating benefits to participation outweigh costs being one of the strongest predictors of engagement in a one-time survey (Groves, Singer, and Corning 2000). For longitudinal studies, communication is ongoing and a communication breakdown reduces an investigator’s ability to contact participants and can result in substantial attrition. In contrast, strong and successful communication can result in participant’s long-term commitment and perhaps even enthusiasm for the study (Given et al. 1990). Three key components of the communication strategy include (1) accessibility, (2) employing multimodal methods, and (3) a personal tailored touch.

Accessibility

Communication in many one-time studies is predominantly unidirectional. The investigator informs a sample about a study and asks them to participate. If the person agrees, the investigator proceeds to direct them on how to fulfill the obligations of the study. Instructions can be provided in writing or verbally. The investigator then debriefs the participant and communication ceases. Although participants are typically given a phone number or e-mail address with which to communicate back to investigators if they have questions or a problem (as required by ethical treatment of participant standards), anecdotal evidence indicates that the number of participants who actually exercise this privilege in a minimal risk study is quite small. In a longitudinal study, the communication continues with additional requests to participate in the study and direction on how to do so. Thus, if participants change their contact information between waves of survey administration, they can be lost to the study investigators. And the longer the interval between data collection, the greater the risk that contact information will change.

Relationship research suggests that two-way communication patterns build familiarity and trust. This focus on relationship relates to Dillman’s (2007) discussion of how important trust is to the process of increasing participation. However, in longitudinal panel studies, a type of communal trust is emphasized in which participants know there is an easily accessible means to contact a person from the study and it is safe to do so. In addition, accessibility means that the participants can communicate with the study easily and update their own contact information without too much effort. Capitalizing on the power of Internet communication possibilities, communication can be enhanced by setting up a study website for participants that allows them to check and update their personal information with ease and without restriction.

Multimodal methods

Not explicitly described, but certainly implied by the previously mentioned description, is the notion that providing and utilizing multiple modes for communication with participants are essential. Classically, longitudinal survey studies have been conducted by telephone, mail, or person to person. Currently, however, web-based surveys and e-mail are being increasingly utilized to maintain panels and solicit ongoing participation with varying degrees of success. While several authors have written about the advantages and disadvantages of recruiting participants through mail, phone, and e-mail, the discussion here is about how using multiple modes of communicating with participants promotes participant commitment to a longitudinal study after the initial recruitment.

Clearly, people differ in their preference for how to communicate and with what frequency. Previous research has shown that providing different response modes sequentially can increase participation rates (Millar and Dillman 2011). When participants are drawn from a population with computer access and literacy, the least expensive form of communication is currently e-mail. A protocol that includes e-mail solicitation to participate can be partnered with other forms of communication. Specifically, for participants who do not respond to e-mail requests to participate in a survey, phone calls can be used to contact them and remind them of the survey. The calls are also a chance to update contact information, to help with any technical issues participants have with the survey, and to maintain a communal relationship between the participant and the project. Finally, if e-mail and phone calls do not achieve contact with participants, mail can be used to maintain contact and request participation.

The web–phone–mail approach is in contrast to the web plus mail approach that Messer and Dillman (2011) describe in which participants are first e-mailed and then sent a mailed request to participate. Interestingly, Messer and Dillman’s article also shows that mail only among demographically similar participants resulted in response rates higher than the web plus mail approach. Some research has shown that switching the mode of the survey collection can also result in higher completion rates. However, recent research demonstrates that participants can develop loyalty to their chosen mode of taking a survey (Kovac et al. 2009). The key may be responsive to participant preferences, which is facilitated by providing easy means for communication.

In addition to having several modes for communicating with participants, cultivating committed participants involves providing multiple methods by which participants can communicate to the research team if they so choose. While multimodal approaches to collecting data have been well studied (see Messer and Dillman 2011 or Millar and Dillman 2011 for recent research on the topic), what we are describing here are multimodal methods for panel participants to contact the study staff between and during survey administration, which is far less studied. To facilitate this sort of easy communal communication pattern, provide readily available mail address, phone number, and postal address to all participants. To promote two-way communication, all communications with participants can include basic contact information about the project. Effective communal style communication also is facilitated by having a designated project manager who checks e-mail, phone, and mail daily so as to quickly and efficiently responds to participants. In addition to reinforcing the identity and legitimacy of the study by having a professional and well-maintained web portal and phone response system, a project website enables participants to update contact information (e-mail, phone, or address) with a simple search of the study name.

Tailored communication

Dillman used the term “tailored” to describe how to modify a solicitation approach to fit the population group to which you want to gain participation (Dillman 2007). However, in the case of maintaining a long-term panel, tailored refers to how one approaches each panel member. At the most basic level, “tailored” refers to the use of person’s proper name, whether in e-mails, phone calls, or postal communications. Proper pronunciation or name usage is an important element of any personalized communication. Mispronounced or misspelled names can immediately alert a person that the person communicating to them is not familiar. Mispronounced names can prime a person in a matter of seconds to dismiss a call. And research shows that a personal request is much more likely to be fulfilled than one made from a stranger due to differences in need to self-present positively (Millar 2002) and the evoking of a sense of obligation (Roloff et al. 1988). Keeping notes on each participant that includes name preferences and participation can contribute toward a more personalized communication experience and reinforce the communal norms of the situation. In this way, Dillman’s (2007) recommendation to personalize the contact is followed. Tailored communication methods convey norms of friendship and a more communal tone, as opposed to norms of stranger interaction (Duncan 1979). Participants are not just a name on a list, but a partnered contributor to the research.

In all forms of communication, research shows that members of the general public are typically more likely to read and respond to communication attempts that are personalized (Dillman 2007). However, a response is more common when a potential participant is approached because they are a part of a group with whom they share a positive identity (Dillman et al. 2007). For instance, a person is more likely to open an envelope with a handwritten address and read a handwritten note than preprinted materials.

Case study

Figure 4 provides a description of how communication occurred in TheScienceStudy.

Figure 4.

Figure 4

TheScienceStudy: Communication.

Consistency

Research on commitment informs us that for most people, it is easier to commit to something or someone that is predictable, thus invoking communal relationship norms, than to someone or something that is unpredictable (invoking more exchange relationship norms), and there is greater comfort with that which is known (Fehr 1999). Consistency allows norms to develop and makes it clear what the expected response patterns are, promoting a normalized sequence of events.

Message

When a person agrees to complete a one-time survey, the commitment is relatively short and typically immediate. In contrast, when people agree to participate in a longitudinal study, they make a multitude of commitments. The investigators have an obligation to clearly describe what those commitments entail. We found the solicitation to be effective when the initial message engenders commitment to the study across time. Just as with onetime surveys, we found it effective to describe the specific costs and benefits in a manner that shows that benefits exceed the costs and that the request comes from a trustworthy source. We add to short-term cost–benefit description by also emphasizing the costs and benefits of long-term commitment to the study. Following the Tripartite Integration Model of Social Influence (Kelman 1958, 2006; Estrada et al. 2011), the initial message entailed three attributes—rules, roles, and values—to clearly set the foundation for the participant longer term commitment. First, the message clearly described the rules of engagement in the study and the long-term commitment to the panel. For instance, we clearly described that in participating, they would be asked repeatedly to participate in surveys, and in exchange, the investigator will provide compensation for their participation. This level of description is typically given in all studies as well as a comprehensive description of the costs and benefits of participating and subject’s rights to withdraw at any time in order to comply with Internal Review Board requirements (Brody 2001; York 2003). For a one-time study, the message given to the participants usually has no reason to include anything further than the costs and benefits. But, for longitudinal studies, we found incorporating two more levels of information to participants was effective.

When asking people to participate in a longitudinal study, we asked people to become a part of an ongoing panel. We hypothesized, based on in-group/out-group research, that language that identifies people as participating in an important panel would help to establish commitment. For that reason, the message we provided at the onset clearly and sincerely conveyed that each person is an important part of the study. In short, the message intentionally engendered a type of identification with the study for each participant. Research indicates that cultivating identification can vary, but crafting an initial solicitation message that begins a process of personal identification with the study can be done by simply naming the group of which one is becoming a part. This communication can be done through verbal or written recruitment processes.

And finally, we crafted a message that conveyed the value of participating. As with most studies, the informed consent forms can begin to establish the values of participating by describing to participants the benefits of the study to society, to the immediate community, the next generation of science students, and even to the participant. In addition, with an ongoing panel, there is an opportunity to place heavy emphasis on how the longitudinal nature of the study would help answer important questions. Based on previous research on value messaging, researchers can hypothesize that when participants internally value the purpose of the study, there will be greater likelihood that the participant will be intrinsically motivated to comply with the requests to participate. However, future research is needed to show with certainty that focus on the long-term value indeed has a positive effect all by itself (when not in combination with these other variables) on commitment to research participation.

Across time

One key characteristic of communal relationships is the development of familiarity with another. Translated to a longitudinal study, familiarity becomes possible by keeping the message consistent across time. For instance, if the initial agreement is to participate in one survey a year and suddenly they are being asked to do two, participants may feel that the investigators have violated their agreement. In contrast, consistently reiterating the agreement to participate in the panel, conveying the sincere message that each person is a valued member of the panel and reiterating that the results have value in the wider community can over time convey a familiarity with the research project and the goals. In addition, as participants become positively familiar with the study and their role in it, their identity as a member of the panel as well as internalization of the values of the community increases.

Timing

There are few empirically tested guidelines about how often to communicate with participants. If participants feel overly contacted, there is a risk that they will feel overburdened. At the same time, if participants are not contacted often enough, they may not feel the study is committed to their involvement. From the relationship literature, there is some evidence that quality rather than quality of communication is key. Also, some level of predictability helps to establish trust and commitment (Ross and LaCroix 1996).

Branding

Another way to promote familiarity is to have an appealing “look” of a product (Ribisl et al. 1996)—in this case the product is the study. While academic researchers might be unaccustomed to “selling” participation, there is some familiarity with the idea of branding with a look that is consistent and appealing. A simple method of branding a study is to have a logo and color scheme that can be used across all modalities of communication. Research shows that connecting a logo with a meaningful concept can be enduring (Buttle and Westoby 2006). For instance, having the study logo on letters, envelopes, e-mails, website, and the survey itself ties all communications together. Communication specialists emphasize that having a logo that establishes a meaningful image can be powerful. Attention to branding is consistent with Dillman’s (2007) emphasis on survey design principals. However, we extend the issue of design from survey content to overall research study image.

Although not appropriate for some research, learning theory does suggest that using creative methods for establishing a positive association between the “brand” and participation in the study may be useful. Displaying the brand on a product that participants use such as a flash drive, pens, business card holder, and so on, support participant identification with the panel and greater commitment to continuing involvement. While there is no empirical research to show that panel identification increases participation in longitudinal studies, research does show that identification with a group is associated with higher engagement in normative behaviors for that group (Terry and Hogg 1996; White et al. 2009). Future research testing the impact of branding in longitudinal studies would be a valuable next step to better understand this issue.

Case study

Figure 5 provides a description of how consistency was maintained in TheScienceStudy.

Figure 5.

Figure 5

TheScienceStudy: Consistency.

Credibility

Unlike the first two characteristics of the TPM approach—compensation and communication—the latter two characteristics are qualities of a study that can permeate all aspects. As discussed previously, consistency appears in the execution of both compensation and communication. Likewise, credibility can permeate all aspects of the study. Consistency can help to build credibility and familiarity. Dillman (2007) describes the importance of trust, which is certainly a characteristic of legitimacy. In social psychological research, credibility (also referred to as legitimacy) is associated with greater compliance with requests, as Dillman describes, and has also been found to increase commitment to groups (Tyler 2006). Communal relationships are also built on trust and belief in the credibility of one’s friend or partner. In the next sections, we describe three levels to help establish and maintain credibility. Our theoretical reasoning for focusing on these attributes was drawn primarily from the empirical literature on occurrences of compliance and social influence.

Legitimacy of the requester

Previous research has shown that recruiting participants for a study is easiest when the request for participation comes from a credible (i.e., trusted) source (Patch 1988; Cialdini 1984; Dillman 2007). For instance, if you receive a phone call from a student at your alma mater, you are more likely to listen than if it comes from an anonymous telemarketer at an unfamiliar agency (Albaum 1987; Houston and Nevin 1977). As Dillman (2007) describes, there is strong evidence that research sponsored by a well-known and respected agency, for instance the U.S. Census, is more likely to appear worthy of a person’s time than if it is from a small unknown research center (Heberlein and Baumgartner 1978). The classic Milgram experiment demonstrated the impact of legitimacy by showing more compliance with experimenter requests to shock another participant when the researcher was from Yale University than from an undesignated institution (Milgram 1974). In addition to having a credible institution associated with a study, the investigators may garner legitimacy by their notoriety or simply by having PhD after their name. However established, credibility of the person making an initial request, which may be in association with the university at which they work or his or her titles, is found to be a critical predictor of compliance with requests (Aronson, Turner, and Carlsmith 1963; Milgram 1974).

Credibility of the request

Research on persuasion has shown that even if the requester is perceived as legitimate and credible, it is equally important that the request itself be perceived as legitimate (Kelman and Hamilton 1989). If a history professor asks a student to turn in an assignment, the request would be considered legitimate.

Credibility of the study

The credibility of a study is closely connected to a variety of features including consistency, good communication, and reliable administration of compensation.

Case study

Figure 6 provides a description of how credibility was specifically developed in TheScienceStudy.

Figure 6.

Figure 6

TheScienceStudy: Credibility.

Conclusion

This article has sought to articulate the TPM approach that is being utilized in an ongoing longitudinal study of college students as they completed their undergraduate education and proceeded into graduate school or professional careers. Building on Dillman’s TDM, we suggest that maintaining a longitudinal online panel draws more heavily upon communal rather than social exchange norms. Communal norms rest on a belief that all parties are committed to the long-term health of the relationship. In TheScienceStudy, we conveyed and reinforced this type of commitment through compensation, communication, consistency, and credibility. These principles may be equally applicable for maintaining a panel not engaged in an online study. Future research could examine the applicability of this approach across a multitude of types of longitudinal panels.

Although we have continued to cultivate a type of communal relationship with participants, we have taken precautions to not convey to participants what we would consider “good” or “bad” responses to any item on the survey. The only socially desirable response that they can provide for us is to actually do the survey. The actual content of the survey does not actually lend itself to knowing what responses researchers would most desire. However, a randomized study with one panel partaking in a TPM design study and the other panel following a more traditional approach would certainly test our assumptions and be an important contribution to the literature.

In describing TheScienceStudy’s TPM approach and its application of each principle, we have endeavored to describe how each principle contains both key considerations and flexible execution. Many of the approaches taken build upon previous research on communal relationships and influence social and undoubtedly upon the wealth of research on survey design. Some of the suggestions made are less experimentally grounded and drawn from the best practices case study of the TheScienceStudy. This combination of empirical and case study research makes this article unique and hopefully useful to those who seek to maintain high response rates across time utilizing online survey methodology. In addition, the theoretical foundation provided in this article can stimulate new techniques for survey researchers and generate testable hypotheses for future research.

Acknowledgments

Funding

The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Funding for this work was provided by a grant from the National Institutes of Health (R01-GM075316).

Biographies

Mica Estrada is Research Faculty at California State University, San Marcos. Her area of expertise is social influence, including the study of identity, forgiveness, intergroup relations, and integrative education. Recent studies have focused on how underrepresented minority science students integrate into the scientific community and careers. A common characteristic of Dr. Estrada’s work is designing and empirically testing interventions that can change individual behavior, social norms, well-being, and community consciousness.

Anna Woodcock is research faculty at California State University, San Marcos. Her research interests lie in the broad areas of diversity, prejudice and stereotyping. She is currently investigating: the impact of implicit racial and gender bias on behavior, the processes by which stereotype threat operates, and the psychological processes underlying the underrepresentation of women and minorities in science, technology, engineering, and math (STEM) careers.

P. Wesley Schultz is Professor of Psychology at California State University, San Marcos where he teaches courses in social psychology and statistics. His research focuses on social influence, and the application of psychology to understand and solve social issues. Recent studies have focused on science training programs, and the role of undergraduate research experience in sustaining interest in scientific careers among underrepresented students. He also maintains an active program of research on environmental conservation programs and climate change education.

Footnotes

1

Only 55 participants have withdrawn from the study permanently. Response rates were the number of participants who completed the survey on any given wave divided by the total number of panel member surveyed at that wave.

The article was initially presented in 2009 at the 21st meeting of the Association for Psychological Science, San Francisco, CA. Our appreciation goes to Paul R. Hernandez, Randie Chance, Maria Aguilar, Perla Rivas, Victor Rocha, Richard Serpe, Brian McDonald, Dave Morolla, Adam Zaleski, Priscilla Fernandez, Lilibeth Flores, and the Survey Research Lab at Kent University.

Declaration of Conflicting Interests

The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

References

  1. Albaum G. Do Source and Anonymity Affect Mail Survey Results? Journal of the Academy of Marketing Science. 1987;15:74–81. doi: 10.1007/BF02722173. [DOI] [Google Scholar]
  2. American Psychological Association. [Accessed April 2, 2011. ];American Psychological Association Ethical Principles of Psychologists and Code of Conduct. 2010 http://www.apa.org/ethics/code/index.aspx.
  3. Aronson E, Turner JA, Carlsmith JM. Communicator Credibility and Communication Discrepancy as Determinants of Opinion Change. Journal of Abnormal and Social Psychology. 1963;67:31–36. doi: 10.1037/h0045513. [DOI] [Google Scholar]
  4. Birnholtz JP, Horn DB, Finholt TA, Bae SJ. The Effects of Cash, Electronic, and Paper Gift Certificates as Respondent Incentives for a Web-based Survey of Technologically Sophisticated Respondents. Social Science Computer Review. 2004;22:355–62. doi: 10.1177/0894439304263147. [DOI] [Google Scholar]
  5. Brody BA. Making Informed Consent Meaningful. Ethics and Human Research. 2001;23:1–5. doi: 10.2307/3564045. [DOI] [PubMed] [Google Scholar]
  6. Buttle H, Westoby N. Brand Logo and Name Association: It’s All in the Name. American Journal of Cognitive Psychology. 2006;20:1181–94. doi: 10.1002/acp.1257. [DOI] [Google Scholar]
  7. Church AH. Estimating the Effect of Incentives on Mail Survey Response Rates: A Meta-analysis. Public Opinion Quarterly. 1993;57:62–79. doi: 10.1086/269355. [DOI] [Google Scholar]
  8. Cialdini RB. Influence: The New Psychology of Modern Persuasion. New York: Quill; 1984. [Google Scholar]
  9. Cialdini RB. Influence: Science and Practice. 3. New York: Harper Collins College; 1993. [Google Scholar]
  10. Clark MS, Graham SM, Williams E, Lemay EP. Understanding Relational Focus of Attention May Help Us Understand Relational Phenomena. In: Forgas JP, Fitness J, editors. Social Relationships: Cognitive, Affective, and Motivational Processes. New York: Psychology Press; 2008. pp. 131–46. [Google Scholar]
  11. Clark MS, Lemay ER, Graham SM, Pataki SP, Finkel EJ. Ways of Giving Benefits in Marriage: Norm Use, Relationship Satisfaction, and Attachment-related Variability. Psychological Science. 2010;21:944–51. doi: 10.1177/0956797610373882. [DOI] [PubMed] [Google Scholar]
  12. Clark MS, Mills JR. A Theory of Communal (and Exchange) Relationships. In: Van Lange PM, Kruglanski AW, Higgins E, editors. Handbook of Theories of Social Psychology. Vol. 2. Thousand Oaks, CA: Sage; 2012. pp. 232–50. [Google Scholar]
  13. Committee on Science, Engineering, and Public Policy. On being a Scientist: A Guide to Responsible Conduct in Research. 3. Washington, DC: National Academies Press; 2009. [PubMed] [Google Scholar]
  14. Deci EL, Koestner R, Ryan RM. A Meta-analytic Review of Experiments Examining the Effects of Extrinsic Rewards on Intrinsic Motivation. Psychological Bulletin. 1999;125:627–68. doi: 10.1037/0033-2909.125.6.627. [DOI] [PubMed] [Google Scholar]
  15. Dillman DA. Mail and Telephone Surveys: The Total Design Method. New York: Wiley-Interscience; 1978. [Google Scholar]
  16. Dillman DA. The Design and Administration of Mail Surveys. Annual Review of Sociology. 1991;17:225–49. [Google Scholar]
  17. Dillman DA. Mail and Internet Surveys: The Tailored Design Method. Hoboken, NJ: John Wiley; 2007. [Google Scholar]
  18. Dillman DA, Lesser V, Mason R, Carlson J, Willits F, Robertson R, Burke B. Personalization of Mail Surveys for General Public and Populations with a Group Identity: Results from Nine Studies. Rural Sociology. 2007;72:632–46. doi: 10.1526/003601107782638693. [DOI] [Google Scholar]
  19. Duncan W. Mail Questionnaires in Survey Research: A Review of Response Inducement Techniques. Journal of Management. 1979;5:39–55. doi: 10.1177/014920637900500103. [DOI] [Google Scholar]
  20. Estrada M, Woodcock A, Hernandez PR, Schultz PW. Toward a Model of Social Influence that Explains Minority Student Integration into the Scientific Community. Journal of Educational Psychology. 2011;103:206–22. doi: 10.1037/a0020743. [DOI] [PMC free article] [PubMed] [Google Scholar]
  21. Fehr B. Laypeople’s Conceptions of Commitment. Journal of Personality and Social Psychology. 1999;76:90–103. doi: 10.1037/0022-3514.76.1.90. [DOI] [PubMed] [Google Scholar]
  22. Galizio M, Buskist W. Laboratory Lore and Research Practices in the Experimental Analysis of Human Behavior: Selecting Reinforcers and Arranging Contingencies. The Behavior Analyst. 1988;11:65–69. doi: 10.1007/BF03392457. [DOI] [PMC free article] [PubMed] [Google Scholar]
  23. Given BA, Keilman LJ, Collins C, Given CW. Strategies to Minimize Attrition in Longitudinal Studies. Nursing Research. 1990;39:184–86. doi: 10.1097/00006199-199005000-00018. [DOI] [PubMed] [Google Scholar]
  24. Groves RM, Couper MP. Nonresponse in Household Interview Surveys. New York: John Wiley; 1998. [Google Scholar]
  25. Groves RM, Singer E, Corning A. Leverage-saliency Theory of Survey Participation. Public Opinion Quarterly. 2000;64:299–308. doi: 10.1086/317990. [DOI] [PubMed] [Google Scholar]
  26. Heberlein TA, Baumgartner R. Factors Affecting Response Rates to Mailed Questionnaires: A Quantitative Analysis of the Published Literature. American Sociological Review. 1978;43:447–62. [Google Scholar]
  27. Houston MJ, Nevin JR. The Effects of Source Appeal on Mail Survey Response Patterns. Journal of Marketing Research. 1977;14:374–78. doi: 10.2307/3150777. [DOI] [Google Scholar]
  28. James JM, Bolstein R. Large Monetary Incentives and Their Effect on Mail Survey Response Rates. Public Opinion Quarterly. 1992;56:442–53. doi: 10.1086/269336. [DOI] [Google Scholar]
  29. Kelman HC. Compliance, Identification, and Internalization: Three Processes of Attitude Change. Journal of Conflict Resolution. 1958;2:51–60. doi: 10.1177/0022002-75800200106. [DOI] [Google Scholar]
  30. Kelman HC. Interests, Relationships, Identities: Three Central Issues for Individuals and Groups in Negotiating Their Social Environment. Annual Review of Psychology. 2006;57:1–26. doi: 10.1146/annurev.psych.57.102904.190156. [DOI] [PubMed] [Google Scholar]
  31. Kelman HC, Hamilton V. Crimes of Obedience: Toward a Social Psychology of Authority and Responsibility. New Haven, CT: Yale University Press; 1989. [Google Scholar]
  32. Kovac MD, Rogers B, Mooney GM, Trunzo D. Mode of Choice in a Longitudinal Mail, Web, and Telephone Survey. Paper presented at the 64th Annual Conference of American Association for Public Opinion Research; Hollywood, FL. 2009. [Google Scholar]
  33. Kropf ME, Blair J. Eliciting Survey Cooperation: Incentives, Self-interest, and Norms of Cooperation. Evaluation Review. 2005;29:559–75. doi: 10.1177/0193841X05278770. [DOI] [PubMed] [Google Scholar]
  34. Messer BL, Dillman DA. Surveying the General Public over the Internet Using Address-based Sampling and Mail Contact Procedures. Public Opinion Quarterly. 2011;75:429–57. doi: 10.1093/poq/nfr021. [DOI] [Google Scholar]
  35. Milgram S. Obedience to Authority: An Experimental View. New York: Harper & Row; 1974. [Google Scholar]
  36. Millar M. Effects of a Guilt Induction and Guilt Reduction on Door in the Face. Communication Research. 2002;29:666–80. doi: 10.1177/009365002237831. [DOI] [Google Scholar]
  37. Millar MM, Dillman DA. Improving Response to Web and Mixed-mode Surveys. Public Opinion Quarterly. 2011;75:249–69. doi: 10.1093/poq/nfr003. [DOI] [Google Scholar]
  38. Patch MA. Differential Perception of Source Legitimacy in Sequential Request Strategies. Journal of Social Psychology. 1988;128:817–23. doi: 10.1080/00224545.1988.9924559. [DOI] [Google Scholar]
  39. Ribisl KM, Walton MA, Mowbray CT, Luke DA, Davidson WS, Bootsmiller BJ. Minimizing Participant Attrition in Panel Studies through the Use of Effective Retention and Tracking Strategies: Review and Recommendations. Evaluation and Program Planning. 1996;19:1–25. doi: 10.1016/0149-7189(95)00037-2. [DOI] [Google Scholar]
  40. Roloff ME, Janiszewski CA, McGraff MA, Burns CS, Manrai LA. Acquiring Resources from Intimates when Obligation Substitutes for Persuasion. Human Communication Research. 1988;14:364–96. doi: 10.1111/j.1468-2958.1988.tb00161.x. [DOI] [Google Scholar]
  41. Ross W, LaCroix J. Multiple Meanings of Trust in Negotiation Theory and Research: A Literature Review and Integrative Model. International Journal of Conflict Management. 1996;7:314–60. doi: 10.1108/eb022786. [DOI] [Google Scholar]
  42. Salancik GR. Commitment Is Too Easy. Organizational Dynamics. 1977;6:62–80. doi: 10.1016/0090-2616(77)90035-3. [DOI] [Google Scholar]
  43. Singer E, Bossarte RM. Incentives for Survey Participation: When Are They ‘Coercive’? American Journal of Preventive Medicine. 2006;31:411–18. doi: 10.1016/j.amepre.2006.07.013. [DOI] [PubMed] [Google Scholar]
  44. Singer E, Couper MP. Do Incentives Exert Undue Influence on Survey Participation? Experimental Evidence. Journal of Empirical Research on Human Research Ethics. 2008;3:49–56. doi: 10.1525/jer.2008.3.3.49. [DOI] [PMC free article] [PubMed] [Google Scholar]
  45. Steers RM. Antecedents and Outcomes of Organizational Commitment. Administrative Science Quarterly. 1977;22:46–56. [PubMed] [Google Scholar]
  46. Szelényi K, Bryant AN, Lindholm JA. What Money Can Buy: Examining the Effects of Prepaid Monetary Incentives on Survey Response Rates among College Students. Educational Research and Evaluation. 2005;11:385–404. doi: 10.1080/13803610500110174. [DOI] [Google Scholar]
  47. Terry DJ, Hogg MA. Group Norms and the Attitude-behavior Relationship: A Role for Group Identification. Personality and Social Psychology Bulletin. 1996;22:776–93. doi: 10.1177/0146167296228002. [DOI] [Google Scholar]
  48. Tyler TR. Psychological Perspectives on Legitimacy and Legitimation. Annual Reviews of Psychology. 2006;57:375–400. doi: 10.1146/annurev.psych.57.102904.190038. [DOI] [PubMed] [Google Scholar]
  49. Ulrich CM, Danis M, Koziol D, Garrett-Mayer E, Hubbard R, Grady C. Does It Pay to Pay?: A Randomized Trial of Prepaid Financial Incentives and Lottery Incentives in Surveys of Nonphysician Healthcare Professionals. Nursing Research. 2005;54:178–83. doi: 10.1111/j.1475-6773.2011.01264.x. [DOI] [PubMed] [Google Scholar]
  50. Warriner K, Goyder J, Gjersten H, Hohner P, McSpurren K. Charities, No; Lotteries, No; Cash, Yes: Main Effects and Interactions in a Canadian Incentives Experiment. Public Opinion Quarterly. 1996;60:542–62. doi: 10.1086/297772. [DOI] [Google Scholar]
  51. Whatley MA, Webster JM, Smith RH, Rhodes A. The Effect of a Favor on Public and Private Compliance: How Internalized Is the Norm of Reciprocity? Basic and Applied Social Psychology. 1999;21:251–59. doi: 10.1207/S15324834BASP2103_8. [DOI] [Google Scholar]
  52. White KM, Smith JR, Terry DJ, Greenslade JH, McKimmie BM. Social Influence in the Theory of Planned Behaviour: The Role of Descriptive, Injunctive, and In-group Norms. British Journal of Social Psychology. 2009;48:135–58. doi: 10.1348/014466608X295207. [DOI] [PubMed] [Google Scholar]
  53. York DH. Protection of Human Subjects in Research Trials. American Journal End Technology. 2003;43:54–59. [Google Scholar]
  54. Zangeneh M, Blaszczynski A, Turner N. In the Pursuit of Winning: Problem Gambling Theory, Research and Treatment. New York: Springer; 2008. [Google Scholar]

RESOURCES