Skip to main content
Qualitative Research in Medicine & Healthcare logoLink to Qualitative Research in Medicine & Healthcare
. 2021 Oct 5;5(2):9724. doi: 10.4081/qrmh.2021.9724

Assessing collaboration among team scientists within a triadic research center partnership

Linda S Behar-Horenstein 1,, Joyce M Richey 2, Ukamaka Diké Smith 3
PMCID: PMC10336870  PMID: 37441668

Abstract

Few studies have utilized qualitative methods to assess the perceived effectiveness of collaboration among research center interdisciplinary team scientists. Stages of team development served as the theoretical framework to characterize minority serving institution (MSI) and predominantly White institutions (PWI) participants’ challenges and successes during a National Institutes of Health (NIH) sponsored cancer health disparities training and research program. We present the finding of an inductive analysis of four open-ended survey questions across two years. Fostering an awareness of the inherently taxing, yet centrality of group (team) development may advance an understanding of team dynamics and lead to increased team cohesion and productivity. In conclusion, we provide recommendations to assist multiple principal investigators who embark on team development.

Key words: Interdisciplinary research teams, collaboration, team development, cancer health disparities training and research, evaluation, MSI-PWI partnerships, team science, inductive analysis

Introduction

Research is increasingly being conducted by teams in nearly all disciplines.1 Multi-university collaborations, among the most complicated, are growing the fastest.2 Yet, there is a dearth of qualitative studies focused on interdisciplinary team scientists’ perceptions of collaboration effectiveness within federally funded research center partnerships. For this study, a “center” is defined as partnership between two or more universities that include a minority-serving institution (MSI) and predominately White institutions (PWIs). Collaboration is broadly viewed as the ability and willingness to share resources, knowledge, and outcomes in a manner that benefits participating institutions so that it augments existing services and builds the capacity for nascent or unavailable services and contributes to longer-term societal outcomes associated with these programs.

With the aid of a person-centered approach, and use of surveys or interviews, participant perceptions of collaboration effectiveness can be ascertained. Alternatively, collaboration effectiveness can be assessed by analyzing the extent of shared publications, using bibliometrics or media products with a system-centered approach. Engaging interdisciplinary researchers, biomedical and social scientists, clinicians, and community health educators in team science across geographically situated institutions and academic departments whose values for teaching, scholarship, and research may vary widely will likely influence a center’s capacity to achieve grant outcomes/benchmarks. Working across universities, researchers are often expected to collaborate in teams, with one or more representatives from each institution.

Few field studies have explored the team science development processes among interdisciplinary researchers at the onset of funded projects. The costs associated with studies that do not rely on survey methodology can be prohibitively labor intensive. Further complicating qualitative inquiry is team science where there is a lack of consensual conceptual agreement regarding the characteristics of each developmental stage. While others have used psychometric measures to assess collaboration, qualitative studies are infrequent. 3 The importance of qualitative inquiry cannot be understated as it has the potential to elucidate the complex interacting factors in real-world environments. Engaging social scientists in research on the emergent state of team science can enrich inquiry via the development of theoretical models to inform researchers of its unique dynamic and contextual factors.4

Herein, we employed the long-standing typology of group development (akin to team development) proposed by the Tuckman and Tuckman and Jensen to assess collaborative team science dynamics.5,6 Tuckman and Tuckman and Jensen categorized the stages as: i) forming, ii) storming, iii) norming, and iv) performing. During forming, participants are typically engaged in establishing norms to guide interactions and work as they become acquainted with processes and goals. Characteristics of storming include responding to conflict, identifying differing views about the center’s mission, determining how work should be organized, and negotiating how much influence one member may hold over others. While in norming, participant energy is often dedicated towards building consensus and developing cohesion, identifying the mission, and establishing operational practices to guide work. As participants develop a sense of mutual respect and camaraderie, they enter the stage of performing where shared leadership roles among members and assigning roles and tasks based on group member’s skill and expertise is often observed.5,6

This framework is widely accepted in psychological studies of group dynamics.7 Groups are typically comprised of people who come together to resolve and remedy issues that influence the quality of their day-to day-living or impact relationships. In these settings, individuals are often afforded opportunities to draw upon the insights that the therapist and other individuals offer. Though the motivation for working in this type of group is different from that which brings team scientists together, the dynamics that characterize interpersonal group interactions are similar. Team scientists, like groups, tend to organize themselves by adopting rules or norms that regulate behavior. Member often align themselves with allies according to shared values and what is in their best interests. All groups naturally create a division of labor. The degree to which team members collaborate is likely influenced by i) perceived norms, ii) institutional demands, iii) perceptions of mutual benefits, iv) the degree to which a member feels validated by other members, v) the level of trust, safety, and cohesion a member feels, and vi) individual perceptions of the equitable distribution of power, control, and contributions. Assessing collaboration may broaden our understanding regarding how each person’s actions and contributions fit within a larger context of the aims of the grant. After all, effective teams are not accidental. They must be meticulously formed by bringing together highly skilled, highly motivated individuals who have a clear picture of their goals and can recognize tangible evidence of their achievements.7

Researchers have argued that the degree to which PWI and MSI team scientists perceive their collaboration is not well understood. Warren, Behar-Horenstein, and Heard, and Davis, Warren and Behar-Horenstein suggest that these relationships, typically, are not true partnerships.8,9 In a 20-year review of MSIs and PWIs, Davis et al. (in press) reported a pattern of inequitable partnerships and mechanisms that averted or prevented MSIs from maximizing their research scholarship, faculty productivity, and funding capacities.9 They recommended studying MSI-PWI partnerships while they are evolving rather than at the conclusion of grant funding. In this study, we assessed the MSI-PWI team scientists’ perceptions of interdisciplinary collaboration effectiveness within the context of group development theoretical framework. To the best of our knowledge, qualitative studies such as this have not been undertaken while the partnership is evolving.

Studies of interdisciplinary team science

A growing interest in enhancing cross-disciplinary collaboration among health scientists has prompted several federal agencies, including the NIH, to establish large, multicenter initiatives to foster collaborative research and training. Yet, few studies have focused on the impact or effectiveness of interdisciplinary collaboration in science teams. Unique factors create an imperative to study these teams in context by assessing the processes related to working in transdisciplinary teams. Providing such opportunities can advance our understanding of varied forms of collaboration.10 In a review of quantitative studies, Hall et al. provide a comprehensive overview of team science characteristics including its values, team formation and composition, processes central to effective team functioning, and organizational and institutions factors that impact the success of effectiveness of their collaborations. 10 New strategies for evaluating research processes and products as well as the longer-term societal outcomes (i.e., public health improvements) associated with these programs are essential to assessing the effectiveness of collaborative scientific initiatives.11-14 In this paper, we describe a qualitative approach that was used to assess the effectiveness of collaboration among MSIPWI multi-disciplinary science teams.

Previous evaluation studies have assessed collaborative processes and outcomes during the mid-term or later stages of an initiative, while Hall et al. assessed antecedent factors present at the outset of an initiative using indexes of collaborative readiness, along with additional measures of near-term collaborative processes that may influence the effectiveness of team collaboration over the duration of the program.12 Studies of teams, groups, organizations, and management in industry and the military provide a body of evidence for effective teaming.12 Previous studies of teams, undertaken in laboratories, aviation and military settings, and complex organizational environments, 13-15 were guided by the input-process-outcome model of teamwork.16 Facets of collaboration such as individual attitudes, information sharing, solution identification, and relationships among concepts were assessed to determine how they impact team effectiveness. In recent years, multilevel analyses have offered a more holistic understanding of collaboration.17,18 However, the increased demand for scientific collaborations has outpaced an understanding of the factors that are needed to support teams in science, such as institutional structures, policies, and culture. Therefore, assessing the effectiveness of collaboration among team science researchers across their disciplinary, organizational, and cultural boundaries is vital to address increasingly complex challenges and opportunities in science and society.10

Background information about this MSI-PWI partnership

The center (hereafter, “Center” or “CARE2”) where we conducted this study is comprised of one MSI—the Florida Agricultural and Mechanical University (FAMU)—and two PWIs—the University of Florida (UF) and the University of Southern California National Comprehensive Cancer Center (USC-NCCC). The goals of our grant are to reduce cancer disparities in Blacks and Latinos, train and increase the pool of underrepresented Black and Latino scientists conducting health disparity research, increase research capacity at a minority serving institutions (e.g., FAMU); and increase cancer disparity research at UF and USC-NCCC. Two full projects (funded across the entire grant period) and a pilot project (funded for years one to three of the grant) included investigators from the Center institutions to augment the number of specimens collected from Blacks and Latinos, representing a wide range of subpopulations within these minority groups. The Center conducted prostate cancer and pancreatic cancer studies to reveal unprecedented findings about their impact on understudied Black and Latino populations. The aims of these projects were to transfer cutting edge and innovative technologies to FAMU and to provide research training opportunities across the partnership to 120 trainees which include doctoral, graduate and postbaccalaureate students, postdoctoral fellows, and Early-Stage Investigators (ESI). Overall, the aim of the Center was to expand the research focus at UF and USC-NCCC and address health disparities in prostate and pancreatic cancers. The research projects were supported by six shared workgroups (Administrative, Biostatistics and Methods, Community Outreach, Research Education, Planning and Evaluation, and Tissue Modelling). Each institution site hired its own coordinator to oversee the administrative tasks of the grant. As many as 48 individuals including key investigators, site coordinators, and internal advisory committee members comprised Center personnel.

The Center was funded by the National Cancer Institute U54 Comprehensive Partnership for Advancing Cancer Equity Health (CPACHE) program. The premises for this funding mechanism are to i) reduce the burden of cancer disparities while building capacity for biomedical research and training at MSIs, ii) increase the diversity of underrepresented minorities (URMs) in biomedical sciences, and iii) foster formal collaborations between research- intensive universities and minority serving institutions (MSI). Partnerships among research intensive and MSIs in cancer disparities research were initiated to ensure the: i) identification of biomarkers of cancer incidence, ii) reduce cancer occurrence, iii) provide resources to community health advocates, and iv) empower community members whose lives are impacted by cancer. Currently, 16 CPACHE centers are funded across the US.

As required by the National Institute of Health (NIH), annual self-evaluation is conducted through the submission of progress reports from the individual projects under the program funding mechanism. Each program monitors and reports the degree to which they have attained proposed objectives. Researchers may assess the quality of programs using outcome measures or evidence of impact. Program outcome measures may include i) reporting the number of publications that trainees and mentors co-author, ii) reporting the number of trainees who matriculate, continue in academic cancer-related research careers, or co-author publications, iii) specifying the number and list of presentations/ meetings that include trainees, iv) reporting the number of program graduates who receive R01 or career development awards, v) documenting the number who hold positions as professors or leadership roles on NCI committees, or vi) describing the quality of mentoring, among others. However, beyond annual self-reporting, there are no requirements for assessing the effectiveness of internal team scientists’ dynamics or collaborative engagement.

Materials and Methods

We used a person-centered approach to data collection by surveying participants. The resource and labor intensity of interviewing rendered survey use more feasible. During the summers of 2019 and 2020, we asked Center key investigators to respond to four open-ended questions in a Qualtrics survey. The survey was exclusive of demographic questions or personal links; all responses were anonymous. The questions were designed to assess the Center’s communications and interactions (Table 1). During Summer 2019 (Year 1), 38 were invited; 21 (55%) completed the survey. In Summer 2020 (Year 2), 48 were invited, and 33 (69%) participated. The number of participants in Year 2 was based, in part by an increased number of internal advisory committee members. The total number of participants at each institution ranged from 10 to 15 during Year 1 and from 14 to 17 during Year 2. The number of invited participants per institution included 10 from FAMU, 13 from UF, and 15 from USC during Year 1. During Year 2, there were 14 invited participants from FAMU and 17 from both UF and USC. Drawing upon their diverse racial and ethnic backgrounds, participants included basic and social scientists, clinicians, and community health educators who were tenured, tenure-, and non-tenure accruing.

For purposes of this study, the overall number of participants was considered the unit of analysis. Participant data was analyzed as an entity. As the authors, we are the principal investigators (PIs) of the Planning and Evaluation Core (PEC), a group of researchers assigned to provide an objective assessment of the Center’s outcomes. After de-identified data was downloaded into Excel, we analyzed that dataset inductively and identified emergent themes/subthemes characteristic of respondents’ replies and developed conceptual definitions. Inductive analysis, also referred to as “in vivo coding” is actual language in the qualitative data that has been used by the participants themselves.19We used the participants’ written responses in our analysis. In vivo coding is appropriate to studies that highlight and give priority to participants’ voices.20

Data analysis

As researchers, we are a senior professor with expertise in qualitative research, a senior basic scientist with a wealth of experience in creating, implementing, and evaluating pipeline/pathways program, and an early-stage investigator (ESI) in pharmacy practice with experience in community health initiatives. Prior to the analysis, we read participants’ responses to each of the four survey questions as separate entities. We met to discuss our independent notions of the emergent themes. Then, we used open coding while reading line by line. We compared open coding to ensure that a systematic approach to analysis was occurring before moving on to develop categories and emergent themes.

To ensure validity checking, one of us analyzed the content of two of the four questions. Another author checked the thematic interpretation of the first author’s analysis. This process was repeated for the other two questions across both datasets. We discussed the emergent themes and conceptual definitions and reached consensus. Themes for each survey question for Years 1 and 2 are shown in Table 1. We reached consensus on the identification of the categories as well as the themes that emerged in this study. The use of two independent coders aided us in reducing the potential for bias. Reading line-by-line and coding segments assisted us in making supporting quotations more accessible to support the categories that were identified.

Table 1.

Themes/subthemes and conceptual definitions associated with survey questions.

Survey questions Themes/subthemes: Year 1 Themes/subthemes: Year 2 Conceptual definitions
1. What is working well within your core/project team? • Collaboration
• Respect
• Communication frequency
• Point person
• Meeting agendas/protocols
• Regular meetings
• Communication frequency
Processes, practices, and behaviors characteristic of teamwork
2. Do you have any recommendations to improve productivity? • Reduce administrative tasks
• Efficient/user-friendly data/information management
• Equity in workload
• Effective and efficient communication
• Lack of centralized workspace
• Lack of procedural governance
• Inefficient work allocation
Protocols and practices perceived as linked to successful trainee and research outcomes
3. What evidence can you provide that demonstrates active collaboration with CaRE2 partners? Please provide examples. • Shared responsibility/leadership
• Collaboration /Infrastructure resources/Research project/ scholarship/Actualizing new projects
• Cross institutional training
• ↑Productivity
• Training activities
Practices and processes aligned with teamwork
4. How can collaborations be strengthened? • Enhance workflow efficiency/ Provide advanced notice and clarity for grant needs/Require timely email response/Post center wide resource documents in center-wide folder
• Forge partnership identity
• Strengthen Collaborations
• Strengthen collaboration/Promote cross core- research project contributions/Increase intra-institutional collaboration networks and authorships/Increase intra-institutional engagement of MSI student trainees
• Address conflict resolution
Practices and processes perceived as integral to successful benchmark attainment

This rigorous and systematic approach allowed us to feel confident that what we report is representative of participants’ perspectives. Attention to credibility and confirmability facilitated establishing trustworthiness. Credibility, or confidence in the truth of the findings, was achieved through triangulation and peer debriefing. Triangulation was accomplished by i) using three analysts, ii) reviewing multiple responses to each survey question, iii) using qualitative line-by-line coding, and iv) peer debriefing to ensure the accuracy of interpretations. Confirmability was achieved by engaging more than one person in analyzing the data.21

We also report the Center benchmark attainment at the inception of funding by providing evidence of baseline data for publications, presentations, funded projects, grant submissions and awards, the number of URMs student trainees and ESIs, and other awards at the close of Year 2.

Results

In this section, we provide an overview of the themes and subthemes along with anecdotal support for each survey question by Years 1 and 2. Next, we provide a comparative analysis of the findings and describe the association between findings and the stages of group development. We conclude this section with a summary of baseline and Year 2 outcomes, referred to as “benchmark attainment.”

Working well within cores/project teams

Year 1

Five themes emerged from the analysis of what is working well within the core/project teams: collaboration, respect, communication frequency, point person, and meeting agendas/protocols. These themes were identified as processes, practices, and behaviors characteristic of teamwork.

Collaboration

Regarding collaboration, participants described the impact of “teamwork” and characterized it as a “good division of labor.” During the first year of the grant, they reported that, “collaboratively we established and later revised a protocol to guide policies and practices (effort, documentation/record keeping, communications, guidelines for sending and receiving communications, and workflow).”

Respect

Respect was exemplified by the perception of a sense of “strong camaraderie” as well as “mutual respect and commitment of the core members” devoted to “reducing cancer health disparities.”

Communication frequency

Communication frequency, experienced via email, phone, regularly scheduled meetings and Zoom, videoconferencing/ recordings was exemplified by the practice of “regular meetings” and the use of a “basic video-conferencing systems” that supported what participants described as “efficient and effective” communication.

Point person

The assignment of a point person—someone who schedules and keeps the group on task—was appreciated for his role in “Having someone [to] arrange conferences and remind us of deadlines for progress report.” Remarking about core leadership, one participant pointed out that having this type of leadership provides “glue” to the communication that facilitated honest and forthright communication and served to keep “the group on task and held accountable.”

Meeting agendas/protocols

Advanced preparation and notification of meeting agendas/protocols ensured the implementation of “comprehensive [and planful] agendas” as well as “protocols [that provided guidelines for implementing] policies and practices.”

Year 2

During the 2nd year of the funding, only two themes were noted: regular meetings and communication frequency.

Regular meetings

Participants reported that “regular meetings and having joint activities that we plan together” were evidence of the role that communication plays in successful Center initiatives.

Communication frequency

Many respondents noted “communicating and cooperating with each other via emails and virtual meetings” were “critical to come together as a more cohesive unit.” One participant remarked that “communication [was] held to a high standard which keeps us on track.” Another stated that, “the professional relationship is strong, effective and pleasant.” Others pointed out the meeting frequency promoted working together and “fluid,” cooperative communication.

Comparison between Years 1 and 2

In Year 1, collaboration, respect, communication frequency, point person, and meeting agendas/protocols, characteristic of the stage of forming, were described as what is working well. Interestingly, collaboration and respect were not evident in Year 2. This subsequent emphasis on frequency of meetings and communication may have resulted from the firm establishment of practices set in Year 1, suggesting that participants had begun norming

Recommendations to improve productivity

Year 1

Four themes emerged from an analysis of recommendations to improve productivity: reduce administrative tasks, equity in workload, effective and efficient communication/ action-driven meetings, and efficient/userfriendly data/information management. The focus of these themes was on implementing protocols and practices that were perceived as linked to successful trainee and research outcomes

Reduce administrative tasks

Participants reported a need to reduce administrative tasks. For example, participants stated that there needed to be a stronger focus on the “success of our trainees [by] providing [and supporting] training. Excessive administrative demands such as reports” were viewed as a distraction to that goal.

Equity in workload

Participants expressed considerable concern about equity in workload. Although they observed a practice of shared leadership, several reported that not everyone was “pulling their weight.” This was exemplified by relevant core members who did attend scheduled conference calls or complete assigned tasks, and neglected to notify group members beforehand. Others remarked that “productivity would be improved by greater focus on action items during meetings—what will be done, by whom [specifying the] due date and improved follow-through.” Notably, “inequity in workload completion, is beginning to affect the team outcomes and needs to be resolved.” Unsurprisingly, there is a greater “focus on process and less on outcomes” which is likely to “affect the sustainability of the Center [and] needs to change if we want to be successful.”

Some participants suggested that “meeting agenda items [needed to focus] on short-term goals as well as longer-term goals [to] support productivity that is proactive rather than reactive.” One participant pointed out that [we] “have been in crisis mode for too long and need to get more proactive.” This participant recommended aligning meeting agendas with the work-plan presented in the grant timeline, creating a “check-in” regarding progress made for each aim, and establishing “due dates for all cores and projects would improve proactive productivity.”

Along the same line, participants pointed out that “most of the submissions [such as required reports] are being done last minute, which reduces the quality of assessment.” To remedy this, a participant suggested preparing documents “at least a week prior to the deadline so that other group members [could] provide their feedback.”

Effective and efficient communication/action-driven meetings

Participants pointed out gaps in communication and asserted the need for effective and efficient communication guided by action-driven meetings. They asked for “better and more in time communication and [increased] reminders of grant requirements.” There was broad disagreement about the frequency of meetings. Some called for “increasing the number of meetings so productivity could improve, since many topics cannot be covered in a single meeting and need follow-up meetings.” Others suggested reducing “the number of meetings and rely[ing] more on email exchange.”

Collectively, participants pointed out there was an absence of awareness regarding the role and functions of core and projects. To remedy this, they suggested distributing “a monthly newsletter/update ... to feature progress as well as impediments and to share information” about the role and accomplishments of each core and project in a brief communication. Similarly, some requested increasing clarity about the scientific projects and providing more guidance on “eligibility criteria for tissue [procurement] in order to carry out their scientific aims.” Another suggestion pertained to identifying “who is responsible for tracking samples provided to investigators,” while one individual asked if this process was going to be “centralized or [assigned] per study.” Another participant pointed out that, “we have little or no interaction with other cores and teams. We don’t know what kind of resources and services are available. If each core can provide a simple list of services and expertise, that will help.” Along the same lines, another suggestion was to develop a “central contact liaison from each core, master calendar, [and] find an alternative solution to Dropbox.”

Others opined that there seemed to be a lack of preparation for monthly meetings. Excessively long agendas and lengthy discussions of each item were also regarded as impediments to productivity.

Sometimes I feel as if our core is not always prepared for monthly meetings. By that, I mean, it feels as though they are each considering agenda items for the first time and spend a lot of time discussing the pros and cons of the smallest decision in exquisite detail, when really by the time we are at the meeting this issue should be down to a couple of points in favor or against and the group should be making a decision one way or the other. Sometimes important decisions don’t get finalized in the interest of time, but I don’t know if that course of action really is in the interest of making the most of our time.

Efficient/user-friendly data/information management

Several participants expressed a need for an efficientuser- friendly data/information management. They identified the major issue as “file/data management” and reported the “difficultly managing documents, especially when multiple people need to review and edit the same document.” One recommendation was to “find an alternative solution to Dropbox” and switch over to Basecamp, regarded as a “more efficient platform for sharing tasks and for everyday communication.”

Year 2

During year two, three themes emerged from analysis of recommendations to improve the Center’s productivity: lack of centralized workspace, lack of procedural governance, and inefficient work allocation. These themes referred to practices linked to effective project management practices, organizational governance, and effective internal communication as essential to improve productivity.

Lack of centralized workspace

Participants suggested developing a centralized workspace. They asserted that this platform would enable shared meeting minutes and facilitate better management of assigned tasks to track progress. For example, participants suggested documenting and circulating “action items” to ensure “that individuals are held responsible.” Another participant proffered “building a task list with deadlines to keep track of outstanding/ongoing projects.” Database management platforms such as a “productivitybased platform” or “platform designed for teams… such as Slack or Basecamp” were recommended as mechanisms to remedy this challenge. Others suggested such an investment would enhance Center productivity by “making it easier to find threads, exchanges, and documents.”

Lack of procedural governance

A few participants indicated a lack of procedural governance and adherence to standard operating procedures (SOPs) hindered productivity. Not adhering to meeting agendas was cited as an impediment to productivity. A desire to fix matters without necessary planning such as setting up processes “in the heat of the moment when faced with impending deadlines resulted in, things becom[ing] chaotic.” Participants urged developing protocols “agree[d] upon beforehand … to ensure all are comfortable with them,” followed by “adherence.”

Inefficient work allocation

Several participants commented on the lack of efficient work allocations. Central concerns were an incongruence between the percentage of time allotted to work on projects versus the actual time needed to complete tasks. Some pointed out that “hiring of additional [personnel] to complete the work [which] would increase productivity.” Another suggestion was “to engage more members for the core.” Along the same lines, one suggestion was to utilize “students that are assigned to the core who want to be engaged in publications and [work] on projects … as it would also help alleviate the workload”.

Some participants reported that the frequency of redundant email communications and the number of emergency emails that “necessitate a 24-hour turn-around” was excessive and diminished their availability to spend time on “actual core or project scientific tasks.” One participant pointed out that a lack of or unspecified subject in an email thread coupled with

“... different topics [that] are discussed in one same email, or a topic is discussed in an email trail that has a different subject,” made it challenging “to find exchanges [which then resulted in] repeated emails.” Another participant suggested reducing the frequency of emergency emails and migrating “from email correspondence to a productivity-based platform like Slack, Basecamp ... so that communications can be organized by topic.”

Comparison between Years 1 and 2

Themes related to providing formative feedback and processes (characteristic of forming) for managing scientific projects were absent during Year 2 suggesting that these activities had become normative. While exemplifying storming, participants often expressed dissatisfaction with administrative tasks such as the number of reports requested during Year 1 while this was not apparent in Year 2. While participants observed that Year 1 efforts focused on the developing processes and protocols, typical of norming, a lack of procedural governance and adherence to SOPs, notable during Year 2 suggested that participants had entered the stage of storming.

Participants’ comments over Years 1 and 2 were similar. For example, Year 1 suggestions to shift to a more efficient and user-friendly data information management system became more frequent in Year 2. During this time, participants emphasized a need for more centralized workspaces to effectively manage tasks, share documents, and review communication exchanges. This concern was characteristic of norming. Participants’ expression of increased concern about inefficient communication practices, unnecessarily frequent, redundant emails, and emergency meetings, are characteristic of storming.Workload equity was a common theme during both Year 1 and Year 2.

Evidence of active collaboration

Year 1

Three themes emerged from an analysis of the evidence that demonstrated active collaboration among the CaRE2 partners: shared responsibility/leadership, collaboration (4 subthemes: infrastructure resources, research projects, scholarship, and actualizing new projects) and cross institutional training. Conceptually, themes denoted practices and processes aligned with teamwork.

Shared responsibility/leadership

This theme was exemplified by a collaborative agreement among key investigators to rotate institutional leadership responsibility every four months. Synergistic activities were exemplified by PEC’s distribution of draft evaluation plans for all cores and research projects and their provision of formative feedback. Others described the presence of teamwork as evidenced by “key investigators’ active collaboration in writing and submitting annual reports” and a willingness to engage beyond the initial scheduled meeting times, by “following-up in our next meetings or via e-mail.”

Collaboration

Evidence of this theme was demonstrated by sharing infrastructure resources, research projects, scholarship, and actualizing new projects.

Infrastructure resources was described as sharing technology across the universities and convening cross-institutional trainings, meetings, and meeting presentation submissions. Key investigators’ cooperative “preparation of [the annual report], reviews, and submission” as well as “engagement in frequent phone calls, face-to-facemeetings, communication via emails and Zoom meetings” exemplified collaboration.

Research Project. Participants describe the nature of their shared research efforts as Center projects working together to ensure the completion of “sequencing, determining the sample size, and sequencing coverage depth.” Others ensured that they obtained cells from UF and acquired a “modified version of a drug from FAMU.” Research project leads guaranteed the timely receipt of “samples to FAMU to characterize and provide FFPE tissue blocks.”

Scholarship. Collaboration in scholarship was evidenced by the dissemination through “conference presentations [and] publishing papers together,” submitting grant supplements and manuscripts while including members of different cores and projects as co-authors.

Actualizing New Projects. This subtheme was described as the “potential for new collaborations” and an intention to identify new areas of research through cross university communication. Participants mentioned that the Center offered support for new lines of investigation by “submitting Developmental Research Program (DRP) applications.”

Cross institutional training

Evidenced of this practice was observed by “exchanging students across Center institutions.” Specifically, the Center provided support for “FAMU students and faculty [to travel and] participate in research projects at UF and USC.”

Year 2

Two themes emerged during the second year: increase in productivity and increase in training activities.

Increase in productivity

Participants were nearly uniform in sharing Center successes. They highlighted the noticeable increase in the number of publications, grant submissions, and presentations as evidence of burgeoning productivity. “Continued collaboration among core and project teams” led “to grants, presentations and publications.” Additionally, “we hosted a virtual summit to promote research collaborations Center-wide.” Participants also noted the surge in “joint publications, poster presentations, group discussions, [and] seminar presentations by core leaders.” Another key indicator of increased productivity was the generation of “new manuscript collaborations underway for various cores and research projects.”

Increase in training activities

Trainee engagement also increased in Year 2 as more “trainees participated in community activities.” All of the project PIs served as mentors to undergraduate and graduate students as well as to ESIs. Additionally, training “included advisory planning committees, patient advocates, and organizations” that fostered program expansion.

Comparison between Years 1 and 2

As expected during the first year of group team development and the stage of forming, participants focused on understanding available resources. Consistent with the stage of norming, they identified requisite infrastructure to actualize collaborations. Once these factors were identified, by Year 2, team development rapidly segued to performing and producing outcomes, marked by increased publications, presentations, and grants. Considering that change processes generally take three to five years, realizing an uptick in productivity within a bi-coastal institutional partnership is remarkable.

Strengthening collaborations

Year 1

Three themes emerged from an analysis of how collaborations could be strengthened including calls: enhance workflow efficiency, forge a partnership identity, and strengthen collaborations. Conceptually, these themes referred to practices and processes perceived as integral to successful benchmark attainment.

Enhance workflow efficiency

To enhance workflow efficiency, subthemes were suggestions to: provide advanced notice and clarity for grant needs, require timely email response, and post Centerwide resource documents.

Advanced notice and clarity for grant needs

Participants asked for timely communication and reminders about grant report preparation requirements (i.e., how to format reports and present clearer expectations and details about when something should be completed). One suggestion was to ask PEC liaisons to meet with each core/research project team at least once per quarter to identify requisite data/documents needed to evaluate progress towards benchmark attainment.

Timely Email Response. Participants stressed the need to ensure timely responses. Some called for a standing practice to “respond to emails within 24 hours (one business day).” Others pointed out how failure to respond adversely impacted “evaluation processes” that resulted in lost opportunities for data collection from trainees and delayed or averted the “potential for scholarly dissemination.” Garnering grant funding in a highly competitive field is often viewed as a stellar accomplishment. Since our Center received grant funding following submission of its initial proposal, this places us among few CPACHE centers to achieve such an accolade.

Post Center Wide Resource Documents. Many of the participants requested making documents readily accessible and easy to locate by posting Center-wide resource documents in a designated folder. Among documents requested were: i) a master contact list with the phone numbers of all key investigators, ii) quarterly updates to a master list of project collaborators and potential mentors for trainees, iii) a list of the expectations for each core and for the overall Center, iv) institutional IRB approval letters for each partnership site, and v) a list of all evaluation activities and outcomes alongside CaRE2 benchmarks. Maintaining a running record of Center outcomes was suggested to facilitate quarterly and annual reporting.

Forge a partnership identity

Others recommended forging a partnership identity by attributing outcomes to the Center rather than by giving accolades to individual institutions. Increasing efforts to document and publish educational outcomes that accrued from trainee to ESI assessment of their program and mentoring experiences were mentioned to strengthen the partnership’s notoriety. Several participants called for expanding the Center’s research repertoire beyond its initial support for the prostate cancer and pancreatic cancer projects. One participant suggested applying experience from pancreatic cancer to prostate cancer to access fresh prostate cancer cells/tissue and seeking some additional funding to make our program projects stronger. Others recommended increasing the “collection of samples,” “identifying topics of interest among researchers,” or “specific projects that have value to all involved.” Another suggestion was that lead core members “become better acquainted with investigators at the other two sites.”

Strengthen collaborations

Suggestions aimed at strengthening collaborations pertained to infrastructure concerns, resolving Center problems, and data collection. Participants asserted that there was a need to foster equality in partnership activities. They recommended establishing a process to ensure equitable contributions and implementing action plans to resolve circumstances when this is not happening. They advised strengthening the infrastructure by scheduling meetings to “ensure findings are shared with the administrative core and other cores as needed” and holding more “face to face interactions” and “brainstorming sessions.” Others recommended “fostering authentic conversations to address valid concerns when things are not working” to ensure that problems impacting Center work would be addressed in a timely manner. Setting aside time for “separate meeting time to address problematic concerns” was recommended to ensure that special circumstances received appropriate attention to avert distractions from the business that needed to be discussed during regularly scheduled meeting. They offered suggestions to reduce participant survey burden by ensuring that trainee baseline measures were completed during onboarding.

Year 2

During Year 2, two themes emerged from analysis of how to strengthen collaborations, strengthen collaboration and address conflict resolution. Subthemes for strengthen collaboration included promote cross core-research project contributions, increase intra-institutional collaborations networks and authorships, and establishing opportunities for the intra-institutional engagement of MSI trainees.

Strengthen collaboration

“Strengthen collaborations” refers to identifying practices that would promote, encourage, and support continuous integration and collaboration throughout the Center.

Promote cross core-research project contributions. Participants suggested increasing the frequency of interactions between cores and research projects to augment contributions to each other. Additionally, participants noted that adopting effective collaboration principles and practices such as fostering “mutual respect”, working as a “Center-based team rather than individuals” and implementing team-based strategies would encourage “crossteam collaboration”. One participant stated that

Collaborations can be strengthened by effective communication, building trust, ensuring accountability, understanding each other’s role, and outlining expectations in the beginning. I believe sharing experiences and expertise can also assist in strengthening collaborations as everyone has something to bring to the table.

Having “more discussions amongst projects” [beyond] “a single meeting was recommended to strengthen collaboration” and as a mechanism that would encourage “brain-storming.” One recommendation was “to schedule a Center-wide meeting in addition to [the] annual investigator meeting . . . exchange and discuss findings” through a half-day retreat. These suggestions were grounded by a hope that such meetings would facilitate the emergence of new and innovative ideas and foster new collaborations. Other suggestions to strengthen collaborations were to “[write] grants together [that would] benefit our partners [and to develop] a mini grants program [to provide] funding to partners that would extend [the cores’] activities.”

Increase intra-institutional collaborations networks and authorships. Participants also suggested increasing intra-institutional collaborations networks and authorships. Participants remarked that there was an interest in developing future grant proposals and publications. Additionally, one participant suggested having “more opportunities for research summits or symposium.”

Increase intra-institutional engagement of MSI trainees. Participants offered suggestions for expanding the pipeline of future cancer health disparities researchers. To address this goal and further strengthen collaboration across the Center, one participant felt it was important to establish opportunities focused on developing a more “structured mechanism [for MSI trainees] to engage with cancer researchers” at the partner institutions.

Address conflict resolution

Regarding the theme of addressing conflict resolution, participants suggested reinforcing SOPs to ensure organizational governance. They also commented that SOP adherence could indirectly foster Center benchmark achievement. Some participants emphasized the importance of conflict resolution and pointed out the essential role of SOP adherence. One participant recommended that conflict resolution could be achieved by “a more detailed path for resolving conflicts” articulated within SOPs. Another participant suggested that “continuous improvement principles need to be followed in every activity.” This participant suggested listening to concerns expressed and then taking steps to make improvements. This individual emphasized the importance of acknowledging others’ observations stating that, “If one expresses a challenge, address and discuss it with the team members, and adopt the appropriate solution to make it better.”

Comparison between Years 1 and 2

The theme of enhancing workflow efficiency identified during Year 1 was not mentioned during Year 2. Representative of norming, participants expressed the importance of collaborative interactions within cores and research projects. Similarly, they also suggested increasing intra-institutional networks and authorships. This observation underscored researchers’ understanding that strengthening collaborations is valuable to the successful benchmark achievement. During Years 1 and 2, while exemplifying norming, participants continued to stress the importance of expanding the Center’s research repertoire. Engaging MSI student trainees and addressing conflict resolution to strengthen collaborations were new themes that arose in Year 2, perhaps signifying participant recognition of their integral connection to the continuation of performing.

Benchmark attainment

At the inception of funding for this Center, researchers had two publications. By the end of Year 2, the Center had produced 14 publications and 103 presentations, funded three projects, submitted 13 grant proposals, and received nine grant awards, had trained 63 URMs students and ESIs, and had received 10 other awards. The summary of benchmark attainment of this Center provides a context for understanding how collaboration among interdisciplinary science teams may impact its productivity.

Discussion

In this study, we used qualitative inquiry to provide a unique and nuanced level of insight that cannot be afforded by the sole use of statistical analysis. We codified team scientists’ written responses to open-ended survey questions to assess their perceptions of collaboration effectiveness within an MSI-PWI triadic research Center partnership. The findings offer discernment into their feelings, beliefs, and actions relative to interdisciplinary, multi-university collaborative research endeavors. Building a contextualized real-time understanding for how and why team scientists perceive collaboration effectiveness and subsequent responses may augment the rate and pace of future Center productivity as we use these findings to reify and normalize team development processes.8 After all, we cannot improve that which cannot assess or understand. Specifically, the findings helped identify practices that are working optimally and those which are not supporting grant outcome attainment and illuminated how well protocol and policy supported the Center’s aims.

Optimal practices

Many of the procedural processes, such as shared leadership responsibility, scheduled meetings, and frequent communication effectively served participants’ needs for sharing information. The use of Zoom technology, the primary mode of communication, may be useful for some project outcomes, such as developing new tools or methods, yet for other initiatives, this venue may not be advantageous for team scientists.22,23

A sense of teamwork, camaraderie and mutual respect was described as characteristic of Center interactions in Year 1, but not during Year 2. In Year 1, participants reported that shared leadership was evidenced in the Center’s infrastructure, across research projects via triadic scholarship, and cross institutional training. By Year 2, their responses seemed more nuanced. Evidence of increased publications and presentations, grant submissions, training activities, and internal documentation of benchmark attainment supported their perceptions. Consistent with the literature, our finding showed that scientific collaborations that span organizational (i.e., academic departments), geographic,2,24-26and institutional boundaries 2,24-28 generally enhance research impact.Our findings also support previous evidence that diversity (i.e., disciplinary expertise, gender, rank, race, and ethnicity) among members of a science team has certain advantages.25 However, we did not directly assess or find as other have reported, i.e., that ethnic homophily was associated with increased likelihood of coauthorship.29-30

Practices that necessitate improvement

During the first year, participants remarked that there was a need to ensure equitable individual contributions among all core/research project investigators. By the second year, this issue was not mentioned, suggesting that perhaps resolution had been achieved. Also, during the first year, participants recommended aligning meeting agendas with the work plans in the grant proposal and checking to ensure that proposed activities were accomplished.

During the second year, their responses concerning the challenges impacting Center-wide work were more specific. They reported a lack of access to shared documents and procedural governance. Others have reported the positive effect of knowledge sharing, information acquisition, and information dissemination on team learning and team performance. 31 Participant concern about a lack of access to Center-wide information is supported by previous findings.

Coordination, communication, trust, conflict, shared goals, and the availability of resources play crucial roles in team effectiveness and may be central to mediating team science productivity and impact.31-32 A lack of faceto- face meetings and conference attendance opportunities within the Center may have limited the success of longer distance collaborations. Such venues may be critical for inspiring new collaborations.33-34

Participants described the burdens associated with too many meetings and emergency emails. Conversations with the multiple PIs revealed that unforeseen funding agency requests unwittingly to contributed to the participants’ perceptions of feeling overloaded by persistent and untimed requests.

During partnership meetings, the multiple PIs stressed the importance of supporting MSI faculty through co-authorship and asserted that this was critical to fortifying the Center’s identity as an entity rather than three separate institutions. The Center’s competition for full and pilot research awards forged collaborative research projects across the partner institutions. To ensure the sharing of Center-wide announcements in a timely manner, institutional coordinators now send important reminders using the calendar function. As Cummings et al. noted multiuniversity research centers that use fewer coordination mechanisms yield poorer outcomes.32 They may encounter an initial lag in productivity compared with other research groups.35 Coordination behaviors such as establishing the division of responsibility for tasks and knowledge transfer among researchers and institutions are predictive of project outcomes (i.e., new knowledge production, new tools, and training student outcomes).32

From a theoretical perspective, the findings mirror many characteristics associated with the stages of group development. Most of the interactions reported in Year 1 are characteristic of forming and norming. Year 2 interactions align more with storming, norming, and performing. The movement between these stages was variable, yet typical of group development, while participants moved in and out, and back again to various stages. It takes time for team scientists from various training paradigms and disciplinary expertise to find commonalities, and to engender mechanisms that support collaborative work. There can be little doubt that bringing together team science researchers who have varied personalities, preferred work habits and modes of communication, and work in settings characterized by diverse institutional missions would instantaneously find harmony in interdisciplinary teams.

Benchmark attainment across the Center was expected to ensure continuous NIH funding. Meeting annual productivity objectives depended on the totality across the Center rather than expecting each of the shared workgroups to produce the same number of each. The availability of site-based, lab-related resources and personnel to generate data as well as faculty and trainee commitment to the Center likely impacted the development of products, publications, presentations, funded projects, grant submissions and awards, the number of URMs student trainees and ESIs, and other awards. Faculty/trainee effort was influenced by institutional willingness and ability to provide faculty release time from other academic responsibilities, such as teaching, to dedicate time to Center research projects, manuscript development and conference presentations, and towards providing mentored assistance to trainees. In that this Center has moved from two publications at the start of the grant to achieving noticeable benchmark attainment of 14 publications by the close of Year 2 is testimony to their diligent efforts as they continue towards building more cohesive teams.

These findings may assist newly created U54 CPACHE centers, by informing them of issues that may hinder collaboration and productivity and by suggesting processes that can be implemented to ensure mutual benefits to MSIs and PWIs alike. A concise set of strategies for multiple principal investigators is provided (see Table 2).

Over time, team science members are likely to develop a shared culture grounded by a negotiated set of norms and values that support and constrain members’ behavior. 4We recommend that funded center researchers implement repeated evaluation to assess team scientists’ perceptions of collaboration effectiveness via qualitative or quantitative methods. Specifically, for this Center, we recommend repeating data collection near the end of Years 3, 4 and 5 funding to determine if these findings were representative of the initial stages of partnership building. Study replication may allow participants and team scientists to see if offering an awareness of temporal findings leads to change in communication practices. Ideally, evaluative measures should be administered over the entire course of multi-year initiatives and across multiple sites simultaneously, including at the beginning, their near-term, and later phases. For prospective studies, we recommend offering participants a primer on team science competencies and assessing the relationships between their knowledge of team science competencies, collaboration, and productivity. In recognition of a paucity of literature on science team diversity, particularly when there is considerable cultural, national, and racial/ethnic diversity, we recommend future investigation.10

As shown here, how individuals perceive teamwork effectiveness is likely to impact their productivity. Assessing the perceived effectiveness of collaboration of teambased science research, while challenging and labor-intensive, is crucial to ensuring that the investments proffered by grant funding result in research centers achieving their programmatic goals. More importantly, it has the potential to advance our understanding regarding how well an investment in training has resulted in the anticipated outcomes.10-12 An in-depth analysis of individuals’ collective experiences can advance knowledge regarding how science teams across a full range of profiles and contexts can be maximized.

Assessing the presence, or lack thereof, of team competencies was a not a focus of our research initiative. Moreover, this study includes only two years of data. Thus, we invite readers to interpret the findings while bearing this in mind. Determining if requisite team competencies are enacted or if participants’ normative institutional cultures are aligned with team science characteristics are variables that future researchers may wish to investigate.10

We note that the findings are temporal and limited solely to those who participated in the online survey. We cannot determine if the findings are representative of the views of individuals who declined participation. Moreover, we cannot determine reasons for their non-participation. The use of convenience samples in a study of a single program is another limitation of this study. Pre-definitions of the terms “productivity” or “collaboration” were not provided in the survey. However, expectations for “productivity” and examples of “collaboration” were discussed in every written and verbal communication. We have ample reason to believe that the Center participants understood these terms. Nonetheless, we recognize that our assumption represents a potential limitation of this study.

Conclusions

The first two years of any partnership are inherently impacted by unforeseen challenges that are made more complex by institutional cultural differences and diverse personalities among team scientists (biomedical and social scientists, clinicians, and community health educators) whose background, experience, and training are often influenced by dissimilar paradigms. Expending effort to qualitatively assess collaboration effectiveness, while perhaps laborious, offers an opportunity to take corrective action, to foster productivity by building an understanding of issues arise, and to strategically minimize the deleterious impact of team dynamics and relationships that do not serve grant aims or that might otherwise may sabotage them. The need to assess and cultivate an understanding of collaborative effectiveness through a qualitative lens of team development is not widely recognized, yet the findings in this study support the merits of its application. It is unsurprising that participants suggested the Center’s initial efforts were concentrated on processes and protocol. Raising an awareness about what is working satisfactorily and identifying ensuing challenges might be helpful to investigators who suffer weariness when they do not recognize the inherently taxing impact of group (team) development. Such insight may assist advancement towards increased team science cohesion and motivate investigator effort in garnering additional grants and publications.

Table 2.

Guiding principles for MSI-PWI partnership development.

1. Ensure transparency across the MSI and PWI by making accessible, via cloud application, all documents.
2. Require equitable institutional commitment and accountability. Hold regularly scheduled meetings, create and review checklist of tasks. Identify and hold team members accountable for task completion. Hold Center-wide meetings quarterly to generate collaborations, share information, and address challenges.
3. Establish transparency during communications. Invite all leadership team members to articulate their needs, challenges experienced while working together, and encourage conflict resolution in real time.
4. Develop mutually agreed upon benchmarks of attainment for scholarly products and dissemination and share on the Center website. Ensure this information is shared in a Center-wide webinar and during regularly scheduled core and research project meetings.
5. Communicate time sensitive deadlines using the calendar function.
6. Frame the challenges and successes associated with team development as normative.
7. Use the stages of group development as analytical framework to assess interactions and to guide formative changes.

Funding Statement

Funding: Research reported in this publication was supported by the NIH/ National Cancer Institute Awards U54CA233444, U54CA233465, and U54CA233396. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. Availability of data and materials: Not applicable.

References

  • 1.Wuchty S, Jones BF, Uzzi B. The increasing dominance of teams in production of knowledge. Science 2007;316:1036–9. [DOI] [PubMed] [Google Scholar]
  • 2.Jones BF, Wuchty S, Uzzi B. Multi-university research teams: Shifting impact, geography, and stratification in science. Science 2008;322:1259–62. [DOI] [PubMed] [Google Scholar]
  • 3.Farrell M, Schmitt M, Heinemann G. Informal roles and the stages of interdisciplinary team development. J Interprof Care 2001;15:281-95. [DOI] [PubMed] [Google Scholar]
  • 4.Fiore SM, Carter DR, Asencio R. Conflict, trust, and cohesion: Examining affective and attitudinal factors in science teams. In Salas E, Vessey WB, Estrada AX. (Eds.). Team cohesion: Advances in psychological theory, methods and practice. Emerald Group Publishing Limited, Bingley; 2015:271–301. [Google Scholar]
  • 5.Tuckman BW. Developmental sequence in small groups. Psych Bull 1965;63:384–99. [DOI] [PubMed] [Google Scholar]
  • 6.Tuckman BW, Jensen MA. Stages of small group development revisited. Group and Organ Studies 1977;2:419–27. [Google Scholar]
  • 7.Johns G. Social behaviour and organizational processes. In Johns G. (ed). Organizational behaviour: understanding and managing life at work. Harper Collins College Publishers, 1996. [Google Scholar]
  • 8.Warren RC, Behar-Horenstein LS, Heard TV. Individual Perspectives of majority/minority partnerships: who really benefits and how? J Healthcare Poor Underserved 2019;30:102-15. [DOI] [PubMed] [Google Scholar]
  • 9.Davis AA, Warren RC, Behar-Horenstein LS. Review of HBCU and PWI partnership studies (1998-2018). J Negro Ed. (in press). [Google Scholar]
  • 10.Hall K, Vogel AL, Huang GC, et al. The science of team science: A review of the empirical evidence and research gaps on collaboration in science. Am Psych 2018;73:532–48. [DOI] [PubMed] [Google Scholar]
  • 11.Masse LC, Moser RP, Stokols D, et al. Measuring collaboration and transdisciplinary integration in team science. Am J Preventive Med 2008;35:S151-60. [DOI] [PubMed] [Google Scholar]
  • 12.Hall KL, Stokols D, Moser RP, et al. The collaboration readiness of transdisciplinary research teams and centers: findings from the National Cancer Institute's TREC year-one evaluation study. Am J Prev Med 2008;35:S161-72. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 13.Bell BS, Tannenbaum SI, Ford JK, et al. 100 years of training and development research: what we know and where we should go. J Appl Psych 2017;102:305–23. [DOI] [PubMed] [Google Scholar]
  • 14.Guzzo RA, Dickson MW. Teams in organizations: recent research on performance and effectiveness. Ann Rev Psych 1996;47:307–38. [DOI] [PubMed] [Google Scholar]
  • 15.Ilgen DR, Hollenbeck JR, Johnson M, Jundt D. Teams in organizations: From input-process-output models to IMOI models. Annu Rev Psych 2005;56:517–43. [DOI] [PubMed] [Google Scholar]
  • 16.Klein KJ, Kozlowski SW. Multilevel theory, research, and methods in organizations: Foundations, extensions, and new directions. San Francisco, CA: Jossey-Bass; 2000. [Google Scholar]
  • 17.Kozlowski SW, Ilgen DR. Enhancing the effectiveness of work groups and teams. Psych Sci Public Interest 2006;7:77–124. [DOI] [PubMed] [Google Scholar]
  • 18.Mathieu JE, Hollenbeck JR, van Knippenberg D, Ilgen DR. A century of work teams in the Journal of Applied Psychology. J Appl Psych 2017;102:452– 67. [DOI] [PubMed] [Google Scholar]
  • 19.Strauss AL. Qualitative analysis for social scientists. Cambridge: Cambridge University Press; 1987: 3. [Google Scholar]
  • 20.Saldaña J. The coding manual for qualitative researchers (4th ed.) Los Angeles, CA: SAGE Publications Ltd; 2021. [Google Scholar]
  • 21.Creswell JW, Guetterman TC. Educational research: planning, conducting, and evaluating quantitative and qualitative research (6th ed.). MA: Pearson; 2018. [Google Scholar]
  • 22.Cummings JN, Kiesler S. Collaborative research across disciplinary and organizational boundaries. Soc Studies Sci 2005:35:703–22. [Google Scholar]
  • 23.Vasileiadou E, Vliegenthart R. Research productivity in the era of the internet revisited. Res Pol 2009;38:1260–8. [Google Scholar]
  • 24.Abbasi A, Jaafari A. Research impact and scholars’ geographical diversity. J Informetrics 2013;7:683–92. [Google Scholar]
  • 25.Bales ME, Dine DC, Merrill JA, et al. Associating co-authorship patterns with publications in high-impact journals. J Biomed Inform 2014:52;311–8. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 26.Birnholtz J, Guha S, Yuan YC, et al. Cross-campus collaboration: A scientometric and network case study of publication activity across two campuses of a single institution. J Assoc Infor Sci Tech 2013:64:162–72. [Google Scholar]
  • 27.Jeong S, Choi JY. Collaborative research for academic knowledge creation: How team characteristics, motivation, and processes influence research impact. Sci Pub Pol 2015; 42:460–73. [Google Scholar]
  • 28.Mayrose I, Freilich S. The interplay between scientific overlap and cooperation and the resulting gain in co-authorship interactions. PLoS ONE 2015;10:e0137856. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 29.Dahlander L, McFarland DA. Ties that last: Tie formation and persistence in research collaborations over time. Admin Sci Qtrly 2013;58:69–110. [Google Scholar]
  • 30.Freeman RB, Huang W. Collaborating with people like me: Ethnic coauthorship within the United States. J Labor Econ 2015;33:S289–318. [Google Scholar]
  • 31.Xia L, Ya S. Study on knowledge sharing behavior engineering. Systems Engineering Procedia 2012;4:468–76. [Google Scholar]
  • 32.Cummings JN, Kiesler S. Coordination costs and project outcomes in multi-university collaborations. Res Pol 2007;36:1620–34. [Google Scholar]
  • 33.Binz-Scharf MC, Kalish Y, Paik L. Making science: New generations of collaborative knowledge production. Am Behav Sci 2015;59:531–47. [Google Scholar]
  • 34.Freeman RB, Ganguli I, Murciano-Goroff R. Why and wherefore of increased scientific collaboration. NBER Working Paper 2014; No. 19819. Cambridge, MA: National Bureau of Economic Research. Available from: https://www.nber.org/system/files/working_papers/w19819/w19819.pdf [Google Scholar]
  • 35.Hall KL, Stokols D, Stipelman BA, et al. Assessing the value of team science: A study comparing center- and investigatorinitiated grants. Am J Pre Med 2012;42:157–63. [DOI] [PMC free article] [PubMed] [Google Scholar]

Articles from Qualitative research in medicine & healthcare are provided here courtesy of KeAi Publishing

RESOURCES