Skip to main content
Elsevier - PMC COVID-19 Collection logoLink to Elsevier - PMC COVID-19 Collection
. 2023 Jan 16;98:102239. doi: 10.1016/j.evalprogplan.2023.102239

Remotely and collaboratively evaluating a campus-based therapy dog program during the COVID-19 pandemic

Shaneice Fletcher-Hildebrand a,, Linzi Williamson a, Karen Lawson b, Colleen Dell c
PMCID: PMC9841739  PMID: 37086706

Abstract

The COVID-19 pandemic adversely affected the wellbeing of university students and adults in general, emphasizing the need for mental health programming that was compliant with physical distancing mandates. The present evaluation investigated mental health and social connection within the context of COVID-19 by remotely evaluating a virtual animal-assisted activity at the University of Saskatchewan – PAWS Your Stress. The purpose of this article is to outline our evaluation methods and findings, while calling specific attention to the collaborative strategies that were implemented within a remote, time-sensitive context. The evaluation findings revealed that remote animal-assisted programming can facilitate connections with humans and animals, and promote multiple mental health benefits, despite the lack of physical interaction with the animals. Our lessons learned indicate that remote program logic modelling workshops are feasible when suited to audience demographics. Further, our experience suggests that the Most Significant Change technique (a qualitative, participatory, storytelling method that elicits outcome data) can be useful in time-restricted evaluations, and the necessity of central steps in the process may vary depending on evaluation goals. This project has implications for future evaluation work, by demonstrating the effective use of remote methods that allowed for successful stakeholder collaboration.

Keywords: Remote programming, Animal-assisted activity, Mental health programming, Collaborative evaluation, Most significant change, Program logic modelling

1. Introduction

The COVID-19 pandemic has negatively affected mental health for university students (Elmer et al., 2020, Hamza et al., 2020) and adults in general (Dozois, 2021, O’Connor et al., 2021; see Banks, Fancourt, & Xu, 2021 for a review of the literature on COVID-19 and mental health). Data from focus groups suggest that, for adults, reduced social interaction during the pandemic has been associated with a variety of challenges, such as loss of motivation, loss of self-worth, and feelings of depression or anxiety (Williams, Armitage, Tampe, & Dienes, 2020). Further, various pandemic-related uncertainties (e.g., educational, employment, financial, general) were found to play a role in reduced goal progression, decreased creativity, and increased feelings of anxiety, depression, fear, and stress for university students and adults in general (Bussolari et al., 2021, Ela et al., 2021, Wilson et al., 2020, Yoon et al., 2021). This evidence indicates a need for mental health and wellbeing supports in order to reduce the negative influence of COVID-19 on university students and the general population.

The current project examined mental health and social connection in the pandemic context by evaluating a remote animal-assisted activity (AAA) offered primarily to students, staff, and faculty at the University of Saskatchewan in Saskatoon, Saskatchewan, Canada. Mental health and social connection contribute to university student success in terms of academic performance (Akgun and Ciarrochi, 2003, Eisenberg et al., 2009, Mattanah et al., 2012, Owens et al., 2012) and retention rates (Bowles and Brindle, 2017, Eisenberg et al., 2009, Hjorth et al., 2016, Hilde Ramsdal et al., 2018, Nicpon et al., 2006). For instance, university students who took part in a group-based social support intervention that covered topics such as creating new social ties and work-life balance demonstrated higher GPAs compared to the control group (Mattanah et al., 2012). More recently, qualitative and clinical interviews revealed that people who dropped out of college were more likely to encounter mental health challenges and have mental illnesses compared to those who were in their last year of college (Hilde Ramsdal et al., 2018). Moreover, the students in their final year of college often discussed their positive experiences with social support, while those who dropped out of college commonly mentioned a lack of social support (Hilde Ramsdal et al., 2018). University students also tend to suffer higher levels of mental illnesses such as anxiety and depression (Eisenberg et al., 2007, Evans et al., 2018, Garlow et al., 2008, Ibrahim et al., 2013, Robinson et al., 2016) compared to the general population (Lim et al., 2018, O’Donnell et al., 2015, Remes et al., 2016). This highlights the importance of implementing and evaluating novel programming, such as AAAs, that targets mental health and social connection, especially within the restrictive pandemic context.

This article outlines the unique experience of conducting an entirely remote, collaborative, and utilization-focused evaluation of a remote intervention. Limited time was available to complete the evaluations, as rapid feedback was needed to make programming decisions for upcoming academic terms. We outline the methods that were chosen for process and outcome evaluations of the AAA which were guided by the remote, time-restricted context, and discuss evaluation findings and lessons learned. The main objectives of our evaluation efforts were to: a) determine how the newly remote program could be improved, b) decide whether the remote program format should continue in some capacity when university students return to in-person learning.

1.1. Animal-assisted activities

We use the term animal-assisted activity (AAA) to describe programs in which “informal interactions/visitations [are] often conducted on a volunteer basis by the human-animal team for motivational, educational and recreational purposes” (IAHAIO, 2018, p. 4). Animal-handler teams in AAAs often have some form of training, but handlers do not conduct their professional duties with the animal (IAHAIO, 2018). For example, AAAs do not include healthcare professionals who work with animals in their practices (e.g., mental health therapies).

This paper focuses on evaluations of PAWS Your Stress, an AAA that operates out of the University of Saskatchewan (USask) in partnership with the local St. John Ambulance therapy dog program in Saskatoon. In the program, certified animal-handler teams volunteer to visit community members at locations such as hospitals, seniors’ homes, nursing homes, and USask. Therapy dogs are a type of assistance animal that work alongside handlers to provide beneficial outcomes for people other than their handlers through programs such as AAAs (McMichael & Singletary, 2021). In-person AAAs can provide multiple health and wellness benefits for different groups of individuals, such as university students, including feelings of comfort, love, support, acceptance, happiness, connection, and reduced feelings of stress, anxiety, and loneliness (Barker et al., 2016, Binfet, 2017, Binfet et al., 2018, Crossman et al., 2015, Dell et al., 2015, Grajfoner et al., 2017, Lalonde et al., 2020, Sokal et al., 2021, Ward-Griffin et al., 2018).

1.2. PAWS Your Stress: in-person sessions to virtual sessions

The USask PAWS Your Stress program was founded by Dr. Colleen Dell in 2015 and aims to improve overall mental health and wellbeing by providing attendees with feelings of comfort, love, and support. Before the onset of the pandemic, the program was delivered in-person every two weeks, typically for one hour at libraries across the USask campus. During student final exams, sessions were offered twice per week for two hours each. While USask students were the primary target audience, USask staff and faculty were welcome to attend. During the session, USask community members visited with St. John Ambulance therapy dog handler-animal volunteers in a drop-in style, in which attendees interacted with the dogs (e.g., petting, scratching, talking, etc.), watched the dogs follow commands, talked with the handlers, and connected with other peers.

During the COVID-19 pandemic, the team adapted the program for remote (i.e., live Zoom sessions) and online (i.e., social media) formats to comply with pandemic safety mandates. In this paper, we will focus on the evaluation of the remote program delivery. One-hour, drop-in style Zoom sessions were offered to the USask community once every two weeks from September – November 2020 and January – March 2021, increasing to twice a week during final exams (December 2020 and April 2021). Because the sessions were advertised on social media platforms, the broader community also had access to the sessions (i.e., people located worldwide who were not USask staff, students, or faculty).

Program coordinators moved attendees when requested between two or three Zoom breakout rooms which each contained two or more handler-animal teams. Peer health mentors (volunteering students from USask’s peer health initiative) were also often present in the PAWS Your Stress Zoom sessions for student support if requested. In addition to St. John Ambulance handlers and therapy dogs, other individuals associated with the USask One Health and Wellness office volunteered as handler-animal teams with their non-therapy dogs or other pets (e.g., bearded dragon, cat, guinea pigs). These non-therapy dog teams were considered special guests in the Zoom sessions.1 Handler-animal teams welcomed session attendees when they joined the Zoom platform, facilitated conversation with others, and answered questions.2

1.3. The current evaluations

A previous evaluation of PAWS Your Stress completed by Dr. Linzi Williamson focused on the online program (i.e., social media efforts).3 Research on the in-person version of the program has also been completed (see Dell et al., 2015; Griffith, 2016; Lalonde et al., 2020). The current process and outcome evaluations were conducted by the first author (SFH) and focused on the remote program (i.e., the Zoom sessions). This paper contributes to the literature regarding both AAAs and program evaluation. Although literature on the positive effects of AAAs is abundant for different populations, including university students (e.g., Barker et al., 2016; Binfet, 2017; Binfet et al., 2018; Crossman et al., 2015; Dell et al., 2015; Grajfoner et al., 2017; Lalonde et al., 2020; Nepps, Stewart, & Bruckno, 2014; Sokal et al., 2021; Ward-Griffin et al., 2018), this article presents on a unique, remote AAA, in which attendees experienced multiple benefits despite the lack of physical interaction with dogs and special guest animals. In regards to evaluation research, this article contributes to understanding a heavily modified and rapid version of the Most Significant Change (MSC) technique (Dart and Davies, 2003, Davies and Dart, 2005). The MSC technique is a qualitative, participatory, storytelling method that elicits outcome data and can be helpful for identifying unexpected outcomes (Dart and Davies, 2003, Davies and Dart, 2005). The second column in Table 1 briefly outlines the steps involved in the MSC process. While numerous published evaluations have used this technique (e.g., Aisiri et al., 2020; Limato, Ahmed, Magdalena, Nasir, & Kotvojs, 2018), and modified versions of it (e.g., Connors et al. 2017; Ho, Labrecque, Batonon, Salsi, & Ratnayake, 2015), our time-restricted, remote approach required further adaptations.

Table 1.

Comparison of the 10-Step Most Significant Change technique with Our Approach.

Ten Most Significant Change technique steps Brief description of step according toDavies and Dart (2005) Our approach
1: How to start and raise interest Familiarize stakeholders with the approach Describe the technique in a remote meeting using a brief PowerPoint
2: Defining domains of change Identify categories of significant changes that program participants may experience Step omitted because domains are not essential when handling small numbers of significant change stories (Davies & Dart, 2005)
3: Defining the reporting period Determine frequency and timeline in which stories will be collected Collect one round of stories between March – April, 2021
4: Collecting significant change stories Choose a data collection method that will elicit significant change stories Semi-structured qualitative interviews with program attendees and qualitative questionnaires with animal handlers
5: Selecting the most significant of the stories To establish valued program outcomes in an organization, hierarchical groups select stories they perceive are the most significant starting with the lowest hierarchy Step omitted in interviews with attendees because establishing shared program values was not an objective in this evaluation (see section below on lessons learned about the approach)
Step included in qualitative questionnaires with handlers to choose three potential follow-up interview participants
6: Feeding back the results of the selection process Share results of the selection process with participants who shared MSC stories Step omitted because no single significant change story was selected in Step 5
7: Verification of stories Those who took part in the selection process (Step 5) judge the accuracy of the significant change stories Step omitted because Step 5 was not completed, and based on our program knowledge, no significant change stories included inaccuracies
8: Quantification Analyze qualitative data using quantitative approaches Step omitted because quantitative data was not needed to address evaluation questions
9: Secondary analysis and meta-monitoring Conduct different types of analyses in addition to the selection process in Step 5 Thematic coding; positive and negative change coding; and PLM coding were completed (Davies & Dart, 2005)a
10: Revising the system Make changes to MSC techniques to improve the process Step omitted because we only collected one round of significant change stories
a

Only the PLM coding results are presented in this article. The results of the other coding strategies can be viewed in the outcome evaluation report at www.therapydogs.ca/evaluation

Methods, analyses, and results are discussed according to three phases of the evaluation: program logic modelling workshops, process evaluation, and outcome evaluation. Both evaluations were collaborative and utilization-focused (Patton, 2008, Rodríguez-Campos, 2018), whereby the PAWS Your Stress team (i.e., the program founder, coordinators, and other volunteering team members) were involved in the evaluation planning, data collection, and final report. For example, questionnaires and interview guides were co-developed with the team. Including intended program users in the evaluation, especially in the planning stages, creates a mutual understanding of the evaluation purpose (Patton, 2008). This agreed-upon purpose guides methodological choices that are conducive to the use of the evaluation findings for the intended users (Patton, 2008).

2. Phase 1: program logic modelling workshops

Program logic models (PLM) are graphics that include program components (e.g., resources and activities) that are expected to produce certain outcomes (benefits) for the target audience, and illustrate the links between different expected outcomes (Knowlton & Phillips, 2013). It was necessary to update the in-person PLM for PAWS Your Stress to more accurately reflect the remote version of the program. Multiple inputs, activities, and outcomes from the in-person model were no longer relevant, and there were many new aspects of the remote program that were essential to include. Stakeholder-involved PLM development efforts vary widely with regards to style, length, audience size, and audience composition (Afifi et al., 2011, Cooksy et al., 2001, Chanfreau-Coffinier et al., 2019, Green, 2005, Helitzer et al., 2010, Porteous et al., 2002, Shakman and Rodriguez, 2015). Due to the remote context of the PAWS Your Stress evaluation, we did not follow any previously established PLM workshop guides (e.g., Green, 2005; Porteous et al., 2002; Shakman & Rodriguez, 2015). These approaches assume in-person gatherings, and therefore, suggest techniques that are suited to face-to-face interactions (e.g., using flipcharts or paper handouts). Previously established procedures also recommend detailed discussions of PLM elements, which was not necessary in our case as all team members were familiar with PLMs. Thus, our approach accounted for team members’ PLM knowledge, the number of team members involved in the evaluation (6 − 7), an appropriate length of time given team members’ responsibilities, and the remote context due to the COVID-19 pandemic. These factors indicated that short, remote, meeting-style workshops would be appropriate to adjust the existing PLM and refresh team members on how PLMs are developed.

To align with a collaborative approach, two 1.5-hour PLM development workshops were completed via Zoom with the PAWS Your Stress team in November 2020 (n = 6) and February 2021 (n = 7). The first focused on updating process-related PLM aspects (inputs, activities, target audience), while the second concentrated on revising program outcomes. The workshops also functioned to build evaluation capacity within the team, as it “involve[d] the design and implementation of teaching and learning strategies to help individuals, groups, and organizations, learn about what constitutes effective, useful, and professional evaluation practice” (Preskill & Boyle, 2008, p. 444). Capacity-building is a common approach used in evaluation initiatives with the goal of program improvement (Patton, 2008) and is a common practice across stakeholder-involved evaluation approaches (Fetterman, Rodríguez-Campos, Wandersman, O’Sullivan, & Zukoski, 2018). This approach offers stakeholders a chance to build their own evaluation skills to use in future scenarios (Preskill & Boyle, 2008).

For both sessions, team members were emailed the in-person version of the PLM two weeks in advance and asked to reflect on how items changed or stayed the same in the remote program. During the sessions, SFH refreshed team members on PLM definitions (e.g., inputs, activities, target audience, and short-term, intermediate-term, and long-term outcomes). While the in-person PLM was shown using the screenshare feature on Zoom, team members were asked to discuss how the PLM components needed to be altered in order to more accurately portray the remote version of the program. Following both workshops, team members were provided with meeting minutes and were allowed to clarify and/or add any points. The resulting PLM specific to USask students is depicted in Fig. 1.

Fig. 1.

Fig. 1

Program logic model for the remote version of PAWS Your Stress.

2.1. Lessons learned

The PLM workshops were effective in establishing a revised, stakeholder-informed model, but warrant reflections on why the method was successful and how it could have been improved. We identified three lessons learned based on our completely remote meeting-style sessions. First, our approach demonstrated the importance of establishing a workshop style that was appropriate for the audience, corroborating previous work (Green, 2005). Additionally, collaborative evaluation principles indicate that developed techniques should give thought to stakeholder qualifications (Rodrìguez-Campos, 2018). As all team members were familiar with PLMs, the evaluator did not need to describe PLMs in-depth, such as in other more detailed approaches (Porteous et al., 2002). Considering the team’s existing knowledge, their additional work responsibilities, and the number of team members, a short workshop was practical and allowed ample time to refresh individuals on PLM components and consider PLM revisions. Our approach supports Green’s (2005) findings, revealing that logic modelling workshops do not require resource intensive strategies to produce effective PLMs.

Second, remote PLM workshops can be feasible, despite the traditional practice of in-person gatherings (e.g., Chanfreau-Coffinier et al., 2019; Porteous et al., 2002; Shakman & Rodriguez, 2015). The screen sharing feature on Zoom allowed everyone to view the in-person version of the PLM and work jointly to suggest amendments. It is critical to bear in mind that, presently, team members worked off of an existing PLM. Other virtual software, such as Dylomo, may be more useful when creating a new PLM, in which components can easily be moved and adjusted in real time. In terms of session length, collaborative PLM activities are often longer than 1.5 h total (e.g., Chanfreau-Coffinier et al., 2019; Green, 2005; Shakman, K. & Rodriguez, 2015). However, 1.5 h was appealing and practicable for our workshops based on the virtual nature of interaction. The term Zoom fatigue has recently been used to describe exhaustion felt from using videoconferencing platforms (Bailenson, 2021). To minimize feelings of Zoom fatigue, it was crucial to limit the workshop length, while allowing sufficient time to discuss the PLM.

Finally, it may have been advantageous to measure evaluation capacity-building outcomes regarding uptake of PLM knowledge and skills. Only half of published evaluations using capacity-building strategies report on stakeholder learning outcomes (Labin, Duffy, Meyers, Wandersman, & Lesesne, 2012), despite recommendations to include this step (Preskill & Boyle, 2008). Because capacity-building was not the main purpose of the workshops, we did not think it necessary to collect such data. However, we believe that capacity-building outcome data could have been used to improve future endeavors by exploring what stakeholders learned, and which procedures facilitated learning (and sustained knowledge). Capacity-building strategies have the potential to elicit cognitive, behavioral, and affect-related outcomes for stakeholders (see Preskill and Boyle, 2008 for a review of literature), rendering this meta-evaluation approach useful for informing future PLM workshop sessions.

3. Phase 2: process evaluation

The process evaluation focused on the assessment of program functioning, including program inputs, activities, and target audience. The evaluation was formative, with the objective of program improvement (Scriven, 1981).

3.1. Data collection

Two online methods were chosen to examine program operation, improvement, and satisfaction to comply with physical distancing mandates. First, links to online questionnaires with open- and closed-ended questions were sent via email to four groups of stakeholders that attended the sessions (program coordinators, volunteering team members, animal handlers, and peer health mentors). The goal was to obtain program feedback and learn more about session promotion, the roles of all involved stakeholders, session activities, perceptions of participant engagement, and satisfaction levels. Eighty-five questionnaires were submitted over this time, resulting in an 80% response rate (volunteering team members = 100%; program coordinators = 100%; peer health mentors = 73%; animal handlers = 71%).

Second, SFH conducted, recorded and transcribed semi-structured qualitative interviews via Zoom with 10 PAWS Your Stress attendees from November to December, 2020. The interviewed attendees identified as women (n = 9) and men (n = 1), and ranged in age from 19 to 66 years, with an average age of 38 years. Six interviewed attendees were USask students, while two were staff members, one was a community member, and one was both a student and staff member. Within the collaborative approach, the team decided that the amount of data collected from 10 people was sufficient to answer the evaluation questions. Moreover, recruitment resulted in a 16% response rate, which may be considered high compared to others who have used email-based recruitment strategies (Koo and Skinner, 2005, Temple and Brown, 2011). Following consent, participants answered questions about their perceptions of program purpose (“In your opinion, what do you think is the purpose of the online version of the USask PAWS Your Stress therapy dog program?”), reasons for attendance (“Why do you visit with the therapy dogs online?”), perceived program target audience (“Who do you think would benefit from visiting online with the dogs? Why?”), satisfaction with various program components (e.g., “How satisfied are you with frequency of the remote sessions? Why?”), and barriers and facilitators to attending the sessions (“Are there any barriers that prevent you from visiting the therapy dogs online?” and “What might make it easier for you to visit with the therapy dogs online in the future?”).4

3.2. Analysis

Descriptive statistics were used to summarize closed-ended questions from the questionnaire (i.e., questions about attendance, promotion, technical issues, activities, engagement, and satisfaction). Responses to open-ended questions in the questionnaire were analyzed using a general inductive approach, which involved creating codes, subcodes, and themes that were informed by the evaluation objectives (Thomas, 2006). Responses to the open-ended questionnaire questions were initially coded into the broad themes of attendee engagement and session satisfaction. Thomas’ (2006) inductive approach was then used to create subcodes that reflected categories within each theme. Subcodes were not combined into broader categories, as there was only a small number. This aligns with Thomas’ (2006) suggestion that between 3 and 8 resulting categories is adequate to summarize the data. Regarding program feedback, data were initially coded into “consider” (potential areas to improve existing program elements), “reflect” (suggestions for new program elements), or “continue” (appreciated program elements) themes. Then, subcodes were created using Thomas’ (2006) inductive strategy.

The general inductive approach (Thomas, 2006) was also used to analyze the interview data. Passages were coded according to the interview question and/or prompt that they answered (e.g., program purpose, satisfaction levels with different program elements, etc.). An “Other” code was created for topics that did not neatly fit into the interview question topics. These codes were merged into broader categories to simplify the presentation of the results: program advertising, target audience, program purpose, remote program contributions, and program feedback. Identical to the qualitative questionnaire responses, “consider,” “reflect,” and “continue” codes were used to deductively organize program feedback, and the general inductive approach (Thomas, 2006) was used to establish subcodes.

3.3. Results

The results address the formative goal of program improvement, and introduce preliminary evidence of program outcomes.5 The process evaluation results are based on the program stakeholder questionnaires (n = 85 responses) and qualitative interviews with 10 program attendees.

Online questionnaires are commonplace in research (Dillman et al., 2014b, Lee et al., 2008); thus, this was not an unusual aspect of the remote data collection we completed to comply with COVID-19 social distancing mandates. However, we found that conducting remote interviews, which is more uncommon, was beneficial. For example, attendees reported that video-conference interviews were more convenient, more accessible, and allowed people from locations outside of the host city to easily attend.

3.3.1. Program satisfaction and improvement

Program stakeholder groups and attendees were all largely satisfied with the remote PAWS Your Stress program. Program stakeholders were either “very satisfied” (program coordinator = 90%, peer health mentor = 83%, volunteering team member = 77%, handler = 49%) or “somewhat satisfied” with the session they attended. Stakeholders most often related their satisfaction levels to connections that occurred within the session, special event sessions, and program operation (e.g., using Zoom rooms). Overall, stakeholders enjoyed the positive atmosphere in the sessions and various aspects of program operation, such as having someone moderate the Zoom rooms and having multiple people to technologically and socially support the sessions. In terms of program improvement, questionnaire responses focused on various aspects of program operation, session attendance, and attendee engagement. For instance, some stakeholders spoke about technical aspects: “It was hard for handlers who were using their [mobile devices] to see participants questions and faces.” One person noted the challenge of breakout rooms on Zoom: “I feel that sometimes going into another room the handlers and students are deep into a conversation that you may wait a while to get introductions to the dogs.” Some handlers proposed new suggestions for program operation: “One team per room” and “[Cut] the time or [have] teams come for just half an hour…it’s hard to keep [my dog]…in front of the screen for an hour.” Stakeholders also wished that more students would attend (“Would always like to see more students!”) and that the handlers would engage attendees more (“Maybe the silence was good to a certain extent but I think engaging the students…has a higher chance of reducing anxiety and increasing the sense of well-being”).

Program feedback from attendee interviews was predominantly positive. Attendees were mostly satisfied with the dates and times of the session, session frequency, having special guests (e.g., a bearded dragon), and using Zoom as the platform. They were especially appreciative of the animal handlers: “They're doing such a good job of learning this platform…I really commend them on making the effort to having to completely adjust.” Suggestions for improvement mostly involved session dates, times, and frequency: “I wonder if some people might be interested in an evening time, just because you’re relaxed, you’re at home… if you’re a student 1 pm on a Thursday you may be busy,” and “They could probably almost have [the program] every day during exams I think.

3.3.2. Social connection

Social connection was a dominant theme in stakeholder questionnaires and attendee interviews. All stakeholder groups valued the conversations and social connections that took place within the sessions. Attendee engagement was one of the central topics present when stakeholders described their satisfaction levels: “The participants were more engaged during this session than previous sessions I had attended” and “Connections were made between visitors and the handlers.” When asked about what went well within the sessions, the conversation-style was frequently discussed: “The handlers were so great at engaging everyone and keeping conversation going.” Similar themes were evident in attendee interviews.

Attendees’ perceptions of program purpose and personal reasons for attendance frequently centred on connection. Some attended to connect with other humans: “Communication and interaction… finding ways to interact with other people and finding commonalities.” Others tied the program purpose to larger concepts, such as connection to community and culture: “It has a sense of community about it,” and “I guess this connection to a caring culture…caring for others, caring for pets, caring for self.” These quotes provide initial evidence that program attendees feel connected as a result of the program, which factored into our decision to include “connection” as a short-term outcome in the PLM.

4. Phase 3: outcome evaluation

The outcome evaluation had summative purposes, as the team sought evidence that would inform whether the remote program should continue when in-person activities resume. To obtain program feedback and examine the extent to which outcomes were achieved, SFH pursued qualitative methods with session attendees and animal handlers. The qualitative focus was conducive to obtaining detailed understandings of program outcomes in a small amount of time (Flick, von Kardorff, & Steinke, 2004).

4.1. Data collection

Qualitative data were obtained from two groups of individuals to assess short-term outcomes in the remote PAWS Your Stress PLM (Fig. 1) and investigate the summative evaluation inquiry. First, individuals who attended at least one remote PAWS Your Stress session between December 2020 and March 2021 (referred to as “attendees”) were eligible to participate in a qualitative interview. Five attendees completed interviews via Zoom, representing a 20% response rate. Together with the PAWS Your Stress team, we decided that this sample size was sufficient due to the depth of information collected and based on other literature that suggests 20% response rates are average (Dillman, Smyth, & Christian, 2014a) or high for email recruitment (Koo and Skinner, 2005, Temple and Brown, 2011). All interviewed attendees were women. Four individuals shared their ages, ranging from 21 to 32 years with an average age of 28 years. Four were USask students and one was a USask staff member.

Prior to the interview, attendees completed an online questionnaire in which they supplied personal demographics (e.g., age, gender, ethnicity, place of residence, current pet status), and information related to the PAWS Your Stress program (e.g., attendance, satisfaction, following on social media, connection to USask community, past engagement with the in-person program, future engagement intentions). Following consent at the time of the interview, attendees responded to questions about how attending the remote sessions affected their mental health and wellbeing (e.g., “What do you feel when you attend the remote therapy dog sessions?”), feelings of connection (e.g. “In the context of the COVID-19 pandemic, do you feel an increased sense of connection after attending the remote therapy dog sessions?”), and potential feelings of loneliness and uncertainty related to the COVID-19 pandemic (e.g., “Have you felt isolated/lonely at any point during the COVID-19 pandemic? If so, do the therapy dog Zoom sessions make you feel less isolated/lonely?”). Further, the MSC technique (Dart and Davies, 2003, Davies and Dart, 2005) was incorporated into the semi-structured interviews.

The MSC technique can be described as a qualitative, participatory, storytelling method that elicits outcome data (Dart and Davies, 2003, Davies and Dart, 2005). In the outcome evaluation interviews, attendees were asked “Providing as much detail as possible, what is the most significant change that has happened in your life as a result of the remote PAWS Your Stress sessions?” This approach is helpful for identifying unexpected outcomes (Dart & Davies, 2005), which we anticipated due to the novel, unique, and exploratory nature of the remote program. Additionally, the resulting stories can provide an in-depth overview of how the program affects individuals, avoiding simplistic explanations (Dart & Davies, 2005). This was advantageous because there was limited time to collect data, and this method provided a means to achieve a thorough understanding of program outcomes in addition to other qualitative questions. Although Davies and Dart (2005) outline a detailed ten-step guide, a heavily modified and rapid version of the method was used to accommodate the limited timeframe. Table 1 compares the 10-step technique with our approach.

It was important to gain perspectives from the animal handlers who had observational evidence of program outcomes. Eleven out of 19 handlers who volunteered in at least one remote program session completed a questionnaire via email during March 2021. The questionnaire consisted of open-ended questions aimed at extracting handlers’ perceptions of the most significant change that program attendees experienced as a result of the remote program (“What do you perceive to be the most significant change (effect) that people experience as a result of attending PAWS Your Stress remotely via Zoom?” and “Why do you perceive this to be the most significant change?”). Because questionnaire responses for the MSC question were brief (average word count = 132), a subset of handlers were contacted for follow-up semi-structured interviews to obtain more detailed responses. To select handlers to interview, the PAWS Your Stress team took part in Step 5 of the MSC technique: selecting the most significant of the stories (Davies & Dart, 2005; see Table 1). Team members read de-identified copies of handlers’ MSC responses and voted via online questionnaire on their top three stories. Based on the votes, four handlers were invited to participate; three completed interviews. Each interview included different questions and prompts based on the handlers’ questionnaire response.

4.2. Analysis

Thomas’ (2006) general inductive approach guided the analysis of qualitative interviews with PAWS Your Stress attendees. Passages were coded according to the question the attendee answered. The resulting codes were merged into broader themes based on their relatedness. Two exceptions to this coding method were made. Two questions were broad; therefore, responses to these questions were dispersed into codes associated with other questions where appropriate. The second exception was the MSC question, which was analyzed according to a strategy outlined by Davies and Dart (2005). Specifically, short-term outcomes in the PLM became codes that were aligned with passages in attendees’ significant change stories. The same MSC analysis was used to examine the questionnaire and interview data from the handlers. Analysis was combined for handler questionnaires and interviews because interview questions merely prompted details about the questionnaire responses.

4.3. Results

The outcome evaluation results are derived from a mixture of interviews and questionnaire responses from five session attendees, qualitative questionnaires with 11 animal handlers, and follow-up qualitative interviews with three handlers. These results speak to short-term outcomes in the PLM and summative evaluation objectives.

4.3.1. Program satisfaction

Attendees found value in the remote PAWS Your Stress sessions and were thoroughly satisfied with the program: “After attending PAWS Your Stress [remotely], I felt like a little happy glow for a good day or so, it was nice,” and “I think it’s just nice to have [the Zoom sessions] as an option from USask and from the PAWS Your Stress…organization.” They expressed appreciation for the handlers and PAWS Your Stress team: “[Handlers are] actually a really big component of this [program]…Whenever someone joined the session, they used our names to welcome us and ask us ‘How’s it going?’. They were really great” and “I… appreciate the work that goes into [the Zoom sessions].” Attendees also identified multiple advantages unique to the remote program, including greater accessibility, the ability to connect with multiple handlers and dogs, the option to control the auditory and visual experience, and better for those with dog allergies. Additionally, three of the five attendees stated that they would access the PAWS Your Stress program in-person and remotely once the in-person program restarts. Together, these data led to the recommendation that the remote program should continue in some capacity in the future. The assessment of short-term outcomes, outlined below, further indicate the remote program’s merit.

4.3.2. Mental health & wellbeing

Attendees shared several mental health and wellbeing benefits that they attributed to their attendance at the remote sessions. These benefits included: comfort, support, happiness/uplifted mood, connectedness, and reduced feelings of depression, loneliness, and anxiety. For instance, one attendee felt comforted due to the “building of community” over multiple sessions, while another felt supported by the USask and PAWS Your Stress organizations: “[It’s] a nice thing to know that you are being supported for your mental health and your stress levels and…[to] have those kinds of options available.” All interviewed attendees reported feelings of happiness that emerged during the sessions for several reasons, such as observing the dogs and observing connections within the sessions. Social connection was another evident outcome for most attendees. One person stated that the program “grounded [them] in the desire for connectivity,” while another thought the sessions “were a really nice place for community and interaction with likeminded people.” Some individuals shared that the sessions helped them feel less depressed and/or anxious: “At the time I was really depressed… I guess momentarily it does uplift your spirit, your mood…Even those moments count.” With respect to anxiety, one attendee stated: “I think it could help to reduce anxiety. It did reduce mine.” These positive effects extended to COVID-19 related outcomes in the PLM, as outlined below.

Most interviewed attendees agreed that the remote PAWS Your Stress session(s) decreased their feelings of isolation/loneliness during the COVID-19 pandemic, increased their sense of connectedness, and made them feel relieved from the uncertain COVID-19 context. They felt less lonely from connecting with others, relating to others, and feeling a sense of community. One staff member attributed their reduced feelings of loneliness during the pandemic to having real-time interactions: “There’s really something to be said for time where people are together interacting in real time…it definitely bridges a gap for isolation.” Most attendees also felt an increased sense of social connection in relation to the pandemic: “It was nice to get to know people because of the pandemic, honestly, I don’t even know my classmates that well.” Finally, some individuals felt the sessions provided them with a sense of relief from “isolation,” “discussions of [the pandemic],” and “monotonous days.” The passages in this section support several expected short-term PLM outcomes, despite the lack of physical touch with the animals that is a salient component of the in-person program version.

4.3.3. Most significant change

Most PAWS Your Stress attendees described multiple changes they experienced. One student did not report experiencing any significant changes due to their personal history with dogs. Instead, they described significant changes others might experience. Table 2 includes quotes that accompanied interviewed attendees’ most significant changes organized by PLM outcome.

Table 2.

Most Significant Change Evidence from Program Attendees Organized by PLM Outcome.

Program logic model outcome Evidence
Increased desire to manage personal stress
  • The one session…made me think of different ways we approach…I wanna use the word healing. It’s definitely opened up that sense of awareness from what I would burden as conventional paths to recognizing that there’s a lot of different ways to come to these different states.

  • You can get so fixated or focused on work and your personal life and trying to still have somewhat of a social life and stay healthy…I would say it did encourage just to actually see how simple it was. It’s not another task to do. Then that kind of bridges with other supports that are available…like the yoga on Tuesdays or Wednesdays.

Increased feelings of support AND Increased sense of being valued by others
  • It’s just nice to know the different supports that are available, especially during this time. I think that was really nice. It makes me feel supported.

  • It’s nice working for, or being part of a larger organization ‘cause I’ve worked for smaller ones where obviously they don’t have the resources to be able to have these [supports] available.

Increased sense of connection (with people outside of the session)
  • [The Zoom session] was a brand-new experience for me. I got to talk about it to family and friends. That was an interesting anecdote from me.

  • One of the challenges of social interaction in the time of COVID is that we really aren’t doing much…When you’re having a conversation with your parents it’s a little bit like ‘Well, I can’t really say I’ve done all that much. I did a bunch of school stuff, but you don’t really care about the school stuff that I’m doing.’ So what do you talk about after the first five minutes? One of the big impacts of having PAWS Your Stress is it gives me a whole bunch of other things to talk to people about.

Increased feelings of happiness and uplifted mood
  • Bursts of serotonin from seeing a dog…When you’re staring at a dog smiling at you it’s pretty hard not to smile back and get outta your head for a bit…I think the happiness part is the most significant because whatever it is about dogs that makes you happy, it’s different than everything else.

Decreased feelings of stress
  • I think the main thing would be stress relief…The human interaction and dogs are something different…I think that’s where the stress relief comes in.

Decreased feelings of depression (during and after session)
  • It was this beautiful session, but then I felt that maybe I’m complicit [in relation to query about animal ethics] because I participated…‘cause I was depressed I needed some mood uplifting.

  • o

    Passage from earlier portion of interview: At the time I was really depressed… [the session] definitely helped me…I felt good for about an hour, which means a lot for a depressed person.

Decreased feelings of anxiety (outside of the session)
  • For me as someone with a level of social anxiety, having something to have a conversation about is really, really useful. It is… much easier for me to walk into [a] situation knowing that I have something people would be interested in hearing about then to be in that situation and not necessarily know where to go or what I should be talking about. I think…that’s the big thing…it’s something that I can use as a tool to help me in additional social situations.

Increased awareness of therapy dogs and companion animals
  • I have an animal, I’ve engaged and I love my animal…There’s a greater awareness of it… with the whole therapeutic process going [on] around it. It actually elevated me when I went home and I was like, I don’t think I’ve been as aware.

Similar to attendees, most handler questionnaire responses alluded to multiple PLM outcomes. The most common PLM outcomes that handlers thought attendees would experience from the remote program included: a) decreased stress, b) reduced anxiety, c) increased happiness, d) increased sense of connection with humans and animals, and e) increased desire to manage personal stress. Increased feelings of happiness, desire to manage personal stress, and increased sense of connection were outcomes discussed by both attendees and handlers. Table 3 provides quotations that elucidate handlers’ perceptions of significant changes.

Table 3.

Examples of Handlers’ Perceptions of Most Significant Change Organized by PLM Outcome.

Program logic model outcome Evidence
Increased sense of connection with humans and animals (inside and outside of the session)
  • I’ve had a number of people say that they are looking for ways to connect socially during the pandemic, and that it is nice to see the dogs but it is also nice to see the humans.

  • It’s like if you go to the coffee shop and…you’re listening to other people’s conversations… sometimes you’ll come away from that sitting at the coffee shop and being like ‘Oh I heard this crazy story at the coffee shop today’ and I think it’s the same thing in these sessions where they can go back to someone and be like ‘I watched the PAWS Your Stress team today and [dog] is such a goofball.’

  • I’ve definitely seen some people in the sessions talk to each other about their pets. Almost like making a Zoom friend where they can talk about things that they’re going through.

  • I don’t think it’s the dogs alone. There’s something there with the handlers too. It’s the way that they create a very welcoming space, the way that they really encourage people to be connected as well and to be connected with the others that are there, but also trying to connect them with the dogs too.

  • My guess is that they feel…more connected to USask.

Decreased feelings of stress and/or anxiety
  • I think that Zoom Paws Your Stress acts to decrease stress by giving participants a welcome break from studies, Covid stress, etc.

  • People have said they’ve felt more relaxed and calm after spending some time with the dogs than they were before coming into the zoom room.

  • Decreased stress and anxiety. I’ve been told by students in the past (and even some handlers) that interacting with the dogs and each other has a calming effect.

  • Having a bit of social time just to talk about the weather and the dogs and how are your exams going and that kind of thing. I think that’s stress relieving actually. A lot of people talk to relieve stress.

Increased feelings of happiness and uplifted mood
  • I see participants smile, engage in conversation, and often turn their cameras on even if for a short bit to say hi when we first greet them…I see what I perceive as more joyful body language once they see the dogs/handlers and others in the room.

  • The most significant change that people experience as a result of attending PAWS Your Stress remotely is an increase in happiness and a decrease in stress. I notice that after a few minutes of watching the dogs and talking with the handlers, participants are more open, talking more freely and often smile and laugh at what the dogs are doing.

  • Smiles, positive memories… one participant said… she was excited to see [animal] on zoom, lots of participants share about their pets or experiences with animals.

Increased desire to manage personal stress
  • We see many repeat visitors which suggests that they are reaping some benefit from the encounter.

  • We have a lot of people that come back. We have a lot of repeat visitors. I see some of the same people week after week and they’re obviously coming for a reason.

  • Because [students are] in school there’s a lot of pressure, there’s a lot with assignments and competing things that are supposed to be within their brains. I think it’s related to stress and needing to…for a moment be in conscious awareness of something else. In that way, it’s sort of like what meditation might do…by really figuring out what needs the attention and how much, and what you can tune out.

  • We all have a ton of things no matter who we are that have our attention on a daily basis. The fact that [attendees] take time away from those other things to come to the sessions, I probably would link that to stress management. Needing to walk away and take a break.

Decreased feelings of loneliness
  • That’s definitely the whole goal of the program, is to do whatever we can to make people feel like they are less alone.

  • My guess is that they feel less isolated, less lonely.

  • I think the other thing I seen is some of the students talking amongst themselves… I definitely think it gives them other people to relate to, so that they’re feeling less alone, less isolated ‘cause there’s other people going through the same thing as they are.

Decreased feelings of social isolation in the context of COVID-19 AND Increased sense of connection with humans and animals
  • I think a significant effect for visitors to Paws Your Stress on Zoom might be a sense of connectedness in this pandemic where people are likely to feel very disconnected.

  • I focused on connectedness because we’re in a situation that we’ve never experienced before. Students who have been on campus and being physically on campus you would feel a sense of connection just with the buildings, with your classmates, with the students… Nobody can do that anymore. I think the connection is different, but it’s equally important, perhaps more important right now during the pandemic.

  • I think the effect that our Zoom sessions are having on individuals is to help them not feel so alone during this pandemic. It can be very lonely to be isolated in your home with no family/friends or even your pets.

Increased awareness of therapy dogs and companion animals
  • International students joining in from other countries, they often ask about our pets and they often ask about breeds. They often ask about special treatment of pets or [if] you have to buy special food. That’s kind of what I meant about a learning experience.

  • Students did ask quite a few questions about the therapy dog breeds, habits, duties. I'm sure students got lots of new knowledge about therapy dogs out of the zoom sessions.

Increased feelings of support AND Increased sense of being valued by others
  • I remember one person attended because she lost her dog recently, it passed, she showed [us] a photo and we provided our condolences and support and asked some questions about the dog. Maybe she just wanted to come to a like-minded community at this time in her life.

4.4. Lessons learned: most significant change technique

The MSC technique allowed for an in-depth understanding of program attendees’ perceptions of program outcomes they experienced. First, we found that the technique was suitable once adjusted to meet the time-sensitive evaluation needs of the small-scale PAWS Your Stress program. Some evaluators suggest that the technique is not appropriate for time-sensitive evaluations (Serrat, 2010, Rabie and Burger, 2019), as time is essential to have discussions with multiple groups of program stakeholders and to repeat the process as required. In our case, one round of significant change data was sufficient to investigate potential outcomes of the newly remote program. Further, the story selection phase was only included for handlers’ data, which only required consultation with one stakeholder group. Therefore, our modifications allowed for efficient use of the method within a quick evaluation.

Second, our experience suggests that the necessity of the story selection step (see Table 1) may vary depending on evaluation goals. While Davies & Dart (2005) describe this step as fundamental, story selection was not completed for attendees’ significant change stories. The purpose of story selection is to determine which outcomes are valued by program stakeholders, which can then lead to discussions, for example, about different values that emerge between hierarchical groups in an organization (Davies & Dart, 2005). In the evaluations of the remote PAWS Your Stress program, determining valued outcomes was not a project goal. Instead, we used the MSC technique to solidify expected program outcomes and identify any unexpected outcomes. These goals were more in line with our outcome evaluation objectives. For attendees’ data, we felt the voting process was non-essential. On the other hand, a voting process (i.e., story selection) was completed with handlers’ MSC data. In this case, we believed that identifying stories valued by team members was useful for purposively choosing participants to follow up with. This method also allowed the team to actively participate in the evaluation, a strategy that can improve evaluation use (Patton, 2008).

Our third lesson corroborates reflections made by others, namely that responding to MSC questions can be challenging for some participants (Kelaher et al., 2013, Wilder and Walpole, 2008). Previous evaluators observed that participants had difficulties choosing one significant outcome (Kelaher et al., 2013) and explaining why the outcome they identified was the most significant (Wilder & Walpole, 2008). Our experience was similar, in which most participants (attendees and handlers) alluded to multiple changes in their responses, and their reasons for significance were not always explained coherently, if explained at all. Explanations were especially absent in email responses from handlers, which suggests that interview methodology may be more beneficial because it allows for prompting. The following prompts may be helpful for identifying one significant change and eliciting articulate explanations about why the change is significant: “What are some other changes that you or others might experience? Why is the change you first identified more significant than these other changes?”.

Lastly, we suspect that giving participants ample time to reflect on how the program has affected them would boost the quality of MSC responses in terms of length and detail. On average, handler questionnaire responses in our evaluation were 132 words and discussions around the MSC question in attendee interviews lasted four minutes. Limato and colleagues (2018) recommended using a longitudinal approach to data collection in order to observe changes over time. We think that a longitudinal approach would allow participants adequate time to consider how they have been affected, and in particular, which outcome is perceived as the most significant and why. For time-limited projects, it may be helpful to provide participants with the interview schedule ahead of time.

5. Conclusion

Evaluation of mental health resources is crucial in the context of COVID-19, as the pandemic has negatively impacted the wellbeing of university students and the general population (Bussolari et al., 2021, Ela et al., 2021, Williams et al., 2020, Wilson et al., 2020, Yoon et al., 2021). Animal-assisted activities are one type of intervention that can positively effect peoples’ mental health and wellbeing (Barker et al., 2016, Binfet, 2017, Binfet et al., 2018, Crossman et al., 2015, Dell et al., 2015, Grajfoner et al., 2017, Lalonde et al., 2020, Ward-Griffin et al., 2018). The present evaluations of a remote AAA (PAWS Your Stress) were conducted to a) better understand how the newly developed program functioned, b) elicit program feedback, c) examine the extent to which short-term program outcomes were met, and d) determine if the remote program should persist when in-person activities are permitted. We found that, despite the lack of physical interaction with the dogs, attendees experienced many mental health and wellbeing benefits, contributing to our recommendation that the remote program should continue to some extent. While Sokal and colleagues (2021) find that physical touch is a key mechanism facilitating improved mental health and wellbeing for those that attend AAAs, our evaluation shows that physical touch is not necessary to produce positive outcomes for individuals in the remote context of the COVID-19 pandemic (e.g., social connection, reduced anxiety, increased happiness, etc.). Our evaluation efforts further suggest that completely remote collaborative evaluations can be feasible and effective. The PLM workshops resulted in a successfully revised PLM, suggesting that our short meeting-style workshops were suitable. As well, our modified version of the MSC technique (Dart and Davies, 2003, Davies and Dart, 2005) demonstrated that the story selection process can be adjusted to meet specific evaluation needs.

The limitations to our evaluation pertain to the methods and the COVID-19 context. First, the evaluations were time-restricted in order to provide the PAWS Your Stress team with timely feedback that would allow them to make programming decisions for future academic terms. This limited the methods we could employ and the amount of data that could be handled. However, we chose a mostly qualitative approach to data collection, which facilitated rich, in-depth responses needed to address the evaluation objectives (Flick et al., 2004). This predominantly qualitative approach allowed for an adequate amount of data collection within a short period of time.

Second, the sample sizes for the qualitative interviews with program attendees were small. Although some suggest that there is no optimal sample size in qualitative research (Patton, 2002), recruitment was limited to the number of people who attended the remote PAWS Your Stress sessions during the school year, and by the amount of time we had to recruit. To mitigate this challenge, and in line with the collaborative approach, we consulted with team members in group meetings to discuss the amount of data that was collected. Based on these discussions, team members felt that there was sufficient data to address the evaluation objectives.

Finally, the COVID-19 pandemic undoubtedly influenced the evaluation results we obtained, yet in ways that we are uncertain of. We suspect the results reflect this very specific period of time, and may not speak to future post-pandemic experiences. For example, remote program attendees often felt an increased sense of connection and decreased feelings of loneliness – outcomes that may not emerge if they were to attend the remote version of PAWS Your Stress in a non-pandemic time period. Our results, instead, speak to the beneficial outcomes of remote programs in a time when physical distancing mandates inhibit in-person gatherings and connections.

This article provides one example of how evaluation, and specifically collaborative evaluation, can be successful in a remote context. Similar to Green’s (2005) dissertation work, it is worth evaluating the extent to which stakeholder-involved PLM development efforts are advantageous and contribute to increased evaluation capacity-building. This may be especially worthwhile for remote workshops, as they can be a more accessible option for collaborative work. We anticipate using the PLM workshop approach outlined in this article to lead further workshops with local animal-service organizations, with the goal of building their evaluation knowledge and capacity. In addition, our modified use of the MSC technique can inform future evaluation work, specifically those that require in-depth data in a short amount of time. However, more evidence is needed to support our recommendations. While the manner in which evaluations are conducted will likely change post-pandemic, this work has implications for these post-pandemic endeavors, indicating that evaluators can effectively design projects that engage stakeholders through distanced, virtual methods.

Funding source

This work was supported by the Saskatchewan Health Research Foundation [no grant number].

Competing interests statement

The authors declare that they have no competing interests.

Acknowledgments

We would like to acknowledge and express our appreciation towards all of the volunteering animals, handlers, and peer health mentors that participated in the remote version of the program. Our program efforts and evaluation work would not be possible without their participation. Further, we thank the PAWS Your Stress team members for the time and energy they spent involved in the evaluation.

Biographies

Shaneice Fletcher-Hildebrand, BA (Hons.): Shaneice Fletcher-Hildebrand is a graduate student in the Applied Social Psychology program at the University of Saskatchewan, who completed evaluation training with Dr. Karen Lawson (PhD, CE). To date, her evaluation work has focussed on dementia education programming for acute healthcare professionals, online programming for caregivers of people living with dementia, a housing program for people living with HIV/AIDS, and, most recently, therapy dog programming. Her primary evaluation interests include community-based evaluation and stakeholder-involved approaches.

Linzi Williamson, PhD: Dr. Linzi Williamson completed her PhD in Applied Social Psychology at the University of Saskatchewan (USask). Currently, she is a Canadian Institutes of Health Research (CIHR)-funded postdoctoral fellow in the Office of One Health and Wellness with the Department of Sociology at USask. Her current research interests include the human-animal bond, effectiveness of canine-assisted interventions, and canine welfare.

Karen Lawson, PhD, CE: Karen Lawson is a professor of psychology at the University of Saskatchewan and the current chair of the Canadian Consortium of Universities for Evaluation Education (CUEE). Trained as an applied social psychologist, she focuses on conducting community-based program development and evaluation research across a variety of sectors. She teaches program evaluation at the graduate level and has supervised over 40 program evaluation–related graduate student internships in the community.

Colleen Dell, PhD: Colleen Anne Dell is a Professor & Centennial Enhancement Chair in One Health and Wellness at the University of Saskatchewan. Her research is grounded in a community-empowered participatory approach. Her research interests include the relationship between identity and healing from drug addiction, animal-assisted interventions and evaluation. Her research areas are specific to Indigenous peoples, criminalized women and drug using populations. She is a Senior Research Associate with the Canadian Centre on Substance Use and Addiction. In 2017 she was named to the order of St. John in recognition of her research and community work involving animal-assisted interventions.

Footnotes

1

In the remote version of the program, it was not necessary to only include therapy dog teams from St. John Ambulance, since there were no issues with safety. Safety is one reason why it is important to have formally trained therapy dogs for in-person interactions. We do not believe having non-therapy animals affected the dynamics in the remote sessions, since all handlers interacted with session attendees in a similar way and took part in similar activities with their pets (e.g., petting them).

2

You can learn more about the remote version of PAWS Your Stress by visiting https://www.therapydogs.ca

3

See final report and infographics regarding the evaluation of online program components by visiting https://www.therapydogs.ca/evaluation

4

Please note, we did not seek ethics approval from our institutional review board, as evaluation work is exempted from ethics review in Canada. Similarly, ethics approval was not sought for data collection in the outcome evaluation (Phase 3)

5

Complete results are presented in the process evaluation report on the PAWS Your Stress website: https://therapydogs.ca/evaluation.

References

  1. Afifi R.A., Makhoul J., El Hajj T., Nakkash R.T. Developing a logic model for youth mental health: Participatory research with a refugee community in Beirut. Health Policy and Planning. 2011;26:508–517. doi: 10.1093/heapol/czr001. [DOI] [PMC free article] [PubMed] [Google Scholar]
  2. Aisiri A., Fagbemi B., Akintola O.A., Abodunrin O.S., Olarewaju O., Laleye O., Edozieuno A. Use of the most significant change technique to evaluate intervention in promoting childbirth spacing in Nigeria. African Evaluation Journal. 2020:a426. doi: 10.4102/aej.v8i1.426. [DOI] [Google Scholar]
  3. Akgun S., Ciarrochi J. Learned resourcefulness moderates the relationship between academic stress and academic performance. Educational Psychology. 2003;23(3):287–294. doi: 10.1080/0144341032000060129. [DOI] [Google Scholar]
  4. Bailenson J.N. Nonverbal overload: A theoretical argument for the causes of Zoom fatigue. Technology, Mind, and Behavior. 2021;2(1) doi: 10.1037/tmb0000030. [DOI] [Google Scholar]
  5. Banks, J., Fancourt, D., & Xu, X. (2021). Mental health and the COVID-19 pandemic. In J.F. Helliwell, R. Layard, J.D. Sachs, J. De Neve, L.B. Aknin, & S. Wang (Eds.), World Happiness Report 2021 (pp. 107–130). Sustainable Development Solutions Network. 〈https://happiness-report.s3.amazonaws.com/2021/WHR+21.pdf〉.
  6. Barker S.B., Barker R.T., McCain N.L., Schubert C.M. A randomized cross-over exploratory study of the effect of visiting therapy dogs on college student stress before final exams. Anthrozoös. 2016;29(1):35–46. doi: 10.1080/08927936.2015.1069988. [DOI] [Google Scholar]
  7. Binfet J. The effects of group-administered canine therapy on university students’ wellbeing: A randomized controlled trial. Anthrozoös. 2017;30(3):397–414. doi: 10.1080/08927936.2017.1335097. [DOI] [Google Scholar]
  8. Binfet J., Passmore H., Cebry A., Struik K., McKay C. Reducing university students’ stress through a drop-in canine-therapy program. Journal of Mental Health. 2018;27(3):197–204. doi: 10.1080/09638237.2017.1417551. [DOI] [PubMed] [Google Scholar]
  9. Bowles T.V., Brindle K.A. Identifying facilitating factors and barriers to improving student retention rates in tertiary teaching courses: a systematic review. Higher Education Research & Development. 2017;36(5):903–919. doi: 10.1080/07294360.2016.1264927. [DOI] [Google Scholar]
  10. Bussolari C., Currin-McCulloch J., Packman W., Kogan L., Erdman P. I couldn’t have asked for a better quarantine partner!”: Experiences with companion dogs during Covid-19. Animals. 2021;11 doi: 10.3390/ani11020330. Article 330. [DOI] [PMC free article] [PubMed] [Google Scholar]
  11. Chanfreau-Coffinier C., Peredo J., Russell M.M., Yano E.M., Hamilton A.B., Lerner B., Provenzale D., Knight S.J., Voils C.I., Scheuner M.T. A logic model for precision medicine implementation informed by stakeholder views and implementation science. Genetics in Medicine. 2019;21(5):1139–1154. doi: 10.1038/s41436-018-0315-y. [DOI] [PubMed] [Google Scholar]
  12. Connors S.C., Nyaude S., Challender A., Aagaard E., Velez C., Hakim J. Evaluating the impact of the Medical Education Partnership Initiative at the University of Zimbabwe College of Health Sciences using the most significant change technique. Academic Medicine. 2017;92:1264–1268. doi: 10.1097/ACM.0000000000001519. [DOI] [PMC free article] [PubMed] [Google Scholar]
  13. Cooksy L.J., Gill P., Kelly P.A. The program logic model as an integrative framework for a multimethod evaluation. Evaluation and Program Planning. 2001;24:119–128. doi: 10.1016/S0149-7189(01)00003-9. [DOI] [Google Scholar]
  14. Crossman M.K., Kazdin A.E., Knudson K. Brief unstructured interaction with a dog reduces distress. Anthrozoös. 2015;28(4):649–659. doi: 10.1080/08927936.2015.1070008. [DOI] [Google Scholar]
  15. Dart J., Davies R. A dialogical, story-based evaluation tool: The most significant change technique. American Journal of Evaluation. 2003;24(2):137–155. 〈https://www.mande.co.uk/wp-content/uploads/2003/MSCAJEfinalB.pdf〉 [Google Scholar]
  16. Davies, R. & Dart, J. (2005). The ‘most significant change’ (MSC) technique: A guide to its use. 〈https://www.wikifplan.org/WIKIPLAN/1%201%20151%20-%20Most_significant_change_methodology_pa_abril%202005.pdf〉.
  17. Dell C.A., Chalmers D., Gillett J., Rohr B., Nickel C., Campbell L., Hanoski R., Haugerud J., Husband A., Stephenson C., Brydges M. PAWSing student stress: A pilot evaluation study of the St. John Ambulance Therapy Dog Program on three university campuses in Canada. Canadian Journal of Counselling and Psychotherapy. 2015;49(4):332–359. 〈https://cjc-rcc.ucalgary.ca/article/view/61079〉 [Google Scholar]
  18. Dillman D.A., Smyth J.D., Christian L.M. In: Internet, phone, mail, and mixed-mode surveys: The tailored design method. 4th ed.., Dillman D.A., Smyth J.D., Christian L.M., editors. Wiley; New Jersey: 2014. Reducing people’s reluctance to respond to surveys; pp. 19–55. [Google Scholar]
  19. Dillman D.A., Smyth J.D., Christian L.M. In: Internet, phone, mail, and mixed-mode surveys: The tailored design method. 4th ed.., Dillman D.A., Smyth J.D., Christian L.M., editors. Wiley; New Jersey: 2014. Sample surveys in our electronic world; pp. 1–18. [Google Scholar]
  20. Dozois D.J.A. Anxiety and depression in Canada during the COVID-19 pandemic: A national survey. Canadian Psychology. 2021;62(1):136–142. doi: 10.1037/cap0000251. [DOI] [Google Scholar]
  21. Eisenberg D., Golberstein E., Hunt J.B. Mental health and academic success in college. The B E Journal of Economic Analysis & Policy. 2009;9(1) doi: 10.2202/1935-1682.2191. Article 40. [DOI] [PMC free article] [PubMed] [Google Scholar]
  22. Eisenberg D.E., Gollust S.E., Golberstein E., Hefner J. Prevalence and correlates of depression, anxiety, and suicidality among university students. American Journal of Orthopsychiatry. 2007;77(4):534–542. doi: 10.1037/0002-9432.77.4.534. [DOI] [PubMed] [Google Scholar]
  23. Ela M.Z., Shohel T.A., Shovo T., Khan L., Jahan N., Hossain M.T., Islam M.N. Prolonged lockdown and academic uncertainties in Bangladesh: A qualitative investigation during the COVID-19 pandemic. Heliyon. 2021;7(2) doi: 10.1016/j.heliyon.2021.e06263. Article e06263. [DOI] [PMC free article] [PubMed] [Google Scholar]
  24. Elmer T., Mepham K., Stadtfeld C. Students under lockdown: Comparisons of students’ social networks and mental health before and during the COVID-19 crisis in Switzerland. PLoS ONE. 2020;15(7) doi: 10.1371/journal.pone.0236337. Article e0236337. [DOI] [PMC free article] [PubMed] [Google Scholar]
  25. Evans T.M., Bira L., Gastelum J.B., Weiss L.T., Vanderford N.L. Evidence for a mental health crisis in graduate education. Nature Biotechnology. 2018;36(3):282–284. doi: 10.1038/nbt.4089. [DOI] [PubMed] [Google Scholar]
  26. Fetterman D.M., Rodríguez-Campos L., Wandersman A., O’Sullivan R.G., Zukoski A.P. In: Collaborative, participatory, and empowerment evaluation: Stakeholder involvement approaches. Fetterman D.M., Rodríguez-Campos L., Zukoski A.P., editors. Guilford Press; New York: 2018. Similarities across the three approaches: Principles and practices in common; pp. 118–134. [Google Scholar]
  27. Flick U., von Kardorff E., Steinke I. In: A companion to qualitative research. Flick U., von Kardorff E., Steinke I., editors. Sage; Thousand Oaks: 2004. What is qualitative research? An introduction to the field; pp. 3–11. [Google Scholar]
  28. Garlow S.J., Rosenberg J., Moore D., Haas A., Koestner B., Hendin H., Nemeroff C.B. Depression, desperation, and suicidal ideation in college students: Results from the American Foundation for Suicide Prevention College Screening project at Emory University. Depression and Anxiety. 2008;25:482–488. doi: 10.1002/da.20321. [DOI] [PubMed] [Google Scholar]
  29. Grajfoner D., Harte E., Potter L.M., McGuigan N. The effect of dog-assisted intervention on student well-being, mood, and anxiety. International Journal of Environmental Research and Public Health. 2017;14 doi: 10.3390/ijerph14050483. Article 483. [DOI] [PMC free article] [PubMed] [Google Scholar]
  30. Green, E.L. (2005). Reinventing logic modeling: A stakeholder-driven group approach. [Unpublished doctoral dissertation]. University of Cincinnati. 〈http://rave.ohiolink.edu/etdc/view?acc_num=ucin1123692726〉.
  31. Griffith, R.M. (2016). PAWS Your Stress: Student perspectives of an animal assisted activity program. [Unpublished doctoral dissertation]. University of Saskatchewan. 〈https://harvest.usask.ca/handle/10388/7405〉.
  32. Hamza C.A., Ewing L., Heath N.L., Goldstein A.L. When social isolation is nothing new: A longitudinal study psychological distress during COVID-19 among university students with and without preexisting mental health concerns. Canadian Psychology Advance Online Publication. 2020 doi: 10.1037/cap0000255. [DOI] [Google Scholar]
  33. Helitzer D., Urquieta de Hernandez B., Sanders M., Roybal S., Van Deusen I. Evaluation for community-based programs: The integration of logic models and factor analysis. Evaluation and Program Planning. 2010;33:223–233. doi: 10.1016/j.evalprogplan.2009.08.005. [DOI] [PubMed] [Google Scholar]
  34. Hilde Ramsdal G., Bergvik S., Wynn R. Long-term dropout from school and work and mental health in young adults in Norway: A qualitative interview-based study. Cogent Psychology. 2018;5(1) doi: 10.1080/23311908.2018.1455365. Article 1455365. [DOI] [Google Scholar]
  35. Hjorth C.F., Bilgrav L., Sjørslev Frandsen L., Overgaard C., Torp-Pedersen C., Nielsen B., Bøggild H. Mental health and school dropout across educational levels and genders: A 4.8-year follow-up study. BMC Public Health. 2016;16 doi: 10.1186/s12889-016-3622-8. Article 976. [DOI] [PMC free article] [PubMed] [Google Scholar]
  36. Ho L.S., Labrecque G., Batonon I., Salsi V., Ratnayake R. Effects of a community scorecard on improving the local health system in Eastern Democratic Republic of Congo: Qualitative evidence using the most significant change technique. Conflict and Health. 2015;9 doi: 10.1186/s13031-015-0055-4. Article 27. [DOI] [PMC free article] [PubMed] [Google Scholar]
  37. IAHAIO). (2018). The IAHAIO definitions for animal assisted intervention and guidelines for wellness of animals involved in AAI. 〈https://iahaio.org/wp/wp-content/uploads/2020/07/iahaio_wp_updated-2020-aai-adjust-1.pdf〉.
  38. Ibrahim A.K., Kelly S.J., Adams C.E., Glazebrook C. A systematic review of studies of depression prevalence in university students. Journal of Psychiatric Research. 2013;47:391–400. doi: 10.1016/j.jpsychires.2012.11.015. [DOI] [PubMed] [Google Scholar]
  39. Kelaher M., Dunt D., Berman N., Curry S., Joubert L., Johnson V. Evaluating the health impacts of participation in Australian community arts groups. Health Promotion International. 2013;29(3):392–402. doi: 10.1093/heapro/das073. [DOI] [PubMed] [Google Scholar]
  40. Knowlton L., Phillips C. Better Strategies for Great Results. second ed. Sage Publications; 2013. The Logic Model Guidebook.〈http://www.sagepub.com/upm-data/23938_Chapter_3___Creating_Program_Logic_Models.pdf〉 [Google Scholar]
  41. Koo M., Skinner H. Challenges of internet recruitment: A case study with disappointing results. Journal of Medical Internet Research. 2005;7(1) doi: 10.2196/jmir.7.1.e6. Article e6. [DOI] [PMC free article] [PubMed] [Google Scholar]
  42. Labin S.N., Duffy J.L., Meyers D.C., Wandersman A., Lesesne C.A. A research synthesis of the evaluation capacity building literature. American Journal of Evaluation. 2012;33(3):307–338. doi: 10.1177/1098214011434608. [DOI] [Google Scholar]
  43. Lalonde R., Dell C., Claypool T. PAWS Your Stress: The student experience of therapy dog programming. Canadian Journal for New Scholars in Education. 2020;11(2):78–90. 〈https://journalhosting.ucalgary.ca/index.php/cjnse/index〉 [Google Scholar]
  44. Lee R.M., Fielding N., Blank G. In: The SAGE handbook of online research methods. Fielding N., Lee R.M., Blank G., editors. Sage; London: 2008. The internet as a research medium: An editorial introduction; pp. 3–20. [Google Scholar]
  45. Lim G.Y., Tam W.W., Lu Y., Ho C.S., Zhang M.W., Ho R.C. Prevalence of depression in the community from 30 countries between 1994 and 2014. Scientific Reports. 2018;8 doi: 10.1038/s41598-018-21243-x. Article 2861. [DOI] [PMC free article] [PubMed] [Google Scholar]
  46. Limato R., Ahmed R., Magdalena A., Nasir S., Kotvojs F. Use of most significant change (MSC) technique to evaluate health promotion training of maternal community health workers in Cianjur district, Indonesia. Evaluation and Program Planning. 2018;66:102–110. doi: 10.1016/j.evalprogplan.2017.10.011. [DOI] [PubMed] [Google Scholar]
  47. Mattanah J.F., Brooks L.J., Brand B.L., Quimby J.L., Ayers J.F. A social support intervention and academic achievement in college: Does perceived loneliness mediate the relationship? Journal of College Counseling. 2012;15:22–36. doi: 10.1002/j.2161-1882.2012.00003.x. [DOI] [Google Scholar]
  48. McMichael M.A., Singletary M. Assistance, service, emotional support, and therapy dogs. Veterinary Clinics of North America: Small Animal Practice. 2021;51(4):961–973. doi: 10.1016/j.cvsm.2021.04.012. [DOI] [PubMed] [Google Scholar]
  49. Nepps P., Stewart C.N., Bruckno S.R. Animal-assisted activity: Effects of a complementary intervention program on psychological and physiological variables. Journal of Evidence-Based Complementary & Alternative Medicine. 2014;19(3):211–215. doi: 10.1177/2156587214533570. [DOI] [PubMed] [Google Scholar]
  50. Nicpon M.F., Huser L., Hull Blanks E., Sollenberger S., Befort C., Robinson Kurpius S.E. The relationship of loneliness and social support with college freshmen’s academics performance and persistence. Journal of College Student Retention. 2006;8(3):345–358. doi: 10.2190/A465–356 M-7652-783R. [DOI] [Google Scholar]
  51. O’Connor R.C., Wetherall K., Cleare S., McClelland H., Melson A.J., Niedzwiedz C.L., O’Carroll R.E., O’Connor D.B., Platt S., Scowcroft E., Watson B., Zortea T., Ferguson E., Robb K.A. Mental health and well-being during the COVID-19 pandemic: Longitudinal analyses of adults in the UK COVID-19 Mental Health & Wellbeing study. The British Journal of Psychiatry. 2021;218:326–333. doi: 10.1192/bjp.2020.212. [DOI] [PMC free article] [PubMed] [Google Scholar]
  52. O’Donnell S., Vanderloo S., McRae L., Onysko J., Patten S.B., Pelletier L. Comparison of the estimated prevalence of mood and/or anxiety disorders in Canada between self-report and administrative data. Epidemiology and Psychiatric Sciences. 2015;25(4):360–369. doi: 10.1017/S2045796015000463. [DOI] [PMC free article] [PubMed] [Google Scholar]
  53. Owens M., Stevenson J., Hadwin J.A., Norgate R. Anxiety and depression in academic performance: An exploration of the mediating factors of worry and working memory. School Psychology International. 2012;33(4):433–449. doi: 10.1177/0143034311427433. [DOI] [Google Scholar]
  54. Patton M.Q. Qualitative research and evaluation methods. thirrd ed. Sage; Thousand Oaks: 2002. [Google Scholar]
  55. Patton M.Q. Utilization-focused evaluation. fourth ed. Sage; Los Angeles: 2008. [Google Scholar]
  56. Porteous N.L., Sheldrick B.J., Stewart P.J. Introducing program teams to logic models: Facilitating the learning process. The Canadian Journal of Program Evaluation. 2002;17(3):113–141. 〈http://med-fom-familymed-research.sites.olt.ubc.ca/files/2012/03/faciliter_modele_logiques_CJPE-2002_f.pdf〉 [Google Scholar]
  57. Preskill H., Boyle S. A multidisciplinary model of evaluation capacity building. American Journal of Evaluation. 2008;29(4):443–459. doi: 10.1177/1098214008324182. [DOI] [Google Scholar]
  58. Rabie B., Burger A. Benefits of transport subsidisation: Comparing findings from a customer perception survey and most significant change technique interviews. African Evaluation Journal. 2019;7(1) doi: 10.4102/aej.v7i1.371. Article 371. [DOI] [Google Scholar]
  59. Remes R., Brayne C., van der Linde R., Lafortune L. A systematic review of reviews on the prevalence of anxiety disorders in adult populations. Brain and Behavior. 2016;6(7) doi: 10.1002/brb3.497. Article e00497. [DOI] [PMC free article] [PubMed] [Google Scholar]
  60. Robinson A.M., Jubenville T.M., Renny K., Cairns S.L. Academic and mental health needs of students on a Canadian campus. Canadian Journal of Counselling. 2016;50(2):108–123. 〈https://cjc-rcc.ucalgary.ca/article/view/61100〉 [Google Scholar]
  61. Rodríguez-Campos L. In: Collaborative, participatory, and empowerment evaluation: Stakeholder involvement approaches. Fetterman D.M., Rodríguez-Campos L., Zukoski A.P., editors. Sage; 2018. Essentials of collaborative evaluation; pp. 10–20. [Google Scholar]
  62. Scriven, M. (1981). Evaluation thesaurus (3rd ed.) [ebook]. Edgepress. 〈https://files.eric.ed.gov/fulltext/ED214952.pdf〉.
  63. Serrat, O. (2010). The most significant change technique. Asian Development Bank. 〈https://ecommons.cornell.edu/handle/1813/87749〉.
  64. Shakman, K. & Rodriguez, S.M. (2015, May). Logic models for program design, implementation, and evaluation: Workshop toolkit. Department of Education, Institute of Education Sciences, National Center for Education Evaluation and Regional Assistance, Regional Educational Laboratory Northeast & Islands. 〈https://ies.ed.gov/ncee/edlabs/regions/northeast/pdf/REL_2015057.pdf〉.
  65. Sokal L., Bartel B., Martin T. Effects of touch on students’ stress, happiness, and well-being during animal-assisted activities. Journal of Education and Development. 2021;5(1):111–122. doi: 10.20849/jed.v5i1.887. [DOI] [Google Scholar]
  66. Temple E.C., Brown R.F. A comparison of Internet-based participant recruitment methods: Engaging the hidden population of cannabis users in research. Journal of Research Practice. 2011;7(2) 〈http://jrp.icaap.org/index.php/jrp/article/view/288/247〉 Article D2. [Google Scholar]
  67. Thomas D.R. A general inductive approach for analyzing qualitative evaluation data. American Journal of Evaluation. 2006;27(2):237–246. doi: 10.1177/1098214005283748. [DOI] [Google Scholar]
  68. Ward-Griffin E., Klaiber P., Collins H.K., Owens R.L., Coren S., Chen F.S. Petting away pre‐exam stress: The effect of therapy dog sessions on student well‐being. Stress and Health. 2018;34:468–473. doi: 10.1002/smi.2804. [DOI] [PubMed] [Google Scholar]
  69. Wilder L., Walpole M. Measuring social impacts in conservation: Experience of using the most significant change method. Oryx. 2008;42(4):529–538. doi: 10.1017/S0030605307000671. [DOI] [Google Scholar]
  70. Williams S.N., Armitage C.J., Tampe T., Dienes K. Public perceptions and experiences of social distancing and social isolation during the COVID-19 pandemic: A UK-based focus group study. BMJ Open. 2020;10 doi: 10.1136/bmjopen-2020-039334. Article e039334. [DOI] [PMC free article] [PubMed] [Google Scholar]
  71. Wilson J.M., Lee J., Fitzgerald H.N., Oosterhoff B., Sevi B., Shook N.J. Job insecurity and financial concern during the COVID-19 pandemic are associated with worse mental health. Journal of Occupational and Environmental Medicine. 2020;62(9):686–691. doi: 10.1097/JOM.0000000000001962. [DOI] [PubMed] [Google Scholar]
  72. Yoon S., McClean S.T., Kim J.K., Koopman J., Rosen C.C., Trougakos J.P., McCarthy J.M. Working through an ‘infodemic: The impact of COVID-19 news consumption on employee uncertainty and work behaviors. Journal of Applied Psychology. 2021;106(4):501–517. doi: 10.1037/apl0000913. [DOI] [PubMed] [Google Scholar]

Articles from Evaluation and Program Planning are provided here courtesy of Elsevier

RESOURCES