Skip to main content
NIHPA Author Manuscripts logoLink to NIHPA Author Manuscripts
. Author manuscript; available in PMC: 2018 Mar 1.
Published in final edited form as: J Pers Assess. 2015 Sep 25;99(2):126–135. doi: 10.1080/00223891.2015.1077336

Trainee and Client Experiences of Therapeutic Assessment in a Required Graduate Course: A Qualitative Analysis

Justin D Smith 1, Kaitlyn N Egan 2
PMCID: PMC4808493  NIHMSID: NIHMS718437  PMID: 26407831

Abstract

Surveys indicate that practice and training in psychological assessment, and personality assessment (PA) to a lesser degree, has been stable or increasing over the past quarter century. However, its future arguably remains threatened due to changes in doctoral training programs and beliefs in the field concerning the utility of PA for treatment success. In order to increase interest in and use of PA, studies of training methods that include trainees’ perspectives are needed. This study evaluated the experiences of ten graduate trainees and their clients who were trained in and conducted a brief Therapeutic Assessment (TA). Qualitative responses to a self-evaluation administered post-TA were coded using directed content analysis. Results indicated that trainees’ viewed TA/PA as having clinical utility; they had positive feelings about TA/PA, and they desired or intended to use or continue learning about TA/PA. Clients’ responses reflected positive feelings about the TA, having gained new self-awareness or understanding, and having a positive relationship with the assessor. The findings suggest that teaching PA from a TA perspective could produce positive benefits for psychology trainees.

Keywords: personality assessment, therapeutic assessment, training, student perspective


Although there has been a historical decline in the use of and training in psychological assessment, in some ways this defining professional practice has demonstrated remarkable resilience in the face of criticism from the 1960s to the early 1990s (Watkins, 1991). The rise of managed care organizations during this period represented a critical threat to psychological assessment (e.g., Stedman, Hatch, & Schoenfeld, 2001), which continues to this day. However, recent surveys indicate that psychological assessment remains a noteworthy portion of the professional practice of psychologists (Norcross, Karpiak, & Santoro, 2005). Over time, these historical trends and factors have not only affected the professional practice of assessment, but have also likely trickled down to graduate-level training.

Graduate Training in Psychological Assessment

As stated by Ready and Veague (2014), “Assessment competencies are exclusive to psychologists and as such, training is valued and highly regarded by the profession” (p. 278). According to their survey of APA-accredited clinical psychology doctoral programs, self-reported assessment training has been either stable or increasing in the last decade, with the exception of the continued decline in performance-based (i.e., projective) assessment measures (Ready & Veague, 2014). Piotrowski's (2015b) findings concerning worldwide use of projective tests indicate a similar steady trend but to a lesser degree than a half century ago. However, in a separate survey, Piotrowski (2015a) found that assessment training in doctoral programs in the United States highly favors non-projective assessment instruments, indicating a shift in the focus of graduate training that may not align with trends in professional psychology. Overall, these trends are generally promising after previous decades of perceived decline (Handler & Smith, 2012; Martin, 2009; Stedman et al., 2001). However, questions concerning the quality of assessment training remain, necessitating evaluation of ways to improve training methods and approaches while also increasing trainees’ interest.

Handler and Smith (2012) outlined 13 reasons why it is important to teach and to learn psychological assessment. Of note is the utility of assessment in contexts such as research, forensics, medical settings, and work-related settings. Within the context of psychological intervention, assessment can facilitate the development of the therapeutic relationship and aid in diagnosis, treatment planning, tracking treatment progress, and managing risk. Programs vary widely in the quality of assessment training, such as whether or not they cover the full range of assessment topics (i.e., psychometrics, diversity, ethics) or adequately link theory and supervised practice (Krishnamurthy et al., 2004). As such, graduate trainees are far too often unprepared for the predoctoral internship year, let alone professional practice, in psychological assessment (Clemence & Handler, 2001). This discrepancy represents the difference between the importance internship directors place on assessment training and the level of experience of incoming interns. Data from 30 sites revealed that internship directors regard psycho-diagnostic training as highly important and that a good psycho-diagnostic training background can make graduate students more competitive for securing an internship slot. However, one study revealed that an estimated 25% of graduate students in clinical and counseling psychology doctoral programs receive adequate training in report writing to be considered ready for predoctoral internship (Stedman et al., 2001). The gap between graduate training, internship, and professional practice is evident to the trainees as well. The results of national surveys of clinical, counseling, and school psychologists indicated that many psychologists believe that their predoctoral coursework, practica, or internship left them unprepared for the next phase of assessment training or practice (e.g., Clemence & Handler, 2001; Curry & Hanson, 2010). Unfortunately, the shortcomings in graduate training programs means that the burden of crafting psychologists competent in performing assessments falls on internship programs (Stedman et al., 2001) and to some extent postdoctoral training.

Even though multiple studies have been conducted concerning assessment training, the focus has been almost exclusively on reports from training program directors, internship directors, and practicing psychologists. The perspective of the trainee is a major shortcoming in the current body of research, yet it might be one of the most important factors in whether future generations of psychologists become interested and invested in learning and practicing psychological assessment. This additional perspective could illuminate the strengths and deficits of current assessment training models, provide targets for improvement, and help shape efforts at reform, if indicated.

Therapeutic Assessment

Beginning in the late 1970s, led by the pioneering work of Constance Fischer (1985/1994), a paradigm shift began in the field of psychological assessment toward a more humanistic and phenomenological approach . This shift gave way to the development of a subfield of psychological assessment referred to collectively as collaborative/therapeutic assessment (for a discussion of the distinction between the various approaches in this subfield, see Finn, Fischer, & Handler, 2012). In brief, practitioners using a collaborative/therapeutic approach conduct psychological assessment with the explicit aim of directly improving client distress, symptomatology, and functioning. Empirical research supports the effectiveness for these aims (e.g., Finn & Tonsager, 1992; Newman & Greenway, 1997; Smith, Handler, & Nash, 2010) as well as enhancing clinically meaningful therapeutic processes (e.g., Ackerman, Hilsenroth, Baity, & Blagys, 2000; De Saeger et al., 2014; Hilsenroth, Peters, & Ackerman, 2004; Ougrin, Ng, & Low, 2008; Smith, Eichler, Norman, & Smith, 2015). The core values of the TA reflect the therapeutic aims of the model: humility, respect, compassion, collaboration, and openness and curiosity (Finn, 2009).

Therapeutic Assessment (TA) emerged from this paradigm shift. Martin (2009) noted that TA is one avenue through which psychological assessment has begun to reemerge in graduate training programs. In many ways the tenets and core values of the TA model align with the current trend in healthcare toward more collaborative and client-centered approaches (e.g., Laine & Davidoff, 1996). The TA approach also appeals to psychologists interested in understanding the “whole person” and not simply in determining the client's psychiatric diagnosis. Anecdotally, graduate trainees report that TA is less pathologizing, more natural, and more clinically useful than a traditional approach to psychological assessment (Finn, 1998; Fowler, 1998). Ougrin, Zundel, Ng, Habel, and Latif (2013) evaluated the outcomes of training 24 clinicians in the use of TA as a brief intervention for adolescents who self-harm. Although their primary focus was fidelity to the TA model and clinical efficacy, they found that training in TA was associated with improved quality and the trainees reported high satisfaction.

Current Study

The purpose of this study was to examine graduate student experiences of learning TA and conducting a brief TA in the context of a required doctoral course in personality assessment (PA). The course design and objectives were largely a product of the context in which the course was being taught. When approached to teach this course, the instructor (JDS) was informed that the training program valued evidence-based practice, diversity, and social justice; that the personality assessment course had received abysmal course evaluations in recent years; and that psychological assessment was routinely criticized by the program's trainees as being reductionistic, culturally insensitive, pathologizing, and not useful to clinical practice. Thus, the instructor designed a course that would introduce students to PA from a TA perspective. The formal course objectives included: (1) Foster interest and curiosity in the practice of TA/PA; (2) Demonstrate the clinical utility of TA/PA; and (3) Provide a positive experience of TA/PA for trainees and clients. The course culminated in each trainee conducting a brief TA and completing a self-evaluation consisting of open-ended questions concerning their experience with the project. A qualitative coding system using a directed content analysis approach based on the course objectives and the core values of TA was developed and used to code the trainees’ responses. Client responses were coded according to the four subscales of the Assessment Questionnaire, Version 2 (AQ–2; Finn, Schroeder, & Tonsager, 2000): (1) New self-awareness/understanding; (2) Positive accurate mirroring; (3) Positive relationship with the examiner; and (4) Negative feelings about the assessment.

Method

Procedure

Ten graduate student trainees in a counseling psychology doctoral program were enrolled in a required graduate course titled, “Psychological Assessment II – Personality.” The course covered administration, scoring, and interpretation of personality assessment instruments and other tests for psychopathology in adults. The principles, procedures, and techniques of the TA model were covered via readings, lecture, and the use of videotaped sessions of the instructor conducting a TA with an adult client. The final course project consisted of trainees conducting a brief TA of a volunteer client. Volunteer clients were recruited from the masters’ program in Couples and Family Therapy. They were sent a program-wide electronic mail message offering a no-cost psychological assessment that could help the student with issues related to their clinical training and clinical work and current interpersonal and mental health concerns. No additional incentive was offered for participation and volunteer participants were not screened.

The TA comprised an initial session, one or two test administration sessions, a summary and discussion session, and the preparation of a TA letter for the client. In keeping with the TA model, in the initial session the client and trainee collaboratively developed assessment questions and discussed background related to each question, which guided the selection of psychological tests appropriate to answering the assessment questions. Examples of assessment questions posed in this study include, “How are my romantic relationships affected by codependency and approval-seeking and how can I be more balanced?”, “How can I start new friendships and maintain current friendships more easily and with less anxiety?”, and “Why might I worry so much and what might be potentially helpful coping strategies?” For this project, trainees were required to administer a broadband self-report measure, such as the Minnesota Multiphasic Personality Inventory–2 Restructured Form (Tellegen & Ben-Porath, 2008) or the Personality Assessment Inventory (Morey, 1991); the Early Memories Procedure (Bruhn, 1992); and a symptom or condition specific measure, such as the Beck Depression Inventory-II (Beck & Steer, 1987). Trainees were encouraged to administer additional tests when necessary to answer the assessment questions and to use extended inquiry techniques when appropriate. Those selected by trainees included the Wechsler Adult Intelligence Scales, Fourth Edition (Wechsler, 2008) and the Dissociative Experiences Scale (Carlson & Putnam, 1993). A summary and discussion session was then completed, which followed the procedures described by Finn (2007) and Smith and Finn (2014). This session focuses on answering the client's assessment questions based on the findings of the assessment and the client's history and current context. This brief version of the TA model did not include the assessment intervention session, which Finn (2007) considers to be the most advanced step in the model and further notes that it is not always indicated. For this reason, some controlled empirical studies of TA have elected not to include them (e.g., De Saeger et al., 2014).

Trainees were required to complete two one-hour sessions of individual supervision with the instructor. The instructor (JDS) is well trained in TA procedures and psychological assessment. As a member of the training faculty of the Therapeutic Assessment Institute, he conducts professional workshops in the TA model. He has also published numerous peer-reviewed articles on this topic. Supervision sessions occurred following the initial interview and again prior to the summary and discussion session. An evaluation of trainees’ competence in the administration of the tests, accuracy in the scoring, and basic interpretation of the results were incorporated into the supervision sessions. Worksheets from the TA training workbook (Smith & Finn, 2011) were completed by students prior to supervision to help the assessor conceptualize the case, prepare for the summary discussion, and focus supervision meetings. Trainees then wrote a technical psychological report and a TA letter. The TA letter, once reviewed and approved by the instructor, was mailed in hard copy to the client. The TA letter is written in first person, non-technical language and intended to be a communication to the client and a written summary of material already discussed in face-to-face sessions. It contains a summary of the testing conducted and formally answers the client's assessment questions. The technical report was a requirement of the course in order to meet a program competency requirement, which was developed to be consistent with the expectations for graduate trainees put forth in the American Psychological Association's Test User Qualifications (Turner, DeMers, Fox, & Reed, 2001). The technical report is not a mandatory aspect of the TA model but is often done when a referring professional or entity requires it (e.g., the school system for determining educational accommodations). After completing the TA, trainees were required to complete a self-evaluation. During the summary and discussion session the trainees asked their clients to provide feedback about the assessment and a form with open-response questions was provided via electronic mail.

Participants

Of the ten graduate student trainees (90% female) enrolled in the course, four trainees were in their second year of the doctoral program, five trainees were in the third year, and one trainee was a fourth year student. Two of the trainees had masters-level degrees and had practiced as independently licensed counselors prior to enrolling in the doctoral program. The graduate student volunteer clients (N = 10) from the masters’ program in Couples and Family Therapy were 100% female and ranged in age from 22–30 years (Mean = 23.6, SD = 1.77). Two volunteer clients were currently receiving psychotherapy.

Measures

Trainee self-evaluation

The trainee self-evaluation consisted of five open-response questions developed by the course instructor. The questions were developed with the aim of fostering self-reflection and a critical evaluation of strengths and areas for improvement in the context of delivering TA. A secondary aim of the evaluation was to indirectly assess the effectiveness of the instructor's teaching methods and supervisory practices leading up to the final project. The content of the self-evaluation was not used in the determination of the trainee's grade in the course but was required for successful course completion. All ten self-evaluations were returned. The following questions were sent to students via electronic mail following completion of the final project:

  1. What was your experience of conducting the assessment with your client?

  2. What was the most important thing you learned about the assessment process through this project?

  3. What did you find most helpful to you about the project in your learning of assessment and psychotherapy more generally?

  4. What areas of the project do you feel you did well at? Why?

  5. What areas do you see as needing improvement? Why?

Client feedback form

The client feedback form consisted of five open-response questions that were developed by the course instructor to inform trainees of their clients’ experiences of the TA. Five of the ten client feedback forms were returned. No incentive was offered to the client for completing the feedback form. The client feedback form was intended for the graduate trainee and was not used as a method of trainee evaluation in the course. The client feedback form included the following questions:

  1. Briefly, what did you expect from the assessment and how well did the assessment meet your expectations?

  2. What part(s) of the assessment did you find most valuable?

  3. What part(s) of the assessment were least valuable?

  4. What suggestions do you have for improving the way I [the trainee] do assessments?

  5. Please give any other comments.

Qualitative Coding, Reliability, and Analysis

The open-ended responses from the trainee self-evaluations and client feedback forms were qualitatively coded using a directed content analysis approach (Hsieh & Shannon, 2005). In this approach, coding categories are predetermined and developed from prior research or existing theory. These initial coding categories are then used to code the data, and any data that cannot be coded is set aside for further analysis as either a subcategory or a representation of a new category. In this study, the core values of TA (Finn, 2009) and the course objectives were used to guide the development of coding categories for the trainee self-evaluations. The coding categories for the client feedback forms were developed using the four subscales of the AQ–2 (Finn et al., 2000), a 48-item measure used to evaluate client experiences of psychological assessment: (1) New self-awareness/understanding; (2) Positive accurate mirroring; (3) Positive relationship with the examiner; and (4) Negative feelings about the assessment. These four factors were directly translated into coding categories. The initial and final coding systems for the trainee self-evaluations and client feedback forms are presented in Table 1. Each independent clause provided by the respondents was eligible for coding within one of the categories. Therefore, the same code could be given to multiple independent clauses provided by the respondents. A clause could not be included in two coding categories.

Table 1.

Qualitative Coding System: Initial and Final Coding Categories

Trainee Responses Client Responses
Interest or curiosity New self-awareness or understanding
Shifting perspective of TA/PA Positive accurate mirroring
Clinical utility Relationship with the assessor
Intent/desire to use/learn TA/PA     a) positive*
Aptitude in TA techniques/skills:     b) negative*
    a) competence Feelings about the assessment
    b) challenges     a) positive*
Positive feelings about:     b) negative*
    a) TA/PA Practical/relevant to life*
    b) course/instructor
    c) assessment project
Congruence with value system*

Note.

*

added during coding.

Next, a doctoral-level graduate student (KNE) independently coded one case from both trainee and client questionnaires. Coding was conducted by extracting key phrases in a line-byline manner from the raw data. Data was then either coded using an existing category or identified as a potential new category or subcategory. The researchers then conducted consensus coding on these two cases by establishing exemplars, modifying existing categories, and deriving new categories as needed. After consensus coding was repeated for another case from the trainee self-evaluations, the graduate student conducted independent coding of the remaining cases. The first author (JDS) remained blind to the coding and conducted a reliability coding of three trainee self-evaluations and three client feedback forms selected at random. Reliability analyses yielded a Krippendorff's (2011) alpha of .84 for trainee evaluations and .80 for client feedback coding. Codes were entered into an Excel file and frequencies and descriptive statistics for each coding category were calculated. We examined the frequency of codes at the macro level as well as the respondent and item levels. Based on the content of the codes within categories, we derived thematic descriptors of subcategories.

Results

Trainee Self-Evaluations

All trainee responses were categorized within the a priori categories, with one exception. Table 2 contains the percentage of overall coded responses within each category as well as the proportion of trainees who provided a response in each category. The three positive feelings categories were aggregated into one large category, due to the majority (64.52%) of the responses being captured by positive feelings for the assessment project.

Table 2.

Trainee Self-Evaluations: Frequency of Coded Responses and Number of Respondents by Category

Coding Category Frequency Respondents (N = 10)
Clinical utility 24.71% 10
Positive feelings towards TA/PA 18.24% 8
Intent/desire to use/learn TA/PA 17.65% 10
Competence in TA/PA techniques/skills 15.88% 10
Shifting perspective of TA/PA 10% 9
Congruence with value system 8.24% 6
Needs improvement in TA/PA skills 5.29% 6
Interest or curiosity in TA/PA 0% 0

Clinical utility

Nearly a quarter (24.71%) of the overall coded responses reflected the clinical utility of TA/PA. All of the trainees (100%) provided at least one response coded to indicate clinical utility. Responses in this category highlighted the clinical benefits of TA. Trainees described how TA: (1) can be clinically meaningful and helpful; (2) facilitates the therapeutic alliance; (3) emphasizes collaboration; and (4) allows clients to be understood at a deep, therapeutic level. For example, four out of the ten trainees expressed that TA is a powerful therapeutic tool, and two trainees described TA as non-pathologizing. Other examples included facilitating greater depth of exploration (“I was able to talk with my client about her concerns on a deeper level than she has talked about with her own psychotherapist” [ID #1]; “The collaborative development of assessment questions lent itself to a genuine discussion of what was happening for my client and how she could address her concerns” [ID #10]); and reducing anxiety about the assessment:

“My client was fairly transparent about her discomfort opening up about personal experiences as a part of the assessment, and presented as guarded during the beginning of our first meeting. I kept this in mind throughout our work together, making sure to check in with my client about her experiences, and empathizing with her hesitations and emotional discomfort. By our final session she appeared much more relaxed and expressed her increased comfort with the assessment process.” (ID #2)

Positive feelings toward TA/PA

A notably large number of responses conveyed positive feelings toward TA/PA, the assessment project, and the course instructor (18.24% of codes; 80% of respondents included). Responses in this category reflected an appreciation for the opportunity to conduct a brief TA and described TA as rewarding, enjoyable, and helpful. Five of the 10 trainees verbalized their positive feelings directly; for example, “It was great!” (ID #8) or “I had a very positive experience” (ID #7). Other representative responses included: “I really appreciated the opportunity to practice collaborative assessment” (ID #2), “It was rewarding to watch [my client] make novel connections based on the assessment results” (ID #5), and “It was motivating for me to see how powerful the experience was for my client” (ID #8).

Intent/desire to use/continue learning about TA/PA

All trainees expressed an intent or desire to use or continue learning about TA/PA (17.65% of codes; 100% of respondents). The responses in this category directly stated a desire to continue working with TA/PA, such as, “I would definitely do another one of these again” (ID #1). Other responses stated this intent more indirectly by identifying TA/PA skills and components in which they would like to gain more experience with in the future. For example, “I want to increase my ability to use metaphors with clients” (ID #10). Other examples included: “Given that this was my first therapeutic assessment project, I believe that my comfort and skill level will continue to improve as I gain additional experience” (ID #4); “I would like to be able to communicate better with clients during feedback” (ID #8); and “Now that I have seen how to do TA, and got to try it myself, I can't imagine doing assessment any other way” (ID #9).

Competence in TA/PA skills

All trainees also expressed competence in TA/PA skills and techniques (15.88% of codes; 100% of respondents), which is expected given the nature of the fourth question, “What areas of the project do you feel you did well at? Why?” Four of the ten trainees stated that they did well during the feedback portion of the assessment, and five trainees reported being successful in collaborating and/or building rapport with their clients (e.g., “I did well developing rapport” [ID #3]). Other areas of competence included developing assessment questions, having respect for clients, using clinical skills appropriately (e.g., “I feel like I really ‘got’ the case conceptualization” [ID #9]), and executing a specific component of TA/PA (e.g., “The clinical interview [is an area] I am proud of” [ID #6]).

Shifting perspective of TA/PA

Nearly every trainee recognized a shift in their perspective of TA/PA (10% of codes; 90% of respondents). This category is defined by responses that indicate the respondent's view of PA had changed since taking the course and conducting the assessment project. For example, two trainees learned that TA/PA could be a collaborative experience, contrary to their previously held view that clinicians must maintain the detached expert role during an assessment. Three trainees discovered that the assessment could be clinically useful (e.g., “It felt much more like therapy than an assessment, and that's not something I expected at the beginning of the term” [ID #1]). Other characteristic responses included: “[I discovered] that I can conduct an assessment while maintaining my integrity as a person and approach as a psychologist” (ID #3); “Assessment can be very specific; I had no idea that an assessment could be so thorough and comprehensive” (ID #6); and:

“I learned that, contrary to popular belief, the assessment process can be quite collaborative. Popular opinion of testing often portrays assessment as unsupportive of client perspective or feedback. I envision a clinician spouting off a bunch of results that are not meaningful to a client's life, leaving the client feeling upset, unheard, and disempowered. However, it is now obvious that assessment does not have to mimic this stereotype.” (ID #5)

Congruence with personal value system

This category emerged during the coding procedures as a distinct theme. Some codes reflected that TA/PA techniques were congruent with the trainee's personal value system (8.24% of codes; 60% of respondents). These responses typically included descriptions of TA/PA as valuable, meaningful, or important. A consistent response was that TA techniques are congruent with how the trainee wants to function as a clinician: “The chance to develop assessment questions that were helpful to my client – and in doing so, balance the power differential between myself and my client – was more congruent with my typical therapeutic orientation” (ID #2); “I think that being well-versed in assessment is important for any mental health clinician” (ID #5); “I learned that assessment can be done in a way that fits who I am as a psychologist” (ID #9).

Challenges in using TA/PA skills

A small portion of codes identified areas of difficulty or struggle in learning TA/PA skills and conducting the brief TA (5.29% of codes; 60% of respondents). Codes in this category expressed difficulty, frustration, or struggle in certain areas or with specific components of TA/PA. Areas of difficulty varied by respondent, and no overarching themes or patterns emerged. Some examples are as follows: “I have a hard time with the concept of holding “knowledge” about my client and deciding within a session that they might not be in an appropriate place to hold that same information” (ID #2); “As I worked through the assessment process and received supervision this term, it became quite obvious that my understanding of the theoretical bases for different clinical perspectives is lacking” (ID #5); and “I forgot to administer a BAI or BDI, and felt rushed throughout this whole process” (ID #7).

Interest or curiosity

No codes were categorized as interest or curiosity in TA/PA.

Client Feedback

Table 3 includes the percentage of codes that fell into each category as well as the number of clients who endorsed each category.

Table 3.

Client Feedback: Frequency of Coded Responses by Category and Respondent

Coding Category Respondents (N = 5) Frequency
New self-awareness or understanding 4 17.78%
Positive accurate mirroring 5 11.11%
Relationship with the assessor
    a) positive* 5 14.44%
    b) negative* 0 0%
Feelings about the assessment
    a) positive* 5 37.78%
    b) negative* 4 6.67%
Practical/relevant to life* 4 12.22%

Note.

*

added during coding.

Feelings about the assessment

Over one third of the coded responses reflected positive feelings about the assessment (37.78% of codes), a category all five clients endorsed. Four of the five clients found the assessment as a whole to be valuable, and two clients reported that the experience exceeded their expectations. Other clients expressed appreciation for the feedback session, the TA letter, or other specific components of TA/PA. Exemplars of this category included: “I really enjoyed this process” (ID #1); “Actually hearing about my results and how they relate to my questions and concerns was really interesting and valuable” (ID #3); and “It was a great experience and I wish I could do it again!” (ID #5).

In contrast, negative feelings about the assessment were reflected by only 6.67% of codes (4/5 clients). The particular aspect of TA/PA with which the client disliked or felt uncomfortable with varied across respondents. Examples included, “I didn't like talking about the early childhood memories” (ID #1) and “I felt that it moved a little quickly from one thing to the next a couple of times” (ID #4).

New self-awareness or understanding

A noteworthy proportion of responses were coded in the category of new self-awareness or understanding (17.78%, 4/5 clients). This category client accounts of gaining new insight about themselves or learning something unexpected from the assessment. Exemplars included: “I didn't think three sessions would have as profound of an effect as they did” (ID #2); “It gives me tremendous insight into who I am and why I act the way I do” (ID #3); and “I really see myself in a different way than I did before the assessment” (ID #4).

Relationship with the assessor

Notably, no clients expressed a negative relationship or negative aspects of their relationship with their assessor. Instead, all five clients expressed a positive relationship with their assessor (14.44% of codes). Responses ranged from direct positive statements, such as “My assessor was fantastic” (ID #1), to more specific aspects of the therapeutic alliance, such as “[The assessor] really listened to me and helped me to feel comfortable and safe” (ID #5). Other examples included: “In terms of working with [my assessor], I was truly impressed” (ID #2), “I felt [my assessor] conducted the entire assessment process professionally and empathetically” (ID #3), and “I was a little bit closed off when I went into the assessment but [my assessor] made me feel very comfortable and that I would be safe to share personal things and also explore my personality and relationships without fear” (ID #4).

Practical or relevant to life

This category, which emerged as distinct from the predetermined categories during the initial coding process, captured 12.22% of coded responses (4/5 clients). Responses described the impact of the assessment on their current functioning or the utility of the assessment experiences or findings in the clients’ everyday lives. A testament to the utility of the assessment, three clients suggested that the assessment project be required of or offered to all beginning psychological trainees. Other exemplars included: “[My assessor] did a wonderful job relating [the assessment] to my childhood and current functioning” (ID #3); “I noticed myself doing things differently (for the better) immediately after even the first session” (ID #4); and “I think anyone who wants to become a counselor should do this” (ID #5).

Positive accurate mirroring

All five clients stated an experience of positive accurate mirroring (11.11% of codes). Positive accurate mirroring refers to the ability of the assessment or assessor to accurately capture and reflect aspects of the client that are congruent with their self-concept. Three clients stated that they felt understood by their assessor and two clients felt that their concerns and feelings were sufficiently validated. Other examples included: “I think much of the assessment results were things I knew about myself” (ID #3); “I was so impressed at how well understood I felt” (ID #5); and

“There were several lines [in the TA letter] that I had to re-read several times because they struck such chords (one was ‘The findings show that you might be minimizing the impact of your experiences.’). Very simple statement, but very true, and no one has put it to me like that before” (ID #2).

Discussion

Although recent research demonstrates that training in psychological assessment has either remained stable or increased in the past decade, more research on student experiences of training in psychological assessment could help perpetuate an upward trend by identifying areas in need of improvement and more effective methods of training and conducting assessment. This qualitative study examined trainees’ experiences of learning and then conducting a brief TA as part of a required doctoral course in personality assessment. The results indicated high acceptability of PA when taught from the TA perspective. For example, all of the trainees conveyed the clinical utility of TA/PA, felt competent in particular skills and techniques of TA, and expressed an intent or desire to continue using or learning about TA/PA. The majority of the trainees’ responses concerned the clinical utility of TA/PA and their positive feelings towards assessment. There was also a notable shift in student perspectives of TA/PA from dehumanizing, hierarchical, and pathologizing to collaborative and therapeutic. Finally, trainees expressed the unanticipated sentiment that TA was congruent with their personal and professional values. Further, the formal course evaluations administered by the university indicated high trainee ratings for the course and the instructor (course quality = 4.17, instructor = 4.30; 5 = exceptional, 1 = poor).

The results indicated not only a positive response from the trainees but also an equally favorable response from their clients. Of those who responded to the follow-up questionnaire, the majority of responses conveyed positive feelings for the assessment and the assessor and no negative feelings toward the assessor were expressed. These results suggest that the collaborative stance of the assessor, along with the other core values of TA, likely elicited the clients’ positive responses. This interpretation of the results is augmented by empirical studies concerning the assessor-client relationship, which is fostered by the structure and techniques of the TA model (e.g., De Saeger et al., 2014; Hilsenroth et al., 2004). Further, client reports of experiencing positive accurate mirroring, indicating that the assessor made the client feel understood and validated, reflect a key component of the theory of change in TA: During the summary and discussion of findings, clients are first presented with information that is congruent with their self-view; in other words, they experience positive accurate mirroring. Then, in accord with self-verification theory, clients are presented with information that is slightly discrepant from their self-schemas (Finn, 2007; Smith & Finn, 2014). When clients become actively involved in changing and revising the ways in which they view themselves through the collaborative processes of TA, their self-schemas are more amenable to change (Finn et al., 2012). The results from this study are consistent with this theory of change, in that client responses indicated experiencing new self-awareness and understanding. This explicit goal differs from traditional approaches to psychological assessment and highlights the clinical utility of the TA model and its effectiveness as a brief intervention.

No pattern emerged from the responses concerning areas of TA/PA in which the trainee could improve or needed additional training. Each response seemed to reflect a unique area of potential growth for the trainee. Similarly, no patterns emerged from the client responses concerning negative feelings toward the assessment process, but were instead a personal area of discomfort for the client. Last, somewhat interestingly, no trainee responses were coded in the a priori category of “interest or curiosity.” In our experience of coding trainee responses, sentiments reflecting an implicit interest or curiosity in TA/PA were included in phrases that were ultimately coded under the “intent/desire to use/continue to learn about TA/PA.” Future research might consider posing more direct and distinct questions concerning these two areas.

While there was no control group for comparison in this study, previous research supports the hypothesis that the core values of TA may be responsible for the high acceptability and positive feelings expressed by clients. In multiple studies comparing TA to traditional assessment (e.g., Ackerman et al., 2000; Hilsenroth et al., 2004) or other brief intervention (De Saeger et al., 2014), clients in the TA condition reported acceptability or satisfaction as high or higher than the comparison condition.

The category that emerged in the client feedback, which we labeled “practical or relevant to life”, is an interesting result. This finding has significant overlap with the trainee perspective of the TA having clinical utility. That is, it is useful in the endeavor of helping the client with his or her concerns. It is a question to be tested as to whether this theme would emerge with a clinical population, who arguably expect that an intervention with a psychologist will be relevant to their lives, or whether this is an anomaly attributable to the client sample and the assessors involved. With a fairly pervasive rhetoric concerning the limited therapeutic benefit of psychological assessment, it very well could be that this sample of student “clients” also entered into the TA with a set of expectations about the process and outcome. It is plausible that our findings regarding clients’ finding the TA useful were a reflection of a shift in perspective that is similar to the sentiment expressed by the assessors. This is deserving of further inquiry. Further, clients who were also in training in a mental health field might have been more tolerant of the assessor's anxieties and could have identified with the assessor's efforts. This study needs to be replicated with a clinical population.

Despite the obvious limitation of this study being from one course in a particular type of doctoral training program, the results are likely to be applicable to two types of training programs, which would benefit from the integration of a TA perspective in teaching psychological assessment: humanistically and interpersonally oriented programs, arguably more often core to programs adhering to the practitioner-scholar training model, and clinical science programs. Concerning the former, TA emphasizes collaboration and renders psychological assessment less pathologizing, which is consistent with the values of training programs that are more interpersonal or humanistic in nature. The historical development of TA from the school of humanistic psychology (Finn, 2007; Fischer, 1985/1994) attests to the complementarity of this approach within a humanistic training program. Our results indicating a shift in perspective, particularly trainees’ surprise at how collaborative and therapeutic assessment can be, is evidence of the division between the ideology of humanistic training and the marginalization and portrayal of psychological assessment, albeit inaccurate in many ways. Our findings indicate that TA runs contrary to the depiction of psychological assessment as dehumanizing, unsupportive, and meaningless to the client and to the endeavor of effective intervention and reduction of suffering. The unanticipated theme arising in trainee responses concerning the congruence of TA with their personal and professional values furthers supports our assertion of TA fitting within the interpersonal and humanistic traditions.

Clinical science programs may also benefit from teaching psychological and personality assessment from the TA perspective. A widespread view among clinical science programs is that psychological assessment has little treatment and clinical utility (e.g., Hunsley & Mash, 2007). Our results unequivocally indicated that trainees saw the value and clinical utility of TA/PA. Additionally, the growing body of research on TA supports its utility for a variety of clinically meaningful outcomes from symptom improvement (e.g., Finn & Tonsager, 1992; Smith et al., 2010) to engagement in indicated services (e.g., Ougrin et al., 2008) and preparation for treatment (e.g., De Saeger et al., 2014; Smith et al., 2015). TA could serve as a bridge between antiquated views concerning the usefulness of assessment as well as achieve acceptability among trainees and practicing psychologists, in turn increasing its use and improving training.

Despite the potential benefits and positive student responses to teaching from a TA perspective, implementing a quality curriculum will be a challenge without training instructors how to teach and supervise TA. The Therapeutic Assessment Institute currently offers training for clinical application of the TA model. However, formal training in TA instruction and supervision currently needs to be developed. Clinical training in the model to competence or certification level, followed by consultation with instructors teaching courses on TA or from a TA perspective, is the perhaps the best available pathway at this time. Further, there is some evidence that clinicians being trained in TA are sometimes challenged personally as well as professionally when working with clients (Finn, 1998; Haydel, Mercer, & Rosenblatt, 2010) and that confronting such challenges is an integral part of learning to practice TA effectively (Finn, 2005). Nevertheless, we suggest incorporating TA principles and procedures into the existing required sequence of psychological assessment courses. The way in which the course was taught in this study demonstrates that TA can be incorporated into existing coursework.

Limitations

The primary limitations of this study are the characteristics of the sample and the lack of a control group. The size of the sample is small with only ten trainees and five clients. These trainees may have held a view of assessment that is different from that of other programs. For this reason, understanding the context of the program and how trainees view assessment is critical to effectively teaching a course such as this. Assessing this early on in class and with program faculty is highly recommended. Further, the sample comes from one program at one university and one instructor. The type of program limits generalizability of the findings as it may differ greatly from other training programs. An interesting question for future research is a comparison of counseling and clinical psychology training programs, which could differ in their respective views of psychological assessment and the messages conveyed to trainees.

The course instructor was particularly well trained to teach a course from the TA perspective and had extensive experience and training in the model, as well as consultation from experts in the TA community concerning teaching TA. As already mentioned, the issue of training instructors in the TA model is a barrier to implementation. Further, the study design poses some limitations, including the retrospective nature and use of a measure designed by the course instructor but not intended for qualitative analysis. As such, we were unable to query responses to clarify their meaning, which may have helped align our theoretically-derived coding system with the measure. Nevertheless, the researchers found very few examples where a query would have been beneficial. Similarly, the client feedback form was created by the course instructor to assess the client perspective of the TA to inform the trainee. Coding of the responses were based on the four factors of the AQ–2. However, the results indicated that client responses mapped on well to the AQ–2 and that coding could be done with high interrater reliability.

Replication of these results with treatment seeking, rather than volunteer, clients is also an important limitation and next step. Relatedly, there is some inherent risk in not screening the volunteers for certain neurological and psychiatric conditions, suicidality, and other concerns that could have affected the assessment and therefore the training aims and generalizability of the findings. The low rate of client response is also an issue. However, client responses were largely intended to support trainee experiences of the TA. A number of studies already report the high acceptability of the TA model from the client's perspective (e.g., Tharinger et al., 2009). Last, we did not conduct a joint feedback session between the assessor, client, and the client's psychotherapist, as would be best practice in the TA approach (see Smith et al., 2015). However, this arrangement is not always feasible and would have only applied to two of the ten clients.

Conclusion

Despite the limitations of this study and the existing barriers to implementing a TA-based assessment training curricula, the results provide evidence of positive trainee experiences with PA when taught from a TA perspective. This study also offers an example of how TA can be integrated into the existing required course in graduate training in PA in a manner that renders psychological assessment more acceptable and clinically useful to graduate trainees. Last, the coding categories offer a blueprint for the development of a measure to assess trainees’ experiences of learning and practicing TA. Monitoring the experience of trainees, as well as assessing the effectiveness of teaching methods via evaluations of fidelity to the model and clinical effectiveness, would undoubtedly lead to improvements in teaching practices and curricula development. This is an area of training in TA and PA that is currently underdeveloped. Early training experiences that foster interest, shift perspectives, and lead to an appreciation of the clinical utility of psychological assessment are an important factor for continuing the revival of assessment in professional practice.

Figure 1.

Figure 1

Trainee Self-Evaluations: Percent of Codes Reflected by Each Category

Figure 2.

Figure 2

Client Feedback: Percent of Codes Reflected by Each Category

Acknowledgments

Funding

Justin D. Smith received support from National Institute on Drug Abuse grant DA027828 awarded to C. Hendricks Brown for the Center for Prevention Implementation Methodology for Drug Abuse and Sex Risk Behavior.

Contributor Information

Justin D. Smith, Center for Prevention Implementation Methodology, Department of Psychiatry and Behavioral Science, Northwestern University Feinberg School of Medicine

Kaitlyn N. Egan, Department of Psychology & Neuroscience, Baylor University

References

  1. Ackerman SJ, Hilsenroth MJ, Baity MR, Blagys MD. Interaction of therapeutic process and alliance during psychological assessment. Journal of Personality Assessment. 2000;75(1):82–109. doi: 10.1207/S15327752JPA7501_7. doi: 10.1207/S15327752JPA7501_7. [DOI] [PubMed] [Google Scholar]
  2. Beck AT, Steer RA. Beck Depression Inventory Manual. The Psychological Corporation; San Antonio, TX: 1987. [Google Scholar]
  3. Bruhn AR. The Early Memories Procedure: A projective test of autobiographical memory, part 1. Journal of Personality Assessment. 1992;58(1):1–15. doi: 10.1207/s15327752jpa5801_1. [DOI] [PubMed] [Google Scholar]
  4. Carlson EB, Putnam FW. An update on the Dissociative Experiences Scale. Dissociation: Progress in the Dissociative Disorders. 1993;6(1):16–27. [Google Scholar]
  5. Clemence AJ, Handler L. Psychological assessment on internship: A survey of training directors and their expectations for students. Journal of Personality Assessment. 2001;76:18–47. doi: 10.1207/S15327752JPA7601_2. [DOI] [PubMed] [Google Scholar]
  6. Curry KT, Hanson WE. National survey of psychologists’ test feedback training, supervision, and practice: A mixed methods study. Journal of Personality Assessment. 2010;92(4):327–336. doi: 10.1080/00223891.2010.482006. doi: 10.1080/00223891.2010.482006. [DOI] [PubMed] [Google Scholar]
  7. De Saeger H, Kamphuis JH, Finn SE, Smith JD, Verhuel R, van Busschbach JJV, Horn E. Therapeutic Assessment promotes treatment readiness but does not affect symptom change in patients with personality disorders: Findings from a randomized clinical trial. Psychological Assessment. 2014;26(2):474–483. doi: 10.1037/a0035667. doi: 10.1037/a0035667. [DOI] [PubMed] [Google Scholar]
  8. Finn SE. Teaching Therapeutic Assessment in a required graduate course. In: Handler L, Hilsenroth M, editors. Teaching and learning personality assessment. Erlbaum; Mahwah, NJ: 1998. pp. 359–373. [Google Scholar]
  9. Finn SE. How psychological assessment taught me compassion and firmness. Journal of Personality Assessment. 2005;84(1):29–32. doi: 10.1207/s15327752jpa8401_07. [DOI] [PubMed] [Google Scholar]
  10. Finn SE. In our client's shoes: Theory and techniques of Therapeutic Assessment. Erlbaum; Mahwah, NJ: 2007. [Google Scholar]
  11. Finn SE. Core values in Therapeutic Assessment. 2009 Retrieved from http://www.therapeuticassessment.com.
  12. Finn SE, Fischer CT, Handler L, editors. Collaborative/therapeutic assessment: A casebook and guide. John Wiley & Sons, Inc; Hoboken, NJ: 2012. [Google Scholar]
  13. Finn SE, Schroeder DG, Tonsager ME. The Assessment Questionnaire-2 (AQ-2): A measure of clients’ experiences with psychological assessment. Unpublished manuscript, available from Stephen Finn, Austin, TX. 2000 [Google Scholar]
  14. Finn SE, Tonsager ME. Therapeutic effects of providing MMPI-2 test feedback to college students awaiting therapy. Psychological Assessment. 1992;4(3):278–287. doi: 0.1037/1040-3590.4.3.278. [Google Scholar]
  15. Fischer CT. Individualizing psychological assessment. Erlbaum; Mahwah, NJ: 1985/1994. [Google Scholar]
  16. Handler L, Smith JD. Education and training in psychological assessment. In: Graham JR, Naglieri JA, editors. Handbook of Assessment Psychology. 2nd ed. Vol. 10. Wiley; New York, NY: 2012. pp. 211–238. [Google Scholar]
  17. Haydel ME, Mercer BL, Rosenblatt E. Training assessors in Therapeutic Assessment. Journal of Personality Assessment. 2010;93(1):16–22. doi: 10.1080/00223891.2011.529004. [DOI] [PubMed] [Google Scholar]
  18. Hilsenroth MJ, Peters EJ, Ackerman SJ. The development of therapeutic alliance during psychological assessment: Patient and therapist perspectives across treatment. Journal of Personality Assessment. 2004;83(3):332–344. doi: 10.1207/s15327752jpa8303_14. doi: 10.1207/s15327752jpa8303_14. [DOI] [PubMed] [Google Scholar]
  19. Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qualitative health research. 2005;15(9):1277–1288. doi: 10.1177/1049732305276687. [DOI] [PubMed] [Google Scholar]
  20. Hunsley J, Mash EJ. Evidence-based assessment. Annual Revew of Clinical Psychology. 2007;3:29–51. doi: 10.1146/annurev.clinpsy.3.022806.091419. doi: 10.1146/annurev.clinpsy.3.022806.091419. [DOI] [PubMed] [Google Scholar]
  21. Krippendorff K. Agreement and information in the reliability of coding. Communication Methods and Measures. 2011;5(2):93–112. doi: 10.1080/19312458.2011.568376. [Google Scholar]
  22. Krishnamurthy R, VandeCreek L, Kaslow NJ, Tazeau YN, Miville ML, Kerns R, Benton SA. Achieving competency in psychological assessment: Directions for education and training. Journal of Clinical Psychology. 2004;60(7):725–739. doi: 10.1002/jclp.20010. doi: 10.1002/jclp.20010. [DOI] [PubMed] [Google Scholar]
  23. Laine C, Davidoff F. Patient-centered medicine: a professional evolution. Journal of the American Medical Association. 1996;275(2):152–156. [PubMed] [Google Scholar]
  24. Martin H. A bright future for psychological assessment. Psychotherapy Bulletin. 2009;44:23–26. [Google Scholar]
  25. Morey LC. Personality Assessment Inventory professional manual. Psychological Assessment Resources; Odessa, FL: 1991. [Google Scholar]
  26. Newman ML, Greenway P. Therapeutic effects of providing MMPI-2 test feedback to clients at a university counseling service: A collaborative approach. Psychological Assessment. 1997;9(2):122–131. doi: 10.1037/1040-3590.9.2.122. [Google Scholar]
  27. Norcross JC, Karpiak CP, Santoro SO. Clinical psychologists across the years: The Division of Clinical Psychology from 1960 to 2003. Journal of Clinical Psychology. 2005;61(12):1467–1483. doi: 10.1002/jclp.20135. doi: 10.1002/jclp.20135. [DOI] [PubMed] [Google Scholar]
  28. Ougrin D, Ng AV, Low L. Therapeutic Assessment based on cognitive-analytic therapy for young people presenting with self-harm: Pilot study. Psychiatric Bulletin. 2008;32:423–426. doi: 10.1192/pb.bp.107.018473. [Google Scholar]
  29. Ougrin D, Zundel T, Ng AV, Habel B, Latif S. Teaching Therapeutic Assessment for self-harm in adolescents: Training outcomes. Psychology and Psychotherapy: Theory, Research and Practice. 2013;86(1):70–85. doi: 10.1111/j.2044-8341.2011.02047.x. doi: 10.1111/j.2044-8341.2011.02047.x. [DOI] [PubMed] [Google Scholar]
  30. Piotrowski C. On the decline of projective techniques in professional psychology training. North American journal of psychology. 2015a;17(2):259. [Google Scholar]
  31. Piotrowski C. Projective techniques usage worldwide: A review of applied settings 1995-2015. Journal of the Indian Academy of Applied Psychology. 2015b;41(3 (Special issue)):9–19. [Google Scholar]
  32. Ready RE, Veague HB. Training in psychological assessment: Current practices of clinical psychology programs. Professional Psychology: Research and Practice. 2014;45(4):278–282. [Google Scholar]
  33. Smith JD, Eichler W, Norman K, Smith SR. The effectiveness of a therapeutic model of assessment for psychotherapy consultation: A pragmatic replicated single-case study. Journal of Personality Assessment. 2015;97(3):261–270. doi: 10.1080/00223891.2014.955917. doi: 10.1080/00223891.2014.955917. [DOI] [PubMed] [Google Scholar]
  34. Smith JD, Finn SE. Workbook for training and supervision in Therapeutic Assessment. Therapeutic Assessment Institute; Austin, TX: 2011. [Google Scholar]
  35. Smith JD, Finn SE. Therapeutic presentation of multimethod assessment results: Empirically supported guiding framework and case example. In: Hopwood CJ, Bornstein RF, editors. Multimethod clinical assessment of personality and psychopathology. Guilford Press; New York, NY: 2014. pp. 403–425. [Google Scholar]
  36. Smith JD, Handler L, Nash MR. Therapeutic Assessment for preadolescent boys with oppositional-defiant disorder: A replicated single-case time-series design. Psychological Assessment. 2010;22(3):593–602. doi: 10.1037/a0019697. doi: 10.1037/a0019697. [DOI] [PubMed] [Google Scholar]
  37. Stedman JM, Hatch JP, Schoenfeld LS. The current status of psychological assessment training in graduate and professional schools. Journal of Personality Assessment. 2001;77(3):398–407. doi: 10.1207/S15327752JPA7703_02. doi: 10.1207/S15327752JPA7703_02. [DOI] [PubMed] [Google Scholar]
  38. Tellegen A, Ben-Porath YS. The MMPI-2-RF. Pearson Assessments; Minneapolis, MN: 2008. [Google Scholar]
  39. Tharinger DJ, Finn SE, Gentry L, Hamilton AM, Fowler JL, Matson M, Walkowiak J. Therapeutic Assessment with children: A pilot study of treatment acceptability and outcome. Journal of Personality Assessment. 2009;91(3):238–244. doi: 10.1080/00223890902794275. doi: 10.1080/00223890902794275. [DOI] [PubMed] [Google Scholar]
  40. Turner SM, DeMers ST, Fox HR, Reed G. APA's guidelines for test user qualifications: An executive summary. American Psychologist. 2001;56(12):1099–1113. doi: 10.1037/0003-066X.56.12.1099. [Google Scholar]
  41. Watkins CE. What have surveys taught us about the teaching and practice of psychological assessment? Journal of Personality Assessment. 1991;56(3):426–437. doi: 10.1207/s15327752jpa5603_5. doi: 10.1207/s15327752jpa5603_5. [DOI] [PubMed] [Google Scholar]
  42. Wechsler D. Wechsler Adult Intelligence Scale 4th Edition (WAIS-IV) Psychological Corporation; New York, NY: 2008. [Google Scholar]

RESOURCES