Abstract
Despite growing interest in data visualization and graphically aided reporting, the evaluation literature could benefit from additional guidance on systematically integrating visual communication design and marketing into comprehensive communication strategies to improve data dissemination. This article describes the role of targeted communication strategies—based on visual communications, design, and marketing theory—in producing more effective reports. In evaluation practice, well-synthesized and translated reports often require the integration of data from multiple sources, methods, and/or time points to communicate complex findings in ways that elicit productive responses. Visual communication strategies, such as project branding or designing actionable tools with marketing principles in mind, can be applied to optimize effective reporting of complex evaluation findings. This article references a longitudinal, mixed-method evaluation of public school administrators in Michigan to illustrate the application of a systematic communication design framework to produce several graphically aided project materials and subsequent findings reports.
Keywords: dissemination, data visualization, communication, design, reporting
Buzzwords associated with graphically aided data dissemination flood the air at evaluation conferences and fill the meeting rooms of evaluation practitioners. The terms “data visualization” and “infographic,” for example, are tools that evaluation clients want to see and evaluators want to learn how to produce. Clients and evaluators alike are becoming more invested in the promise of well-designed visual tools to end the era of dust-gathering findings reports destined to take up residence on the shelf-of-good-intentions. Yet, as the field is making strides to justify and define the use of these tools in evaluation, systematic methods of integrating data visualization and graphically aided reporting into comprehensive communication and marketing strategies remain largely undefined and elusive. Some evaluators acknowledge the need to take an interdisciplinary approach to embedding design in evaluation reporting and to explore how basic design principles can be applied to create better graphics and visuals (e.g., Azzam, Evergreen, Germuth, & Kistler, 2013; Evergreen, 2011a; Evergreen, 2011b; Evergreen & Emery, 2016; Evergreen & Metzner, 2013; Henderson & Segal, 2013; Lysy, 2013; Pankaj & Emery, 2016). Additionally, program evaluators have written comprehensive guidebooks on the topic, providing practitioners with instruction on applying design principles to craft better charts, graphs, or presentations (Evergreen, 2017, 2018). However, building on these works, which illustrate how to create more effective products via the integration of basic design principles into reporting, the field can further advance by better understanding the theoretical underpinnings of visual communications design and marketing. Rather than positioning data visualization or infographics as products that evaluators must distribute to be effective, these theories suggest data visualization and the creation of infographics are processes that can be embedded in larger systems of communication (Noble & Bestley, 2011).
To clarify this distinction, it is helpful to take a closer look at these buzzwords themselves. Data visualization is both a product (e.g., a well-constructed graph) and a process studied and developed within entire disciplines (Friedman, 2003). Infographics are interesting and easily understood products that tell stories in attractive ways, but the science of information graphics lies in processes refined via visual communication studies and by those like the innovative journalists who popularized them in rapid-reporting news magazines. The power of information graphics in these contexts is not rooted in visual appeal but in the use of diagramming and similar graphic communication tools to transcend the linear constraints of text-based learning via their abilities to convey complex relationships, nonlinear processes, and simultaneity of information (Frascara, 2001).
For evaluators, features that allow for the clear interpretation of complex relationships or nonlinear processes are critical when faced with synthesizing longitudinal, multimethod, or diverse sets of data and effectively translating meaningful stories that evoke actionable learning or appropriate responses. This desired learning or audience response encapsulates evaluators’ ultimate goal of data use, which current literature argues may rely heavily on the communication of a clear message via easily interpretable reports (Evergreen, 2011b; Evergreen & Metzner, 2013). Until now, evaluation literature has largely focused on understanding the interpretive processes of the reader (e.g., the impacts of culturally appropriate colors, visual literacy, and working memory) to create comprehensible or captivating reports (e.g., Evergreen, 2011a, 2011b, 2018; Evergreen & Metzger, 2013). This article, on the other hand, will shift the focus to the front-end synthesis and translation processes of the communicator (i.e., the evaluator) to illustrate a systematic approach to disseminating complex evaluation findings that employ visual communication design and marketing theories. More specifically, the communication plan and corresponding findings reports from the Michigan School Program Information (MiSPI) project evaluation will be used to illustrate a systematic framework employing strategies of brand, objective, narrative, and design (BOND). In sum, evaluators need more formal strategies and tools to help improve the processes of visual communication design and the dissemination of evaluation findings. The BOND framework offers potential solutions to these problems by (1) providing evaluators with an approach to guide the synthesis and translation of complex data, (2) integrating visual communications and marketing principles into report design to improve the dissemination of evaluation data, and (3) presenting a comprehensive framework focused on the process of systematic visual communication design.
An Applied Example: The MiSPI Project
The MiSPI project focuses on understanding how public school administrators find information about school programs and how they use this information to decide which programs to implement in their districts (Neal, Mills, McAlindon, Neal, & Lawlor, in press; Neal, Neal, Kornbluh, Mills, & Lawlor, 2015; Neal, Neal, Lawlor, Mills, & McAlindon 2018; Neal, Neal, Mills, & Lawlor, in press). As part of this project, the MiSPI team conducted a longitudinal statewide evaluation of public school administrators from Fall 2015 to Spring 2017, with initial pilot data collected in 2013. The purpose of the evaluation was to understand the use and flow of research evidence and other information about school programs among administrators in Michigan’s public school systems. Overall, the project generated data from 334 school districts via 152 qualitative interviews and 948 surveys of school administrators, information brokers, and researchers. Analyses of evaluation questions were conducted using several approaches including social network analysis, spatial analysis, qualitative open coding, and various descriptive statistics.
Throughout the development and completion of the MiSPI evaluation, it became clear that managing the large amount of data—to be reported at multiple times, on multiple levels, to multiple audiences, and from multiple sources—would present a communication challenge. This challenge, although common in many evaluations, is not well addressed in the current literature or in conventional practitioner training. Thus, evaluations of this size or duration often leave the evaluators feeling as though they are “drowning in data” (Neal & Neal, 2017). In other words, moving from Point A (a wealth of diverse raw data) to Point B (concise answers to relevant questions, delivered to the right people in actionable ways) is not currently guided by an established standard of practice.
A Process for Applying Visual Communication Design and Marketing Theory to Strategic Evaluation Reporting: The BOND Framework
To address this challenge, MiSPI evaluators opted to explore the science related to visual communication design and marketing to help construct the BOND framework, which would provide an effective communication strategy. Drawing on literature and case examples from social marketing (e.g., Andreasen, 2002; Kotler & Zaltman, 1971; Lefebvre, 2011), graphic design (e.g., Cornish, Goodman-Deane, Ruggeri, & Clarkson, 2015; Noble & Bestley, 2016), visual communications (e.g., Frascara, 2004; Frascara, 2001), and data visualization (e.g., Azzam et al., 2013; Evergreen, 2011a; Evergreen & Emery, 2016; Evergreen & Metzger, 2013; Henderson & Segal, 2013; Onwuegbuzie & Dickinson, 2008; Strecker, 2012), four guiding strategies for more effectively synthesizing and translating complex evaluation data were drawn into one cohesive framework. The strategies within the BOND framework include branding (building and presenting the identity of the evaluation), objective setting (thinking beyond communication of the findings to the desired responses), narrative formation (synthesizing and translating compelling stories), and design (simplifying complexity into manageable products). Although outlined here in the order B–O–N–D, the elements of the framework are intended to be considered iteratively throughout the report building process. The following sections on each strategy outline (1) why to use the strategy and the theoretical rationale and (2) how to use each one, with guidance for use in evaluation contexts as well as examples from a real-world, practical application with the MiSPI project (see Table 1 for a summary of the BOND framework definitions, strategies, and examples).
Table 1.
The BOND Framework Definitions, Strategies, and Examples.
| Definition | Strategies | MiSPI Examples | |
|---|---|---|---|
| Brand | Building and presenting the identity of the evaluation and the findings | Use color schemes, graphic elements, value statements, and so on in early correspondences, data collection tools, and informational materials | Use of MSU branding within MiSPI logo, consistent colors, style, language (see a in Figures 1–3 and http://www.mispi.org) |
| Objective | Thinking beyond the communication of the findings to the desired actionable responses | Use actionable questions like “what needs to happen as a result of the report?” to guide objective setting | Use of multiple data points/sources to support the same objective; all data points related to one clear desired response (see b and c in Figures 1–3) |
| Narrative | Synthesizing and translating compelling stories from the data | Use maps, storyboards, diagrams, and so on to organize themes and assign data points to each theme; connect the themes into a full story | Use of storyboard grid to arrange, emphasize, and connect chosen data points into one cohesive message (see Figure 4) |
| Design | Simplifying complex information into a manageable and intelligible product | Minimize design elements and use every graphic on the page with purpose; for segmenting, grouping, or guiding the eye | Use of color, proximity, shading to connect themes across report pages (see, e.g., in Figures 1–3); symbols or pictures to minimize text (see f in Figures 1 and 3) |
Note. MiSPI = Michigan School Program Information; BOND = brand, objective, narrative, and design.
Brand: Building an Identity
Why to use: Brand
The fields of public health communications (Evans & Hastings, 2008), social marketing (Andreasen, 2002; Lefebvre, 2011), and visual communication design (Noble & Bestley, 2011) stress the importance of a “brand” or the identity attached to products or messages that need to be promoted. Brands are commonly defined as “a set of associations linked to a name, mark, or symbol associated with the product or service” (Calkins, 2005, p. 1; Evans & Hastings, 2008). While the term may elicit thoughts of flashy logos and catchy slogans, the practice of branding is in fact much more involved and intentional (Lefebvre, 2011; Noble & Bestley, 2011), such that it can be used to help systematically improve the communication of evaluation findings.
The purpose of branding is to give meaning, relevance, or a positional stance to the product it represents. In social marketing, for example, which involves the marketing of individual changes to promote the adoption of social causes, the desired product is not the materials or programs created but individual changes in attitudes or behavior (e.g., donating to the cause, recycling more frequently, getting vaccinated; Andreasen, 1995; Kotler & Zaltman, 1971). Branding those actions can help make them more recognizable, appealing, safe, or relatable, and the success of social marketing campaigns hinge on the success of the branding (Lefebvre, 2011). Similarly, when an evaluator creates a findings report, the desired final product is not the report itself, but the changes in attitudes or behaviors that it promotes (e.g., clients become aware of local health disparities, clients adopt, or support a new intervention). Thus, evaluators can leverage the capabilities of branding to make these changes more memorable, familiar, or personally applicable. In a discussion of branding within social marketing, Andreasen (2002) noted that “approaches to solving any problem … gain favor when they are widely perceived as superior to the alternatives” (p. 5). Thus, the point of making new information or actions more appealing via the branded presentation of relevant data is to make accepting the information, or enacting the actions, more favorable than the alternatives (i.e., dismissing the information or choosing not to respond).
How to use: Brand
So, how does an evaluator use branding to increase the saliency of their findings? Again, the answer is not necessarily found in flashy catchphrases or logos but in building memorable, relatable, and influential identities throughout the evaluation. Building a brand for an evaluation can start by incorporating the images, colors, layouts, or values associated with the project into the design of planning or recruitment materials (Evergreen & Metzner, 2013). For example, this could include drawing imagery (icons, photos, logos, and language) from the program being evaluated. This imagery can be used throughout the project on recruitment documents to increase response rates, on data collection materials for subsequent waves to promote recall, or on compensation letters to encourage appreciation for the project and elicit further interest. Finally, when it is time to present the data, using this imagery and identity to design findings reports will link the audience back to data collection, helping them to connect their personal efforts (i.e., filling out the survey with the recallable imagery) to the reports or to tap into the feelings of familiarity that have formed from getting to know the brand over time. This will also help the audience relate findings reports back to a shared value (e.g., a desire for evidence-based programs to be used in schools) capable of inspiring the desired response to the findings (e.g., raising awareness of the need for an intervention promoting research use among school administrators).
For the MiSPI evaluation, the stage was set early and intentionally to make sure that when findings reports were delivered, they would be recognizable and intriguing based on the ubiquitous MiSPI brand. The primary tools used to do this were the MiSPI logo and a website dedicated to the evaluation. The MiSPI logo is simple and minimally designed, but elements like the block “S” representing Michigan State University help relate feelings of legitimacy, education, and trustworthiness by harnessing the brand identity of the well-known institution, which many of the respondents attended or are personally familiar as Michigan residents (see a in Figures 1–3). Further, the MiSPI website was the platform for building, maintaining, and adjusting the brand as needed. In e-mail communications for recruitment, for example, readers were directed to the site, which outlined the goals and values of the evaluation, offered biographies and contacts for members of the research team, provided ways to get involved in the research, and hosted a short video (all branded with the same colors, icons, logos, phrases, value statements, etc.). The website was updated as the evaluation progressed, again allowing a place for interim reports, updates, and a second video to be showcased and shared. To add to the visibility and familiarity of the MiSPI brand, the same branding schemes (colors, layouts, visuals, and language) were taken from the earlier materials to design a unique online survey that was recognizable as connected to the evaluation. Thus, the logo, videos, correspondences, and Web materials helped make the MiSPI brand familiar and connected to the values of the respondents (school administrators’ value of evidence-based practices), while the surveys made the MiSPI branding not only recognizable but also connected to the work and efforts of the respondents (visit http://www.mispi.org for more and see the reports in Figures 1–3 to compare branding elements).
Figure 1.
MiSPI Annotated Evaluation Report Fall 2016 Front Page (Reprinted with permission from MiSPI, Under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Color version available online). MiSPI = Michigan School Program Information.
Figure 3.
MiSPI Annotated Evaluation Report Winter 2017 (Reprinted with permission from MiSPI, Under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Color version available online). MiSPI = Michigan School Program Information.
Objective: Thinking Beyond Communication
Why to use: Objective
When the time arrives to send out a well-branded and recognizable report, it can be overwhelming to decide what to include, how to synthesize that information, and how to design a report that translates information in an effective way. Many dissemination and implementation frameworks across various fields attempt to illuminate these challenges (see Tabak, Khoong, Chambers, & Brownson, 2012, for a review), and the literature suggests that the synthesis and translation of information can benefit from the incorporation of social marketing and communication into dissemination practices (McAlindon, 2017; Noonan, Wilson, & Mercer, 2012). According to marketing literature, an effective and actionable communication tool must be driven by a clear objective or a desired behavioral or attitudinal change in the viewer (Andreasen, 2002).
Setting an objective is a key component of a marketing strategy. In social marketing, the objective is called the “product,” and it is the desired response to the message or change in attitudes or behaviors that the information or service needs to elicit (Andreasen, 2002). Understanding this concept in the context of evaluation reporting is helpful for setting actionable objectives that will guide what information gets included in reports, how to communicate it, and to whom (McAlindon, 2017; Winnett, 1990). Likewise, in visual communication design, starting with a clear and actionable objective is important in order to systematically design with intention (Noble & Bestley, 2011). For instance, imagine a report designed based on three pieces of data (e.g., area income rates, student achievement rates, and differences in curriculum content across local school districts). While the report may catch readers’ attention, the overall objective of presenting these data is still absent if they cannot be translated into appropriate responses. Now, think about that same report designed based on three actions the evaluator wants to encourage (e.g., raised awareness of disparities across neighborhood schools, signing of a petition, and participation in a teacher-led intervention). Starting with this information allows the evaluator greater flexibility and guidance to interpret the data and add figures or findings that motivate these actions (e.g., breakdown of achievement rates by area incomes, percent of residents who already signed petition, and preintervention indicators of the need for the new program). Starting with a well-defined behavioral or attitudinal objective also allows the designer to know who will be reading the reports or if certain groups need unique reports to suit their needs (e.g., should teachers get a different report than others, highlighting the need for the intervention that they will ultimately have to lead and how it benefits their students?). The aim is to identify an objective with the desired change in mind and then make the findings work for that goal and work with those who need to enact it.
How to use: Objective
The idea of establishing an objective early on is perhaps why “intentionality” is such a common term in graphic design (Noble & Bestley, 2011). Beyond their visual appeal, graphics should ideally elicit action, thought, or emotion. This focus on intentionality was present throughout the design of the MiSPI findings reports. The first evaluation team meeting about the reports was guided by questions including “what do we want to happen as a result of the reports?” and “who will be viewing the reports and what do they need to do with them?” as opposed to “what did we find in the data?” or “what were the most positive outcomes?” The reports were created based on two objectives: (1) to make administrators more aware of the importance of knowing where they get information and (2) to spur interest in the possibility of an intervention to shift these patterns.
Two reports designed to meet these two objectives were issued (one in Fall 2016 and one in Winter 2017). To achieve Objective 1, both reports contained different sets of data designed to collectively raise awareness of where educators get information related to school programs. For example, in Fall 2016, descriptive data were reported describing that 20% of administrators do not go to anyone else for information or 28% go to someone in their district (see b in Figure 1). Likewise, in Winter 2017, network maps were used to report data that also described where administrators go for information but instead displayed complete paths of information-seeking that lead to an actual researcher of the topic (see b in Figure 3). Thus, the key message from each display (e.g., some administrators do not go to anyone for information on school programs or some are removed from the research) both used different data and visualizations to drive home the same objective: making administrators more aware of where they go for information and where that information is coming from. To meet Objective 2, both reports contained a simple section explaining that the next phase of the evaluation would explore intervention possibilities and offered response options like “contact us” (see c in Figures 2 and 3).
Figure 2.
MiSPI Annotated Evaluation Report Fall 2016 Back Page (Reprinted with permission from MiSPI, Under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License. Color version available online). MiSPI = Michigan School Program Information.
Narrative: Synthesis and Storytelling
Why to use: Narrative
All design, especially graphic design, involves the construction of narratives or synthesizing information into a cohesive story. Even the simplest designs with no words are meant to tell elaborate stories (Frascara, 2001). Think of the small, embroidered graphics on the breast of a shirt. Although simple symbols, they need to tell the story of a brand and the product you are wearing; is it nautical (e.g., an anchor), sporty (e.g., a golf club), or adventurous (e.g., a mountain peak)? There is power in images and graphics to tell influential and complex stories with little or no words at all. Learning via alphanumeric text is bound to linear, isolated thinking, but graphics and diagrams open up abilities for the mind to see and understand complex relationships, hierarchies, and simultaneous action (Frascara, 2001). As Cross (2004) explains, “a key competency of [a design expert] is the ability to mentally stand back from the specifics of the accumulated examples, and form more abstract conceptualisations” (p.432). This competency is what makes designers, and can help evaluators, take large volumes of information and synthesize it piece by piece into processed and translated chunks.
The key process in crafting a narrative is more systematic than it may seem; it is, essentially, analysis (Frascara, 2001). Crafting a story requires one to alternate zooming out to the lot of the data and zooming in to meaningful findings (Lennertz, 2000). Analysis of this sort is something evaluators are already comfortable with. The next step of translating that analyzed information simply needs to be visual instead of scripted in the case of data design. Because of the advantages of graphic learning described by Frascara (2001), translating information graphically is often much clearer than writing the same conclusions as text. Storyboarding, for example, is particularly helpful for visually crafting narratives as applied in the BOND framework. After an evaluator decides upon an objective, it is helpful to first write it at the top of a sheet of paper and then make boxes across the remaining space on the paper. Just as if making a storyboard for a comic strip, they may use each box to insert a finding, its corresponding data point, and some graphic elements perhaps emphasizing a number or linking the findings to those in another box on the page. By filling and iteratively adjusting each story in each box, the evaluator can slowly craft a full, cohesive story based on the objective and then design the report based on the squares’ contents and their connections. Here, as the evaluator synthesizes, they are doing the required cognitive processing of the raw data for the viewer. But further, as the corresponding story is built, the evaluator is performing the cognitive task that occurs when we look at numbers: figuring out what they mean and how they connect. By doing this cognitive work up front and placing the outcome in front of the viewer, processing time can be dedicated to understanding the story in a more actionable and personally applicable way (Evergreen, 2011b; Evergreen & Metzger, 2013).
How to use: Narrative
For the MiSPI reports, storyboarding was an essential strategy for creating a cohesive story as well as visually sizing and positioning each piece of the story. For example, Figure 4 displays the process of arranging data for a section on the front page of the Fall 2016 report (see d in Figures 1 and 4). To begin, three data points were identified that needed to be included in the report, so they were placed into a simple storyboard (see a in Figure 4). Next, the storyboard format allowed for the data to be easily rearranged, such that a clear, cohesive story could be written connecting each square to the next (see b in Figure 4). For this step, it is helpful to think of the full set of storyboard squares as a sentence and to try to narrate each square with phrasing that connects to the next square. This way, a narrative begins to form that emphasizes critical connections between key points for the reader. Finally, the storyboard squares were rearranged to group the top two similar points together and emphasize that, combined, they comprised the bottom figure (see c in Figure 4). This process led to both a guiding narration, as well as an arrangement, that helped highlight a narrative about the notably high percentage of administrators who either did not make decisions about school programming or did not have anywhere to go for information about school programs (see d in Figures 1 and 4). This storyboarding strategy helped weave the report narrative and utilized higher order analysis processes, which allowed the evaluators to understand and display connections between the key points, thus eliminating this otherwise taxing step for the reader.
Figure 4.
Michigan School Program Information storyboarding process example.
Design: Simplicity Communicates Complexity
Why to use: Design
While guiding the designers of an influential historical exhibition at the Metropolitan Museum of Art, acclaimed artistic director Wong Kar-Wai advised simply in a documentary of the event, “when you say too much, you say nothing” (Rossi, 2016). The purpose of design is not solely to beautify with lots of attractive visual elements but is to simplify (Evergreen & Metzger, 2013; Frascara, 2001). Simplicity is so critical that “the main difference between effective and ineffective data displays is their inability to communicate the evaluator’s key message in a clear and straightforward way such that it does not overload a viewer’s working memory capacity” (Evergreen & Metzger, 2013, p. 6). From a graphic design perspective, simplicity in the form of isolated variables or reduced data is often used to increase the “scientific quality” of graphics, where quality refers to clear, correctly interpretable information (Frascara, 2001, p. 166). Again, there are a number of comprehensive guides available that provide detailed and directive approaches to designing graphs, charts, presentation slides, or report pages (see Evergreen, 2017, 2018; Wong, 2013, as examples). Here, we aim to synthesize some of the basics that have been discussed within the context of evaluation reporting, while focusing on the value of simplicity for clearly and succinctly presenting information within a larger communication framework. The primary idea behind simplicity in data design is to make sure that everything on the page has a purpose. Are words or numbers on the page actually facts or figures that the readers need to know? Do graphics or lines serve a direct purpose to segment information or guide the eye? Using only purposeful graphic elements and “need to know” information in data reports is critical to assist comprehension and avoid overload, confusion, or misinterpretation (Evergreen, 2011a, 2011b; Evergreen & Metzger, 2013). When data reports are pared down to essential numbers and facts, the evaluator can also start adding elements like enlarged figures, boxes, or bullets to help emphasize some pieces of essential information over others (Evergreen & Metzger, 2013).
Thus, to enact the principle of simplicity, an evaluator must consider the purpose of everything on the page. For example, p values are considered important in scientific reporting because they allow researchers to assess the statistical significance of a finding. But, do these values need to be included in brief evaluation reports for organizations to understand the message? Will the p values serve a purpose to inspire the right response of learning or adopting new practices? If not, evaluators may try removing them from action-oriented reports. Instead, it may be more appropriate to save information (e.g., p values, exact dates, raw numbers) for other outlets (e.g., appendices, detailed findings guides, journal articles) if they do not directly relate to the objective and integrity of that report (Evergreen & Emery, 2016; Evergreen & Metzger, 2013). Similarly, the evaluator must also consider the purpose of visual elements (e.g., are bullets necessary to group statements together if they are already in a box together?). Shapes or lines that look attractive or fill white space can be hard to part with, but unless they serve a purpose like segmentation or organization, even the simplest graphic elements can add to cognitive confusion when viewed alongside a full page of visual stimuli (Evergreen, 2018; Evergreen & Emery, 2016; Evergreen & Metzger, 2013). Common strategies for simplifying to aid data design include using symbols to replace words, using colors to replace boxes or organizational elements (Evergreen, 2011a, 2011b; Evergreen & Emery, 2016), or reducing presented data even if it requires limiting the scope of the report (Frascara, 2001). Again, evaluators can use design concepts to simplify over beautify. Color can be used to group themes scattered across the page or emphasize relationship strength or hierarchies, proximity can be used to connect quantitative figures with qualitative corroboration, and symbols can be used to carry concepts throughout the report without restating or adding text (Evergreen, 2018; Noble & Bestley, 2011).
How to use: Design
To aid in the design of the MiSPI reports, simplicity was the driving design principle during planning. Strategies of data reduction and the uses of color, proximity, and symbols were all implemented to make each page more elegantly designed. For example, Figures 1 and 2 (see e) display a column of varying data, all emphasized in green, that carried a theme all the way down the page, and onto the back page (color version available at http://www.mispi.org). Using color here allowed the evaluators to emphasize one storyline at a time, so when readers view the report, they can either follow the color down through the report and get one story about a specific source of information or read through the report in full with conceptual storylines made distinct by switching colors. According to Evergreen (2011b), “readers expect that a change in color indicates a change in meaning,” so it is easier to grasp these shifts when they are coded by shifting colors as in the MiSPI report (p. 10). Returning to Figures 1 and 2 (see f), a map of the state of Michigan (and the United States in Figure 2) is displayed with dots representing evaluation participants’ locations. While seemingly straightforward, this symbol is actually conveying a lot of information including regional concentration of participants and the vast scope of the data collection. By using this symbol, a reader can instantly pick up on the concentration and scope of respondents instead of reading about them in a footnote under the sample size. In both cases, the evaluators took important messages (themes in the story and sample details) and thought of ways to simplify them with design strategies (color scheme and a symbol). Finally, given the need to describe findings related to network tendencies in more detail, evaluators incorporated qualitative data using proximity and were able to present and bridge two data sources, both enriching the personal significance of the data and serving their need (see g in Figure 2).
Taken together, although we chose to order the components of the framework to align with a memorable mnemonic (BOND), it is important to remember that each individual component is as critical as the last when it comes to crafting the most effective reports. For example, while one might argue that the careful analysis and synthesis involved in narrative formation is the most critical component for effective reporting, without intentional and simple design, the message of the narrative could be lost among cognitively complicated visual information. Further, the BOND components are meant to be iterative and considered in tandem throughout the process. Branding, for example, is described first in this article to denote the importance of forming a coherent and identifiable brand for the evaluation itself well before the creation of reports. In practice though, the concept will likely be relevant throughout the BOND process, as it will certainly be a consideration among the design elements of the report. Overall, we recommend being mindful of the iterative implications of each component throughout the process.
Discussion
Reflections on MiSPI Reporting Processes and the BOND Framework
The adaptive website, promotional posters, instructional videos, and findings reports received positive feedback from both the evaluation audience (school administrators and project funders) and colleagues. The findings reports were sent out via e-mail to administrators, presented at evaluation meetings and professional conferences, and made available on the website throughout the duration of the dissemination period. The evaluators received unsolicited feedback on many occasions describing the reports as “very helpful” or “very interesting” and asking “please, continue to share your data.” We attribute the positive response to the thoughtful communication strategy constructed using the BOND framework. Still, we acknowledge room for improvement, both in the design and dissemination of the reports and in the framework. First, in order to more accurately understand the effectiveness of the BOND framework in supporting the reporting of MiSPI evaluation findings, a great deal of insight could be obtained from evaluating uptake and use. To do so, future efforts by the MiSPI team could include the examination of website analytics, digital report downloads, or resulting shifts in actions or attitudes by report readers. Second, the design discipline and evaluators using design principles have acknowledged that good design is inherently participatory (Cornish, 2015; van Bon-Martens, van de Goor, & van Oers, 2017). We recognize that gathering feedback throughout the planning and design of the reports would have made them more personalized, usable, and intelligible to the MiSPI evaluation audience. Unfortunately, at the time of report creation, resource and time constraints limited our ability to seek feedback from the targeted audiences.
An important consideration when using the BOND framework is that participatory design allows for contextually sensitive, user-centered tools that will present further opportunities to collaboratively meet goals and disseminate useful and actionable findings. For example, while gathering stakeholder input on narrative or design may seem logical (e.g., asking “how does the report look?” or “do you like the arrangement?”), gathering input on each component is critical. The objective, for instance, involves identifying an audience and pursuing an attitudinal or behavioral change in that audience. Thus, it may be helpful to elicit stakeholder feedback on, say, the appropriateness of or desire for that objective. Also, audiences vary in their interpretation styles and preferences in reporting. While the MiSPI reports were disseminated to school administrators (i.e., participants in the study and their colleagues), reports with more technical audiences like researchers or funding entities may require greater detail, the consideration of different objectives, or even the use of formats other than infographics or data visualizations. In such cases, stakeholder feedback is important for tailoring the report to the needs and desires of the audience. Future research or practice using the BOND framework could explore the benefits of incorporating more participatory strategies.
The Future of Design and Communication in Evaluation Practice
Both the field of evaluation and the larger science of dissemination studies have expressed the need to further integrate design and communication theories into reporting practices (Evergreen & Metzger, 2013; Noonan et al., 2012). One way to answer this call is to embed more visual communication design and marketing theory and practice into the training of evaluators. To achieve this, evaluators must challenge themselves to reach out and create more opportunities for transdisciplinary collaboration. Collaborating with designers or communications experts to define new theories and practices is a critical step to establishing a well-founded approach to comprehensive communication design in evaluations. The field requires new knowledge to improve the quality of data dissemination and reporting. The addition of frameworks like the BOND framework to the evaluation literature is one step toward defining more systematic processes for creating valuable and sustainable evaluation products.
The BOND framework addresses topics that are important and relevant to a broad range of evaluators as well as current evaluation practice and research. Specifically, the BOND framework can contribute to evaluation training or practice guides by offering a roadmap for incorporating visual communication design and marketing theory into reporting. Evaluation practitioners, reporting to a broad range of stakeholders in various formats, can benefit from the incorporation of these principles into intentional, theory-driven report design. Furthermore, formal assessments of the framework, its components, and its effectiveness can contribute to building a more substantial, empirically based knowledge base for communicating information to evaluation audiences. Thus, these principles and concepts are also applicable to evaluation researchers and the advancement of evidence-based practice in reporting. Moving forward, when reporting findings or translating evaluation messages, we challenge evaluators to apply the strategies in the BOND framework, collaboratively with stakeholders and with an assessment component, to produce more intentional, audience-centered products that improve the knowledge of communication design in evaluation practice.
Acknowledgments
Funding
The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This study was funded by an R21 research grant from the National Institute of Mental Health (#1R21MH100238-01A1) and a Use of Research Evidence Award (#183010) from the William T. Grant Foundation.
Footnotes
Declaration of Conflicting Interests
The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
This research was approved by Michigan State University’s IRB (#x14-706e, #x14-1173e), and all procedures performed in studies involving human participants were in accordance with ethical standards.
References
- Andreasen AR. Marketing social marketing in the social change marketplace. Journal of Public Policy & Marketing. 2002;21:3–13. [Google Scholar]
- Azzam T, Evergreen S, Germuth AA, Kistler SJ. Data visualization and evaluation. New Directions for Evaluation. 2013;2013:7–32. [Google Scholar]
- Cawthon N, Moere AV. The effect of aesthetic on the usability of data visualization. Information Visualization, 2007. IV’07. 11th International Conference; 2007. pp. 637–648. [Google Scholar]
- Cornish K, Goodman-Deane J, Ruggeri K, Clarkson PJ. Visual accessibility in graphic design: A client–designer communication failure. Design Studies. 2015;40:176–195. [Google Scholar]
- Cross N. Expertise in design: An overview. Design Studies. 2004;25:427–441. [Google Scholar]
- Evans WD, Hastings G, editors. Public health branding: Applying marketing for social change. Oxford, England: Oxford University Press; 2008. [Google Scholar]
- Evergreen SD. Unpublished doctoral dissertation. Kalamazoo: Western Michigan University; 2011a. Death by boredom: The role of visual processing theory in written evaluation communication. [Google Scholar]
- Evergreen SD. Eval+ comm. New Directions for Evaluation. 2011b;131:41–46. [Google Scholar]
- Evergreen SD. Effective data visualization: The right chart for the right data. Los Angeles, CA: Sage; 2017. [Google Scholar]
- Evergreen SD. Presenting data effectively: Communicating your findings for maximum impact. 2. Thousand Oaks, CA: Sage; 2018. [Google Scholar]
- Evergreen SD, Emery AK. Data visualization checklist. 2016 Retrieved from http://www.annkemery.com/checklist.
- Evergreen SD, Metzner C. Design principles for data visualization in evaluation. New Directions for Evaluation. 2013;140:5–20. [Google Scholar]
- Frascara J. Graphic design: Fine art or social science? Design Issues. 1988;5:18–29. [Google Scholar]
- Frascara J. Diagramming as a way of thinking ecologically. Visible Language. 2001;35:164. [Google Scholar]
- Frascara J. Communication design: Principles, methods, and practice. New York, NY: Allworth Press; 2004. [Google Scholar]
- Friedman K. Theory construction in design research: Criteria, approaches, and methods. Design Studies. 2003;24:507–522. [Google Scholar]
- Galle P. Design as intentional action: A conceptual analysis. Design Studies. 1999;20:57–81. [Google Scholar]
- Henderson S, Segal EH. Visualizing qualitative data in evaluation research. New Directions for Evaluation. 2013;2013:53–71. [Google Scholar]
- Lefebvre R. An integrative model for social marketing. Journal of Social Marketing. 2011;1:54–72. [Google Scholar]
- Lysy C. Developments in quantitative data display and their implications for evaluation. New Directions for Evaluation. 2013;2013:33–51. [Google Scholar]
- McAlindon K. Selling innovations like soap: The Interactive systems framework and social marketing. American Journal of Community Psychology. 2017;60:242–256. doi: 10.1002/ajcp.12157. [DOI] [PubMed] [Google Scholar]
- McConney A, Rudd A, Ayres R. Getting to the bottom line: A method for synthesizing findings within mixed-method program evaluations. American Journal of Evaluation. 2002;23:121–140. [Google Scholar]
- Neal JW, Mills KJ, McAlindon K, Neal ZP, Lawlor JA. Multiple audiences for encouraging research use: Uncovering a typology of educators. Educational Administration Quarterly. doi: 10.1177/0013161X18785867. In press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Neal JW, Neal ZP. Drowning in data: How to prioritize. Workshop led at the William T. Grant Foundation Mixed Methods Meeting; Los Angeles, CA. 2017. Apr 21, [Google Scholar]
- Neal JW, Neal ZP, Kornbluh M, Mills KJ, Lawlor JA. Brokering the research-practice gap: A typology. American Journal of Community Psychology. 2015;56:422–435. doi: 10.1007/s10464-015-9745-8. [DOI] [PubMed] [Google Scholar]
- Neal JW, Neal ZP, Lawlor JA, Mills KJ, McAlindon K. What makes research useful for public school educators? Administration and Policy in Mental Health and Mental Health Services Research. 2018;45:432–446. doi: 10.1007/s10488-017-0834-x. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Neal ZP, Neal JW, Mills KJ, Lawlor JA. Making or buying evidence: Using transaction cost economics to understand decision-making in public school districts. Evidence and Policy. doi: 10.1332/174426416X14778277473701. In press. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Noble I, Bestley R. Visual research: An introduction to research methods in graphic design. London, England: Bloomsbury; 2016. [Google Scholar]
- Noonan RK, Wilson KM, Mercer SL. Navigating the road ahead: Public health challenges and the interactive systems framework for dissemination and implementation. American Journal of Community Psychology. 2012;50:572–580. doi: 10.1007/s10464-012-9534-6. [DOI] [PubMed] [Google Scholar]
- Onwuegbuzie AJ, Dickinson WB. Mixed methods analysis and information visualization: Graphical display for effective communication of research results. The Qualitative Report. 2008;13:204–225. [Google Scholar]
- Pankaj V, Emery AK. Data placemats: A facilitative technique designed to enhance stakeholder understanding of data. New Directions for Evaluation. 2016;2016:81–93. [Google Scholar]
- Rossi A., Director . The First Monday in May [Motion picture] New York, NY: Magnolia Pictures; 2016. Apr 15, [Google Scholar]
- Smith VS. Data dashboard as evaluation and research communication tool. New Directions for Evaluation. 2013;2013:21–45. [Google Scholar]
- Strecker J. Evaluating IDRC results: Communicating research for influence, data visualization in review: Summary. Ottawa, Canada: International Development Research Centre; 2012. Retrieved from http://www.idrc.ca/EN/Documents/Summary-Report-English-Final-7-May-2012.pdf. [Google Scholar]
- Tabak RG, Khoong EC, Chambers DA, Brownson RC. Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine. 2012;43:337–350. doi: 10.1016/j.amepre.2012.05.024. [DOI] [PMC free article] [PubMed] [Google Scholar]
- van Bon-Martens MJ, van de Goor IA, van Oers HA. Concept mapping as a method to enhance evidence-based public health. Evaluation and Program Planning. 2017;60:213–228. doi: 10.1016/j.evalprogplan.2016.08.014. [DOI] [PubMed] [Google Scholar]
- Winett RA, Altman DG, King AC. Conceptual and strategic foundations for effective media campaigns for preventing the spread of HIV infection. Evaluation and Program Planning. 1990;13:91–104. [Google Scholar]
- Wong DM. The wall street journal guide to information graphics: The dos and don’ts of presenting data, facts, and figures. New York, NY: W. W. Norton; 2013. [Google Scholar]




