Skip to main content
Forensic Science International: Synergy logoLink to Forensic Science International: Synergy
. 2020 Nov 4;2:317–324. doi: 10.1016/j.fsisyn.2020.10.003

Backlogs are a dynamic system, not a warehousing problem

Max M Houck 1
PMCID: PMC7662868  PMID: 33225253

Abstract

Addressing casework backlogs would seem to represent “low hanging fruit” for increasing offender apprehension and improving justice. Yet, after years of grant funding for backlog reduction and capacity building, backlogged cases, especially DNA cases, continue to increase in U.S. forensic laboratories. Why? This paper suggests a shift from linear, mechanical thinking to a systems thinking approach may help to see ways to leverage laboratories from dysfunctional operational states burdened by history to new ways of seeing themselves as part of a system of systems. The A3 method is offered as a practical approach to initiating a systems approach.

Keywords: Backlogs, Management, Systems thinking, A3 process

1. Introduction

An appropriate amount of concern is expressed over unworked or backlogged cases in forensic laboratories, especially in the U.S., and particularly with sexual assault cases. Sexual assault kits (SAKs) are a defined unit of analysis and an index of potential criminal activity. For example, an effort by Michigan State Police State Forensic Laboratory completed over 1595 backlogged sexual assault kits. Completing the testing of kits is just the start, however; jurisdictions that only finish testing to meet public or legislative demands miss the significant benefits of uploading profiles into DNA databases, like the Convicted Offender DNA Indexing System (CODIS). The 1595 SAKs that the Michigan State Police Forensic Laboratory processed yielded 455 CODIS hits and 127 serial sexual assaults [1]. Backlogs therefore represent a significant criminal justice issue that has both personal and societal impact [2]. Doleac demonstrates that uploading a DNA profile is many times more cost-effective than a policing solution and represents a financial benefit of $20,096 to society for each profile. In 2010, 761,609 offender profiles were uploaded to CODIS, representing an overwhelming return on investment to society and justice [3].

Backlogs, therefore, would seem to represent “low hanging fruit” for increasing offender apprehension and improving justice. Yet, after multiple years of grant funding for backlog reduction and capacity building, amounting to well over US$1 billion, backlogged cases, especially DNA cases, persist in U.S. forensic laboratories. Why? A myriad of external and internal issues dog forensic laboratories, including but not limited to unfunded mandates via legislation, lack of adequate resources, time to train new employees to competency, successful cases that encourage more submissions, and older cases that may not be a submitting agency’s priority. New legislation regarding sexual assault kits resulted in a 150% increase in submission of kits for one laboratory and another had nearly three times that amount [4]. In addition to more SAKs being submitted, the forensic technology is improving. If a laboratory switches to Y-screening, more kits will test positive for male DNA; thus, cases can be screened faster but more will yield a male-positive result and move onto full DNA analysis. With a higher referral rate to DNA, the laboratory has to work more samples and more cases. Screening may become a critical step in triaging samples. With the improvements in mixture interpretation offered by probabilistic genotyping comes more analysis time and concomitant court time (to explain the new results); validation of these new statistical methods also takes time, adding to the cumulative personnel hours and case turnaround time lags. In many jurisdictions, adequate resources to supplement these increases in case workloads have not yet been provided [5]. This results in a persistent increase in backlogs, exemplified by DNA (Fig. 1).

Fig. 1.

Fig. 1

Trends in DNA testing of forensic cases [6].

Mechanistic, linear thinking would lead to more grant funding, which has not worked. Giving a person a fish is all well and good; hoping they learn how to fish after being given a fish is overly optimistic and expecting them to learn to fish better-while demanding more fish from them-courts disaster. The efforts to date are laudable and many laboratories have taken outstanding strides to combat their backlog creep [5] but a strategy of “more of the same” will not succeed without a new viewpoint from which to shift from machine-age thinking and seeing the problems in a more holistic fashion.

1.1. Systems thinking

The current but waning paradigm of machine-age thinking, where problems are taken apart like a mechanism to understand how the immediate components work, fares poorly in the face of ever-increasing and more connected information. Individual parts no longer equate to the larger whole, the system; the environment and externalities are as important and dynamic as the thing under study [7,8]. Systems thinking takes the perspective that systems come about due to the interactions and relationships among their elements. These interactions, and their emergent behaviors and unintended consequences, are as much a part of the system as the individual components. A system is any group of interacting, interrelated, or interdependent parts that form a complex and unified whole with a specific purpose [9]. The purpose defines the system as a discrete entity, binding its elements.

A simple example should help demonstrate this concept. Fig. 2 shows a bathtub as a system. Water enters the tub (input) and begins to fill it (capacity). At some level, the tub will reach its limit, and water will need to be intentionally drained from it (output). The desired water level is determined through feedback, indicating how well (effectiveness) and how quickly (efficiency) the goal is reached. Every system has some leakage or waste (a crack in the tub leaks water and reduces both effectiveness and efficiency). Likewise, a forensic laboratory is a system (Fig. 3). Submissions enter the laboratory (input), are completed at a certain rate (capacity), and completed reports are sent out (output). The laboratory has some waste (cases that remain unworked or need to be re-worked), affecting effectiveness and efficiency. Note that the analogy is not completely consistent: Legislation and submission rates can effectively increase not just the flow of the water into the bathtub but also the size of the spigot; while the laboratory can increase the size of its bathtub, it is really only increasing its stock size and not its output capacity. These variances in analogy, however, become important later.

Fig. 2.

Fig. 2

A bathtub is a simple model of a system.

Fig. 3.

Fig. 3

A forensic laboratory is a system.

Systems cannot be easily or meaningfully separated into independent parts; therefore, every element of a system loses some properties if removed from the system, and the system—as a whole—has essential properties that none of its elements do [10]. A faucet is not a bathtub, to put it simply. Problems can be thought of as a system with a critical value proposition attached: “Every problem interacts with other problems and is, therefore, part of a set of interrelated problems, a system of problems” [11]. A problem has an external context and internal structure and these can be addressed through a process of systemic inquiry, such as Checkland’s soft systems method (SSM). Systems may exist within or among other systems. A criminal justice system is in reality a system of systems (Fig. 4) and forensic laboratories are a system within that system of systems.

Fig. 4.

Fig. 4

A criminal justice system as a system of systems.

1.2. What is backlog, really?

No industry-standard definition of a backlogged case exists. The U.S. National Institute of Justice defines a backlogged case as one that has yet to be worked. Project FORESIGHT defines, through a consensus of forensic laboratories, as cases that remain unworked for 30 calendar days or more. This makes a practical difference: By one definition, as soon as a case is submitted to the laboratory, it becomes backlog; by the other, the laboratory has 30 days to work the case before it becomes backlogged. Arguably, the second definition suggests that the laboratory has a working capacity window of 30 calendar days to complete the case to forestall an increase in backlog. Yet, another kind of backlog exists, one that has not to do with the laboratory’s ability to do work but with their environmental awareness of their stakeholders’ needs. So-called artificial backlogs [12] represent submissions to a laboratory that do not need to be completed (the charges were dropped, the accused pled, among other reasons). The reason the case remains on the laboratory’s active list is that the stakeholder failed to inform the laboratory that their services were no longer needed or requested samples, like known DNA swabs, are never submitted; thus, the case remains on the “to do” list unnecessarily. This false information skews the laboratory’s perception of what demand is, leading potentially to an inefficient allocation of resources to solve a problem that does not exist. While a laboratory may increase its output to meet “demand,” the criminal justice outcome rate may not increase proportionately or even significantly; a laboratory that doubles output may only see a fractional increase in prosecutions or pleas in the relevant jurisdiction.

How many cases should a particular laboratory, with a specific staffing compliment and validated processes, be able to work? If a laboratory’s capacity—in essence a production function baseline—could be known, then variances in production could be more easily assessed and addressed. Otherwise, the organization is flying blindly, committing to trial and error as a quality and productivity method. This approach does not have a history of success [13].

Starting in 2017, Project FORESIGHT began to publish efficient capacity possibilities in the project annual report which included tables regarding the optimal cost per case for various output levels (Fig. 5). These efficient output levels are recalculated each year in the FORESIGHT project with the addition of tabular detail on efficient productivity levels per full-time equivalent employee for given case levels. The industry average total cost curve has been estimated by a series of non-linear regressions. When a laboratory performs on or near the curve, it is an indication of efficiency for the corresponding caseload. For an efficient performance that is near the bottom of the U-shaped curve, the laboratory exhibits cost-effective performance as it approaches perfect economies of scale. Each of the average cost curves is illustrated with a corresponding table of values for the cost/case for various caseloads. Also, note that productivity in the form of Cases/FTE versus the corresponding caseload exhibits an inverted curve as compared to the average cost. Research to-date suggests that the level of productivity for any caseload is the most critical component in the DuPont breakdown to explain efficiency in the laboratory. That is a laboratory that exemplifies high productivity for their caseload.

Fig. 5.

Fig. 5

Efficient Frontier for DNA Casework: Efficient Cost/Case for Various Caseloads (table) and Average Total Cost v. Cases Processed (chart). Source: [14].

1.3. Casework flow as process and lag

Thinking of a laboratory as a system, the inputs, cycle (or turnaround) times, and outputs define the core of the process. Inefficiencies, like those addressed through process improvement methods, build up over time, resulting in process lags. Hysteresis is used here as the dependence of a system not only on its current environment but also on its past environment. If a given input, like case submissions, alternately increases and decreases, the output tends to form a loop. Loops may also occur because of a dynamic lag between input (submissions) and output (completed reports). Rate dependent hysteresis is where an input variable X cycles from X0 to X1 and back, the output Y(X) may be Y0 initially and a different value Y2 on the return (Fig. 4). The values of Y(X) depend on the values that X passes through (how many and what kind of cases are submitted), but not on the rate of change of X. Because “history matters,” predictable amplifications of small lags are a disproportionate cause of later circumstances; in the long run, this “historical hangover” leads to inefficiency. A system will ratchet towards a state that is, to one degree or another, dependent, reversible, and inefficient.

Because the system is asymptotic (history-dependent), it moves towards one or more attractors (such as methods, budgets, staff, managers, policies, among many others) governed largely by externalities (budget approval, available staff, accreditation, standards, etc.) (Fig. 6). Those attractors keep the system from visiting all possible states (ergodic). Unless the system can better manage the allocation of internal resources, it will hit a less-than-efficient lock-state (or inefficiency space) from which it could be costly to recover: The attractors have made the system non-ergodic, meaning it cannot, on its own, reach some of its possibly or actually more efficiency states. Exit and opportunity costs become barriers to moving out of that non-ergodic lock-state. For example, inefficiencies due to inferior processes persist because of the legacy the processes accrue (“That’s the way we’ve always done it.”). Changing only the process (mechanistic thinking) ignores the inter-relatedness and inter-dependency of other factors (staffing, quality, adoption of new processes, budgets, etc.) which create entry and exit costs. Technically, hysteresis occurs when a path-dependent stochastic process has an asymptotic distribution that emerges as a consequential function of that non-ergodic process.

Fig. 6.

Fig. 6

Hysteresis in a forensic laboratory system.

A backlog is therefore more than a simple inefficiency, like a warehouse full of boxes, but rather is the cumulative historical result of inadequately supplied or poorly managed resource allocations (of any type, including things such as technology, training, and unfunded legislative mandates) and process changes. In the forensic laboratory, this asymptotic state results in a backlog, meaning backlogs are hysteretic in nature. New approaches to measuring input, process, output, and feedback are needed to reduce backlogs systematically and not mechanically. Backlogs are not only cases to be worked but also involve the laboratory’s capacity for doing work (production function) and case submission rates. Metrics and examples for doing this exist in the forensic literature [[15], [16], [17], [18]].

1.4. Productivity

If a productivity indicator (PI) can be defined as:

PI = S — R

where S is “submissions” and R is “reports”, the rate of work that is being done under the current environment can be assessed, with the units in “cases”. Backlog then can be described as a rate-independent hysteresis (because it has memory) and the difference between inputs and outputs indicates process (in)efficiency. Some relation of the two should be an indicator of capacity—what could be achieved if the inefficiencies of the backlog were removed. One measure of this would relate R to S as a ratio (R/S), creating a Productivity Index (PIx). Because the PIx is dimensionless, it can be used at any level throughout a hierarchy, from analyst to laboratory, allowing managers to relate productivity across units that may have extremely different quantitative metrics of work completed (drug analysis compared to trace evidence, for example). As the PIx exceeds 1.0 (output equilibrating with input), work functions chip away at backlogged cases; if the PIx drops below 1.0, then internal or external factors are affecting productivity and need to be identified and addressed (see sample data in Table 1). In the example, Units 1 and 3 are struggling with meeting parity on PIx but for different reasons. While Unit 3 is seeing more than double the submissions, the quantitative amount of cases coming in is lower than Unit 1; this could model the difference between a drug analysis unit (#3) and a trace unit (#1). A manager might want to assign simpler or fewer cases to Analysts #3 and #8 to allow them to catch up (assuming competency). Likewise, Analyst #6 shows an outlier PI and has either found a highly efficient system for working cases or is cutting corners; absent Analyst #6’s apparent productivity, Unit 2 is at par. Note that this is a “best case” scenario, in that the analysts are only doing casework and not supervision, quality tasks, validations, or professional outreach; casework time allocations can help sort out how much work is being done by which employees and help to balance analysis time with other valuable tasks.

Table 1.

Example data to demonstrate the utility of PIx.

Analyst Submissions Reports PI PIx
Laboratory 0.69 Unit 1 0.40 1 121 45 76 0.37
2 125 55 70 0.44
3 131 42 89 0.32
4 119 56 63 0.47
Unit 2 1.08 5 214 201 13 0.94
6 199 255 −56 1.28
7 222 227 −5 1.02
Unit 3 0.57 8 18 7 11 0.39
9 12 8 4 0.67
10 15 10 5 0.67

In a real-world macro example for the forensic industry, Fig. 6 shows 2018 data from FORESIGHT (Fig. 7) [31]. Plotting the PIx (reports completed divided by cases submitted) against reported backlogs (cases still unworked 30 days from submission) demonstrates those investigative areas in the industry where backlogs are being slowly reduced (a value of 1.0 or greater), those remaining at par with submissions (0.99–0.95), and those where submissions are exceeding the capacity of the areas to meet demand (lower than 0.94). Information like this may suggest policy decisions about service provision, areas of focus for improvement or innovation, and how the industry is coping with exigent circumstances like the opioid crisis [19], changes in evidence testing (like shifting to all sexual assault kits being tested [3]), or pandemics.

Fig. 7.

Fig. 7

Reports/Cases (log) vs. Backlog as a Percentage of Casework. Plotting the PIx against reported backlogs (cases still unworked 30 from submission) demonstrates those investigative areas in the industry where backlogs are being slowly reduced (1.0 or greater), those remaining at par with submissions (0.99–0.95), and those where submissions are exceeding the capacity of the areas to meet demand (lower than 0.94). Reports/Cases was plotted on a log scale due to data point overlap.

Effective systems are more than the sum of their parts and are a product of the quality of ways that its people, resources, and processes interact. Each individual’s activity in a system can be measured but the true indicator of a system’s success is how those individuals’ work comes together, interacts, and interlaces. What drives these individual and sub-system interactions is an agreed-upon, single, shared purpose for the whole system. All within the system must be committed to it above all other goals or aims. The agendas of individual actors within a system must be considered, of course, but the organization’s overarching shared purpose predominates. The purpose of an organization is the reason it exists, the service it was created to perform. This differs from mission, which is “what business the organization is in (and what it isn’t) both now and projecting into the future” ([20]; page 285). And values are how the individuals within the system act when the work, be it competitive, cooperative, collaborative, or other ways. Values make up the organization’s culture, providing its “compass” [20]. Ultimately, a vision statement is based on the organization’s purpose, mission, and values. While this simplifies and clarifies the historically cloudy or vague depiction of an organization offered by the typical vision statement, the core concept here is purpose: Why these people and resources are marshalled together. Purpose is what defines and drives a system.

Structural and political obstacles historically prevented a systemic approach to forensic science. State and local governments are jurisdictionally fragmented, promoting an event-level perspective. Even at the Federal level, where one might expect a higher-order viewpoint given its larger scope of operations, Departments, agencies, and bureaus are restricted operationally to their tasks and cross-unit collaboration can be difficult at best. These governmental directives and restrictions disaggregate what could otherwise be a strategic platform. Because forensic science has no historical or traditional enterprise or strategic models from which to work, it has rarely if ever seen itself as a science-based (not a police-based) system or enterprise.

1.5. Moving from event-level thinking to strategic thinking

Performance requirements for leaders can be outlined as direct, general, and strategic, each of which has differences in complexity, time horizon, and focus, among others, for the leadership environment. Most forensic professionals spend their careers leading at the direct (or tactical) level within an agency. They interact with the same people each day following policies, utilizing resources, directing casework with structured, defined goals. The time frame for this work is short—normally less than one month to one year. Communications are typically limited to the same organization and focus the same topics to the same audiences (sections within a laboratory or within a police agency, for example). Activities become routines; workers spend more time at the tactical level than any other, becoming habituated to and comfortable with the regular pattern of work activity. These daily routines can become misleading, lulling skilled workers into responding only to immediate concerns. Managing only short-terms goals obscures the larger view that is necessary to understanding the system within which that work takes place. Downstream or follow-on effects are either ignored or not considered because workers are managing only their own positions. Reaction to events, not planning, becomes the primary activity. For forensic scientists, this level of thinking is continually reinforced by the incessant submission of and subsequent demand for casework. Operational thinking is important to achieving short-term goals, certainly, but can rob employees and organizations of the ability to look beyond the “now” and take a strategic path to improving their work, their organization, and their profession.

This issue is true not only for individual leaders but also at the organization or laboratory level for the profession. Many forensic service providers exist beyond the traditional laboratory setting, sometimes comprising the bulk of casework (nearly 2/3 of fingerprinting is done outside of laboratories, for example). Roughly 3120 counties exist in the US and many have their own or shared forensic capabilities; this does not count cities, municipalities, or other governmental agencies with a forensic capability, such as public health or medical examiners offices. Jurisdictionally, and thus operationally, local and state governments are fragmented, promoting an event-level perspective. Even at the Federal level, where one might expect a higher-order viewpoint given its larger scope of operations, Departments and agencies are restricted operationally to their remits, tasks, and charges. Cross-unit communication—let alone collaboration—can be difficult at best. Governmental directives and boundaries disaggregate what could otherwise be a strategic platform from which to offer leadership. Moreover, the vast majority of forensic service providers are subsumed under law enforcement agencies with a mismatch of professional cultures (science professionals and police), creating its own problematic disjunctions. The patch-work structure of jurisdictions leads to a kind of “tragedy of the anticommons,” the breakdown of coordination between numerous rights-holders which frustrates the achievement of a desirable outcome with multiple, usually public, benefits [21]. Because no one agency “owns” the strategic domain of forensic science, none can take up the banner of leadership and communal needs go unmet.

1.6. What strategic leadership is and what it means to forensic science

Strategic relates to a plan or plans intended to achieve a goal and, in its broadest sense, is used to indicate primary, overarching importance. A strategic plan, then, is a way to address the most advantageous, complex, difficult, or potentially damaging challenge and to determine the ends, the best ways, and how to apply the most effective means for addressing the challenge [22]. Strategic leadership decisions have effects lasting decades and touching people’s lives broadly, deeply, or both; this is especially true for leaders in high reliability organizations, like forensic service providers.

Strategic leadership is central to forensic science for three reasons. First, it is almost uniformly provisioned through governmental agencies and, as such, holds a primary stewardship role for public resources and goods. Second, forensic science has an enormous potential effect on the criminal justice system and individuals’ lives and liberties. Finally, the consumers of and stakeholders in forensic science are almost exclusively non-scientists. Therefore, forensic scientists have a greater responsibility to be able to explain their fundamental science, methods, and interpretations. This responsibility of forensic leaders extends to stakeholders who may not normally be considered by the forensic profession (because of event-level thinking or political restrictions), like policy makers and governmental entities, who need to understand forensic methods and consequences to make sound, equitable decisions.

Moving from event-level to systems-level strategies requires practical applications of process improvements, regardless of the reasons for the desired changes. The practical applications must push beyond the event level, however, so that the same approaches work on a pattern and strategic level as well. Financial shortfalls are event level but budget planning is pattern level, for example. Strategic level involves both structure and mental models. Structural issues look at the causes of patterns (“What is causing our laboratory to be underfunded each year?“) while mental models address the beliefs, attitudes, expectations, and values that allow structures to continue functioning as they do (“The lab is underfunded because our parent agency does not identify with our mission.“). Thus, any practical application that intends to improve an organization through systems level thinking needs to move the process beyond events and reaction. The application must use patterns to anticipate issues, structures to design flexibility and resiliency, and mental models to transform the organization keeping it ahead of issues and changes.

1.7. Boundary objects and the A3 process

As one practical example of moving from event-level thinking and management to a systems-level one (pattern + structure + mental model), the remainder of this paper will offer the use of boundary objects and their utility in the A3 process. Boundary objects are physical artifacts that provide a common language and promote shared understanding about a problem or a situation among a group of individuals to reach a mutually satisfactory resolution; they can be most useful for crossing boundaries of professional knowledge and perceived intellectual territory [23]. For example, a medical chart in a hospital is a physical artifact presented in a standardized format that serves as a boundary object; physicians and nurses from different functional disciplines read the chart, record information, and discuss among themselves the best possible treatment for the patient. The boundary object acts as “professional passport” focussing the discussion on the boundary object as a token, allowing passage of information between or within potentially varied or conflicting professional groups.

An A31 report can act as a boundary object and can be adapted to nearly any process. Developed by the Toyota Motor Corporation, the A3 provides engineers, supervisors and managers with a structured problem-solving approach that facilitates knowledge sharing and collaboration [[24], [29]]. Organizational problem solving is often addressed superficially, a kind of “first-order problem-solving” that answers the immediate objective but not the root causes of the problem so as to prevent its happening again. Not addressing the root cause will inevitably lead to the same problem or same type of problem again with no systemic changes [[25], [28]]. The A3 process pushes problem-solvers to address the root causes of problems in almost any organization and improves the chances of solving the problem; this forces the shift from event level to consider pattern, structure, and mental models to anticipate, prevent, or repair problems. The A3 process is only outlined here as an explanatory example; the interested reader is directed to Ref. [24,26]; and [27] for a more detailed treatment.

The steps of the A3 Process are:

  • 1.

    Identify a problem or need

  • 2.

    Better understand the current situation

  • 3.

    Conduct root cause analysis

  • 4.

    Devise countermeasures to address the root cause

  • 5.

    Develop a target state

  • 6.

    Create an implementation plan

  • 7.

    Develop a follow-up plan with predicted outcomes

  • 8.

    Engage others in the change improvement process

  • 9.

    Implement the plan

  • 10.

    Evaluate the outcomes

The current way work is happening is not ideal and one or more goals or objectives are not being met, and, thus, there is a problem (or, more neutrally, a need). The recognition of the unmet goal or objective is Step 0, identifying the problem. To better understand how this came to be (or not be), inquiry or investigation is required (Step 1). Toyota suggested that the work processes be observed and documented first hand and not through hearsay or rumor. Visualizing how the work is currently done can be very helpful; formal software or artistic talent is not necessary: stick figures and arrows can make the point just as well and much cheaper. Quantifying the problem (lag time in days, percent unworked, reports per analyst, cost per case, and other) is key to later steps, such as target states, predicted outcomes, and variances. Step 2 is conducting a root cause analysis [25]; outlines a process for forensic organizations. Once this is complete, countermeasures are devised to address the root cause or causes (Step 3). Countermeasures are changes to be made in the work processes that will move their outputs more in line with the desired target state (Step 4) and should.

  • Specify the desired outcome, the tasks needed to attain it, and their sequence,

  • Draw clear, direct connections between requests and provision of goods and services, and

  • Reduce or eliminate re-work, workarounds, and lags or delays.

The countermeasures addressing the root cause(s) of the problem should lead one or more new ways of accomplishing what is desired, the target state. The countermeasures should be listed with the expected improvement predicted quantitatively so it can be measured. The target state is a description of how the tasks are accomplished with the proposed countermeasures in place. Like the inquiry into the problem, the target state should be diagrammed (the new work flow visualized) and included in the A3 report discussed later. Step 5 is the creation of an implementation plan, which includes a list of what has to happen to put the countermeasures in play and achieve the target state. The implementation plan allocates task responsibilities by individual with deadlines (and costs, if necessary). As a future check on the success of the implementation plan, a follow-up plan with predicted outcomes is created (Step 6); it is, in essence, what the implementation plan would look like if it is successful. The follow-up plan has a number of functions, including clarifying that everyone involved understands the current condition and what “success” will look like, ensuring that the implementation plan was executed properly, that the target state was met, and to what degree. At this point, the results of Steps 0 through 6 can be documented in an A3 report (Fig. 8). No formal template exists for the A3 report but the goal is to have one piece of paper that outlines Steps 0 through 6.

Fig. 8.

Fig. 8

The A3 provides a simple and strict approach that guides problem solving. The approach typically uses a single sheet of ISO A3-size paper, which is the source of its name; there is no standard template for the report form. Source: Wikimedia Commons.

Shook summarizes the form concisely (2009; page 30):

A3s are deceptively simple. An A3 is composed of a sequence of boxes (seven in the example) arrayed in a template. Inside the boxes the A3’s “author” attempts, in the following order, to: (1) establish the business context and importance of a specific problem or issue; (2) describe the current conditions of the problem; (3) identify the desired outcome; (4) analyze the situation to establish causality; (5) propose countermeasures; (6) prescribe an action plan for getting it done; and (7) map out the follow-up process.

The A3 report is a one-page, physical manifestation of the work, inquiry, and proposed outcomes for the problem (Steps 1–6); being physical, it can be shared and used as a boundary object to engage any entity that may be a part of the intended solution in discussion about the proposed solution (Step 7). The use of the A3 as a boundary object is critical in building and maintaining consensus throughout the process, investigating the problem further, and discussing what is proposed. The A3 as a boundary object:

  • Records the thinking, inquiry, decisions, rationale, and planning involved with solving the problem,

  • Encourages communication across professional knowledge boundaries, and

  • Structures the problem-solving to facilitate a solution and add to the organizational memory.

Management must be involved in Step 7, not only for chain of command or organizational hierarchy, but because the “transformation is everybody’s job” [13]. Once approved, the plan is implemented (Step 8) and the result evaluated (Step 9). If the actual outcomes are different from those that were expected, additional inquiry is needed to determine why, how to modify the implementation plan, and try again until the target state is achieved. In this way, the A3 process is based on the PDCA cycle: Steps 1 through 6 are the Plan, 7 and 8 are the Do, and 9 is the Check.

The core of the A3 report is about storytelling that synthesizes information into an easily communicated situation and solution [29]. The process of collecting this information and documenting the connections and implications-the story-is key to understanding how the problem affects the organization as a system. The A3 process is tied to the purpose, mission, and values of the organization, acknowledging that the target state is the best outcome for the organization as a system.

2. Conclusions

The adage, “If you give a person a fish, they can eat for a day; if you teach them to fish, they can eat for a lifetime,” sounds nice but the reality is if the productivity index does not equal backlog as measured by cases unworked after 30 days, then the backlog may never be reduced by working “harder” or “smarter”. Overtime, a discussion of which merits its own separate analysis, may not be beneficial in the medium-to long-term due to work-life balance and employee burnout. Overtime is no substitute for an adequately staffed and resourced organization working at high efficiency. The laboratory will have achieved a hysteretic state that may not be reversible due to not only internal processes but also externalities (increased case submissions). The laboratory needs to be able to learn how to fish differently—fishing “the way we’ve always done it” or being given fish (like “backlog reduction” or “capacity building” grants with no standard measures for “reduction” or “capacity”) does not solve the systemic problem. One approach to solving system problems is the A3 process, which creates a coherent narrative about a current unwanted state, a plan on how to get to a desired one, and a tool for communicating the process across multiple knowledge boundaries.

Forensic laboratories face a daunting challenge: Flat or reduced budgets, increased submissions, and intense scrutiny from politicians and advocates. The need for strategic leadership in forensic science is critical and the lack of a historical systems-level view has slowed the development of strong strategic leadership. Accountability is a moral, if not legal, imperative for civil servants and the public deserve to know how their money is spent. Backlogs represent the potential for many ills, including under-resourced laboratories, lack of political support, inefficient processes, poorly trained scientists, and, worst of all, unidentified criminals. By establishing metrics for capacity and productivity in a systems view, laboratories can gain knowledge of what they do and how they do it [[30], [31]]. Communicating this information fearlessly to stakeholders and the public would represent a sea-change for forensic agencies, one that is necessary if problematic. Demonstrating efficiency and economical stewardship of the public’s money might actually persuade decision-makers to reduce an organization’s budget, instead of increasing it, on the rationale that if the money is not being spent, it is not needed. An organization should not be penalized or fear retribution for doing well. But using productivity indicators as outlined in this paper could establish a rationale for what is needed and, most important, why. Without this information, forensic laboratories will continue to flounder in a sea of mechanically-minded one-off grants or thirst for sustainable budgets, neither of which will reduce backlogs nor build capacity in an out-of-kilter system of systems.

Declaration of competing interest

The author has no financial considerations or conflicts of interest to claim.

Footnotes

1

The term “A3” derives from the dimensions of the A series paper sizes, as defined by the ISO 216 standard; A3 is the metric equivalent to 11″ × 17″ paper.

References

  • 1.Campbell R., Fehler-Cabral G., Pierce S., Sharma D., Bybee D., Shaw J., Horsford S., Feeney H. U.S. Department of Justice; Washington, DC: 2015. The Detroit Sexual Assault Kit (SAK) Action Research Project (ARP), Final Report. National Institute of Justice Report 248680. [Google Scholar]
  • 2.Doleac J.L. The effects of DNA databases on crime. Am. Econ. J. Appl. Econ. 2017;9(1):165–201. [Google Scholar]
  • 3.Speaker P.J. The jurisdictional return on investment from processing the backlog of untested sexual assault kits. Forensic Sci. Int.: Synergy. 2019;1:18–23. doi: 10.1016/j.fsisyn.2019.02.055. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 4.Gamette M.J. U.S. Senate Judiciary Committee; Washington, DC: 2018. Promoting Justice for Victims of Crime: Examining the Federal Investment in DNA Analysis.https://www.judiciary.senate.gov/imo/media/doc/07-18-18%20.Gamette%20Testimony.pdf Retrieved from. [Google Scholar]
  • 5.National Institute of Justice (Nij) US Department of Justice, Office of Justice Programs, & United States of America; 2020. Comprehensive Needs Assessment of Forensic Laboratories and Medical Examiner/Coroner Offices Points to Solutions for a System under Stress. [Google Scholar]
  • 6.LaPorte G., Waltke H., Heurich C., Chase R. NIJ fiscal year 2017 funding for DNA analysis, capacity enhancement, and other forensic activities. 2018. https://www.ncjrs.gov/pdffles1/nij/251445.pdf Retrieved from.
  • 7.Edson R. ASysT Institute; Arlington, VA: 2008. Systems Thinking. Applied. [Google Scholar]
  • 8.Senge P.M. Doubleday/Currency; New York City: 2006. The Fifth Discipline: the Art and Practice of the Learning Organization. [Google Scholar]
  • 9.Meadows D.H. Chelsea Green Publishing; 2008. Thinking in Systems: A Primer. [Google Scholar]
  • 10.Ackoff R. John Wiley and Sons; Hoboken: 1974. Redesigning the Future. [Google Scholar]
  • 11.Ackoff R.L. Wiley; New York City: 1999. Ackoff’s Best: His Classic Writings on Management. [Google Scholar]
  • 12.Strom K.J., Hickman M.J., Smiley Mcdonald H.M., Ropero-Miller J.D., Stout P.M. Crime laboratory personnel as criminal justice decision makers: a study of controlled substance case processing in ten jurisdictions. Forensic Sci. Pol. Manag.: Int. J. 2011;2(2):57–69. [Google Scholar]
  • 13.Deming W.E. Massachusetts Institute of Technology; Cambridge, MA: 1986. Out of the Crisis. [Google Scholar]
  • 14.Speaker P. Project FORESIGHT annual report, 2017-2018. 2019. https://researchrepository.wvu.edu/faculty_publications/1139/
  • 15.Houck M.M., Riley R. FORESIGHT: a business approach to improving forensic science services. Forensic Sci. Pol. Manag. 2009;1:85–95. [Google Scholar]
  • 16.Speaker P. Key performance indicators and managerial analysis for forensic laboratories. Forensic Sci. Pol. Manag.: Int. J. 2009;1(1):32–42. [Google Scholar]
  • 17.Speaker P. The decomposition of return on investment for forensic laboratories. Forensic Sci. Pol. Manag. 2009;1:96–102. [Google Scholar]
  • 18.Maguire C., Houck M., Williams R., Speaker P. Efficiency and the cost effective delivery of forensic science services: in-sourcing, out-sourcing, and privatization. Forensic Sci. Pol. Manag. 2012;2(3):28–35. [Google Scholar]
  • 19.Ropero-Miller J.D., Speaker P.J. The hidden costs of the opioid crisis and the implications for financial management in the public sector. Forensic Sci. Int.: Synergy. 2019;1:227–238. doi: 10.1016/j.fsisyn.2019.09.003. [DOI] [PMC free article] [PubMed] [Google Scholar]
  • 20.Kenny G. Your company’s purpose is not its vision, mission, or values. Harv. Bus. Rev. 2014;3:285–306. [Google Scholar]
  • 21.Buchanan J.M., Yoon Y.J. Symmetric tragedies: Commons and anticommons. J. Law Econ. 2000;43(1):1–14. [Google Scholar]
  • 22.Guillot W.M. Strategic leadership: defining the challenge. Air Space Power J. 2003;17(4):67. [Google Scholar]
  • 23.Gal U., Yoo Y., Boland R., Jr. ECIS 2005 Proceedings. 2005. The dynamics of boundary objects, social infrastructures and social identities.https://aisel.aisnet.org/ecis2005/57/ Retrieved from: [Google Scholar]
  • 24.Sobek D.K., Smalley A. CRC Press/Productivity Press; Boca Raton, FL: 2008. Understanding A3 Thinking: a Critical Component of Toyota’s PDCA Management System. A Productivity Press Book. [Google Scholar]
  • 25.Houck M.M. Risk, reward, and redemption: root cause analysis in forensic organizations. Forensic Sci. Pol. Manag.: Int. J. 2016;7(3–4):106–112. [Google Scholar]
  • 26.Shook J. Lean Enterprise Institute; Cambridge, MA: 2008. Managing to Learn: Using the A3 Management Process to Solve Problems, Gain Agreement, Mentor and Lead. [Google Scholar]
  • 27.Matthews Daniel D. CRC Press/Productivity Press; New York: 2011. The A3 Workbook: Unlock Your Problem-Solving Mind. A Productivity Press Book. [Google Scholar]
  • 28.Checkland P. Wiley; Chichester, Sussex; New York City: 1993. Systems Thinking, Systems Practice. [Google Scholar]
  • 29.Shook J. Toyota’s secret: the A3 report. MIT Sloan Manag. Rev. 2000;50(4):30–33. [Google Scholar]
  • 30.Speaker P. Financial management of forensic science laboratories: lessons from project FORESIGHT 2011-2012. Forensic Sci. Pol. Manag. 2015;6(1–2):7–29. [Google Scholar]
  • 31.Speaker P. Project FORESIGHT annual report, 2016-2017. 2018. https://researchrepository.wvu.edu/faculty_publications/1140/

Articles from Forensic Science International: Synergy are provided here courtesy of Elsevier

RESOURCES