Skip to main content
EPA Author Manuscripts logoLink to EPA Author Manuscripts
. Author manuscript; available in PMC: 2019 Sep 18.
Published in final edited form as: World Futures Rev. 2018;25:1–25. doi: 10.1177/1946756718814908

Strategic Foresight in the Federal Government: A Survey of Methods, Resources and Institutional Arrangementsi

Joseph M Greenblott *, Thomas O’Farrell *, Robert Olson +, Beth Burchard *
PMCID: PMC6750717  NIHMSID: NIHMS1523455  PMID: 31534624

Abstract

The goal of this study is to help identify approaches and practices for improving Federal foresight efforts and the means for integrating foresight work into strategic planning and decision making. We present the results of semi-structured interviews with people involved with foresight in 19 Federal agencies, and with two non-Federal experts on foresight in government. We discuss where agencies use similar approaches and where their approaches diverge and include particularly insightful quotes from interviewees. We also discuss “broader observations” regarding the state of strategic foresight efforts across the Federal government and how our findings relate to opportunities and challenges to institutionalizing foresight in the Federal government.

Keywords: strategic foresight, federal government, strategic planning, decision making

Introduction

Importance of Strategic Foresight

Strategic foresight is a planning tool to develop the critical thinking, planning, and management competencies for considering the impact of long-term uncertainties on near-term decision making. One foresight expert, Richard Antcliff,1 argues that what sets foresight work apart from other planning approaches is that it results in a “changed mindset” and organizational “culture shift.” This kind of shift to a long-term and broader decision-making paradigm benefits the full range of organizational planning and management processes, including strategic planning, budgeting, human resources, evaluation, assessment, training, and procurement. Antcliff gives the example of workforce hiring where, as positions open, the normal tendency is to backfill positions with people of similar skills. With a changed mindset, innovative managers look for individuals with the skill sets necessary to address future needs.

Leon Fuerth,2 a prominent national security and government foresight expert, points out that the U.S. Government has a history of long-range thinking and actions that benefit the country, even at the expense of short-term costs. Fuerth stresses that it is important to foster and revive the kind of long-range strategic thinking—foresight—that produced much of the nation’s economic, technological, and social progress. Examples include the:

  • Marshall Plan to speed European recovery from World War II.

  • GI Bill to make it possible for veterans returning from the war to get a college education.

  • Creation of land grant universities.

  • Financing of the trans-continental railroad line.

  • Building of the Panama Canal.

  • Construction of the interstate highway system.

  • Purchase of Alaska.

  • Creation of the international financial system after World War II.

The Office of Management and Budget (OMB) encourages Federal agencies to incorporate strategic foresight into the strategic planning and review process as one method for facilitating the achievement of long-term goals (Office of Management and Budget 2017), emphasizing that:

strategic foresight is a method for systematically considering a longer time horizon and broader scope of issues than other forms of planning. Integrating strategic foresight in the planning process also facilitates a systems approach to problem solving and may help an agency better prepare for future threats or take early advantage of emerging opportunities. The systems approach of strategic foresight also encourages organizational communication to avoid the “silo effect,” in which problems are viewed in isolation. Foresight methodologies may vary by agency depending on its mission and operating environment, but examples of strategic foresight methodologies include scanning, trend analysis, and scenario planning. Opportunities for cross-agency foresight coordination are also encouraged to be explored where appropriate. (Section 230–2)

OMB also advises agencies that annual strategic reviews also serve to:

inform long-term strategic decision-making by agency leadership and key stakeholders, including OMB and Congress; and inform the development of the Strategic Plan at the beginning of each new Administration. Strategic foresight methodologies, conceptualized as the capacity to think systematically about the future to inform strategy development, represent one such approach to inform long-term decision-making and can be used as a planning tool to prepare for change. Agencies are encouraged to think and, where applicable, apply core elements of strategic foresight as a part of their review process, including framing, environmental scanning, forecasting, identifying probable and plausible future scenarios, and using those scenarios to inform the development of strategic actions. (Section 270–6)

Background for This Study: Foresight at Environmental Protection Agency

In the 1980s, several U.S. Environmental Protection Agency (EPA) offices began to explore the use of horizon scanning, scenarios, and other tools of strategic foresight. These efforts, however, generally were ‘one shot’ activities and not sustained as part of an ongoing, systematic foresight process. Beginning in 1995, advisory bodies, including the National Academy of Sciences/National Research Council (2011, 2012, 2014), the EPA’s Science Advisory Board (1995) and the EPA’s National Advisory Council for Environmental Policy and Technology (2002, 2009, 2012), consistently recommended that the EPA institutionalize strategic foresight and connect it to the Agency’s strategic planning. To link strategic foresight and planning, beginning in 1998, the EPA sponsored modest horizon scanning, scenario planning, and networking activities. While these efforts identified several emerging issues that were discussed in EPA’s 2006–2011, 2011–2015, and 2014–2018 Strategic Plans (U.S. Environmental Protection Agency 2006, 2010, 2014)—including population growth and water scarcity, nanotechnology, waste from mining rare earth elements, remote sensing, distributed sensor networks, information technology, climate change, sea-level rise and storm surge, biotechnology genomics, computational toxicology, pharmaceuticals in wastewater, renewable energy and biofuels—these foresight efforts were individual activities that EPA did not sustain and which were not fully integrated in to EPA planning and decision-making processes.

To reinvigorate and demonstrate the usefulness of foresight, the EPA initiated a Strategic Foresight Pilot Project in the summer of 2015. The project involved the creation of a multidisciplinary “Lookout Panel” with representatives from EPA’s headquarter offices and three regional offices. The Panel undertook a formal horizon scanning effort, interviewed “thought-leaders” within and outside the EPA, and identified priority issues. The Lookout Panel’s findings were shared broadly among the EPA’s leadership and referenced in internal guidance for developing the EPA’s 2018–2022 Strategic Plan. In addition, the Pilot Project created an ongoing Strategic Foresight Community of Practice (COP) with programs on foresight open to all EPA staff.

The Pilot Project identified several emerging issues and technologies that were discussed in EPA’s 2018–2022 Strategic Plan (U.S. Environmental Protection Agency 2018), including monitoring, sensing, measurement and information technologies, advances in the fields of information technology and social science research, and emerging chemical and biological contaminants. While the Pilot Project again demonstrated that strategic foresight can inform planning and management decisions at the EPA, it did not address how to incorporate foresight systematically into the Agency’s planning and management processes. Thus, we decided to seek the advice of foresight experts and explore how other government organizations address this challenge.

The goal of the study reported here is to help identify approaches and practices for improving Federal foresight efforts and the means for integrating foresight work into strategic planning and decision making.

Method

We reviewed literature on strategic foresight and conducted semi-structured interviews with representatives from 19 Federal agencies, and with two non-Federal experts on foresight in government. Interviews involved up to four representatives from a range of Federal defense, intelligence, and civilian agencies (Table 1). We selected organizations and/or individuals to interview using an initial convenience sample of individuals who participate in the Federal Foresight Community of Interest (FFCOI). 3 We supplemented this with a snowball sample of individuals and organizations recommended by interviewees. While we believe our sample represents a broad range of foresight activities and knowledge across the Federal government, it is by no means comprehensive.

Table 1.

Federal Agencies/Participants

Government participants: Agency Number of interviewees
1. U.S. Air Force (USAF) 1
2. Bureau of Prisons (BOP) (U.S. Department of Justice) 2
3. Bureau of Safety and Environmental Enforcement (BSEE) (Department of the Interior) 2
4. Central Intelligence Agency (CIA) 1
5. U.S. Coast Guard (USCG) (Department of Homeland Security) 1
6. Department of Veterans Affairs (VA) 2
7. Environmental Protection Agency (EPA) 3
8. Federal Bureau of Investigation (FBI) - Futures Working Group 1
9. Federal Emergency Management Agency (FEMA) 1
10 U.S. Forest Service (USFS)/Northern Research Station (Department of Agriculture) 2
11 Government Accountability Office (GAO) 3
12 U.S. Marine Corps (USMC) - Futures Assessment Division 1
13. National Aeronautics and Space Administration (NASA) 1
14. National Geospatial-Intelligence Agency (NGA) 2
15. National Guard Bureau (NGB) - Strategic Foresight Group 4
16. National Intelligence Council (NIC) 1
17. Office of Management and Budget (OMB) (Executive Office of the President) 2
18. Office of Net Assessment (ONA) (Department of Defense) 1
19. Office of Personnel Management (OPM) 1
32

Nongovernment participants: Organization Number of interviewees

20. The Project on Forward Engagement 1
21. IBM Center for the Business of Government 1
2

Each interview lasted approximately one-hour in person or by telephone. We structured the interviews around the following set of questions, which we sent to each interviewee in advance, but also allowed for other topics to be explored.

  • What is the history of foresight in your organization?

  • How do you currently organize your strategic foresight efforts?

  • What resources—full-time equivalents (FTEs) and dollars—do you estimate your organization devotes to strategic foresight?

  • What foresight methods do you use? For each method, who is involved and how do you decide on their involvement (e.g., management, management and staff, open to all, invitations targeted to certain staff, people from outside the agency)?

  • What is the timeframe of your foresight efforts, vis-à-vis matching up with annual and strategic planning and enterprise risk management?

  • How do you integrate the findings from strategic foresight into annual planning, strategic planning, enterprise risk management, etc.?

  • What aspects of your strategic foresight efforts have worked well and why? What would you improve upon?

  • What are exemplary accomplishments you can attribute to your strategic foresight efforts?

We took detailed notes during each interview and following each interview we developed one- to three-page summaries. We used the interviews and published sources to identify foresight practices across the Federal government and to derive key insights from interviewees on what aspects of their foresight efforts worked well, what were their most difficult challenges, and where did they see opportunities for improvement. We sent an annotated draft of the interview summaries to each interviewee to validate the information and address follow-up questions. All organizations that participated in the interviews replied and we incorporated their revisions and suggestions into the summaries (see Supplemental Information).

Results: U.S. Government Foresight Practices

Institutional Arrangements and Resources

Staffing and extramural funding.

Table 2 shows the range and median staffing levels and extramural funding dedicated to strategic foresight across civilian and defense-intelligence agencies for which interviewees provided estimates. Interviewees from several agencies could not provide estimates because their funding was intermingled and inseverable from other planning and management activities or because the information was sensitive or classified.

Table 2.

Agency Strategic Foresight Resources.

Minimum Maximum Median Number of agencies reporting
FTEs
 Civilian 0.5 4 1 9
 Defense-intelligence 1 15 7.5 6
 Overall 2 15
Extramural funding
 Civilian $0 $750,000 $50,000 9
 Defense-intelligence $500,000 $20,000,000 $1,590,000 4
 Overall $225,000 13

Note. FTEs = full-time equivalents.

Most agencies have 0.5 to 2.0 FTEs responsible for leading foresight activity, but the full range runs from a couple of staff devoting a minor percentage of their time to fifteen analysts devoting most their time to foresight. Dedicated foresight staffing for defense and intelligence agencies is consistently greater than for the civilian agencies we interviewed.

Most Federal foresight programs also use contractor support. These range from large business consulting firms to small specialized foresight academic programs. Except for the Department of Veterans Affairs (VA), the extramural foresight budgets for military and intelligence agencies are greater than those for civilian agencies. Several programs have extramural budgets (i.e., beyond the cost of Federal salaries) in the range of $600,000 to over $2,000,000 per year. For example, the annual budget for contractor support for the U.S. Coast Guard (USCG) Evergreen Project has varied between $500,000 and $750,000 per year. Our interviewee from the U.S. Marine Corps (USMC) reported that a recent annual budget for strategic foresight allotted $2.3 million for contractor support. The Office of Net Assessment (ONA) is an outlier with an annual budget of $15 to $20 million.

Contractors typically provide expertise on foresight methods, assist in research, facilitate workshops, and aid in presenting findings in written reports and other media. The ONA uses more than one consultant if funding is available. The ONA also retains some contractors and rotates others to get different perspectives. Several interviewees, however, cautioned against overreliance on contractors. As one interviewee reflected, “Contractor support has been important, but it was good to become more organic, to develop a better in-house capacity to use foresight processes.”

Some foresight programs also use a larger core group to supplement dedicated foresight staff, including staff who volunteer a portion of their time (e.g., collateral duty) to participate in the design, operation, and evaluation of the foresight program. Compared to civilian agencies, defense organizations generally have larger core groups (9 to 15, see Table 3) to provide for continuity, as uniformed personnel often rotate through the programs every two to three years. One interviewee argued that “a small group of five to nine people is best for getting things done,” and that is the size many foresight programs have found useful.

Table 3.

Participation in Agency Foresight Projects.

Minimum* Maximum* Median* Number of agencies reporting
Core group
 Civilian 5 20 8 4
 Defense-intelligence 12 25 15 3
 Overall 12 7
Active participants
 Civilian 50 200 70 7
 Defense-intelligence 40 200 100 3
 Overall 85 10
*

Number of participants

Core group members are selected for different reasons, such as representing various parts of the organization or bringing in different backgrounds and skills. Regardless of the selection criteria used, involving people who believe foresight is important, want to participate, and have an aptitude for the kind of thinking involved is essential. One interviewee argued that, “identifying core team members with the needed knowledge, bent of mind, and interest must be a top staff priority.” Another interviewee stressed the importance of “avoiding people whose negative attitude can get in the way.” Here we must distinguish between skeptics and what might be called “rejectionists” or “negative influencers.” Skeptics who question methods, challenge assumptions, or contest the plausibility of scenarios are wanted and needed. A rejectionist or negative influencer is someone who won’t contribute to the process in a constructive way, is constantly acting in ways that tend to impede the process, and sometimes may not even believe in the idea of foresight. A skeptic pushes others to explain and justify their position; a negative influencer typically discounts views that depart from the status quo.”4

Foresight activities usually involve active participants from across an individual agency and frequently involve staff and managers from other agencies and non-Federal experts. The number of participants involved depends on the type of activity and can include as many as two hundred individuals.

Some interviewees highlighted the importance of thinking in terms of a “Foresight Ecosystem” within an organization, not just a specific foresight staff. This idea reflects a recognition that there is some look-ahead thinking going on in different parts of virtually every agency, but these efforts have not been labeled as foresight or organized together. Locating these efforts and creating linkages among them and across organizational silos helps the organization benefit from its own knowledge and resources. Drawing people together under the umbrella of a “Foresight Ecosystem” also helps build support for foresight and improves its usefulness in planning and decision making.

Location of foresight in the organization.

There is no standard organizational location for a foresight function in Federal agencies. Locations range from near the top of the organization and reporting directly to senior leadership, to research branches remote from decision makers. Most organizations locate foresight at some in-between level. John Kamensky, with the IBM Center for the Business of Government, observed that the most successful national foresight programs (e.g., Policy Horizons Canada and the Center for Strategic Futures in Singapore) are located “off to the side at the top.” He believes that this location can work for many Federal agencies.

Foresight Methods

Table 4 shows the various foresight methods used by individual Federal agencies. Horizon (environmental) scanning and scenario planning are the most frequently used methods.

Table 4.

Foresight Methods Used by Federal Agencies.

USAF BOP BSEE CIA USCG VA EPA FBI FEMA USFS GAO USMC NASA NGA NGB NIC ONA OPM Total
Horizon Scanning/Trend Analysis X X X X X X X X X X X X X X X 15
Scenarios X X X X X X X X X X X X X X X 15
COP/Speaker Series X X X X 4
Delphi X X X 3
Formal Foresight Training X X 2
Simulation/Models X 1
Backcasting X 1
Futures Wheel X 1
Assumption Testing X 1

Note. USAF =U.S. Air Force; BOP = Bureau of Prisons; BSEE = Bureau of Safety and Environmental Enforcement; CIA = Central Intelligence Agency; USCG = U.S. Coast Guard; VA= Veterans Affairs; EPA = U.S. Environmental Protection Agency; FBI = Federal Bureau of Investigation; FEMA = Federal Emergency Management Agency; USFS = U.S. Forest Service; GAO = Government Accountability Office; USMC = U.S. Marine Corps; NASA = National Aeronautics and Space Administration; NGA = National Geospatial-Intelligence Agency; NGB = National Guard Bureau; NIC = National Intelligence Council; ONA = Office of Net Assessment; OPM = Office of Personnel Management; COP = Community of Practice.

Horizon scanning/trend analysis.

Horizon Scanning (also called Environmental Scanning) is a systematic process for gathering and analyzing information on trends and emerging or potential developments that may be important for an organization, including new threats, additional responsibilities, or untapped opportunities. Scanning often focuses on developments that are at the margins of current thinking and planning but may prove important to an organization. It also can identify new developments and insights related to persistent challenges faced by an organization.

All but three of the organizations interviewed reported they conduct horizon scanning. For a few organizations, scanning has been the only method used (e.g., the Central Intelligence Agency [CIA] Emerging Trends Program). For many of the organizations surveyed, horizon scanning has a dual function: it is important in itself with findings typically summarized in reports for the organization’s leadership, and it is also a first step in various forms of scenario planning.

Many Federal organizations conduct scanning on a continuing basis. One interviewee argued, “You can’t scan once and then live off it for several years – change is too fast. Some amount of scanning should be done yearly even if bigger efforts are only done periodically.” Some foresight programs do their major scans on a regular multi-year schedule. The VA and the USCG do an intensive round of scanning every four years, as the first step in a four-year planning cycle. They continue smaller scanning efforts throughout the cycle to stay abreast of important changes and new developments.

Scanning benefits from a wide range of perspectives gained by involving people from many parts of an organization and relevant outside participants. Some organizations found it effective to engage a relatively small number of people (less than 25) to form a scanning team, if the people involved are diverse and highly qualified. Other organizations have had good experiences with larger groups. In their most recent scan, the VA involved 80 to 100 people from across the organization working in several small teams. The US Air Force (USAF) Strategic Studies Group involved participants from each of the USAF’s 150 organizations. Participants used an on-line tool to recommend emerging issues.

One interviewee stressed the importance of involving people from outside the organization in scanning efforts. The Federal Emergency Management Agency (FEMA), for example, reached out to its whole community—partners at the state and local level, non-profits, community groups, the private sector, and think tanks—to create a scanning group of 70 people.

All the organizations surveyed focused their scanning efforts on both their specific field of activity and broader trends. The USMC Futures Assessment Division, for example, focused on reports by the National Intelligence Council (NIC), the United Kingdom’s Ministry of Defense and other national security-related think tanks, but they also looked at a wide range of reports on trends and changes in the larger society. The Bureau of Safety and Environmental Enforcement (BSEE) used a Social, Technological, Economic, Environmental, Political (STEEP) framework to systematically evaluate external trends and emerging developments that could affect their organization.

Agencies used a wide variety of scanning sources, including media sources, specialist magazines, journals and newsletters, think tank reports, internet searches, participation in meetings and conferences, and interviews with experts. Several agencies found outside input extremely valuable because they often see interactions among different areas and developments outside the areas of in-house expertise.

Conversations with leading-edge outside thinkers can be especially useful. In a recent strategic foresight project, the EPA received many scanning results from interviewing people in universities, non-governmental organizations, and think tanks. The National Geospatial-Intelligence Agency (NGA) elicited the perspectives of futurists, science-fiction writers, and visionary technologists for its 2017 scan. Analysts in the CIA Emerging Trends Program often travel to where developments of interest are happening to study them first-hand.

In every agency, the scanning process produces a large number of issues that a smaller group, usually the core group, narrows down by combining overlapping ideas and making decisions about importance and relevance to the organization. In a recent scan, the VA narrowed a list of scanning “hits” from hundreds to twenty. A scanning effort conducted by the EPA resulted in a list of 80 individual topics that were narrowed to eight issue areas, which were further developed and shared in a report to agency leadership.

Some foresight programs only report their scanning results and intentionally avoid policy recommendations, while other recommended options or dealing with challenging developments identified in their scans. One interviewee cautioned not to jump to solutions and recommendations too early, believing there needs to be a sufficient period of divergence and exploration before moving toward convergence on solutions.

Scenarios.

Scenarios are alternative descriptions of how the future might unfold. They synthesize information about divergent trends and possibilities into explicit, coherent, internally consistent descriptions of plausible alternative futures. Scenarios are not predictions and do not forecast the likelihood of any particular alternative future. Most interviewees agreed that the most important result of scenario planning is how it changes participants’ mindsets about their ability to control and respond to future conditions and events. Scenario planning helps organizations deal with uncertainty, adapt to rapid change, and clarify their priorities.

Agencies use different approaches for developing scenarios. In their latest round of foresight activity, the USCG initially identified 16 scenarios based on the potential interaction of four major dimensions of change. Their leadership then selected a few scenarios for further development. The USMC used a different approach, creating a single base scenario and then two variants that accelerated the trends considered most mutable in the base scenario. One scenario accelerated water scarcity and international migration; the other accelerated biohacking and economic crisis. Regardless of the number of scenarios generated using different approaches, three to five scenarios ultimately formed the basis for planning in every agency, which interviewees considered the upper limit that groups can fully consider in subsequent scenario planning stages.

Scenarios are used differently among Federal agencies. The NGA develops short scenarios for use in internal work groups. The ONA develops short scenarios to set the stage for war games. The U.S. Forest Service (USFS) Foresight Research Work Unit developed project-based scenarios for work in specific areas like future forest fires or climate adaptation planning for tribal communities. In past work, the Bureau of Prisons (BOP) brought together groups of 40–50 subject matter experts, staff, prison wardens, and other high-level officials for scenario training and development sessions and then used the scenarios to brainstorm projects to be carried out by smaller teams.

The most common use of scenarios is as a tool to improve strategic planning. Foresight programs at the VA, the USCG, and the USMC use scenarios as the centerpiece of a multi-step, three- or four-year planning cycle:

  1. The first step is a major horizon scan, with less intensive scanning efforts continuing throughout the planning cycle.

  2. The second step uses the results of the horizon scan as input for the development of scenarios that explore a plausible range of future conditions their organizations may encounter. Scenarios typically are developed by an organization’s smaller core foresight group with contractor support.

  3. Once their scenarios are constructed, larger groups of people with different expertise and authority are recruited for a third step: workshops on each of the scenarios. In these workshops, participants explore and become familiar with each scenario, mentally living in them and deriving implications for their organization.

  4. The fourth step is to identify imperatives, that is, the capabilities necessary for the organization to succeed across the range of potential future circumstances described by the scenarios.

Agencies proceed in various ways once they identify imperatives. At the VA, the imperatives are expressed in the form of “therefore, the VA will need to…” statements. These imperatives are shared with the VA’s planning office, which conducts a gap analysis to compare the current state of the VA with future needs. The analysis aims to formulate goals based on the gaps deemed most important and to identify available options to fill those gaps. These goals then become embedded in the VA’s strategic plan. Interviewees from the VA said they consider their best achievement to be making this scenario-based foresight process the foundation of their strategic planning.

At the USCG, a four-year cycle of foresight activity is timed to correspond with the four-year term of Commandants. A description of the strategic foresight process and recommendations are communicated in a report for the incoming Commandant. The foresight report is a major input into the “Commandant’s Intent” document, which is the USCG’s de facto strategic plan strategic plan issued by each Commandant within a few months of taking office. The USMC foresight program presents its leadership with information on what the future may be like, without making recommendations.

All scenario-based planning efforts face a challenge of achieving a proper balance between exploring potential external developments and staying relevant to the organization. The VA’s initial effort at scenario planning was reportedly too internally focused, thereby missing several external trends and developments that affect their operations. Their second scenario-based planning effort was too externally focused, making it difficult to relate to specific issues of importance to the VA and veterans. The VA foresight team now feels they have found the proper balance, and they urge others to pay attention to the issue of balance from the start.

The USMC found that including vignettes — stories of daily life — made their scenarios more vivid for decision-makers. In 2015, the USMC partnered with experienced science fiction writers to produce narratives depicting their baseline and alternative future scenarios. These professional writers mentored volunteers within the USMC as they wrote the scenarios. This was identified as one of the most successful aspects of their 2015 strategic foresight process, largely because the stories were engaging–they immersed people into the future environments. The scenarios were printed in the Marine Corps Gazette and available online, allowing them to be read and discussed widely (Futures Directorate 2015).

Several interviewees emphasized the importance of having scenarios and other work products reviewed by many people both inside and outside the organization, including academics and people in the private sector. The USCG’s scenario descriptions have an up-front “Contributor’s Page” that lists all the people who helped write and review the scenarios, which adds to their credibility.

Other foresight methods.

In addition to horizon scanning and scenario planning, there are a variety of other foresight methods used by Federal agencies, including the following:

  • Futures Wheel (or Implications Wheel) exercises help people explore the second- and third- (or more) order consequences of a change. The USFS Foresight Research Unit conducted several futures wheel exercises and recently assisted the EPA and the Department of Energy in conducting a futures wheel exercise on emerging energy technologies.

  • Backcasting is a method used by several agencies that involves defining a desirable future state and then working backwards to identify policies and programs to connect that specified future state to the present.

  • The Federal Bureau of Investigation (FBI) Future’s Working Group experimented with Delphi Forecasting, an anonymous survey method using iterative feedback to pool expert opinion on the future.

  • The NIC recently used a variety of Key Assumptions Testing methods to review underlying assumptions in its previous reports. The NGA and the ONA use one of these methods called Red Cells, where groups are set up specifically to examine and challenge an organization’s assumptions to improve its effectiveness. The ONA, the USMC and other parts of the military also convert scenarios into war games that test the effectiveness of tactical responses to different kinds of threats and future conditions.

COPs.

Several agencies created a COP to involve people throughout the organization, share information about strategic foresight, and encourage participation in foresight activities. This can involve something as simple as a speaker program focused on foresight methods, specific emerging issues relevant to the organization, and foresight programs at other agencies. COP members also can participate in horizon scanning efforts, scenario workshops, and other foresight activities.

A COP also is a way to involve newer employees in strategic foresight. One interviewee stressed the importance of exposing new staff to foresight activities, stating “There are sharp limits of how much change can be made in the near future, but changing the thinking and culture of younger people can influence the organization over time.”

Forward-Looking Time-Frame

While there is a wide range of forward-looking timeframes (Figure 1), most Federal foresight programs look roughly twenty to forty years into the future, long enough to identify potential developments and stimulate reflection on the consequences of continuing trends, the social impacts of technological changes, and the potential for major innovations and discontinuities.

Figure 1.

Figure 1.

Forward-looking time frames of federal agencies’ foresight activities.

Note. BSEE = Bureau of Safety and Environmental Enforcement; NASA = National Aeronautics and Space Administration; OPM = Office of Personnel Management; NGA = National Geospatial-Intelligence Agency; BOP = Bureau of Prisons; VA = Veterans Affairs; EPA = U.S. Environmental Protection Agency; FBI = Federal Bureau of Investigation; FEMA = Federal Emergency Management Agency; GAO = Government Accountability Office; NGB = National Guard Bureau; ONA = Office of Net Assessment; NIC = National Intelligence Council; USAF = U.S. Air Force; USCG = U.S. Coast Guard; USMC = U.S. Marine Corps; USFS = U.S. Forest Service.

Different topics and purposes, however, may merit different time frames.

  • The CIA’s Emerging Trends Program looks for developments already underway that potentially will affect the intelligence community within three to five years. Agencies looking at cyber-terrorism find it difficult to look more than five years ahead because of the rapid pace of change in technology.

  • The BSEE foresight program looks ten years out, which corresponds with the lifecycle of offshore energy development technology.

  • The USCG normally looks ahead twenty years, except for a project on the Arctic that looked ahead more than thirty years based on climate projections. Several defense organizations look twenty-five to thirty years ahead, because it takes that long to procure new weapons systems and realize other long-term capital investments.

  • The USMC Futures Assessment Division thinks in terms of efforts and investments across three time-frames and recommends that other organizations think in similar terms: 1) immediate challenges and needs, 2) mid-term needs and goals, and 3) long-term needs and goals.

  • The USFS sometimes works with very long time-frames: fifty to hundred years for timber projections, 150 years for growth and yield models. The USFS also used the International Panel on Climate Change scenarios, which reach through the end of this century.

As several interviewees stressed, regardless of the forward-looking time-frame, the challenge for Federal agencies is to bring back insights gained from taking a long view to the present to inform decision making over the next five years.

Training

Contractors can help provide training in foresight methods, but it is also possible to learn from a variety of other sources. The USFS Foresight Research Work Unit developed an in-house training program in foresight capability by working with the Institute for Alternative Futures, the futures studies program at the University of Hawaii, and other futures groups. All members of the USMC Futures Assessment Division take the University of Houston futures studies certificate course to provide a common frame of reference and vocabulary. Many organizations enroll their staff in short courses and support participation in foresight conferences.

Some organizations, like the EPA and the BOP, have found it useful to provide internal training sessions to introduce foresight methods and information sources, and give participants in foresight activities a sense of how their work fits into larger planning and management processes. The EPA has given training as live and recorded webinars, available on an internal EPA web site.

Leadership Involvement

There is agreement across all interviewees that it is crucial to involve and effectively communicate with an organization’s leadership to have a successful strategic foresight program. Agencies use different approaches to connect with their leadership, but directly involving decision makers in the strategic foresight process is considered particularly effective.

  • The VA holds a workshop with senior staff in each round of its foresight process to help them understand how implications and imperatives (recommendations that must be implemented to achieve the VA’s goals) are developed and to draw implications from scenarios.

  • In its first round of foresight activity, the NGA held a full-day offsite meeting with the agency’s director and senior staff.

  • The BOP held foresight training programs for senior executives and interested staff from across the BOP. For several years (2013–2016), the BOP offered a forward-thinking component, involving Delphi exercises and scenario exploration, in annual refresher training for all staff and headquarters’ executives.

  • At the FEMA, senior career executives became “Priority Champions” within the foresight process. The FEMA enlisted nine executives who “owned” or “co-owned” a strategic objective and were responsible for developing an integrated approach for addressing that objective. In some cases, the strategic objectives were built into their individual performance standards.

  • The FEMA’s foresight team also developed a “Placemat” format for meetings with senior leadership. The 11-inch by 17-inch placemats are intended to be attractive and display key trends/emerging developments, implications and potential strategic objectives.

  • The USFS Foresight Research Work Unit ran a series of Implications Wheels for the Northern Research Station’s leadership that focused on the Station’s research and development program. The project helped the Station’s leaders see potential long-term and second- and third-order impacts of different decision options.

  • The ONA’s staff interact directly with the Secretary of Defense and other senior leaders when they believe they have something important to share.

  • In the EPA’s recent horizon scan of emerging issues, staff engaged with senior leaders throughout the process and discussed issues identified through the scanning effort with experts and senior leaders in the different parts of the agency to collectively identify the issues of greatest interest and importance.

  • The CIA Emerging Trends Program develops short, hard hitting, and visually appealing reports–with high quality paper, an attractive cover, and good graphics–to increase the likelihood that they will be read by leadership. The CIA program also amplifies the attention its work receives by often partnering with other units in the CIA in conducting research and producing reports. Most of the program’s staff are long-time CIA employees who have close working and personal relationships with senior leadership.

One interviewee stressed the importance of breaking down organizational silos in conducting foresight as a strategy for leadership involvement. Political leadership and senior staff changes sometimes will result in less support for foresight activities. An interviewee stressed the importance of persevering by helping leadership and colleagues understand the value foresight efforts can provide. When leaders in one part of an organization engage in a foresight activity, their involvement can help engage leaders in other parts of the organization. Another interviewee suggested that Federal agency leadership would be better prepared to participate in foresight activities if the Federal Executive Institute included foresight training in its curriculum.

In some of the longest running foresight programs, a benefit of involving large numbers of people in scanning activities and scenario workshops is that they understand and support foresight activities when they become part of their organization’s senior leadership. In the USCG, recent Commandants were involved in the Evergreen Program earlier in their careers.

Regardless of the approach used, efforts to involve leadership need to be a high priority. As one interviewee stated, “The success of strategic foresight depends on the support of people at the top of the organization.”

Integration into Planning and Management Processes

One of the biggest challenges for foresight groups across the government is linking strategic level insights and guidance with resources and policy decision making. Many agencies we interviewed do not yet have formal processes for integrating the results of strategic foresight activities into planning or other management processes. The sole mission of some intelligence agencies is to produce foresight products for others, and not necessarily to influence their own organization’s planning efforts. For most agencies with broader missions, however, the objective of strategic foresight is to inform their organization’s planning and management. The interviewee from the USAF Strategic Studies Group said that the struggle of the USAF, and of government in general, is how to take higher-level imperatives that are the products of strategic foresight and translate them into detailed decisions about where to invest or disinvest resources.

Several Federal agencies have had varying degrees of success in integrating foresight and mission planning and management. For example,

  • In the VA, strategic foresight is in the Office of Enterprise Integration, which also encompasses strategic planning, enterprise risk management (ERM), performance management, and data governance. They have made strategic foresight the foundation for building their strategic plans. The results of their foresight work inform the development of the VA’s strategic objectives. They then build performance measures for each of their objectives, currently starting with goals for FY 2023, and then develop annual plans to achieve those objectives. In addition, the VA puts newly published information on emerging developments and trends into annual planning guidance for all VA offices. The VA also uses strategic foresight as a tool for identifying enterprise risks. They estimate that a quarter to a third of the risks in their profile come from foresight work.

  • The NGA Office of Strategic Operations is responsible for providing strategic guidance for the NGA. They conduct strategic foresight, performance management and ERM at the level of the entire organization (other parts of the organization have their own ERM studies). The NGA’s Executive Committee has a “Plans and Programs Director” with whom they work directly to integrate their recommendations into plans, programs and the budget structure.

  • The USCG Evergreen Program’s four-year cycle of work is timed to be completed shortly before a new Commandant takes office. The goal of Evergreen is to inform the development of the Commandant’s Intent document. As mentioned previously, this document serves as the de facto USCG strategic plan since the USCG is included in the Department of Homeland Security’s strategic plan.

  • The ONA interacts directly with the Secretary of Defense. As a result, it can have a substantial influence on the Department of Defense’s planning even though it does not participate in the Quadrennial Defense Review, the Defense Planning Guidance, or any other formal planning activities.

  • At the EPA, strategic foresight is in the office that manages agency-wide strategic planning (the Office of the Chief Financial Officer), and results of foresight activity have been reflected to a limited extent in recent agency strategic plans.

  • The USAF Strategic Studies Group contributed to past strategic planning and Quadrennial Defense Reviews. They currently are engaging with financial management staff to help identify how money can best be spent to reach the goals identified through strategic foresight.

Intergovernmental Coordination

There is broad agreement among interviewees that the main foresight efforts need to occur within individual agencies because each deal with different threats and opportunities relevant to its mission. In addition, agency personnel should play a central role in foresight efforts to effectively influence organizational planning and management. Several interviewees also suggested that foresight efforts in individual agencies would be enhanced and more likely sustained if a central foresight body existed in the Federal government that supported and championed strategic foresight.

Many interviewees would like to see the OMB provide greater leadership in advancing the use of strategic foresight within the Federal government. Doing so could help agencies use foresight to optimize implementation of their missions. The OMB already has started conversations on how foresight can help improve performance and welcomes input on specific roles it could play to help make strategic foresight a systematic and routine process. The Executive Office of the President has conducted meetings for strategic planning leads of different agencies, with outside experts providing a lesson plan with key strategic planning concepts, and different agencies sharing how they apply those concepts. They also have held sessions on setting long-term strategic objectives using backcasting to identify steps needed to reach the objectives, a significantly different approach than the common approach to planning that projects out from what is being done today rather than working backwards from long-term goals and objectives.

In addition to encouraging the use of foresight in agency strategic planning and reviews, the OMB also implicitly encourages the use of foresight in ERM (Office of Management and Budget 2016): “Risk management practices must be forward-looking and designed to help leaders make better decisions, alleviate threats and to identify previously unknown opportunities to improve the efficiency and effectiveness of government operations.” (p.1) The OMB instructs agencies to use “a structured and systematic approach to recognizing where the potential for undesired outcomes or opportunities can arise” and “to include surveillance of leading indicators of future risk from internal and external environments.” (p.11) ERM in many agencies is still relatively new and evolving; several interviewees expressed the opinion that it is an ideal time for the OMB to increase emphasis on identifying and characterizing long-term risks and opportunities and encourage the use of strategic foresight specifically as an integral component of ERM.

Several interviewees suggested that the OMB might also operate a foresight “hub,” which would increase recognition of strategic foresight’s importance as a planning and management tool, set expectations for agencies’ use of foresight, foster networking and collaboration, and provide other support functions to enhance agency foresight capacity. The National Academy of Public Administration’s Presidential Transition 2016 effort recommended “Creating a central executive hub that brings together the futures work of multiple agencies and manages the integration of siloed knowledge about future risks and opportunities held in various parts of the bureaucracy” (Redburn and Breul 2016). Another suggested approach is to model a hub on the Joint Terrorism Task Force created after “9/11,” which helped foster anti-terrorism activities across the government. Another approach could be similar to one proposed by the Project on National Security Reform’s Visioning Work Group Report and Scenarios: establishing a “Center for Strategic Analysis and Assessment” that would exist and operate within the Executive Office of the President (Project on National Security Reform 2010).

Broad Observations

Strategic Foresight activity is increasing across the Federal government, but is not fully institutionalized.

A few agencies have made foresight an integral part of their standard operations and some have made it the foundation of their strategic planning, but foresight in other agencies is still fragile and has limited influence. Few government organizations have effectively integrated their strategic foresight work into planning and management processes such as ERM and performance management. Without being institutionalized as a complement to and/or component of agency planning and management processes, the ability to sustain foresight efforts will be vulnerable to changes in leadership, and foresight will have limited influence.

Individual Federal agencies are in very different places on a “maturity scale” of foresight efforts.

At one end of the scale are organizations where foresight activity has been limited to occasional internal speaker programs on emerging issues. At the other end, a few agencies like the VA and the USCG have well-established foresight programs that employ a variety of sophisticated methods and whose work is integrated into their organizations’ planning processes.

There is a widely shared aspiration by interviewees to expand the quality and influence of foresight efforts.

Virtually all interviewees would like to develop their foresight effort to a level where they are appreciated and utilized by senior management and where foresight is integrated with both strategic planning and enterprise risk management.

The FFCOI has been a significant factor in the growth of foresight in the Federal government.

The FFCOI is an informal government-wide network initiated in 2013 by leaders of the foresight effort at the VA and the BOP. People from organizations across the Federal government have been coming together in quarterly meetings to share what they are doing, learn from each other’s successes and challenges, and support each other’s strategic foresight efforts.

There is no standard organizational location for a foresight function.

Locations range from near the top of the organization to research branches remote from decision makers.

Horizon scanning and scenario-based planning are the most widely used foresight methodologies.

Horizon scanning is viewed as the most immediately useful foresight method and the basis for most other methods. Scenario-based strategic conversations and planning are the next most commonly used foresight tool. Defense organizations often integrate forecasts and scenarios into a war game format. Other methods mentioned in the interviews are Delphi forecasting, futures wheels and backcasting.

Different foresight programs use different methods to connect with leadership.

Some foresight organizations have forged strong connections with senior leaders; for others, this is a critical area where improvement is needed.

There is agreement among interviewees on the importance of looking ahead beyond conventional planning horizons.

Most foresight programs look twenty years or more into the future.

All Federal foresight programs share the underlying assumptions that it is impossible to predict “the future” and that human action can shape the future to some extent.

The widespread use of scenarios reflects a recognition that there is no single certain future ahead that is predictable and can be planned for. But it is possible to examine trends and emerging and potential developments, forecast alternative plausible futures, and use them for contingency planning, for identifying actions that could work well across a wide range of potential future conditions, and for envisioning and moving toward preferred futures.

Discussion

Overall, the findings of our study are consistent with those in the literature: while some foresight efforts have been strongly supported by their organization, support for others waxes and wanes and remains tenuous. As Havas et al. (2010, p.91) commented, “the perspectives for the future use and impacts of foresight are far from clear.” This may appear surprising, given the recent emphasis on foresight from such powerful Federal organizations as the OMB and the Government Accountability Office (GAO) and the acknowledged success of many foresight programs and projects across a range of Federal defense, intelligence and domestic agencies. Yet significant and unresolved challenges remain to fully institutionalize the use of foresight in the Federal government.

Consistent with the observations of Gerasimov (2016) and the findings of Roberge (2013), U.S. government foresight programs appear to have a more stable footing in defense and intelligence agencies than in civilian agencies. Militaries “are more likely to be tasked with specific goals related to the protection of national sovereignty, for which the analysis of multiple future scenarios and potential future risks can provide important value for shaping strategic planning” (Dreyer and Stang 2013, p.17). The challenges of institutionalizing foresight in civilian agencies also may be exacerbated by their smaller budgets relative to defense and intelligence agencies. In times of austerity writes, Roberge (2013, p.540) “bureaucracies may simply not have the capacity and/or resources needed to properly conduct foresight.” In addition, the consequence for poor planning in civilian agencies are perceived as less severe than for security agencies (Roberge 2013). More generally, foresight as a discipline is not well-recognized in academia or the Federal government and is therefore at a disadvantage when competing with conventional social science disciplines for intellectual and financial resources and the attention of and influence on policy makers (Fuerth 2009).

Paradoxically, the focus on future issues may give rise to the most persistent challenges to successfully institutionalizing foresight in civilian agencies. The ability of foresight to help “ensure that the organization’s purpose and outcomes are situated within a longer-term context” (Tully 2015, p.13) is often in direct conflict with demands on and the natural inclinations of policy makers to achieve short-term goals. To successfully engage in foresight, government organizations need to “overcome pressures to crisis manage, or to build policy responses based on the demands of round-the-clock media coverage, and instead develop longer-term strategies to tackle ‘wicked problems’” (Solem 2011). Robarge (2013, p.539) points out that “foresight is often at odds with the political imperative…Politicians and citizens alike often expect quick results, and have very little time for a view of the long term.” Even within research programs, when policy goals and plans “may be complex, unclear and uncertain”, program managers may “deliberately avoid treating particularly difficult albeit important problems” and instead “concentrate on the more easily understood and more easily solved problems” (Solem 2011, p.25).

One of the most powerful strategic planning attributes of foresight is that it provides an explicit process that challenges assumptions about the future and helps overcome what behavioral economics describe as being anchored or invested in a position. Foresight provides a counterweight to our “inbuilt tendency to favor corroborative data and ignore challenging data” and enables organizations to “take the time out to reflect on alternatives…and to build the space for conversations that do not shut down alternative viewpoints” (Tully et al. 2017). This requires a “readiness to listen to foresight and to consider action; ability to maintain a ‘protected space’ within which analysts feel empowered to present their views; and rich exchanges between producers and consumers of foresight” (Fuerth 2009).

Of course, critique of strongly held beliefs about the future may trigger cognitive dissonance and resistance. As Gowing and Landgon (2016) observe, although data and signals of emerging issues frequently are available, “for a variety of internal reasons those signs–and a frank assessment of what they suggest–are often marginalized or buried by systems that discourage the airing of unwelcome assessments.” Their research revealed that “there is a fear among staff that bosses will ‘chop the legs off’ those whose advice might seem to be off the wall or ‘wacky’.” If the results of foresight challenge established paradigms, policy makers may perceive a loss of control and may attempt to exercise inappropriate influence on the foresight process to defend their interests. Policy makers who participate in the foresight process may represent their institutional points of view, may be reluctant to communicate hidden agendas, may disagree about short- and long-term needs, and may seek to use foresight to justify policies that have already been decided (Da Costa et al. 2008, Havas et al. 2010). Grant et al. (1988) also warn that policy makers could attempt to manipulate the membership of a foresight organization to ensure it endorses a particular policy outcome. Notwithstanding—to promote leadership support for the foresight process, results, and recommendations—there is general agreement in the literature that policy makers should be involved throughout the foresight effort, from the early identification of issues through the discussion of policy implications and implementation options (Coates 1985, Da Costa et al. 2008, Solem 2011, Dreyer and Stang 2013, Calof and Smith 2010).

The organizational location of foresight activities may be critical to achieve the appropriate balance between policy-maker involvement and the need to maintain the foresight effort’s scientific integrity. John Kamensky’s observation in the present study that the most successful foresight organizations are located “off to the side at the top” is consistent with the need to establish policy makers’ direct interests and stakes in the effort and “the need for positioning at least part of the foresight function at a certain distance from the intrusiveness of day-to-day politics and administrative interference” (Solem 2011). In addition to locating it near the top of an organization’s management, Coates (2010) emphasizes that to be effective, the foresight organization needs adequate staff, resources, facilities, and authority; access to decision-makers, stakeholders, and information; and the ability to distribute information.

By helping to identify “weak signals, threats and opportunities which can occur and to monitor the constant transformation of capabilities, cultures, structures and management processes in a coherent way” (Martinet 2010, p.1486), foresight “can also help to identify redundancies and conflicts that need to be addressed” (Walker 2007, p.22) and the need to coordinate among organizations to address multi-dimensional issues. Intellectual and financial resources, responsibilities, and authorities, however, frequently are allocated among relevant government organizations “and these organizations are not always keen on co-operation—to put it mildly” (Da Costa et al. 2008, p.7). In addition, because foresight efforts often extend beyond an organization’s span of control, responsibility, and/or authority, it is often difficult to impose accountability (Havas et al. 2010, Horton 2012). Without a willingness by relevant organizations to coordinate resources and actions, the recommendations of foresight projects may go unheeded.

Much of the underlying resistance to institutionalizing foresight may be based on its underlying limits on knowledge and ability to enable “quantum leaps” in policy-making, which may be perceived as at odds with traditional policy analysis and development processes. Foresight is designed to address complex social science problems that are frequently ill-structured, characterized by the co-existence of many variables and interactions, and have qualitative and/or holistic aspects (Fuerth 2009, Horton 2012). Foresight is based on complexity theory, which focuses on the interactions among issues, policies, and the consequences of policies and helps identify the potential for abrupt and discontinuous changes. These capabilities differ from the more familiar incremental approach to policy (Havas et al. 2010), a form of “bounded rationality”—called “muddling through” by Charles Linbloom—necessitated by the impracticality of quantifying, valuating, and cognitively considering all possible variables and alternative outcomes related to complex policy issues (Lindbloom 1995, Simon 1955). While there may be “increasing dissatisfaction with ‘muddling through’ approaches” (Tully 2015, p.5), according to Horton (2012, p.294) “we are trained to think in terms of linear causality; doing anything else is difficult, disturbing, and different.”

Broad agreement exists that the value and credibility of foresight rests on the process’s ability to inform and improve priority setting, policy, and decision making (Calof and Smith 2010, Da Costa et al. 2008, Grant et al. 1988). the usefulness of foresight is hard to assess in the short-term, beyond the process and method, since its success may only become evident years after the fact. It also is difficult to prove the counterfactual—what might have happened in the absence of foresight (Roberge 2013, Tully 2015). According to Tully (2015, p.11), “a successful process means the participants internalize and come collectively to a common set of policies and actions. This often means the strategy process becomes ‘invisible’ as the journey through the process means that by the end its conclusions are seen as ‘inevitable’ and ‘common-sense’” as they become part of the collective consciousness of participants. In addition, the specific contributions of foresight activities may become obscured by the mosaic of information and opinions considered in what are often lengthy decision-making processes. The success of foresight would ideally be “measured in changed behavior, thinking, resources, budgets, communications, etc. [and]…evidence that a particular decision-path indicated by a strategic foresight project achieved the tangible desired result” (Tully 2015, p.7–8). To this end, Rohrbeck and Kum (2018) published a well-designed longitudinal study that provides compelling evidence of a significant positive impact of foresight on corporate performance, as measured by both profitability and market capitalization. There is, however, a paucity of published work on the impact of foresight studies in government agencies (Havas et al. 2010), and there is little evidence of what works, outside of ad hoc and incomplete case studies and narratives.

To fully institutionalize foresight in the Federal government, it will be necessary for policy makers and organizations to seek a longer and broader context for near-term decision making, i.e., to “switch their frame of mind mode from dealing with short-term urgencies into long-term and holistic thinking” (Da Costa et al. 2008, p.2). Organizationally, it will be necessary to “establish programs rather than one-off projects. There is a learning curve to doing foresight work. Programs allow for learning processes and personnel continuity” (Dreyer and Stang 2013, p.28). Two recommendations from participants in the current study may help in this regard. The first is to begin to educate Federal executives by incorporating foresight into the curriculum of the Federal Executive Institute (FEI). In fact, the FEI Senior Executive Service’s (SES) Enterprise Leadership Lab sponsored a workshop in 2017 on “Using Strategic Foresight to Influence Strategic Decision-Making.” and FEI is piloting additional classes on scenario planning that would be offered as an elective in FEI’s “Leadership for a Democratic Society” program for Federal executives. In addition, FEI is providing some facilitation support of individual agency foresight events.5

Finally, several participants in our study recommended the development of a central Federal “foresight hub” located in or affiliated with the OMB. The concept of a central foresight organization is not new, but past efforts to conduct foresight from the White House were not sustained (Grant et al. 1988). Importantly, the idea of a central foresight hub is not analogous to centralized foresight efforts — such as those in the United Kingdom, Singapore, France, and the Netherlands — which “often have central foresight agencies taking the lead on government efforts and responding to requests from central policy bodies” (Dreyer and Stang 2013, p.22). The current concept builds off lessons learned from earlier efforts in the United States and focuses more on counseling, advising, and supporting agencies foresight programs through “coordination, facilitation, information exchange, and quality control” (Coates 1985, p.47). If a foresight hub were established, it would be important to articulate its roles and responsibilities vis a vis the FFCOI. An important area for future research it to understand the extent to which — and under what circumstances — centralized national foresight efforts are sustainable and successful.

Based on our research, we conclude that foresight has taken root in the U.S. Federal government. For the reasons we discuss, however, its foothold in many organization is tenuous. To better understand how to more fully institutionalize foresight in the Federal government, research is needed that directly assesses policy makers’ perceptions of long-term risks and uncertainty, and the government’s role in managing to this future uncertainty. Additional studies of how successful foresight programs become valued in the private sector and in other countries could provide important insights. Adaptation of methodologies like those used by Rohrbeck and Kum (2018) is needed to assess the maturity of, need for, and impact of foresight programs within individual Federal agencies. Also needed is further study of how best to integrate foresight into Federal planning and management processes, including ERM and human resource planning. Our findings indicate that Federal civilian agencies generally rely on only a few well-known foresight methodologies; research therefore is needed on approaches for introducing and selecting from among alternative foresight methodologies, including newer techniques such crowdsourcing, forecasting tournaments, prediction markets, game-like engagements, algorithmic analysis, simulations and modeling, computational predictive analysis, and human-machine hybrid forecasting systems and methods. Finally, more research is needed in the area of foresight training (what little scholarly research has been published is limited largely to corporate managers). While some Federal executive-level foresight training is ongoing, as described above, more research is needed to improve the effectiveness of executive foresight training and delivery mechanisms, with the goal of Federal executives embracing foresight as an essential part of decision making.

Every person we interviewed is convinced of the importance of foresight and the need for improving its use in the U.S. Federal government. Thanks in large part to the FFCOI, foresight activity appears to be increasing in the Federal government, with more agencies involved and more people engaged in foresight. The most fully developed programs are having a significant influence on their organizations and provide a model for others. Despite these impactful efforts, however, foresight remains an underdeveloped and fragile enterprise in many government organizations. In light of a rapidly changing world, the critical challenge ahead is to institutionalize foresight so that it becomes part of routine Federal agency operations that result in more robust and resilient planning and management outcomes.

Supplementary Material

sup 1

Acknowledgments

We thank the participants in this study for their willingness to share their time, expertise and candid opinions. We thank Kevin Teichman (EPA Office of Research and Development) for his leadership in chairing EPA’s Strategic Foresight Lookout Panel and for his helpful review of an earlier draft of this paper. We also acknowledge the unwavering support for this project from Kathy O’Brien and John Hall, director and former acting deputy director (respectively) of EPA’s Office of Planning, Analysis and Accountability, and Tom Sinks and Mary Greene, director and deputy director (respectively) of EPA’s Office of the Science Advisor.

Author Biographies

Joseph M. Greenblott has served at the U.S. Environmental Protection Agency since 1991 and currently is the Associate Director of the Analysis Division in the Office of Planning, Analysis, and Accountability, Office of the Chief Financial Officer. Among his duties, Dr. Greenblott manages EPA’s strategic foresight program and serves on the Leadership Council of the Federal Foresight Community of Practice. He holds a BA in Psychology from the State University of New York at Binghamton, a M.Sc. in Environmental Biology from the Hebrew University of Jerusalem, Israel, and a Ph.D. in Environmental Science and Public Policy from George Mason University, Virginia.

Tomas O’Farrell received his Ph.D. in Biochemistry from the University of Tennessee, Memphis. He joined the EPA, Office of Research and Development in 2006. Since 2014, he has worked in the Office of the Science Advisor, supporting cross-agency science policy. Recently, he has helped coordinate efforts to institutionalize Strategic Foresight at EPA with staff from the Office of the Chief Financial Officer.

Robert Olson is an independent consultant and a Senior Fellow with the Institute for Alternative Futures in Alexandria, Virginia where he served as Director of Research for fifteen years. Much of his work has focused on environmental foresight, including projects with the Environmental Law Institute, the U.S. Environmental Protection Agency, the U.S. Forest Service, and the World Resources Institute. Previously, he was a project director and consultant to the Director of the Office of Technology Assessment of the U.S. Congress. He has been an adjunct professor at the American University and a Resident Fellow at the University of Illinois’ Center for Advanced Study.

Beth Burchard is a Program Analyst in the U.S. EPA’s Office of Planning, Analysis, and Accountability, Office of the Chief Financial Officer. She has worked in the federal government for 27 years, holding a variety of positions at the EPA, the Library of Congress, and the Executive Office of the President. She holds a BA in Economics from the University of Vermont and a Master of Public Policy from the University of Maryland.

Notes

1

Richard Antcliff (former Chief Strategist, Office of the Director, Langley Research Center-National Aeronautics and Space Administration) in discussion with authors, 2018.

2

Leon Fuerth (Director, Project on Forward Engagement), in discussion with authors, 2018.

3

Federal Foresight Community of Interest (FFCOI) is a voluntary forum for managers and staff in Federal agencies involved in strategic foresight to share and learn from each other.

4

Eric Popiel (Commander and Evergreen Program Manager, Office of Emerging Policy, U.S Coast Guard, retired) in email to authors, July 26. 2018.

5

Marci Ledlow (Faculty Member, Federal Executive Institute, Center for Leadership Development), in discussion with the author, August 22, 2018.

i

First published on line: 12/13/2018. Print: 2019 World Futures Review, 11(3), 245–266. https://doi.org/10.1177/1946756718814908.

References

  1. Calof Jonathan, and Smith Jack E. 2010. “Critical success factors for government-led foresight.” Science and Public Policy 37 (1):31–40. [Google Scholar]
  2. Coates Joseph F. 1985. “Foresight in federal government policymaking.” Futures Research Quarterly 1 (2):29–53. [Google Scholar]
  3. Coates Joseph F. 2010. “The future of foresight—A US perspective.” Technological Forecasting and Social Change 77 (9):1428–1437. [Google Scholar]
  4. Costa Da, Olivier, et al. 2008. “The impact of foresight on policy-making: insights from the FORLEARN mutual learning process.” Technology Analysis & Strategic Management 20 (3):369–387. [Google Scholar]
  5. Dreyer Iana, and Stang Gerald. 2013. “Foresight in governments – practices and trends around the world.” In Yearbook of European Security, 7–32. Paris: EU Institute for Security Studies. [Google Scholar]
  6. Fuerth Leon S. 2009. “Foresight and anticipatory governance.” Foresight 11 (4):14–32. 10.1108/14636680910982412. [DOI] [Google Scholar]
  7. Futures Directorate. 2015. Marine Corps Security Environment Forecast: Futures 2030–2045 Quantico, VA: U.S. Marine Corps. [Google Scholar]
  8. Gerasimov Valery. 2016. “The Value of Science Is in the Foresight: New Challenges Demand Rethinking the Forms and Methods of Carrying out Combat Operations.” Military Review Jan-Feb:24–29.
  9. Gowing Nik, and Langdon Chris. 2016. Groupthink is depriving the West of vision. The World Today June and July. https://www.chathamhouse.org/publications/twt/how-groupthink-depriving-west-vision.
  10. GPRA Modernization Act of 2010. In 124 Stat. 3866.
  11. Grant Lindsey, et al. 1988. Foresight and National Decisions. The Horseman and the Bureaucrat Lanham, Md: University Press of America. [Google Scholar]
  12. Havas Attila, et al. 2010. “The impact of foresight on innovation policy-making: recent experiences and future perspectives.” Research Evaluation 19 (2):91–104. 10.3152/095820210X510133. [DOI] [Google Scholar]
  13. Horton Averil 2012. “Complexity science approaches to the application foresight.” Foresight 14 (4):294–303. 10.1108/14636681211256080. [DOI] [Google Scholar]
  14. Lindbloom Charles E. 1995. “The science of muddling through.” Public Administration Review 19 (2 (Spring)):79–88. [Google Scholar]
  15. Martinet Alain-Charles. 2010. “Strategic planning, strategic management, strategic foresight: The seminal work of H. Igor Ansoff.” Strategic Foresight 77 (9):1485–1487. [Google Scholar]
  16. National Advisory Council on Environmental Technology and Policy. 2002. The Environmental Future Washington, DC: U.S. Environmental Protection Agency. [Google Scholar]
  17. National Advisory Council on Environmental Technology and Policy. 2009. Outlook for EPA Washington, DC: U.S. Environmental Protection Agency. [Google Scholar]
  18. National Advisory Council on Environmental Technology and Policy. 2012. First Advice Letter on Incorporating Sustainability at EPA Washington, DC: U.S. Environmental Protection Agency. [Google Scholar]
  19. National Research Council. 2011. Sustainability and the U.S. EPA Washington, DC: National Academy of Sciences. [Google Scholar]
  20. National Research Council. 2012. Science for Environmental Protection: The Road Ahead Washington, DC: National Academy of Sciences. [Google Scholar]
  21. National Research Council. 2014. Sustainability. Concepts in Decision-Making: Tools and Approaches for the US Environmental Protection Agency Washington, DC: National Academy of Sciences. [Google Scholar]
  22. Office of Management and Budget. 2016. OMB Circular No. A-123, Management’s Responsibility for Enterprise Risk Management and Internal Control Washington, DC: Executive Office of the President. [Google Scholar]
  23. Office of Management and Budget. 2017. Circular No. A-11: Preparation, Submission, and Execution of the Budget Washington, DC: Executive Office of the President. [Google Scholar]
  24. Project on National Security Reform. 2010. Vision Working Group Report and Scenarios edited by Ronis Sheila R.. Washington, DC: U.S. Army Strategic Studies Institute. [Google Scholar]
  25. Redburn F Stevens, and Breul Jonathan D.. 2016. “Linking Foresight to Decision Making.” National Acadamie of Public Administration, Last Modified June 28, 2016. http://napat16.org/blog/16-linking-foresight-to-decision-making.html.
  26. Roberge Ian. 2013. “Futures construction in public management.” International Journal of Public Sector Management 26 (7):534–542. [Google Scholar]
  27. Rohrbeck René, and Kum Menes Esingue. 2018. “Corporate Foresight and its Impact on Firm Performance: A Longitudinal Analysis.” Technological Forecasting and Social Change 129 (April):105–116. 10.1016/j.techfore.2017.12.013. [DOI] [Google Scholar]
  28. Science Advisory Board. 1995. Beyond the Horizon: Using Foresight to Protect the Environmental Future (EPA-SAB-EC-95–007) Washington, DC: U.S. Environmental Protection Agency. [Google Scholar]
  29. Simon Herbert A. 1955. “A Behavioral Model of Rational Choice.” The Quarterly Journal of Economics 69 (1 (Feb.)):99–118. [Google Scholar]
  30. Solem Knut Erik 2011. “Integrating foresight into government. Is it possible? Is it likely?” Foresight 13 (2):18–30. 10.1108/14636681111126229. [DOI] [Google Scholar]
  31. Tully Catarina. 2015. Stewardship of the Future: Using Strategic Foresight in 21st Century Governance Singapore: UNDP Global Centre for Public Service Excellence. [Google Scholar]
  32. Tully Catarina, et al. 2017. Strategic foresight can make the future a safer place. The World Today April and May. https://www.chathamhouse.org/publications/twt/strategic-foresight-can-make-future-safer-place.
  33. U.S. Environmental Protection Agency, Office of the Chief Financial Officer. 2006. 2006–2011 EPA Strategic Plan: Charting Our Course (EPA-190-R-06-001) Washington, DC. [Google Scholar]
  34. U.S. Environmental Protection Agency, Office of the Chief Financial Officer. 2010. 2011–2015 EPA Strategic Plan: Achieving Our Vision (EPA-190-R-10-002) Washington, DC. [Google Scholar]
  35. U.S. Environmental Protection Agency, Office of the Chief Financial Officer. 2014. 2014–2018 EPA Strategic Plan (EPA-190-R-14-006) Washington, DC. [Google Scholar]
  36. U.S. Environmental Protection Agency, Office of the Chief Financial Officer. 2018. FY 2018–2022 EPA Strategic Plan (EPA-190-R-18-003) Washington, DC. [Google Scholar]
  37. Walker David M. 2007. “Foresight for government.” The Futurist 41 (2):18–22. [Google Scholar]

Associated Data

This section collects any data citations, data availability statements, or supplementary materials included in this article.

Supplementary Materials

sup 1

RESOURCES