ABSTRACT
Differentiating performance components across military occupations is critical to developing assessments and other applications for matching people to occupations in the military. However, identifying occupation-specific performance components is challenging and resource intensive. The current paper summarizes different methods the services use to define and identify occupation-specific performance components. The paper concludes with recommendations and future directions for advancing the military’s needs for information and data on occupation-specific performance components.
KEYWORDS: Job performance, job analysis, person-job match, job tasks
What is the public significance of this article?—Defining occupation-specific performance components facilitates matching people to military jobs. Mismatching people to jobs negatively impacts service members and the organizations they serve. However, defining occupation-specific performance components is challenging and resource intensive. This paper summarizes methods that minimize burden and yield useful performance components to enable person-job matching.
Introduction
Occupation-specific performance components support many first-term enlisted talent management functions from job qualification to occupational training to promotion. Occupation-specific performance components are especially important for improving person-occupation matching. At the individual-level, the right person-occupation match can influence (a) an applicant’s decision to join the military; (b) the probability that the recruit will complete their initial military training; (c) how effectively the new service member performs in-unit, post-initial training; and (d) the probability that the new service member will persist through their first term of service and make the military a career. At an organizational level, the right person-occupation match can impact (a) first-term service members’ engagement and retention; (b) the overall strength and health of an occupation; and (c) operational readiness and mission success, in the aggregate. Accordingly, effectively defining occupation-specific performance components and profiling occupations on the components is foundational to making the right person-occupation match.
Despite the need for occupation-specific performance components, there are many challenges to identifying them. These challenges include but are not limited to (a) the large number of occupations; (b) challenges in defining an occupation (or job); (c) challenges in differentiating occupations from roles (e.g., leader); (d) changes in occupational structure or job requirements in response to continually evolving strategic and operational demands; (e) resources (time, money, access to job incumbents or subject matter experts [SMEs]); and (f) requirements for maintaining detailed occupation-specific performance components (e.g., tasks) up to date. Various methods to defining performance components are differentially useful depending upon objectives and circumstances.
The purpose of this paper is to summarize and discuss different methods for defining and profiling occupation-specific performance components. First, we define occupation-specific performance components and present examples of how the U.S. military has used inductive and deductive methods to define and profile occupations on these components. We next present methods that make use of worker-related descriptors (e.g., knowledges, skills, abilities or aptitudes, personality-temperament, occupational or work interests) or the use of existing occupational data (or information) with minimal to no job incumbent involvement. Finally, we conclude the paper with recommendations and future directions for effectively defining and differentiating job performance components in military occupations.
Defining occupation-specific performance components
Performance is work behavior (J. P. Campbell et al., 1993). Accordingly, performance components refer to behaviors or actions (e.g., tasks, duties) people perform at work. As shown in Table 1, job performance is a function of three main determinants: (a) declarative knowledge (DK), (b) procedural knowledge and skill (PKS), and (c) motivation (M). DK is knowledge about facts such as medical knowledge or knowledge of computer technology. PKS refers to knowing what to do and how to do it – for example, knowing how to change a tire and having the skill to do it. Motivation is choosing to invest energy to perform a task (or duty) or to learn the DK and PKS needed to perform a task or duty. Temperament and interests are antecedents to motivation. In sum, proficiency in performing a duty or task is a function of knowing facts and principles relevant to performing the task, knowing what to do and how to perform the task, and expending the effort needed to perform the task, effectively.
Table 1.
Examples of performance components.
Job Performance Components (Behaviors) | = | Declarative Knowledge (DK) | x | Procedural Knowledge and Skill (PKS) | x | Motivation (M) |
---|---|---|---|---|---|---|
Maintaining alertness to enemy and environment threats | Knowledge of possible threats | Knowing how to identify threats and skill in perceiving them | Willingness and vigilance in looking for possible threats over time | |||
Keeping weapons and equipment in combat-ready condition | Knowledge of weapons and equipment, their parts and how they work. | Knowing how and being able to safely perform maintenance on weapons and equipment. | Level of effort in maintaining weapons and equipment |
Note. Based on definitions of performance components from “A theory of performance” by J. P. Campbell, R. A. McCloy, S. H. Oppler, and C. E. Sager, 1993, in N. Schmitt and W.C. Borman (Eds.) Personnel Selection in Organizations. Jossey-Bass Publishers.
Performance components can be common to all or most occupations (or jobs) or specific to an occupation. Occupation-specific performance components are those elements of performance that are important to an occupation (or cluster of occupations with similar requirements) but uniquely differentiate performance in the occupation(s) from other occupations.1
Overall, there are two general approaches to defining occupation-specific performance components: (a) inductive and (b) deductive (Peterson & Jeanneret, 2007). Inductive methods involve defining performance components from the bottom up, using descriptive materials about the occupation (e.g., job descriptions, training manuals) and discussions with SMEs to elicit the occupation-specific performance components, sometimes followed-up with a within-occupation survey to refine the occupation-specific performance components. Conversely, deductive methods define performance components from the top-down, starting with one cross-occupation taxonomy for describing all occupations (e.g., Department of Labor’s occupational information network, O*NET (https://www.onetonline.org/; Peterson et al., 1999) and drilling downward to create occupation-specific descriptions (or profiles).
Volumes have been written about occupational analysis and how to conduct one (e.g., Brannick et al., 2007; Harvey, 1991; McCormick, 1976). The current paper is not intended to serve as a primer on conducting an occupational analysis. Instead, our focus is on summarizing and discussing different methods the U.S. military has used to define occupation-specific performance components. The following examples illustrate the U.S. military’s use of different inductive and deductive methods to define occupation-specific performance components.
Examples of inductive methods
Using task analysis
The U.S. military has extensive experience applying inductive methods to define occupation-specific performance requirements. The Air Force has conducted job task surveys to define training needs for enlisted occupations, inductively, for decades (see, Geimer et al., 2007). Currently organized under the HQ Air Education & Training Command’s (AETC) Studies and Analysis Squadron, a designated Occupational Analysis division conducts such surveys for enlisted specialties on a regular and recurring basis – normally every three to five years for each specialty, or as requested by career field managers based on substantial changes in mission, equipment, or weapons systems (United States Air Force, 2014, 2019a, 2019b). The Army documents and periodically updates the individual tasks and related requirements of its enlisted occupations, primarily to support institutional and occupation-specific training (Geimer et al., 2007). Similarly, the Naval Manpower and Analysis Center (Navy Manpower Analysis Center, 2021) conducts occupational analysis for the purpose of defining the standards for selection and classification, and the Naval Education and Training Command (Naval Education and Training Command, 2013) conducts occupational analysis to inform training programs. The Marine Corps Front End Analysis (FEA) is intended to support occupational structure decisions and training curricula (Headquarters, U.S. Marine Corps, 2012). The Coast Guard conducts job task analysis and occupational analysis both of which are part of a broader system of analyses in support of its training and promotion programs (United States Coast Guard, 2018). Finally, the Space Force is in the process of establishing its own personnel systems. In the meantime, the Space Force relies on Air Force tools for comparable occupations. It is our understanding that, with exception of the Air Force, which routinely conducts occupational analyses, the services tend to conduct occupational analysis on an as-needed basis (e.g., when occupations merge or in response to a request from the career field).
While occupation-specific task surveys clearly support the primary function they were intended for – prioritizing tasks for inclusion in each career field’s technical training curriculum – the data do not readily lend themselves to differentiating performance requirements across occupations. The services do not maintain “master lists” of tasks across occupations, and when there is overlap in tasks across multiple occupations, essentially the same task may be labeled differently in the job task inventory for each occupation or defined with different levels of granularity. For example, the current Air Force job task inventory for Dental Assistants and Hygienists (4Y0X1) lists “Take and record vital signs” (United States Air Force, 2018), while the job task inventory for Surgical Service (4N1X1) members separately lists “Obtain and record preoperative or postoperative patient vital signs manually” and “ … using automated methods” (United States Air Force, 2013).
Further, the sheer number of tasks identified for any given occupation (upwards of hundreds of distinct tasks in a typical inventory) does not lend itself to a manageable number of job performance components for criterion measure development. Although statistical factor analysis of the hundreds of task frequency and importance ratings collected as part of these surveys may yield insights into the types of jobs within an occupation that perform various tasks (Johnson, 2000), it is unlikely to address the underlying performance components within a given job (Cranny & Doherty, 1988).
Geimer et al. (2007) performed an extensive review of the services’ occupational analysis practices. They obtained documentation including technical reports, presentations, and occupational analysis surveys from the U.S. Army, Air Force, Navy, Marine Corps, and Coast Guard. They examined web sites, attended a biannual occupational analyst workshop, and conducted interviews with military occupational analysts. Based on that review, they concluded that the services’ operational occupational analysis practices were not aligned with the needs of selection and classification systems because the focus is typically on detailed occupation-specific tasks with no organizing rubric across occupations, and information about needed knowledge, skills, abilities, and other characteristics (KSAOs) is rarely provided (see also, National Academies of Science, Engineering and Medicine, 2020).
Using critical incidents
Another inductive approach, the critical incident method (Flanagan, 1954) was initially developed for debriefing pilot trainees. The method’s primary advantage is that it captures descriptions of effective and ineffective work behaviors in a rich context. The method has been used extensively in the military for the development of performance rating scales and situational judgment tests. The main drawback is that the full-blown critical incident approach is laborious. If followed to the letter, the approach involves (a) asking SMEs to write literally hundreds of critical incidents, (b) sorting incidents to form behavioral categories, (c) asking SMEs to “retranslate” the critical incidents into dimensions as a check on the validity of the dimensions (Smith & Kendall, 1963), and (d) analyzing the results to affirm the behavioral dimensions. Consequently, researchers often deviate from the original method. For example, instead of starting from scratch to identify combat/deployment performance dimensions, Wasko et al. (2012) drafted dimensions based on a review of the literature on combat performance. Then, they asked SMEs to provide critical incidents targeting those dimensions. Research staff condensed the critical incidents to form 39 concise behavioral summary statements. In turn, Army SMEs retranslated the 39 behavioral statements (instead of hundreds of critical incidents).
As illustrated by the definitions of the combat/deployment performance dimensions in Table 2, the critical incident approach yields behavioral categories that can be occupation-specific or can cut across occupations. Once developed using an inductive approach, critical incident categories could be used in a deductive way. For example, military occupations could be profiled on performance dimensions.
Table 2.
Examples of critical incident-based performance dimensions.
Dimension | Definition |
---|---|
A. Field/Combat Judgment | Thinks rationally under pressure. Makes sound on-the-spot decisions in the field based on prior training. Applies correct rules (e.g., rules of engagement [ROE], escalation of force) to the situation. Immediately and correctly performs required warrior tasks and drills. |
B. Field Readiness | Keeps self, weapons, and equipment in combat-ready condition. Maintains positive control and accountability of weapons, equipment, tools, and munitions. Follows procedures for handling equipment and weapons safely. |
C. Physical Endurance | Is capable of meeting the demands of physical or environmental challenges or stressful situations. Sustains performance as long as the situation requires. |
D. Physical Courage | Overcomes fears of bodily harm. Takes necessary risks in spite of fears. Does not act recklessly or place self or others at unwarranted risk. |
E. Awareness and Vigilance | Maintains sense of awareness and alertness to enemy and environment threats. Acts as constant sensor to unusual or threatening persons or conditions. Remains focused and alert despite sleep deprivation, extended missions, and difficult environmental conditions. |
Note. Adapted from “Development of the Combat/Deployment Performance Rating Scales” by l. Wasko, K. Owens, R. Campbell, and T. Russell, 2012, in D. J., Knapp, K. S. Owens, & M. T. Allen (Eds.) Validating future force performance measures (Army Class): In-unit performance longitudinal validation (Technical Report 1314). Fort Belvoir, VA: U.S. Army Research Institute for the Behavioral and Social Sciences.
Examples of deductive methods
Using major duties as performance components
One example of the deductive approach was used in a joint-service project to assess the work-relevance of the Armed Services Vocational Aptitude Battery (ASVAB; Waters et al., 2009). The performance components were major duties (MDs) – higher-order categories of tasks that are designed to cover all major types of entry-level enlisted job duties. Example MDs appear in Table 3.
Table 3.
Examples of major duties.
MD # | Label | Definition |
---|---|---|
1 | Engage in hand-to-hand combat | Uses offensive and defensive maneuvers to combat and protect self and others from hostile individuals. |
18 | Manage and control traffic | Manages and coordinates the departing, en route, arriving, and holding of traffic (land or sea) by monitoring equipment, and communicating with vehicles and other traffic control units. |
22 | Collect information from and on individuals and groups | Collects and gathers information from and on individuals and groups using a variety of techniques (e.g., interviews, focus groups, observations). |
41 | Inspect and maintain vehicles and mechanical equipment | Inspects and maintains vehicles and mechanical equipment (e.g., trailers, generators, construction equipment, sheet metal machines, engines, fuel systems, gear boxes). Conducts scheduled services to maintain equipment. |
Note. Adapted from Development of a methodology for linking ASVAB content to military job information and training curricula (FR-09-64) by S. Waters, M., Shaw, T., Russell, M., Allen, W., Sellman, and J. Geimer, 2009, Human Resources Research Organization.
The current list of cross-service MDs (Waters et al., 2009) originated in Army projects. In the 1990s, Peterson and colleagues (Peterson et al., 2001) generated initial MDs by aggregating job-specific tasks into higher-order task categories that were generalizable to more than one job. Those MDs were updated and expanded in another project where they were used to assess the efficacy of O*NET’s generalized work activities for use by the Army (Russell et al., 2008). Waters et al. (2009) further expanded the MDs for cross-service use. Specifically, Waters and colleagues (a) edited the language of the MDs to make them generalizable across services; (b) revised and expanded MDs based on training materials for a sample of seven Army, five Navy, and seven Air Force occupations; and (c) reviewed MDs with SMEs. This process resulted in a list of 84 MDs. Military SMEs representing the 19 occupations indicated which MDs were relevant to their occupations. Waters and colleagues subsequently mapped the MDs to ASVAB content areas to assess occupation-relevance of ASVAB content domains.
The primary advantages of using MDs as occupation-specific performance components are that MDs (a) condense tasks to a manageable set, (b) generalize across occupations, and (c) characterize performance components that are likely to be a function of a wide array of KSAOs. However, the taxonomy is now fairly dated, and the MDs currently are only linked to a few occupations. SMEs would need to be involved to update the taxonomy and populate a database with ratings of the importance or relevance of MDs to their occupations.
Using O*NET descriptors as performance components
The Army sponsored a project to evaluate the use of O*NET for describing Army jobs (Russell et al., 2008). They found that SMEs were able to use the O*NET descriptor sets reasonably well, although some military-specific tailoring would improve the relevance of O*NET work context and generalized work activities to the military. For example, descriptors having to do with deployment, such as working with little sleep and working overseas, could be added to the work context variables.
Other deductive occupation analysis projects have assessed the use of O*NET for a wide variety of other applications, such as for (a) matching veterans to civilian occupations likely to require qualities they obtained or applied in military jobs (Wenger et al., 2017), (b) identifying job conditions affecting service members’ health and performance (Gadermann et al., 2014), (c) assessing changes in future military workforce requirements by comparing ratings of future occupations on O*NET descriptors to extant O*NET data (Levy et al., 2001), and (d) evaluating ASVAB content domains (Waters et al., 2009). Overall, the results of these evaluations show the potential value from using an extant, well-established taxonomy of cross-occupation descriptors like the O*NET to describe and profile military occupations for one or more talent management applications.
Summary
The National Research Council (1999) concluded that the military needs an occupational analysis system that (a) is feasible to maintain, (b) allows for systematic comparisons across occupations within a service and ideally across services, and yet (c) still allows for sufficient service- or occupation-specific detail. Deductive methods are advantageous when many occupations must be compared because all occupations are described according to the same descriptors (behaviors, tasks, KSAs, etc.). Importantly, deductive methods facilitate use of job component and synthetic validation approaches (Hoffman et al., 2007; Johnson, 2007; Peterson et al., 2001) which can be cost-effective alternatives to traditional criterion-related validation of selection and classification measures. Inductive methods are most useful when one occupation is the focus and/or detailed information is needed about that occupation (such as for the development of training programs). Even so, it should be noted that detailed information can be created to supplement more generic cross-job descriptors. For example, the O*NET system identifies detailed, occupation-specific work activities and tasks as well as the cross-job descriptors of generalized work activities, abilities, and skills.
Using worker-related information to define occupation-specific performance components
Worker-related descriptors describe worker attributes that determine performance (e.g., knowledges, skills, abilities or aptitudes, personality-temperament). They can be developed and used inductively or deductively, That is, an existing worker-related taxonomy, such as abilities and skills in O*NET can be used deductively to describe occupations or an new set of worker attributes can be created inductively and, in turn, used to describe all occupations. Using worker-related descriptors that are most proximal or closely linked to performance can serve as useful surrogates or indicators of performance. They can be especially advantageous for describing future jobs or future changes to jobs with sufficient generalizability or for efficiently describing within-occupation differences (e.g., by role, by mission). Example methods that use worker-related information (or descriptors) include (a) competency modeling and (b) occupational (or work) interest taxonomies.
Using competency modeling
Competency modeling is usually thought of as an inductive occupation analysis approach intended to identify performance components that differentiate high and low performers (Schippman et al., 2000). Although the term “competencies” can generically refer to any collection of behaviors or KSAOs needed for effective performance (see reviews in Campion et al., 2011; Pearlman, 2002; Schippman et al., 2000), within the U.S. government the term has been used to reference “softer skills” such as flexibility, teamwork, and creative thinking (Rodriguez et al., 2002), rather than tasks and knowledge defined with a level of granularity that does not allow for direct comparison across jobs.
The U.S. Office of Personnel Management (OPM) (or the individual military services) has not widely adopted the time-consuming and labor-intensive behavioral event interview methodology central to “competency modeling” approaches advocated by McClelland (1998) and Spencer and Spencer (1993). Instead, the competency taxonomy used widely within the U.S. government for “top down” deductive job analysis of civilian positions (Rodriguez et al., 2002) ultimately includes many of the same, or similar, soft skill competencies that were commonly identified across civilian jobs by Spencer and Spencer (1993) based on the behavioral event interview approach that empirically differentiates “top” performers from average ones.
A recent example is work by the Air Force to identify and define Foundational Competencies, the competencies most essential to Air Force jobs. The Air Force recently evaluated the relevance of many of the “soft skill” competencies previously identified by Spencer and Spencer (1993) for Air Force careers (Barelka et al., 2019; Barron & Rolwes, 2020a). These studies highlight the importance of soft skills (identical or closely related to those identified in Spencer & Spencer, 1993) in military jobs, underscoring the importance of extending beyond the traditional focus on narrowly defined job tasks, knowledge, and highly technical skills that make cross-job comparisons difficult.
Though Air Force’s Foundational Competencies were identified based on their widespread importance for success across the Air Force (rather than for differentiating among career fields), there are occupational distinctions in the relative importance of each Foundational Competency for performance in specific career fields or career field clusters. An initial study of 35 enlisted career fields (Barron & Rolwes, 2020b) directly queried SMEs on the extent to which each Foundational Competency was viewed as “more or less important for your career field than in other enlisted career fields.” They found, for example, that competencies such as Strategic Thinking, Communication, and Analytical Thinking were rated most important for Intelligence career fields, Inspector General inspectors, and Financial Management personnel, while the competencies of Flexibility and Resilience were rated most important for enlisted aviation (e.g., Flight Engineer, In-Flight Refueling, Aircraft Loadmaster) and Special Investigations personnel.
Systematic matching of recruits or in-service members to military career fields by their perceived or assessed relative strengths on competencies like these has not historically been done. We suspect such an approach would yield quite different results than the traditional approach of matching to military jobs based on one’s relative scores on ASVAB subtests (i.e., within-person comparison of scores on subtests such as Mechanical Knowledge, General Science, or Word Knowledge).
The Air Force Research Laboratory (AFRL) also developed and employs a competency-based work analysis method, mission-essential competencies, specific to flying and command and control positions. AFRL’s method is a blended approach that combines defining mission-essential competencies and important knowledges, skills, and experiences, inductively, with the identification of important supporting competencies (e.g., decision-making, adaptability, situational awareness), deductively (Alliger et al., 2007). Mission-essential competencies describe work activities performed by individuals or teams that are required for successful mission completion (e.g., intelligence, surveillance, and reconnaissance missions). Mission-essential competencies are contextualized to one or more missions (or mission types), but, defined at a higher-order of specificity than tasks. Defining mission-essential competencies and important knowledges, skills, and experiences can be potentially time consuming and SME intensive. Nonetheless, the rating scales used in the surveys of incumbents (experts and non-experts) could have broader applicability for efficiently distinguishing the level of competency needed in any military job or occupation (e.g., “Basic,” “Intermediate,” or “Advanced”). The scales used to define basic, intermediate, or advanced levels of proficiency on each competency in generic terms have demonstrated appropriate inter-rater reliability across target populations and have shown expected relationships to job tenure (Alliger, Beard, Bennett, Symons & Colegrove, 2013).
Summary. Existing soft skill taxonomies like those used by OPM and in the Air Force’s Foundational Competencies have shown to be useful for differentiating occupations – even if the soft skill lists were initially developed based on their overall applicability service-wide. Approaches to differentiate occupations based on both applicability of a given soft skill competency and the skill or mastery level needed in a specific occupation may be useful for person-job matching or other applications. One advantage of taxonomies like the Foundational Competencies is breadth of coverage. The competencies are related to a wide array of KSAOs and non-cognitive attributes. On the downside, broad, soft skill competencies that are not linked to job tasks require SMEs to make inferences about the relevance of the competency to their work (Schippman et al., 2000).
Using occupational interest areas
Another method is to use occupational (or work) interest areas as surrogates for or to infer occupation-specific performance components. Occupational or work interests are linked to job performance and long-term career success (Nye et al., 2012, 2017; Van Iddekinge et al., 2011). In brief, people perform better and persist longer in occupations or jobs that match their interests because they are intrinsically motivated to perform the tasks comprising the job, invest effort in developing or acquiring the requisite knowledge and skills to be successful in a job, and self-select for challenging or stretch assignments and other opportunities to advance their career (Ingerick & Rumsey, 2014).
Occupational or work interests are expressed in terms of preferences for work activities and environments. With that in mind, the Navy, the Air Force, and the Army have developed work taxonomies to support the development of occupational interest measures. While the primary goal is to measure an individual’s interests, the descriptors comprising these taxonomies also represent a means for describing and profiling occupations on performance components.
For example, the Navy developed a taxonomy of work activities to serve as the foundation for occupational interests in its Job Opportunities in the Navy (JOIN) interest measure. Watson (2020) analyzed Navy occupation descriptions for words representing four domains: process (i.e., task verbs such as “maintain”), content (i.e., objects of the process such as “aircraft”), community (i.e., the group of occupations that share a function or process such as “aviation”), and work style (i.e., the type or context of the work, such as indoor versus outdoor, mental versus physical). The resulting taxonomy appears in Table 4. It has 41 items – 25 work activities (process-content pairs) organized into seven scales, eight communities, and eight work styles. Watson mapped all Navy occupations according to this taxonomy and used it as the basis for JOIN.
Table 4.
Job opportunities in the Navy taxonomy.
Scale/Activities | Scale/Activities | Communities |
---|---|---|
Hands-on | Electronics/electrical | Aviation |
Direct aircraft | Maintain electrical equipment | Construction |
Maintain facilities | Maintain electronic equipment | Health care |
Maintain mechanical equipment | Operate electrical equipment | Intelligence |
Make facilities | Operate electronic equipment | Support |
Make mechanical equipment | Weapons | Special Warfare |
Admin/Supply | Maintain security | Submarine |
Maintain documents | Maintain weapons | Surface |
Maintain supplies | Operate weapons | Work styles |
Make documents | Emergency response | Indoor |
Operate facilities | Direct emergency response | Outdoor |
Serve customers | Respond to emergencies | Industrial |
Analysis | Train people | Office |
Analyze communications | Train people | Mental |
Analyze data | Physical | |
Analyze documents | Independent | |
Make communications | Team |
Note. Adapted from Job Opportunities in the Navy (JOIN) by S. E. Watson, 2020, Military Psychology, 32(1), 101–110.
The Air Force has also conducted targeted job analysis to develop scoring specifications for a vocational interest assessment specific to its career fields. Designed to help potential recruits identify which (of the over 130) Air Force enlisted specialties most closely match their personal interests, the Air Force Work Interest Navigator (AF-WIN) relies on career field SME ratings on the applicability of 62 “job markers” for the tool’s scoring. AF-WIN job markers were developed based on the same general taxonomy used for the Navy’s JOIN to incorporate each career field’s mission and organizational focus (“functional community”), as well as applicable job contexts and work activities (Johnson et al., 2020). The specific job markers were then adapted and revised based on the unique structure and needs of the Air Force; markers include work activities such as “Direct air and spacecraft” and “Operate weapons,” as well as more generic job context descriptors (e.g., Indoor or Outdoor, Hazardous or Non-Hazardous, Predictable or Unpredictable). We note that, while the current AF-WIN taxonomy includes a relatively small number of work activities, past Air Force work to establish more comprehensive taxonomies could help refine the tool for the future.2
AF-WIN is scored by multiplying the recruit’s level of interest in each job marker by the extent to which career field SMEs endorse a given job marker as applicable to their specialty.3 The sum across job markers then provides a fit score used to rank specialties by relative person-job match. The tool has demonstrated useful criterion-related validity. Incumbents whose current specialty is identified as a stronger fit based on the tool report greater levels of job satisfaction and re-enlistment intent than other incumbents (Johnson et al., 2020; Romay et al., 2019).
The Army has also formulated a work activity and job context taxonomy to support the development of an interest measure, the Adaptive Vocational Interests Diagnostic (AVID). Although the AVID’s measurement approach and scoring is different from the JOIN and AF-WIN, the Army is similarly using its taxonomy to describe and profile Army enlisted jobs for ultimate use in matching recruits to jobs (Nye et al., 2019).
Summary. The cross-occupation taxonomies of descriptors used for the Navy’s JOIN (Watson, 2020), the Air Force’s AF-WIN (Johnson et al., 2020), and Army’s AVID (Nye et al., 2019) have proven useful for matching recruits to jobs based on occupational interests. Primary advantages in using them as performance components are that (a) the Navy, Air Force, and Army have profiled or are profiling their occupations on the taxonomic elements; and (b) they were designed to capture work in a way that differentiates occupations to facilitate person-occupation matching. If these taxonomies were sufficiently fleshed out, they might also be linked to other KSAs and facilitate synthetic or job component validation. For example, if these taxonomies could be linked to cognitive abilities measured by the ASVAB, they could be useful in identifying subtests for use in person-job matching or they could help identify gaps where the ASVAB currently does not assess interest or knowledge. On the downside, it might be more difficult to link these work taxonomies to non-cognitive constructs.
Using existing occupational-jobs data to describe or profile occupation-specific performance components
Making use of existing occupational-jobs data represents another potentially useful approach to describing and profiling occupation-specific performance components. Thanks to the increasing proliferation of data, this approach is not limited to specific sources or types of data. The source for these data can be military or nonmilitary, service- or non-service specific. Similarly, the type of data analyzed can be quantitative (e.g., importance ratings on one or more occupational- or job descriptors) or qualitative (e.g., text job descriptions). The following examples illustrate different methods for using existing occupational-jobs data to (a) identify occupation-specific performance components and (b) synthesize or transport occupational information from occupations (or jobs) with data to occupations (or jobs) without data.
Using existing data
One method related to the preceding discussion on occupational (or work) interests to identify occupation-specific performance components, inductively, with existing data is by analyzing existing job analysis data collected on worker-related descriptors (e.g., importance ratings on knowledge, skills, abilities, work styles, and work-vocational interests). An individual’s performance on the components comprising their occupation (or job) is a function of worker-related characteristics (J. P. Campbell et al., 1993). Individuals higher on the worker-related characteristics relevant to an occupation (or job) will perform better and persist longer in said occupation (or job) than individuals lower on these same characteristics. Accordingly, data on descriptors of these worker-related characteristics can be reflective of one or more performance components, particularly, in combination. Cross-domain analyses of existing data on different worker-related descriptors can be especially useful for identifying performance components (e.g., exploratory or confirmatory factor analyses).
For example, Putka and colleagues (Putka et al., 2020) conducted an iterative series of principal component and exploratory factor analyses of existing data (importance ratings) on multiple worker-related descriptors (knowledges, skills, abilities, work styles, and work-vocational interests) for a sample of 50 Army jobs, in research sponsored by the Army. A total of 22 performance components for describing and profiling Army jobs were identified from the cross-domain analyses of worker-related descriptor data. Putka and colleagues organized the 22 performance components into two categories. Selection-focused performance components were less occupation-specific and were important to most or all Army jobs (e.g., Adaptability, Dependability, Interpersonal), while Classification-focused components were more occupation-specific, important to some Army jobs and more useful for matching people to specific Army jobs than for selection into the service (e.g., Administrative-Clerical, Data and Information Handling, Equipment Repair & Maintenance, Weapons Operation).
A second method for using existing data to define occupation-specific performance components is to apply emerging data science techniques like natural language processing (NLP) to existing text data on occupations-jobs (e.g., job descriptions, task lists). In brief, NLP techniques, specifically natural language understanding (NLU), enable users to parse or extract, classify, and model important information (features) from text data, structured or unstructured, using one or more analytic methods (e.g., text classification, vector semantic, word embedding, sequence labeling). All services have text descriptions of core career fields or occupational specialties, although the richness and currency of the job descriptions may vary. NLU techniques potentially hold promise for defining occupation-specific performance components for military jobs, efficiently and at scale, because they do not require manual grouping or sorting of related tasks or similar text data.
Current research sponsored by the Army illustrates an initial proof-of-concept. Putka et al. (2021) applied NLU to military and nonmilitary job descriptions to train models that could then be applied to inputted Army job descriptions to profile the knowledges, skills, and behaviors important to Army jobs. NLU techniques could also potentially be applied to critical incident texts to similarly derive occupation-specific performance components or related work descriptors (e.g., work context) (cf., Green, 2020).
Synthesizing or transporting existing occupational information
Another method that makes use of existing occupational data is to transport or synthesize data from jobs with data to jobs without data. In brief, this method is analogous to validity transportability or synthetic validation (Johnson, 2007). The method typically involves mapping or linking jobs without occupational data to corresponding jobs with data on performance components (e.g., work activities) or other descriptors useful for inferring performance components. Linking occupations-jobs can be done rationally, empirically, or by some combination of the two (hybrid rational-empirical). The chief advantage to this method is that military occupations can be linked to corresponding civilian jobs (e.g., using an existing crosswalk) with data from existing occupational data sources (e.g., O*NET).4
For example, researchers for the Army’s STARRS program (Study to Assess Risk & Resilience in service members) extended an existing occupational cross-walk linking Army jobs to civilian occupations in O*NET (the O*NET Military Crosswalk), constructing composite linkages (one-to-many) for Army jobs with no clear link to a single civilian occupation (Gadermann et al., 2014). The researchers then used the extended Army-civilian occupational cross-walk to transport O*NET occupational-level data to profile Army jobs on multiple work- (e.g., work activities, tasks, work context) and worker-related descriptors (e.g., knowledges, skills, abilities, education). More recent research made similar occupational linkages (military jobs without data to civilian jobs with data, military jobs without data to military jobs with data) and then used the linkages to transport existing occupational data from multiple sources (including O*NET) to profile Army jobs (Putka et al., 2021). One could also apply this method to link occupations and use the transported data on work-related descriptors (e.g., work activities, tasks) to define occupation-specific performance components for one or more military occupations or jobs without data, directly. Similarly, one could use the transported data on worker-related descriptors (e.g., knowledges, skills, abilities) to infer occupation-specific performance components for military occupations-jobs, in line with the work of Putka et al. (2021).
Summary
Most occupational analysis methods make significant demands on SME time and other resources. These demands make collecting occupational information or data challenging for the services (and other organizations). Methods that make use of existing data on military occupations or related civilian occupations hold promise for overcoming these challenges, as well as providing linkage(s) to other occupational-relevant data. Further research is needed to assess the validity of these methods. For example, one could compare the results from synthetic or transportability methods to the results from occupational analysis with job incumbents.
One option would be to make use of the Department of Defense’s Careers in the Military (CITM) website to obtain job information to use in creating job components. The website, administered by the ASVAB Career Exploration Program, aggregates occupational information across the services and provides a cross-occupation summary of common tasks. For instance, the website’s page for the occupation “nurse” provides links to relevant occupation information from each service as well as a cross-service list of nurse tasks. The CITM occupations are also cross-walked to O*NET. The CITM and O*NET databases provide a host of links to other information about KSAs, salary information, emerging occupations, and so on, making these databases highly advantageous for purposes beyond describing or profiling occupation-specific performance components.
Discussion
Occupation-specific performance components are work behaviors that are unique to occupations or clusters of occupations. Defining and collecting data on occupation-specific performance components is foundational to U.S. military talent management, guiding and informing initial job qualification to occupational training to promotion. However, the U.S. military faces many challenges defining and collecting data on occupation-specific performance components. Chief among these challenges are the resources needed to do so (e.g., people, time, technology). Consequently, the military has developed, experimented with, or implemented multiple and diverse methods, inductive and deductive, to define and capture data on occupation-specific performance components to support different talent management applications (e.g., occupational-jobs training, performance measurement development, criterion-related validation, person-job matching).
The current paper focused on defining occupation-specific components to support performance measurement development, criterion-related validation studies, and other applications related to improving person-job matching. From our experience, supporting one or more of these applications requires occupational-job analytic methods that (a) can be embedded within an existing process, rather than requiring a new, additional process to implement; (b) make use of existing data or enable the efficient use of any newly collected data, within or external to the military; or (c) can scale, across occupations-jobs, across levels within occupations-jobs, across the different services, and so on. However, not all occupational-job analytic methods meet these requirements.
Recommendations
Recommendation 1. Make use of deductive methods using existing work- or worker-related descriptors
Deductive occupational-job analytic methods are advantageous because they enable the efficient collection of occupational data and are scalable. Ideally, the U.S. military could develop a system of cross-service, cross-occupation performance component descriptors that could be used to describe and profile military occupations, deductively (Campbell et al., 2007; National Research Council, 1999). The MD work summarized earlier (Waters et al., 2009) represents an illustration of such a system and could serve as a useful starting point for one that could be implemented, operationally. Alternatively, the military could compile and synthesize relevant existing work- or worker-related descriptor information across the different services, similar to the work reflected in the CITM website.
As illustrated in the current paper, the military has additional options if a cross-service system of cross-occupation descriptors is not feasible. For one, they could make use of O*NET descriptors or other nonmilitary-specific systems relevant to describing and profiling military occupations (e.g., the National Initiative for Cybersecurity Education [NICE] Cybersecurity Workforce Framework, https://www.nist.gov/itl/applied-cybersecurity/nice/nice-framework-resource-center). Second, they could make use of other taxonomies such as competencies to describe and profile occupations (or jobs) (e.g., the Air Force’s work on competencies). Third, they could make use of existing occupation-specific descriptors that are embedded in existing processes, like occupation interest measures in use for person-job matching.
Recommendation 2. Make use of methods, inductive or deductive, that capitalize on existing data
Occupation or job analytic methods that make use of existing occupational data are advantageous because of their efficient use of data and scalability. The U.S. military services have collected a lot of data on occupation-specific performance components or data relevant to performance components over the years and continue to do so as part of one or more existing processes. The same holds true for sources external to the U.S. military, several sources of which are publicly accessible (e.g., O*NET). Indeed, the digital age has made aggregating and linking occupational (or jobs) databases across military and civilian data sources feasible, as illustrated by the work of Putka et al. (2021) and the Army’s STARRS project (Gadermann et al., 2014).
The military has multiple options for using existing occupational data to generate or identify performance components, inductively, or to profile military occupations deductively, from an existing set of performance components. As shown in this paper, these options include but are not limited to (a) analysis or re-analysis of existing occupational (or jobs) data on worker-related descriptors (e.g., KSAs; Putka et al., 2020), (b) using NLP to analyze existing text data on occupations (e.g., job descriptions or task lists; Putka et al., 2021), and (c) synthesizing or transporting existing occupational (or jobs) data to occupations without data (e.g., Gadermann et al., 2014). Overall, accessing existing data (or preparing it for analysis) poses the biggest challenges to implementing one or more of these methods. However, continued advances in data science and machine learning (ML) techniques, such as NLU, make their implementation more viable (e.g., using synthetic data generation to resolve security or privacy issues; using pre-trained language models to complete missing text).
New and emerging methods
New and emerging technologies also hold promise for defining and collecting data on occupation-specific performance components. Many of these technologies are advantageous because they enable the efficient collection of data and can be embedded within existing processes or natural work settings. A further advantage of these technologies is their potential to capture new or richer types of data that are not as reliably or accurately collected using existing data collection methodologies.
One example is the use of wearable biosensors to collect data in situ on physical (or ergonomic) demands of jobs without conducting resource-intensive job observations or using self-report task analysis surveys that may not capture demands as accurately. Similarly, wearable social sensors (e.g., social badges) enable the collection of dynamic social-interpersonal data relevant to occupations (or jobs) not possible using standard task analysis surveys (e.g., teamwork, leadership).
Another example is the use of high-fidelity military simulators (e.g., Virtual or Augmented Reality (VR/AR)). High-fidelity simulators have become common in the military for training individuals or units, collectively. Data collected in or from these simulators could likewise describe and profile occupations in new or richer ways than existing occupational-job analytic methods, in addition to their primary intended application for training (e.g., identifying and analyzing performance components important for specific situations). High-fidelity military simulators could also be used to conduct future-oriented work-job analyses to identify performance components important to future jobs.
Conclusion
Defining and collecting data on occupation-specific performance components is essential to matching the right people to the right occupation (or job). Missing or limited data on these performance components means less effective matches. Mismatching people to occupations negatively impacts individual servicemembers (e.g., decreased performance, decreased professional and personal development) and the organizations they serve (e.g., decreased operational readiness and mission success, increased turnover of talented servicemembers). Accordingly, the U.S. military needs occupational-job analytic methods that make use of (a) existing deductive work- or worker-related methods with the potential to scale across occupations, across levels within occupations, or across the services; (b) existing occupational data (or information), internal or external to the military; or (c) new technologies and tools that can be embedded within an existing process to enable the efficient collection of data as work is performed.
Notes
A previous paper in this issue described a taxonomy of enlisted job and training performance components that are common across enlisted military occupations (Russell et al., 2022). The current paper focuses on defining performance components that are specific to occupations or clusters of occupations that share a common function (e.g., cyber occupations across services).
See, Cunningham et al. (1990) for a 268-item inventory that includes differentiating physical and interpersonal activities. See, Earles et al. (1996) regarding a job classification taxonomy using the action verbs included on AETC Occupational Analysis task lists (e.g., “Sterilize,” “Load,” “Transport”).
SME endorsement is based on mean job marker applicability ratings across SMEs (Romay et al., 2019).
Another potentially useful resource is Careers in the Military (https://www.careersinthemilitary.com/), developed and maintained for the ASVAB Career Exploration Program (CEP) by the Office of People Analytics (OPA), U.S. Department of Defense.
Data availability statement
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Disclosure statement
No potential conflict of interest was reported by the author(s).
References
- Alliger, G. M., Beard, R., Bennett, W., Symons, S., & Colegrove, C. M. (2013). A psychometric examination of Mission Essential Competency (MEC) measures used in Air Force distributed missions operations training needs analysis. Military Psychology, 25(3), 218–233. 10.1037/h0094964 [DOI] [Google Scholar]
- Alliger, G. M., Bennett, W., Colegrove, C. M., & Garrity, M. (2007). Understanding mission essential competencies as a work analysis methodology (AFRL-HE-AZ-TR-2007-0034). Air Force Research Laboratory. https://apps.dtic.mil/dtic/tr/fulltext/u2/a474546.pdf [Google Scholar]
- Barelka, A., Barron, L. G., Kulpa, P., Hernandez, S., & Coggins, M. (2019). Development and validation of air force foundational competency model. Air Education & Training Command. https://apps.dtic.mil/sti/citations/AD1083781 [Google Scholar]
- Barron, L. G., & Rolwes, P. (2020a). Development of air force foundational competency assessments. Air Education & Training Command. https://apps.dtic.mil/sti/pdfs/AD1120209.pdf [Google Scholar]
- Barron, L. G., & Rolwes, P. (2020b). Unpublished data.
- Brannick, M. T., Levine, E. L., & Morgeson, F. P. (2007). Job and work analysis: Methods, research, and applications for human resource management (2nd ed.). Sage Publications, Inc. [Google Scholar]
- Campbell, J. C., McCloy, R. A., McPhail, S. M., Pearlman, K., Peterson, N. G., Rounds, J., & Ingerick, M. (2007). U.S. Army classification research panel [Study Report 2007-05]. U.S. Army Research Institute for the Behavioral and Social Sciences. [Google Scholar]
- Campbell, J. P., McCloy, R. A., Oppler, S. H., & Sager, C. E. (1993). A theory of performance. In Schmitt N. & Borman W. C. (Eds.), Personnel selection in organizations (pp. 35–70). Jossey-Bass Publishers. [Google Scholar]
- Campion, M. A., Fink, A. A., Ruggeberg, B. J., Carr, L., Phillips, G. M., & Odman, R. B. (2011). Doing competencies well: Best practices in competency modeling. Personnel Psychology, 64(1), 225–262. 10.1111/j.1744-6570.2010.01207.x [DOI] [Google Scholar]
- Cranny, C. J., & Doherty, M. (1988). Importance ratings in job analysis: Note on the misinterpretation of factor analyses. Journal of Applied Psychology, 73(2), 320–322. 10.1037/0021-9010.73.2.320 [DOI] [Google Scholar]
- Cunningham, J. W., Wimpee, W. E., & Ballentine, R. D. (1990). Some general dimensions of work among U.S. Air Force enlisted occupations. Military Psychology, 2(1), 33–45. 10.1207/s15327876mp0201_3 [DOI] [Google Scholar]
- Earles, J. A., Driskill, W. E., & Dittmar, M. J. (1996). Methodology for identifying abilities for job classification: An application of job analysis. Military Psychology, 8(3), 179–193. 10.1207/s15327876mp0803_4 [DOI] [Google Scholar]
- Flanagan, J. C. (1954). The critical incident technique. Psychological Bulletin, 51(4), 327–358. 10.1037/h0061470 [DOI] [PubMed] [Google Scholar]
- Gadermann, A., Heeringa, S., Schoenbaum, M., Stein, M., Ursano, R., Colpe, L., Fullerton, C., Gilman, S., Gruber, M. J., Nock, M. K., Rosellini, A. J., Sampson, N. A., Schoenbaum, M., Zaslavsky, A. M., & Kessler, R. C. (2014). Classifying U.S. Army military occupational specialties using the occupational information network. Military Medicine, 179(7), 752–761. 10.7205/MILMED-D-13-00446 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Geimer, J. L., Shaw, M. N., & Sellman, W. S. (2007). Application of military occupational analysis information to the development of ASVAB specifications [FR-07-48]. Human Resources Research Organization. [Google Scholar]
- Green, J. P. (2020). A taxonomy of leadership situations: Development, validation, and implications for the science and practice of leadership [Unpublished doctoral dissertation]. George Mason University. [DOI] [PubMed] [Google Scholar]
- Harvey, R. J. (1991). Job analysis. In Dunnette M. D. & Hough L. M. (Eds.), Handbook of industrial and organizational psychology (2nd ed., pp. 71–163). Consulting Psychologists Press. [Google Scholar]
- Headquarters, U.S. Marine Corps . (2012, March). Marine corps order 1200.13G. [Google Scholar]
- Hoffman, C. C., Rashkovsky, B., & D’Egidio, E. (2007). Job component validity: Background, current research, and applications. In McPhail S. M. (Ed.), Alternative validation strategies: Developing new and leveraging existing validity evidence (pp. 82–121). Jossey-Bass. [Google Scholar]
- Ingerick, M., & Rumsey, M. G. (2014). Taking the measure of work interests: Past, present, and future. Military Psychology, 26(3), 165–181. 10.1037/mil0000045 [DOI] [Google Scholar]
- Johnson, J. W. (2000). Factor analysis of importance ratings in job analysis: Note on the Misinterpretation of Cranny and Doherty (1998). Organizational Research Methods, 3(3), 267–284. 10.1177/109442810033004 [DOI] [Google Scholar]
- Johnson, J. W. (2007). Synthetic validity: A technique of use (finally). In McPhail S. M. (Ed.), Alternative validation strategies: Developing new and leveraging existing validity evidence (pp. 122–158). Jossey-Bass. [Google Scholar]
- Johnson, J. F., Romay, S., & Barron, L. G. (2020). Air Force Work Interest Navigator (AF-WIN) to improve person-job match: Development, validation, and initial implementation. Military Psychology, 32(1), 111–126. 10.1080/08995605.2019.1652483 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Levy, D. G., Thie, H. J., Robbert, A. A., Naftel, S., Cannon, C., Ehrenberg, R., & Gershwen, M. (2001). Characterizing the future defense workforce. RAND. [Google Scholar]
- McClelland, D. C. (1998). Identifying competencies with behavioral event interviews. Psychological Science, 9(5), 331–339. 10.1111/1467-9280.00065 [DOI] [Google Scholar]
- McCormick, E. J. (1976). Job and task analysis. In Dunnette M. D. (Ed.), Handbook of industrial and organizational psychology (pp. 651–696). John Wiley & Sons. [Google Scholar]
- National Academies of Science, Engineering, and Medicine . (2020). Strengthening U.S. Air force human capital management: A flight plan for 2020-2030. The National Academies Press. https://www.nap.edu/catalog/25828/strengthening-us-air-force-human-capital-management-a-flight-plan [Google Scholar]
- National Research Council . (1999). The changing nature of work: Implications for occupational analysis. National Academy of Sciences. [Google Scholar]
- Naval Education and Training Command . (2013). Job duty task analysis management manual [NAVEDTRA 137A].
- Navy Manpower Analysis Center . (2021). Manual of Navy enlisted manpower and personnel classification and occupational standards: Volume I Occupational Standards [NAVPERS 18068F].
- Nye, C. D., Rounds, J., Kirkendall, C. D., Drasgow, F., Chernyshenko, O. S., & Stark, S. (2019). Adaptive vocational interest diagnostic: Development and initial validation [Technical Report 1378]. U.S. Army Research Institute for the Behavioral Sciences. [Google Scholar]
- Nye, C. D., Su, R., Rounds, J., & Drasgow, F. (2012). Vocational interests and performance: A quantitative summary of over 60 years of research. Perspectives on Psychological Science, 7(4), 384–403. 10.1177/1745691612449021 [DOI] [PubMed] [Google Scholar]
- Nye, C. D., Su, R., Rounds, J., & Drasgow, F. (2017). Interest congruence and performance: Revisiting recent meta-analytic findings. Journal of Vocational Behavior, 98, 138–151. 10.1016/j.jvb.2016.11.002 [DOI] [Google Scholar]
- Pearlman, K. (2002, July) Competency modeling: Mirror into the 21st century workplace – Or just smoke? [Paper presented]. 26th Annual International Public Management Association-Human Resources Assessment Council (IPMAAC) Conference, New Orleans, LA. [Google Scholar]
- Peterson, N. G., & Jeanneret, P. R. (2007). Job analysis: Overview and description of deductive methods. In Whetzel D. L. & Wheaton G. R. (Eds.), Applied measurement: Industrial psychology in human resources management (pp. 13–50). Erlbaum. [Google Scholar]
- Peterson, N. G., Mumford, M. D., Borman, W. C., Jeanneret, P. R., & Fleishman, E. A. (1999). An occupational information system for the 21st century: The development of O*NET. American Psychological Association. [Google Scholar]
- Peterson, N. G., Wise, L. L., Arabian, J., & Hoffman, R. G. (2001). Synthetic validation and validity generalization: When empirical validation is not possible. In Campbell J. P. & Knapp D. J. (Eds.), Exploring the limits in personnel selection and classification (pp. 411–452). Erlbaum. [Google Scholar]
- Putka, D. J., Ingerick, M., Ness, A., & O’Brien, E. (Eds.) (2020). Improving Soldier MOS fit using cognitive, non-cognitive, and physical measures [Technical Report]. U.S. Army Research Institute for the Behavioral and Social Sciences. [Google Scholar]
- Putka, D. J., Ingerick, M. J., Yu, M., Wiley, C. R. H., & Kim, K. (2021). Improving Army officer job matching with improved job profiling [Technical Report]. U.S. Army Research Institute for the Behavioral and Social Sciences. [Google Scholar]
- Rodriguez, D., Patel, R., Bright, A., Gregory, D., & Gowing, M. K. (2002). Developing competency models to promote integrated human resources practices. Human Resource Management, 41(3), 309–324. 10.1002/hrm.10043 [DOI] [Google Scholar]
- Romay, S., Johnson, J. F., Rose, M. R., & Barron, L. G. (2019, April). Operationalizing occupational “fit” for optimal scoring of vocational interest assessments. Poster presented at the Society for Industrial/Organizational Psychology Annual Conference, Maryland, National Harbor. [Google Scholar]
- Russell, T. L., Allen, M., Ford, L., Carretta, T., & Kirkendall, C. (2022). Development of a performance taxonomy for entry-level military occupations. Military Psychology. 10.1080/08995605.2022.2050163 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Russell, T. L., Sinclair, A., Erdheim, J., Ingerick, M., Owens, K., Peterson, N., & Pearlman, K. (2008). Evaluating the O*NET occupational analysis system for Army competency development [Technical Report 1237]. U.S. Army Research Institute for the Behavioral and Social Sciences. [Google Scholar]
- Schippman, J. S., Ash, R. A., Battista, M., Carr, L., Eyde, L. D., Hesketh, B., & Sanchez, I. (2000). The practice of competency modeling. Personnel Psychology, 53(3), 703–740. 10.1111/j.1744-6570.2000.tb00220.x [DOI] [Google Scholar]
- Smith, P. C., & Kendall, L. M. (1963). Retranslation of expectations: An approach to the construction of unambiguous anchors for rating scales. Journal of Applied Psychology, 47(2), 149–155. 10.1037/h0047060 [DOI] [Google Scholar]
- Spencer, L. M., & Spencer, S. M. (1993). Competence at work: Models for superior performance. Wiley. [Google Scholar]
- United States Air Force . (2013, March). United States Air Force occupational survey report: Surgical Services AFSC 4N1X1/B/C/D [OSSN: 2913]. Occupational Analysis Program, Air Education and Training Command. (with restricted access) http://oa.aetc.af.mil/Enlisted_OARs_Index.html [Google Scholar]
- United States Air Force . (2014, March). Air force manual 36-2647: Institutional competency development and management [OPR: AF/A1DI, Certified by AF/A1, Robert Corsi]. Rescinded publication. [Google Scholar]
- United States Air Force . (2018, February). United States Air Force occupational survey report: Dental assistant and hygienist AFSC 4Y0X1/H [OSSN: 3085]. Occupational Analysis Program, Air Education and Training Command. (with restricted access) http://oa.aetc.af.mil/Enlisted_OARs_Index.html [Google Scholar]
- United States Air Force . (2019a, April). Air force handbook 36-2647: Competency modeling [OPR: AETC/A3K, Certified by SAF/MR, Jeffrey R. Mayo]. (with restricted access) http://www.e-publishing.af.mil
- United States Air Force . (2019b, May). Air force manual 36-2664: Personnel assessment program [OPR: AF/A1P, Certified by SAF/MR, Jeffrey R. Mayo]. (with restricted access) http://www.e-publishing.af.mil
- United States Coast Guard . (2018). Standard operating procedures (SOP) for the Coast Guard’s training system: Volume 2 analysis. [Google Scholar]
- Van Iddekinge, C. H., Roth, P. L., Putka, D. J., & Lanivich, S. E. (2011). Are you interested? A meta-analysis of relations between vocational interests and employee performance and turnover. Journal of Applied Psychology, 96(6), 1167. 10.1037/a0024343 [DOI] [PubMed] [Google Scholar]
- Wasko, L., Owens, K. S., Campbell, R., & Russell, T. (2012). Development of the combat/deployment performance rating scales. In Knapp D. J., Owens K. S., & Allen M. T. (Eds.), Validating future force performance measures (Army Class): In-unit performance longitudinal validation (Technical Report 1314) (pp.A-1–A-11). U.S. Army Research Institute for the Behavioral and Social Sciences. [Google Scholar]
- Waters, S. D., Shaw, M. N., Russell, T. L., Allen, M. T., Sellman, W. S., & Geimer, J. L. (2009). Development of a methodology for linking ASVAB content to military job information and training curricula [FR-09-64]. Human Resources Research Organization. [Google Scholar]
- Watson, S. E. (2020). Job Opportunities in the Navy (JOIN. Military Psychology, 32(1), 101–110. 10.1080/08995605.2019.1652485 [DOI] [PMC free article] [PubMed] [Google Scholar]
- Wenger, J. B., Pint, E. M., Piquado, T., Shanley, M. G., Beleche, T., Bradley, M. A., Welch, J., Werber, L., Yoon, C., Duckworth, E. J., & Curtis, N. H. (2017). Helping soldiers leverage Army knowledge, skills, and abilities in civilian jobs. RAND Corporation. [Google Scholar]
Associated Data
This section collects any data citations, data availability statements, or supplementary materials included in this article.
Data Availability Statement
Data sharing is not applicable to this article as no new data were created or analyzed in this study.