Abstract
Objectives
The goal of this article is to identify some of the major trends and findings in expertise research and their connections to human factors.
Background
Progress in the study of superior human performance has come from improved methods of measuring expertise and the development of better tools for revealing the mechanisms that support expert performance, such as protocol analysis and eye tracking.
Methods
We review some of the challenges of capturing superior human performance in the laboratory and the means by which the expert performance approach may overcome such challenges. We then discuss applications of the expert performance approach to a handful of domains that have long been of interest to human factors researchers.
Results
Experts depend heavily on domain-specific knowledge for superior performance, and such knowledge enables the expert to anticipate and prepare for future actions more efficiently. Training programs designed to focus learners’ attention on task-related knowledge and skills critical to expert performance have shown promise in facilitating skill acquisition among nonexperts and in reducing errors by experts on representative tasks.
Conclusions
Although significant challenges remain, there is encouraging progress in domains such as sports, aviation, and medicine in understanding some of the mechanisms underlying human expertise and in structuring training and tools to improve skilled performance.
Applications
Knowledge engineering techniques can capture expert knowledge and preserve it for organizations and for the development of expert systems. Understanding the mechanisms that underlie expert performance may provide insights into the structuring of better training programs for improving skill and in designing systems to support professional expertise.
Experts are sometimes defined as outliers from the general population: those more than two standard deviations above the mean, showing consistently superior performance on representative tasks from their domain (Ericsson & Charness, 1994). Given that guidelines for design typically try to encompass the middle 95% of the population, it seems strange for the field of human factors to be concerned with expertise. However, experts may contribute disproportionately to the welfare of their societies (Charness & Krampe, 2008). Consider an expert fighter pilot navigating a $100 million aircraft trying to prevent an opponent from destroying a $5 billion aircraft carrier. A design change that provides even a minor advantage to this skilled pilot could prove highly cost-effective.
Some applied issues addressed by contemporary expert performance researchers were raised by the progenitors of human factors. Fitts (1947) attempted to capture pilots’ eye movements during landings and retrospective reports of critical aviation incidents, and Gagne (1954) noted some critical hazards of using poor criterion measures in the evaluation of simulation fidelity.
Common to human factors and expertise research is the admonition to “know thy user.” Progress in expertise research (see Ericsson, Charness, Hoffman, & Feltovich, 2006) has resulted mainly from improved methods for measuring expertise and the development of tools that reveal the mechanisms that support expert performance.
There are many approaches to the study of expertise (for an experience approach, see Ericsson et al., 2006; Salthouse, 1990). We restrict ourselves here to an influential one known as the expert performance approach. Its general premise is that any claim regarding expertise requires empirical observations of superior, reproducible task performance. (Relying on someone’s title or years of experience can be inadequate.) Thus, modern researchers begin by constructing representative tasks on which superior performance can be directly observed. In some cases, the representative task can be sampled directly from real-world situations (e.g., best move selection from a high-level, unfamiliar chess position). However, alterations may be necessary to re-create the task in a laboratory setting by constructing simulations or scenarios depicting realistic critical incidents (Ericsson & Williams, 2007).
Once superior performance has been captured under controlled conditions, experiments can be designed to identify the underpinnings of superior performance. Process-tracing techniques such as the elicitation of think-aloud reports (Ericsson, 2006) or the collection of eye movement data (e.g., Reingold,Charness,Pomplun,&Stampe,2001) can reveal the mechanisms supporting expertise.
An important caveat for the expert performance approach – and, indeed, for any approach that uses research designs where people are not randomly assigned to individual difference levels (e.g., age, gender, education level) – is that studies are quasi-experimental and hence cannot unambiguously determine causation for mechanisms identified as supporting expert performance. Ultimately, if the expert performance approach has validity, it should be demonstrable through the development of skill-sensitive training procedures to help bring novices up to high levels of performance more quickly. Some successful applications can be seen in training apprentices (Kalyuga, Chandler, & Sweller,1998) and training vehicle threat detection (Kirlik, Walker, Fisk, & Nagel, 1996). Also, by identifying the mechanisms for expert performance, such research may pinpoint ways to design better tools to support the expert. Examples would be the case of performing information search tasks (Hollands & Merikle, 1987), selecting input devices (Sutter & Ziefle, 2005), and using sound to improve patient monitoring by anesthesiologists (Watson & Sanderson, 2004).
A classic finding from the expertise area, dating back to the earliest work on telegraphy (Bryan & Harter, 1899) and through work on chess (Chase & Simon, 1973), is that the highly efficient performance of the expert depends on acquiring domain-specific knowledge rather than on a general ability advantage. Evidence for this proposition is the skill-by-task interaction observed in many domains, usually with memory and perception tasks. Experts show far superior perception and recall performance on briefly presented domain-related materials (e.g., structured chess positions) but differ minimally or not at all from novices in performance on recall of domain-unrelated materials (e.g., quasi-random chess positions). This perceptual and knowledge advantage enables the expert to anticipate and prepare for future actions in sports (Williams, Ward, Knowles, & Smeeton, 2002), games (Jastrzembski, Charness, & Vasyukova, 2006), piloting aircraft (Sohn & Doane, 2004), and optimizing industrial production processes (Prietula, Feltovich, & Marchak, 2000).
Given that domain knowledge is regarded as critical to expert performance, knowledge engineering techniques have been developed to mine the knowledge of experts (Hoffman & Lintern, 2006; Hoffman, Crandall, & Shadbolt, 1998), with the aim of preserving it for organizations when experts depart and to create computer-based expert systems (e.g., Shortliffe, 1976).
Researchers employing the expert performance approach have reported a number of consistent findings that have implications for applied human performance research. Foremost among the findings is that many experts report extended training and/or apprenticeship (a decade or longer) prior to their attainment of superior performance (Ericsson & Lehmann, 1996). One explanatory mechanism proposed to account for the development of skill during this period is the notion of deliberate practice – challenging activities designed to target and correct specific weaknesses in performance skills (Ericsson, Krampe, & Tesch-Römer, 1993). For instance, skilled musicians frequently practice technical études focusing on particular dimensions of performance such as articulation or dynamics (Gruson, 1988). In domains that are less motor skill oriented, such as chess, players may improve their tactical skills by studying experts’ previous games, following one move at a time while attempting to predict the subsequent moves (Charness, Tuffiash, Krampe, Reingold, & Vasyukova, 2005).
It is often difficult to differentiate deliberate practice activities from other forms of practice, so in some applied domains, experiences linked with deliberate practice are used as surrogates. For instance, flying hours on a given military aircraft (vs. hours on all aircraft) were taken as a proxy for deliberate practice when examining how effectively pilots coped with mechanical failures in flight (McKinney & Davis, 2003).
The consequences of extended engagement in deliberate practice are quantitative and qualitative changes in cognitive representations of task-relevant knowledge. For instance, expert soccer goalkeepers learn to attend to particular postural characteristics and body angles to anticipate the direction of an oncoming kick (Williams et al., 2002). In other motor-intensive performance domains, such as piano playing and typing, experts demonstrate a greater eye-hand span that enables them to program overlapping keystrokes (Furneaux & Land, 1999; Salthouse, 1986).
In predominantly cognitive domains, such as competitive memory performance, memorists learn to chunk incoming stimuli into meaningful units and employ hierarchical retrieval mechanisms during test performance (Chase & Ericsson, 1982; Ericsson, Delaney, Weaver, & Mahadevan, 2004). Experts in other domains with high problem-solving and reasoning demands, such as physics and medicine, are more likely to identify the essential principles underlying particular cases than domain novices, whose representations tend to gravitate toward surface features. The superior problem representation of experts permits more effective search strategies, such as forward search through the problem space (Chi, Feltovich, & Glaser, 1981; de Groot, 1978; Patel & Groen, 1986). We now examine in more detail a few representative domains characterized by robust measurement of expertise and substantial progress in identifying underlying mechanisms.
SPORTS
Recent studies have demonstrated that key psychological and physical elements of sport performance can be observed reliably under controlled conditions, with further studies demonstrating how the information gained from such observations can be used to improve and refine critical performance skills (Ward, Williams, & Hancock, 2006).
Williams et al. (2002) videotaped expert and novice tennis players during serve receive, noting differences in their time to react and accuracy of returns. Interventions using expert modeling and feedback produced significant increases in serve receive performance (Farrow, Chivers, Hardingham, & Sachse, 1998; Smeeton, Williams, Hodges, & Ward, 2005). Analogous programs for field hockey and soccer goalkeepers have also demonstrated positive results (for a review, see Ward et al., 2006).
Fadde (2007) described a number of studies on the recognition of pitches by batters in baseball. Citing prior laboratory observations of eye movements by expert and novice baseball players (Paull & Glencross, 1997), Fadde noted that the locus of superior pitch-tracking skills in expert baseball batters could be traced to their superior recognition of cues extracted from perception of the pitcher’s arm motion and release, as well as the first third of the trajectory of the ball during flight. Consequently, Fadde (2006) designed a video observation system to guide players’ attention to these critical cues, followed by a controlled study of its effects on a sample of collegiate baseball players. Fadde demonstrated real-world transfer of the video training program, reporting a significant increase in batting performance of the trained players during competitive play relative to a nontrained control group equated for pretraining hitting ability.
AVIATION
During World War II, Fitts (1947) suggested that many crash landings among military pilots were caused by poorly designed instrument panels, with further observational work using crude measures of eye movements suggesting significant variability in the frequency of fixations on cockpit equipment during landings (Fitts, Jones, & Milton, 1950). Using modern eye-tracking equipment capable of producing more reliable measures of scan paths, contemporary researchers in a number of labs (Bellenkes, Wickens, & Kramer, 1997) have observed systematic differences between more and less skilled pilots in the direction of gaze between instruments and the external view during landings. A number of subsequent trials of computer programs have demonstrated the utility of training new pilots to follow the same type of instrument viewing patterns demonstrated by experts in these laboratory studies (Gopher, Weill, & Bareket, 1994), with a significant drop in errors reported during both simulated and actual flights.
Another area of considerable concern in aviation is weather-related decision making. Longitudinal records over the past half-century indicate adverse weather as the third leading cause of aviation incidents after pilot error and equipment malfunctions, as well as a contributing factor in roughly 20% of all accidents (National Transportation Safety Board, 2005). Researchers have found individual differences in pilots’ abilities to recognize adverse weather conditions, with more experienced pilots generally making better judgments. Until recently, however, it was not known how such judgments were being made.
By employing think-aloud reports, Wiggins and O’Hare (1995) demonstrated that highly experienced pilots focused on a relatively small set of weather parameters and weighted them in a different manner than less experienced pilots (Rockwell & McCoy, 1988). Wiggins and O’Hare (2003) subsequently designed and tested a cue-based training approach for recognition of poor weather conditions. The researchers demonstrated limited transfer, inasmuch as the pilots receiving the cue training made better weather decisions during a simulated flight. Whether such training results in a significant decrease in accidents among those exposed to it has proven difficult to assess. Nevertheless, the application created by these authors is now recommended by the Federal Aviation Administration (FAA) as a part of pilot training (Jensen & Hunter, 2002).
MEDICINE
Like pilots, medical professionals have also benefited from the synergy of expert performance and human factors research. One area in which this is evident is in the design of health information and decision support systems. Medical practitioners, particularly in intensive care settings, are required to make rapid and accurate decisions about the health of their patients. However, medical technologists have introduced a dizzying array of computerized recordkeeping systems and monitoring devices into patient care, and failures to properly use or interpret the data from such devices may be a significant factor in medical treatment errors (Koppel et al., 2005; Walsh & Beatty, 2002).
Seeking to minimize these types of problems, Alberdi et al. (2001) observed senior- and junior-level physicians and nurses in a neonatal intensive care unit and found that senior doctors and nurses used computer physiological monitoring equipment both more frequently and in a different manner than the junior-level staff. These observations, as well as informal feedback interviews with the same staff, suggested modifications to the settings of monitoring equipment displays, along with a number of training interventions to improve signal interpretations. Subsequent trials by a companion research group (Hunter et al. 2003) demonstrated the utility of designing neonatal monitoring software according to actual observations of staff usage.
Simulators have become an integral part of both the training and assessment of medical personnel, particularly in high-risk areas such as surgery and anesthesia (Kneebone, 2003). In addition to allowing repeated, rapid exposure to critical events that might otherwise occur very infrequently in the actual environment, medical simulators have also provided health care administrators and researchers with the tools to assess professional skills without endangering the lives or well-being of actual patients. A number of studies have been conducted in which highly experienced medical staff have demonstrated superior performance using such simulators, with eye-tracking data and verbal reports elucidating some of the critical mechanisms that may underlie such performance (Ericsson, 2004).
However, some recent work has also pointed out the potential drawbacks of medical simulators. Moorthy, Munz, Adams, Pandey, and Darzi (2005) found significant differentiation between junior and intermediate trainees on many technical elements of a simulated vein operation but not between junior and senior trainees. Moorthy et al. also noted failures of simulators to clearly identify and isolate errors in team communication between the primary surgeon, the surgeon’s assistant, and the anesthesiologist. This, as well as other work (Maran & Glavin, 2003), suggests that different kinds of simulators may be optimal for measuring particular surgical skills at different overall levels of surgical expertise and also serves to highlight the broader importance of adapting simulator design to the skills and needs of the targeted user and the constraints placed on that user by the particular situation being simulated.
THE COSTS AND BENEFITS OF EXPERTISE APPROACHES
We have highlighted the positive impact of integrating the methods and principles of expert performance research with those of human factors. However, as with any empirical approach, there are drawbacks. Because experts are by definition a rarity within the population, it can be costly to identify and recruit them for laboratory research. In domains such as military aviation, emergency medicine, or professional sports, the limited availability of top-level personnel to participate in lengthy laboratory studies may present a significant obstacle. Also, given that there will always be variability in performance, as well as in the contents of process-tracing data (e.g., think-aloud reports), one cannot always be confident in generalizing from observations collected from a few experts.
Furthermore, it can be costly to make the necessary links between cross-sectional studies that identify skill differences (via novice-expert comparisons) and training studies that can be used to verify that the observed differences are essential to expert performance and that the appropriate skills and knowledge structures can be imparted efficiently. Longitudinal training studies may take more time and money than most engineering or usability labs can afford.
Such problems pose significant challenges. The path to progress may require a great deal more collaboration and discourse between researchers following the expert performance approach and those following complementary approaches found elsewhere in the human factors community.
Looking forward, we see a continuing need for integration of the observational tools employed by expert performance researchers (e.g., verbal reports/protocol analysis, eye tracking) into the human factors laboratory. Detailed observations of skilled performers engaged in representative tasks may lead to further refinements of training technologies and hardware/software interfaces that better reflect the abilities brought to bear in complex task environments. In turn, new measurement technologies developed by both hardware and software engineers may permit the development of reliable performance metrics in domains that presently lack objective measures of skill (e.g., Weiss & Shanteau, 2003). Despite the difficulty in pursuing research with experts, the studies undertaken in domains such as sports, medicine, and aviation during the past decade show the promise of the expert performance approach.
ACKNOWLEDGMENTS
This research was supported by National Institutes of Health/National Institute on Aging (NIH/NIA), R01 AG13969 “Life-Span Expertise” and NIH/NIA P01 AG17211 “CREATE.”
Biography
Neil Charness is the William G. Chase Professor of Psychology and an associate of the Pepper Institute on Aging and Public Policy at Florida State University. He obtained his Ph.D. in psychology from Carnegie Mellon University in 1974.
Michael Tuffiash is a doctoral student in the Psychology Department at Florida State University, where he received his M.Sc. in psychology in 2002.
REFERENCES
- Alberdi E, Becher J, Gilhooly K, Hunter J, Logie R, Lyon A, et al. Expertise and the interpretation of computerized physiological data: Implications for the design of computerized monitoring in neonatal intensive care. International Journal of Human-Computer Studies. 2001;55:191–216. [Google Scholar]
- Bellenkes AH, Wickens CD, Kramer AF. Visual scanning and pilot expertise: The role of attentional flexibility and mental model development. Aviation, Space, and Environmental Medicine. 1997;68:569–579. [PubMed] [Google Scholar]
- Bryan WL, Harter N. Studies in the telegraphic language: The acquisition of a hierarchy of habits. Psychological Review. 1899;6:345–375. [Google Scholar]
- Charness N, Krampe R. Th. Expertise and knowledge. In: Alwin DF, Hofer SM, editors. Handbook on cognitive aging: Interdisciplinary perspectives. Sage; Thousand Oaks, CA: 2008. pp. 244–258. [Google Scholar]
- Charness N, Tuffiash M, Krampe R, Reingold EM, Vasyukova E. The role of deliberate practice in chess expertise. Applied Cognitive Psychology. 2005;19:151–165. [Google Scholar]
- Chase WG, Ericsson KA. Exceptional memory. American Scientist. 1982;70:607–615. [PubMed] [Google Scholar]
- Chase WG, Simon HA. Perception in chess. Cognitive Psychology. 1973;4:55–81. [Google Scholar]
- Chi MTH, Feltovich P, Glaser R. Categorization and representations of physics problems by experts and novices. Cognitive Science. 1981;5:121–152. [Google Scholar]
- de Groot AD. Thought and choice in chess. 2nd ed. Mouton; The Hague, the Netherlands: 1978. Rev. translation, 1946. [Google Scholar]
- Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Academic Medicine. 2004;79:S70–S81. doi: 10.1097/00001888-200410001-00022. [DOI] [PubMed] [Google Scholar]
- Ericsson KA. Protocol analysis and expert thought: Concurrent verbalizations of thinking during experts’ performance on representative tasks. In: Ericsson KA, Charness N, Feltovich P, Hoffman R, editors. Cambridge handbook of expertise and expert performance. Cambridge University Press; Cambridge, UK: 2006. pp. 223–242. [Google Scholar]
- Ericsson KA, Charness N. Expert performance: Its structure and acquisition. American Psychologist. 1994;49:725–747. [Google Scholar]
- Ericsson KA, Charness N, Hoffman R, Feltovich P, editors. Cambridge handbook of expertise and expert performance. Cambridge University Press; Cambridge, UK: 2006. [Google Scholar]
- Ericsson KA, Delaney PF, Weaver G, Mahadevan R. Uncovering the structure of a memorist’s superior “basic” memory capacity. Cognitive Psychology. 2004;49:191–237. doi: 10.1016/j.cogpsych.2004.02.001. [DOI] [PubMed] [Google Scholar]
- Ericsson KA, Krampe R. Th., Tesch-Römer C. The role of deliberate practice in the acquisition of expert performance. Psychological Review. 1993;100:363–406. [Google Scholar]
- Ericsson KA, Lehmann AC. Expert and exceptional performance: Evidence of adaptation to task constraints. Annual Review of Psychology. 1996;47:273–305. doi: 10.1146/annurev.psych.47.1.273. [DOI] [PubMed] [Google Scholar]
- Ericsson KA, Williams M. Capturing naturally occurring superior performance in the laboratory: Translational research on expert performance. Journal of Experimental Psychology: Applied. 2007;13:115–123. doi: 10.1037/1076-898X.13.3.115. [DOI] [PubMed] [Google Scholar]
- Fadde PJ. Interactive video training of perceptual decision-making in the sport of baseball. Technology, Instruction, Cognition and Learning. 2006;4:265–285. [Google Scholar]
- Fadde PJ. Instructional design for advanced learners: Training recognition skills to hasten expertise. Educational Technology Research and Development. 2007 Available from http://www.springerlink.com/content/q8n175627t7500pr/
- Farrow D, Chivers P, Hardingham C, Sachse S. The effect of video-based perceptual training on the tennis return of serve. International Journal of Sport Psychology. 1998;29:231–242. [Google Scholar]
- Fitts P. Psychological research on equipment designs in the AAF. American Psychologist. 1947;2:93–98. doi: 10.1037/h0053785. [DOI] [PubMed] [Google Scholar]
- Fitts P, Jones RE, Milton JL. Eye movements of aircraft pilots during instrument-landing approaches. Aeronautical Engineering Review. 1950;9:1–6. [Google Scholar]
- Furneaux S, Land MF. The effects of skill on the eye-hand span during musical sight-reading. Proceedings of the Royal Society of London: Biological Sciences. 1999;266:2435–2440. doi: 10.1098/rspb.1999.0943. [DOI] [PMC free article] [PubMed] [Google Scholar]
- Gagne R. Training devices and simulators: Some research issues. American Psychologist. 1954;9:95–107. [Google Scholar]
- Gopher D, Weill M, Bareket T. Transfer of skill from a computer trainer to flight. Human Factors. 1994;36:387–405. [Google Scholar]
- Gruson L. Rehearsal skill and musical competence: Does practice make perfect? In: Sloboda JA, editor. Generative processes in music: The psychology of performance, improvisation, and composition. Oxford University Press; Oxford, UK: 1988. pp. 91–112. [Google Scholar]
- Hoffman RR, Crandall B, Shadbolt NR. Use of the critical decision method to elicit expert knowledge: A case study in the methodology of cognitive task analysis. Human Factors. 1998;40:254–276. [Google Scholar]
- Hoffman RR, Lintern G. Eliciting and representing the knowledge of experts. In: Ericsson KA, Charness N, Feltovich P, Hoffman R, editors. Cambridge handbook of expertise and expert performance. Cambridge University Press; Cambridge, UK: 2006. pp. 203–222. [Google Scholar]
- Hollands JG, Merikle PM. Menu organization and user expertise in information search tasks. Human Factors. 1987;5:577–586. [Google Scholar]
- Hunter J, Ewing G, Freer Y, Logie R, McCue P, McIntosh N. NEONATE: Effective decision support in the neonatal intensive care unit – A preliminary report. In: Dojat M, Keravnou ET, Barahona P, editors. Artificial intelligence in medicine: 9th Conference on Artificial Intelligence in Medicine in Europe, AIME 2003; Berlin: Springer Verlag; 2003. pp. 41–45. [Google Scholar]
- Jastrzembski T, Charness N, Vasyukova C. Expertise and age effects on knowledge activation in chess. Psychology and Aging. 2006;21:401–405. doi: 10.1037/0882-7974.21.2.401. [DOI] [PubMed] [Google Scholar]
- Jensen R, Hunter D. General aviation aeronautical decision-making. FAA; Washington, DC: 2002. [Google Scholar]
- Kalyuga S, Chandler P, Sweller J. Levels of expertise and instructional design. Human Factors. 1998;40:1–17. [Google Scholar]
- Kirlik A, Walker N, Fisk AD, Nagel K. Supporting perception in the service of dynamic decision making. Human Factors. 1996;38:288–299. doi: 10.1177/001872089606380209. [DOI] [PubMed] [Google Scholar]
- Kneebone R. Simulation in surgical training: Educational issues and practical implications. Medical Education. 2003;37:267–277. doi: 10.1046/j.1365-2923.2003.01440.x. [DOI] [PubMed] [Google Scholar]
- Koppel R, Metlay JP, Cohen A, Abaluck B, Localio AR, Kimmel SE, et al. Role of computerized physician order entry systems in facilitating medication errors. Journal of the American Medical Association. 2005;293:1197–1203. doi: 10.1001/jama.293.10.1197. [DOI] [PubMed] [Google Scholar]
- Maran NJ, Glavin RJ. Low- to high-fidelity simulation: A continuum of medical education. Medical Education. 2003;37:22–28. doi: 10.1046/j.1365-2923.37.s1.9.x. [DOI] [PubMed] [Google Scholar]
- McKinney EH, Jr., Davis KJ. Effects of deliberate practice on crisis decision performance. Human Factors. 2003;45:436–444. doi: 10.1518/hfes.45.3.436.27251. [DOI] [PubMed] [Google Scholar]
- Moorthy K, Munz Y, Adams S, Pandey V, Darzi A. A human factors analysis of technical and team skills among surgical trainees during procedural simulations in a simulated operating theatre. Annals of Surgery. 2005;242:631–639. doi: 10.1097/01.sla.0000186298.79308.a8. [DOI] [PMC free article] [PubMed] [Google Scholar]
- National Transportation Safety Board . Risk factors associated with weather-related general aviation accidents (NTSB/SS-05/01) Author; Washington, DC: 2005. [Google Scholar]
- Patel VL, Groen GJ. Knowledge based solution strategies in medical reasoning. Cognitive Science. 1986;10:91–116. [Google Scholar]
- Paull G, Glencross D. Expert perception and decision making in baseball. International Journal of Sport Psychology. 1997;28:35–56. [Google Scholar]
- Prietula MJ, Feltovich PJ, Marchak F. Factors influencing analysis of complex cognitive tasks: A framework and example from industrial process control. Human Factors. 2000;42:56–74. doi: 10.1518/001872000779656589. [DOI] [PubMed] [Google Scholar]
- Reingold EM, Charness N, Pomplun M, Stampe DM. Visual span in expert chess players: Evidence from eye movements. Psychological Science. 2001;12:48–55. doi: 10.1111/1467-9280.00309. [DOI] [PubMed] [Google Scholar]
- Rockwell TH, McCoy CE. General aviation pilot error: A study of pilot strategies in computer simulated adverse weather scenarios. U.S. Department of Transportation; Cambridge, MA: 1988. [Google Scholar]
- Salthouse TA. Perceptual, cognitive, and motoric aspects of transcription typing. Psychological Bulletin. 1986;99:303–319. [PubMed] [Google Scholar]
- Salthouse TA. Influence of experience on age differences in cognitive functioning. Human Factors. 1990;32:551–569. doi: 10.1177/001872089003200505. [DOI] [PubMed] [Google Scholar]
- Shortliffe EH. Computer-based medical consultations: MYCIN. Elsevier; New York: 1976. [Google Scholar]
- Smeeton NJ, Williams AM, Hodges NJ, Ward P. The relative effectiveness of various instructional approaches in developing anticipation skill. Journal of Experimental Psychology: Applied. 2005;11:98–110. doi: 10.1037/1076-898X.11.2.98. [DOI] [PubMed] [Google Scholar]
- Sohn YW, Doane SM. Memory processes of flight situation awareness: Interactive roles of working memory capacity, long-term working memory, and expertise. Human Factors. 2004;46:461–475. doi: 10.1518/hfes.46.3.461.50392. [DOI] [PubMed] [Google Scholar]
- Sutter C, Ziefle M. Interacting with notebook input devices: An analysis of motor performance and users’ expertise. Human Factors. 2005;47:169–187. doi: 10.1518/0018720053653893. [DOI] [PubMed] [Google Scholar]
- Walsh T, Beatty PCW. Human factors error and patient monitoring. Physiological Measurement. 2002;23:111–132. doi: 10.1088/0967-3334/23/3/201. [DOI] [PubMed] [Google Scholar]
- Ward P, Williams AM, Hancock P. Simulation for performance and training. In: Ericsson KA, Charness N, Feltovich P, Hoffman RR, editors. Cambridge handbook of expertise and expert performance. Cambridge University Press; Cambridge, UK: 2006. pp. 243–262. [Google Scholar]
- Watson M, Sanderson P. Sonification supports eyes-free respiratory monitoring and task time-sharing. Human Factors. 2004;46:497–517. doi: 10.1518/hfes.46.3.497.50401. [DOI] [PubMed] [Google Scholar]
- Weiss DJ, Shanteau J. Empirical assessment of expertise. Human Factors. 2003;45:104–116. doi: 10.1518/hfes.45.1.104.27233. [DOI] [PubMed] [Google Scholar]
- Wiggins M, O’Hare D. Expertise in aeronautical weather-related decision making: A cross-sectional analysis of general aviation pilots. Journal of Experimental Psychology: Applied. 1995;1:305–320. [Google Scholar]
- Wiggins M, O’Hare D. Weatherwise: Evaluation of a cue-based training approach for the recognition of deteriorating weather conditions during flight. Human Factors. 2003;45:337–345. doi: 10.1518/hfes.45.2.337.27246. [DOI] [PubMed] [Google Scholar]
- Williams AM, Ward P, Knowles J, Smeeton N. Anticipation skill in ‘real-world’ tasks: Measurement, training and transfer. Journal of Experimental Psychology: Applied. 2002;8:259–270. doi: 10.1037//1076-898x.8.4.259. [DOI] [PubMed] [Google Scholar]