Identifying the limits of training system effectiveness through taxonomies of human performance (original) (raw)
Related papers
The Use of Work Domain Analysis for the Design of Training Systems
Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 2000
In this paper we argue that specifications for training equipment must be based on statements of mission-system functionality. To develop a good description of functionality is a difficult technical challenge, and the methodology of Work Domain Analysis has been developed for that purpose. However, a Work Domain Analysis does not fully specify the devices that are needed for training. Other forms of analysis and inference are needed to resolve issues of crititicality, instructional functions, implementation of functions in a training device, and fidelity of training device features. In this paper we explain the means of moving from functional requirements as developed by a Work Domain Analysis to specifications for training equipment.
Specifying Skill-Based Training Strategies and Devices: A Model Description
This report describes the background and specifications of a model that identifies the skills required for competent performance of a job, specifies strategies for training those skills, defines training devices, and evaluates efficient use of skill-based training devices. The framework addresses three benefits of skill training: (1) skill training provides more practice on critical skills in a given amount of time, (2) critical skills generalize to many tasks, and (3) training the critical skills involved in complex and difficult tasks decreases the mental workload required to learn or perform the tasks. Both a formal model and a concrete example of the steps derived from that theoretical approach in the context of the Air Traffic Control job domain are described. The specification of training strategies and devices is broken into four steps: identifying skills, selecting instructional strategies designing devices, and allocating training. First, tasks are reduced to their elements...
ADAPT IT: Tools for training design and evaluation
Educational Technology Research and Development, 2002
This article describes a set of computerized tools that support the design and evaluation of competency-based training programs. The training of complex skills such as air traffic control and process control requires a competency-based approach that focuses on the integration and coordination of constituent skills and transfer of learning. At the heart of the training are authentic whole-task practice situations. The instructional design tools are based on van Merriƫnboer's 4C/ID* methodology (1997). The article describes a training design tool (Core) that supports the analysis and design for competency-based training programs and an evaluation tool (Eval) that supports the subsequent revision of this training design.
Incorporating Training Effects in Modeling and Simulation Software
This paper describes a research program that is developing more robust algorithms for The Improved Performance Research Integration Tool (IMPRINT) that better reflect the complexity of the effects of training on human performance. The Air Force Research Laboratory's Warfighter Readiness Research Division is sponsoring a series of empirical studies to assess the effects of various training strategies on skill acquisition and retention in the performance of complex military tasks. New algorithms will be developed and populated based on the results of these studies. Ongoing research efforts are described. The relevance of this research for system designers is discussed.
2016
The public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing the burden, to Department of Defense, Washington Headquarters Services, Directorate for Information Operations and Reports (0704-0188), 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number.
Reimagining Workload Task Analysis: Applications to Training System Design
2011
Today's warfighter performs more complex, cognitively demanding tasks than ever before. Despite the need for more extensive training to perform these tasks, acquisition professionals are often tasked to reduce training budgets and identify optimal tradeoffs. Tools are available to help them make these decisions that provide empirical evidence of how performance and mission requirements will be affected by design decisions. This article offers insights into the utility of implementing a Workload Task Analysis (WLTA) early in weapon systems acquisition for the purpose of focusing on training system decisions, and provides a description of where WLTA occurs within the top-down functional analysis process. It concludes with several examples of how the WLTA results can be used to guide training development.
A machine learning approach to modeling and predicting training effectiveness
2015
Developments in online and computer-based training (CBT) technologies have enabled improvements in efficiency, efficacy, and scalability of modern training programs. The use of computer-based methods in training programs allows for the collection of trainee assessment metrics at much higher levels of detail, providing new opportunities for training evaluation in these programs. These resulting datasets may provide increased opportunities for training evaluation and trainee intervention through the use of descriptive and predictive modeling. In particular, there is the potential for descriptive approaches to provide greater understanding of trainee behavior and indicate similarities between trainees, while accurate prediction models of future performance available early in a training program could help inform trainee intervention methods. However, traditional analysis techniques and human intuition are of limited use on so-called "big-data" environments, and one of the most...
A Strategy for the Development of Training Devices
Human Factors, 1978
This paper disciisses the coiiiplex issiies irir*oli.ed iii the design o f aircreiv siiiiiilatiori traiiiiiig devices. I t addresses methods for defiiiiiig traitiiiig reqiiircrrierits, fidelity, perforiliarice r i msiireiiierit, iiistriictiorial featiires, niid crew coorrliriatiorz. A research evaliintioiz o f a device iisiiig these iiietliods is preseiited. Requests for reprints should be sent to A h. Bertram W. Cream, Department of the Air Force, AFHRL Advanced Systems Division. \{'right-Patterson Air Force Base, Ohio 45433. U.S.A. The methodology that is described assumes that a decision has already been made to obtain a training device. Mcdia selection is a complex problem in itself and will not be discussed here. Analyses of media select ion decisions h a w appeared elsewhere (e.g., Braby. Henry, Parrish, and Swope, 1975; Parker and Downs, 1961). This article will focus on the methodology for designing the configuration of a training device. The basic strategy for simulator design consists of defining precise training requirements, identifying supporting features, and preparing a utilization plan as early as possible in the development cycle. The impact of early identification of these factors will be discussed for the areas of simulator fidelity, performance measures, instructional features, and crew coordination. These factors represent the major issues involved in the de\.elopment of a simulator design. There are three groups which can provide essential inputs to simulator design. First are the users who will eventually train students with the device. Second are the training psychologists who serve as data collectors, refiners, and integrators. The final element consists of simulation engineers who actually de
Testing and analyzing different training methods for industrial operators: an experimental approach
Computer Aided Chemical Engineering, 2013
Process industry is known for its complexity and sensitivity with critical procedures saturated with demanding human-machine interfaces that may induce human errors thus resulting in abnormal situations. Abnormal situations may lead to near misses and even to severe accidents, which can result in loss of production and even in casualties and fatalities. This paper aims at abridging the gap between the highly demanding human machine interfaces and the training methods employed in the process industry by experimentally analyzing the effectiveness of distinct training methods in a virtually simulated abnormal situation. The performance of operators is measured by means of suitable Key Performance Indicators (KPIs) applied to the specific case study. In particular, we analyze experimentally two distinct training methods based respectively on a Power Point presentation and a 3D virtual environment. The positive outcomes of this approach consist in increasing the reliability, cost effectiveness, environmental friendliness, and safety of the process. This work is the result of the interaction between chemical engineers and experimental psychologists, which may open new horizons to scientific research.
Perceptual-Cognitive & Physiological Assessment of Training Effectiveness
Several trends within the simulation and training industry are emphasizing the need for measurable proof that training solutions meet or exceed the requirements for delivering effective training. Cognitive state is a key component of learning, meaning that classification of cognitive state and capacity can provide a measure of training effectiveness. However, accurate classification of trainee state is an extremely challenging task. The more traditional subjective assessment methods have several limitations, while objective assessment methods can be difficult to implement. We conducted an exploratory study that evaluated the cognitive and physiological load engaged during flight simulation and live flight during maneuvers of three levels of difficulty. The study represents the work performed to date in the first year of a multi-year effort to design a method for assessing the efficacy of training content and devices, including live platforms, that is based on objective cognitive state assessment techniques coupled with control input and mission/platform performance measures. The method employs NeuroTracker, a validated tool for evaluating or training perceptual-cognitive skills, to measure spare cognitive capacity, and physiological measures of workload based on analysis of eye tracking and electrocardiogram data. This paper briefly summarizes the design, implementation, and initial results of this study. It summarizes the next steps required to further refine the proposed method for assessing training efficacy and describes the planned follow-on effort. Finally, it discusses additional applications of this method in military and commercial training markets, such as the real-time adaptation of training content to trainee skill level and state.