HUMAN ERROR Research Papers - Academia.edu (original) (raw)

High Reliability Organisation (HRO) and Resilience Engineering (RE) are two research traditions which have attracted a wide and diverse readership in the past decade. Both have reached the status of central contributions to the field of... more

High Reliability Organisation (HRO) and Resilience Engineering (RE) are two research traditions which have attracted a wide and diverse readership in the past decade. Both have reached the status of central contributions to the field of safety while sharing a similar orientation. This is not without creating tensions or questions, as expressed in the call of this special issue. The contention of this article is that these two schools introduce ways of approaching safety which need to be reflected upon in order to avoid simplifications and hasty judgments about their relative strength, weaknesses or degree of overlapping. HRO has gained strength and legitimacy from (1) studying ethnographically, with an organisational angle, high-risk systems, (2) debating about principles producing organisation reliability in face of high complexity and (3) conceptualising some of these principles into a successful generic model of " collective mindfulness " , with both practical and theoretical success. RE has gained strength and legitimacy from (1) harnessing then deconstructing, empirically and theoretically, the notion of 'human error', (2) argued for a system (and complexity) view and discourse about safety/accidents, (3) and supported this view with the help of (graphical) actionable models and methods (i.e. the engineering orientation). In order to show this, one has to go beyond the past 10 years of RE to include a longer time frame going back to the 80s to the early days of Cognitive Engineering (CE). The approach that is followed here includes therefore a strong historical orientation as a way to better understand the present situation, profile each school, promote complementarities while maintaining nuances.

Over the past two decades, the 'new view' has become a popular term in safety theory and practice. It has however also been criticised, provoking division and controversy. The aim of this article is to clarify the current situation. It... more

Over the past two decades, the 'new view' has become a popular term in safety theory and practice. It has however also been criticised, provoking division and controversy. The aim of this article is to clarify the current situation. It describes the origins, ambiguities and successes of the 'new view' as well as the critiques formulated. The article begins by outlining the origins of this concept, in the 1980s and 1990s, from the cognitive (system) engineering (CSE) school initiated by Rasmussen, Hollnagel and Woods. This differed from Reason's approach to human error in this period. The article explains how Dekker, in the early 2000s, translates ideas from the CSE school to coin the term 'new view', while also developing, shortly after, an argument against Reason's legacy that was more radical and critical than his predecessors'. Secondly, the article describes the ambiguities associated with the term 'new view' because of the different programs that have derived from CSE (Resilience Engineering-RE then Safety II, Safety Differently, Theory of Graceful Extensibility). The text then identifies three programs by different thinkers (methodological, formal and critical) and Dekker's three eclectic versions of the 'new view'. Thirdly, the article discusses the success of the CSE and RE school, showing how it has strongly resonated with many practitioners outside the academic world. Fourthly, the objections raised within the field of human factors and system safety but also from different traditions (e.g., system safety engineering with Leveson, sociology of safety with Hopkins) are introduced, and discussed.

This paper proposes a novel man-machine interface which takes into account the mutual cooperation between the operator and an expert system in order to realize precise decision making and operation. This interface is composed of three... more

This paper proposes a novel man-machine interface which takes into account the mutual cooperation between the operator and an expert system in order to realize precise decision making and operation. This interface is composed of three modules: a diagnosis module, a supporting module for the operator's decision making and an indication module. The knowledge base in the system described is represented by a fault tree diagram, incorporated to communicate between the operator and the computer. The system constructed enables the diagnosis of not only human errors, but also malfunctions of the process, and increases precision, reliability, and safety during bioprocess operation. © 1997 Elsevier Science B.V.

Human error in configuring routers undermines attempts to provide reliable, predictable end-to-end performance on IP networks. Manual configuration, while expensive and errorprone, is the dominant mode of operation, especially for large... more

Human error in configuring routers undermines attempts to provide reliable, predictable end-to-end performance on IP networks. Manual configuration, while expensive and errorprone, is the dominant mode of operation, especially for large enterprise networks. These networks often lack the basic building blocks-an accurate equipment inventory, a debugged initial configuration, and a specification of local configuration policies-to support the holy grail of automation. We argue that migrating an existing network to automated configuration is a rich and challenging research problem rooted in data analysis and in the modeling of network protocols and operational practices. We propose a novel, bottom-up approach that proceeds in three phases: (i) analysis of configuration data to summarize the existing network state and uncover configuration problems; (ii) data mining to identify the network's local configuration policies and violations of these policies; and ultimately (iii) boot-strapping of a database to drive future configuration changes. The first stage reduces the number of errors, the second normalizes the local policies, and the third prevents new errors and reduces the manpower needed to configure the network. We describe the architecture of our EDGE tool for steps (i) and (ii), and present some examples from our experiences applying the tool to several large enterprise networks.

The correct functioning of interactive computer systems depends on both the faultless operation of the device and correct human actions. In this paper, we focus on system malfunctions due to human actions. We present abstract principles... more

The correct functioning of interactive computer systems depends on both the faultless operation of the device and correct human actions. In this paper, we focus on system malfunctions due to human actions. We present abstract principles that generate cognitively plausible human behaviour. These principles are then formalised in a higher-order logic as a generic , and so retargetable, cognitive architecture, based on results from cognitive psychology. We instantiate the generic cognitive architecture to obtain specific user models. These are then used in a series of case studies on the formal verification of simple interactive systems. By doing this, we demonstrate that our verification methodology can detect a variety of realistic, potentially erroneous actions, which emerge from the combination of a poorly designed device and cognitively plausible human behaviour.

While a significant effort is currently being undertaken by the CHI community in order to apply and extend current usability evaluation techniques to new kinds of interaction techniques very little has been done to improve the reliability... more

While a significant effort is currently being undertaken by the CHI community in order to apply and extend current usability evaluation techniques to new kinds of interaction techniques very little has been done to improve the reliability of software offering these kinds of interaction techniques. As these new interaction techniques are currently more and more used in the field of command and control safety critical systems the potential of incident or accidents increases. Similarly, the non reliability of interactive software can jeopardize usability evaluation by showing unexpected or undesired behaviors. Lastly, iterative design processes promote multiple designs through evolvable prototypes in order to accommodate requirements changes and results from usability evaluations thus reducing reliability of the final system by lack of global and structured design. The aim of this SIG is to provide a forum for both researchers and practitioners interested in safety critical interactive systems. Our goal is to define a roadmap of activities to cross fertilize usability, reliability and safety for these kinds of systems to minimize duplicate efforts and reuse knowledge in all the communities involved.

In the spreadsheet error community, both academics and practitioners generally have ignored the rich findings produced by a century of human error research. These findings can suggest ways to reduce errors; we can then test these... more

In the spreadsheet error community, both academics and practitioners generally have ignored the rich findings produced by a century of human error research. These findings can suggest ways to reduce errors; we can then test these suggestions empirically. In addition, research on human error seems to suggest that several common prescriptions and expectations for reducing errors are likely to be incorrect. Among the key conclusions from human error research are that thinking is bad, that spreadsheets are not the cause of spreadsheet errors, and that reducing errors is extremely difficult.

Risk is defined as the probability of a specific adverse event occurring within a specific period, while Quantitative risk assessment (QRA) is the development of a quantitative estimate of risk based on engineering evaluation and... more

Risk is defined as the probability of a specific adverse event occurring within a specific period, while Quantitative risk assessment (QRA) is the development of a quantitative estimate of risk based on engineering evaluation and mathematical techniques by combining estimate of incident consequences and frequencies. In view of the increase in the use of railways as the mode of transportation for hazardous materials throughout the world, the associated risk analysis should be taken into concern. In this study, the failure frequency of the transportation of ammonia from the Petronas fertilizers Kedah (PFK) plant in Gurun (Northern part of Peninsular Malaysia), to the Chemical Company Malaysia (CCM) fertilizer's facilities in Port Klang (South-western part of Peninsular Malaysia) was evaluated by incorporating the human error assessment. The study highlighted the importance of human error contributions in the failure frequency analysis and its impact on the selected failure scenarios. Besides, it also shows that the application of the human error assessment and reduction technique (HEART), which is a useful human reliability analysis tool, should be used in parallel with the fuzzy arithmetic approach to reduce the uncertainties involved in the estimation of human error probabilities, and hence, to reduce the likelihood of incorrect risks estimates being assessed. The results suggested that the commonly applied approach in quantitative risk assessments, which only consider equipment failures in the failure frequency estimations, are clearly an underestimate of the potential causes of failures leading to hazardous material releases, and hence, the calculated risks based do not reflect the actual risks. © 2008 American Institute of Chemical Engineers Process Saf Prog, 2009

Current predictors of speech intelligibility are inadequate for understanding and predicting speech confusions caused by acoustic interference. We develop a model of auditory speech processing that includes a phenomenological... more

Current predictors of speech intelligibility are inadequate for understanding and predicting speech confusions caused by acoustic interference. We develop a model of auditory speech processing that includes a phenomenological representation of the action of the Medial Olivocochlear efferent pathway and that is capable of predicting consonant confusions made by normal hearing listeners in speech-shaped Gaussian noise. We then use this model to predict human error patterns of initial consonants in consonant-vowelconsonant words in the context of a Dynamic Rhyme Test. In the process we demonstrate its potential for speech discrimination in noise. Our results produced performance that was robust to varying levels of stationary additive speech-shaped noise and which mimicked human performance in discrimination of synthetic speech as measured by the Chi-squared test.

Background: The importance of theory in underpinning interventions to promote effective professional practice is gaining recognition. The Medical Research Council framework for complex interventions has assisted in promoting awareness and... more

Background: The importance of theory in underpinning interventions to promote effective professional practice is gaining recognition. The Medical Research Council framework for complex interventions has assisted in promoting awareness and adoption of theory into study design. Human error theory has previously been used by high risk industries but its relevance to healthcare settings and patient safety requires further investigation. This study used this theory as a framework to explore non-prescription medicine supply from community pharmacies. The relevance to other healthcare settings and behaviours is discussed. Method: A 25% random sample was made of 364 observed consultations for non-prescription medicines. Each of the 91 consultations was assessed by two groups: a consensus group (stage 1) to identify common problems with the consultation process, and an expert group (stages 2 and 3) to apply human error theory to these consultations. Paired assessors (most of whom were pharmacists) categorised the perceived problems occurring in each consultation (stage 1). During stage 2 paired assessors from an expert group (comprising patient safety experts, community pharmacists and psychologists) considered whether each consultation was compliant with professional guidelines for the supply of pharmacy medicines. Each noncompliant consultation identified during stage 2 was then categorised as a slip/lapse, mistake, or violation using human error theory (stage 3). Results: During stage 1 most consultations (n = 75, 83%) were deemed deficient in information exchange. At stage 2, paired assessors varied in attributing non-compliance to specific error types. Where agreement was achieved, the error type most often selected was ''violation'' (n = 27, 51.9%, stage 3). Consultations involving product requests were less likely to be guideline compliant than symptom presentations (OR 0.30, 95% CI 0.10 to 0.95, p = 0.05). Conclusions: The large proportion of consultations classified as violations suggests that either pharmacy staff are unaware of professional guidelines and thus do not follow them (therefore these acts would not be violations), or that they knowingly violate the guidelines due to reasons that need further research. The methods presented here could be used in other healthcare settings to explore healthcare professional behaviour and to develop strategies to promote patient safety and effective professional practice.

Due to the large number of ~teps and the number of p 9 involved in the p~par atlon of a radiation treatm~t, the tr~sPer orin for matlon from one step to t~e next i~ a vtry critical p~int. Error~ due to inadequ~le ~ansfer of in formation... more

Due to the large number of ~teps and the number of p 9 involved in the p~par atlon of a radiation treatm~t, the tr~sPer orin for matlon from one step to t~e next i~ a vtry critical p~int. Error~ due to inadequ~le ~ansfer of in formation will be refitted in every next step and can ~edously affect tile fi~al result ~f the treatn~ent We ~tudled the frequency and the zou~ of the transfer errors. A total number 0/'464 new treatments has been checked over a period or 9 month~ (l~uary to October 1990). Erroneous data transfer ha~ been detected in 139/2412~ ( < I ~,~ ofth~ t ransPetted param~lor~; Ihey a fleet 0d 26~ (I 19/464) of the checked treatments. Twenty-five of there dt viations could h~ve led to large g~graphleai miss ~ impoaant ow~ or underdo~age trench more than 5% ) of the organs in the irradiated volume, ahus increasing tho c~plicatlons or d~rea~ing the ttt~ur~n~r~lpr~babi~ity'i~n~tc~rr~etcd'~uchmaj~rde~'ia1i~nL~niy~u~nn8in~fthe~ransferrcdpar~ters~a~er 5 ~ (25/464) of the new treatments. Tl~ ~onr~s of tt~e large devlallons were n~ty always human mlstak~s, whereas a e on~iderable numb = of the smaller deviation s were. iv her. r t~ken dCeizlons to de,6~te from Ihe int~ded lr~lnl~n. NaarIy half of the n~ajor devla~ions were introduced during iI~pm nf the data in ~ll~ r syslem, dcm~ns~r that a ~ysten~ alined to prevent aceidem~ e~or~, can I~d to a eonsiderabIe na~ber of Syslcm~tle ~ro~s it ~ u~r ~s an un~nx fdled set-up system. The results of tills alldy ~how that human mr~r162 can ~r;o~zly affect ~he out.me of petlent treatments, Therefore+ pati~t-re[ateA quality assu~n~ pro~dures are highly re--mended at tI~e ~rst treatment zesstons to eliminate, at least, the systematlc errors.

Rationale, aims and objectives The accuracy of health care professionals in reporting safety events determines their usefulness for both system improvement and research. The study objectives were to: (1) validate (assess the accuracy of)... more

Rationale, aims and objectives The accuracy of health care professionals in reporting safety events determines their usefulness for both system improvement and research. The study objectives were to: (1) validate (assess the accuracy of) the reasons recorded by doctors and pharmacists for discontinuing medication orders at discharge in a hospital's electronic patient records (EPR); (2) investigate the causes of any detected recording inaccuracy; and (3) collect preliminary data on the frequency and types of medication discontinuation. Methods This was a validation study in one English hospital. The study comprised two steps: extraction of discontinued medication orders from the EPR followed by short structured interviews with doctors and pharmacists who made the discontinuation. A total of 104 discontinued orders were discussed during 15 face-to-face and six telephone interviews. The software package spss was used for data analysis. Results Duplication of therapy (27, 25.2%), omission of drug (23, 21.5%) and dosage regimen change (19, 17.8%) were the three most frequent reasons given for discontinuing medications. The majority of recorded discontinuation reasons were correct (100, 96.2%) and complete (101, 97.1%), and hence were judged accurate (97, 93.3%). The difference in accurate recording between doctors (15, 88.2%) and pharmacists (82, 94.3%) was not statistically significant. Potential causes of recording inaccuracy included: slip or lapse, lack of training, carelessness and electronic system rigidity. Conclusion This study showed that doctors and pharmacists recorded accurate reasons for the majority of the discontinued medication orders. It also showed that utilizing pharmacists' recorded reasons during clinical interventions using EPR was beneficial in understanding and characterizing prescribing errors. Although they require further research, the reasons identified present preliminary data about the most prevalent types of pharmacists' interventions during hospital discharge.

The paper offers a social science perspective on some of the assumptions and abstractions implicit in the semantic web vision in one bio-medical domain. The focus here is on the more persistent semantic and systemic alignment issues in... more

The paper offers a social science perspective on some of the assumptions and abstractions implicit in the semantic web vision in one bio-medical domain. The focus here is on the more persistent semantic and systemic alignment issues in the context of neuro-psychiatric disease, as experienced by Grid projects working in this domain in the UK, EU and US, where the opportunities and the challenges of integration have been particularly evident.

Construction accident investigation techniques and reporting systems identify what type of accidents occur and how they occurred. Unfortunately, they do not properly address why the accident occurred by identifying possible root causes,... more

Construction accident investigation techniques and reporting systems identify what type of accidents occur and how they occurred. Unfortunately, they do not properly address why the accident occurred by identifying possible root causes, which is only possible by complementing these techniques with theories of accident causation and theories of human error. The uniqueness of the construction industry dictates the need to tailor many of the contemporary accident causation models and human error theories. This paper presents ...

This work describes a novel system for device development that automates and fully integrates the workflow from test chip construction, from placement and routing to electrical test program generation. In addition to accelerating test... more

This work describes a novel system for device development that automates and fully integrates the workflow from test chip construction, from placement and routing to electrical test program generation. In addition to accelerating test chip and test program development, this system facilitates parameterized data analysis, thereby providing a framework that finally allows the user to realize the full benefits of complex and elegant experimental device designs. By utilizing a centralized database and eliminating parameter re-entry, the automation provided by this integrated approach eliminates many of the sources for human error while maximizing reuse between technologies.

Disasters can be defined as catastrophic events that challenge the normal range of human coping ability. The technological/human-caused disaster, a classification of interest in this article, is attributable to human error or misjudgment.... more

Disasters can be defined as catastrophic events that challenge the normal range of human coping ability. The technological/human-caused disaster, a classification of interest in this article, is attributable to human error or misjudgment. Lower socioeconomic status and race intersect in the heightened risk for technological/humancaused disasters among people of color. The experience of the Navajo with the uranium industry is argued to specifically be this type of a disaster with associated long-standing psychological impacts. The history of the Navajo with uranium mining and milling is reviewed with a discussion of the arduous efforts for compensation. The psychological impacts of this long-standing disaster among the Navajo are organized around major themes of: (a) human losses and bereavement, (b) environmental losses and contamination, (c) feelings of betrayal by government and mining and milling companies, (d) fears about current and future effects, (e) prolonged duration of psychological effects, (f) anxiety and depression, and (g) complicating factors of poverty and racism. The paper concludes with suggestions for culturally-appropriate education and intervention.

This paper introduces the idea of predicting 'designer error' by evaluating devices using Human Error Identification (HEI) techniques. This is demonstrated using Systematic Human Error Reduction and Prediction Approach (SHERPA) and Task... more

This paper introduces the idea of predicting 'designer error' by evaluating devices using Human Error Identification (HEI) techniques. This is demonstrated using Systematic Human Error Reduction and Prediction Approach (SHERPA) and Task Analysis for Error Identification (TAFEI) to evaluate a vending machine. Appraisal criteria which rely upon user opinion, face validity and utilisation are questioned. Instead a quantitative approach, based upon signal detection theory, is recommended. The performance of people using SHERPA and TAFEI are compared with heuristic judgement and each other. The results of these studies show that both SHERPA and TAFEI are better at predicting errors than the heuristic technique. The performance of SHERPA and TAFEI are comparable, giving some confidence in the use of these approaches. It is suggested that using HEI techniques as part of the design and evaluation process could help to make devices easier to use.

Historically, mining has been viewed as an inherently high-risk industry. Nevertheless, the introduction of new technology and a heightened concern for safety has yielded marked reductions in accident and injury rates over the last... more

Historically, mining has been viewed as an inherently high-risk industry. Nevertheless, the introduction of new technology and a heightened concern for safety has yielded marked reductions in accident and injury rates over the last several decades. In an effort to further reduce these rates, the human factors associated with incidents/accidents needs to be addressed. A modified version of the Human Factors Analysis and Classification System was used to analyze incident and accident cases from across the state of Queensland to identify human factor trends and system deficiencies within mining. An analysis of the data revealed that skill-based errors were the most common unsafe act and showed no significant differences across mine types. However, decision errors did vary across mine types. Findings for unsafe acts were consistent across the time period examined. By illuminating human causal factors in a systematic fashion, this study has provided mine safety professionals the information necessary to reduce mine incidents/accidents further.

Abstract—Use of information technology has become com-monplace in healthcare. In an ideal world a patient always gets first class treatment and everything goes smoothly and as planned. Applications of information technology are created to... more

Abstract—Use of information technology has become com-monplace in healthcare. In an ideal world a patient always gets first class treatment and everything goes smoothly and as planned. Applications of information technology are created to help the hospital staff ...

La valutazione della fidatezza degli schemi elettrici di potenza è lasciata all'esperienza e alla conoscenza del singolo progettista. Per questa ragione, troppo frequentemente, gli schemi delle infrastrutture, quali ospedali, aeroporti,... more

La valutazione della fidatezza degli schemi elettrici di potenza è lasciata all'esperienza e alla conoscenza del singolo progettista. Per questa ragione, troppo frequentemente, gli schemi delle infrastrutture, quali ospedali, aeroporti, stadi e teatri, grattacieli, gallerie autostradali sono progettati senza neppure la percezione dell'esistenza di concetti che sono invece a fondamento della progettazione nei più evoluti contesti industriali. Tra questi, inoltre, lo scambio di informazioni è abbastanza rara: il mondo ferroviario, navale, aerospaziale, nucleare, le società di distribuzione di energia, la ricerca universitaria, il settore petrolifero, il mondo della tecnologia informatica costituiscono dei contesti di eccellenza, di fatto inaccessibili a coloro che non vi appartengono. Una classificazione degli schemi elettrici sulla base di criteri condivisi, con specifico riferimento alla fidatezza, potrebbe aiutare a condividere e a diffondere questa cultura, oltre che a premiare la ricerca e sviluppo dei migliori fabbricanti. Nell'articolo si traccia uno schema di classificazione e si avanzano le ipotesi dei criteri di validazione.

The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents.... more

The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based on Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. The HFACS framework was used to analyze human error data associated with aircrew-related commercial aviation accidents that occurred between January 1990 and December 1996 using database records maintained by the NTSB and the FAA. Investigators were able to reliably accommodate all the human causal factors associated with the commercial aviation accidents examined in this study using the HFACS system. In addition, the class...

The demand to travel by rail is ever increasing because it benefits both passengers and freight; therefore it is of utmost importance for railway administrators to carry passengers and freight safely to their destinations. Undergoing... more

The demand to travel by rail is ever increasing because it benefits both passengers and freight; therefore it is of utmost importance for railway administrators to carry passengers and freight safely to their destinations. Undergoing safety procedures and developing safety systems require awareness of what is causing unsafe conditions. This can be accomplished by learning from the past. This research has been performed to analyze the data from past accidents of the Iranian Railway (RAI) by applying association rules data mining techniques in order to discover and reveal unknown relationships and patterns among the data. By the application of CRISP-DM as the data mining methodology and utilizing Clementine 12.0 as the software tool, the mentioned objectives of this paper were fulfilled. For this research some 6500 accident records were selected from the accidents database from 1996 to 2005. The ultimate relationships and patterns extracted can been utilized to develop regulations and rules. This research considers accident conditions and relationships discovered among the most common accident factors (human error, wagon and track) with other fields of the database in order to prevent them from occurring in the future.

Safe surgical practice requires a combination of technical and nontechnical abilities. Both sets of skills can be impaired by intra-operative stress, compromising performance and patient safety. This systematic review aims to assess the... more

Safe surgical practice requires a combination of technical and nontechnical abilities. Both sets of skills can be impaired by intra-operative stress, compromising performance and patient safety. This systematic review aims to assess the effects of intra-operative stress on surgical performance.A systematic search strategy was implemented to obtain relevant articles. MEDLINE, EMBASE, and PsycINFO databases were searched, and 3,547 abstracts were identified. After application of limits, 660 abstracts were retrieved for subsequent evaluation. Studies were included on the basis of predetermined inclusion criteria and independent assessment by 2 reviewers.In all, 22 articles formed the evidence base for this review. Key stressors included laparoscopic surgery (7 studies), bleeding (4 studies), distractions (4 studies), time pressure (3 studies), procedural complexity (3 studies), and equipment problems (2 studies). The methods for assessing stress and performance varied greatly across studies, rendering cross-study comparisons difficult. With only 7 studies assessing stress and surgical performance concurrently, establishing a direct link was challenging. Despite this shortfall, the direction of the evidence suggested that excessive stress impairs performance. Specifically, laparoscopic procedures trigger greater stress levels and poorer technical performance (3 studies), and expert surgeons experience less stress and less impaired performance compared with juniors (2 studies). Finally, 3 studies suggest that stressful crises impair surgeons' nontechnical skills (eg, communication and decision making).Surgeons are subject to many intra-operative stressors that can impair their performance. Current evidence is characterized by marked heterogeneity of research designs and variable study quality. Further research on stress and performance is required so that surgical training and clinical excellence can flourish.

This paper investigates the contribution of a human modeling and simulation software in two different applications: workstation in a medical products industry and a costumer service counter. This study examines its application in sectors... more

This paper investigates the contribution of a human modeling and simulation software in two different applications: workstation in a medical products industry and a costumer service counter.
This study examines its application in sectors iin which these techniques had not been well developed yet. The same way, it can contribute to the application in other areas as well.
The results achieved were: evaluations of the visual fields, determination of workplaces’ physical characteristics, visualization of reach envelopes and improving communications between different people involved in the process, reduce the time spent on the process, and especially, preliminary considerations of human factors.

When an accident happens, it is obviously important to understand what caused it in order to take effective preventive measures. Accident analysis always implies an accident model, i.e., a set of assumptions of what the underlying... more

When an accident happens, it is obviously important to understand what caused it in order to take effective preventive measures. Accident analysis always implies an accident model, i.e., a set of assumptions of what the underlying "mechanisms" are. Over the last 50-75 years there have been significant changes in accident models, leading to changes in the methods and goals of accident analysis. In parallel to this development the understanding of the role of humans in accidents, and of the nature of "human error", has also changed. This paper provides an overview of the developments, and outlines the consequences for contemporary accident analysis and prevention.

Making India to a global healthcare hub, it is not only about bringing new technology but also we have to take care of the existing technology. The healthcare hub is the leading factor for current economic growth of India. Human Factor... more

Making India to a global healthcare hub, it is not only about bringing new technology but also we have to take care of the existing technology. The healthcare hub is the leading factor for current economic growth of India. Human Factor Engineering (HFE) plays a vital role in this field. In medical or healthcare, the field is named as Medical Human Factor Engineering (MHFE). This paper discusses on how MHFE responsible for strengthen the Technology Management of Hospital, Hazards from device failure and use related, Human Factors consideration in medical device use and case study on (Infusion Pumps) errors committed by users in each clinical area. Now the challenging issue for HFE is to design a proper workspace to avoid human errors and the four workspace design principles of Sanders & McCormick (1993) is also discussed. This paper deals with the Computer-aided-design (CAD) systems and a failure mode and effects analysis (FMEA) technique with Simple Organizational Structure of HFE in designing the workspace.

This article attempts to investigate the various types of threats that exist in healthcare information systems (HIS). A study has been carried out in one of the government-supported hospitals in Malaysia. The hospital has been equipped... more

This article attempts to investigate the various types of threats that exist in healthcare information systems (HIS). A study has been carried out in one of the government-supported hospitals in Malaysia. The hospital has been equipped with a Total Hospital Information System (THIS). The data collected were from three different departments, namely the Information Technology Department (ITD), the Medical Record Department (MRD), and the X-Ray Department, using in-depth structured interviews. The study identified 22 types of threats according to major threat categories based on ISO/IEC 27002 (ISO 27799:2008). The results show that the most critical threat for the THIS is power failure followed by acts of human error or failure and other technological factors. This research holds significant value in terms of providing a complete taxonomy of threat categories in HIS and also an important component in the risk analysis stage.

Human error has been the subject of much debate over the past two decades. Alongside this debate, a number of methods have been developed to predict human error. In this paper a systems approach to the topic is proposed. A methodology... more

Human error has been the subject of much debate over the past two decades. Alongside this debate, a number of methods have been developed to predict human error. In this paper a systems approach to the topic is proposed. A methodology based upon the approach has been developed and is reported. The methodology is illustrated by way of a case study. Future directions are indicated. Copyright 0 19%

The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents.... more

The Human Factors Analysis and Classification System (HFACS) is a general human error framework originally developed and tested within the U.S. military as a tool for investigating and analyzing the human causes of aviation accidents. Based upon Reason's (1990) model of latent and active failures, HFACS addresses human error at all levels of the system, including the condition of aircrew and organizational factors. The purpose of the present study was to assess the utility of the HFACS framework as an error analysis and classification tool outside the military. Specifically, HFACS was applied to commercial aviation accident records maintained by the National Transportation Safety Board (NTSB). Using accidents that occurred between January 1990 and December 1996, it was demonstrated that HFACS reliably accommodated all human causal factors associated with the commercial accidents examined. In addition, the classification of data using HFACS highlighted several critical safety issues in need of intervention research. These results demonstrate that the HFACS framework can be a viable tool for use within the civil aviation arena.

Human data entry can result in errors that ruin statistical results and conclusions. A single data entry error can make a moderate correlation turn to zero and a significant t-test non-significant. Therefore, researchers should design and... more

Human data entry can result in errors that ruin statistical results and conclusions. A single data entry error can make a moderate correlation turn to zero and a significant t-test non-significant. Therefore, researchers should design and use human computer interactions that minimize data entry errors. In this paper, 195 undergraduates were randomly assigned to three data entry methods: double entry, visual checking, and single entry. After training in their assigned method, participants entered 30 data sheets, each containing six types of data. Visual checking resulted in 2958% more errors than double entry, and was not significantly better than single entry. These data entry errors sometimes had terrible effects on coefficient alphas, correlations, and t-tests. For example, 66% of the visual checking participants produced incorrect values for coefficient alpha, which was sometimes wrong by more than .40. Moreover, these data entry errors would be hard to detect: Only 0.06% of the errors were blank or outside of the allowable range for the variables. Thus, researchers cannot rely upon histograms and frequency tables to detect data entry errors. Single entry and visual checking should be replaced with more effective data entry methods, such as double entry.

Equations for mix design of structural lightweight concrete are presented. Conventionally, mix design of concrete is conducted using the tabular data and charts in standards. This requires extra efforts of understanding the data in the... more

Equations for mix design of structural lightweight concrete are presented. Conventionally, mix design of concrete is conducted using the tabular data and charts in standards. This requires extra efforts of understanding the data in the code and interpolations are often required when intermediate values are needed. The process is also liable to human error as data may be erroneously taken by the mix designer. The tabular data and graphs in ACI 211.2-98 are converted to equations. Various models were tried and the best model that adequately represents the data was chosen based on the regression coefficient and its predictive capability. The equations were used to solve some mix design problems from reputable textural sources. The developed equations are capable of giving material constituents for the first trial batch of structural lightweight concrete. These equations can be used in place of the data in the code and would reduce the effort, time and energy expended in the manual process of mix design of structural lightweight concrete. The equations are also useful for mixture proportioning adjustment.

Back cover text: Real Social Science presents a new, hands-on approach to social inquiry. The theoretical and methodological ideas behind the book, inspired by Aristotelian phronesis, represent an original perspective within the social... more

Back cover text: Real Social Science presents a new, hands-on approach to social inquiry. The theoretical and methodological ideas behind the book, inspired by Aristotelian phronesis, represent an original perspective within the social sciences, and this volume gives readers for the first time a set of studies exemplifying what applied phronesis looks like in practice. The reflexive analysis of values and power gives new meaning to the impact of research on policy and practice. Real Social Science is a major step forward in a novel and thriving field of research. This book will benefit scholars, researchers, and students who want to make a difference in practice, not just in the academy. Its message will make it essential reading for students and academics across the social sciences.

The 2003 invasion of Iraq by the United States yielded no evidence to Saddam's possession of weapons of mass destruction. This paper aims to analyze the intelligence utilized by the U.S. Bush Administration to justify the invasion of... more

The 2003 invasion of Iraq by the United States yielded no evidence to Saddam's possession of weapons of mass destruction. This paper aims to analyze the intelligence utilized by the U.S. Bush Administration to justify the invasion of Iraq, and whether this intelligence collection process was biased.

This technical note details the preliminary stage in the development of a postural analysis tool, Rapid Entire Body Assessment (REBA). REBA has been developed to "ll a perceived need for a practitioner's "eld tool, speci"cally designed to... more

This technical note details the preliminary stage in the development of a postural analysis tool, Rapid Entire Body Assessment (REBA). REBA has been developed to "ll a perceived need for a practitioner's "eld tool, speci"cally designed to be sensitive to the type of unpredictable working postures found in health care and other service industries. A team of ergonomists, physiotherapists, occupational therapists and nurses collected and individually coded over 600 postural examples to produce a new tool incorporating dynamic and static postural loading factors, human}load interface (coupling), and a new concept of a gravity-assisted upper limb position. Initial reliability for inter-observer coding shows promise but further work is needed to establish the validity of the tool.

Human data entry can result in errors that ruin statistical results and conclusions. A single data entry error can make a moderate correlation turn to zero and a significant t-test non-significant. Therefore, researchers should design and... more

Human data entry can result in errors that ruin statistical results and conclusions. A single data entry error can make a moderate correlation turn to zero and a significant t-test non-significant. Therefore, researchers should design and use human computer interactions that minimize data entry errors. In this paper, 195 undergraduates were randomly assigned to three data entry methods: double entry, visual checking, and single entry. After training in their assigned method, participants entered 30 data sheets, each containing six types of data. Visual checking resulted in 2958% more errors than double entry, and was not significantly better than single entry. These data entry errors sometimes had terrible effects on coefficient alphas, correlations, and t-tests. For example, 66% of the visual checking participants produced incorrect values for coefficient alpha, which was sometimes wrong by more than .40. Moreover, these data entry errors would be hard to detect: Only 0.06% of the errors were blank or outside of the allowable range for the variables. Thus, researchers cannot rely upon histograms and frequency tables to detect data entry errors. Single entry and visual checking should be replaced with more effective data entry methods, such as double entry.

As aircraft have become more reliable, humans have played a progressively more important causal role in aviation accidents, resulting in the proliferation of human error frameworks and accident investigation schemes. To date, however, few... more

As aircraft have become more reliable, humans have played a progressively more important causal role in aviation accidents, resulting in the proliferation of human error frameworks and accident investigation schemes. To date, however, few efforts have been made to systematically organize these different approaches based on underlying theoretical similarities, and formalized methods for evaluating the utility of these multiple approaches have not been clearly defined. Many safety professionals, therefore, have been at a loss when choosing which error analysis and prevention approach to use within their organizations. As a result, those tasked with instituting human-centered safety programs often rely primarily on personal experience and intuition to address their needs. The purpose of this article is to help remedy this situation by providing safety practitioners with an overview of the prominent human error perspectives in aviation, as well as a set of objective criteria for evaluating human error frameworks.

Outlier detection has been used for centuries to detect and, where appropriate, remove anomalous observations from data. Outliers arise due to mechanical faults, changes in system behaviour, fraudulent behaviour, human error, instrument... more

Outlier detection has been used for centuries to detect and, where appropriate, remove anomalous observations from data. Outliers arise due to mechanical faults, changes in system behaviour, fraudulent behaviour, human error, instrument error or simply through natural deviations in populations. Their detection can identify system faults and fraud before they escalate with potentially catastrophic consequences. It can identify errors and remove their contaminating effect on the data set and as such to purify the data for processing. The original outlier detection methods were arbitrary but now, principled and systematic techniques are used, drawn from the full gamut of Computer Science and Statistics. In this paper, we introduce a survey of contemporary techniques for outlier detection. We identify their respective motivations and distinguish their advantages and disadvantages in a comparative review.

In recent years, the topic of car-following has become of increased importance in trac engineering and safety research. Models of this phenomenon, which describe the interaction between (typically) adjacent vehicles in the same lane, now... more

In recent years, the topic of car-following has become of increased importance in trac engineering and safety research. Models of this phenomenon, which describe the interaction between (typically) adjacent vehicles in the same lane, now form the cornerstone for many important areas of research including (a) simulation modelling, where the car-following model (amongst others) controls the motion of the vehicles in the network, and (b) the functional de®nition of advanced vehicle control and safety systems (AVCSS), which are being introduced as a driver safety aid in an eort to mimic driver behaviour but remove human error. Despite the importance of this area however, no overview of the models availability and validity exists. It is the intent of this paper therefore to brie¯y assess the range of options available in the choice of car-following model, and assess just how far work has proceeded in our understanding of what, at times, would appear to be a simple process. Ó

Background: We aimed to improve the quality and safety of handover of patients from surgery to intensive care using the analogy of a Formula 1 pit stop and expertise from aviation. Methods: A prospective intervention study measured the... more

Background: We aimed to improve the quality and safety of handover of patients from surgery to intensive care using the analogy of a Formula 1 pit stop and expertise from aviation. Methods: A prospective intervention study measured the change in performance before and after the implementation of a new handover protocol that was developed through detailed discussions with a Formula 1 racing team and aviation training captains. Fifty (23 before and 27 after) postsurgery patient handovers were observed. Technical errors and information omissions were measured using checklists, and teamwork was scored using a Likert scale. Duration of the handover was also measured. Results: The mean number of technical errors was reduced from 5.42 (95% CI ±1.24) to 3.15 (95% CI ±0.71), the mean number of information handover omissions was reduced from 2.09 (95% CI ±1.14) to 1.07 (95% CI ±0.55), and duration of handover was reduced from 10.8 min (95% CI ±1.6) to 9.4 min (95% CI ±1.29). Nine out of twenty-three (39%) precondition patients had more than one error in both technical and information handover prior to the new protocol, compared with three out of twnety-seven (11.5%) with the new handover. Regression analysis showed that the number of technical errors were significantly reduced with the new handover (t = )3.63, P < 0.001), and an interaction suggested that teamwork (t = 3.04, P = 0.004) had a different effect with the new handover protocol. Conclusions: The introduction of the new handover protocol lead to improvements in all aspects of the handover. Expertise from other industries can be extrapolated to improve patient safety, and in particular, areas of medicine involving the handover of patients or information.

The purpose of this paper is to identify the threats that exist in Healthcare Information Systems (HIS). The study has been carried out in three different departments namely, Information Technology Department (ITD), Medical Record... more

The purpose of this paper is to identify the threats that exist in Healthcare Information Systems (HIS). The study has been carried out in three different departments namely, Information Technology Department (ITD), Medical Record Department (MRD) and X-Ray Department in one of the leading government supported hospital in Malaysia. The hospital was equipped with Total Hospital Information System (THIS) environment. The data were collected using in-depth structured interviews. The study identified 22 types of threats according to major threat categories based on ISO/IEC 27002 (ISO 27799:2008). The result shows the most critical threat for the THIS is the power failure. In addition, acts of human error or failure threat also show high frequency of occurrence. The contribution of the paper will be categorization of threats in HIS and can be used to design and implement effective security systems and policies in healthcare setting.

Background Teamwork in surgical teams is at the forefront of good practice guidelines and empirical research as an important aspect of safe surgery. We have developed a comprehensive assessment for teamwork in surgery-the Observational... more

Background Teamwork in surgical teams is at the forefront of good practice guidelines and empirical research as an important aspect of safe surgery. We have developed a comprehensive assessment for teamwork in surgery-the Observational Teamwork Assessment for Surgery (OTA-S)-and we have tested it for general surgical procedures. The aim of the research reported here was to extend the assessment to urology procedures. Methods After refining the original assessment, we used it to observe 50 urology procedures. The OTAS comprises a procedural task checklist that assesses patient, equipment/ provisions, and communication tasks as well as ratings on five team behavior constructs (communication, cooperation, coordination, leadership, and monitoring). Teamwork was assessed separately in the surgical, anesthesia, and nursing subteams in the operating theater. We also assessed the reliability of the behavioral scoring. Results Regarding task completion, a number of communication and equipment/provisions tasks were not routinely performed during the operations we observed. Regarding teamwork-related behaviors, adequate reliability was obtained in the scoring of behaviors. Anesthetists and nurses obtained their lowest scores on communication.