Hollnagel Human Error
Contents |
Error Analysis Method Coping with computers in the cockpit Handbook of Cognitive Task Design human error definition Barriers and Accident Prevention Joint Cognitive Systems: Foundations Joint
Human Error Percentage
Cognitive Systems: Patterns Resilience Engineering: Concepts and Precepts Resilience Engineering Perspectives: Vol 1 Resilience
Human Error Synonym
Engineering Perspectives: Vol 2. The ETTO Principle Safer Complex Industrial Environments Resilience Engineering in Practice Governance and Control in Financial Systems FRAM Resilient
Human Error In Experiments
Health Care Safety-I and Safety-II Resilience Engineering in Practice vol II The Resilience of Everyday Clincal Work Papers etc. 2011-2020 2013 2012 2011 2001-2010 2010 2009 1991-2000 1981-1990 1990 1989 1988 1987 1986 1985 1984 1983 1982 1981 1971-1980 1980 1979 1978 1977 1976 1975 1974 types of human error 1973 1972 1971 Ideas Cognitive Systems Engineering NO view of Human Error CREAM COCOM ECOM Resilience Engineering Resilience Assessment Grid ETTO principle Work related ETTO rules Individual ETTO rules Collective ETTO rules ETTO and TETO FRAM Safety-I and Safety-II Resilient Health Care Synesis Half-baked Ideas Contact About HomeCVBooksPapers etc.2001-20101991-20001981-19901971-1980Ideas Cognitive Systems EngineeringNO view of Human ErrorCREAMCOCOMECOMResilience EngineeringResilience Assessment GridETTO principleFRAMSafety-I and Safety-IIResilient Health CareSynesisHalf-baked IdeasContactAbout Copyright © Erik Hollnagel 2016 All Rights Reserved. Erik HollnagelPh.D., Professor, Professor Emeritus The NO view of 'human error''Human error' has been the focus of much debate and many arguments for nearly 50 years. The term came to the fore during the human factors surge that followed the accident at Three Mile Island in 1979. In the rush to use 'human error' to explain accidents and incidents it was generally overlooked th
the military, or medicine. Human performance can be affected by many factors such as age, state of mind, physical health, attitude, emotions, propensity for certain common types of human error at workplace mistakes, errors and cognitive biases, etc. Human reliability is very important due to human error in aviation the contributions of humans to the resilience of systems and to possible adverse consequences of human errors or human error analysis oversights, especially when the human is a crucial part of the large socio-technical systems as is common today. User-centered design and error-tolerant design are just two of many terms used to http://erikhollnagel.com/ideas/no-view-of-human-error.html describe efforts to make technology better suited to operation by humans. Contents 1 Analysis techniques 1.1 PRA-based techniques 1.2 Cognitive control based techniques 1.3 Related techniques 1.4 Human Factors Analysis and Classification System (HFACS) 2 See also 3 Footnotes 4 References 5 Further reading 6 External links 6.1 Standards and guidance documents 6.2 Tools 6.3 Research labs 6.4 Media coverage 6.5 https://en.wikipedia.org/wiki/Human_reliability Networking Analysis techniques[edit] A variety of methods exist for human reliability analysis (HRA).[1][2] Two general classes of methods are those based on probabilistic risk assessment (PRA) and those based on a cognitive theory of control. PRA-based techniques[edit] One method for analyzing human reliability is a straightforward extension of probabilistic risk assessment (PRA): in the same way that equipment can fail in a power plant, so can a human operator commit errors. In both cases, an analysis (functional decomposition for equipment and task analysis for humans) would articulate a level of detail for which failure or error probabilities can be assigned. This basic idea is behind the Technique for Human Error Rate Prediction (THERP).[3] THERP is intended to generate human error probabilities that would be incorporated into a PRA. The Accident Sequence Evaluation Program (ASEP) human reliability procedure is a simplified form of THERP; an associated computational tool is Simplified Human Error Analysis Code (SHEAN).[4] More recently, the US Nuclear Regulatory Commission has published the Standardized Plant Analysis Risk - Human Reliability Analysis (SPAR-H) method to take account of the potential for human error.[
guys at safety that it was dangerous and one day we would lose concentration and pay for it. I already told those guys at safety that it was very dangerous! We are human and http://www.safetydifferently.com/the-use-and-abuse-of-human-error/ this can happen to us. This curve is inhuman!" These are the distressed words of the injured train driver moments after the train derailment in Santiago de Compostela, northern Spain on 25 July 2013. The driver can be heard pleading in sorrow, hoping for the safety of the passengers, “I have turned over. My God, my God, the poor passengers. I hope no-one is dead. I hope. I hope.” Seventy-nine people human error died. In the aftermath of the accident, initial investigations ruled out mechanical or technical failure, sabotage and terrorism. That appeared to leave only two possible explanations – ‘human error’ or ‘recklessness’, or both. When society demands someone to blame, the difference – whatever it might be – can seem trivial. What followed was a display of our instinct to find a simple explanation and someone to blame. Soon, the explanation and the human error in blame pointed to the driver. The Galicia regional government president Alberto Nunez Feijoo stated that “The driver has acknowledged his mistake“. Meanwhile, Jorge Fernandez Diaz, Spain’s Interior Minister, said that there “were reasonable grounds to think he may have a potential liability” and confirmed he could face multiple charges for reckless manslaughter. While safety investigations are ongoing, the driver faces preliminary charges of 79 counts of homicide by professional recklessness and numerous counts of bodily harm. Several claims appeared about the driver in the media, often without relevant context. It was reported that the driver “admitted speeding” on the occasion of the crash [1]. It is known that the train was travelling at twice the speed limit on the curve and that just before the crash. The train’s black boxes showed that the train was travelling at 192 kph moments before the crash. The speed limit on the curve was 80 kph. The implication was that the speeding was reckless. The media pounced onto an old Facebook post reportedly by the driver, over a year ago, of the speeds at which his trains would travel. One post, reported by Spanish media and attributed to the driver, stated: “It would be amazing to go alongside police and overtake them and trigger off t
be down. Please try the request again. Your cache administrator is webmaster. Generated Mon, 17 Oct 2016 15:36:54 GMT by s_wx1127 (squid/3.5.20)