Human Action Error
Contents |
Island accident), aviation (see pilot error), space exploration (e.g., the Space Shuttle Challenger Disaster and Space Shuttle Columbia disaster), human error examples and medicine (see medical error). Prevention of human error is generally
Types Of Human Error
seen as a major contributor to reliability and safety of (complex) systems. Contents 1 Definition 2 Performance human error synonym 3 Categories 4 Sources 5 Controversies 6 See also 7 References Definition[edit] Human error means that something has been done that was "not intended by the actor;
Human Error In Experiments
not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits".[1] In short, it is a deviation from intention, expectation or desirability.[1] Logically, human actions can fail to achieve their goal in two different ways: the actions can go as planned, but the plan can human error in aviation be inadequate (leading to mistakes); or, the plan can be satisfactory, but the performance can be deficient (leading to slips and lapses).[2][3] However, a mere failure is not an error if there had been no plan to accomplish something in particular.[1] Performance[edit] Human error and performance are two sides of the same coin: "human error" mechanisms are the same as "human performance" mechanisms; performance later categorized as 'error' is done so in hindsight:[4][5] therefore actions later termed "human error" are actually part of the ordinary spectrum of human behaviour. The study of absent-mindedness in everyday life provides ample documentation and categorization of such aspects of behavior. While human error is firmly entrenched in the classical approaches to accident investigation and risk assessment, it has no role in newer approaches such as resilience engineering.[6] Categories[edit] There are many ways to categorize human error.[7][8] exogenous versus endogenous (i.e., originating outside versus inside the individual)[9] situation assessment versus response planning[10] and related distinctions in errors in problem detection (als
Login Join our community 17. Human error (slips and mistakes) by James Reason (1990) has extensively analysed human errors and distinguishes between mistakes and slips. Mistakes are errors in choosing an objective or specifying a method of achieving it
Types Of Human Error At Workplace
whereas slips are errors in carrying out an intended method for reaching an objective (Sternberg
Human Error Prevention
1996). As Norman (1986: p. 414) explains: "The division occurs at the level of the intention: A Person establishes an intention to human error quotes act. If the intention is not appropriate, this is a mistake. If the action is not what was intended, this is a slip."
For example, a mistake would be to buy a Microsoft Excel licence https://en.wikipedia.org/wiki/Human_error because you want to store data that should be made accesible to web clients through SQL-queries, as Microsoft Excel is not designed for that purpose. In other words, you choose a wrong method for achieving your objective. However, if you installed a Postgresql Server for the same reason but in your haste forgot to give the programme privileges to go through your firewall, that would be a slip. You chose the right method https://www.interaction-design.org/literature/book/the-glossary-of-human-computer-interaction/human-error-slips-and-mistakes of achieving your objective, but you made an error in carrying out the method. Both Reason (1990) and Norman (1988) have described several kinds of slips (see 'related terms' below). According to Sternberg (1996), "slips are most likely to occur (a) when we must deviate from a routine, and automatic processes inappropriately override intentional, controlled processes; or (b) when automatic processes are interrupted - usually as a result of external events or data, but sometimes as a result of internal events, such as highly distracting thoughts." See the glossary term Capture Error for an example. Overall, it should be noted that "The designer shouldn't think of a simple dichotomy between errors and correct behavior: rather, the entire interaction should be treated as a cooperative endeavor between person and machine, one in which misconceptions can arise on either side." (Norman, 1988: p. 140) Topics in this book chapter: Human Error Demand Characteristics Human factors Learnt something new? Share with your friends: 17.2 References Lewis, Clayton H., Norman, Donald A. (1986): Designing for Error. In: Norman, Donald A., Draper, Stephen W. (eds). "User Centered System Design: New Perspectives on Human-Computer Interaction" . Norman, Donald A. (1988): The Design of Everyday Things, Doubleday, Reason, James (1990): Human Error, Cambridge University Press, Sternberg, Robert J. (1996): Cognitive Psychology. 2nd. Ed., HarcouA generalized process model of human action selection and error and its application to error prediction TitleA generalized http://www.nrl.navy.mil/itd/aic/content/generalized-process-model-human-action-selection-and-error-and-its-application-error process model of human action selection and error and its application to error prediction Publication TypeConference Paper Year of Publication2014 AuthorsTamborello, F, Trafton, JG Conference Name36th Annual Conference of the Cognitive Science Society Date Published07/2014 Conference LocationQuebec City, Canada AbstractOur model of action selection and postcompletion error in two form filling tasks extends to skip errors human error in a story telling task. We also discuss how it explains perseverations in one of the aforementioned form filling tasks. Finally we discuss a predictive classifier application we built from the model’s data. The classifier could allow an autonomous agent to know when it is a bad time to interrupt a human, when a human is about types of human to err, and how to help. Full TextDownload PDF pdf:http://www.nrl.navy.mil/itd/aic/sites/www.nrl.navy.mil.itd.aic/files/pdfs/A%20Generalized%20Process%20Model%20of%20Human%20Action%20Select.pdfNRL Publication Release Number:14-1231-1358 BibTex Research SectionsAdaptive Systems Intelligent Systems Interactive Systems Perceptual Systems Research Highlights3D Audio-Cued Operator Performance Modeling Adaptive Testing of Autonomous Systems Chat Attention Management Cognitive Robotics & HRI Cognitively Inspired Decision Making Damage Control for the 21st Century Goal Reasoning Human Mimetic Active Sonar Classification Machine Classification of Spoken Language Mobile Autonomous Teams for Information Surveillance & Search Naturally Occurring Change Points in Navy Radio Comms Predicting and Preventing Errors Robotic Touch Sensing, Manipulation, and Fault Detection Swarm Control using Physicomimetics Trinocular Structured Light System Unifying Inference through Attention News About NCARAIOpportunities for Students & Faculty Facilities Visitor Information Publications AIC Home Research Sections Research Highlights News About NCARAI Publications Home Field Sites Visitor Info Contact NRL Accomplishments Awards & Recognitions Timeline Systems Rockets Solar & Lunar Studies Astronomy Ocean & Environment Materials 90 Years of Innovation Research Directorates & Divisions Nanoscience Institute Laboratory for Autonomous Systems Research NRL Review Future Naval Capabilities NRL Research Library
be down. Please try the request again. Your cache administrator is webmaster. Generated Tue, 18 Oct 2016 02:42:27 GMT by s_ac15 (squid/3.5.20)