Human Error And The Design Of Computer Systems
Contents |
THE ACM1 In 1988, the Soviet Union's Phobos 1 satellite was lost on its way to Mars. Why? According to Science magazine, "not long after the launch, a ground controller omitted a single letter in a series of digital name 5 ways that a computer can be the object of a crime. commands sent to the spacecraft. And by malignant bad luck, that omission caused the
Human Error Computer Security
code to be mistranslated in such a way as to trigger the test sequence" (the test sequence was stored in ROM, but was human error in information system intended to be used only during checkout of the spacecraft while on the ground) [7]. Phobos went into a tumble from which it never recovered. What a strange report. "Malignant bad luck"? Why bad luck: why how to reduce human error in the workplace not bad design? Wasn't the problem the design of the command language that allowed such a simple deviant event to have such serious consequences. The effects of electrical noise on signal detectability, identification, and reliability are well known. Designers are expected to use error-detecting and correcting codes. Suppose interference from known sources of electromagnetic noise had corrupted the signal to Phobos. We would not blame the ground controllers: we would say that the system
Types Of Human Error
designers did not follow standard engineering practice, and we would reconsider the design of the system so as to protect against this problem in the future. People err. That is a fact of life. People are not precision machinery designed for accuracy. In fact, we humans are a different kind of device entirely. Creativity, adaptability, and flexibility are our strengths. Continual alertness and precision in action or memory are our weaknesses. We are amazingly error tolerant, even when physically damaged. We are extremely flexible, robust, and creative, superb at finding explanations and meanings from partial and noisy evidence. The same properties that lead to such robustness and creativity also produce errors. The natural tendency to interpret partial information -- although often our prime virtue -- can cause operators to misinterpret system behavior in such a plausible way that the misinterpretation can be difficult to discover. Quite a lot is known about human performance and the way it applies to system interaction [1]. Several classes of human error have been identified and studied and conditions that increase the likelihood of error can be specified in advance [3, 4, 5]. Communication systems can be designed to be error-tolerant and error-detecting or correcting. In a similar way, we could devise a science of error-tolerant, detecting or minimization interactions with human operators [2]. Ma
ChapterHCI International 2011 – Posters’ Extended Abstracts Volume 173 of the series Communications in Computer and Information Science pp 270-274A human error prevention Study on Human Error in the Interaction with the Computer SystemsLuiz Carlos BegossoAffiliated withDepartment human error examples of Computer Science, Fundação Educacional do Município de Assis - FEMAFaculdade de Tecnologia de Ourinhos – FATEC, Maria Alice Siqueira Mendes SilvaAffiliated withDepartment of Psychology, Universidade Estadual Paulista – UNESP, Thiago Henrique CortezAffiliated withDepartment of Computer Science, Fundação Educacional do Município de Assis - FEMA Buy this eBook * Final gross prices may http://www.jnd.org/dn.mss/commentary_human_er.html vary according to local VAT. Get Access Abstract The term human factor is used by professionals of various fields meant for understanding the behavior of human beings at work. The human being, while developing a cooperative activity with a computer system, is subject to cause an undesirable situation in his/her task. This paper starts from the principle that human http://link.springer.com/chapter/10.1007%2F978-3-642-22098-2_54 errors may be considered as a cause or factor contributing to a series of accidents and incidents in many diversified fields in which human beings interact with automated systems. We propose a simulator of performance in error with potentiality to assist the Human Computer Interaction (HCI) project manager in the construction of the critical systems. Keywords Computer Systems Human Computer-Interaction Human Error Simulator of Performance in Error Page %P Close Plain text Look Inside Chapter Metrics Provided by Bookmetrix Reference tools Export citation EndNote (.ENW) JabRef (.BIB) Mendeley (.BIB) Papers (.RIS) Zotero (.RIS) BibTeX (.BIB) Add to Papers Other actions About this Book Reprints and Permissions Share Share this content on Facebook Share this content on Twitter Share this content on LinkedIn Supplementary Material (0) References (6) References1.Dejours, C.: Le facteur humain. FUP, Paris (2002)2.Reason, J.: Human error. Cambridge University Press, Cambridge (1999)3.Sanders, M.S., McCormick, E.J.: Human factors in engineering and design. McGraw-Hill, New York (1987)4.Swain, A.D., Guttmann, H.E.: Handbook of Human Reliability Analysis with emphasis on Nuclear Power Plant Applications. U.S. Nuclear Reg. Commission, Albuqu
are attributed to a poorly designed human-computer interface (HCI). However, human beings are often needed to be the fail-safe in an otherwise automated system. Even the most https://users.ece.cmu.edu/~koopman/des_s99/human/ highly trained and alert operators are prone to boredom when they are usually not needed for normal operation, and panic when an unusual situation occurs, stress levels are raised, and lives are at stake. The HCI must give appropriate feedback to the operator to allow him or her to make well informed decisions based on the most up to date human error information on the state of the system. High false alarm rates will make the operator ignore a real alarm condition. Methods for determining the effectiveness of an HCI, such as heuristic evaluation, cognitive walkthroughs, and empirical evaluations like protocol analysis, exist, but are often cumbersome and do not provide conclusive data on the safety and usability of an HCI. System human error in designers must insure that the HCI is easy and intuitive for human operators to use, but not so simple that it lulls the operator into a state of complacency and lowers his or her responsiveness to emergency situations. Contents: Introduction Key Concepts Sources of Human Error HCI Problems Available tools, techniques, and metrics HCI Design Heuristic Evaluation Cognitive Walkthrough Protocol Analysis MetriStation Relationship to other topics Conclusions Annotated Reference List & Further Reading Introduction In any complex system, most errors and failures in the system can be traced to a human source. Incomplete specifications, design defects, and implementation errors such as software bugs and manufacturing defects, are all caused by human beings making mistakes. However, when looking at human errors in the context of embedded systems, we tend to focus on operator errors and errors caused by a poor human-computer interface (HCI). Human beings have common failure modes and certain conditions will make it more likely for a human operator to make a mistake. A good HCI design can encourage the operator to perform correctly and protect the sys
be down. Please try the request again. Your cache administrator is webmaster. Generated Tue, 18 Oct 2016 03:04:25 GMT by s_wx1127 (squid/3.5.20)