Skip to main content

Systemgestaltung und Automatisierung

  • Chapter
Human Factors

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 89.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Literatur

  • Aeronautica Civil of the Republic of Columbia (1996).AA965 Cali accident report.Prepared for the WWW by Peter Ladkin, University of Bielefeld, Germany.[verfügbar unter http://www.rvs.uni-bielefeld.de/publications/Incidents/DOCS/ComAndRep/Cali/calirep.html; Zugriff am 12.06.2011]

    Google Scholar 

  • Annett, J. & Stanton, N. (Eds.) (2000).Task analysis. London: Taylor & Francis.

    Google Scholar 

  • Bahner, J. E., Hüper, A.-D. & Manzey, D. (2008). Misuse of automated decision aids: Complacency, automation bias and the impact of training experience. International Journal of Human-Computer Studies, 66,688-699.

    Google Scholar 

  • Bainbridge, L. (1983). Ironies of automation. Automatica, 19,775–770.

    Google Scholar 

  • Billings, C. E. (1997). Aviation automation. The search for a human-centered approach. Mahwah: Lawrence Erlbaum.

    Google Scholar 

  • Bliss, J.P. & Fallon, C.K. (2006). Active warnings: false alarms. In M.S. Wolgater (eds.), Handbook of warnings(pp. 231-242). Mahwah: Lawrence Erlbaum.

    Google Scholar 

  • Breznitz, S. (1983).Cry-wolf: the psychology of false alarms. Hillsdale: Lawrence Erlbaum.

    Google Scholar 

  • Boeing Commercial Airline Group (2005). Statistical summary of commercial jet aircraft accidents: Worldwide operations 1959–2005. [verfügbar unter: http://www.boeing.com/news/techissues/pdf/statsum.pdf; Zugriff am 22.09.2006].

    Google Scholar 

  • Byrne, E. A. & Parasuraman, R. (1996).Psychophysiology and adaptive automation. Biological Psychology, 42,249–268.

    Google Scholar 

  • Christoffersen, K. & Woods, D. D. (20 02).How to make automated systems team players. In E. Salas (Ed.), Advances in human performance and cognitive engineering research. Vol. 2. Automation (pp. 1–12). Burlington: Elsevier.

    Google Scholar 

  • Dearden, A., Harrison, M. & Wright, P. (2000). Allocation of function: Scenarios, context and the economics of effort. International Journal of Human-Computer Studies, 52,289–318.

    Google Scholar 

  • Dzindolet, M. T., Peterson, S. A., Pomranky, R. A., Pierce, L. G. & Beck, H. P. (2003). The role of trust in automation reliance. International Journal of Human-Computer Studies, 58,697–718.

    Google Scholar 

  • Endsley, M. R. (1995). Toward a theory of situation awareness in dynamic systems. Human Factors, 37,85–104.

    Google Scholar 

  • Endsley, M. R., Bolté, B. & Jones, D. B. (2003).Designing for situation awareness. An approach to user-centered design. London: Taylor & Francis.

    Google Scholar 

  • Endsley, M. R. & Kiris, E.O. (1995).The out-of-the-loop performance problem and level of control in automation. Human Factors, 37,381–394.

    Google Scholar 

  • Fitts, P. M. (Ed.) (1951). Human engineering for an effective air navigation and traffic-control system. Columbus, OH: Ohio State University Research Foundation.

    Google Scholar 

  • Gérard, N. & Manzey, D. (2010) Are false alarms not as bad as supposed after all? A study investigating operators’ responses to imperfect alarms.In D. de Waard, A. Axelsson. M. Berglund, B. Peters & C. Weikert (eds.), Human factors. A system view of human, technology and organisation(pp. 55-69). Maastricht: Shaker.

    Google Scholar 

  • Grote, G., Ryser, C., Wäfler, T., Windischer, A. & Weik, S. (2000). KOMPASS: A method for complementary function allocation in automated work systems. International Journal of Human-Computer Studies, 52,267–287.

    Google Scholar 

  • Hacker, W. (1989). Vollständige vs. unvollständige Arbeitstätigkeiten. In S. Greif, H. Holling & N. Nicholson (Hrsg.), Arbeits- und Organisationspsychologie. Internationales Handbuch in Schlüsselbegriffen(S. 463–466). München: Psychologie Verlags Union.

    Google Scholar 

  • Hackman, J. R. & Oldham, G. R. (1980). Work redesign. Reading: Addison-Wesley.

    Google Scholar 

  • Hauß, Y & Timpe, K.-P. (2000). Automatisierung und Unterstützung im Mensch-Maschine-System. In K.-P. Timpe, T. Jürgensohn & H. Kolrep (Hrsg.), Mensch-Maschine-Systemtechnik. Konzepte, Modellierung, Gestaltung, Evaluation(S. 41–62). Düsseldorf: Symposion.

    Google Scholar 

  • Kaber, D. B. & Riley, J. M. (1999). Adaptive automation of a dynamic control task based on secondary-task workload measurement. International Journal of Cognitive Ergonomics, 3,169–187.

    Google Scholar 

  • Kaber, D. B., Riley, J. M., Tan, K.-W. & Endsley, M. R. (2001).On the design of adaptive automation for complex systems. International Journal of Cognitive Ergonomics, 5,37–57.

    Google Scholar 

  • Kessel, C. J. & Wickens, C. D. (1982).The transfer of failure-detection skills between monitoring and controlling dynamic sytems. Human Factors, 24,49–60.

    Google Scholar 

  • Lee, J. D. & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46,50–80.

    Google Scholar 

  • Lorenz, B., Di Nocera, F., Röttger, S. & Parasuraman, R. (2002).Automated fault-management in a simulated spaceflight micro-world. Aviation, Space, and Environmental Medicine, 73,886–897.

    Google Scholar 

  • Madhavan, P., Diegmann, D. A. & Lacson, F. C. (2006). Automation failures on tasks easily performed by operators undermine trust in automated aids. Human Factors, 48,241–256.

    Google Scholar 

  • Manzey, D. & Bahner, J. E. (2005). Vertrauen in Automation als Aspekt der Verlässlichkeit von Mensch-Maschine-Systemen. In K. Karrer, B. Gauss & C. Steffens (Hrsg.), Beiträge zur Mensch-Maschine-Systemtechnik aus Forschung und Praxis. Festschrift für Klaus-Peter Timpe(S. 93–109). Düssledorf: Symposion.

    Google Scholar 

  • Meyer, J. (2004). Conceptual issues in the study of hazard warnings. Human Factors, 46, 196-204.

    Google Scholar 

  • Moray, N. & Inagaki, T. (2000). Attention and complacency. Theoretical Issues of Ergonomics Science, 1,354–365.

    Google Scholar 

  • Mosier, K. L. & Skitka, L. J. (1996). Human decision-makers and automated decision aids: made for each other? In R. Parasuraman & M. Mouloua (Eds.), Automation and human performance: Theory and applications(pp. 201–220). Mawah: Lawrence Erlbaum.

    Google Scholar 

  • Parasuraman, R. & Manzey, D. (2010). Complacency and bias in human use of automation: A review and attentional synthesis. Human Factors, 52, 381-410.

    Google Scholar 

  • Parasuraman, R., Molloy, R. & Singh, I. L. (1993). Performance consequences of automation induced »complacency«. The International Journal of Aviation Psychology, 2,1–23.

    Google Scholar 

  • Parasuraman, R., Mouloua, M. & Molloy, R. (1996). Effects of adaptive task allocation on monitoring of automated systems. Human Factors, 38,665–679.

    Google Scholar 

  • Parasuraman, R. & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39,230–253.

    Google Scholar 

  • Parasuraman, R., Sheridan, T. B. & Wickens, C. D. (2000).A model for types and levels of human interaction with automation. IEEE Transactions on Systems Man and Cybernetics – Part A: Systems and Humans, 30,286–297.

    Google Scholar 

  • Patrick, J. (1992). Training. Research and practice. London: Academic Press.

    Google Scholar 

  • Reichenbach, J., Onnasch, L. & Manzey (2010). Misuse of automation: The impact of system experience on complacency and automation bias in interaction with automated aids. Proceedings of the HFES 54th Annual Meeting (pp. 374-378). Santa Monica: Human Factors and Ergonomics Society.

    Google Scholar 

  • Scallen, S. F. & Hancock, P. A. (2001).Implementing adaptive functional allocation. The International Journal of Aviation Psychology, 11,197–221.

    Google Scholar 

  • Scerbo, M. W. (1996). Theoretical perspectives on adaptive automation. In R. Parasuraman & M. Mouloua (Eds.), Automation and human performance: Theory and applications(pp. 37–63). Mawah: Lawrence Erlbaum.

    Google Scholar 

  • Sheridan, T. B. (1997). Supervisory control. In G. Salvendy (Ed.), Handbook of human factors(pp. 1295–1327). New York: Wiley.

    Google Scholar 

  • Singh, I. L., Molloy, R. & Parasuraman, R. (1993). Automation-induced »complacency«: Development of the complacency rating scale. The International Journal of Aviation Psychology, 3,111–122.Sorkin, R.D., Kantowitz, B.H. & Kantowitz, S.C. (1988). Likelihood alarm displays. Human Factors, 30, 445-459.

    Google Scholar 

  • Vicente, K. (1999). Cognitive work analysis. Mahwah: Lawrence Erlbaum.

    Google Scholar 

  • Wiczorek, R. &Manzey, D. (2011).Evaluating likelihood alarm systems as an alternative to binary alarm systems. In D. de Waard, N. Gérard, L. Onnasch, R. Wiczorek, and D. Manzey (Eds.) (2011). Human Centred Automation. Maastricht, the Netherlands: Shaker Publishing. (in press).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Manzey, D. (2012). Systemgestaltung und Automatisierung. In: Badke-Schaub, P., Hofinger, G., Lauche, K. (eds) Human Factors. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-19886-1_19

Download citation

Publish with us

Policies and ethics