Automation und Unternehmensverwaltung
In: Veröffentlichungen der Schmalenbach-Gesellschaft 27
8484 Ergebnisse
Sortierung:
In: Veröffentlichungen der Schmalenbach-Gesellschaft 27
In: Human factors: the journal of the Human Factors Society, Band 64, Heft 2, S. 269-277
ISSN: 1547-8181
Objective Identify a critical research gap for the human factors community that has implications for successful human–automation teaming. Background There are a variety of approaches for applying automation in systems. Flexible application of automation such that its level and/or type changes during system operations has been shown to enhance human–automation system performance. Method This mini-review describes flexible automation in which the level of automated support varies across tasks during system operation, rather than remaining fixed. Two types distinguish the locus of authority to change automation level: adaptable automation (the human operator assigns how automation is applied) has been found to aid human's situation awareness and provide more perceived control versus adaptive automation (the system assigns automation level) that may impose less workload and attentional demands by automatically adjusting levels in response to changes in one or more states of the human, task, environment, and so on. Results In contrast to vast investments in adaptive automation approaches, limited research has been devoted to adaptable automation. Experiments directly comparing adaptable and adaptive automation are particularly scant. These few studies show that adaptable automation was not only preferred over adaptive automation, but it also resulted in improved task performance and, notably, less perceived workload. Conclusion Systematic research examining adaptable automation is overdue, including hybrid approaches with adaptive automation. Specific recommendations for further research are provided. Application Adaptable automation together with effective human-factored interface designs to establish working agreements are key to enabling human–automation teaming in future complex systems.
In: Human factors: the journal of the Human Factors Society, Band 55, Heft 6, S. 1130-1141
ISSN: 1547-8181
Objective: The aim of this study was to evaluate whether communicating automation uncertainty improves the driver–automation interaction. Background: A false system understanding of infallibility may provoke automation misuse and can lead to severe consequences in case of automation failure. The presentation of automation uncertainty may prevent this false system understanding and, as was shown by previous studies, may have numerous benefits. Few studies, however, have clearly shown the potential of communicating uncertainty information in driving. The current study fills this gap. Method: We conducted a driving simulator experiment, varying the presented uncertainty information between participants (no uncertainty information vs. uncertainty information) and the automation reliability (high vs. low) within participants. Participants interacted with a highly automated driving system while engaging in secondary tasks and were required to cooperate with the automation to drive safely. Results: Quantile regressions and multilevel modeling showed that the presentation of uncertainty information increases the time to collision in the case of automation failure. Furthermore, the data indicated improved situation awareness and better knowledge of fallibility for the experimental group. Consequently, the automation with the uncertainty symbol received higher trust ratings and increased acceptance. Conclusion: The presentation of automation uncertainty through a symbol improves overall driver–automation cooperation. Application: Most automated systems in driving could benefit from displaying reliability information. This display might improve the acceptance of fallible systems and further enhances driver–automation cooperation.
In: Rutgers, the State University, Stonier Graduate School of Banking, Rutgers Banking Series
In: Human factors: the journal of the Human Factors Society, Band 57, Heft 5, S. 728-739
ISSN: 1547-8181
Objective We examine the effects of two different kinds of decision-aiding automation errors on human–automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Background Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Method Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. Results The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Conclusions Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Implications Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias.
In: Current history: a journal of contemporary world affairs, Band 36, S. 333-337
ISSN: 0011-3530
In: New world review, Band 25, S. 25-32
ISSN: 0028-7067
In: International Journal of Research Publication and Reviews, Band 3, Heft 5, S. 1408-1410
SSRN
In: The Freeman: ideas on liberty, Band 14, S. 31-38
ISSN: 0016-0652, 0445-2259
In: Rowohlts deutsche Enzyklopädie 124
In: Sachgebiet Wirtschaftswissenschaften