Situation awareness: Valid or fallacious?by Oliver Carsten, Frédéric Vanderhaegen

Cognition, Technology & Work

About

Year
2015
DOI
10.1007/s10111-015-0319-1
Subject
Applied Psychology / Philosophy

Similar

Individual Differences in Situation Awareness: Validation of the Situationism Scale

Authors:
Megan E. Roberts, Frederick X. Gibbons, Meg Gerrard, William M. P. Klein
2015

Situation awareness in obsessive-compulsive disorder

Authors:
Selim Tumkaya, Filiz Karadag, Shane T. Mueller, Tugce T. Ugurlu, Nalan K. Oguzhanoglu, Osman Ozdel, Figen C. Atesci, Mustafa Bayraktutan
2013

Situation Awareness: Icons vs. Alphanumerics

Authors:
B. A. Steiner, M. J. Camacho
1989

Adaptive Information Fusion for Situation Awareness in the Cockpit

Authors:
S. M. Waldron, G. B. Duggan, J. Patrick, S. Banbury, A. Howes
2005

Text

EDITORIAL

Situation awareness: Valid or fallacious?

Oliver Carsten • Fre´de´ric Vanderhaegen

Published online: 22 January 2015  Springer-Verlag London 2015

This issue of CTW opens with a special discussion section on situation awareness (SA) which follows on from the earlier such section on workload [CTW 16(3)]. The concept of SA has had massive purchase both in academic literature and in accident investigations. Indeed, it has entered into everyday parlance. That, however, does not guarantee its scientific validity. There are a number of pertinent questions. Does it describe process or product?

Does it add anything to standard models of information processing? What is its relationship to the somewhat similar concept of being ‘‘out-of-the-loop’’? And can it be reliably measured? Is it equivalent to mindfulness as used in social and medical sciences or to sensemaking as used in information science and organisational studies?

Ten years ago in this journal, Dekker and Hollnagel (2004) depicted SA as one of a series of ‘‘folk models’’ used in human factors and particularly in studies of human error and accident causation. The term ‘‘folk model’’ came from the philosopher Stich (1985), and they charged that such folk models as lack of situation awareness or automation-induced complacency were commonsense explanations of human performance and human error that failed to provide any useful explanation of how failure had arisen.

Such concepts lacked substance and were often circular in terms of cause and effect: If error had occurred, it was because of lack of situation awareness, and if there was lack of situation awareness, that was the result of error.

Other critics have been willing to accept the term of situation awareness but have admitted the difficulty in defining it. Thus, Sarter and Woods (1995) state: ‘‘a long tradition of research has not brought us much closer to being able to understand and support the phenomenon.’’

They further concede that ‘‘it appears to be futile to try to determine the most important contents of situation awareness, because the significance and meaning of any data are dependent on the context in which they appear.’’ One could perhaps question the utility of a concept that cannot be understood and which defies precise definition.

In this issue, Sidney Dekker returns to the fray. He associates the term of situation awareness with a blamethe-individual culture in which the deficiencies in the system are ignored. Norman (1991) has called this approach ‘‘blame and train,’’ and Dekker states that he does not wish to be associated with it.

Mica Endsley provides a spirited rejoinder, arguing that situation awareness is a valid, useful and distinct construct, backed by theory, that provides useful diagnosis of how to improve system design. She refutes the suggestion that the use of the construct implies blaming the human operator.

Patrick Millot is a ‘‘glass-half-full’’ man. He acknowledges that the concept has some imperfections, but argues that it should be augmented and improved rather than discarded. He also discusses the concept of collective situation awareness and associates it with a proposed framework on human–machine cooperation, where there is a dimension of ‘‘knowing how to cooperate.’’ He proposes that such cooperation within teams can be supported by a common work space.

In their contribution, Paul Salmon, Guy Walker and

Neville Stanton also focus on the theme of shared or

O. Carsten (&)

Institute for Transport Studies (ITS), University of Leeds, 34-40

University Road, Leeds LS2 9JT, UK e-mail: O.M.J.Carsten@its.leeds.ac.uk

F. Vanderhaegen

LAMIH, Universite´ de Valenciennes et du Hainaut-Cambre´sis,

Le Mont Houy, 59313 Valenciennes Cedex 9, France e-mail: frederic.vanderhaegen@univ-valenciennes.fr 123

Cogn Tech Work (2015) 17:157–158

DOI 10.1007/s10111-015-0319-1 distributed situation awareness (DSA). They argue that

DSA should replace the more traditional focus on individual situation awareness—here they accept much of

Dekker’s argument—with a new focus on the operation of the system, looking at all the human operators involved as well as relevant system components. They argue that, with such as systems focus, there is great value in the situation awareness concept.

Finally, this issue also includes a piece by Sidney

Dekker and James Nyce on the measurement of workload.

This contribution was inadvertently omitted from the earlier special section on workload which was framed around the commentary of de Winter (2014). We hope that readers will appreciate it in spite of the error of omission by the editors that has caused the delay.

References

De Winter JCF (2014) Controversy in human factors constructs and the explosive use of the NASA TLX: a measurement perspective. Cogn Tech Work 16(3):289–297

Dekker S, Hollnagel E (2004) Human factors and folk models. Cogn

Tech Work 6:79–86

Norman DA (1991) Cognitive science in the cockpit. CSERIAC

Gatew 2(2):1–6

Sarter NB, Woods DD (1995) How in the world did we ever get into that mode? Mode error and awareness in supervisory control.

Hum Factors 37(1):5–19

Stich S (1985) From folk psychology to cognitive science: a case against belief. MIT Press, Cambridge 158 Cogn Tech Work (2015) 17:157–158 123

Copyright of Cognition, Technology & Work is the property of Springer Science & Business

Media B.V. and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.