Human and organizational biases affecting the management of safety

Teemu Reiman (Corresponding Author), Carl Rollenhagen

Research output: Contribution to journalArticleScientificpeer-review

37 Citations (Scopus)

Abstract

Management of safety is always based on underlying models or theories of organization, human behavior and system safety. The aim of the article is to review and describe a set of potential biases in these models and theories. We will outline human and organizational biases that have an effect on the management of safety in four thematic areas: beliefs about human behavior, beliefs about organizations, beliefs about information and safety models. At worst, biases in these areas can lead to an approach where people are treated as isolated and independent actors who make (bad) decisions in a social vacuum and who pose a threat to safety. Such an approach aims at building barriers and constraints to human behavior and neglects the measures aiming at providing prerequisites and organizational conditions for people to work effectively. This reductionist view of safety management can also lead to too drastic a strong separation of so-called human factors from technical issues, undermining the holistic view of system safety. Human behavior needs to be understood in the context of people attempting (together) to make sense of themselves and their environment, and act based on perpetually incomplete information while relying on social conventions, affordances provided by the environment and the available cognitive heuristics. In addition, a move toward a positive view of the human contribution to safety is needed. Systemic safety management requires an increased understanding of various normal organizational phenomena – in this paper discussed from the point of view of biases – coupled with a systemic safety culture that encourages and endorses a holistic view of the workings and challenges of the socio-technical system in question.
Original languageEnglish
Pages (from-to)1263-1274
JournalReliability Engineering and System Safety
Volume96
Issue number10
DOIs
Publication statusPublished - 2011
MoE publication typeA1 Journal article-refereed

Fingerprint

Security systems
Human engineering
Vacuum

Keywords

  • Safety management
  • bias
  • safety science
  • organizational factors
  • human factors

Cite this

Reiman, Teemu ; Rollenhagen, Carl. / Human and organizational biases affecting the management of safety. In: Reliability Engineering and System Safety. 2011 ; Vol. 96, No. 10. pp. 1263-1274.
@article{01c4630d86b5490fbc6361fc207e5fc2,
title = "Human and organizational biases affecting the management of safety",
abstract = "Management of safety is always based on underlying models or theories of organization, human behavior and system safety. The aim of the article is to review and describe a set of potential biases in these models and theories. We will outline human and organizational biases that have an effect on the management of safety in four thematic areas: beliefs about human behavior, beliefs about organizations, beliefs about information and safety models. At worst, biases in these areas can lead to an approach where people are treated as isolated and independent actors who make (bad) decisions in a social vacuum and who pose a threat to safety. Such an approach aims at building barriers and constraints to human behavior and neglects the measures aiming at providing prerequisites and organizational conditions for people to work effectively. This reductionist view of safety management can also lead to too drastic a strong separation of so-called human factors from technical issues, undermining the holistic view of system safety. Human behavior needs to be understood in the context of people attempting (together) to make sense of themselves and their environment, and act based on perpetually incomplete information while relying on social conventions, affordances provided by the environment and the available cognitive heuristics. In addition, a move toward a positive view of the human contribution to safety is needed. Systemic safety management requires an increased understanding of various normal organizational phenomena – in this paper discussed from the point of view of biases – coupled with a systemic safety culture that encourages and endorses a holistic view of the workings and challenges of the socio-technical system in question.",
keywords = "Safety management, bias, safety science, organizational factors, human factors",
author = "Teemu Reiman and Carl Rollenhagen",
year = "2011",
doi = "10.1016/j.ress.2011.05.010",
language = "English",
volume = "96",
pages = "1263--1274",
journal = "Reliability Engineering and System Safety",
issn = "0951-8320",
publisher = "Elsevier",
number = "10",

}

Human and organizational biases affecting the management of safety. / Reiman, Teemu (Corresponding Author); Rollenhagen, Carl.

In: Reliability Engineering and System Safety, Vol. 96, No. 10, 2011, p. 1263-1274.

Research output: Contribution to journalArticleScientificpeer-review

TY - JOUR

T1 - Human and organizational biases affecting the management of safety

AU - Reiman, Teemu

AU - Rollenhagen, Carl

PY - 2011

Y1 - 2011

N2 - Management of safety is always based on underlying models or theories of organization, human behavior and system safety. The aim of the article is to review and describe a set of potential biases in these models and theories. We will outline human and organizational biases that have an effect on the management of safety in four thematic areas: beliefs about human behavior, beliefs about organizations, beliefs about information and safety models. At worst, biases in these areas can lead to an approach where people are treated as isolated and independent actors who make (bad) decisions in a social vacuum and who pose a threat to safety. Such an approach aims at building barriers and constraints to human behavior and neglects the measures aiming at providing prerequisites and organizational conditions for people to work effectively. This reductionist view of safety management can also lead to too drastic a strong separation of so-called human factors from technical issues, undermining the holistic view of system safety. Human behavior needs to be understood in the context of people attempting (together) to make sense of themselves and their environment, and act based on perpetually incomplete information while relying on social conventions, affordances provided by the environment and the available cognitive heuristics. In addition, a move toward a positive view of the human contribution to safety is needed. Systemic safety management requires an increased understanding of various normal organizational phenomena – in this paper discussed from the point of view of biases – coupled with a systemic safety culture that encourages and endorses a holistic view of the workings and challenges of the socio-technical system in question.

AB - Management of safety is always based on underlying models or theories of organization, human behavior and system safety. The aim of the article is to review and describe a set of potential biases in these models and theories. We will outline human and organizational biases that have an effect on the management of safety in four thematic areas: beliefs about human behavior, beliefs about organizations, beliefs about information and safety models. At worst, biases in these areas can lead to an approach where people are treated as isolated and independent actors who make (bad) decisions in a social vacuum and who pose a threat to safety. Such an approach aims at building barriers and constraints to human behavior and neglects the measures aiming at providing prerequisites and organizational conditions for people to work effectively. This reductionist view of safety management can also lead to too drastic a strong separation of so-called human factors from technical issues, undermining the holistic view of system safety. Human behavior needs to be understood in the context of people attempting (together) to make sense of themselves and their environment, and act based on perpetually incomplete information while relying on social conventions, affordances provided by the environment and the available cognitive heuristics. In addition, a move toward a positive view of the human contribution to safety is needed. Systemic safety management requires an increased understanding of various normal organizational phenomena – in this paper discussed from the point of view of biases – coupled with a systemic safety culture that encourages and endorses a holistic view of the workings and challenges of the socio-technical system in question.

KW - Safety management

KW - bias

KW - safety science

KW - organizational factors

KW - human factors

U2 - 10.1016/j.ress.2011.05.010

DO - 10.1016/j.ress.2011.05.010

M3 - Article

VL - 96

SP - 1263

EP - 1274

JO - Reliability Engineering and System Safety

JF - Reliability Engineering and System Safety

SN - 0951-8320

IS - 10

ER -