Issues and Opportunities for Human Error-Based Requirements Inspections: An Exploratory Study

Vaibhav Anu, Gursimran Walia, Wenhua Hu, Jeffrey C. Carver, Gary Bradshaw

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

[Background] Software inspections are extensively used for requirements verification. Our research uses the perspective of human cognitive failures (i.e., human errors) to improve the fault detection effectiveness of traditional fault-checklist based inspections. Our previous evaluations of a formal human error based inspection technique called Error Abstraction and Inspection (EAI) have shown encouraging results, but have also highlighted a real need for improvement. [Aims and Method] The goal of conducting the controlled study presented in this paper was to identify the specific tasks of EAI that inspectors find most difficult to perform and the strategies that successful inspectors use when performing the tasks. [Results] The results highlighted specific pain points of EAI that can be addressed by improving the training and instrumentation.

Original languageEnglish
Title of host publicationProceedings - 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017
PublisherIEEE Computer Society
Pages460-465
Number of pages6
ISBN (Electronic)9781509040391
DOIs
StatePublished - 7 Dec 2017
Event11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017 - Toronto, Canada
Duration: 9 Nov 201710 Nov 2017

Publication series

NameInternational Symposium on Empirical Software Engineering and Measurement
Volume2017-November
ISSN (Print)1949-3770
ISSN (Electronic)1949-3789

Conference

Conference11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017
CountryCanada
CityToronto
Period9/11/1710/11/17

Fingerprint

Inspection
Fault detection

Keywords

  • human error
  • inspection
  • software requirements

Cite this

Anu, V., Walia, G., Hu, W., Carver, J. C., & Bradshaw, G. (2017). Issues and Opportunities for Human Error-Based Requirements Inspections: An Exploratory Study. In Proceedings - 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017 (pp. 460-465). (International Symposium on Empirical Software Engineering and Measurement; Vol. 2017-November). IEEE Computer Society. https://doi.org/10.1109/ESEM.2017.62
Anu, Vaibhav ; Walia, Gursimran ; Hu, Wenhua ; Carver, Jeffrey C. ; Bradshaw, Gary. / Issues and Opportunities for Human Error-Based Requirements Inspections : An Exploratory Study. Proceedings - 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017. IEEE Computer Society, 2017. pp. 460-465 (International Symposium on Empirical Software Engineering and Measurement).
@inproceedings{3ef117198d4f4bf7a19083e5450c34ca,
title = "Issues and Opportunities for Human Error-Based Requirements Inspections: An Exploratory Study",
abstract = "[Background] Software inspections are extensively used for requirements verification. Our research uses the perspective of human cognitive failures (i.e., human errors) to improve the fault detection effectiveness of traditional fault-checklist based inspections. Our previous evaluations of a formal human error based inspection technique called Error Abstraction and Inspection (EAI) have shown encouraging results, but have also highlighted a real need for improvement. [Aims and Method] The goal of conducting the controlled study presented in this paper was to identify the specific tasks of EAI that inspectors find most difficult to perform and the strategies that successful inspectors use when performing the tasks. [Results] The results highlighted specific pain points of EAI that can be addressed by improving the training and instrumentation.",
keywords = "human error, inspection, software requirements",
author = "Vaibhav Anu and Gursimran Walia and Wenhua Hu and Carver, {Jeffrey C.} and Gary Bradshaw",
year = "2017",
month = "12",
day = "7",
doi = "10.1109/ESEM.2017.62",
language = "English",
series = "International Symposium on Empirical Software Engineering and Measurement",
publisher = "IEEE Computer Society",
pages = "460--465",
booktitle = "Proceedings - 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017",

}

Anu, V, Walia, G, Hu, W, Carver, JC & Bradshaw, G 2017, Issues and Opportunities for Human Error-Based Requirements Inspections: An Exploratory Study. in Proceedings - 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017. International Symposium on Empirical Software Engineering and Measurement, vol. 2017-November, IEEE Computer Society, pp. 460-465, 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017, Toronto, Canada, 9/11/17. https://doi.org/10.1109/ESEM.2017.62

Issues and Opportunities for Human Error-Based Requirements Inspections : An Exploratory Study. / Anu, Vaibhav; Walia, Gursimran; Hu, Wenhua; Carver, Jeffrey C.; Bradshaw, Gary.

Proceedings - 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017. IEEE Computer Society, 2017. p. 460-465 (International Symposium on Empirical Software Engineering and Measurement; Vol. 2017-November).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Issues and Opportunities for Human Error-Based Requirements Inspections

T2 - An Exploratory Study

AU - Anu, Vaibhav

AU - Walia, Gursimran

AU - Hu, Wenhua

AU - Carver, Jeffrey C.

AU - Bradshaw, Gary

PY - 2017/12/7

Y1 - 2017/12/7

N2 - [Background] Software inspections are extensively used for requirements verification. Our research uses the perspective of human cognitive failures (i.e., human errors) to improve the fault detection effectiveness of traditional fault-checklist based inspections. Our previous evaluations of a formal human error based inspection technique called Error Abstraction and Inspection (EAI) have shown encouraging results, but have also highlighted a real need for improvement. [Aims and Method] The goal of conducting the controlled study presented in this paper was to identify the specific tasks of EAI that inspectors find most difficult to perform and the strategies that successful inspectors use when performing the tasks. [Results] The results highlighted specific pain points of EAI that can be addressed by improving the training and instrumentation.

AB - [Background] Software inspections are extensively used for requirements verification. Our research uses the perspective of human cognitive failures (i.e., human errors) to improve the fault detection effectiveness of traditional fault-checklist based inspections. Our previous evaluations of a formal human error based inspection technique called Error Abstraction and Inspection (EAI) have shown encouraging results, but have also highlighted a real need for improvement. [Aims and Method] The goal of conducting the controlled study presented in this paper was to identify the specific tasks of EAI that inspectors find most difficult to perform and the strategies that successful inspectors use when performing the tasks. [Results] The results highlighted specific pain points of EAI that can be addressed by improving the training and instrumentation.

KW - human error

KW - inspection

KW - software requirements

UR - http://www.scopus.com/inward/record.url?scp=85042388471&partnerID=8YFLogxK

U2 - 10.1109/ESEM.2017.62

DO - 10.1109/ESEM.2017.62

M3 - Conference contribution

AN - SCOPUS:85042388471

T3 - International Symposium on Empirical Software Engineering and Measurement

SP - 460

EP - 465

BT - Proceedings - 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017

PB - IEEE Computer Society

ER -

Anu V, Walia G, Hu W, Carver JC, Bradshaw G. Issues and Opportunities for Human Error-Based Requirements Inspections: An Exploratory Study. In Proceedings - 11th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2017. IEEE Computer Society. 2017. p. 460-465. (International Symposium on Empirical Software Engineering and Measurement). https://doi.org/10.1109/ESEM.2017.62