Usefulness of a human error identification tool for requirements inspection

An experience report

Vaibhav Anu, Gursimran Walia, Gary Bradshaw, Wenhua Hu, Jeffrey C. Carver

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

Abstract

Context and Motivation: Our recent work leverages Cognitive Psychology research on human errors to improve the standard fault-based requirements inspections. Question: The empirical study presented in this paper investigates the effectiveness of a newly developed Human Error Abstraction Assist (HEAA) tool in helping inspectors identify human errors to guide the fault detection during the requirements inspection. Results: The results showed that the HEAA tool, though effective, presented challenges during the error abstraction process. Contribution: In this experience report, we present major challenges during the study execution and lessons learned for future replications.

Original languageEnglish
Title of host publicationRequirements Engineering
Subtitle of host publicationFoundation for Software Quality - 23rd International Working Conference, REFSQ 2017, Proceedings
EditorsAnna Perini, Paul Grünbacher
PublisherSpringer Verlag
Pages370-377
Number of pages8
ISBN (Print)9783319540443
DOIs
StatePublished - 1 Jan 2017
Event23rd International Working Conference on Requirements Engineering – Foundation for Software Quality, REFSQ 2017 - Essen, Germany
Duration: 27 Feb 20172 Mar 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10153 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference23rd International Working Conference on Requirements Engineering – Foundation for Software Quality, REFSQ 2017
CountryGermany
City Essen
Period27/02/172/03/17

Fingerprint

Human Error
Inspection
Requirements
Fault Detection
Leverage
Empirical Study
Replication
Fault
Fault detection
Experience
Abstraction

Cite this

Anu, V., Walia, G., Bradshaw, G., Hu, W., & Carver, J. C. (2017). Usefulness of a human error identification tool for requirements inspection: An experience report. In A. Perini, & P. Grünbacher (Eds.), Requirements Engineering: Foundation for Software Quality - 23rd International Working Conference, REFSQ 2017, Proceedings (pp. 370-377). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10153 LNCS). Springer Verlag. https://doi.org/10.1007/978-3-319-54045-0_26
Anu, Vaibhav ; Walia, Gursimran ; Bradshaw, Gary ; Hu, Wenhua ; Carver, Jeffrey C. / Usefulness of a human error identification tool for requirements inspection : An experience report. Requirements Engineering: Foundation for Software Quality - 23rd International Working Conference, REFSQ 2017, Proceedings. editor / Anna Perini ; Paul Grünbacher. Springer Verlag, 2017. pp. 370-377 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{631ac45a870a434885c169e8cf6965f5,
title = "Usefulness of a human error identification tool for requirements inspection: An experience report",
abstract = "Context and Motivation: Our recent work leverages Cognitive Psychology research on human errors to improve the standard fault-based requirements inspections. Question: The empirical study presented in this paper investigates the effectiveness of a newly developed Human Error Abstraction Assist (HEAA) tool in helping inspectors identify human errors to guide the fault detection during the requirements inspection. Results: The results showed that the HEAA tool, though effective, presented challenges during the error abstraction process. Contribution: In this experience report, we present major challenges during the study execution and lessons learned for future replications.",
author = "Vaibhav Anu and Gursimran Walia and Gary Bradshaw and Wenhua Hu and Carver, {Jeffrey C.}",
year = "2017",
month = "1",
day = "1",
doi = "10.1007/978-3-319-54045-0_26",
language = "English",
isbn = "9783319540443",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Verlag",
pages = "370--377",
editor = "Anna Perini and Paul Gr{\"u}nbacher",
booktitle = "Requirements Engineering",

}

Anu, V, Walia, G, Bradshaw, G, Hu, W & Carver, JC 2017, Usefulness of a human error identification tool for requirements inspection: An experience report. in A Perini & P Grünbacher (eds), Requirements Engineering: Foundation for Software Quality - 23rd International Working Conference, REFSQ 2017, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10153 LNCS, Springer Verlag, pp. 370-377, 23rd International Working Conference on Requirements Engineering – Foundation for Software Quality, REFSQ 2017, Essen, Germany, 27/02/17. https://doi.org/10.1007/978-3-319-54045-0_26

Usefulness of a human error identification tool for requirements inspection : An experience report. / Anu, Vaibhav; Walia, Gursimran; Bradshaw, Gary; Hu, Wenhua; Carver, Jeffrey C.

Requirements Engineering: Foundation for Software Quality - 23rd International Working Conference, REFSQ 2017, Proceedings. ed. / Anna Perini; Paul Grünbacher. Springer Verlag, 2017. p. 370-377 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10153 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contributionResearchpeer-review

TY - GEN

T1 - Usefulness of a human error identification tool for requirements inspection

T2 - An experience report

AU - Anu, Vaibhav

AU - Walia, Gursimran

AU - Bradshaw, Gary

AU - Hu, Wenhua

AU - Carver, Jeffrey C.

PY - 2017/1/1

Y1 - 2017/1/1

N2 - Context and Motivation: Our recent work leverages Cognitive Psychology research on human errors to improve the standard fault-based requirements inspections. Question: The empirical study presented in this paper investigates the effectiveness of a newly developed Human Error Abstraction Assist (HEAA) tool in helping inspectors identify human errors to guide the fault detection during the requirements inspection. Results: The results showed that the HEAA tool, though effective, presented challenges during the error abstraction process. Contribution: In this experience report, we present major challenges during the study execution and lessons learned for future replications.

AB - Context and Motivation: Our recent work leverages Cognitive Psychology research on human errors to improve the standard fault-based requirements inspections. Question: The empirical study presented in this paper investigates the effectiveness of a newly developed Human Error Abstraction Assist (HEAA) tool in helping inspectors identify human errors to guide the fault detection during the requirements inspection. Results: The results showed that the HEAA tool, though effective, presented challenges during the error abstraction process. Contribution: In this experience report, we present major challenges during the study execution and lessons learned for future replications.

UR - http://www.scopus.com/inward/record.url?scp=85013872679&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-54045-0_26

DO - 10.1007/978-3-319-54045-0_26

M3 - Conference contribution

SN - 9783319540443

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 370

EP - 377

BT - Requirements Engineering

A2 - Perini, Anna

A2 - Grünbacher, Paul

PB - Springer Verlag

ER -

Anu V, Walia G, Bradshaw G, Hu W, Carver JC. Usefulness of a human error identification tool for requirements inspection: An experience report. In Perini A, Grünbacher P, editors, Requirements Engineering: Foundation for Software Quality - 23rd International Working Conference, REFSQ 2017, Proceedings. Springer Verlag. 2017. p. 370-377. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). https://doi.org/10.1007/978-3-319-54045-0_26