Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Do System Test Cases Grow Old?
Blekinge Institute of Technology, Faculty of Computing, Department of Software Engineering.
2014 (English)Conference paper, Published paper (Refereed)
Abstract [en]

Companies increasingly use either manual or automated system testing to ensure the quality of their software products. As a system evolves and is extended with new features the test suite also typically grows as new test cases are added. To ensure software quality throughout this process the test suite is continously executed, often on a daily basis. It seems likely that newly added tests would be more likely to fail than older tests but this has not been investigated in any detail on large-scale, industrial software systems. Also it is not clear which methods should be used to conduct such an analysis. This paper proposes three main concepts that can be used to investigate aging effects in the use and failure behavior of system test cases: test case activation curves, test case hazard curves, and test case half-life. To evaluate these concepts and the type of analysis they enable we apply them on an industrial software system containing more than one million lines of code. The data sets comes from a total of 1,620 system test cases executed a total of more than half a million times over a time period of two and a half years. For the investigated system we find that system test cases stay active as they age but really do grow old; they go through an infant mortality phase with higher failure rates which then decline over time. The test case half-life is between 5 to 12 months for the two studied data sets.

Place, publisher, year, edition, pages
IEEE , 2014.
Keyword [en]
Software testing, System testing, Empirical study, Statistical analysis
National Category
Software Engineering
Identifiers
URN: urn:nbn:se:bth-6524DOI: 10.1109/ICST.2014.47ISI: 000355985000037ISBN: 978-0-7695-5185-2 (print)OAI: oai:DiVA.org:bth-6524DiVA: diva2:834042
Conference
International Conference on Software Testing, Verification, and Validation (ICST), Cleveland
Available from: 2014-11-27 Created: 2014-11-26 Last updated: 2016-02-01Bibliographically approved

Open Access in DiVA

fulltext(568 kB)97 downloads
File information
File name FULLTEXT01.pdfFile size 568 kBChecksum SHA-512
70997ccf7eb302fa08344028ddce555584ac19b79010ab0f4cc1c9957ec2e79a2425696a3f52ab63d3cac86ac510350ac5b27d327bd47078c5e432b3dbf7a262
Type fulltextMimetype application/pdf

Other links

Publisher's full text

Search in DiVA

By author/editor
Feldt, Robert
By organisation
Department of Software Engineering
Software Engineering

Search outside of DiVA

GoogleGoogle Scholar
Total: 97 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

doi
isbn
urn-nbn

Altmetric score

doi
isbn
urn-nbn
Total: 97 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf