+ All Categories
Home > Design > Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Date post: 01-Jul-2015
Category:
Upload: xavier-devroey
View: 359 times
Download: 0 times
Share this document with a friend
Description:
Paper may be downloaded at https://pure.fundp.ac.be/ws/files/7911785/VAMOS2014_FTS_statistical_prioritization.pdf
35
www.unamur.be Towards Statistical Prioritization for Software Product Lines Testing Xavier Devroey * ; Gilles Perrouin ; Maxime Cordy ; Pierre-Yves Schobbens ; Axel Legay ; Patrick Heymans 8th International Workshop on Variability Modelling of Software- intensive Systems, VaMoS ’14 Nice, France
Transcript
Page 1: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

www.unamur.be

Towards Statistical Prioritization for Software Product Lines Testing

Xavier Devroey * ; Gilles Perrouin ; Maxime Cordy ; Pierre-Yves Schobbens ; Axel Legay ; Patrick Heymans

8th International Workshop on Variability Modelling of Software-intensive Systems, VaMoS ’14

Nice, France

Page 2: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Plan

• Introduction

• Background

– Featured Transition Systems

– Product-Based Test Derivation

• Family-Based Test Prioritization

• Feasibility Assessment (Claroline case-study)

• Conclusion and Future Works

Page 3: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

TESTING

… in a Product Line Context

Page 4: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Testing Process

Specification

Page 5: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Testing Process

Specification

SUT

1. Implemented

Page 6: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Testing Process

Specification

SUT{(pay, change, soda, serveSoda, open, take close), (pay, change, tea, serveTea, open, take, close)}

Test-Cases

1. Implemented

2. Derived

Operationalization

3. Executed

Pass Fail4.

Page 7: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Testing a Product Line

{(pay, change, soda, serveSoda, open, take close), (pay, change, tea, serveTea, open, take, close)}

Pass Fail

{(pay, change, tea, serveTea, open, take close), (pay, change, cancel, return)}

Pass Fail

{(free,, tea, serveTea, take),(free, soda, serveSoda, take); (free, cancel, return)}

Pass Fail

{(free,, soda, serveoda, take),}

Page 8: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Testing a Product Line

{(pay, change, soda, serveSoda, open, take close), (pay, change, tea, serveTea, open, take, close)}

Pass Fail

{(pay, change, tea, serveTea, open, take close), (pay, change, cancel, return)}

Pass Fail

{(free,, tea, serveTea, take),(free, soda, serveSoda, take); (free, cancel, return)}

Pass Fail

{(free,, soda, serveoda, take),}

Which is first ???

Page 9: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

SPECIFYING A PRODUCT LINE

Featured Transition Systems (FTSs) [Classen et al. 2011]

Page 10: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Specifying a Product Line

Page 11: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Specifying a Product Line

Page 12: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Specifying a Product Line

Page 13: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Featured Transition System [Classen et al. 2011 ]

Featured Transition System Feature Diagram

Page 14: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

TESTING LOTS OF SYSTEMS

Which product first ?

Page 15: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Which product first ?

• Particularly useful during regression testing

• Using weights on features [Henard et al. 2013, Johansen et al. 2012]

– Does not consider behaviour

• Using weights (i.e., probabilities) on transitions

Page 16: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Usage Model

• Statistical testing [Whittaker 1994]

• Deterministic Time Markov Chain (DTMC)

• Independent from the FTS Allows usage of existing

tools Extraction method is

agnostic of features DTMC may be incomplete

• Allows invalid paths => DTMC + FTS detects inconsistencies

Page 17: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

Page 18: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

1. Product Selection

Page 19: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

2. FTS Projection

1. Product Selection

Page 20: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

2. FTS Projection

1. Product Selection

3. DTMC Pruning

Page 21: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

2. FTS Projection

1. Product Selection

3. DTMC Pruning

Page 22: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

2. FTS Projection

1. Product Selection

3. DTMC Pruning

1. Trace Selection

DFS(lmax = 7 ; Pr min = 0; Pr max = 0.1 ; DTMC) = {(pay, change, cancel, return; Pr = 0,01) ; (free, cancel, return; Pr = 0,09) ; (pay, change, tea, serveTea, open, take, close; Pr = 0,009); (pay, change, tea, serveTea, take; Pr = 0,081) ; (free, tea, serveTea, open, take, close; Pr = 0,081)}

Page 23: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

2. FTS Projection

1. Product Selection

3. DTMC Pruning

1. Trace Selection

2. Trace Filtering and FTS Pruning

DFS(lmax = 7 ; Pr min = 0; Pr max = 0.1 ; DTMC) = {(pay, change, cancel, return; Pr = 0,01) ; (free, cancel, return; Pr = 0,09) ; (pay, change, tea, serveTea, open, take, close; Pr = 0,009); (pay, change, tea, serveTea, take; Pr = 0,081) ; (free, tea, serveTea, open, take, close; Pr = 0,081)}

Page 24: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

2. FTS Projection

1. Product Selection

3. DTMC Pruning

1. Trace Selection

2. Trace Filtering and FTS Pruning

3. Product Prioritization

DFS(lmax = 7 ; Pr min = 0; Pr max = 0.1 ; DTMC) = {(pay, change, cancel, return; Pr = 0,01) ; (free, cancel, return; Pr = 0,09) ; (pay, change, tea, serveTea, open, take, close; Pr = 0,009); (pay, change, tea, serveTea, take; Pr = 0,081) ; (free, tea, serveTea, open, take, close; Pr = 0,081)}

(¬f t) ∧ ∧

Page 25: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Product-Based Test Derivation [Samih and Baudry 2012] Family-Based Test Prioritization

2. FTS Projection

1. Product Selection

3. DTMC Pruning

1. Trace Selection

2. Trace Filtering and FTS Pruning

3. Product Prioritization

DFS(lmax = 7 ; Pr min = 0; Pr max = 0.1 ; DTMC) = {(pay, change, cancel, return; Pr = 0,01) ; (free, cancel, return; Pr = 0,09) ; (pay, change, tea, serveTea, open, take, close; Pr = 0,009); (pay, change, tea, serveTea, take; Pr = 0,081) ; (free, tea, serveTea, open, take, close; Pr = 0,081)}

Page 26: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

FAMILY-BASED PRODUCT PRIORITIZATION

Feasibility assessment

Page 27: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Case-study: Claroline@UNamur (Webcampus)

• Open source online course management system (http://www.claroline.net/)

• ± 7000 users

• Upload/download documents, online exercises, forum, agenda, announcements, etc. http://webcampus.fundp.ac.be

Page 28: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Claroline: DTMC

• Derived from anonymized Apache Access Log (5.26 Go)• From January 1st to October 1st 2013• 12.689.033 PHP pages HTTP requests

• (1 PHP page 1 state) + initial state• 1 request 1 transition• User session = sequence of request (timeout = 45 min)• 2-gram without smoothing [Verwer et al. 2013]

Page 29: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Claroline: models (http://info.fundp.ac.be/~xde/fts-testing/)

• Usage Model (DTMC)– 96 states and 2149 transitions– 2 hours computation

• Ubuntu Linux (Intel Core i3, 3.10 GHz, 4GB mem.)

• Feature Diagram (FD)– Built manually by inspecting a Claroline local instance– 44 features

• Lots of optional features

• Featured Transition System (FTS)– Web crawler on local instance to get the pages– (1 page 1 state) + initial state– Every state accessible from anywhere– Transitions tagged with feature expressions based on the knowledge of the

system– 107 states and 11236 transitions

Page 30: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Claroline: Setup and Results

Run 1 Run 2 Run 3 Run 4

Lmax 98 98 98 98

Pr min 1E-4 1E-5 1E-6 1E-7

Pr max 1 1 1 1

#DTMC traces 211 1389 9287 62112

#Valid traces 211 1389 9287 62112

Traces avg. size 4.82 5.51 6.35 7.17

Traces avg. proba 2.06E-3 3.36E-4 5.26E-5 8.10E-6

#Pruned FTS states 16 36 50 69

#Pruned FTS transitions

66 224 442 844

Page 31: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Claroline: Discussion

• Observation: – Even with a “simple” algorithm, computation time is reasonable– Independence of the features and low size of valid traces

• #products associated to each trace too important

• Generate longer traces by coupling probabilistic approach with state/transitions coverage criteria

• Select minimal features set needed to execute a trace– Use knowledge of the application domain– Select features according to their frequency in the feature

expressions of valid traces

Page 32: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Claroline: Discussion

• Multiple usage models: one per role (i.e., student, teacher, admin, visitor)

• Use other selection criteria on the usage model– Least/Most probable traces

• Main threat: Web nature of the considered application

Page 33: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

CONCLUSION…

… and Future Works

Page 34: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

Conclusion

• Contribution: – A first approach prioritazing behaviours statistically for testing

SPLs in a family-based manner

• Future works:– Improve exploration algorithm in order to support other

“statistical selection” criteria on the usage model• Least/Most probable behaviours

– Combine structural selection criteria with statistical testing in an SPL context• State coverage, transition coverage, transition pairs coverage, path

coverage, etc.

Page 35: Towards Statistical Prioritization for Software Product Lines Testing (VaMos '14)

THANK YOU FOR YOUR ATTENTION !

Models and tools available on http://info.fundp.ac.be/~xde/fts-testing/

E-mail: [email protected]


Recommended