Date post: | 06-Aug-2015 |
Category: |
Software |
Upload: | eurostar-software-testing-conference |
View: | 589 times |
Download: | 1 times |
1
Lessons learnt integrating test into the agile lifecycle
Fran O’Hara – Inspire Quality Services
© 2015 Inspire Quality Services
2
Roles• Product Owner• ScrumMaster• Development Team
Events• Sprint planning• Sprint review/demo• Sprint retrospective• Daily scrum meeting• (Backlog refinement/grooming)
Artifacts
• Product backlog• Sprint backlog• Burndown Charts• Definition of Done
Rules
Each component within the framework serves a specific purpose and is
essential to Scrum’s success and usage.
Scrum Guide @ Scrum.org
7
PO, T
M, S
M
(All)
All
All All All,
CUAl
l
WEEK1
TM
WEEK2
Sprin
t Plan
ning
<=8h
BackL
og
Refinem
ent <
=10%
Review
<=
4h
Retro
spec
tive
<=3h
Daily S
tandUp
15m
PO: Product Owner – SM: ScrumMaster - TM: Development Team – CU: Customer
Each Event is Timeboxed. Times provided are maximum times from the Scrum Guide at scrum.org based on a 1 month sprint. Each event is an opportunity to Inspect and Adapt
8
Quality & Test
• Quality is not equal to test. Quality is achieved by putting development and testing into a blender and mixing them until one is indistinguishable from the other.
• Testing must be an unavoidable aspect of development, and the marriage of development and testing is where quality is achieved.
from ‘How google tests software’, James Whittaker et. al.
© 2015 Inspire Quality Services
9
Is testing fully integrated?
Code Code Code & Bug Fix
Test
Sprint 1 Sprint 2
Code
Test
Sprint 1 Sprint 2
Code & Bug Fix Code
Test
Code & Bug Fix
Code & Bug Fix
Test
Sprint 1 Sprint 2
Code & Bug Fix
Test
A
B
C
10
Achieving Scenario C
Prerequisites:
Test Driven at ‘Acceptance’ level (story level)
Small stories• ½ - 6 person days effort as a
guide
Prioritised and implemented in sequence (e.g. 2-3 at a time) • including test execution
.
.
.
.
Sprint Backlog
© 2015 Inspire Quality Services
No ‘verified’ column!
11
User Story Example – Hotel ReservationReservation Cancellation
As a user I want to cancel a reservation so that I avoid being charged full rate
Confirmation:
• Verify a premium member can cancel the same day without a fee
• Verify a non-premium member is charged 10% for same day cancellation but otherwise not charged
• Verify an email confirmation is sent to user with appropriate information
• Verify that the hotel is notified within 10 minutes of a cancellation
CONVERSATION: • What if I am a premium
member – do I have charges?
• When is a non-premium member charged and how much?
• How do these vary depending on when cancellation occurs?
• Do we need to send the user confirmation by email?
• When does the hotel need to be notified?
• What if the user has paid a deposit?
12
Release/feature planning level – a testing perspective
Add value in release (re-)planning by:
• Supporting the Product Owner in writing User Stories/Epics and making sure they are testable,
• Participating in the high level risk analysis of those User Stories/Epics,• Ensuring Estimation includes testing perspective• Planning the testing for the release/feature level. That is, to create a
test strategy/approach for it (resources, tools, test levels, static testing, test environments, test automation targets), based on the scope and risks identified for that release/feature• Based on an evolving product backlog
• Playing a key role in defining the definition of done of the release, and later on of the iteration/Sprint.
Adapted from ISTQB Agile Tester Extension Syllabus
13
An acceptance test is a formal description of the behaviour of a software product, generally expressed as an example or a usage scenario. ..- in many cases the aim is that it should be possible to automate the execution
of such tests by a software tool, either ad-hoc to the development team or off the shelf.
- Similarly to a unit test, an acceptance tests is generally understood to have a binary result, pass or fail;
- For many Agile teams acceptance tests are the main form of functional specification; sometimes the only formal expression of business requirements. ..
Also known as• The terms "functional test", "acceptance test" and "customer test" are used
more or less interchangeably.• A more specific term "story test", referring to user stories is also used, as in
the phrase "story test driven development".
(Agile Alliance)
‘Acceptance’ Testing in Agile
© 2015 Inspire Quality Services
14
‘Acceptance’ Testing – is it enough?
• May not be…context/risk/strategy issue…– Expand to fuller ‘system’ tests• Functional testing • Non-functional testing – performance, usability, etc.
– May still need more user story interaction tests, epic/feature level testing, workflows, end-to-end business scenario focused User Acceptance Test, etc.
– System integration testing issues– Etc.
• Strategy and scheduling issue– Risk-driven, adaptive
© 2015 Inspire Quality Services
17
Is testing fully integrated?
Code & Bug Fix
Test
Sprint 1
Code & Bug Fix
Test
Sprint 2
Code & Bug Fix
Test
Sprint 3
Potentially Releasable
……
Potentially Releasable
Potentially Releasable
Actual release (MMF)
• Initial Backlog• Release and Test Planning
Functional: Unit, component integration, story acceptance, story interaction, exploratory, etc.
+Feature/system/system integration… and Non-functional:
……
© 2015 Inspire Quality Services
18
Definition of ‘Done’
An agreement between PO and the Team• Evolving over time to increase quality & ‘doneness’
Used to guide the team in estimating and doing
‘Done’ may apply to a Product Backlog Item (PBI) and to an Increment
Used by the PO to increase predictability and accept Done PBIs
A single DoD may apply across an organisation, or a product• Multiple teams on a product share the DoD
© 2015 Inspire Quality Services
19
DoD exampleStory level• Unit tests passed, • unit tests achieving 80%
decision coverage,• Integration tests passed• acceptance tests passed
with traceability to story acceptance criteria,
• code and unit tests reviewed,
• static analysis has no important warnings,
• coding standard compliant, • published to Dev server
Sprint level• Reviewed and accepted by
PO, • E-2-E functional and feature
tests passed• all regression tests passing, • exploratory testing
completed, • performance
profiling/benchmarking complete,
• bugs committed in sprint resolved,
• deployment/release docs updated and reviewed,
• user manual updated
Release level• Released to Stage
server, • Deployment tests
passed, • Deployment/release
docs delivered, • large scale integration
performance/stress testing passed
© 2015 Inspire Quality Services
20
Conclusions on lessons learnt
© 2015 Inspire Quality Services
• Prevention as well as detection• Avoiding the mini-waterfall• Activity versus artefact• Test competence in the team• Role of (test) management