How to Optimize Automated Testing with Everyone's Favorite Butler

Post on 02-Aug-2015

67 views 3 download

Tags:

transcript

#jenkinsconf

Footer

How to Optimize Automated Testing with Everyone's Favorite Butler

Viktor Clerc, Product Manager & Jenkins Fan, XebiaLabs

#jenkinsconf

2

Agenda

• The World of Testing is Changing• Testing = Automation• Test Automation and CD: Execution and Analysis• Focus on the Basics• Best Practices for Test Execution using Jenkins• Supporting Test Analysis

#jenkinsconf

Footer3

But first…a bit about me

• Product Manager XL TestView for XebiaLabs• Traversed through all phases of the software

development lifecycle• Supported major organization in setting up a

test strategy and test automation strategy• Is eager to flip the way (most) organizations do

testing

#jenkinsconf

Footer4

…and about XebiaLabs

• We build tools to solve problems around DevOps and Continuous Delivery at scale

#jenkinsconf

The World of Testing is Changing

5

#jenkinsconf

Introducing Test Automation For Real

SPECIFY DESIGN BUILD TEST INTEGRATE REGRESSIONUSER

ACCEPTANCERELEASE

Acceptance Driven Testing

Development = TestTest = Development

Automate ALL

User Acceptance

Test effort

INTEGRATE REGRESSIONUSER

ACCEPTANCE

#jenkinsconf

Testing = Automation

7

#jenkinsconf

Testing = Automation: Implications

• Developers are becoming testers– Maintain test code as source code

• Need to set up on-demand pipelines and environments

• Infrastructure as code– X-browser tests, Selenium grids, dedicated

performance environments, mobile etc.

• Hosted services

#jenkinsconf

Testing = Automation: Challenges

• Many test tools for each of the test levels, but no single place to answer “Good enough to go live?”

• Requirements coverageis not available– “Did we test enough?”– Minimize the mean time to repair– Support for failure analysis

JUnit, FitNesse, JMeter, YSlow, Vanity Check, WireShark, SOAP-UI, Jasmine, Karma, Speedtrace,

Selenium, WebScarab, TTA, DynaTrace, HP Diagnostics, ALM stack AppDynamics, Code Tester

for Oracle, Arachnid, Fortify, Sonar, …

#jenkinsconf

Testing = Automation: Challenges

• Thousands of tests makes test sets hard to manage:– “Where is my subset?”– “What tests add most value, what tests are superfluous?”– “When to run what tests?”

• Running all tests all the time takes too long, feedback is too late

• Quality control of the tests themselves and maintenance of testware

#jenkinsconf

Testing = Automation: Challenges

• Tooling overstretch

#jenkinsconf

Testing = Automation: Challenges

• Tooling overstretch• Poor butler!

#jenkinsconf

Test Automation and CD: Execution and Analysis

13

#jenkinsconf

The Two Faces of CD

• A lot of focus right now is on pipeline execution• …but there’s no point delivering at light speed if

everything starts breaking• Testing (= quality/risk) needs to be a first-class

citizen of your CD initiative!

#jenkinsconf

The Two Faces of CD

• CD = Execution + Analysis

#jenkinsconf

The Two Faces of CD

• CD = Execution + Analysis• = Speed + Quality

#jenkinsconf

The Two Faces of CD

• CD = Execution + Analysis• = Speed + Quality• = Pipeline orchestration + ..?

#jenkinsconf

Focus on the Basics

18

#jenkinsconf

19

Quick Review

1. Cohn’s pyramid– Unit tests– Service tests (under the GUI)– (graphical) User Interface tests

2. And even further downstream– Integration Tests– Performance Tests

#jenkinsconf

20

“Modern Testing” 101

1. Testers are developers

#jenkinsconf

21

“Modern Testing” 101

1. Testers are developers

2. Test code equals production code– Conway’s Law– Measure quality

#jenkinsconf

22

“Modern Testing” 101

1. Testers are developers

2. Test code equals production code– Conway’s Law– Measure quality

3. Linking tests to use cases

#jenkinsconf

23

“Modern Testing” 101

1. Testers are developers

2. Test code equals production code– Conway’s Law– Measure quality

3. Linking tests to use cases

4. Slice and dice– Labeling

#jenkinsconf

24

“Modern Testing” 101

1. Testers are developers

2. Test code equals production code– Conway’s Law– Measure quality

3. Linking tests to use cases

4. Slice and dice– Labeling

5. Radical parallelization

Fail FASTer!

“Kill the nightlies”

#jenkinsconf

25

Dealing With Growing Tests

• Conway’s Law for test code– Let the test code mimic the production code– Organize tests under the project/system under test

• Suite.App.UseCase.TestCase• Cut the suite at UseCase: now you have

independent chunks which you can run massively in parallel

#jenkinsconf

26

Dealing With Growing Tests

• Tests should not depend on other tests– Setup and tear down of test data done within each test– Share test components (as you would do with ‘real’

production code)– Trade-off between:

• No code duplication yet somewhat more complex fixtures

• Easy-to-grab simple fixtures but a lot of them (and duplication)

#jenkinsconf

27

Keep It Manageable

• Focus on functional coverage, not technical coverage• Say 40 user stories, 400 tests

– Do I have relatively more tests for the more important user stories?– How do I link tests to user stories/features/fixes?

• Metrics– Number of tests– Number of tests that have not passed in <time>– Flaky tests– Duration

#jenkinsconf

28

Slice and Dice

• Use appropriate labels in your test code– Responsible team– Topic– Functional area– Flaky– Known issue– etc.

#jenkinsconf

Best Practices for Test Execution in Jenkins

29

#jenkinsconf

30

Jenkins Testing Basics

• Tilt the pyramid …

• … and use this as the guiding principle to set up your Jenkins test jobs “left to right”

#jenkinsconf

31

Organizing Test Jobs in Jenkins

1. Create unique artifacts and fingerprints to monitor what you are pushing across your pipeline

2. Treat different platforms (e.g. browsers) as different tests, handled by different jobs

3. Well-known plugins:– Multi-job– Copy Artifact– Workflow

#jenkinsconf

Organizing Test Jobs in Jenkins

4. Keep Jenkins jobs sane and simple– Ergo: execute shell scripts from your Jenkins jobs

5. Shell scripts are parameterized

6. Parameters are fed to individual test tools– FitNesse labels, Cucumber labels, etc. etc.

7. Shell scripts placed under version control– Managed by the team as any other source code

#jenkinsconf

33

Example Job Distribution

Build Deploy Int. Tests Test Test Test Perf. Tests

Build Deploy Int. Tests Test

Test

Test

Perf. Tests

Beware of scattered result qualification

#jenkinsconf

34

Distributing Tests Across Jobs

• Radical parallelization using cheap andcheerful throw-away environments– Especially when environments (e.g. containers)

lie at your fingertips

• Jobs should not depend on other jobs• Test jobs are your “eyes and ears” – optimize for

them!

#jenkinsconf

35

Example Job Distribution

Build Deploy

Int. Tests

Test

Test

Test

Perf. Tests

?

#jenkinsconf

36

Challenge: Scattered Results

#jenkinsconf

Supporting Test Analysis

37

#jenkinsconf

Footer

Making Sense of Test Results

• Real go/no go decisions are non-trivial– No failing tests– 5 % of failing tests– No regression (tests that currently fail but passed previously)– List of tests-that-should-not-fail

• Need historical context• One integrated view• Data to guide improvement

#jenkinsconf

Footer

Making Sense of Test Results

Executing tests from Jenkins is great, but…• Different testing jobs have their share of Jenkins

plugins• Historic view merely available per job, not across

jobs• Pass/Unstable/Fail is too coarse

– How to do “Passed, but with known failures”?

#jenkinsconf

Footer40

Making Sense of Test Results

• Ultimate analysis question (“are we good to go live?”) is difficult to answer

• No obvious solution for now, unless all your tests are running through one service

#jenkinsconf

Example Case Study

#jenkinsconf

• Started with 1 project containing all tests• Sharing knowledge• Structured the same as our use cases, i.e.

• WebshopSuite.BusinessAccountSuite.UseCase1500• Nightly runs from the beginning• Indication by labels (“nightly”)• First sequential per application

• WebshopSuite• Later parallel, split by functional area

• WebshopSuite.BusinessAccountSuite.*

42

FitNesse Implementation

#jenkinsconf

Check-inStep

Tools

Environment git-server

Build Unit tests Build EAR Deploy

Jenkins-server

Smoke Test

Dedicated Team Server

Build Deploy

Code review

Example Pipeline

#jenkinsconf

System TestStep

Tools

Production Acc. Test

Deploy to Chain

Chain Test

Security TestStep

Tools

Environment Jenkins server and Sonar server

Remarks

Source Code Quality TestStep

Tools

Dedicated Team Server

Chain {1-5} Chain {1-5}

Testing

Environment

Dedicated Team ServerEnvironment

End to End Testing

Smoke Test

Chain {1-5}

#jenkinsconf

45

Test Analysis: Homebrew

#jenkinsconf

46

Test Analysis: Custom Reporting

#jenkinsconf

47

Test Analysis: Custom Reporting

#jenkinsconf

48

Summary

• Testing = Automation– Testers are developers

• Structure and annotate tests– Conway’s Law for Tests– Link to functions/features/use cases

• Radical parallelization– Throwaway environments

#jenkinsconf

49

Summary

• Keep Jenkins jobs simple• Keep Jenkins jobs independent• Track SUT with fingerprints• Invoke test tools via plugins or version-controlled

scripts• Parameterization!• Parallelize & optimize

#jenkinsconf

50

Summary

• CD = Speed + Quality = Execution + Analysis• Making sense of scattered test results is still a

challenge• Need to figure out how to address real world go/no

go decisions

#jenkinsconf

51

What’s Next?

• Visit http://tiny.cc/webinar-xebialabs for a webinar by CloudBees and XebiaLabs demonstrating the key value of CD and go-live decisions

• Read more on the testing challenges in CD– http://tiny.cc/ta-and-cd

• Try XebiaLabs’ XL TestView solution to bringquality into the heart of your CD initiative– http://tiny.cc/xl-testview

#jenkinsconf

52

Please Share Your Feedback

• Did you find this session valuable?• Please share your thoughts in the

Jenkins User Conference Mobile App.• Find the session in the app and click

on the feedback area.