+ All Categories
Home > Internet > Value add: Single User Performance Testing (

Value add: Single User Performance Testing (

Date post: 18-Jul-2015
Category:
Upload: akbollinger
View: 73 times
Download: 0 times
Share this document with a friend
Popular Tags:
15
Value Add: Single User Performance Testing http://managingperformancetesting.blogspot.co.uk/
Transcript
Page 1: Value add: Single User Performance Testing (

Value Add:Single User Performance

Testing

http://managingperformancetesting.blogspot.co.uk/

Page 2: Value add: Single User Performance Testing (

Agenda

• Context

• Challenges

• Single User Performance Testing

• “Shift Left”

• High Value “Just in Time” Reporting

• Network Analysis

• Are we good enough?

• Benefits

• Lessons Learned

Page 3: Value add: Single User Performance Testing (

Context

• Well Known International Company with high standing reputation in a “niche” business.

• Multi Year Program, ERP and CRM Modernisation, Application Consolidation, New Application Development, Data Centre Consolidation; Managed by one of the top 5 Consultancy Firms.

• We were engaged to deliver the Performance, Security and Accessibility Testing.• The budget was fixed before we were engaged. New Software/Licenses were not to

be purchased, short term leases would be possible if essential for the program.• The functionality for the ERP and CRM Modernisation was delivered in functional

batches, with each batch dependent on the functionality of the previous one.• The degree of data dependency between business processes and integrated system

was high.• The existing ERP and CRM systems had a bad reputation with regards to

Performance. The level of dis-satisfaction varied throughout the world.• The remaining illustrations are around the ERP implementation only.

Page 4: Value add: Single User Performance Testing (

Challenges

• “Back Loaded” Schedule:

• Test Environments: • The fully integrated pre-production test environment was scheduled to be available only 2 months before go-live. This meant

that there would have been little time to fix any performance issues found.

• All other integrated test environments were on a shared platform with only one server per layer in the architecture. “Traditional” Performance Testing would be of limited value in such an environment.

• Test Environments were generally late in being delivered to the Testing Teams and in “bare metal” conditions. I would take an average of one week for an Integrated Test Environment to get ready for Test Execution.

• Releases:• The Timeline for Releases and eventually the Go-Live were based on a tight schedule and an assumption on the amount of

rework (defects) required. From early on, the timelines were not met and the amount of rework on functionality was also higher than expected.

• This resulted in more parallel testing activities than originally planned which put even more strain on the test environment resources.

Test Level Jan Feb Mar Apr May Jun Jul Aug

ST ◊

SIT

UAT

E2E

PT

Page 5: Value add: Single User Performance Testing (

Single User Performance Testing

Manual SUPT◦ Increase the performance test coverage.

Automated SUPT◦ Assess the difference in the relative Business

Process performance. ◦ More realistic data volumes than Manual

SUPT.◦ The same HP LoadRunner Test Scripts were

used for both SUPT and MUPT.◦ Load Generators were deployed by the local

application support teams on users PC in all major locations around the world. (~14).

◦ Includes Network Measurements (tracert, ping)

◦ Multiple Iterations against multiple Environments

◦ Employs Data Visualisation tools to help interpret the results quicklyResearch has shown that the average person perceives page load time as being about 15% slower than the

actual page load time. When recounting their experience to others, they will recall that the page was 35% slower than it actually was (strangeloop.com).

Page 6: Value add: Single User Performance Testing (

“Shift Left”

Test Level Jan Feb Mar Apr May Jun Jul Aug

ST ◊

SIT

UAT

E2E

PT

SUPT

◦ Early View on Location Specific Performance, leaving enough time to address infrastructure related issues

◦ Early View on Data Volume Related Performance issues (Business Process)

◦ Reduced Dependency on the Integrated Pre-Production Environment

◦ Performance Test Script and Data Script development as new functionality becomes available, reducing the risk for MUPT to be delayed

◦ Performance Testing in Pre-Production can concentrate on concurrency

Page 7: Value add: Single User Performance Testing (

High Value “Just in Time” Reporting

• Once of the challenges for the SUPT was the reporting. We had response time data for hundreds of measurements from various locations in the world.

• To speed up the Analysis, Interpretation and Communication of the test results we used freely availably data visualisation tools originating from Genomic Data Visualisation and Social Network Analysis.

• Increased Turnaround of Test Results to within hours of the test execution with an emphasis on “Face-To-Face” presentations rather than lengthy written reports.

• Providing condensed graphical representations of the results supported the understanding of the System Performance both inside the Performance Test team as well as in the Project Team.

Page 8: Value add: Single User Performance Testing (

High Value Reporting in 10 Minutes

Locatio

n

Transaction

Page 9: Value add: Single User Performance Testing (

Location Transaction

A9 is the worst performing location.

“fa”, “ea”, “dz” and “da” are the worst performing transactions and the best performing locations are “A1”, “A2” and “A8”. The “A1” location is the data centre.

Page 11: Value add: Single User Performance Testing (

Entire WAN Zoom Into Problem Area

Page 12: Value add: Single User Performance Testing (

Are we good enough?

Business Process Transaction Name A2 A3 A4 A5 A6 A7 A8 A9 A10 A11 A12 A13 A14 A15 A16 A17 A1 SLA

BP01 BP01_001 2.739 6.292 6.86 6.365 7.271 8.316 7.9 5.168 5.737 4.134 4.277 11.807 6.059 5.385 4.904 5.38 2.924 10

BP02 BP01_002 0.428 0.788 0.912 0.969 1.253 1.022 1.297 0.727 0.78 0.749 0.715 1.619 0.83 0.882 0.777 0.787 0.432 1

BP03 BP01_003 0.063 0.43 0.537 0.546 0.61 0.761 0.619 0.326 0.455 0.259 0.393 0.848 0.422 0.288 0.298 0.41 0.103 1

BP04 BP01_004 1.39 1.546 1.536 1.482 1.566 1.612 1.983 1.586 1.792 1.627 1.846 1.828 1.534 1.637 1.727 1.51 1.319 4

BP05 BP01_005 0.16 1.009 1.313 1.061 1.327 1.43 1.438 1.049 1.122 1.52 1.352 2.351 0.92 1.619 1.498 0.89 0.288 1

BP06 BP01_006 0.314 0.681 0.822 0.546 0.868 0.779 0.996 0.642 0.666 1.013 1.009 1.368 0.681 1.005 0.831 0.7 0.308 1

BP07 BP01_007 0.318 0.686 0.872 0.801 0.766 0.769 0.985 0.592 0.696 0.806 0.653 1.093 0.656 0.728 0.708 0.689 0.356 4

BP08 BP01_008 0.308 1.122 1.255 1.201 1.483 1.571 1.522 1.289 0.985 1.726 1.702 2.345 1.073 1.821 1.837 1.03 0.304 1

BP09 BP01_009 0.288 0.764 0.757 0.796 1.135 1.017 0.982 0.648 0.66 0.439 0.504 1.488 0.68 0.956 1.05 0.691 0.347 5

BP01 Total 6.0 13.3 14.9 13.8 16.3 17.3 17.7 12.0 12.9 12.3 12.5 24.7 12.9 14.3 13.6 12.1 6.4 28.0

BP02 Total 9.5 19.1 22.9 24.1 25.6 25.5 26.8 17.9 20.0 16.5 16.2 38.0 18.6 19.9 20.1 18.8 11.1 34.0

BP03 Total 11.8 21.5 24.8 25.8 28.1 27.8 30.0 21.5 21.6 21.2 21.5 39.8 20.9 23.7 24.8 21.6 12.9 35.0

BP07 Total 15.5 35.3 41.0 47.3 50.0 49.5 53.0 34.5 37.3 35.7 32.3 76.6 34.8 37.1 38.1 34.9 16.5 54.0

BP08 Total 33.8 48.3 54.6 52.9 60.3 56.3 61.8 45.8 48.7 45.5 45.4 78.9 47.7 50.1 50.2 47.3 32.5 57.0

BP09 Total 8.9 15.7 17.2 21.4 20.8 19.3 20.4 13.6 19.9 15.3 17.0 29.9 16.9 16.7 15.7 14.7 9.4 24.0

BP10 Total 8.8 21.6 23.2 23.5 27.0 28.3 33.7 23.8 23.5 23.2 23.2 40.8 19.4 29.1 29.0 22.7 9.1 51.0

BP11 Total 9.6 18.9 19.9 19.3 29.0 28.6 29.1 17.5 18.0 14.2 15.0 0.0 18.6 0.0 0.0 18.6 11.0 38.0

BP14 Total 10.0 16.7 19.6 19.0 20.6 20.1 21.2 18.4 16.9 15.7 16.3 34.8 16.2 18.1 18.9 16.8 9.2 18.0

Grand Total 113.8 210.3 238.1 247.1 277.5 272.7 293.6 205.1 218.9 199.7 199.4 363.5 206.0 209.0 210.4 207.4 118.1 339.0

Absolute Difference to NFR 225.2 128.7 100.9 91.9 61.5 66.3 45.4 133.9 120.1 139.3 139.6 -24.5 133.0 130.0 128.6 131.6 220.9

Percentage Difference to NFR 34% 62% 70% 73% 82% 80% 87% 60% 65% 59% 59% 107% 61% 62% 62% 61% 35%

Page 13: Value add: Single User Performance Testing (

Benefits

SUPT and MUPT were completed within the same budget originally allocated only to MUPT.

Automated SUPT• Early view of the Process Performance using “real” volume data.

• Objective view of the Performance from different locations around the world.

• Eight data volume related performance issues were identified which would have not been discovered until the E2E Testing. Providing enough time to get fixed and re-tested before go-live.

• One Network Routing Issue and four Location Specific performance issues (bandwidth, compression) were identified, allowing the infrastructure enhancements to complete in time for go-live.

Manual SUPT• Increased Test Coverage for Performance Testing.

• Created a more holistic “Ownership” within the team.

• A number of performance related defects were raised through this channel, with one being a critical showstopper identified within one subsystem.

• This allowed enough time to fix the defects and re-test them during the planned ST/SIT/PT cycles.

• No additional cost to the client for this “Performance Testing”

Page 14: Value add: Single User Performance Testing (

Lessons Learned

• One of the key challenges was the provisioning of Integrated Test Environments. System Test Environments were generally available and provisioned in time.

• The Integrated Test Environments also brought their challenges with them, especially around the data integrity between these 20+ systems.

• A Service Virtualisation tool like CA LISA could have brought benefits to this environment:• Reduce the time and effort for provisioning Integrated Test Environments.• Improve the overall Testing Schedule for System Integration and Performance

Testing.• More efficient use of the limited hard and software available for building test

environments.• Reduction of data management effort and data storage capacity.

Page 15: Value add: Single User Performance Testing (

Thank You!


Recommended