Post on 31-Oct-2014
description
transcript
Copyright © 2012 Aware Corporation Ltd.
Copyright © 2012 Aware Corporation Ltd.
Copyright © 2012 Aware Corporation Ltd.
The Audacity of Quality Requirements: Non Functional Testing
Testing as Managed Services
“Think Communicate Implement”
By - Sudeepta Guchhait (Deep)
“Quality is the link to Success”
Copyright © 2012 Aware Corporation Ltd.
How to DEFINE business problem?
How to DERIVE high quality non functional test requirements?
How to VERIFY high quality non functional test requirements?
How to TEST non functional test requirements?
How to DISCOVER the bottlenecks?
Key BEST Practices? – Take Away
INTENT
Copyright © 2012 Aware Corporation Ltd.
How to DEFINE business problem, due to
lower quality Non Functional test
requirements?
Voice of Customer
Proposition – Testing Upfront
Copyright © 2012 Aware Corporation Ltd.
Voice of Customer
Copyright © 2012 Aware Corporation Ltd.
Proposition – Test Upfront
100% 15% 20%
20%
10%
When to Test
Consistency — on time, on budget delivery to business, increased client satisfaction, Decreased
implementation risk.
Identify system & application bottlenecks to fine tune those.
Defect Removal
Qualitative
Benefits
% O
rig
inal D
efe
cts
Requirement (15%): Performance Modeling, Think time
modeling
Infrastructure (20%):
– Investment needs to be made to procure the production like
NFT test environment (Physical as well Logical design)
– Appropriate test tool procurement
Testing (35%):
– Test execution to be done for performance test, load test,
Stress, Scalability (Capacity) and Availability (Reliability).
Test design (20%):
• Identification of critical business transactions, defining the test
SLA, User load distribution, Transactional volume and peak
usage
0%
20%
40%
60%
80%
100%
Original Defects
Req Infra &
Test lab
Prod Defects
35%
Testing
√ Improved software quality with varied data sets
√ Fully tested instance in line with the expectation of business users
%
Reduction
Quantitative
Benefits
Design
Copyright © 2012 Aware Corporation Ltd.
Challenges of Performance testing
Creation of tests
Determine what to simulate?
Determine which performance factors are critical?
Analyze desired test results
Fine tune the bottlenecks
Infrastructure
Build a realistic test environment (Infrastructure usage)
Select the appropriate test tool
Usability of tests
Software build should be functionally stable
Tests are resource-intensive
Copyright © 2012 Aware Corporation Ltd.
How to DERIVE high quality Non Functional
test requirements?
Non Functional Objective
Practical Conversation
Performance [ M / T / E ]
Copyright © 2012 Aware Corporation Ltd.
Non Functional Objectives
Application Response Time
How long does it take to Complete
a task?
Reliability
How stable is the system under a
heavy load?
Acceptance
Is the system stable enough to go
into production?
Capacity Planning
At what point does degradation in
Performance occur?
Bottleneck Identification
What is the cause of degradation
In Performance?
Product Evaluation
What is the best server configuration
for 100-500-1000 users?
Performance Testing
Copyright © 2012 Aware Corporation Ltd.
Practical conversation on Performance of Application
“I think we need Performance Testing, but what is it exactly?”
“I know you want it to be fast, but how fast?”
“I think around 300 users will use the system, they will do all kinds of activity,
so can we determine performance ?
“What you will do with production data in performance?”
“Post Test charts looks nice, let me know the performance is good or bad”
“Does that mean we’re done? Can we release?”
Copyright © 2012 Aware Corporation Ltd.
Identification of Performance Transactions
80/20 rule
- 20% of the critical transaction will be performed by 80% of real business
users.
- high visibility activities
• Creating initial user profile
• Updating payment information
- high importance activities
• Withdrawals
• Transfer funds
• Paying bill on-line
- performance intensive activities
• Importing the monthly report from “the other system”
• Requesting payment history
• Heavy weight transactions (Stock trade, IPO subscription etc.)
- Identify the “heavy hitters” even though they may be used less often
Copyright © 2012 Aware Corporation Ltd.
Performance Modeling (Validation)
“Performance validation is the process by which software is tested with the
intent of determining if the software meets pre-existing performance
requirements. This process aims to evaluate compliance.”
Primarily used for…
Determining SLA compliance
Determine user behavior load
Determine the volume usage
Copyright © 2012 Aware Corporation Ltd.
Performance Testing
“Performance testing is the process by which software is tested to determine the
current system performance. This process aims to gather information about
current performance, but places no value judgments on the findings.”
Primarily used for…
Determining capacity of existing systems
Creating benchmarks for future systems
Evaluating degradation with various loads and/or configurations
Copyright © 2012 Aware Corporation Ltd.
Performance Engineering
“Performance engineering is the process by which software is tested and tuned
with the intent of realizing the required performance. This process aims to
optimize the most important application performance trait, user experience.”
Primarily used for…
Fine application bottlenecks
Extending the capacity of old systems.
Fixing problems of the application that are not meeting
requirements/SLAs.
Copyright © 2012 Aware Corporation Ltd.
How to VERIFY high quality Non Functional
test requirements?
Defect Root Cause
Issue in Requirement & Mitigation
Copyright © 2012 Aware Corporation Ltd.
Defect Root Cause Identification
Identify Key Scenarios: Identify scenarios where performance is important
and scenarios that pose the most risk to performance objectives
Identify Workload: Identify how many users and how many concurrent users
system needs to support
Identify Performance Objectives: Identify performance objectives for each of
the key scenarios.
Baseline is the process of running a set of tests to capture performance
metric data for the purpose of evaluating the effectiveness of subsequent
performance improving changes to the system or application.
Benchmarking is the process of comparing your system performance against
an industry standard that is endorsed by some other organization.
Copyright © 2012 Aware Corporation Ltd.
Defect Root Cause Identification - Example
F1 A
P
I
Fn A
P
I
App1 App2
1
2
3
4
5
6
8 10
9 11
12
12 times * number of sec
Step 1: Funds Transfer Menu (App1 –App2)
Point-to-Point – Component Level View 1. Captures the internal processing time between steps (low-level system processing)
2. No ability to measure this with testing tool alone
3. Requires monitors to be installed to collect internal processing time at each step
7
Network Monitoring Server Monitoring
Server Monitoring API Processing Monitoring
API Processing Monitoring
Copyright © 2012 Aware Corporation Ltd.
Requirement Design
Issue 1: Illustrative Example
BA / SME
ESANDA
Deposit
Withdrawal
Payment
Cheque
EOD- Subscription
SolArc/SME
High-level products/functions/
modules specified in the documents
Step 1: Determine user load distribution across products/functions/modules
Copyright © 2012 Aware Corporation Ltd.
Issue in Requirement & Mitigation
Information given in document
Total User Base 1444
Details No. of
User Time
Concurrency 1441 8:30 am – 10:30 Am
Peak load 1441 12:00 pm – 03:30 pm
Distribution missing in document
Business Process % of Users
Deposit ??
Withdrawal ??
Payment ??
Cheque ??
EOD Subscription ??
Real-time
user
simulation
may not
be realistic
User load distribution is
discussed and agreed during
the requirement Workshops.
Copyright © 2012 Aware Corporation Ltd.
Issue in Requirement & Mitigation
Step 2: Decompose high-level scenarios (products/functions/modules) into low-level scenarios or test
steps and assign response time for each step.
BA / SME
ESANDA
Deposit
Withdrawal
Payments
Cheque
EOD Subscription
SolArc/SME
Scenario 1
High-Level Scenario(s)
Withdrawal
Scenario 2
Step - 1
Step - 2
Step - 28
Step – 30
Internal Workshop
Step - 1
Step - 2
Step - N
Test Team
Transactio
n
Response
time for
steps
would be
discussed
& agreed
in the
requireme
nt
workshop.
Workshops to decompose the
high-level scenarios into the low-
level test cases/test steps.
Copyright © 2012 Aware Corporation Ltd.
Model Real Users / Think Time
Why must they be accurately modeled?
- Results from inaccurately modeled tests are nearly always inaccurate, and often
lead to incorrect decisions.
- The only way to predict actual user experience (end-to-end response time) is
to execute tests using realistic User Community Model (s).
- Extrapolating expected performance based on incomplete models doesn’t work.
You know what, end users will move at different speed in your Application
And…
It’s your job to design how to model the think time and execute script in varying speed
"The one thing that matters the most is not how your application behaves under
theoretical or simulated conditions, but how well it works when you plug it into the
wall and let everyone come hit your box from all across the world“ - Serdar
Yegulalp
Copyright © 2012 Aware Corporation Ltd.
How to TEST Non Functional
Requirements?
Methodology
Approach
Execution & Monitoring
Diagnose & Tune
Analysis
Copyright © 2012 Aware Corporation Ltd.
Test Methodology
Copyright © 2012 Aware Corporation Ltd.
Test Approach
Performance Test Approach
Design Plan Execution
Understand Customer
Environment
Understand
Application
Identify Business
Critical Processes
Identify Load Module
& Volumetrics
Identify Performance
Test Scenarios
Determine Acceptable
Test Conditions
Understand Tuning
Methodology
Create Performance
Test Plan
Estimate Load
Generators
Setup Controller &
Load Generators
Setup Monitors, Logs
Initial Benchmark
Execution
Tuning Scenarios
(Repetitive)
Generate Test Report
& Log Defects
Analyze & Identify
Bottlenecks
Tuning Application
Final Benchmark
Execution
Final Report
Performance Testing Team
Collaborative
Support
Understand Project
Goals and Timelines
Create Performance Test Cases
Generate Automated
Test Scripts
Customize Automated
Test Scripts
Script Development
& Validation
Create Performance
Test Scenarios
Test Environment
Setup
Analyze
Production Data
Identify Required Data
Generate Test Data
Verification of Test
Data
Test Data
Preparation
Copyright © 2012 Aware Corporation Ltd.
Execution & Monitoring
Web Server Application Server
Database Internet/ WAN
Automation Tool Controller
Automation Tool Virtual Users
PERFORMANCE MONITORS
Replaces real users with thousands of virtual users
Generates consistent, measurable, and repeatable load, managed from a single point of control
Efficiently isolates performance bottlenecks across all tiers/layers with automated reporting and analyses
Copyright © 2012 Aware Corporation Ltd.
Post Test Analysis
• Analyse most difficult and important data
• Check performance criteria is met or not
• Plan for next level of performance improvements
• Provide input to level phase of application design and development, tuning
Copyright © 2012 Aware Corporation Ltd.
Diagnose & Tune
Diagnose Tuning
Copyright © 2012 Aware Corporation Ltd.
How to DISCOVER Non Functional
Bottlenecks?
What are bottlenecks?
Analysis
Example
Copyright © 2012 Aware Corporation Ltd.
What are the Bottlenecks?
A term used to describe a limiting resource in the system under test. Bottlenecks
are typically identified as the cause of slow or unacceptable performance. They
directly affect both the performance and scalability of the system under test.
Some of the common areas bottleneck can appear in any system are
CPU burst
Memory Leakage
Disk threshold
Network utilization
Operating System limitations
Bottlenecks exist within systems for a number of reasons. These include:
Inadequate server hardware to support the projected system load
Inadequate network capacity, both internal and external
Poor system and architecture design decisions
Databases that are incorrectly implemented or tuned
Developers not being mindful of performance considerations during development
Copyright © 2012 Aware Corporation Ltd.
Bottleneck Analysis
Data collection to be done during test execution for bottleneck analysis
Profilers – Record the time spent in different parts of a program. Profilers like SQL
profiler can be used for recording the time spent on SQL queries, stored procedures, etc.
Traces – Record occurrence of various specified events. These traces can be on the
client side ( server response, communication between server and client, done by the load
controller) or at the server side (e.g. event viewer on the servers). Calls to web and app
servers can be accounted for by looking into the trace logs.
Counters – To record the health of the servers during test execution. Load generating
tools or utilities like perfmon for windows, vmstat, iostat for unix, netstat for network
monitoring can be used.
Copyright © 2012 Aware Corporation Ltd.
Examples
The CPU utilization of the application server reached
unacceptable levels shortly before the response times
increased
Monitoring the CPU queue length resulted in the chart in
Figure, which showed a direct correlation between the
queue length and the poor performance.
Copyright © 2012 Aware Corporation Ltd.
Insufficient Database Calls
Cause
Symptom
Diagnosis
• Slow down in many transactions even under light load
• Running out of DB connections
• Inefficient Database Calls
– Is Database fully tuned?
– Is connection pool in use and optimized?
– Are too many calls being made?
– Are stored procedures in use?
– Profile all SQL calls
– Isolate long running queries with JDBC instrumentation
Copyright © 2012 Aware Corporation Ltd.
Key Best Practices? - Take Away
Copyright © 2012 Aware Corporation Ltd.
Best Practices & Take Away
Performance Requirements Gathering is the key
Workload to be driven by current and future business volume patterns Impact analysis of
one operation on the other
Data population
Sufficient data volumes lead to correctly optimistic results 50% - 70% - 100% of Prod :
Good, Better, Best
Gap Between Test and Production Environments
Performance great on test envi, but could bomb on production
Usage of WAN Emulator
Obtain real-user perception for end to end performance (simulating WAN &Network)
Run Duration
Test Run Duration to cover steady state
Transient state readings alone, will give a wrong picture of performance
Test Early
Performance-target-driven development
Fine tune the database, application server
Packaged solutions : Benchmark vanilla product for typical workloads
Copyright © 2012 Aware Corporation Ltd.