+ All Categories
Home > Documents > Reality Test - StickyMinds...Software Test Automation March 5-8, 2001 San Jose, CA, USA P R E S E N...

Reality Test - StickyMinds...Software Test Automation March 5-8, 2001 San Jose, CA, USA P R E S E N...

Date post: 11-Jun-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
33
International Conference On Software Test Automation March 5-8, 2001 San Jose, CA, USA P R E S E N T A T I O N Thursday, March 8, 2001 11:30 AM HOW TO EVALUATE AND SELECT A HIGH-END LOAD TESTING TOOL Marquis Harding Reality Test E6 Presentation Bio
Transcript

International Conference OnSoftware Test Automation

March 5-8, 2001San Jose, CA, USA

P R E S E N T A T I O N

Thursday, March 8, 200111:30 AM

HOW TO EVALUATE AND SELECT

A HIGH-END LOAD TESTING

TOOL

Marquis HardingReality Test

E6Presentation

Bio

A Methodology for EvaluatingAlternative Load Testing Tools

Marquis HardingR e a l i t y T e s t

This is a Reality Test 2

The Selection Problemw Tool selection is a difficult

choicen Many alternativesn Costlyn Long evaluation period

w No standard evaluation methodw No standard evaluation criteria

This is a Reality Test 3

Agendaw Tool evaluation methodology

n The experimentn The resultsn Technical environmentn Technical skill set

w Customer evaluationn Environment detailsn Evaluation methodologyn Results

This is a Reality Test 4

What Is the Objective?w Predict, diagnose and correct problems in the

the system under test (SUT) before deployment.

0

10

20

30

40

50

Users

Re

sp

on

se

Tim

e

UnacceptablePerformance

IncorrectBehavior

This is a Reality Test 5

0

10

20

30

40

50

Users

Re

sp

on

se

Tim

e

0

10

20

30

40

50

Re

sp

on

se

Tim

e Current SUTPerformance

ReconfiguredPerformance

What Tool Characteristics Matter?

w Must scale on Production Equivalent Hardwarew Must accurately represent real workloadw Must be maintainable and repeatable

when SUT changes are testedw Must be Cost Effective

This is a Reality Test 6

Tool Evaluation - The Experiment

w Tool evaluation is an experimentw You need to :

Gather informationIdentify materialsIdentify methodologyIdentify metricsExecuteAnalyze

w Experiment must be RepeatableRefresh Database

Refresh logs

Reset

This is a Reality Test 7

Information Gathering

nVendor web sitesnVendor literature

packsn Local user groupsn Internet resourcesnCustomer

references

This is a Reality Test 8

Identify Materialsw Materials required

ToolTarget systemRefresh mechanismMonitoring toolsAnalysis toolsTime

w And most importantlyTechnical supportManagement support

This is a Reality Test 9

Determine Methodologyw Determine functions to test

n 1 to 3 or more representative scenariosrepresentative scenariosn Start with read only scenario then insert &

n Vary complexityn Create input datan Consider securityn You can’t test everything

This is a Reality Test 10

Determine MetricswQuantitative metrics

n Memory usagen CPU usage

wQualitative metricsn Ease of usen Recording processn Scriptingn Reportingn Protocol support

This is a Reality Test 11

Executing the Test

w Some things to considerNetwork Load - Day vs. NightSystem LoadStress of measurement tools

w Test must be RepeatableRefresh DatabaseRefresh logsReset

This is a Reality Test 12

Analyze Resultsw Validate Run

n Invalid Return Resultsn Dropped Connections

w Examine Timing Datan Tool Datan External Reporting Data

This is a Reality Test 13

Technical Environmentw Ample supply of driver machines

n As much hard drive storage space as possiblew Keep Staged Database backups/dump filesw Keep all result files

n Ample Memoryw Budget 3MB per VU

n Double your worst case time estimate Playback must

n Every error, omission and oversight costs one hour -

n Server response times slow with additional usersn User log on time grows exponentially

This is a Reality Test 14

PerformanceStudio

Tool Implementationw Technical Skill Set

PerformanceStudio

System Under Test Architecture

Business Processes

Tool Knowledge

Networking

Database Management Windows NT

HTTP

SQL

Project Management

Statistics

Unix

This is a Reality Test 15

Customer Evaluation

w After Information Gathering, the decisioncame down to evaluate performance testingtools on a real production system!w Good Management Supportw Fair Technical Supportw Other Measurement Aids

n WinNT - MS Perfmonn SQL Server - SQL Tracen WinDiff

This is a Reality Test 16

The Experimentw Project Xw SQL Server driven application for customer

w Application to track user maintenance

w Evaluate performance testing tools on realproduction systemn All were shipping versions

w Qualitative and Quantitative Analysis

This is a Reality Test 17

Project Timew Preparation Time

n Total time elapsed 2 Monthsn Active time spent on project 2 Weeks

w Execution Timen Total time elapsed 6 Daysn Active time spent on project 5 Days

w Analysis Timen Total time elapsed 5 Daysn Active time spent on project 5 Days

This is a Reality Test 18

Technical Environmentw Recording Environment

n Application: Customer servicen Client: Gateway Pentium 200

Windows NT Servern Server: SQL Server 6.5

Dell Pentium II 450, 512 MB Ramn Tools: Current shipping versions

This is a Reality Test 19

The Recording ProcessFor a fair evaluation, scripts had to be IDENTICAL• Three scenarios identified

2 - Focussed on specific areas of concern1 - Complex Business Process

• Complex Business Process Scenario DroppedProved redundant - first two yielded sufficient

Script was complex and additional effort would

This is a Reality Test 20

The Recording Process Cont.w Recorded original scripts with one tool.w Used Tool specific recording to capture the

n Play back 1 instance of original scriptn Capture transactions

w Both scripts were edited for Data Correlationw Tool output and SQL Trace outputs analyzed

with WinDiff to ensure they were exactly the

This is a Reality Test 21

The Execution Process

w Executed four testsn User Load: 1, 50, 150 and 300 Virtual Users

w Scheduling Difficultiesn Tool 1 scheduling features available

§ Random events§ Complex logon patterns§ User profiling

w 5 Days to Executen Generally off hours

This is a Reality Test 22

Execution HardwareDriver Machines

Gateway Pentium 166 Mhz 128MBGateway Pentium II 233 Mhz 256MBDell Pentium II 450 Mhz 512MBDell Pentium II 450 Mhz 512MBDell Pentium II 450 Mhz 512MB

ControllerGateway Pentium II 233 Mhz 256MB

This is a Reality Test 23

Analysis - Quantitative Resultsw Used NT Performance Monitor

n Memory Metric : Available Bytesn Processor Metric: %Processor Time

w Used SQL Trace to analyze Database

n Verify that all tools performing same

This is a Reality Test 24

Tool 1 Processor & Memory Stats

Average Footprint 1.60 MB/VU

Average Processor Utilization

Dell Pentium II 450 Mhz , 512MB, 60 Virtual Users Script 1

Memory

Processor

This is a Reality Test 25

Tool 1 Processor & Memory StatsGateway Pentium 166 Mhz,128MB, 60 Virtual User Script 2

Average Footprint 0. 52 MB/VU

Average Processor Utilization

Memory

Processor

This is a Reality Test 26

Tool 1 and SQL Server StatisticsDell Pentium II 450 Mhz , 512MB

Logon

This is a Reality Test 27

Different Log-on EmulationTool 2, May Not accurately emulateconnections for the SUT

Tool 1, emulates connections asthey were recorded

This is a Reality Test 28

Surprising Differences!Tool 1, found•Database Locking

Verified as problem byreal user testing

•Accurate connectionmodeling•Accurate pacing

This is a Reality Test 29

Analysis - Qualitative Resultsw Ease of Use

n Script Length using Tool 1

Script 1 2,715LinesScript 2 2,032Average 2,374

n Script Development Time Tool 1 2 Days per Script

Note: Knowledge gained by scripting inother tools saved scripting time.

This is a Reality Test 30

Analysis - Qualitative Resultsw Features worth mentioning

Data Smart recordingScript splittingTiming of individualcommandsComplex schedulingServer error handlingShared memory

Ability to pass informationbetween virtual users.

Network recordingAccurate script pacingAccurate connectionemulationOn-line monitoringDetailed reportingSupport mechanism

This is a Reality Test 31

Lessons Learnedw Tool choice matters!w Performance testing works!

n Revealed application architecture deficienciesn Found deadlocksn Found redundant database coden Determined optimization points

w Be preparedn Time estimatesn Double your hard drive spacen Off hours availability

Marquis Harding

Marquis Harding has over twenty-five years of Information Technologies andSoftware Quality Assurance experience. His backqround includes developmentand QA of large and mid-range mainframe, client/server, and Internet systems,senior management of QA and testing for large companies that span thefinancial, telecommunications and software industries. Mark has personated atinternational conferences on software development and testing. Marquis is adisabled Vietnam Veteran.

While at Microsoft Corporation, he held the positions of Group QualityAssurance/Test Manager for Windows.com, Windows Update.com, andMicrosoft.com, and Test Manager for IT Sales & Marketing/Product SupportServices. In six years at Charles Schwab & Co., Inc., he held the positions ofSenior Test Manager, ITG, as well as Development Manager for Schwab’sFinancial Advisor Services division. Prior to this was a seventeen-year career atPacific Telesis where he was employed as Manager of Information TechnologySupport for the CFO and Executive Vice President of Operations.


Recommended