Date post: | 22-Jun-2015 |
Category: |
Documents |
Upload: | softwarecentral |
View: | 501 times |
Download: | 0 times |
Software Performance And The Development Life CycleCarey SchwaberAnalystForrester Research
February 14, 2005 Call in at 12:55 p.m. Eastern Time
2Entire contents © 2006 Forrester Research, Inc All rights reserved
Software performance
► The speed at which software functions (sub-second response time versus three-second response time)
► How often software is available (99% uptime versus scheduled weekly downtime)
► How these two factors change as usage levels increase (from dozens of current users to thousands)
3Entire contents © 2006 Forrester Research, Inc All rights reserved
Load testing is necessary but not sufficient
Loadtesting
? ?
??
4Entire contents © 2006 Forrester Research, Inc All rights reserved
Theme
A performance-driven development life cycle helps
IT shops more effectively and efficiently meet business performance requirements
5Entire contents © 2006 Forrester Research, Inc All rights reserved
Agenda
• Software performance matters
• Performance verification ensures software performance meets business needs
• But performance-driven development drives down the cost of meeting these needs
• How to move to performance-driven development
6Entire contents © 2006 Forrester Research, Inc All rights reserved
Agenda
• Software performance matters
• Performance verification ensures software performance meets business needs
• But performance-driven development drives down the cost of meeting these needs
• How to move to performance-driven development
7Entire contents © 2006 Forrester Research, Inc All rights reserved
Software performance matters to the business
• Software performance is a limiting factor of business performance
» A financial services company can’t provide a life insurance quote to a customer any faster than its quote engine can compute the appropriate rate
» An automotive supplier can’t start building parts if the software that delivers orders from its OEM customers has been taken down by unanticipated usage levels
» A retailer can’t process orders any faster than it can process credit card validations
8Entire contents © 2006 Forrester Research, Inc All rights reserved
But development shops neglect performance, and firefighting wastes time, money, and good will
• IT loses credibility when performance problems disrupt business operations
» Problems with in-store app performance earn a retail CIO the ire of his fellow executives
• Postponing problem resolution until late in the day is less cost effective by at least an order of magnitude
» A US health insurance company calculates several million dollars of avoidable costs every year
• Millions of dollars are wasted on unnecessary hardware» A global pharmaceutical purchases several times the hardware it needs
• Funds are diverted from strategic initiatives to ongoing maintenance and operations
» A telecom spends $29 million on support calls in six months when its EAI infrastructure experiences 15-second timeouts
9Entire contents © 2006 Forrester Research, Inc All rights reserved
The answer? A performance-driven development life cycle
FirefightingPerformanceverification
Maturity
Performance-drivendevelopment
Stage of thesoftware
developmentlife cycle
Production
Development
Testing
Requirements
Design
10Entire contents © 2006 Forrester Research, Inc All rights reserved
Agenda
• Software performance matters
• Performance verification ensures software performance meets business needs
• But performance-driven development drives down the cost of meeting these needs
• How to move to performance-driven development
11Entire contents © 2006 Forrester Research, Inc All rights reserved
Lost revenue
% of irrecoverable business
Duration ofoutage
Average revenueper hour
Why performance verification? The view from ops
Cost of datarecovery
IT costs
Problemidentification
Analysis andresolution
Testing
Cost of externalsupport
Productivity costs
Number of people affected
Average % oflost productivity
Average costper employee
Duration ofoutage
Averageovertime costs
+ + = Real costs
Costs if resolved in development
+ Costs if resolved in production
=
=
12Entire contents © 2006 Forrester Research, Inc All rights reserved
Business and IT partner to set performance requirements
• Look first at high-level business needs
» New functionality on an office supply eCommerce site can’t degrade page load time by more than 10%
• Right-size expectations by talking tradeoffs and chargebacks
» A UK life insurance company has found that building service levels into chargebacks results in more accurate performance requirements
» Pitney Bowes helps business stakeholders understand necessary tradeoffs by asking questions like, “Are you willing to pay an additional $10,000 for a third failover site?”
• The production environment is itself a set of requirements
» A US grocery chain traces the majority of its performance problems back to improperly understood production constraints
13Entire contents © 2006 Forrester Research, Inc All rights reserved
Get more bang for your performance testing buck
• Cut the cost of replication by centralizing the performance test lab » One $30B+ firm spent $11 million on testing and staging environments
» Adding virtualization to the test lab’s tool kit drives this cost down
» A team at CSFB spent approximately $300,000 on performance modeling tools instead
• Improve and share performance testing services with a test center of excellence
» Experienced performance testers command salaries ranging from $75,000 to $150,000
» Software necessary for testing 3,000 concurrent users costs $300,000 dollars
• Offshore performance test scripts created to keep staff focused on optimization
» Performance testing is too iterative to offshore entirely
» A Midwest retailer has TCS conduct capacity planning, design for performance, performance optimization, and tuning on-site but creates and executes test scripts offshore
• Enable data and asset sharing by using testing tools that integrate with other life-cycle tools
» Testing with monitoring or testing and monitoring with development
14Entire contents © 2006 Forrester Research, Inc All rights reserved
Performance verification isn’t enough
• Resolving performance problems during testing still takes more time and money than it does to prevent them from being inserted in the first place
» Resolving problems earlier in the life cycle is less expensive by several orders of magnitude, but your mileage will vary
• For most companies, the majority of performance problems are rooted in design decisions
» Once performance testing gets underway, it’s too late in the game to make any significant changes to an application’s architecture
• The amount of time and money it takes varies widely and is difficult to predict
» Problems in performance testing are a common cause of missed release dates and budget overruns
» Some outsourcers avoid performance testing engagements because their indeterminate length leads to tension with clients
• Shops that stop treating design and development as a black box and adopt practices that prevent performance defects from ever being inserted increase both the efficiency and the predictability of their development efforts
15Entire contents © 2006 Forrester Research, Inc All rights reserved
Agenda
• Software performance matters
• Performance verification ensures software performance meets business needs
• But performance-driven development drives down the cost of meeting these needs
• How to move to performance-driven development
16Entire contents © 2006 Forrester Research, Inc All rights reserved
Design maps out a strategy for meeting performance requirements
• Define performance objectives for application components the team owns
» Component-level performance objectives as pass/fail criteria for developers’ early performance testing efforts
• Secure performance contracts for components maintained by other groups
» A major grocery chain has an “ITIL guy” to help IT better define performance contracts for internal services to get more “givens”
• Model application performance to validate design decisions
» A financial services firm estimates that using HyPerformix to model app performance and identify potential bottlenecks during design has saved it $15 million — far more than its $300,000 investment
17Entire contents © 2006 Forrester Research, Inc All rights reserved
Developers anticipate test and monitoring
• Check their work by testing early, often, and automatically
» A $15 billion retailer finds that that when developers test component-level performance, there are fewer problems during final performance testing — mostly integration issues
» ThoughtWorks, a systems integrator, performs continuous performance testing
• Minimize the time to problem resolution by instrumenting code for test and monitoring
» A multi-channel retailer expects that adding JMX instrumentation to its corporate applications will dramatically decrease the time it takes to resolve performance problems
» Instrumentation plugs into management consoles from vendors like HP
» IDE plug-ins make it easier for developers to instrument their code
18Entire contents © 2006 Forrester Research, Inc All rights reserved
Sample economics of a move to performance-driven development
Life-cyclestage
Cost of resolution for100 defects at x = $100
Requirements
Design
Development
Test
Production
Cost ofproblem
resolution
1x
2x
10x
50x
100x
Maturity level
Firefighting
$1,000,000
%resolved cost
0%
0%
0%
0%
100% $1,000,000
$0
$0
$0
$0
Performance-drivendevelopment
%resolved cost
10% $1,000
40% $8,000
25% $25,000
20% $50,000
5% $100,000
$184,000
Performanceverification
%resolved cost
10% $1,000
$0
$0
0%
0%
60% $300,000
30% $300,000
$601,000
19Entire contents © 2006 Forrester Research, Inc All rights reserved
Agenda
• Software performance matters
• Performance verification ensures software performance meets business needs
• But performance-driven development drives down the cost of meeting these needs
• How to get to performance-driven development
20Entire contents © 2006 Forrester Research, Inc All rights reserved
Three ways to get to performance-driven development
1. Management decree
• Motivated by pressure from the business or pressure on the bottom line
2. IT operations imposition of release criteria
• Production refuses to accept any more apps that haven’t been subjected to rigorous performance testing
• The development organization suffers from cost overruns and missed release dates until it begins to adopt performance practices for earlier in the life cycle
3. Shared service model
• A team with expertise in some or all of the activities in performance-driven development grows in proportion to its ability to help internal customers cut costs, cut time-to-market, and improve their own customer satisfaction levels
These paths are not mutually exclusive
The move to performance-driven development is incremental
21Entire contents © 2006 Forrester Research, Inc All rights reserved
Where should performance live on the org chart?
Pros and cons to every arrangement
• For performance verification:
» Quality assurance
» IT operations
• For performance-driven development:
» Architecture
» Application-specific, cross-functional team
22Entire contents © 2006 Forrester Research, Inc All rights reserved
Vendor offerings
Tool integration that enables data or asset sharing
MonitoringDevelopment* Testing
OpNet
Empirix
Borland*
HP
HyPerformix
Microsoft
Mercury
Avicode
Compuware
IBM
Requirements Design*
‡
*Vendors with support for performance-driven development activities within their design and development products.†Through its pending purchase of Segue Software, Borland gains performance testing and monitoring tools ‡Mercury OEMs HyPerformix technology for sale as part of Mercury Performance Center.
23Entire contents © 2006 Forrester Research, Inc All rights reserved
Carey Schwaber
+1 617/613-6260
cschwaber@forrestercom
www.forrester.com
Thank you
24Entire contents © 2006 Forrester Research, Inc All rights reserved
Selected bibliography
• In production: March 2006 Best Practices “Performance-Driven Development”
• July 12, 2005, Trends “Applying Process Control Principles To Application Performance Management”
• May 16, 2005, Trends “Software Quality Is Everybody’s Business”
• February 11, 2005, Best Practices “Performance Management And The Application Life Cycle”