Date post: | 11-May-2015 |
Category: |
Technology |
Upload: | techwellpresentations |
View: | 200 times |
Download: | 1 times |
T1
Test Management
5/8/2014 9:45:00 AM
A Funny Thing Happened on the
Way to User Acceptance Testing
Presented by:
Randy Rice
Rice Consulting Services, Inc.
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ [email protected] ∙ www.sqe.com
Randy Rice
Rice Consulting Services, Inc.
A leading author, speaker, and consultant with more than thirty years of experience in the field of software testing and software quality, Randy Rice has worked with organizations worldwide to improve the quality of their information systems and optimize their testing processes. He is coauthor (with William E. Perry) of Surviving the Top Ten Challenges of Software Testing and Testing Dirty Systems. Randy is an officer of the American Software Testing Qualifications Board (ASTQB). Founder, principal consultant, and trainer at Rice Consulting Services, Randy can be contacted at riceconsulting.com where he publishes articles, newsletters, and other content about software testing and software quality. Visit Randy’s blog.
4/26/2014
1
A FUNNY THING
HAPPENED ON THE
WAY TO THE
ACCEPTANCE TEST
RANDALL W. RICE, CTAL
RICE CONSULTING SERVICES, INC.
WWW.RICECONSULTING.COM
2
THIS PRESENTATION
• The account of four different
acceptance tests, in three
organizations.
• The names have been withheld and the data generalized to protect privacy.
• One project was in-house developed and the other three were vendor-developed systems.
• So, a more traditional UAT approach was taken.
4/26/2014
2
3
A COMMON
PERCEPTION OF UAT
• UAT is often seen as that last golden moment or phase of
testing, where
• Users give feedback/acceptance
• Minor problems are identified and fixed
• The project is implemented on time
• High fives, all around
4
IN REALITY…
• UAT is one of the most risky and explosive levels of testing.
• UAT is greatly needed, but happens at the worst time to find major defects –at the end of the project.
• Users may be unfriendly to the new system
• They like the current one just fine, thank you.
• Much of your UAT planning may be ignored.
• People tend to underestimate how many cycles of regression testing are needed.
4/26/2014
3
5
THERE ARE MANY
QUESTIONS ABOUT UAT
• Who plans it?
• Who performs it?
• Should it only be manual in nature?
• What is the basis for test design and
evaluation?
• When should it be performed?
• Where should it be performed?
• Who leads it?
• How much weight should be given to it?
6
PROJECT #1
• Medical laboratory testing business that closely resembles a manufacturing environment.
• New Technology for the company and for the industry.
• The previous project had failed
• The company almost went out of business because of it!
• Very high growth in terms of both business and employees.
• Company at risk of failure.
• This project was truly mission-critical.
4/26/2014
4
7
PROJECT #1 – CONT’D.
• Few functional requirements.
• 8 pages for over 400 modules
• Test team had little knowledge of:
• subject matter,
• test design,
• or testing at all.
• Very little unit or integration testing
being done by developers.
• Some system testing was done.
• UAT was the focus.
8
DEFECT DISCOVERY
AND BACKLOG
System Test UAT 1st Deploy
2nd Deploy
3rd Deploy
4 weeks 3 weeks
4/26/2014
5
9
PROJECT #1 RESULTS
• Very high defect levels in testing.
• Many were resolved before implementation.
• Severe performance problems.
• Old system could process 8,000 units/day
• New system could process 400 units/day
• Many problems due to the new technology being used
• “bleeding edge” issues
• “Deadline or else” attitude
• The business was under extreme pressure to deploy due to increased processing volumes.
• System was de-installed/re-installed 3 times before performance was acceptable to deploy.
10
WHAT WE LEARNED
• Requirements are important.
• Even if you have to create some form of them after the software has been written.
• Early testing is important.
• That would have caught early performance bottlenecks.
• Teamwork is critical.
• Things got so bad we had to have a “do over.”
• The deadline is a bad criteria for implementation.
• Always have a “Plan B”.
4/26/2014
6
11
UAT LESSONS
• Build good relationships with subject matter experts.
• They often determine acceptance
• Listen to the end-users.
• Understand what’s important
• Don’t rely on UAT for defect detection.
• Interesting factoid
• A similar project with the exact same technology failed due to performance errors two years later for a city water utility. $1.8 million lawsuit lost by the vendor.
12
PROJECT #2
• Same company as before, but two
years later
• Integration of a vendor-developed
and customized accounting
system
• Lots of defects in the vendor system
• Implemented two months late with
practically zero defects.
4/26/2014
7
13
WHAT MADE THE
DIFFERENCE?
• Same people – testers, IT manager, developer
• Different project manager who was a big supporter of testing
• More experience with the technology
• Better understanding of testing and test design
• A repeatable process
• Less pressure to implement
• Having a contingency plan
• Having the courage to delay deployment in favor
of quality.
• The financials had to be right.
14
PROJECT #3
• New government-sponsored entity.
• Everything was new – building, people, systems
• System was a vendor-developed
workers compensation system.
• Some customization
• Little documentation
• Designed all tests based on business
scenarios.
• We had no idea of the UI design.
4/26/2014
8
15
KEY FACTORS
• No end-users in place at first to help with
any UAT planning.
• In fact, we had to train the end-users in the system and the business.
• Lots of test planning was involved
• 50% or more effort in planning and optimizing tests.
• This paid off big in test execution and training
16
RESULTS
• Tested 700 modules with 250 business scenario tests.
• We had designed over 300 tests
• The management and test team felt confident after 250 tests we had covered enough of the system.
• Found many defects in a system that had been in use in other companies for years.
• Reused a lot of the testware as training aids.
• Successful launch of the organizationand system.
4/26/2014
9
17
HARD LESSONS
LEARNED
• “You don’t know what you don’t know”
AND “You sometimes don’t know what you
think you know.”
• Newly hired SME with over 30 years
workers comp experience provided
information that was different (correct)
than what we had been told during test
design.
• We had to assign two people for two weeks to create new tests.
• These were complex financial functions –we couldn’t make it up on the fly.
18
HARD LESSONS
LEARNED (2)
• Real users are needed for UAT.
• Sometimes the heavy lifting of test design may be done by other testers, but users need heavy involvement.
4/26/2014
10
19
PROJECT #4
• State government, Legal application
• Vendor-developed and customized
• Highly complex system purchased to replace two co-
existing systems.
• Half of the counties in the state used one system, the other half used another.
• Usability factors were low on the new system
• Data conversion correctness was critical
20
THE GOOD SIDE
• Well-defined project processes
• Highly engaged management and
stakeholders
• Good project planning and tracking
• Incremental implementation strategy
• The entire system was implemented, only one county at a time.
• Heavy system testing
• Good team attitude
4/26/2014
11
21
THE CHALLENGES
• The system’s learning curve was very high.
• The key stakeholders set a high bar for acceptance.
• The actual users were few in number and were only able to
perform a few of the planned tests.
• Very high defect levels.
22
LEADING UP TO
VENDOR SELECTION
• Over 2 years of meeting with users and
stakeholders to determine business
needs.
• Included:
• JAD sessions
• Creation of “as-is” and “to-be” use cases
• Set of acceptance criteria (approximately 350 acceptance criteria items)
4/26/2014
12
23
THE STRATEGY
• Create test scenarios that described
the trail to follow in testing a task,
but not to the level of keystrokes.
• Based on use cases.
• The problem turned out to be that even the BAs and trainers had difficulty in performing the scenarios.
• System complexity was high.
• Training had not been conducted.
• Usability was low
24
DEFECT DISCOVERY
AND BACKLOG
System Test UAT 1st Deploy
10 weeks 4 weeks
750
250
4/26/2014
13
25
WHAT WAS
VALIDATED
• The precise “point and click” scripts provided
by the vendor were long and difficult to
perform.
• Each one took days.
• Plus, there were errors in the scripts and differences between what the script indicated and what the system did.
26
THE BIG SURPRISES
• We planned the system test to be a practice run for UAT.
• It turned out to be the most productive phase of testing in terms of finding defects.
• We planned for a 10 week UAT effort with 10 users
• It turned out to be a 2 week effort with 4 users.
• First sense of trouble: initial users were exhausted after 3 days of a pre-test effort.
4/26/2014
14
27
THE BIG SURPRISES (2)
• We used none of the planned tests (around 350 scenarios)
in UAT.
• Instead, it was a guided “happy path” walkthrough, noting problems along the way.
• Defects were found, but the majority of defects had been found in system test.
28
LESSONS LEARNED
• The early system test was invaluable in
finding defects.
• Learning the system is critical for users in
new systems before they are able to test it.
• The test documentation is not enough to provide context of how the system works.
• It took a lot of flexibility on the part of
everyone (client, vendor, testers, users,
stakeholders) to make it to the first
implementation.
• Sometimes actual users just aren’t able to
perform a rigorous test.
4/26/2014
15
29
WHAT CAN WE LEARN FROM
ALL THESE PROJECTS?
• UAT is a much-needed test, but happens at the worst
possible time – just before implementation.
• You can take some of the late defect impact away with system testing and reviews.
• You can lessen the risk of deployment by implementing to
a smaller and lower risk user base first.
• Actual end-users are good for performing UAT, but much
depends on what you are testing and the capabilities of
the users.
• The reality is the users are going to have to use the system in real-life anyway.
• However, not all users are good testers!
30
WHAT CAN WE
LEARN? (2)
• Be careful how much time and effort
you invest in planning for UAT before
the capabilities are truly known.
• That is, senior management may want actual users to test for 8 weeks, but if the people aren’t available or can’t handle the load, then it probably isn’t going to happen.
• Don’t place all the weight of testing on
UAT.
• In project #4 our system testing found a majority of the defects.
4/26/2014
16
31
WHAT CAN WE
LEARN? (3)
• UAT test planning isn’t bad, just expect
changes.
• People, software, business, timelines –they all change.
• Try to optimize and prioritize.
• Example: If you have 500 points of acceptance criteria, can they be validated with 200 tests?
• Which of the acceptance criteria are critical, needed and “nice to have”?
32
4/26/2014
17
33
BIO - RANDALL W. RICE
• Over 35 years experience in building and testing information systems in a variety of industries and technical environments
• ASTQB Certified Tester – Foundation level, Advanced level (Full)
• Director, American Software Testing Qualification Board (ASTQB)
• Chairperson, 1995 - 2000 QAI’’’’s annual software testing conference
• Co-author with William E. Perry, Surviving the Top Ten Challenges of Software Testing andTesting Dirty Systems
• Principal Consultant and Trainer, Rice Consulting Services, Inc.
34
CONTACT INFORMATION
Randall W. Rice, CTAL
Rice Consulting Services, Inc.
P.O. Box 892003
Oklahoma City, OK 73170
Ph: 405-691-8075
Fax: 405-691-1441
Web site: www.riceconsulting.com
e-mail: [email protected]