Post on 05-Mar-2015
description
transcript
SOFTW
ARE
TESTING
Agenda
1.Introduction to Testing2.SDLC Models3.Types of Testing4.Testing Flow5.Test Design Techniques6.STLC 7.Defect Tracking8.Automation Testing9.Configuration Management
Software testing is the process of testing the Software /Application in intent to check whether application is working as per requirements.
What is Software Testing…..??
Why Software testing is required…..?
To achieve the Quality of an Application.
When does the testing phase Starts…..?
Testing starts at the early phase itself… from the requirement collection stage.
Software Development Life Cycle(SDLC)
It is the step by step process which explains how to develop a Software / Application.
SDLC Models
Water Fall Model
“V” Model
Etc….
Requirement collection
Design
Development
Testing
Maintenance
Water Fall Model
It is the basic sequential model
Applications:
• Used in small applications• Used in low complexity(application which is not often tend to change)
projects
Advantages:
• Low cost(since developers involve in testing most of the times)• Used in small application with low complexity
Disadvantages:
• Time consuming• Fixing of bugs cost more comparably
“ V ” Model (Verification & Validation)
RC
Design
Unit Testing
Coding
FT
IT
ST
UATVerification
Validation
Developers
Testers
Applications:
• Used in Large applications• Used in High complexity(application which is often tend to change)
projects.
Advantages:
• Cost of fixing of bugs is less• Total investment is more
Verification & Validation
Verification Are we Building product Right ?
Validation Are we Building right Product ?
TYP
ES O
F TES
T ING
White Box (Unit) Testing
• Knowledge of internal program design and code required.
• Tests are based on coverage of code statements, branches, code,path.
BlackBox Testing
• Knowledge of internal program design and code not required.
• Tests are based on requirements and functionality.
TYPES OF BLACKBOX TESTING
• Smoke Testing
• Functional Testing
• Integration Testing
• Regression Testing
• System Testing
• Acceptance Testing
• Exploratory Testing
Smoke Testing
Testing the basic or critical feature of an application before doing rigorous testing
• To ensure that product is stable
Testing each and every component rigorously according to requirement specification
Functional Testing
Integration Testing
Testing the data flow or interface between modules.
• Should know the product very well• Should identify all the possible scenarios• Prioritization which scenario should be tested.
RegressionTesting
Re execution of same test cases in different build or releases to ensure that changes(i.e adding / modifying /deleting modules or defect fixed) are not introducing defects in unchanged features.
System Testing
Its end to end testing where in testing environment is just like customer environment(where real business is running to live environment)
Acceptance Testing
Acceptance testing is designed to determine whether software is fit for use or not
• UAT helps to determine whether a software system satisfies its acceptance criteria and to enable buyer to determine whether to accept the system or not.
Alpha Testing
Alpha testing is done before the release of a product to check whether it is functioning properly or not.
Beta Testing
Beta testing is done when the product is given to end users. They use it and if they find any defects in it they report back to developers. This is done before the final release of the product.
Compatibility Testing
Testing the application on different software and hardware environment
Hardware Compatibility
• Processor-make( Intel, A MD) Speed(32 bit,6bit)• RAM-make(Samsung, transient) size(1gb,2gb,4gb)• Mother Board(mercury)• Visual Graphic cards
Software Compatibility
• Different OS(windows Xp, windows 7,vista,linux,Mac OS)• Different Browser(IE, Firefox, Safari (with different
versions))
Performance Testing
Testing the stability, response time of an application by uploading load
• Stability-ability to withstand no of desired users• Response time-time taken to receive the response• Load-no of user
Types of Performance Testing
•Load testing•Stress testing•Volume testing
Exploratory Testing
Testing the application without requirements/test cases
Flow of testingCode have been developed
Deployment
Build
Smoke Testing
Sending back to dev to fix the issue
Testing Environment
CycleGot any issues
No issues in ST
Black Box Testing
Release
Alpha testing
Customer
Beta Testing
Any modifications/ adding extra features V
ersion
Patch
Small changes
Build: A executable code given by developers to test the stability of an application in testing environment.• Once new build comes smoke test will be done where we
check major functionalities, if its working fine black box testing will be carried or else once again new build will be given by developers fixing the issues found in old build.
Cycle: Developers deploy the build to Testing environment, if issue exits test engineers escalate that to developers, developers fix that issue & deploys new build that complete rotation is known as One Cycle.
Executable Code
Smoke Testing
Dev TE Build
Issues
Cycle
Release: from RC stage to Product ready stage, the complete cycle is known as release.
• Before release Application/Product undergo Alpha Testing• During release, product will be delivered to the
Customer/Client• After release Application/Product undergo Beta Testing in
customer/ client place and further it will go on Live
• One Release has no of Builds/Cycles
TestingAlpha Testing Releas
eLIVE
Client
RC Development
RC RC Testing
Version: After release if any modifications or need to add extra feature those changes will be done in next release is called a version
• One version has no of releases
• Completion of one project consists no of versions
ReleaseRelease 1
Release 2
No of releases
VersionVersion
TEST DESIGN TECHNIQUES
• Error Guessing
• Equivalence Partition
• Boundary Value Analysis
Equivalence Partition
Input data is divided into different equivalence data classes
• It can classified as one valid & two invalid TC
• This method is typically used to reduce the total number of test cases to a finite set of testable test cases, still covering maximum requirements.
BOUNDARY VALUE ANALYSISUsed to identify errors at boundaries rather than finding those exist in center of input domain.
• It’s widely recognized that input values at the extreme ends of input domain cause more errors in system. More application errors occur at the boundaries of input domain.
• Boundary value analysis is often called as a part of stress and negative testing.
• n, n+1, n-1 one can use this formulae to calculate BV
Software Testing Life Cycle (STLC)
It is the step by step process which explains how the testing process goes on in developing the application.Test Strategy
Test Planning
Test Case Development
Test Execution
Test Summary
Defect Tracking
Test Plan Format
• Test Case Id• Reference• Introduction• Resource Requirements• Scope• Approach• Test Deliverables• Entry & Exit Criteria• Dependencies/Risks• Responsibilities
Test Case Template
• Test Case ID• Test Case Title• Test Case Description• Preconditions• Expected Result• Actual Result• Status(Pass/Fail)
Smoke Test Template
NO Req ID Project ID URL Login Details Description Envirounment Reproducable Detected By
Test Case Template
Title:
Test Case Id/Requirement No:
Details Name Last Modified DateAuthor:Reviewed By :Approved By:
testcase id test case name
test case desc
test steps test status (P/F)
step expected actual
Sample Test Case:HOME PAGE:test URL: www.qatest.co.in/railPreconditions: Open Web browser and enter the given url in the address bar. Home page must be displayed. All test cases must be executed from this page.
Test case id Test case name
test case desc test steps test status (P/F)
step expected actual
Login01 Validate Login To verify that Login name on login page must be greater than 3 characters
enter login name less than 3 chars (say a) and password and click Submit button
an error message “Login not less than 3 characters” must be displayed
enter login name less than 3 chars (say ab) and password and click Submit button
an error message “Login not less than 3 characters” must be displayed
enter login name 3 chars (say abc) and password and click Submit button
Login success full or an error message “Invalid Login or Password” must be displayed
Login02 Validate Login To verify that Login name on login page should not be greater than 10 characters
enter login name greater than 10 chars (say abcdefghijk) and password and click Submit button
an error message “Login not greater than 10 characters” must be displayed
enter login name less than 10 chars (say abcdef) and password and click Submit button
Login success full or an error message “Invalid Login or Password” must be displayed
Login03 Validate Login To verify that Login name on login page does not take special characters
enter login name starting with specail chars (!hello) password and click Submit button
an error message “Special chars not allowed in login” must be displayed
enter login name ending with specail chars (hello$) password and click Submit button
an error message “Special chars not allowed in login” must be displayed
enter login name with specail chars in middle(he&^llo) password and click Submit button
an error message “Special chars not allowed in login” must be displayed
Pwd01 Validate Password
To verify that Password on login page must be greater than 6 characters
enter Password less than 6 chars (say a) and Login Name and click Submit button
an error message “Password not less than 6 characters” must be displayed
enter Password 6 chars (say abcdef) and Login Name and click Submit button
Login success full or an error message “Invalid Login or Password” must be displayed
Pwd02 Validate Password
To verify that Password on login page must be less than 10 characters
enter Password greater than 10 chars (say a) and Login Name and click Submit button
an error message “Password not greater than 10 characters” must be displayed
enter Password less than 10 chars (say abcdefghi) and Login Name and click Submit button
Login success full or an error message “Invalid Login or Password” must be displayed
Traceability Matrix
Traceability Matrix: It is used to map between the customer requirements and prepared test case
• It ensures that atleast one testcase for each requirement
• To keep the track of passed/failed testcases
Traceability Matrix
Requirement ID RC1 RC2 RC3 RC4 RC5 RC6 RC7 RC8 RC9 No of Req
No of Test Cases No of Tc
Req TC Status
RC1 TC1 Pass
TC2
TC(n)
RC2 TC1
TC2 Fail
TC(n)
RC(n) TC1
TC2 Not able to Test
TC(n)
Challenges for Testers
• Domain knowledge• Time• Test Data setup• Test environment setup• Unavailability of right Tools• Team at Multi location• Developing a good relationship with developers
DEFECT T
RACK
ING
Bug: "Bug is an error in the software program"
Different Stages of Defect Tracking1.New 2.Open 3.Assign 4.Test 5.Verified 6.Deferred 7.Reopened 8.Duplicate 9.Rejected 10.Closed
Defect Life Cycle New
OP
Assign
Fixed
V
NN
Assign
Fixed
VerifiedClosed
New
Open
D
Rejected
Duplicate
Deferred Reopened
Severity & Priority Severity & Priority
Severity: Defines how important to fix the bug which impact on the business• Based on the severity/priority developers will fix the issue
Severity Types
• Critical
• Major
• Minor
Priority: Defines how important to fix the bug which impact on the Application/Product• Developers fix the issue on priority wise.
Priority types
• High
• Medium
• Low
Showstopper/Blocker: Doesn't allow the user to further do the testing
• Showstoppers are raised as High Priority
• No work around provided
Critical /High: Major functionality missing
• No work around can be provided
Major/Medium: Unable to function or Misfunctionality of a particular feature
Minor/Low: Cosmetic defects which does not affect the functionality of the system.
Title: Application crash on clicking the SAVE button while creating a new user
Req ID: mention in which module/ feature issue exits
Version Number: 5.0.1
Build Number: 5.1.10
Severity: HIGH (High/Medium/Low) or 1
Priority: HIGH (High/Medium/Low) or 1
Defect Report Template
Assigned to: Developer-X
Reported By: Your Name
Reported On: Date
Status: New/Open
Environment: Windows 2003/SQL Server 2005/Java1.6 : O.S/Browser/Language(version)
Login Details: Login/Password
Description: Application crash on clicking the SAVE button while creating a new user, hence unable to create a new user in the application.
Steps To Reproduce:1) Logon into the application2) Navigate to the Users Menu > New User3) Filled all the user information fields4) Clicked on ‘Save’ button5) Seen an error page “ORA1090 Exception: Insert values Error…”
Issue Exists in production: Yes/No
Expected result: On clicking SAVE button, should be prompted to a success message “New User has been created successfully”.
Actual Result: On clicking SAVE button, application crashed
Screenshot Attached: Yes(Attach the screen shot)
AUTO
MAT IO
N T
EST ING
Automation Testing
Test automation is the use of software to control the execution of tests, the comparison of actual outcomes to predicted outcomes, the setting up of test preconditions, and other test control and test reporting functions.
• Test automation tools can be expensive, and it is usually employed in combination with manual testing. It can be made cost-effective in the longer term, especially when used repeatedly in regression testing
There are two general approaches to test automation
• Code-driven testing(Data Driven Testing) The public(usually) interfaces to classes, modules, or libraries are tested with a variety of input arguments to validate that the results that are returned are correct.
• Graphical User Interface Testing(GUI):A testing framework generates user interface events such as keystrokes and mouse clicks, and observes the changes that result in the user interface, to validate that the observable behaviour of the program is correct.
Automation Framework
Automation framework is not a tool to perform some specific task, but is an infrastructure that provides the solution where different tools can plug itself and do their job in an unified manner.
• A framework is an integrated system that sets the rules of Automation of a specific product.
• The framework provides the basis of test automation and simplifies the automation effort.
• This system integrates the function libraries, test data sources, object details and various reusable modules.
There are various types of frameworks.
• Data driven Testing
• Modularity driven Testing
• Keyword driven Testing
• Hybrid Testing
• Model based Testing
Tool Name Company Name Latest Version
HP QuickTest Professional HP 11.0
IBM Rational Functional Tester IBM Rational 8.1.0.3
Parasoft SOAtest Parasoft 9.0
Rational robot IBM Rational 2003
Selenium OpenSource Tool 1.0.6
SilkTest Micro Focus 2010
TestComplete SmartBear Software 8.0
TestPartner Micro Focus 6.3
Visual Studio Test Professional Microsoft 2010
WATIR OpenSource Tool 1.6.5
Popular Test Automation Tools
Software configuration management (SCM)
It is the task of tracking and controlling changes in the software. Configuration management practices include revision control and the establishment of Baselines
• SCM concerns itself with answering the question "Somebody did something, how can one reproduce it?????
Often the problem involves not reproducing "it” identically, but with controlled, incremental changes”
Source configuration management : is a related practice often used to indicate that a variety of artefacts may be managed and versioned, including software code, hardware, documents, design models, and even the directory structure itself.
The goals of SCM are generally
• Configuration identification - Identifying configurations, configuration items and baselines.
• Configuration control - Implementing a controlled change process. This is usually achieved by setting up a change control board whose primary function is to approve or reject all change requests that are sent against any baseline.
• Configuration status accounting - Recording and reporting all the necessary information on the status of the development process.
• Teamwork - Facilitate team interactions related to the process.
• Configuration auditing - Ensuring that configurations contain all their intended parts and are sound with respect to their specifying documents, including requirements, architectural specifications and user manuals.
• Process management - Ensuring adherence to the organization's development process.
• Environment management - Managing the software and hardware that host the system.
• Build management - Managing the process and tools used for builds.
• Defect tracking - Making sure every defect has traceability back to the source.