+ All Categories
Home > Documents > Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems...

Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems...

Date post: 18-Jan-2018
Category:
Upload: marian-bradford
View: 217 times
Download: 0 times
Share this document with a friend
Description:
SOFTWARE DEVELOPMENT MODELS The development process adopted for a project will depend on the project aims and goals. There are numerous development life cycles that have been developed in order to achieve different required objectives. These life cycles range from lightweight and fast methodologies, where time to market is of the essence, through to fully controlled and documented methodologies where quality and reliability are key drivers.
92
Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software life cycle instructor: Razvan BOGDAN
Transcript
Page 1: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

Politehnica University of TimisoaraMobile Computing, Sensors Network and Embedded Systems

Laboratory

Embedded systems testing

Testing throughout the software life cycle

instructor: Razvan BOGDAN

Page 2: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

Outlines 1.Software Development Models

2.Test Levels

3.Testing Types

4.Maintenance Testing

Page 3: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • The development process adopted for a project will depend on the project aims and goals.

• There are numerous development life cycles that have been developed in order to achieve different required objectives.

• These life cycles range from lightweight and fast methodologies, where time to market is of the essence, through to fully controlled and documented methodologies where quality and reliability are key drivers.

Page 4: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • The life cycle model that is adopted for a project will have a big impact on the testing that is carried out

• Test activities are highly related to software development activities

• The life cycle model will define the what, where, and when of our planned testing, influence regression testing, and largely determine which test techniques to use.

Page 5: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • In every development life cycle, a part of testing is focused on verification testing and a part is focused on validation testing.

• Verification focuses on the question 'Is the deliverable built according to the specification?'.

• Validation focuses on the question 'Is the deliverable fit for purpose, e.g. does it provide a solution to the problem?'.

Page 6: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Waterfall• It has a natural timeline where tasks are executed in a sequential fashion.

• We start at the top of the waterfall with a feasibility study and flow down through the various project tasks finishing with implementation into the live environment

• Testing tends to happen towards the end of the project life cycle so defects are detected close to the live implementation date.

Page 7: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS

Page 8: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Waterfall • With this model it has been difficult to get feedback passed backwards up the waterfall

• There are difficulties if we need to carry out numerous iterations for a particular phase

Page 9: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • V-model •The V-model was developed to address some of the problems experienced using the traditional waterfall approach

• The V-model provides guidance that testing needs to begin as early as possible in the life cycle

• There are a variety of activities that need to be performed before the end of the coding phase. These activities should be carried out in parallel with development activities

Page 10: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

V-model

Page 11: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Testing along the V-model• Development and test are two equal branches

• Each development level has a corresponding test level• Tests (right hand side) are designed in parallel with software development (left hand side)• Testing activities take place throughout the complete software life cycle

Page 12: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Testing along the V-model• Software development branch

• Requirements Definition• specification documents

• Functional System Design• design functional program flow

• Technical System Design• define architecture/interfaces, their interactions

• Component specification• structure of component

• Programming• create executable code

Page 13: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Testing along the V-model• Software test branch

• 4. Acceptance Testing• formal test of customer requirements

• 3. System Testing• integrated system, specifications

• 2. Integration Testing• component interfaces• software does what should be doing from a functional point of view

• 1. Component Testing• component’s functionality

Page 14: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Verification vs. Validation• Verification

• Confirmation by examination and through provision of objective evidence that specified requirements have been fulfilled. [ISO 9000]• Main issue: Did we proceed correctly when building the system? Did we use correctly this component?

• Validation• Confirmation by examination and through provision of objective evidence that the requirements for a specific intended use or application have been fulfilled. [ISO 9000]• Main issue: Did we build the right software system? Was it the matter to use that particular component or maybe we should have used another one?

Page 15: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Verification within the general V-Model• Each development level is verified against the contents of the level above it

• to verify: to give proof of evidence, to substantiate

• to verify means to check whether the requirements and definitions of the previous level were implemented correctly

Verifi

catio

n

Page 16: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Validation within the general V-Model• Validation refers to the correctness of each development level

• to validate: to give proof of having value

• to validate means to check the appropriateness of the results of one development level

• e.g., is the component offering the expected behavior? Is the system offering that particular behavior which is mentioned/agreed in the System Design Document?

Verifi

catio

n Validation

Page 17: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

V-model • Although variants of the V-model exist, a common

type of V-model uses four test levels.

1. component testing: searches for defects in and verifies the functioning of software components (e.g. modules, programs, objects, classes etc.) that are separately testable;

2. integration testing: tests interfaces between components, interactions to different parts of a system such as an operating system, file system and hard ware or interfaces between systems;

Page 18: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

V-model • Although variants of the V-model exist, a common

type of V-model uses four test levels.

3. system testing: concerned with the behavior of the whole system/product as defined by the scope of a development project or product. The main focus of system testing is verification against specified requirements;

4. acceptance testing: validation testing with respect to user needs, require ments, and business processes conducted to determine whether or not to accept the system.

Page 19: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Iterative life cycles• Not all life cycles are sequential. V-Cycle takes a large amount of time, being

applied in those cases where the requirements are not changing (ok, that often )

• There are also iterative or incremental life cycles where, instead of one large development time line from beginning to end, we cycle through a number of smaller self-contained life cycle phases for the same project.

Page 20: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Iterative life cycles• A common feature of iterative approaches is that the delivery is divided

into increments or builds with each increment adding new functionality

• The increment produced by an iteration may be tested at several levels as part of its development.

• Subsequent increments will need testing for the new functionality, regression testing of the existing functionality, and integration testing of both new and existing parts

Page 21: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Iterative life cycles

• Regression testing is increasingly important on all iterations after the first one. This means that more testing will be required at each subsequent delivery phase which must be allowed for in the project plans

• This life cycle can give early market presence with critical functionality, can be simpler to manage because the workload is divided into smaller pieces, and can reduce initial investment although it may cost more in the long run.

Page 22: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Examples of iterative or incremental development models are:

• Prototyping• Building quickly a usable representation of the system, followed by successive

modification until the system is ready

• Rapid Application Development (RAD)• The user interface is implemented using out-of-the box functionality taking the

functionality which will be later developed

• Rational Unified Process (RUP) • Object oriented model and a product of the company Rational/IBM. It mainly

provides the modelling language UML and support for the Unified Process

• Agile development • Development and testing take place without formalized requirements

specification

Page 23: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Rapid Application Development (RAD) is formally a parallel development of functions and subsequent integration.

• Components/functions are developed in parallel as if they were mini projects, the developments are time-boxed, delivered, and then assembled into a working prototype

• This can very quickly give the customer something to see and use and to provide feedback regarding the delivery and their requirements

• This methodology allows early validation of technology risks and a rapid response to changing customer requirements.

• Dynamic System Development Methodology [DSDM] is a refined RAD process that allows controls to be put in place in order to stop the process from getting out of control

Page 24: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

Rapid Application Development

Page 25: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • The RAD development process encourages active customer feedback.

• An early business-focused solution in the market place gives an early return on investment (ROI) and can provide valuable marketing information for the business

Page 26: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Extreme Programming (XP) is currently one of the most well-known Agile development life cycle models.

• Some characteristics of XP are: • It promotes the generation of business stories to define the functionality.

• It demands an on-site customer for continual feedback and to define and carry out functional acceptance testing.

• It promotes pair programming and shared code ownership amongst the developers.

Page 27: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Agile development • Some characteristics of XP are:

• It states that component test scripts shall be written before the code is written and that those tests should be automated.

• It states that integration and testing of the code shall happen several times a day.

• It states that we always implement the simplest solution to meet today's problems.

Page 28: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • XP developers write every test case they can think of and automate them.

• Every time a change is made in the code it is component tested and then integrated with the existing code, which is then fully integration-tested using the full set of test cases.

• It gives continuous integration, which mean that changes are incorporated continuously into the software build.

• At the same time, all test cases must be running at 100% meaning that all the test cases that have been identified and automated are executed and pass.

Page 29: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Iteration models: Test Driven Development (TDD)• Based on: test case suites

• Prepare test cycles• Automated testing using test tools

• Development according to test cases• Prepare early versions of the component for testing• Automatic execution of tests• Correct defects on further versions• Repeat test suites until no errors are found

• First the tests are designed, then the software is programmed; • test first, code after

Page 30: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Iteration models: Test Driven Development (TDD)• Based on: test case suites

• Prepare test cycles• Automated testing using test tools

• Development according to test cases• Prepare early versions of the component for testing• Automatic execution of tests• Correct defects on further versions• Repeat test suites until no errors are found

• First the tests are designed, then the software is programmed; • test first, code after

Page 31: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS • Principles of all models• each development activity must be tested

• no piece of software may be left untested, whether it was developed “in one procedure” or iteratively

• each test level should be tested specifically• each test level has its own test objectives• the test performed at each level must reflect these objectives

• testing begins long before test execution• as soon as development begins, the preparation of the corresponding tests can start• this is also the case for document reviews starting with concepts specification and overall design

Page 32: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

SOFTWARE DEVELOPMENT MODELS - Summary

• whichever life cycle model is being used, there are several characteristics of good testing:

• for every development activity there is a corresponding testing activity;

• each test level has test objectives specific to that level;

• the analysis and design of tests for a given test level should begin during the corresponding development activity;

• testers should be involved in reviewing documents as soon as drafts are available in the development cycle.

Page 33: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

Outlines 1.Software Development Models

2.Test Levels

3.Testing Types

4.Maintenance Testing

Page 34: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• Testing levels:

• Component testing • Integration testing • System testing • Acceptance testing

Page 35: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component (unit) testing

• also known as unit, module or program• test of each software component after its realization• searches for defects in, and verifies the functioning of software (e.g. modules, programs, objects, classes, etc.) that are separately testable.

• Due to the naming of components in different programming languages, the component test may be referred to as:

• module test (e.g. in C) • class test (e.g. in Java or C++)• unit test (e.g. in Pascal)

• The components are referred to as modules, classes or units. Because of the possible involvement of developers in the test execution, they are also called developer’s test

Page 36: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing

• Test cases may be derived from (= Test basis ):• Component requirements• (Detailed) software design• Code• Data models

• Typical test objects:• Components/classes/units/modules• Programs• Data conversion/migration programs• Database modules

Page 37: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Scope

• Only single components are tested• components may consist of several smaller units• test object often cannot be tested stand alone

• Every component is tested on its own• finding failures caused by internal defects• cross effects between components are not within the scope this test

Page 38: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Functional/non functional testing• Testing functionality

• Every function must be tested with at least one test case• are the functions working correctly, are all specifications met?

• Defects found commonly are:• defects in processing data, often near boundary values• missing functions

• Testing robustness (resistance to invalid input data)• How reliable is the software?

• Test cases representing invalid inputs are called negative tests• A robust system provides an appropriate handling of wrong inputs

• Wrong inputs accepted in the system may produce failure in further processing (wrong output, system crash)

• Other non-functional attributes may be tested• e.g., performance and stress testing, reliability

Page 39: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Test harness /1•Test execution of components often requires drivers and stubs

• Drivers handle the interface to the component

• calls a component to be tested • Simulate inputs, records outputs and provide a test harness• they use programming tools

• Stubs replace or simulate components not yet available or not part of the test object

• called from the software component to be tested

• To program drivers and/or stubs one• must have programming skills• need to have the source code available• may need some special tools

Page 40: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Test harness /2• Example 1: Stubs..• Q: How to test module L?

Module L

Module K Module F(not developed) (not developed)

Page 41: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Test harness /3• Example 2: Stubs..• A: Stubs: called from the software component to be tested

Module L

Module K Stub Module F StubDummy code that SIMULATES the functionality of the

undeveloped modules

Page 42: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Test harness /4• Example 1: •A: Stubs: called from the software component to be tested

void functionToBeTested(params..) {………...int p = price(param1);………..

}void price(int param) { //this is the stub

return 10; // We don’t care what the price is. We just need a value so we can test the other function}

Page 43: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Test harness /5• Example 2: What about drivers?..• Q: How to test module L?

Module K Module F

Module L(not developed)

Page 44: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Test harness /6• Example 2: What about drivers?..• Q: How to test module L?• A: A driver calls a component to be tested

Module K Module F

Module L Driver

Dummy code that returns values from Module L in… Module K

Page 45: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Test harness /7• Example 2: What about drivers?..• Q: How to test module L?• A: A driver calls a component to be tested

void functionThatCallsPrice (params..) { //this is the driver

int p = price(param1);

printf(“Price is: %d”, p);

}

void price(int param) {

//complex ecuations and DB interogations that determin the real price

}

Page 46: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Test harness /8• Example 3: Driver - UI in which the tester can introduce some data, simulate some input for the Component, this application being connected to the Component

Unit Under Test (F)

Driver

Page 47: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing: Methods• The program code is available to the tester

• In case “tester = developer”:testing takes place with a strong development focus•Knowledge about functionality, component structure and variables may be applied to design test cases•Often functional testing will apply•Additionally, the use of debuggers and other development tools (e.g. unit test frameworks) will allow direct access to program variables

•Source code knowledge allows to use white box methods for component test

Page 48: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 1. Component testing - Summary • A component is the smallest system unit specified

• Module, unit, class and developer’s test are used as synonyms • Drivers will execute the component functions and adjacent functions that are replaced by stubs

• Component tests may check functional and non-functional system properties

Page 49: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing • Test basis

• Software and system design• Architectural design• Workflows• Use cases• Interface specifications• Data models

• Typical test objects• Subsystems database implementation• Infrastructure• Interfaces• System configuration

• Configuration data

Page 50: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration (interface) testing• tests interfaces between components, interactions to different parts of a system such as an operating system, file system and hardware or interfaces between systems.

• each component has already been tested for its internal functionality (component test ) => integration tests examine the external functions after component testing• examines the interaction of software elements (yes! components) between different systems or hardware and software • Integration is the activity of combining individual software components into a larger subsystem or in a series of systems• Further integration of a subsystems is also part of the system integration process

• Integration testing is often carried out by the integrator, but preferably by a specific integration tester or test team

Page 51: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Scope /1• Integration tests assume that the components have already been tested • Integration tests examine the interaction of software components (subsystems) with each other:

• interfaces with other components• interfaces among GUIs/MMIs

• Integration tests examine the interfaces with the system environment• In most cases, the interaction tested is that of the component and simulated environment behavior• Under real conditions, additional environmental factors may influence the components behavior

Page 52: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Scope /2• A subsystem, composed of individual components, will be tested

• each component has an interface either external and/or interacting with another component with the subsystem

• Test drivers (which provide the process environment of the system or subsystem) are required

• to allow for or to produce input and output of the subsystem• to log data

• Test drivers of the component tests may be reused here

Page 53: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Scope /3• Monitoring tools logging data and controlling tests can support testing activities• Stubs replace missing components

• data or functionality of a component that have not yet been integrated will be replaced by programming stubs • stubs take over the elementary tasks of the missing components

Page 54: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Approach• Integration tests aim at finding defects in the interfaces. They check the correct interaction of components

• among other reasons, in order to check performance and security aspects, requiring additional non-functional tests

• Replacing test drivers and stubs with real components may produce new defects , such as

• loosing data, wrong handling of data or wrong inputs• the components involved interpret the input data in a different manner• the point in time where data is handed over is not correct: too early, too late, at a wrong frequency

Page 55: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Strategies /1• There are different strategies for integration testing

• ‘Big-bang' integration testing - one extreme - all components or systems are integrated simultaneously, after which everything is tested as a whole.

• Big-bang testing has the advantage that everything is finished before integration testing starts. There is no need to simulate (as yet unfinished) parts

• The major disadvantage of the big-bang is that in general it is time-consuming and difficult to trace the cause of failures with this late integration.

Page 56: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Strategies /1• There are different strategies for integration testing

• Incremental testing - another extreme - all programs are integrated one by one, and a test is carried out after each step.

• The incremental approach has the advantage that the defects are found early in a smaller assembly when it is relatively easy to detect the cause.

• A disadvantage is that it can be time-consuming since stubs and drivers have to be developed and used in the test

Page 57: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Strategies /2• There are different strategies for integration testing

• Within incremental integration testing a range of possibilities exist, partly depending on the system architecture:

• Top-down: testing takes place from top to bottom, following the control flow or architectural structure (e.g. starting from the GUI or main menu). Components or systems are substituted by stubs.

• Bottom-up: testing takes place from the bottom of the control flow upwards. Components or systems are substituted by drivers.

• Functional incremental: integration and testing takes place on the basis of the functions or functionality, as documented in the functional specification.

Page 58: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Strategies /3• There are different strategies for integration testing

• The preferred integration sequence and the number of integration steps required depend on the location in the architecture of the high-risk interfaces.

•The best choice is to start integration with those interfaces that are expected to cause most problems

• Ad-hoc integration• components will be tested, if possible, directly after programming and component tests have been completed• implies early start of testing activities, possibly allowing for a shorter software development process as a whole

Page 59: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 2. Integration testing: Summary• Integration means building up groups of components

• Integration tests examine component interactions against the specification of interfaces

• Integrating sub-systems (they consist of integrated components) is also a form of integration

Page 60: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. System testing• Test basis (may include tests based on)

• System and software requirement specification• Use cases, business process• Functional specification• Risk analysis reports

• Typical test objects• System, user and operation manuals• System configuration and configuration data

Page 61: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. System testing• The process of testing an integrated system to verify that it meets specified requirements. [Hetzel]

• The system testing means the behavior of the whole system• The scope is defined in the Master Test Plan (or Level Test Plan)• Software quality is looked at from the user’s point of view

• we want to use a user’s point of view

• System tests refer to (as per ISO 9126):• functional and non-functional requirements (functionality, reliability, usability, efficiency, maintainability, portability)• The characteristics of the data quality are the base for testing

Page 62: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. System testing: Scope• Test of the integrated system from the user’s point of view

• Complete and correct implementation of requirements• Deployment in the real system environment with real life data

• The test environment should match the true environment• No test drivers or stubs are needed!• All external interfaces are tested under true conditions• close representation of the later true environment

• No tests in the real life environment!• Induced defects could damage the real life environment• Software under deployment is constantly changing. Most tests will not be reproducible

Page 63: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. System testing: functional requirements /1• Goal: to prove that the implemented functionality exposes the required characteristics

• According to ISO9126, characteristics to be tested include:• Suitability

• are the implemented functions suitable for their expected use• the software should not be more complex than necessary

• Accuracy• Do the functions produce correct (agreed upon) results?

• Interoperability• Does interaction with the system environment show any problems/

• Compliance• Does the system comply with applicable norms and regulations?

• Security• Are data/programs protected against unwanted access or loss?• e.g. unauthorized access to programs & data not available

Page 64: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. System testing: functional requirements /2• Three approaches for testing functional requirements:

• Business process based test• each business process serves as basis for deriving tests• the ranking order of the business process can be applied for prioritizing test cases

• Use case based test• test cases are derived from sequences of expected or reasonable use• sequences used more frequently receive a higher priority

• Requirements based test (building blocks)• test cases are derived from the requirement specification• the number of test cases will vary according to the type/depth of specification Requirements based test

Page 65: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. System testing: Non-functional requirements /2• Compliance with non-functional requirements is difficult to achieve:

• Their definition is often very vague (e.g. easy to operate, well structured user interface, etc.)• They are not stated explicitly. They are an implicit part of system description, still they are expected to be fulfilled • Quantifying them is difficult, often non-objective metrics must be used, .g. looks pretty, quite safe, easy to learn.

• Examples: Testing/inspecting documentation• Is documentation of programs in line with the actual system, is it concise, complete and easy to understand?

• Example: Testing maintainability• All programmers comply to the respective Coding-Standards?• Is the system designed in a structured, modular fashion?

Page 66: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. System testing: Non-functional requirements /2• Compliance with non-functional requirements is difficult to achieve:

• Their definition is often very vague (e.g. easy to operate, well structured user interface, etc.)• They are not stated explicitly. They are an implicit part of system description, still they are expected to be fulfilled • Quantifying them is difficult, often non-objective metrics must be used, .g. looks pretty, quite safe, easy to learn.

• Examples: Testing/inspecting documentation• Is documentation of programs in line with the actual system, is it concise, complete and easy to understand?

• Example: Testing maintainability• All programmers comply to the respective Coding-Standards?• Is the system designed in a structured, modular fashion?

Page 67: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. System testing: Summary• System testing is performed using functional and non-functional test cases

• Functional system testing confirms that the requirements for a specific intended use have been fulfilled (validation)• Non-functional system testing verifies non-functional quality attributes

• e.g. usability, efficiency, portability etc.

• Non-functional quality attributes are often an implicit part of the requirements, this makes it difficult to validate them

• An independent test team often carries out system testing

Page 68: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS - Class work•Which of these is a functional test?

• a. Measuring response time on an on-line booking system. • b. Checking the effect of high volumes of traffic in a call-center system. • c. Checking the on-line bookings screen information and the database contents

against the information on the letter to the customers. • d. Checking how easy the system is to use.

Page 69: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS - Class work•Which of these is a functional test?

• a. Measuring response time on an on-line booking system. • b. Checking the effect of high volumes of traffic in a call-center system. • c. Checking the on-line bookings screen information and the database contents

against the information on the letter to the customers. • d. Checking how easy the system is to use.

Page 70: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. Acceptance testing

• Test basis (from where can the test cases be generated)• User requirements• System requirements • Use cases• Business processes • Risk analysis reports

• Typical test objects• Business processes on fully integrated system • Operational and maintenance processes• User procedures• Forms• Reports• Configuration data

Page 71: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. Acceptance testing: contract and regulation acceptance• Does the software fulfill all the contractual requirements?

• Within the acceptance test for a business-supporting system, two main test types can be distinguished; they are usually prepared and executed separately • A. Acceptance testing: user acceptance testing

• Often customer selects test cases for acceptance testing• Focuses mainly on the functionality thereby typically verifies the fitness for the use of the system by business users

• “The customer knows best”

• Testing is done using the customer environment• Customer environment may cause new failures

Page 72: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. B. Acceptance testing: operational acceptance testing• Requires that software is fit for use in a productive environment

• Integration of software into the customer IT-infrastructure (Backup/Restore-System, restart, install and de-install ability, disaster recovery etc.)• User management, interfacing to file and directory structures in use• Compatibility with other systems (other computers, database servers etc.)• Maintenance tasks• Data load and migration tasks• Periodic checks of security vulnerabilities

• Operational acceptance testing is often done by the customer’s system administrator

Page 73: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. Acceptance testing: alpha testing and beta testing• A stable preliminary version of the software is needed (preliminary acceptance testing)

• Mostly done for market software (also called Commercial off the shelf COTS software)

•Alpha testing: This test takes place at the developer's site. •A cross-section of potential users and members of the developer's organization are invited to use the system. •Developers observe the users and note problems. •Alpha testing may also be carried out by an independent test team.

• Beta testing, or field testing, sends the system to a cross-section of users who install it and use it under real-world working conditions.

•The users send records of incidents with the system to the development organization where the defects are repaired.

•Advantages of alpha and beta tests:•Reduce the cost of acceptance testing•Use different user environment•Involve a high number of users

Page 74: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TEST LEVELS• 3. Acceptance testing: Summary

• Acceptance testing is the customer’s systems test

• Acceptance testing is a contractual activity, the software will then be verified to comply with customers requirements

• Alpha- and beta tests are tests performed by potential or existing customers wither at the developer’s site (alpha) or at the customers site (beta)

Page 75: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

Outlines 1.Software Development Models

2.Test Levels

3.Testing Types

4.Maintenance Testing

Page 76: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Test types and test levels• Test levels

• The previous section explained the various testing levels, i.e. component test, integration test etc.• At every test level the test objectives have a different focus!• Therefore different test types are applied during different test levels

• Test types• Functional testing ( Goal: Testing of function )• Non-functional testing ( Goal: Testing product characteristics )• Structural Testing ( Goal: Testing of SW structure / architecture )• Change related testing ( Goal: Testing after changes )

Page 77: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

Test types: the targets of testing • A test type is focused on a particular test objective, which could be:

• the testing of a function to be performed by the component or system; • The testing of a non-functional quality characteristic, such as reliability or

usability; • The testing of the structure or architecture of the component or system; • Testing related to changes, i.e. confirming that defects have been fixed

(confirmation testing, or re-testing) and looking for unintended changes (regression testing).

Page 78: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Testing of Function ( Functional testing )• Goal: the function of the test object

• Functionally can be linked to input and output data of the test object

• Black box methods are applied to design the relevant test cases• Testing is to verify functional requirements ( As stated in specifications,

concepts, case studies, business rules or related documents )• Area of use• Functional testing may be performed at all test levels• Execution• The test object is executed using text data derived from test cases• The results of the test execution are compared to the expected results• Security testing• Type of functional testing dealing with external threads• Malicious attacks could damage programs or data

Page 79: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Testing of Non-functional Software Characteristics• Goal: software product characteristics

• How well does the software perform its functions?• The non-functional quality characteristics ( ISO 9126: reliability, usability, efficiency,

maintainability, portability ) are often vague, incomplete or missing all together, making testing difficult

• Area of use• Non-functional testing may be performed at all test levels• Typical non-functional testing:

• Load testing / performance testing / volume testing / stress testing • Testing of safety features• Reliability and robustness testing / compatibility testing• Usability testing / configuration testing

• Execution• Compliance with the non-functional requirements is measured using selected

functional requirements

Page 80: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Non-functional Testing (system test )

• Load test• System under load ( minimum load, more users / transactions )• Performance test• How fast does the system perform a certain function• Volume test• Processing huge volumes of data / files• Stress test• Reaction to overload / recovery after return to normal• Reliability test• Performance while in “continuous operation mode”• Test of robustness• Reaction to input of wrong or unspecified data• Reaction to hardware failures / disaster recovery

Page 81: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Non-functional Testing (system test )

• Compliance testing• Meeting rules and regulations (internal/ external )

• Test for usability• Structured, understandable, easy to learn for user

• Other non-functional quality aspects:• portability: replaceability/, install-ability,

conformance/compliance, adaptability

• maintainability: verifiability, stability, analyzability, changeability• reliability: maturity, robustness, recoverability

Page 82: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Testing of Software Structure/Architecture (Structural

Testing)• Goal: Coverage

• Analyses the structure of the test object ( white box approach )• Testing aims at measuring how well the structure of the test object is

covered by the test cases• Area of use

• Structural testing possible on all test levels, code coverage testing using tools mainly done during component and integration testing

• Structural test design is finalized after functional tests have been designed, aiming at producing a high degree of coverage

• Execution• Will test the internal structure of a test object (e.g. control flaw within

components, flow through a menu structure)• Goal: all identified structural elements should be covered by test cases

Page 83: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Testing related to Changes / 1 • Goal: test object after changes

• After a test object or its system environment has been changed, results related to the change have become invalid: tests have to be repeated• Two main reasons for changing software

• Error correction• Functional extension

• Because of undesired side effects of extended or new functionality, it is necessary to also retest adjacent areas!

Error correction

Functional extension

New test case

Confirmation testing

Regression testing

Release software

Page 84: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Testing related to Changes /2 ( Re-testing and Regression Testing )• Areas of use

• Repeating a test of functionality that has already been verified is called a regression test.

• The scope of the regression test depends on the risk, that the newly implemented functionality (extension or error fix) imposes to the system.

• Analyzing this risk can be done with an impact analysis• Change related testing may be performed at all test levels.• Typical tests after changes are:

• Re-testing ( = Testing after correction of errors )• Regression testing ( = Testing to uncover newly introduced defects)

Page 85: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Testing related to Changes / 3 • Execution

• Basically, execution takes place as in previously executed test iterations

• In most cases, a complete regression test is not feasible, because it is too expensive and takes too much time

• A high degree of modularity in the software allows for more appropriate reduced regression tests

• Criteria for the selection of regression test cases:• Test cases with high priority• Only test standard functionality, skip special cases and

variations• Only test configuration that is used most often• Only test subsystems / selected areas at the test object

• If during early project phases, it becomes obvious that certain tests are suitable for regression testing, test automation should be considered

Page 86: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

TESTING TYPES• Summary

• On different test levels different types of tests are used• Test types are: functional, non-functional, structural and change

related testing• Functional testing examines the input / output behavior of a test

object• Non-functional testing checks product characteristics• Non-functional testing includes, but is not limited to, load testing,

stress testing, performance testing, robustness testing• Common structural tests are tests that check data and control flow

within the test object, measuring the degree of coverage• Important tests after changes are: re-tests and regression tests

Page 87: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

Outlines 1.Software Development Models

2.Test Levels

3.Testing Types

4.Maintenance Testing

Page 88: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

MAINTENANCE TESTING• Testing after Product Acceptance / 1

• Customer has approved the product and sets it into production• The initial development cycle, including its related tests, has been

completed• The software itself is at the beginning of its life cycle:

• It will be used for many years to come, if will be extended• It will most likely still have defects, hence if will be further modified

and corrected• It needs to adapt to new conditions and to be integrated into new

environments• It needs to change or extended the configuration data• It will one day be retired, put out of operation

• Any new version of the product, any new update and any other change in the software requires additional testing!

Page 89: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

MAINTENANCE TESTING• Testing after Product Acceptance / 2• Configuration

• The composition of a component or system as defined by the number, nature, and interconnections of its constituent parts.

• Impact analysis• The assessment of change to the layers of deployment

documentation, test documentation and components, to implement a given change to specified requirements.

• Maintenance testing• Testing the changes to an operational system or the impact of a

changed environment to an operational system.

Page 90: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

MAINTENANCE TESTING• Testing after Product Acceptance / 3

• Software maintenance covers two different fields:

• Maintenance as such: correction of defects or implementation of hot-fixes, that already were part of the initial version of the software• Planned software releases: adaptations as a result of a changed environment or new customer requirements

• Test scope of maintenance testing • Hot fixes and defect correction require retests• Extended functionality requires new test cases• Migration to another platform requires operational tests• In addition, intensive regression testing is needed

Error correction

Functional extension

New test case

Confirmation testing

Regresion testing

Release software

Page 91: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

MAINTENANCE TESTING• Testing after Product Acceptance / 4• Scope of testing is affected by the impact of the change

• Impact analysis is used to determine the affected areas to decide the amount of regression testing

• Problems might occur, if documentation of the old software is missing or incomplete

• Software retirement• Test after software retirement may include

• Data migration tests• Verifying archiving data and programs• Parallel testing of old and new systems

Page 92: Politehnica University of Timisoara Mobile Computing, Sensors Network and Embedded Systems Laboratory Embedded systems testing Testing throughout the software.

MAINTENANCE TESTING

• Summary• Ready developed software needs to be adapted to new conditions,

errors have to be corrected• An impact analysis can help to judge the change related risks• Maintenance tests make sure, that• New functions are implemented correctly ( new test cases )• Errors have been fixed successfully ( old test cases )• Functionality, that has already been verified, is not affected

( regression test )• If software gets retired, migration tests or parallel tests may be

necessary


Recommended