1
SOFTWARE ENGINEERING
SECTION A
1. What is a Software Engineering?
Software Engineering is a discipline in which theories, methods and tools are applied
to develop professional software.
2. What is a Software?
A Software is the collections of computer programs, related documents, intended to
provide desired features, functionalities and better performance.
3. What are the characteristics of Software?
Software is engineered, but not manufactured.
Software does not wear out.
Most of the Software is custom built rather than being assembled from the
components.
4. What are the various types of Software?
System Software
Real Time Software
Engineering/Scientific Software
Embedded Software
Personal Software
Web based Software
Artificial Intelligence Software
5. Define Software process.
Software process is defined as the structured set of activities which are
required to develop the software system.
6. What are the Fundamental activities of the Software process?
Specification
Design and Implementation
Validation
Evolution
2
7. What are the Umbrella activities of the Software process?
Software project tracking and control.
Risk management.
Software Quality Assurance.
Formal Technical Reviews.
Software Configuration Management.
Work product preparation and production.
Reusability Management.
Measurement.
8. What are the merits of the Incremental Model?
The Incremental Model can adopt, when there are less number of people involved
in the project.
Technical risks can manage each increment.
At least a core product should be delivered to the customer for every small time.
9. List the task regions in the Spiral Model.
o Customer communication – Tasks required to establish effective communication
between developed and customer..
o Planning – All the planning activities are carried out to
define resources, timelines and other project related activities.
o Risk analysis – The tasks required to assess the technical and the
management risks.
o Engineering – Task required to build one or more representations of the
applications.
o Construct and release –Tasks required to construct, test, install and provide user
supports.
o Customer evaluation –Tasks required to obtain customer feedback based on
evaluation of the software representations created during the engineering stage
10. What are the drawbacks of the Spiral Model?
a. It is based on the customer communication. If the communication fails, then the
software product in progress will not be up to the mark.
3
b. The Software product demands considerable risk assessment. If the risk assessment is
succeeded, then the successful product can be obtained.
11. What is System Engineering?
System Engineering means designing, implementing, validating and deploying and
maintaining operating systems which includes hardware, software and people.
12. List out process maturity levels in the Software Engineering Institute‟s (SEIs)
Capability Maturity Model (CMM).
Level 1: Initial – Few process are defined and individual efforts are
taken.
Level 2: Repeatable – To track the cost scheduling and the functionality basic
project management processes are established.
Level 3: Defined – The processes are standardized, documented and followed.
Level 4: Managed – Both the software processes and products are
quantitatively understood and controlled using detailed
measures.
Level 5: Optimizing – Establish mechanisms to plan and to implement
change.
13. What does Verification represent?
Verification refers to the set of activities that ensures, that the software correctly
implements a specific functionality.
14. What does Validation represent?
Validation refers to a different set of activities that ensures, that the software that has
been built is traceable to customer requirements.
15. What are the steps followed in testing?
a. Unit testing – In this type of testing, the individual components
are tested.
b. Module testing – Related collection of independent components are
tested.
c. Sub-system testing – This is a kind of integration testing. Various
modules are integrated into a subsystem and the
4
whole subsystem is tested.
d. System testing – The whole system is tested in this system.
e. Acceptance testing – It is application face, where the customer data are
correlated and tested for successful completion.
16. What is the use of CMM?
Capability Maturity Model is a processes which allows to complete and to manage
new software projects in an organization.
17. Name the Evolutionary process Models.
i. Incremental model
ii. Spiral model
iii. WIN-WIN spiral model
iv. Concurrent Development
18. What are the various types of traceability in the software engineering?
i. Features traceability table - Shows how requirements relate to important
customer observable system/product features
ii.Source traceability table - Identifies the source of each requirement.
iii.Dependency traceability table - Indicates how requirements are related to one
another.
iv.Subsystem traceability table - Categorizes requirements by the subsystems
that they govern.
v.Interface traceability table - Shows how requirements relate to both internal
and external system interfaces.
19. Define software prototyping.
Software prototyping is defined as a rapid software development for
validating the requirements.
20. What are the benefits of prototyping?
i. Prototype serves as a basis for deriving system specification.
ii. Design quality can be improved.
iii. System can be maintained easily.
iv. Development efforts may get reduced.
v. System usability can be improved.
5
21. What are the prototyping approaches in software process?
i. Evolutionary prototyping – In this approach of system development, the initial
prototype is prepared and it is then refined through number of stages to final stage. It is a
process of refined prototype from the initial stage to the final stage.
ii. Throw-away prototyping –Based on the requirement problem, a rough practical
is implemented and discarded, and a new engineering methodology is developed.
22. What are the advantages of evolutionary prototyping?
i. Fast delivery of the working system.
ii. User is involved while developing the system.
iii. More useful system can be delivered.
iv. Specification, design and implementation work in co-ordinated manner.
23. What is the use of User Interface prototyping?
This prototyping is used to pre-specify the look and feel of user interface in an
effective way.
24. What is requirement?
A requirement can range from a high-level abstract statement of a service or of a
system constraint to a detailed mathematical specification.
This is inevitable as requirements may serve a dual function
• May be the basis for a bid for a contract – therefore must be open to interpretation;
• May be the basis for the contract itself – therefore must be defined in detail;
• Both these statements may be called requirements.
25. What are the types of requirements?
The types of requirements are
1. User Requirements
2. System Requirements
3. Software Specification.
26. What are the user requirements?
It is a collection of statements in the natural language and the description of the
service of the system.
6
27. What are the system requirements?
A structured document setting out detailed description of the system functions,
services and operational constraints that, should be implemented between client and
contractor.
28. What are the Functional Requirements?
These are statements of services the system should provide how the system should
react to particular input and how the system should behave in particular situations.
29. Define Non-functional requirements
These are constraints on the services (or) functions offered by the system. They
include timing constraints, constraints on the development process, standards etc.
30. Define Domain requirements
These are requirements that were from the application domain of the system and that
reflect characteristics of that domain. It may be functional (or) non-functional requirements.
31. What are the methods that are related by a set of operational principles of Analysis
modeling?
i. The information domain of a problem must be represented and understood.
ii. The functions that the software is to perform must be defined.
iii. The behavior of the software must be represented.
iv. The models that depict information, function, and behavior must be partitioned
in a manner that uncovers detail in a layered fashion.
v. The analysis process should move from essential information toward implementation
detail.
32. What is a data modeling?
A Data modeling is the basic step in the analysis modeling. In data modeling, the
data objects are examined independently of processing. The data model represents how data
are related with one another.
33. What is data object?
7
Data object is a collection of attributes that act as an aspect, characteristic, quality, or
descriptor of the object.
34. What are attributes?
Attributes define the properties of data object.
35. What is cardinality in data modeling?
Cardinality in data modeling, specifies how the number of occurrences of one object
is related to the number of occurrences of another object.
36. What does modality in data modeling indicates?
Modality indicates whether or not a particular data object must participate in the
relationship.
37. What is ERD?
Entity Relationship Diagram is the graphical representation of the object
relationship pair. It is mainly used in database applications.
38. What does Level0 DFD represents?
Level 0 DFD is called as „fundamental system model‟ or „context model‟. In the
context model the entire software system is represented by a single bubble with input and
output indicated by incoming and outgoing arrows.
39. What is a state transition diagram?
State transition diagram is a collection of states and events. The events cause the
system to change its state. It also represents what actions are to be taken on the occurrence of
particular event.
40. Define Data Dictionary.
Data Dictionary is defined as an organized collection of all the data elements of the
system with precise and rigorous definitions. Hence the user and the system analyst should
have a common understanding of inputs, outputs, components of stores and intermediate
calculations.
41. What are the elements of Analysis model?
i. Data Dictionary
ii. Entity Relationship Diagram
iii. Data Flow Diagram
iv. State Transition Diagram
8
v. Control Specification
vi. Process specification
42. What are the elements of design model?
i. Data design
ii. Architectural design
iii. Interface design
iv. Component-level design
43. List the principles of a software design.
i. The design process should not suffer from “tunnel vision”.
ii. The design should be traceable to the analysis model.
iii. The design should exhibit uniformity and integration.
iv. Design is not coding.
v. The design should not reinvent the wheel.
44. Define QFD.
Quality Function Deployment (QFD) is a “method to transform user demands into
design quality, to deploy the functions forming quality, and to deploy methods for achieving
the design quality into subsystems and component parts, and ultimately to specific elements
of the manufacturing process.”
45. What are the different levels of abstraction?
Procedural Abstraction, Data Abstraction, Control Abstraction are the different
levels of abstraction.
46. What are the criteria for an effective modular system?
Modular Decomposition, Modular Composition, Modular Understandability,
Modular continuity and Modular Protection are the criteria for an effective modular system.
47. What is Architectural Design?
The Architectural Design defines the relationship between major structural elements
of the software, the “design patterns” that can be used to achieve the requirements that have
been defined for the system, and the constraints that affect the way in which architectural
design pattern can be applied.
48. How can the Architectural design be represented?
9
It is the overall structure of the software and the ways in which that structure provides
conceptual integrity of a system.
49. Define design process.
It is an iterative process through which requirements are translated into a “blueprint”
for constructing the software.
50. List the principles of software design.
The design process should not suffer from “tunnel vision”.
The design should be traceable to the analysis model.
The design should not reinvent the wheel.
The design should exhibit uniformity and integration.
The design should be structured to accommodate change.
The design is not coding, coding is not design.
51. What is the benefit of a modular design?
A large problem is subdivided in to smaller modules. The software is created
according to the needs of the particular module.
52. What is cohesive module?
A cohesive module performs a single task within a software procedure, requiring
little interaction with procedures being performed in other parts of a program.
53. What are the various types of Cohesions?
Coincidental Cohesion, Logical Cohesion and Temporal Cohesion are the various
types of Cohesions.
54. What is coupling?
Coupling is a measure of the relative interdependence among modules.
55. What are the various types of coupling?
Control Coupling, External Coupling and Common Coupling are the various types of
coupling.
56. What are the benefits of horizontal partitioning?
Software that is easier to test.
Software that is easier to maintain.
Propagation of fewer side effects.
10
Software that is easier to extend.
57. What is Interface Design?
The interface design describes how the software communicates within itself, with
systems that interoperate with it, and with humans who use it.
58. What is Data Design?
It transforms the information domain model, created during analysis into the data
structures that will be required to implement the software.
59. Define Control Hierarchy
It is also called as the program structure, represents the organization of program
components (modules) and implies a hierarchy of control.
60. What is Transformation mapping?
It is a set of designing steps which allows a Data Flow Diagram (DFD) with the
transformation flow of characteristics to be mapped in to a specific architectural style.
61. What is meant by Fan –In, Fan–Out?
Fan – In : Indicates how many modules directly control a given module.
Fan – Out : It is a measure of the number of modules that are directly
controlled by another module.
62. How is Functional Independence achieved?
It is achieved by developing modules with “single – minded” function and an
“aversion” to excessive interaction with other modules.
63. What do Software Procedures focus on?
The Software procedures focus on the processing details of each module
individually.
Procedure must provide a precise specification of processing, including sequence
of events, exact decision points, repetitive operations, and data organization and structure.
64. Define Software Testing?
Testing is the process of executing the software product in predefined ways to check,
if the behaviour is the same as expected behaviour.
65. What is Regression testing?
Regression testing enables the test team to meet the objective. It is achieved by
enhancing or fixing the defects on the software and does not affect the existing functionality.
11
66. Define White Box Testing.
Testing the external functionality of the code by examining and testing the program
code that realizes the external functionality.
67. Why do we need Black Box Testing?
Black box testing helps in the overall functionality verification of the system under
testing.
68. What is the use of Boundary value analysis?
Boundary value analysis is useful to generate test cases when the input or output data
is made up of clearly identifiable boundaries or ranges.
69. What is the importance of equivalence partitioning?
It reduces the number of permutations and combinations of input, output values used
for testing and so testing efforts are reduced.
70. What is equivalence partitioning?
Equivalence partitioning is a software testing technique that involves identifying a
small set of representative input values that produce as many different output conditions as
possible.
71. What is beta testing?
It is the process of getting the feedback from the customer for software which is
under the test phase. It is done by customer at client site.
72. What is alpha testing?
Testing done by customer at developers site is alpha testing.
73. What are the reasons behind to perform white box testing?
The defects that come because of incorrect translation of requirements and design of a
program code,
Other defects are created by programming errors and programming languages.
74. What are the types of code coverage testing?
Statement Coverage, Path Coverage, Condition Coverage, Function Coverage are the
types of Code Coverage Testing.
75. What are the requirements of the Requirement Traceability Matrix (RTM)?
It traces all the requirements from design, development and testing.
76. What do you mean by test case?
12
A test case is the process of executing on the predefined inputs to produce the
expected outputs.
77. What are the objectives of testing?
Main objective of testing is to find and control the bugs before the product release to
deliver a bug free product to the customer.
78. What are the various testing activities?
Test engineers prepare test plan, prepare test case, execute the test case report of the
bugs by doing functional testing, integration testing, system testing and performance testing.
79. What are the types of testing?
1. White box testing
2. Black box testing
3. Gray box testing.
80. What is integration testing?
Integration is defined as the set of interactive among components. Testing the
interaction between the modules and interaction with other systems externally.
81. Define system testing?
System testing is defined as a testing phase conducted on the complete integrated
system to evaluate the system compliance with its specified requirements.
82. Define unit testing?
Coding produces several program units, each of these smaller program units have to
be tested independently before trying to combine them together to form components.
83. Why system testing is important?
System testing is done to identify as many defects as possible before the customer
finds them in the deployment and it is last phase of testing before the release.
84. Write about the types of project plan.
Quality plan – This plan describes the quality procedures and standards that will be
used in a project.
Validation plan – This plan describes the approach, resources and schedule required
for system validation.
13
Configuration management plan – This plan focuses on the configuration
management procedures and structures to be used.
Maintenance plan – The purpose of maintenance plan is to predict the maintenance
requirements of the system, maintenance cost and efforts required.
Staff development plan – This plan describes how to develop the skills and
experience of the project team members.
85. Define measure.
Measure is defined as a quantitative indication of the extent, amount,
dimension, or size of some attribute of a product or process.
86. Define metrics.
Metrics is defined as the degree to which a system component, or process
possesses a given attribute.
87. What are the types of metrics?
Direct metrics – It refers to immediately measurable attributes.
Example – Lines of code, execution speed.
Indirect metrics – It refers to the aspects that are not immediately
quantifiable or measurable. Example – functionality of a program.
88. What are the advantages and disadvantages of size measure?
Advantages:
Artifact of software development which is easily counted.
Many existing methods use Lines Of Coding (LOC) as a key input.
A large body of literature and data based on LOC already exists.
Disadvantages:
This method is dependent upon the programming language.
This method is well designed but shorter program may get
suffered.
It does not accommodate non procedural languages.
In early stage of development it is difficult to estimate LOC.
14
89. Write short note on the various estimation techniques.
Algorithmic cost modeling – The cost estimation is based on the
size of the software.
Expert judgment – The experts from software development and
the application domain use their experience to predict software costs.
Estimation by analogy – The cost of a project is computed by
comparing the project to a similar project in the same application
domain and then cost can be computed.
Parkinson‟s law – The cost is determined by available resources
rather than by objective assessment.
Pricing to win – The project costs whatever the customer ready to
spend it.
90. Give the procedure of the Delphi method.
1. The co-ordinator presents a specification and estimation form to
each expert.
2. Co-ordinator calls a group meeting in which the experts discuss
estimation issues with the coordinator and each other.
3. Experts fill out forms anonymously.
4. Co-ordinator prepares and distributes a summary of the estimates.
5. The Co-ordinator then calls a group meeting. In this meeting the
experts mainly discuss the points where their estimates vary widely.
6. The experts again fill out forms anonymously.
7. Again co-ordinator edits and summarizes the forms, repeating steps
5 and 6 until the co-ordinator is satisfied with the overall prediction
synthesized from experts.
91. What is EVA?
“Earned Value Analysis” is a technique of performing quantitative analysis of the
software project. It provides a common value scale for every task of software project. It acts
as a measure for software project progress.
92. What are the metrics computed during error tracking activity?
15
Errors per requirement specification page, Ereq.
Errors per component-design level,Edesign.
Errors per component-code level, Ecode.
Defect Removal Efficiency (DRE)-requirement analysis
DRE-architectural analysis,
DRE-component level design
DRE-coding.
93. Write about software change strategies.
The software change strategies that could be applied separately or together are:
Software maintenance – The changes are made in the software
due to requirements.
Architectural transformation – It is the process of changing one architecture
into another form.
Software re-engineering – New features can be added to existing system and
then the system is reconstructed for better use of
it in future.
94. What are the types of software maintenance?
Corrective maintenance – Means the maintenance for correcting the
software faults.
Adaptive maintenance – Means maintenance for adapting the
change in environment.
Perfective maintenance – Means modifying or enhancing the
system to meet the new requirements.
Preventive maintenance – Means changes made to improve
future maintainability.
95. What are the types of static testing tools?
There are three types of static testing tools.
Code based testing tools – These tools take source code as input and generate
test cases.
Specialized testing tools – Using this language the detailed test
16
specification can be written for each test case.
Requirement-based testing tools – These tools help in designing the test cases as
per user requirements.
96. What is meant by software project management?
Software project management focuses as the following
The people
Deals with the cultivation of motivated, highly skilled people
Consists of the stakeholders, the team leaders, and the software team
The product
Product objectives and scope should be established before a project can be
planned
The process
The software process provides the framework from which a
comprehensive plan for software development can be established
The project
Planning and controlling a software project is done for one primary
reason. It is the only known way to manage complexity
In a 1998 survey, 26% of software projects failed outright, 46%
experienced cost and schedule overruns
97. Write note on people as stakeholders.
Five categories of stakeholders
Senior managers – define business issues that often have significant influence on
the project
Project (technical) managers – plan, motivate, organize, and control the
practitioners who do the work
Practitioners – deliver the technical skills that are necessary to engineer a
product or application
Customers – specify the requirements for the software to be engineered and other
stakeholders who have a peripheral interest in the outcome
End users – interact with the software once it is released for production use
98. List out the equalities of team leaders.
17
• Qualities to look for in a team leader
o Motivation – the ability to encourage technical people to produce to their best
ability
o Organization – the ability to mould existing processes (or invent new ones)
that will enable the initial concept to be translated into a final product
o Ideas or innovation – the ability to encourage people to create and feel
creative even when they must work within bounds established for a particular
software product or
99. Write down the seven project factors.
Seven project factors to consider when structuring a software development team:
o The difficulty of the problem to be solved.
o The size of the resultant program(s) in source lines of code.
o The time that the team will stay together.
o The degree to which the problem can be modularized.
o The required quality and reliability of the system to be built.
o The rigidity of the delivery date.
o The degree of sociability (communication) required for the project.
SECTION-B
1. What are the differences between system engineering and software engineering?
Software engineering is part of a System engineering
System engineering is concerned with all aspects of computer-based systems
development including
hardware,
software and
process engineering
System engineering is involved in
System specification,
Architectural design,
Integration and deployment
2. Explain about rapid prototyping techniques.
18
o Executable specification languages.
It is used to animate the system specification.
It is expressed in a formal, mathematical language to provide a system
prototype.
o Very high level languages.
These are the powerful programming languages with data management facilities.
They simplify program development.
o Application generators and fourth-generation languages.
These are successful languages because there is a great deal of communality
across data processing applications.
3. Explain the structured evolutionary prototype model.
The developers build a prototype during the requirements phase.
The prototype is evaluated by end users
The users give corrective feedback
The developers further refine the prototype
When the user is satisfied, the prototype code is brought up to the standards needed for a
final product.
Steps
A preliminary project plan is developed
An partial high-level paper model is created
The model is a source for a partial requirements specification
A prototype is built with basic and critical attributes
The designer builds
o the database
o user interface
o algorithmic functions
The designer demonstrates the prototype, the user evaluates the problems and suggests
the improvements.
This loop continues until the user is satisfied
4. List out the advantages and disadvantages of the structured evolutionary prototype
model.
19
Advantages
The customers can “see” the system requirements as they are being gathered
The developers can learn from customers
A more accurate end product can be obtained.
The unexpected requirements are accommodated
It allows for flexible design and development
A steady, visible signs of progress are produced
An Interaction with the prototype stimulates awareness of additional needed functionality
Disadvantages
A tendency to an abandon structured program development for “code-and-fix”
development
A bad reputation for “quick-and-dirty” methods
An overall maintainability may be overlooked
The customer may want the prototype to be delivered.
Process may continue forever.
5. Write short notes on Spiral Model.
SPIRAL MODEL:
The spiral model is divided into number of frame works. These frameworks are denoted
by the task regions.
Usually there are six task regions. The entry point axis is defined here.
The task regions are:
o Customer communication
o Planning
o Risk analysis.
o Engineering.
o Construct and release.
o Customer evaluation.
Drawbacks
It is based on customer communication.
It demands considerable risk assessment.
20
6. Discuss the important issues that a Software Requirement Specification (SRS) must
address.
SRS is produced at the culmination of the analysis task. It is difficult for software
engineer to understand the nature of problem constraints under which it must operates, so
software engineer take help of a step called requirement capture and analysis. It is the first
formal document to be produced in the software development process and it serves as a basis for
the contract between a producer and a software developer/supplier.
The important issues that a SRS must address are:
(a) The system goals and the requirements are different: Goal is a more generally
characterized. “e. g. Whole system should be designed in a user friendly manner or more robust
system”.
Requests are more testable in nature. For example all users command selection should be only
using pop up menus.
(b) Request Definition: A statement is in a natural language stating, which provides the
expected services of the system. It should be understandable by the clients, the contractors, the
management and the users.
(c) Request Specification: A structured document maintaining the services are more
detailed than definition, and precise enough to act as a contract. It should be understandable by
technical staff both at a developer and producer‟s place.
(d) Software Specification (Design Specification): It is an abstract design description of
the software which is the basis for the design and the implementation. There should be a clear
relationship between this documents and the software request specification. The reader of this
document is a software engineer, a system analyst and a project leader.
(e) Requirement Capture and Analysis: It is the process of designing the system
request and captures through: -
(i) An observation through the existing system.
(ii) Discussion with potential the users, the producers.
(iii) The Personal interviews and the task analysis are performed.
(iv) The standard published documents/reports from the user are generated.
21
(f) Feasibility Study:
(i) It is to estimate whether the identified user needs can be satisfied using the
current technology.
(ii) Estimate whether the proposed system is cost effective.
(iii) Estimate whether it is possible to develop the system within the budgetary
and time constraints.
(g) Suggestions for preparing an SRS Document:
(i) Only specify external system behavior.
(ii)Specifying the constraints on implementation.
(iii)It should record the life cycle of the system.
(iv) It should characterize acceptable responses to the undesired events.
7. What is Requirement Engineering?
Requirement Engineering provides the appropriate mechanism for understanding what
the customer wants, analyzing need, assessing feasibility, negotiating a reasonable solution,
specifying the solution unambiguously, validating the specification, and managing the
requirements as they are transformed into an operational system. The requirements engineering
process can be described in five distinct steps.
Requirements Elicitation
Requirements Analysis and Negotiation
Requirements Specification
System Modeling
Requirements Validation
Requirements Management
Requirements Elicitation-Requirements Elicitation which involves asking the customer,
users, and others, what the objectives for the system or product are what is to be
accomplished, how the system or product fits into the needs of the business, and finally,
how the system or product is to be used on a day-to-day basis. But it isn‟t simple-it‟s very
hard.
Problems of scope – users specify unnecessary technical detail that may
confuse.
22
Problems of understanding-users are not completely sure of what is
needed, have a poor understanding of the capabilities and limitations of
their computing environment.
Problems of volatility-The requirements change over time.
To help overcome these problems, system engineers must approach the
requirements gathering activity in an organized manner.
Guidelines for requirements elicitation
For the proposed system
The people who will help specify requirements and understand their
organizational bias.
The technical environment
Domain constraints
Define one or more requirements elicitation methods.
Requirements Analysis and Negotiation-Once requirements have been gathered the
work products noted earlier form the basis for requirements analysis. Analysis
categorizes requirements and organizes them into related subsets.
As the requirements analysis activity commences, the following questions are asked and
answered:
Is each requirement consistent with the overall objective for the
system product?
Have all requirements been specified at the proper level of
abstraction?
Is the requirement really necessary or does it represent an add-on
feature that may not be essential to the objective of the system?
Is each requirement bounded and unambiguous?
The system engineer must reconcile these conflicts through a process
of negotiation.
Requirements Specification-Specification means different things to
different people. A specification ca be a written document, a graphical
model, a formal mathematical model.
23
8. Explain throw-away prototyping and evolutionary prototyping.
Throw-Away Prototyping:
It is also called close ended prototyping. Throwaway or Rapid Prototyping refers to the
creation of a model that will eventually be discarded rather than becoming part of the final
delivered software. Rapid Prototyping involved creating a working model of various parts of the
system at a very early stage, after a relatively short investigation. The method used in building,
quite informal, the most important factor being the speed with which the model is provided. The
model then becomes the starting point from which users can re-examine their expectations and
clarify their requirements. When this has been achieved, the prototype model is 'thrown away',
and the system is formally developed based on the identified requirements.
The most obvious reason for using Throwaway Prototyping is that it can be done quickly.
If the users can get quick feedback on their requirements, they may be able to refine them early
in the development of the software. Speed is crucial in implementing a throwaway prototype,
since with a limited budget of time and money little can be expended on a prototype that will be
discarded. Strength of throwaway prototyping is its ability to construct interfaces that the users
can test. The user interface is what the user sees as the system, and by seeing it in front of them,
it is much easier to grasp how the system will work.
Evolutionary prototyping:
Evolutionary Prototyping (also known as breadboard prototyping) is quite different from
Throwaway Prototyping. The main goal of Evolutionary Prototyping is to build a very robust
prototype in a structured manner and constantly refine it. The reason for this is that the
Evolutionary prototype, when built, forms the heart of the new system, and then improves for the
further requirements.
When developing a system using Evolutionary Prototyping, the system is continually
refined and rebuilt. “Evolutionary prototyping acknowledges that we do not understand all the
requirements and builds only those that are well understood.”
Evolutionary Prototypes have an advantage over Throwaway Prototypes, they are
functional systems. Although they may not have all the features the users have
planned, they may be used on an interim basis until the final system is delivered. In Evolutionary
Prototyping, developers can focus themselves to develop the parts of the system that they
understand instead of working on developing a whole system.
24
9. Explain any two requirement elicitation methods.
Interviews:
After receiving the problem statement from the customer, the first step is to arrange a
meeting with the customer. During the meeting or interview, both the parties would like to
understand each other. Normally specialized developers are often called as „requirement
engineers‟ they interact with the customer. The objective of conducting an interview is to
understand the customer‟s expectation from the software. Both parties have different feelings,
goals, opinions, vocabularies, understanding, in common project to be a successed.
With this, requirement engineers normally arrange interviews. Requirements engineers
must be open minded and should not approach the interview with pre-conceived notion about
what the requirement.
Interview may be open-ended or structured. In open ended interview there is no pre-set
agenda. Context free questions may be asked to understand the problem and to have an overview
of the situation.
In structured interview, agenda of fairly open questions are prepared. Sometimes a proper
questionnaire is designed for the interview. Interview, may be started with simple questions to
set people at ease. After making the atmosphere comfortable and calm, specific questions may be
asked to understand the requirements, the customer may be allowed to voice his or her
perception about a possible solution.
Facilitated Application Specification Technique
A team oriented approach is developed for the requirements gathering and is called as
Facilitated Application Specification Techniques (FAST). This approach encourages the creation
of a joint team of the customers and the developers who work together to understand the
expectations and to propose a set of requirements.
The basic guidelines of FAST are given below:
• Arrange a meeting at a neutral site for the developers and the customers.
• An Establishment of rules for preparation and participation.
• Prepare and informal agenda that encourages free flow of ideas.
• Appoint a facilitator to control the meeting. A facilitator may be a developer, a
customer, or an outside expert. Prepare a definition mechanism-Board, flip chart,
worksheets, wall stickiest, etc.
25
• Participants should not criticize or debate.
10. What is a Data Dictionary? Explain each component.
Data dictionary is a storehouse of data giving information about the data. It is a list of
terms and their definition for all data items and the data files of a system. A data dictionary
contains the description and the definition concerning the data structure, data elements, their
interrelationships and other characteristics of a system.
Objectives of Data dictionaries:-
1) A standard definition of all terms in a system, that is each item of data is uniquely
identified and defined.
2) An Easy cross-referencing between the sub-systems, the programs and the modules.
3) A Simpler program maintenance.
Data Items:
There are three classes of data items in a data dictionary:-
1) Data element- It is the smallest unit of data which cannot be meaningfully decomposed
further. E.g. Employee number etc.
2) Data structure- A group of data elements forms a data structure.
3) Data Flows and Data Stores- Data Flows are data structures in motion whereas Data
stores are data structures at rest.
11. What specific languages can be used in SRS? What are the advantages of using these
specific languages of SRS?
The Requirement specification necessitates the use of some specification language. The
language should support the desired qualities of the SRS- modifiability, understandability,
unambiguous, and so forth. The language should be easy to learn and to use.
For ease of understanding a natural language might be preferable. Though formal
Notations exist for specifying specific properties of the system, natural languages are now most
often used for specifying requirements. The overall SRS is generally in a natural language, and
when flexible and desirable, some specifications in the SRS may use formal languages.
26
The major advantage of using a natural language is that both client and superior
understand the language. However, natural language, is imprecise and ambiguous. To reduce the
drawback of natural language, most often natural language is used in a structured fashion.
In structured English, the requirements are broken into sections and paragraphs. Each
paragraph is then broken into sub paragraphs. In an SRS, some parts can be specified better using
some formal notation, example- to specify formats of inputs or outputs, regular expression can be
very useful.
Similarly when discussing system like communication protocols, finite state automata
can be used. Decision tables are useful to formally specify the behavior of a system on different
combination of input or settings.
12. What is Software requirement Specification (SRS)? Why is it important? List the
characteristic of a good quality SRS.
Software Requirement Specification Document is the output of requirement analysis
stage, the software development life cycle. It documents all types of requirements and constraints
imposed on the end product. This document is important because it is used in all the successive
stages of SDLC. Any error introduced will result in incomplete and a bad quality product.
The characteristics of a good quality SRS are:
(i) Correctness
(ii) Completeness
(iii) Consistency
(iv) Unambiguousness
(v) Ranking for importance and/ or stability
(vi) Modifiability
(vii) Verifiability
(viii) Traceability
(ix) Design Independent
(x) Understandable by customer
13. What is the use of a data flow diagram? Explain the important concepts of data flow
diagram.
A data flow diagram is used to show the functional view of an application domain. It shows all
the important business processes and the flow of data between those processes.
27
The main concepts used are:
(i) Process: A process represents some kind of transformation on data. It takes as input
one or more inputs and after doing necessary processing generates the output. It is represented by
a circle with the name written inside
(ii) Data Flow: A data flow represents data in motion and is represented by an arrow.
The data flows represent the flow of data among processes, stores and external agents.
(iii) Data Store: A data store represents the data at rest. At the time of implementation it
is represented by data base or files.
(iv) External Agent: An external agent represents a person, a system or any other
software which interacts with the system by providing necessary inputs and outputs. Graphically
it is represented by a rectangle.
14. Discuss about the concept of Non-functional classifications.
These define system properties and constraints e.g. reliability, response time and storage
requirements. Constraints are I/O device capability, system representations, etc.
The Process requirements may be specified by mandating a particular CASE system,
programming language or development method.
Non-functional requirements may be more critical than functional requirements. If these
are not met, the system is useless.
The classifications are
● Product requirements
• Requirements which specifies the delivered product must behave in a particular way.
e.g. execution speed, reliability, etc.
● Organizational requirements
• Requirements which are consequence to organizational policies and to procedures. e.g.
process standards used, implementation requirements, etc.
● External requirements
• Requirements which arise from the factors which are external to the system and its
development process. e.g. interoperability requirements, legislative requirements, etc.
15. Explain Data dictionary.
28
A data dictionary is a collection of data about data. It maintains information about the
defintion, structure, and use of each data element that an organization uses.
There are many attributes that may be stored about a data element. Typical attributes used
in CASE tools (Computer Assisted Software Engineering) are:
Name
Aliases or Synonyms
Default label
Description
Source(s)
Date of origin
Users
Programs
Change authorizations
Access authorization
Data type
Length
Units (cm, degrees C, etc.)
Range of values
Frequency of use
Input/output/local
Conditional values
Parent structure
Subsidiary structures
Repetitive structures
Physical location: record, file, data base
29
A data dictionary is invaluable for documentation purposes, for keeping control information on
corporate data, for ensuring consistency of elements between organizational systems, and for the
use in developing databases.
Data dictionary software packages are commercially available, often as part of a CASE package
or DBMS. DD software allows, consistency checks and the code generation. It is also used in
DBMSs to generate the reports.
16. Discuss briefly the Design model.
The design model consists of the following:
Data Design: Transforms the information domain model which is created during the
analysis into data structures that will be required to implement the software. The data
objects and relationships defined in the entity relationship diagram and the detailed
data content depicted in the data dictionary provide the basis for the data design
activity. Part of data design may occur in conjunction with the design of software
architecture. More detailed design occurs as each software component is designed.
Architectural Design: It defines the relationship between major structural elements
of the software, the “design patterns” that can be used to achieve the requirements
that have been defined for the particular system, and the constraints that affect the
way in which architectural design patterns can be applied. The architectural design
representation – the frame of computer based system – can be derived from the
specified system, the analysis model, and the interaction of subsystems defined
within the analysis model.
Interface Design: It describes how the software communicates within itself, and
interoperates with it, and who uses it. An interface implies a flow of information and
a specific type of behavior. Therefore a data and control flow diagram provides much
of the information required for an interface design.
Component level Design: transforms structural elements of the software architecture
into a procedural description of software components, based on the information
obtained from Process Specification (PSPEC), Control Specification (CPEC) and
State Transition Diagram (STD) serves as the basis for component design.
During design, decisions ultimately affect the success of software construction and
the ease with which the software can be maintained.
The importance of software design can be stated with a single word “ quality”
17. Briefly discuss on Abstraction.
30
At highest level of abstraction a solution is stated in a broad terms using the language of
the problem environment. At lower level of abstraction a more procedural orientation is
taken. Problem oriented terminology is coupled with implementation oriented
terminology in an effort to state a solution. Finally at the lowest level of abstraction the
solution is stated in a manner that can be directly implemented.
Different levels of Abstraction:
Procedural Abstraction: It is a named sequence of instructions that has a specific
and limited function. OPEN implies a long sequence of procedural steps
Data Abstraction: It is a named collection of data that describes a data object. It can
be defined a data abstraction called DOOR
Control Abstraction: it is the third form of data abstraction used in software design.
Like procedure and design abstraction, control abstraction implies a program control
mechanism without specifying internal details.
18. Write a note on Refinement.
Refinement is a process of elaboration.
It is a top – down design strategy originally proposed by Niklaus Wirth.
A program is developed by successively refining levels of procedural details.
A hierarchy is developed by decomposing a macroscopic statement of function in a
step wise fashion until programming language statement is reached.
In each step of the refinement one or several instructions of the given program are
decomposed into detailed instructions.
This successive decomposition or refinement of specifications terminates when all
instructions are expressed in terms of any underlying computer or programming
language.
31
As task are refined so the data must have to be refined, decomposed, or structured
and it is natural to refine the program and the data specifications in parallel.
Refinement is actually a process of elaboration. The begin with a statement of
function, that is defined at a high level of abstraction. That is, Refinement causes the
designer to elaborate on the original statement, providing more and more detail as
each successive refinement occurs.
Abstraction and refinement are complementary concepts. Abstraction enables a
designer to specify procedure and data and yet suppress low-level details.
Refinement helps the designer to reveal low-level details as design progresses. Both
concepts aid the designer in creating a complete design model as the design evolves.
19. Discuss the criteria of an effective modular system.
Modular decomposition: If a design method provides a systematic mechanism for
decomposing the problem into sub problems, it will reduce the complexity of the
overall problem, thereby achieving an effective modular solution.
Modular Composition: if a design module enables an existing design components
to be assembled into a new system, it will yield a modular solution that does not
reinvent the wheel.
Modular Understandability: if a module can be understood as a standalone unit, it
will be easier to build and easier to change.
Modular continuity: if small changes to the system requirements result in changes
to individual modules, rather than system wide changes, the impact of change-
induced side effects will be minimized.
Modular Protection: if an aberrant condition occurs within a module and its effects
are constrained within that module, the impact of error-induced side effects will be
minimized.
20. Write a note on Software Architecture.
32
It is the overall structure of the software and the ways in which the structure provides
conceptual integrity for a system. In its simplest form, architecture is the hierarchical
structure of program components the manner in which these components interact and the
structure of data that are used by the components.
One goal of software design is to derive an architectural rendering of a system. This
rendering serves as a framework from which more detailed design activities are
conducted.
Properties of Architectural Design:
Structural Properties: This aspect of the architectural design representation defines
the components of a system (e.g., modules, objects, filters) and the manner in which
those components are packaged and interact with one another. For example, objects
are packaged to encapsulate both data and the processing that manipulates the data
and interact via the invocation of methods.
Extra-functional properties: The architectural design description should address
how the design architecture achieves requirements for performance, capacity,
reliability, security, adaptability, and other system characteristics.
Families of related systems: The architectural design should draw upon repeatable
patterns that are commonly encountered in the design of families of similar systems.
In essence, the design should have the ability to reuse architectural building blocks.
Different Types of Models:
Structural model: It represents architecture as an organized collection of the
program components.
Frame model: An increase level of the design abstraction by attempting to identify a
repeatable architectural design frameworks that are encountered in similar types of
applications
33
Dynamic Model: address the behavioral aspects of the program architecture,
indicating how the structure or system configuration may change as a function of
external events.
Process model: Focus on the design of the business or technical process that the
system must accommodate.
Functional model: It can be used to represent the functional hierarchy of a system.
21. Write a short note on Information Hiding.
The principle of information hiding suggests that modules be “characterized by design
decisions that hides from all others”. In other words
Modules should be specified and designed so that the information contained within a
module is inaccessible to other modules.
Hiding implies that effective modularity can be achieved by defining a set of
independent modules that communicate with one another, only that information
necessary to achieve software functions.
Abstraction helps to define the procedural entities that makeup the software.
Hiding defines and enforces the access constraints to both procedural detail within a
module and any local data structure used by the module.
The use of information hiding as a design criterion for modular systems provides the
greatest benefits when modifications are required during this testing and later during
software maintenance.
22. Discuss briefly on System Design.
System Design:
System Design process involves deciding which system capabilities are to be
implemented in software and which in hardware. Timing constraints and system
34
functions such as signal processing have to be implemented using hardware as hardware
components deliver much better performance than the equivalent software.
System processing bottlenecks can be identified and replaced by hardware this
avoiding expensive software optimization. A good system design process should
therefore result in a system which can be implemented in either hardware or software the
design process for real-time system differs from other response times must be considered.
There are several stages in the design process.
1. Identify the stimulus that the system must process and the associated response.
2. Identify the timing constraint for each stimulus and associated response.
3. Aggregate the stimulus and response processing into a number of concurrent
processes.
4. For each stimulus and response design algorithms to carryout the required
computations which gives an indication of the amount of processing required and the
time required to complete that processing.
5. Design a scheduling system to ensure that all processes are started in time to meet
their deadlines.
6. Integrate the system under the control of a real-time execute.
The above said design steps is an iterative process. The process architecture, the
scheduling policy, the executive or all of these may then have to be redesigned to
improve the performance of the system.
Analyzing the timing of a real time system is difficult. Process in real-time
system must be co-coordinated. Process condition mechanism ensures mutual exclusion
to share resources. When one process is modifying and shared resource, other processes
should not be able to change that resource. Mechanism for ensuring mutual exclusion
including semaphores, monitors and critical regions. As real-time system should meets
their timing constraints it is impartial to use design stratifies for hard real time system
which involve additional implementation overhead.
23. Write a short note on SCM?
SCM – Software Configuration Management.
SCM is the art of coordinating a software development to minimize the... confusion.
35
It is the art of identifying, organizing and controlling the modifications to the
software being built by a programming team.
The goal is to maximize the productivity by minimizing the mistakes
It is an umbrella activity that is applied throughout the software process.
Because change can occur at any time SCM activities are developed to identify the
change, control change, ensure change is being properly implemented and report the
changes to others who may have an interest.
There are four fundamental sources of change.
New business or market conditions dictate changes in product requirements or
business rules.
New customer needs demand modification of data produced by information system,
functionality delivered by products, or services delivered by a computer-based
system.
Reorganization or business growth/downsizing causes changes in project priorities or
software engineering team structure.
Budgetary or scheduling constraints cause a redefinition of the system or product.
24. Explain about testing objectives.
Testing is a process of executing a program with the intent of finding an error.
A good test case is one that has a high probability of finding an undiscovered error.
A successful test is one that uncovas an undiscovered error.
Objective is to design tests systematically to uncover different classes of errors with
minimum amount of time and effort. Testing shows the present software errors and defects but
not the absence of errors and defects.
25. What is the necessity of unit testing and list out all unit test considerations?
36
Unit testing is done at white box testing to ensure complete coverage of modules, checks
the specific paths within the modules and ensures maximum error detection. Unit test
considerations are
a. To ensure that information properly flows into and out of the program unit.
b. Local data structure is examined to ensure that the data is maintained, during the
execution of the algorithm.
c. Boundary conditions are tested to ensure that whether the module operates at
boundaries.
d. All independent paths through control structure to ensure all statements in a
module been executed and all error handling paths are tested.
26. Explain about system testing.
Testing is done for the entire unit to make sure that all the software requirements
are satisfied by the product
System testing is the phase of testing which tests the both functional and non
functional aspects of the product.
It simulates customer deployments
Different non-functional system testings are performance testing, load testing,
scalability testing, reliability testing, localization testing.
On the basis of written test cases, System testing is performed, according to the
information from detailed architecture, design documents, module specifications
and system requirements specifications.
Identify the defects as possible as before the customer finds in the deployment.
This is the final last testing of the product before the release.
System testing is closer to product release than component.
Test both functional and non-functional aspects of the product.
Build confidence in the product
Analyze and reduce the risk of releasing the product.
Ensure all requirements are met.
27. Explain the importance of boundary values in black box test data.
37
Boundary value analysis is a method useful for arriving at tests that are effective
in catching the defects that happens at boundaries.
Boundary value analysis believes and extends the concept that the density of
defect is more towards the boundaries.
Boundary value analysis is useful to generate test cases when the input or output
data is made up of identifiable boundaries or ranges.
To check the behaviour of the product with the limits of values of the various
variables.
Testing is possible to identify the defects of input boundaries or ranges.
Behaviour of the list at the beginning and end have to be tested thoroughly.
28. Discuss the differences between black box and white box testing.
Black box testing is a functional testing and does not examine the code of a
program.
It is based on the customer‟s viewpoint.
Testers tests with set of inputs and compares the expected results with the actual
results.
Testers require the domain and the functional knowledge of the product to be
tested.
White box testing is a programming testing to examine the program code, code
structure and internal design flow.
It is classified into static and structural testing.
Examine whether the code works according to the functional requirements,
written according to the design, whether any functionality is missed out and
handles errors properly.
Black box testing is done based on requirements.
Test the end users perspectives.
Test how the product handles valid and invalid inputs.
29. Write note on Requirements based testing.
Deals with validating the requirements given in the Software Requirement
Specification (SRS) of the software system.
38
Explicit requirements are stated and documented as part of the requirements
specification.
Requirements review ensures that they are consistent, correct, complete and
testable.
Process ensures some implied requirements are converted and documented.
Requirements are tracted by Requirements Traceability Matrix (RTM).
RTM traces all the requirements from the design, development and testing.
Validating the requirements in the SRS.
Explicit and implicit requirements to be tested.
Requirements tracked by RTM
30. Write notes on positive and negative testing.
Positive testing tries to prove the given product does what it is supposed to do. It is
to check the products behaviour for positive and negative testing as stated in the
requirement.
Negative testing is done to show that the product does not fail when an unexpected
input is given.
Negative testing covers scenarios for which the product is not designed and coded.
Positive testing checks the products behaviour for positive and negative conditions
as stated in the requirements.
Negative testing checks that the product does not fail when an unexpected input is
given.
31. Explain functional versus non-functional testing.
Functional testing involves products functionality and features.
Non functional testing involves testing the products quality factors.
System testing comprises both functional and non-functional test verification.
Functional testing helps in verifying what the system is supposed to do.
Functional testing is performed in all the phases of testing as unit testing,
component testing, integration testing and system testing.
39
Non-functional testing is performed to verify the quality factors as reliability
scalability.
It requires the expected results to be documented in qualitative and quantifiable
terms.
Functional testing results on the product.
Non functional testing requires understanding the product behaviour, design and
architecture.
32. Explain the different testing types of non-functional testing.
Different types of non-functional testing are:
Non functional testing differs from the aspects of complexity,
Knowledge requirement, Effort needed ,Number of times the test cases are
repeated setting up the Configuration, Coming up with Entry /Exit criteria,
Balancing Key resources
Scalability testing
Reliability testing
Stress testing
Interoperability testing
Acceptance testing
Performance testing
33. Specify the period and time at which Regression Testing can be done.
Each time a new module is added as part of integration testing, the software changes. A
new data flow paths are established, new I/O may occur, and new control logic is invoked.
Regression testing is the re-execution of some subset of tests that have already been conducted
to ensure that changes have not propagated in side effects.
In a broader context, successful tests result in the discovery of errors and errors must be
corrected. Whenever software is corrected, some aspects of the software configuration are
changed. Regression testing is the activity that helps to ensure that changes do not introduce
unintended behaviour or additional errors.
Regression testing may be conducted manually, by reexecuting a subset of all test cases
using automated capture playback tools enable the software engineer to capture test cases and
results for subsequent playback and comparison.
40
The regression test suite contains three different classes of test cases
o A representative sample of tests that will exercise all software functions.
o Additional tests that focus on software function that are likely to be affected by the
change.
o Tests that focus on software components that have been changed.
An integration testing proceeds, the number of regression tests can grow quite
large.Therefore,the regression test suite should be designed to include only those tests that
address one or more classes of errors in each of the major program functions. It is
impractical and inefficient to reexecute every test for every program function once a
change has occurred.
34. Write notes on the types of regression testing.
When internal or external test teams or customers begin using a product, they report
defects.
Defects are analyzed by each developer who makes individual defect fixes.
Regular regression testing and final regression testing.
Regular regression testing is done between test cycles to ensure that the defect fixes.
Final regression testing is done to validate the final build before release.
35. Discuss briefly about measures and measurement of software.
A software measure is a mapping from a set of objects in the software engineering world
into a set of mathematical constructs such as numbers or vectors of numbers.
A software measurement is a technique or method that applies software measures to a
class of software engineering objects to achieve a predefined goal. Five characteristics of
software measurement can be identified.
Object of measurement ranging from products (e.g., source code, software
designs, software requirements, and software test cases) to processes (e.g.,
architectural design process, coding and unit test processes, system test process)
and projects.
Purpose of measurement such as characterization, assessment, evaluation,
prediction.
Source of measurement such as software designers, software testers, and software
managers.
41
Measured property such as cost of software, reliability, maintainability, size,
portability.
Context of measurement where software artifacts are measured in different
environments (including people, technology, resources available), which are
specified in advance before applying some software measures.
36. Discuss the different scales in measurement theory.
Scale Description Examples
Nominal Denotes membership in a
class (supports equivalence
relations)
Labeling, classifying
entities (e.g., slow, fast)
Ordinal Measurement expresses
comparative judgement,
imposes an ordering of
terms (supports equivalence
relations, e.g., “≥”)
Preference
Interval Measurement expresses
distance between pairs of
items (supports equivalence
relations and known ratio of
intervals)
Time (calendar),
temperature (centigrade,
Fahrenheit)
Ratio Measurement denotes a
degree in relation to
standard where a software
entity manifests chosen
property (supports
equivalence relations,
known ratio of intervals,
Time (interval), temperature
(absolute), length
42
and known ratio of any two
scalar values)
37. Explain the COCOMO model in short.
The COCOMO (COnstructive COst Model) model is the most complete and
thoroughly documented model used in effort estimation. COCOMO is based on Boehm‟s
analysis of a database of 63 software projects (Boehm, 1981). The model provides
detailed formulas for determining the development time schedule, overall development
effort, effort breakdown by phase and activity, as well as maintenance effort. COCOMO
estimates the effort in person-months of direct labor. The primary effort factor is the
number of source line of code(SLOC) expressed in thousands of delivered source
instructions (KDSI). These instructions include all program instructions, format
statements, and job control language statements. They exclude comments and
unmodified utility software. The COCOMO model relies on two assumptions. First, it is
linked to the classic waterfall model of software development. Second, good management
practices with no slack time are assumed. The model is developed in three versions of
different level of detail: basic, intermediate, and detailed. We will discuss the first two of
them. Furthermore, the overall modeling process takes into account three classes of
systems:
1. Embedded: This class of systems is characterized by tight constraints, changing
environment, and unfamiliar surroundings. Projects of the embedded type are novel
to the company, and usually exhibit temporal constraints. Good Examples of
Embedded systems are real-time software systems (say, in avionics, aerospace,
medicine).
2. Organic: This category encompasses all systems that are small relative to project
size and team size, and have a stable environment, familiar surroundings, and
relaxed interfaces. These are simple business systems, data processing systems,
small software libraries.
3. Semidetached: The software systems falling under this category are a mix of those
of organic and embedded nature. Some examples of software of this class are
43
operating systems, database management systems, and inventory management
systems.
38. Give a short note on Delphi Cost Estimation model.
It helps to coordinate a process of gaining information and generating reliable estimates.
A series of steps is followed in the Delphi Cost estimation:
Co ordinate presents each expert with a specification of the proposed project and
other relevant information.
Co ordinate calls a group meeting where experts discuss the estimates.
Experts fill out estimation forms indicating their personal estimates of total project
effort and total development effort. The estimates are given in an interval format; the
expert provides the most likely value along with an upper and lower bound.
Co ordinate prepares and circulates summary report indicating the group estimates
and the individual estimates.
Coordinator calls a meeting during which experts discuss current estimates.
39.Explain in detail about the software process.
Software engineering is the establishment and use of sound engineering principles on
order to obtain economically software that is reliable and works efficiently on real
machines.
The technical aspects of software quality
The need for customer satisfaction or timely product delivery.
Mention of the importance of measurement and metrics.
The application of a systematic, disciplined, quantifiable approach to the
development, operation, and maintenance of software; that is, the application of
engineering to software.
The study of approaches as in:
Software engineering is a layered technology
A Generic View of Software Engineering :
Engineering is the analysis, design, construction, verification, and management of
technical entities. Regardless of the entity to be engineered, the following questions must be
asked and answered:
What is the problem to be solved?
44
What characteristics of the entity are used to solve the problem?
How will the entity be realized?
How will the entity be constructed?
What approach will be used to uncover errors that were made in the design and
construction of the entity?
How will the entity be supported over the long term, when corrections, adaptations, and
enhancements are requested by users of the entity.
A single entity-computer software. To engineer software adequately, a software
engineering process must be defined. The generic characteristics of the software process
are considered.
Formal technical reviews
Software configuration management
Document preparation and production
Reusability management
Measurement
Risk management
The software process:
A common process framework is established by defined a small number of framework
activities that are applicable to all software projects, regardless of their size or complexity. A
number of task sets each a collection of software engineering work tasks,
project milestones, work products and quality assurance points –enable the framework
activities to be adapted to the characteristics of the software of the software project and the
requirements of the project team.
Level 1 -Initial: The software process is characterized as adhoc and occasionally even chaotic.
Few processes are defined, and success depends on individual effort.
Level 2 -Repeatable: Basic project management processes are established to track cost,
schedule, and functionality. The necessary process discipline is in place to repeat earlier
successes on projects with similar applications.
Level 3 - Defined: the software process for both management and engineering activities is
documented, standardized, and integrated into an organization wide software process. All
45
projects use a documented and approved version of the organization‟s process for developing
and supporting software.
Level 4 - Managed: Detailed measures of the software process and product quality are collected
. Both the software process and products are quantitatively understood and controlled using
detailed measures.
Level 5 - Optimizing: Continuous process improvement is enabled quantitative feedback from
the process and from testing innovative ideas and technologies.
The five levels defined by the SEI were derived as a consequence of evaluating responses to the
SEI assessments questionnaire that is based on the CMM. The results of the questionnaire are
distilled to a single numerical grade that provides an indication of an organization‟s process
maturity.
Goals: the overall objectivities that the KPA must achieve.
Commitments: Requirements that must be met to achieve the goals or provide proof of intent to
comply with the goals.
Abilities: those things that must be in place to enable the organization to meet the commitments.
Activities: the specific tasks required to achieve the KPA function
Methods for monitoring implementation: the manner in which proper practice for the KPA can
be verified.
The following KPAs should be achieved at each process maturity level:
Process maturity level 2
Software configuration management
Software quality assurance
Software subcontract management
Software project tracking and oversight
Software project planning
Requirements management
Process maturity level 3
Peer reviews
Intergroup coordination
Software product engineering
Integrated software management
46
Training program
Organization process definition
Organization process focus
Process maturity level 4
Software quality management
Quantitative process management
Process maturity level 5
Process change management
Technology change management
Defect prevention
40. Explain in detail about the life cycle process.
System engineering process follows a waterfall model for the parallel development of the
different parts of the system.
System requirements definition
Three types of requirements
i) Abstract functional requirements.
ii) System properties.
iii) Undesirable Characteristics.
System objectives
System requirement problem.
The system design process
Process steps
Partition requirements
Identify sub-systems.
Assign requirements to sub-systems.
Specify sub-system functionality.
Define sub-system interfaces.
Sub-System development process
After system design starts.
47
Involve use of COTS (Commercial-Off-The-Shelf).
System Integration
It is the process of putting hardware, software and people together to make a system.
System Installation
Issues are
Environmental assumptions may be incorrect.
There may be human resistance to the introduction of a new system.
System may have to coexist with alternative systems for some period.
There may arise some physical installation problems (e.g. cabling problem).
Operator training has to be identified.
System evolution
The lifetime of the large systems is too long. They must evolve to meet change requirements.
The evolution may be costly.
Existing systems that must be maintained are sometimes called as legacy systems.
System Decommissioning
Taking the system out of service after its useful lifetime is called as System Decommissioning.
41. Explain Software Prototyping. What are the various prototyping methods and tools?
The customer resources be committed to the evaluation and refinement of the prototype
The customer is capable of making requirements decisions in a timely fashion. Finally the
nature of the development project will have a strong bearing on the efficacy of prototyping.
Prototyping Methods and Tools:
For software prototyping to be effective, a prototype must be developed rapidly so that
the customer may access results and recommend changes. To conduct rapid prototyping, three
generic classes of methods and tools are available:
Fourth generation techniques:
Fourth generation techniques encompass a broad array of database query and reporting
languages, program and application generators, and other very high-level nonprocedural
48
languages. Because 4GT enable the software engineer to generate executable code quickly, they
are ideal for rapid prototyping.
Reusable software components:
Another approach to rapid prototyping is to assemble, rather than build, the prototype by
using a set of existing software components. Melding prototyping and program component reuse
will work only if a library system is developed so that components that do exist can be cataloged
and then retrieved. It should be noted that an existing software product can be used as a
prototype for a new, improved competitive product. In a way, this is a form of reusability for
software prototyping.
Formal specification and prototyping environment:
Over the past two decades, a number of formal specification languages and tools have
been developed as a replacement for natural language specification techniques. Today,
developers of those formal languages are in the process of developing interactive environments
that
To enable an analyst to interactively create languages based specifications of a
system or software.
Invoke automated tools that translate the language based specifications into
executable code.
Enable the customer to use the prototype executable code to refine formal
requirements.
42. Explain Functional and Behavioural Modelling .
Behavioural modelling is an operational principle for all requirements analysis methods.
The state transition diagram represents the behaviour of a system by depicting its states and the
events that cause the system to change state. STD indicates what actions are taken as a
consequence of a particular event.
A state is any observable mode of behaviour. States for monitoring and control system for
pressure vessels might be monitoring state, alarm state, pressure state and so on. Each of these
49
states represents a mode of behaviour of the system. A state transition diagram indicates hoe the
system moves from state to state.
43. What are Entity Relationship Diagrams?
Entity Relationship Diagrams (ERDs), are the logical structure of the databases.
Entity Relationship Diagram Notations
Peter Chen developed ERDs in 1976. Since then Charles Bachman and James Martin
have added some slight refinements to the basic ERD principles.
Entity
An entity is an object or concept about which you want to store information.
Weak Entity
A weak entity is an entity that must be defined by a foreign key relationship with another
entity as itcannotbe uniquely identified by its own attributes alone.
Key attribute
A key attribute is the unique, distinguishing characteristic of the entity. For example, an
employee's social security number might be the employee's key attribute.
Multivalued attribute
A multivalued attribute can have more than one value. For example, an employee entity can have
multiple skill values.
Derived attribute
50
A derived attribute is based on the another attribute. For example, an employee's monthly salary
is based on the employee's annual salary.
Relationships
Relationships illustrate the sharing of information between two entities in the database structure.
Cardinality
Cardinality specifies the number instances of an entity relate to one instance of another entity.
Ordinality is also closely linked to cardinality. While cardinality specifies the occurences of a
relationship, ordinality describes the relationship as either mandatory or optional. In other words,
cardinality specifies the maximum number of relationships and ordinality specifies the absolute
minimum number of relationships.
Recursive relationship
In some cases, entities can be self-linked. For example, employees can supervise other
employees.
51
44. Explain Structural Partitioning.
If the architectural style of a system is hierarchical, the program structure, can be
partitioned both horizontally and vertically.
Horizontal Partitioning: Defines separate branches of the modular hierarchy for
each major program function. The simplest approach to horizontal partitioning
defines three partitions – input, data transformation (processing) and output.
Partitioning of the architecture horizontally provides a number of distinct benefits.
o Software that is easier to test.
o Software that is easier to maintain.
o Propagation of fewer side effects.
o Software that is easier to extend.
Control Modules: They are represented in a darker shade and are used to coordinate
communication between and execution of the functions.
Vertical Partitioning: It is often called as factoring, that control and work should be
distributed top – down in the program structure. Top level modules should perform
control functions and do little actual processing work. Modules that reside low in the
structure should be the workers, performing all input, computation, and output tasks.
It can be seen that a change in a control module will have a higher probability of
propagating side effects to the modules that are subordinate to it. A change to a
worker module, given its low level in the structure, is less likely to cause the
propagation of side effects. Vertical partitioned structures are less likely to be
susceptible to side effects when changes are made and will therefore be more
maintainable.
45. Write in detail on Effective Modular Design.
Modularity has become an acceptable approach in all engineering disciplines.
Modular design reduces complexity, facilitates changes and results in easier
implementation by encouraging parallel development of different parts of a system.
52
Functional Independence: The concept of function independence is a direct
outgrowth of modularity, abstraction and information hiding. Functional
independence is achieved by developing modules with “single – minded” function
and an “aversion” to excessive interaction with other modules. Software with
independent modules is easier to develop because function may be
compartmentalized and the interfaces are simplified. Independent module is easier to
maintain and to test, because secondary effects caused by design or code
modification are limited, error propagation is reduced and reusable modules are
possible. Independence is measured using two qualities cohesion and coupling.
Cohesion: It is a measure of the relative functional strength of a module. It is a
natural extension of the information hiding concept. A cohesive module performs a
single task within a software procedure, requiring little interaction with procedures
being performed in other parts of a program. Cohesion may be represented as a
“spectrum”. It is always strive for high cohesion though mid- range is acceptable.
Scale of cohesion is nonlinear. At the low end of spectrum, encounts a module that
performs a set of tasks that relate to each other loosely. Such modules are termed
coincidentally cohesive. A module that performs tasks that are related logically is
logically cohesive. When a module contains tasks that are related by the fact that all
must be executed with the same span of time, the module exhibits temporal cohesion.
When processing elements of a module are related and must be executed in a specific
order, procedural cohesion exits. When all processing elements concentrate on one
area of a data structure, communicational cohesion is present.
Coupling: It is a measure of interconnection among modules in a software structure.
Coupling depends on the interface complexity between modules, the point at which
entry or reference is made to a module and what data pass across the interface. Data
coupling is the passing of simple data, another variation is called stamp coupling.
Control coupling is very common in most software designs. External coupling is
essential, but should be limited to a small number of modules with a structure
46. Write a note on Real – Time Software Design.
53
A real – time system is a software system where the correct functioning of the system
depends on the results produced by the system and the time at which these results
produced by the system and the time at which these results are produced.
Soft real Time system: A soft real time system is a system whose operation is
degraded if results are not produced according to the specified timing required.
Hard Real Time system: A hard real time system is a system whose operation is
incorrect if results are not produced according to the timing specification.
Periodic stimuli: It occurs at predictable time intervals. In real time systems are
usually guaranteed by sensors associated with the system. These provide information
about the state of the system‟s environment. The response are directed to a set of
actuators that control some equipment that then influences the system‟s environment.
A periodic stimuli: It occurs irregularly. They are usually signaled using computer‟s
interrupt mechanism. They may be generated either by the actuators or by sensors.
They often indicate some exceptional condition, such as hardware failure, that must
be handled by the system.
47. Discuss in detail the Transform Mapping.
Transform mapping is a set of design steps that allow a Data Flow Diagram (DFD)
with transform flow characteristics to be mapped into a specific architectural style.
Example Safe Home security system: The product monitors the real world and
reacts to changes that it encounters. It also interacts with a user through a series of
types inputs and alphanumeric displays.
Design Steps: The steps begin with a re-evaluation of work done during
requirements analysis and then move to the design of the software architecture.
o Step 1: Review the fundamental system model: It encompasses the level 0 DFD
and supporting information. The design steps begin with an evaluation of both
54
the System Specification and the Software Requirements Specification. Both
documents describe information flow and structure at the software interface.
o Step 2: Review and refine data flow diagrams for the software: Information
obtained from analysis model contained in the Software Requirements
Specification is refined to produce greater detail.
o Step 3. Determine whether the DFD has transform or transaction flow
characteristics: Information flow within a system can always be represented as
transform.
o Step 4. Isolate the transform center by specifying incoming and outgoing flow
boundaries: Income flow was described as a path in which information is
converted from external to internal form, outgoing flow converts from internal to
external form. Incoming and outgoing flow boundaries are open to interpretation.
o Step 5. Perform “first - level factoring”: Program structure represents a top –
down distribution of control. Factoring results in a program structure in which
top – level modules perform decision making and low – level modules perform
decision making and low – level modules perform most input, computation, and
output work. Middle – level modules perform some control and do moderate
amount of work.
o Step 6. Perform “second – level factoring”: It is accomplished by mapping
individual transforms (bubbles) of a DFD into appropriate modules within the
architecture. Beginning at the transform center boundary and moving outward
along incoming and outgoing paths, transforms are mapped into, subordinate
levels of the software structure.
o Step 7. Refine the first – Iteration architecture: using design heuristics for
improved software quality.
48. Give detailed Notes of Transaction Mapping.
55
In many software applications, a single data item triggers one or a number of
information flows that affect a function implies by the triggering data item.
The data item is called transaction.
Example: Safe Home software.
Design steps: The design steps for transaction mapping are similar and in some cases
identical to the steps for transform mapping.
o Step 1: Review the fundamental system model.
o Step 2: Review and refine data flow diagrams for the software.
o Step 3: Determine whether the DFD has transform or transaction flow
characteristics.
o Step 4: Identify the transaction center and the flow characteristics along each of
the action paths: Income path and also all action paths must also be isolated.
Boundaries that define a reception path and action paths must also be shown.
Each action path must be evaluated for its individual flow characteristic.
o Step 5: Map of DFD in a program structure amenable to transaction processing:
Transaction flow is mapped into an architecture that contains an incoming branch
and a dispatch branch. The structure of the incoming branch is developed in
much the same say as transform mapping. Starting at the transaction center,
bubbles along the incoming path are mapped into modules.
o Step 6: Factor and refine the transaction structure and the structure of each action
path.
o Step 7: Refine the first – iteration architecture using design heuristics for
improving software quality.
49. Explain the concept of SCM in detail.
56
SCM- Software Configuration Management.
SCM is an umbrella activity that is applied throughout the software process.
Configuration management is the art of coordinating software development to
minimize confusion.
It is the art of identifying, organizing and controlling modifications to the software
being built by a programming team.
SCM process: The goal is to maximize productivity by minimizing mistakes.
Configuration management is the art of coordinating software development to
minimize confusion.
50. Explain the software testing strategies?
o Software testing is important concept in software developing process of
testing is the process of exercising the software product in pre-defined to
check if the behavior is the same as expected behavior.
o The organization identifies and removes the defects before the product gets
release.
o There are different levels of testing: They are Unit testing, Integration testing,
System integration testing.
o During unit testing each component are tested by the developer to verify
whether all the requirements are satisfied.
o Next integration testing is done to combine the components and to verify the
data flow and to test all the functionality of each components are satisfied.
o Finally the system testing is done to confirm the flow of the system and
behavior of the system before the release of the product.
o Testing is done at component level
o Testing done by independent test group.
o Though debugging is different activity, it must be accommodated in any
testing strategy.
51. Discuss the validation testing?
57
o Validation is the process of evaluating a system or component during or at the end
of the development process to determine whether it satisfies specified
requirements.
o Validation is the quality control process.
o Defects are found and debugged during the testing process.
o Testing is done by set of people within a software product whose goal is to cover
the defects in the product before it reaches the customer.
o So the purpose of validation is to find defects in a software product before the
release.
o Validation is done whether certain activities are carried out during various phases
to validate whether the product is built as per specification.
o Validation is the process of evaluating a system to determine whether it satisfies
specified requirements.
o Validation takes case of a set of activities.
o It is the process of producing the feasible product software.
52. Discuss in detail about debugging process?
Debugging is not testing but always occurs as a consequence of testing. The debugging
process begins with the execution of a test case. Results are assessed and a lack of
correspondence between expected and actual performance is encountered. The no
corresponding data are a symptom of an underlying cause is hidden. The debugging process
attempt to match symptom with cause, thereby leading to error correction.
A few characteristics of bugs provide some clues:
The symptom and the cause may be geographically remote. That is the
symptom may appear in one part of a program, while the cause may actually
be located at a site that is far removed. Highly coupled program structures
exacerbate this situation.
The symptom may disappear when another error is corrected.
The symptom may actually be caused by no errors
The symptom may be caused by human error that is not easily traced.
The symptom may be a result of timing problems, rather than processing
problems.
58
It may be difficult to accurately reproduce input conditions
The symptom may be intermittent. This is particularly common in embedded
systems that couple hardware and software inextricably.
The symptom may be due to causes that are distributed across a number of
tasks running on different processors.
53. Give a detailed account of Regression Testing.
Each time a new module is added as part of integration testing, the software changes.
New data flow paths are established, new I/O may occur, and new control logic is invoked.
These changes may cause problems with functions that previously worked flawlessly. In the
context of an integration test strategy, regression testing is the re-execution of some subset of
tests that have already been conducted to ensure that changes have not propagated unintended
side effects.
In a broader context, successful tests (of any kind) result in the discovery of errors, and
errors must be corrected. Whenever software is corrected, some aspect of software configuration
(the program, its documentation, or the data that support it) is changed. Regression testing is the
activity that helps to ensure that changes (due to testing or for other reasons) do not introduce
unintended behavior or additional errors.
Regression testing may be conducted manually, by re-executing a subset of all test cases
or using automated capture/playback tools. Capture/playback tools enable the software engineer
to capture test cases and results for subsequent playback and comparison.
The regression test suite (the subset of testes to be executed) contains three different
classes of test cases:
A representative sample of tests that will exercise all software functions.
Additional test that focus on software functions that are likely to be affected
by the change.
Tests that focus on the software components that have been changed.
As integration testing proceeds, the number of regression tests can grow quite large.
Therefore, the regression test suite should be designed to include only those tests that address
one or more classes of errors in each of the major program functions. It is impractical and
inefficient to re-execute every test for every program function once a change has occurred.
54. Explain about a Taxonomy of CASE Tools.
59
CASE (COMPUTER –AIDED SOFTWARE ENGINEERING) tools can be classified
by function, by their role as instruments for managers or technical people, by their use in the
various steps of the software engineering process, by the environment architecture that supports
them or even by their origin or cost.
Risk Analysis Tools:
Risk analysis tools enable a project manager to build a risk table by providing detailed
guidance in the identification and analysis of risks.
Project Management Tools:
A manager should use tools to collect metrics that will ultimately provide an
indification of software product quality.
Requirements Tracing Tools:
The typical requirements tracing tool combines human-interactive text evaluation,
with a database management system that stores and categorizes each system requirement that is
“passed” from the original RFP or specification.
Metrics and Management Tools:
Metrics and Measurement tools focus on process, project and product
characteristics. Management-oriented tools capture project-specific metrics that provide an
overall indification of productivity or quality.
Documentation Tools:
Documentation production and desk-top publishing tools support nearly every
aspect of software engineering and represent a substantial “leverage” opportunity for all software
developers.
System Software Tools:
The CASE environment must accommodate high-quality network system
software, electronic mail, bulletin boards and other communication capabilities.
Quality Assurance Tools:
The majority of CASE tools that claim to focus on quality assurance are actually
metrics tools that audit source code to determine compliance with language standards. Other
tools extract technical metrics in an effort to project the quality of the software that is being built.
Prototyping Tools:
60
Screen painters enable a software engineer to define screen layout rapidly for interactive
applications. More sophisticated CASE prototyping tools enable the creation of a data design,
coupled with both screen and report layouts.