Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

o. Define quality?

Answer :

Quany means Consistently meeting custo rneods in terms of reguirements, ost service and devery
schedule.
02. Distinguish between software product quality and software process q'1ality.
Answer :
the software process process quality
Software quality (product) describes the atributes of the products of
desc:ibes the attributes of the software developmert process itself.
03. Define software quality?
Answer:

to explicit stated functional and pertormance requirements e plicitly i0cumented


Contormance
and implicitly characteristics that are expected of al! profession al!y developed software.
development standards
04. Differentiate between error & fault.
or

Differentiate software failure or softwa e fault.


Answer:
are discovered after the software has been released to
Defects (faults) refer to the quality problems i, aich
the customer.
customer.
before the software has been released to the
Erors are the quality problems which are discovered
quality?
Q5. What are the factors of software
Answer:
nality, usability, reliabilitu.
Portability, efficiency, maintainability, functio
Q6. Define quality of design?
Answer: item.
refers to the charac eristics thaidesigners. Specify for an
It
conforman e?
Q7. Define quality of
Answer:
which the design specificat.on are followed during manufacturing.
Itis the degree to
assurance plan?
V8, What is quality
Answer :
SOA nlan provides road map Tor nsuuuns nena t serve as template for SOA activities that are
nstituted for each software project.
software safety?
V9, DËscuss features of
Answer: ldentification of hazards

Analysis of hazards
Safety related requirements.
Warning: XerowPhotocopying of this book is a CRIMINAL Act Anyone found guitty is IABLE to face LEGAL Proceedigs
Software Engineering
Q10. List the goals of SQA. Q16. What is alpha testing?
Answer : Answer:
which the version
Requirement quality It is attestingin
customer under
of
comple
softwann istested by the the
Design quality
Code quality
of developer. supevisicn
Q17. Define debigging?
QCeffectiveness A
Answer:
Q11. How to measure software reliability?
Debugging happens as a result of i testing. ti
Answer: removalof error
the process that cause the
Measure of reliability is Q18. Define system testing?
Mean -Time - between - Failure Answer :
that the
(MT BF) Thetests conducted to ensure software
is comfortable with the systenm.
3
Where,
testing?
MTBF = MTTF + MTTR Q19. Define interation
Answer
of all unite.
It is remains on the collaboration 2
Failure Repair
form one single architecture and then implementing tes
interfaces
Q12. How to mneasure availability? cases to uncover bugs associated with the
Answer: Q20. What is beta testing?
Availability is the probability that a program is Answer:
operating according to requirements at a given point in
time and is defined as, The beta testing is a testing in which the version

Availability = [MTTF/(MTTF + MTTR)]x 100% of the software is tested by the customer without the
developer being present.
Q13. What are different levels of testing?
Answer:
Unit testing
Integration testing
System testing
validation.
Q14. Define verification and
Answer:

Verification refers to the process of determining


correctly carried
whether the coding phase has been
out.
software
Validation is the process of evduating
to ensure
software development
at the end of the requirements.
compliance with the software
regression testing?
Q15. What is
Answer:
to
subsets of tests which are performed
The
software change.
of
nullify the effects book is a CRIMINAL Act. Anyone found guilty is LIABLE to face LEGAL
proceodh
Xerox/Photocopying ofthis
Warning:
1Software Engineering

Essay Questions with Answers


1TESTING STRATEGIES
1. Discss the need for testing a developed software?
nswer :

review O
Otwale lesung is àcritical element of software auality assurance and represents the ultimate
pecification, design and coding.
esting Objectives
According to Glen Myers the testing objectives are
Testing is aprocess of executing a program with the intend of finding an error.
Agood test case is one that has high probability of finding an undiscovered erro.
3 Asuccessful test is one that uncovers an as-yet undiscovered error.
Testing Principles
software testing.
Every software engineer must apply following testing principles while performing the
1 All tests should be traceable to customer requirements.
2 Tests should be planned long before testing begins.
uncovered dring testing will
3 The Pareto principle can be applied to software testing - 80 %of all errors
ikely be traceable to 20 %of all program modules.
4.
Testing should begin "in the smal' and progress toward testing "in the large"
5 Exhaustive testing is not possible.
party.
6. To be most effective, testing should be conducted by an independent third
Why Testing is Important?
engineering activity.
1. Generally, testing is aprocess that requires more efforts than any other software
systematicallu.
2 Testing is aset of activities that can be planned in advance and conducted
errors may get introduced.
3 If it is conducted haphazardly, then only time willbe wasted and more even worse
performing testing
4 This may lead to have many undetected errors in the system being developed. Hence
development of software.
by adopting systematic strategies is very much essential in during
Phases of Testing
Various testing activities are
1 Test Planning
(for
The test plan or test script is prepared. Ihese are generated trom requirements analysis document
box).
black box) and program code for white
2. Test case design
effective in testing
The goal of test case design is to create a set of tests that are
3 Test execution
rosult
The test data is derived through various test cases in order to obtain the test
4. Data collection
verified.
The test results are collected and
5. Effective evaluation
All the above test activities are pertormed on the software model and the maximum number of errors are
uncovered.
is a CRIMINAL
Warning :Xerox/Photocopying of this book Aci. Anyone found gnityA3LFte 'ace ECAL proceedings
Software Engineering 4.6
Q2. Explain the strategic approach to software
testing.
Answer :
Strategic Approach to Software Testing
Atesting strategy provides a process that describes for the developer, quality analysts andIthe customer the
steps conducted as part of testing.
PpECE
a) Test planning
b Test case design
c) Test execution

d Data collection
Effectiveness evaluation.

Strategic approach for software testing can be


1 Just before starting the testing process the formal technical reviews must be conducted.
each interface is tested and
2 At the beginning, various components of the system are tested, then gradualh
thus the entire computer basedsystem is tested.
3. Different testing techniques can be applied at different point of ime.
4 The developer of the software conducts testing.
software testing.
5 Testing and debugging are different activities that must be carried out ir.
6 Debugging also lies within any testing strategy.
Verification and Validation bet
specific function.
i) Verification refers to the set of activities that ensure that software correctly implements a
software that has been built is traceable
i) Validation refers to a different set of activities that ensure that the
to customer requirements. Customen
ii) According to Boehm
Verificatio : 'Are we building the product
right?"
Validation: "Are we building the right product?"
iv) Software testing is only one element ofSoftware Quality
Assurance (sQA)
Veritication and validation activities such as -
Formal technical reviews

Quality and configuration audits


Performance monitoring
Feasibiity study
Documentation review

Database review

Algorithmic analysis
Development testing
Installation testing Act. Anyone found guilty is
is a CRIMINAL LIABLE to face LEGAL proceedings
Xerox/Photocopying of this book
Warning:
4.7 Software Engineering
Q3. Explain about the test strategies for conventional software.
Answer :

Crategies for Conventional Software


We begin by 'testing-in-the-small' and move toward 'testing-in-the-large'. Various testing strateges
conventional software are t ' S s phlws
detechm nhohg
1 Unit testing YO8 coneteuc hvE ve
2 estingPqym UIVs
3. Valida.tion testing
4 System testing;
1. Unit testing
fromeach software component individually.
In this type of testingtechniques are applied to detect the errors
2 Integration Testing *
construction as components begin interacting
It focuses on issues associated with verification and program
with one another.
3. Validation Testing
criteria (established during requirements analysis) meets
It provides assurance that the software validation
all functional, behavioural and performance requirements.
4 System Testing
system is tested as a whole.
In system testing all system elements forming the
Software
Testing development stages
strategies

System System engineering


testing
Validation Requirements
testing
Tntegration Design
testing
Unit
testing Code

Figure : Testing Strategy

Unit Testing
1
In unit testing the individual components are tested independently to ensure theu aualtu
implementation
2 The focus is to uncover the erorsin design and
3 The various tests that are conducted during the unit test are descubed as below
Warning:Xerox/Photocopying of this book is a CRIMINAL Act.Anyone found guity is LIABLE to face LEGAL roceetlinye
Software Engineering 48
E:
E:i) Module interfaces are testedfor proper information fow in and out of the program.
Local data are examined to ensure that integrity is
maintained.
ii) Boundary conditions are testedto ensure that the module operates properly at boundarnesS establishe:
to limit or restrict
processing.
4.
AU the basis (independent) paths are tested for ensuring that all statements in the module have beg
executed only once.
5. Allerror handling paths should be tested.
Mocule mte farce Things to be tested
Local dotatm°Source program
2 Bovnday Intertaces
Module - Local data structuresGenerating
Various to be -Boundary condition
modules tested -Independent path v
in program -Eror handling paths,

Figure : Unit Testing

6. Drivers and stub software need to be developed to test incomplete software.


The "driver" is a program that accepts the test data and prints the relevant results.
The "stub" is a subprogram that uses the module interfaces and performs the minimal data manipulation
if required.
Integration Testing
group of dependent components are tested together to ensure their quality of their integration unit.
A
dictated b.
The objective is to take unit tested components and build a program structure that has been
software design.
The focus of integration testing is to uncover errors in:
architecture.
a) Design and construction of software
b) Integrated functions or operationsat subsystem level.
Interfaces and interactions between them.

d) Resource integration and/or environment integration.


approaches.
The integration testing can be carried out using two
1 The non-incremental integration
Cmbined tt tne
Big bang Al pans
2. Incremental integration

Top - down testing


Bottom up integration

Regression testing

Smoke testing u

book is a CRIMINAL Act. Anyone found guilty is LJABLE to face LEGAL proceedings
(eroxPhotocopying ofthis
Software Engineering
The integrated software is tested based on the equivalence
In equivalence partitioning input conditior
that the desired product is
quiremernts to ensure classesare evaluated for given
of valid or
ained
Equivalence class represents a set
In validation testing the main focus is to uncover invalid states for input condition_
rrorsin Input set
System input output oo
System functions and information data
System interfaces with external parts
User interfaces
Valid input Invalid inpu
System behaviour and performance
Software validation can be performed through
series of black box tests.
System Testing System
The system test is a series of tests conducted to
ully the computer based system.
Various types of system tests are
1 Recovery testing
2 Security testing
3 Stress testing
4 Performance testing
box testing?
Q4. What is meant by black Output generated
Explain in detail.
given
Answer:
Equivalence class guidelines can be as
below :
Black Box Testing range one
behavioural testing. a) If input condition specifies
It is also called as functional valid and wo inalid equivalence classes
methods focuson the are defined.
2 Black boN testing software.
requirements of the b)
speciic
If an input condition requires a invalid
white box testing andit and two
not an alternative to white box value, one valid
3 It is class of errors than
uncovers different equivalence classes are defined
b
testing of c If an input conditon secifesamember
box testing uncovers following types a set,one valid and one ivalid
equivalence
Black
class is defined
errors.
functions
1 Incorrect or missing 2 Boundary Value Analysis (BVA)
2 Interiace erors
It is done to check boundary conditions
data structures
3 Errors in i) A boundary value is,a technique in which the
4
Performance errors elements at théedge pf the domain are selected
termination errors and tested
Initialization or
5
1.
Equtvalence Partitioning iü) Using boundary value analysis. instead of
divides the input focusing on input conditions oniy. the test cases
box technique that
It isa black test
data from output domain are also derived
data. From this
domaininto classes of that
cases can be
derived iv) It is a test case design technique
erors that complements equivalence partitioning
uncovers a class of
An ideal test casse
manyarbitrary test cases to be
meght require observed technique
general error is
executed before a LEGAL procaodigs
Photocopying of this book
is a CRIMINAL Act. Anyone found gutty is LABLE to face
Software Engineering 4.10
3. Graph Based Testing

Object Directed link


(bjet,
Node
weight
Undirected Parallel
link links

Bidirected link
Object Object
#3 |Link weight) #4

Figure : Graph Notations


i) In the graph based testing, a graph of objects present in the system is created.
participating
ii) The graph is basically a collection of nodes and links. Each node represents the objectthat is
in the software system and links represent the relationship among these objects.
iiü) After creating the graph, important objects and their relationships are tested.
Orthogonal Array Testing
each input
There are many applications for which very smallnumber of input is needed and values thatmanageable
can be
requires might be bounded. In such a situation the number of test cases is relatively small and
Orthogonal array testing is a kind of testing method which can be applied to the applications in which
input domain is relatively small.
Following are some important terminologies used in orthogonal testing methods.

1. Runs
It denotes the number of rows in the array. These can be directly translated to the test cases.

2. Factors
It denotes the number of columns in she array. These can be directly translated to maximum number of
variables that can be handled by the array.
3. Level
This number denotes the maximum number of valyes that asingle factor(column) can take.
4 L9 orthogonal array
This array is used to generate the test cases,
O5. What is meant by white box testing? What are the aspects to be considered whilegeneratig
of white box test cases.

Answer:
White Box Testing
'executw
The procedural design using basis set of execution path is tested. This basis set ensures that every
path will be tested at least once.
Flow Graph Notation
1. execution
structuralItesting strategy. This method isintended to exercise every
Path testingis a once independent g
program atleast
path of a
Xerox/Photocopying of this book is a CRIMINAL Act. Anyone found guilty is LIABLE to face LE
Warning:
Software Engineering
Following are the steps that are carried out while
performing path testing.
oon 1:Design the flow graph for the
program or a component.
Step 2 Calculate the cyclomatic complexity.
Step 3: Select a basis set of path.
Step Generate test cases for these paths.
Graph Matrices S2eno.ot noe.
Definition

Graph matrix is a square matrix whose size is equal to number of nodes of the flow graph.
Example : consider a flow graph
-
The graph matrix will be for computing the cyclomatic complexity. Following steps are adopted
as 1 if node A is connected to node b.
Step 1: Create a graph matrix. Mark the corresponding entry
each corresponding row.
Step 2 : Count total number of l's from each row and subtract 1 from

if
a>b

T F

temp = b 3
temp =a
The nodes are
numbered it is shown
in shaded fom.

if
4 c> temp

return
5 temp = C temp

complexity.
Step 3 : Add cyclometic
4 6
2 3
1
2-1= 1
1

1 1-1=0
2

3
1 1-1=0
1 1
4
2-1 = 0
1
1-1=0
2 + 1= 3

a CRIMINAL Act. Anyone found guilty is LLARIEto face LEGAL proceedings


Xerox/Photocopying of this book is
Harming :
Software Engineering 4.12
The results of each row and add 1 to it. The Configuration Review
b)
resultant value is the cycomatic complexity. Itareensures that software contiguration element.
appropriately developed and recorded
3. Control Structure Testing
moreover, these elements should also
The structural testing sometimes called as white contain
sufficient information to the support phase in tha
box testing. software life cycle. Configuration review is the
validation
In structural testing derivation of test cases is most important element of software
an "audit".
according to program structure. Hence knowledge of It is also known as
the program is used to identify additional test cases. c) Alpha and Beta Testing
Objective of structural testing is to exercise all When the software is developed completelu
program statements.
suitably tested and thoroughly reviewed, it has
4. Condition Testing to then be tested by the uses since; the user may
not be able to understand the instructions witen
To test the logical conditions in the program or may not be comfortable with the inputs even
module the condition testing is used. This condition can through these outputs comorts thedevelopers
be a Boolean condition or a relational expression. This form of testing is called acceptance testing.
The condition is incorect in following situations. (i) Alpha testing : During the alpha testing.
Boolean operator is incorrect, missing or extra. several users are called to the developers
i Boolean variable is incorrect.
the and are exposed to the product. Under
thequidance of software engineer, the user
Boolean parenthesis may be missing, incorrect use the product, in the mean while, the
or extra. software engineer notes all the possibilities
Error in relational operator.
daelbaike the problems faced by the users during
en normal operations, the instructions that are
v) Error in arithmetic expression. difficult to understand, the operations
den teeggenerating bugs etc.
Q6. Describe the following terns with respect
to validation testing. (ü) Beta Testing : During the beta testing the
a) Validation test criteria product is delivered to the users. Here, in
the absence of the software engineers, the
b) Configuration review product is tested and a repOrt is prepared
c) Alpha and beta testing. Q e He by the user.
Answer : Psecet Custens cowehcay Q7. Explain about the system testing.
At the end of software development, the product Answer :
is usually delivered to the customer. The customer now lt is aseries of tests conducted to fuly the
adapts the products and checks to see that the software Computer based system.
is developed to his expectation. This scenario is referred
as validation testing i.e., the context, the customer is Various types of system tests are,
satisfied with the developed software. 1
Recovery testing
a Validation Test Criteria 2 Security testing
Softwe
We refera validation test to be complete only 3 Stress testing
aticBsIue
after analyzing that the given software satisfies 4.
all the requirement suggested by the customer Performance testing
prior to the initiation of the software The main focus of such testing is to
development. The testing is carried out using two test
things. System functions and performance.
i) Test plan v System reliability and recover ability (recovey
test).
ii) Test procedure
The test plan and procedure are designed System installation (installation test).
keeping the pertormance requirements, corect d System behaviour in the special cor ditions (SU
documentation, behaviour characteristics. test).
wability. compatibility and funetional System user operations (acceptance test alpha
requirement infocus, test).
Xerox/Photocopying ofthls book is a CRIMINAL Act.
Anyone found guilty is LIABLE to face
Warning: LEGAL prroceedir
131

Hardware and software Software Engineering


collaboration. integration and Execution
Integration of external software and the of test Debugging
Starts
system. Cases
Recovery Testing
is intended to
cOver from facilities,. check the system's ability to Results

In this type of
testing the
dthen it is verified whethersoftware
1 lorced to fail Actual
the system debugging
soperly or not recovers
For automated
vockooint mechanisms,recovery
data
then reinitialization.,
recovery and restart are Actual Actual
erfied.
debugging debugging
2. Security Testing
It verifies that
system protection mechanism Conducting Performing
Additional
-nrevent improper penetration data alteration.
or tests
corrections

It also verifies that


protection
nto the system prevent intrusion such mechanisms built
as unauthorized Conducting
nternal or external access or
willful damage. regression tests

System design goal is to make the penetration Generating


tempt more costBy than the value of the information test
hat will be obtained. cases

. Stress Testing
Determines break point of asystem to establish Figure : Debugging Process
maximum service level.
In stress testing the system is executed in a Common approaches in debugging are:
r¡nner that demands resources in abnormal quantity,
1 Brute Force Method
equency or volume.
The memory dumps and run-timne traces are
Avariation of stress testing is a technique called examined and program with write statements is loaded
ensitivity testing. to obtain clues to error causes.
Performance Testing In this method "Let computer find, the error
Performance testing evaluates the run time approach is used.
Derformance of the software, especially real time This is the least efficient method of debugging.
software.
2
Backtracking Method
In performance testing resource utilization such
sCPU load, throughput. response time, mermory usage This method is applicable to small programs. The
can be measured. source code is examined by looking backwards from
symptom to potential causes of errors.
V8. Explain in detail about the art of
3. Cause Elimination Method
debugging. This method use binary
Answer : partitioning reduce
the number of locations where errors
o
can exist.
Debugging is a process of removal of a defect. It| Why debugging is so difficult ?
CCurs as a consequence of successful testing.
Following are some reasons that reveal why
The actual test results are compared with the debugging is so difficult.
'Dected results. The debugging process attempts to 1. The symptoms of bug may bepresent at somne
fnd lack o Correspondence betweenareactual and s
the. identified part(module) of the program and the its ettect me'
expect
and ed
results. The suspected causes
additional tests or regression tests are
performed
might be seen in some other module of the
program. Hence tracing out the location of Pry
o make the system to work as pe requirement.
symptom becomes difficult.
ng : XercoWPhotocopylng of this book ls a CRIMINAL ACL. Anyonefound gulty is LIABLE toface LEGAL proceodlirgs
Software Engineering
2
Symptoms may be caused bu software I & 2 PRODUCT METRICS 414
developers during the development process.
Such symptoms are difficult Q10. Define software quality. Write
to trace out.
on direct metrics and
indirect short notes
3 The symptom may appear due to timing
problems instead of processing problem Answer: metrlces
some_error then the Software Quality conformance t
4 If the developer corrects
symptom may disappear temporarily. It defined as "the to explicitly
5. The symptom can appear if some functional and performance requirements, staled
inaccuracies documented development standards,
in the program are simply rounded off.
characteristics that are expected of all
and expiimplcitcyit
Q9. Explain about
Arwer:
regression testing. developed software".
There are three main reasons for why
profes ional,
A well designed and well coded
software when quality gets failed? softwae
sutjected to some changes may not work properly and 1 Software requirements must be well understord
require testing toensure its proper working without any before thesoftware development process beatine
enor. This retesting of software is known as "Regression
testing". 2 Similar to explicit requirements it is also essental
to understand the implicit requirements of the
Example software.
incase of integration testing new modules are
added or old modular are removed from the software 3 The set of development criteria has to be decdel
which may change the I/O, control logic and data flow in order to specify the standards of the produg
paths. McCall's Quality Factors
Regression testing is then necessary to ensure Generally, there are two classes of quality facors
that these change are acceptable and o not result in that affect the software quality
any errors.
1. Directly measured
The most important task to be considered in 2
regression testing is to select the tests in the test suite. Indirectly measured
The tests selected should be optimal and must uncover Direct metric includes the measurement ofcost
maximum errors from major parts of the program. and effort applied. It includes lines of code produced Ar
Moreover, the regression test suite must have the execution speed, memory size and the defects reporte
Over some set of time. M
tests case of,
a Modified software components Indirect metric includes the measurement o
functionality, quality, complexity, efficiency, reliabilt,
Software firms that may get affected due maintainability/usability, correctness and other abliU
to modifications.
McCall, Richards and Walters have proposedtis pr
Al! software fns software quality factors can be classified into three
2
Regression testing is not limited to integration categories which can be illustrated by McCall's triangD
testing as it is also carried not while software maintance model for quality factor.
process is in progress.
-Maintainability Reusability
Advantage -Flexibility -Portability
1. It 1aaintain bcth the quality and reliability of Testability Interoperabuit
3.
software. Produrc
etvision

2.
3.
Errors due to notification are captured.
The working of software according to the
Product transic 1Siton 4

requirenent is'ensured.
5
Disadvantages Product operation
1. Time consuming and expensive Corectness
Test suite becomes huge with progress
Reliability
2.
integration.
-Usability
Integrity
6.

Effidency
Xerox/Photocopying of this book is a CRIMINAL Act. Anyone found procdig
Warning: guilty is LIABLE to face
LEGA
War
15
Software Enginering
1
Comehes-The abilit to fuinl the sxcifcation and customes equiremens
2
Relhabilit The degne by whikh the sofwar shoukd work as per the requirement
heratonal 3.
Cblit he abiv to pHere the alid input and characteristics interpret the
ommt outut.
SioenV- The easure of omputing resourves and ime
raquied by the program to pertom.
5 Integity Th:s is the contdling abilitv by which
nauthoizedasuthe sstem can be prwnted.
Maintainabiliy The abil.t guinnd to loate or fix the
hugs in the sotware.
Adaptability to Flexibility - The extent bv whicn it is alkowed to make
wenvironment certain mdifications in the program.
Testability - The ability to check that the function is
working as per the quineent.
Reusability - The ability by which the particular
component in the sofware can be eused
by some other pogram in that softwane.
Ability to 10. Portability - The ability of the softwane to work
Uhdergo change
properly even if the envinonment gets changed
(ie. change in harware or software).
11 Interoperability - The ability of the system to worh with
other system.

Q11. Explain the framework for product metrics.


Answer :
Measures, Metrics and Indicators
Sofware measurement means deriving a numernc value tor an attribute ofa software product or
process.
Measure
Itis a ouantitative indication of the extent, amount, dimension, or size of some
Dcess
attribute of a prodct or
Metrics
Itis the degree to which a system, component, or process pOssesses a given attribute.
The software metrics relate several measures.
Example- average number of errors found per review.
Indicators
ldisshoe mean combination of metiCs that provides insight into the
software process, project or reodvch
Direct Metrics
It refers toimmediately measurable atrToutes. ror example- ine of code,
execution med
hadirect Metrics
It refers to the aspects that are not immediately quantifiabie or
measurLoie. For
the program
Fasts
erempleunetonaltya
Faults are of two types, and those are errors and defects
rnin XeroxPhotocopying of this book isa CRIMINAL Act. Anyone found gultty ls UABLE to
fece LEGAL
proceadinge
Software Engineering
i) Errors software
by the
practitioners during devepmert
These are type of faults that are found
iü) Defects
Customers after release.
Defects mean faults found by the
Measurement Principle
process.
Roche suggested folowing principlesfor the measurement
1. Formulation
2. Coection
3. Analysis
4. Interpretation
5. Feedback
GoalOriented Software Measurement
GOM) is a technique for idertitying meaningti mete
Goal Question Oriented Sofware Measurement
for any part of software process.
For appiy ing this technique folowing are the requirerments - goczs
expicit measurement goals must be esablished which is based on process activiy or
1 The
characteristics.
ind out the measurement goals.
2. Prepare aset of questionnaire which wil help to
prepared set of questions.
3 Identify well formulated metrics that wil help to answer the
Attributes of Effective Software Metrics
attrbutes.
The effective software metrics shoukd have folouing
1 Simple and computable
2 Empirically and intuitivety persuasive
3. Consistent and objective
dimensions
4. Consistent in its use of units and

5. Programming language independent


feedback
6 Metric should be effective mechanism for high quality
analysis model.
Q12. Discuss about the netrics for
Answer :
are useful in estúmating the project In order to determine the me
Metrics for the analysis model
software is used as a measure.
analysis model "size" of the
Function Point Model
The function point model is based on runctonaity of the delivered applicaion
1
These are generaly independent ot the programming language used
2. lRV
by Albrecht in 1979 for
Thismethod is developed
3. derived using
Function points are
4. cOuntable measures ofthe software requirements domain
a
assessments of the software complexity.
b book is a CRIMINAL
XeroxPhotocopyingof this Act Ayone found guitty is LIABLE to fa- L
Ylarning:
17
Dh data for Software Engineering
following information domain characteristics are
Number of user inputs collected
Fach user inputwhich provides distinct
application data to the software is counted.
Number of user outputs
Each user output that provides application data to the user is
counted, e.g. screens, reports, errOr messg
Number of user inquiries
An on-line input that results in the generation of some immediate software response in the form of an
output.
Number of files
Each logical master file, i.e. a logical grouping of data that maybe part of a database or a separate ne.
Number of external interfaces
Allmachine-readable interfaces that are used to transmit information to another system are counted.
The organization needs to develop criteria which determ1ne whether a particular entry is simple, average
or complex.
The weighting factors should be determined by observations or by experiments.

Domain Count Weighting factor Count


Characteristics Simple Average Complex
Number of user input X 3 4 6

Number of user output X 4 5 7


Number of user inquiries X 3 4
Number of files X 7 10 15
Number of external X 5 10
interfaces
Count Total

The formula for calculating function point is

Fp - Total x(0.65 +(0.0) xsumF)


Total value is obtained from above table t is the value adjustment factor (VAF). Which depends on the
nswers to be provided for a set of questions.
Does the system need reliable backup and recovery.
Are data communications required.
Are there distributed processing functions.
Is performance of the system critical.
Willthe sUstem run in an existing. heavily utilized operatiónal environment.
Does the system require on-line data entry.

AdvantThisagesmethod is independent of programming languages.


It is based on the data which can be obtained in early stage of project
rning : Xerov/Photocopving of this book ls a GKIMIN Anyone tound guilty is LIABLE to face LEGAL procer. tings
4.18
Software Engineering
Metrics by Fenton
Disadvantages metrics that av
This method is more suitable for business systems These are simple morphology
architecture with the help of
used to compare different
1
and can be developed for that domain. and edge to node ratio.
size, depth, width
2 Many aspects of this method are not validated. edges
Size = n +e’no. of
3 The functional point has no significant meaning.
It is just a numerical value.

Q13. Discuss about metrics for design model. No. of nodes


Answer : leafmode
depth = longest path fromroot to
We need metrics for design model for
determining the measurement of design quality. This particular level
metric guides the software design activity as design width = Maximum no.of nodes of
evolves.
edgetonoderationr = e/n
There are three design models
2.
Object Oriented Design
1 Architectural design
2. Object oriented design Whitmire has suggested nine measurable
characteristics of ood and those are.
3 User interfacedesign
i) Size
1. Architectural Design Itcan be measured using following factors.
While determining the architectural design
ii) Complexity
primary characteristics of program architecture are
considered. It does not focus on inner working of the It is a measure representing the characteristics
that the classes are interrelated with each other.
system.
Metric byCard and Glass iii) Coupling
It is a measure stating the collaborations between
Two scientists card and glass has suggested three
design complexity measures as or no. of messages that can be passed between
the objects.
a) Structural Complexity
iv) Completeness
Itdepends upon the fan - out for modules. It can
be defined as It is the measure representing all the requirements
of the design component.
v) Cohesion
S(k) - fout (k)
It is the degree by which the set of properties
Font represents fan - out for module that are working together to solve particular
Data Complexity property.
b)
vi)
It is the complexity within the
interface of internal Sufficiency
module. It is the measure represinting the necessar
requirements of the design component.
tot- var(k) vii)
D(k) [fout(k) +1]
Primitiveness
The degree by which the operations are simple
In other words, the measure by which numb
c) System Complexity of operationsare
It is the combination of structural
and data
viii) Similarity independent on oue
complexity.
The degree to which tWo or more classes
S(k) = S(k) + D(k) similar with to their functionalities
an
behaviour. respect
Xerox/PhotocopPying of this book is
a CRIMINAL Act. Anyone found
guilty is LABLE to face LEGAL proced
Warning:
1194 Software Engineering
Volatility The program volume is heavily dependent upon
Due to changes in requirements or some other the program volume.
wasons modifications in the design of application Let.
may oCCur. Volatility is a measure that represents
operators
the probability of changes that will occur. N, = Total count for all the
operands
User Interface Design Metrics N, = Total count for all the
9.
There are numerous methods for evaluating user The program volume ration L
nterlace,

Layout Appropriateness |L= 2/n, xng /Ng


Andrew sears has suggested layout Metrics for Testing
ppropriateness metric for the UI design. It require the testing
Halstead's metrics for estimating
escription of sequence of the actions performed on
UOut entities suchas icons, menus, windows and so efforts are as given below.
defined as
The Halstead effort can be
The layout appropriateness is a metric used for
omputing the "cost" of all the transitions made by user. e-V/ PL
Cost = E (frequency of the transistor * cost of and PL is the
Where is the program volume
V
the transition) program level.
computed as
All transitionsina layout The program level can be
Cohesion Metrics
PL = 1/[(n, /n,)x (N, /n)]
)
relative connection of
It can be defined as the contents.
to other on screen
n - screen content The percentage of overall testing
is high. Kokol has
UI cohesion for screen cohesion metrics for effort = Testing effort of specific
modules
for
Toposed the empirical model
Testing efforts of all the modules.
Idesign. code and
metrics for sorces source code and
14. Define the for testing. Q15. What are the metrics for
metrics maintenance? Explain.
also explain the
nswer : Answer :
"Software science
Halstead has proposed in The stability of software product is given by
IEEE
science metrics. maturity
ome software standard which suggested a metrics software
based on index (SMI) for that matter.
These metrics are

1
Common sense
It is given as follows
Information theory
2
SMI = (M-(A +C+ D)) /M
3 Psychology.
measures are
proposed metrics the used Where,
In the
operators
n,
The no. of district M = No. of modules in current version

The no. of district operand A= No. of added modules in current version


gin end
such as {...), or be
The paired biock operators. C=No. of changed modules in current versions
aro treated as single
Or repeat... until defined as
length N can be D= No. of deleted modules in current version
The program
compared to the previous version
N=n, log, n, + ng log, nz When SMI reaches to the value 1.0 the product
defined as metrics is
volume can be becomes more and more stabilized. This SMI
The program maintenance activities.
used for planning the software
V= Nlog,(n, + n,)
book is a
CRIMINAL Act. Anyone found guitty is LIABLE to face L
Xerox/Photocopying of this
3:
Software Engineering
Q16. What are the metrics for software quality? Explain.
Answer :
this qol
The goal of software engineeringgis to produce high quality software. To achieve
use effective methods along with moderntools while developing the software. Simply producin0tthe
this manner is not sufficient, it is necessaryto measure the quality of software being software.
Isoftwate wtitengte
of software depends upon. Basicaly the qin,
1. Requirements that describe the problem.
2 The design method used to produce the software.
3 The code that leads to executable program.
4 And the tests that are carried out in order to uncover the errors from the software.
Ihe project manager evaluates the quality of the software project using tollowing factors.
1 Errors and defects in the software.
2 Quality metrics colected by each software engineer vwho is involved in the software developrnent proces
Such an evaluation of software quality helps in improving quality assurance and control activities. Typicat
following metrics are used for software assessing the software quality.
1 Work product errors per function
2 Errors found in the per review hour
3. Errors found in the testing
This error data is useful in computing the defect removal efficiency.
Q17. Define the quality measuring.
Answer:
Following are the measure of the software quality
1 Correctness
It is a degree to which the software produces the desired functionality. The correctness can be measured 2s
Correctness = Defects per KLOC
Where defect mc ans lack of conformance to requirements. Such defects are generally reported by the us
of the prOgram.
2 Integrity
Itisbasically an abilityof the system to withstand agai st the attacks. Typicallu attacks anre on programs
data and documents. There are two - attritbutes that are associated with intearity: threat and secuntu.
a) Threat
b) Security
Integrity =(1- threat) x (1 -security)
3 Usability
It means user friendliness of the system or ability of the systenm that indicates the usefulness ofthess

Folowing are the characteristics that are useful for measuring the usabilit
a) The time required to make the system efficient.
b) The ski required to learn the system
c) The net increase in productivity after regular use of the sustom
The user attitude towards the system.
4 Mainiainability
accommodate the
It is an ability of the system to made correstions mace after
environment changes and adapt the
changes in the system in change.
The metric used tor maintainability is MTTC ii.e. mean time toorder
encuuntering
to satisty the
user.
er:os
The MTTC can be defined as the time required to analvse the change request, design an appropa
CRIMINAL
hotocopying of this book is a .Act. Anyone found guitty is LIABLE to tace LEGA eay
4.21 Software Engineering
18. Differentiate between function oriented and size oriented
metrics.
Answer
r:

Size Oriented Metrics


Size - oriented metrics are direct measures of
1 software.
2. These are the attempt to measure the size of the size of the software.
3 It uses lines of code as a normalization value.

4. It includes efforts (time), money spent, KLOC (1000s lines of code), pages of documents, erors peope
the project.
5. It focuses on the function points.

6. It is independent of PL used.

4.3 METRIC PROCESS AND PRODUCT


Q19. Explain the metrics for software process and product.
Answer :
software processes. Process
Process metricsare the set of process indicators that are used to improve the improved with the help of
process can be
metrics is collected over the complete software life cycle. The software
process metrics.

Measure specific Develop set of Using these attributes


attributes of the meaningful metrics build a strategy for
from these attributes process improvenment
process

factors to consider product,


!n making improvement to any software system, there are three basic quality
people.and technology.
These three are the maior determinates of softvare cost, schedule, productivity and quality.

Product

conditoDsnsines
ProzesS

People Deveiopment Technology


environment

The neonle incudes hiving the best people you can find. motivative them to do the best job, and training
them cn the skills needed toperform their jobs effectively.
The technology factor includes acquire and installing tools that help automate
eg. Java, C++. Oracie)
The complexity of the factor product has great impact on quality and teamn performance
Warning : Xerox/Photocopying of this book is a CRIMINAL Act Ariyone found guilty is LIABLE to face LEGAL proceedings
Software Engineering 422
Q20. What is meant by DRE? How it can be asses?
Answer:
document
While productssuch as SRS, design
developing the software project many work
are being created. Along with these work products many errors may get generated. Project manager hascoesOurce
identify all these errors to bring quality software.
Lror tracking is a process of assessing the status of the software project.
In this
Ihe software team performs the formal technical reviews to test the software developed.
various errors are identified and corrected. Any errors that remain uncovered and are found in later tasks
called defects.
The defect removal efficiencycan be defined as

DRE = E/(E+D)
Where
DRE is the defect removal efficiency,
E is the error and D is defect.

The DRE represents the effectiveness of quality assurance activities. The DRE also helps the project manas
to assess the progress of software project as it gets developed through itsscheduled work task.
During error tracking activity following metrics are computed
1 Errors per requirements specification page : denoted by E.
2 Errors per component - design level:denoted by E
3 Errors per component -code level: denoted by E
4. DRE -requirement analysis
5. DRE- architectural design
6 DRE - component level design
7. DRE - COding
These errors tracking metriCS also be used tor better target
reviewand testing resources.

Xerox/Photocopying of this bookis a CRIMINAL Act. Anyone found guitty is


Warning: pro

You might also like