Professional Documents
Culture Documents
Software Quality and Reliability Digital Assignment - 2
Software Quality and Reliability Digital Assignment - 2
RELIABILITY
DIGITAL ASSIGNMENT – 2
Quality
Timetable
Effectiveness
Productivity
Process metrics
Process metrics are measures of the software development process, such as
Their intent is to provide indicators that lead to long-term software process improvement
Types:
Software volume measures: Some density metrics use the number of lines of code while
others apply function points.
Errors counted measures: Some relate to the number of errors and others to the weighted
number of errors.
The TTO and ADMC metrics are based on data for all relevant milestones scheduled in the project
plan. In other words, only milestones that were designated for completion in the project plan stage
are considered in the metrics’ computation. Therefore, these metrics can be applied throughout
development and need not wait for the project’s completion.
Product metrics
Product metrics refer to the software system’s operational phase – years of regular use of the
software system by customers, whether “internal” or “external” customers, who either purchased
the software system or contracted for its development. In most cases, the software developer is
required to provide customer service during the software’s operational phase.
HD quality metrics
HD quality metrics
HD calls density metrics – the extent of customer requests for HD services as measured by the
number of calls.
HD success metrics – the level of success in responding to these calls. A success is achieved by
completing the required service within the time determined in the service contract.
Some relate to the number of the errors and others to a weighted number of errors.
As for size/volume measures of the software, some use number of lines of code while others apply
function points.
Classifications:
Explain why statistical analysis method is required for software quality metric
results.
Being a branch of science, Statistics incorporates data acquisition, data interpretation, and data
validation, and statistical data analysis is the approach of conducting various statistical operations,
i.e. thorough quantitative research that attempts to quantify data and employs some sorts of
statistical analysis. Here, quantitative data typically includes descriptive data like survey data and
observational data. In the context of business applications, it is a very crucial technique for business
intelligence organizations that need to operate with large data volumes.
The basic goal of statistical data analysis is to identify trends, for example, in the retailing business,
this method can be approached to uncover patterns in unstructured and semi-structured consumer
data that can be used for making more powerful decisions for enhancing customer experience and
progressing sales. Apart from that, statistical data analysis has various applications in the field of
statistical analysis of market research, business intelligence (BI), data analytics in big data, machine
learning and deep learning, and financial and economic analysis.
Data is of two types, continuous data and discrete data. The continuous data cannot be counted and
changes over time, e.g. the intensity of light, the temperature of a room, etc. The discrete data can
be counted and has a certain number of values, e.g. the number of bulbs, the number of people in a
group, etc.
Under statistical data analysis, the continuous data is distributed under continuous distribution
function, also known as the probability density function, and the discrete data is distributed under a
discrete distribution function, also termed as the probability mass function.
Data can either be quantitative or qualitative. Qualitative data are labels or names that are
implemented to find a characteristic of each element, whereas quantitative data are always in the
form of numbers that intimate either how much or how many.
Under statistical data analysis, cross-sectional and time-series data are important. For a definition,
cross-sectional data are the data accumulated at the same time or relatively the same point in time,
whereas, time-series data are the data gathered across certain time periods.
Generally, under statistical data analysis, some form of statistical analysis tools are practised that a
layman can’t do without having statistical knowledge. Various software programs are available to
perform statistical data analysis, these software include Statistical Analysis System (SAS), Statistical
Package for Social Science (SPSS), Stat soft and many more.
These tools allow extensive data-handling capabilities and several statistical analysis methods that
could examine a small chunk to very comprehensive data statistics. Though computers serve as an
important factor in statistical data analysis that can assist in the summarization of data, statistical
data analysis concentrates on the interpretation of the result in order to drive inferences and
prophecies.