Under The Guidance of Mr.M.Jagadeesh Assistant Professor CSE Department by M.Praveen Kumar 1221010121 M.Tech-SE-IV Sem

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Under the Guidance of Mr.M.

Jagadeesh Assistant Professor CSE department

By M.Praveen Kumar 1221010121 M.Tech-SE-IV sem

Introduction
Determination of a single solution for multi-objective

problems is performed using methods .


Traditional methods require that the decision-maker has

broad knowledge about the underlying problem.


The other general approach is the determination of a

set of solutions, i.e., a Pareto-optimal set.

Post-Pareto optimality analysis Obtain a small sub-set of preferred solutions from the large Pareto-optimal set. The results obtained from any optimization method.
Decision-maker identify the most preferred solution in

multi-objective optimization problems.


Pareto-optimal set methods to reduce solution set Pruning by using non-numerical objective function

ranking preferences method. Pruning by using data clustering.

Proposed System
Combined Ranking and Pareto analysis methods The solution of a multi-objective optimization

problem involves three stages: (1) Formulation (2) Search, and (3) Decision-making. Two Methods for analysis Pruning by using non-numerical objective function ranking . pruning by using data clustering.

Required characteristics for Reliability


A SRGM can be viewed as product of positive constant. The fault detection rate must be finite The SRGM can have limitations

Some SRGMs:
Weibull SRGMs Gamma SRGMs

Lognormal and Inverse Weibull SRGMs


A SRGM with decreasing fault detection rate

Weibull SRGMs
Fault detection rate per fault at any testing time is a

constant, i.e., d(t) =b H(t) = a[1exp(bt)] , (10) with G(t) = 1exp(bt) . The Weibull distribution is given by G(t) = 1exp[(t / ) ] It includes the exponential distribution as its special case (i.e., when = 1 ).

Inverse Weibull distribution


The inverse Weibull distribution, given by G(t) = exp[( /t) ]

It is also close to the exponential distribution when 1.3854 < < 1.9379

Standard Approach :Weighted Sum of Objective Functions


Conventional approach : Combine the objectives into a single weight formula: E.g., Fitness = W1 * f1(U) + W2 * f2(U) and Normalizing the weights W1 + W2 = 1 Limitations: - Result depends on weights. - Some solutions cannot be reached. - Multiple runs of the algorithm are required in order to get the whole picture. - Difficult to select weights cost functions may be in different scales.

The algorithm used to prune Pareto-optimal solutions : 1.Rank Objectives 2.Scale Objectives

3.Randomly generate weights based on ranks using the weight function


4. Sum weighted objectives to form a single function

5. Find the solution that yields the max (optimal) value 6. Increment the counter corresponding to that solution by a value of one 7. Repeat Steps 2 to 5 numerous (several thousand) times 8. Determine the pruned Pareto optimal set i.e. the solutions that have non zero counter values (counter > 0)

Pruning by Using Data Clustering


In this decision-maker is not required to specify any

objective function preference.


k-means clustering algorithm is used to group the

solutions into clusters.


Determine the optimal number of clusters. Data into k clusters. The members within a cluster are

similar to one another.

The k-means algorithm : 1. Place k points into the space represented by the objects

that are being clustered. These points represent initial group centroids. 2. Assign each object to the group that has the closest centroid. 3. When all objects have been assigned, recalculate the positions of the k centroids.
4. Repeat Steps 2 and 3 until the centroids stabilize (no

longer move).

You might also like