Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 28

Life-Cycle Costing

• System Development Life Cycle

• Life Cycle Cost Elements

• Parametric Cost Estimating

• Regression Concepts
– Simple Regression
– Multiple Regression
Designing for the Life Cycle
Product Use
Needs Conceptual Preliminary Detailed Production Phase-out
Analysis Design Design Design
Disposal

•Market Research •Cost Target •Cost Data Collection


•Intelligence •LCC Analysis & Modeling •Analyses & Reporting
•Threat Analysis •LCC/DTC Evaluations •LCC Assessment

•LCC Targets •Field Cost Data


•Predictions
•Design-to-Cost(DTC) •Analyses
•Analyses
• Reporting
•LCC & DTC Assess
•LCC Assessment
•Documentation
Life-Cycle Cost Elements
• R&D • O&M and Support
– Planning – Field Operations
– Needs/Market Analysis
– Product Distribution
– Feasibility Studies

– Logistical Support
Product Research
– Engineering Design – Maintenance
– Documentation – Customer Service
– Software – Data Collection
– T&E – Facilities
– Management Functions – System Modifications
– Management
• Production/Construction
– Fabrication, Assembly, Test • Retirement and Disposal
– Facilities
– Disposal of non-repairables
– Process Development

– System/Product Retirement
Production Operations
– Quality Control – Material Recycling
– Initial Logistic Support
Life-Cycle Cost

• LCC = R&D + Production + O&M + Disposal


Life Cycle Costs
100
Cost Committed

% Cost Incurred
Ease of Change

Conceptual Prelim Design Detail Design Production O&M, Support, Disposal

Life Cycle Phase


Life Cycle Costing
LIFE CYCLE COST

OPERATION AND SUPPORT

SYSTEM ACQUISITION
LIFE CYCLE COST

PRODUCTION

SYSTEM R&D

YEARS
Parametric Cost Estimating
• Used in the early stages for cost estimating

• Uses historical data to develop general relationships to key


cost drivers (top level approach)
– Called “CER’s” (Cost Estimating Relationships)
– Technical Performance (speed)
– Physical Features (size, weight)

• Y = b0 + b1X1 + b2X2 + . . . . . + bkXk


– b0 = constant term (fixed cost)
– Xi = cost factors (speed, weight, mpg, thrust/weight ratio, etc.)
– bi = coefficients of cost factors ($/mph, $/mile, $/ton, etc.)

• Requires experience and good data bases/warehouses


– Proprietary to the organization
– Industry or trade association shared data
Cost/Hour Estimates

Estimating Generic WBS Time to


Method Type Relationship Accuracy Prepare

Parametric ROM* Top Down -25% to +75% Days


Analogy Budget Top Down -10% to +25% Weeks
Engineering Definitive Bottom Up -5% to +10% Months
(Grass Roots)

* Rough Order of Magnitude


Regression Concepts
• Simple Regression
– Y = 0 + 1X1 + 
 Y = dependent variable
 X = independent variable
 0 = y-intercept (constant term)
 1 = slope

• Multiple Regression
– Y = 0 + 1X1 + 2X2 + . . . . . + kXk + 

 Y = dependent variable (e.g. cost)


 Xi = independent variables (e.g. cost factors such as speed, weight, mpg,
etc.)
 0 = constant term (e.g. fixed cost)
 i = coefficients of independent variables (e.g.$/mph, $/mile, $/ton, etc.)

• Requires experience and good data bases/warehouses


– Proprietary to the organization
– Industry or trade association shared data
Regression Assumptions

• Y is normal for a given X with mean E(Y/X) = 0 + X

• Variance of Y (is constant for all values of X

• For a given X, Y values are independent


Least Squares Estimators
• Objective is to minimize the sum of the squared errors (SSE) about the
regression line where SSE =  (y – yi)2
• Sum of Squares
– SSxx =  (xi – x)2

– SSyy =  (yi – y)2

– SSxy =  (xi – x) (yi – y)

– To Minimize SSE , take partial derivatives with respect to 0 and 

– Solving the two normal equations for 0 and 1 yields the following:

 y -1 x and 1 = SSxy / SSxx


Partitioning the Total Sum of Squares (SST)
• SST represents the sum of squared deviations of each y-
value around the mean of the y-values (y). Thus SSyy = SST

• The fundamental equation of regression is:


 SST = SSR + SSE

• SSE is the is the deviation between the actual y-values and


their corresponding estimated values from the regression line
(y),
 SSE =  (yi – yi)2

• SSR is the deviation between the estimated value from the


regression model (yi) and the mean of the y-values (y),
 SSR =  (yi – y)2
Graphical Illustration of
Sum of Squares Partitioning
Individual y-value (yi)
Y
This difference is unexplained
i.e. random error (yi – yi)

This difference is explained Total Deviation (yi – y)


by the regression model
(y – y)
Y

Regression Model
y = 1 x

X
ANOVA Table
Source of Sum of Degrees of Mean Square F-Ratio
Variation Squares Freedom
Regression SSR k MSR = SSR
k MSR
MSE
Error SSE n–k-1 MSE = SSE
(n–k–1)

Total SST n-1

k = # of independent variables
n = # of observations
Basics of Statistical Inference
Hypothesis Testing

• Null Hypothesis
– The “no difference” case (nothing has changed)
– The hypothesis we are trying to disprove
– Stated as
• H0:  =  where is the current condition
• Alternative Hypothesis
– The “difference” case (things have changed)
– The condition we believe to be true if we can disprove Ho
– Stated as
• Ha:  = o
Basics of Statistical Inference
Test Statistics
• There are four basic test statistics
– Z (Normal Distribution)
• Used for means of large samples

– t (t-distribution)
• Used for means of small samples
• t approaches Z as n gets large (around 30)

– 2 (Chi-square)
• Used for a single variance

– F (F-distribution named for Mr. Fisher)


• Used for the ratio of two variances
• This is the F-test in the ANOVA Table
Basics of Statistical Inference
Possible Errors

• There two types of errors one could make in a


hypothesis test

• Type I Error
– Rejecting a true null hypothesis
– P(Type I Error) = 

• Not rejecting a false null hypothesis


– Type II Error
– P(Type II Error) = 
Basics of Statistical Inference
An Example – Trial by Jury

• Assume there is a criminal trial and you have


been selected to serve on the jury.

• The hypotheses are:


– Ho : Defendant is Innocent
– Ha : Defendant is Guilty

• The decision is always about Ho , and not Ha


– We either “Reject” or “Do Not Reject” Ho
Basics of Statistical Inference
An Example – Trial by Jury
• Verdicts and possible errors
True Condition
Ho: Innocent Ha: Guilty

Guilty Type I Error Correct Decision


Jury
Verdict
Not
Guilty Correct Decision Type II Error

Note that the Jury never gives a verdict of “Innocent”, rather it is either
“Guilty” or “Not Guilty”. The “Not Guilty” verdict means simply
“Do Not Reject Ho” as there was not sufficient evidence to the contrary.
Basics of Statistical Inference
Statistical Tables and Critical Values

• The significance level of a test is just the probability of


making a Type I Error, i.e. 

• Generally,  is set to 10%, 5%, or 1% with 5% being the most


common. It depends on how much risk one is willing to take on
committing a Type I Error.
• As  decreases,  increases

• The “critical value” of the test statistic is the value which


leads to rejection of Ho. These values are found in the
appropriate statistical table and are dependant on the
sample size for the t, chi-square, and F.
Assessing Model Adequacy

• Simple Linear Regression


– Hypotheses
• H0: 1 = 0 (no relationship)
• Ha:  = 0 (relationship exists)
• Test Statistic (t or F)
• Multiple Linear Regression
– Hypotheses:
• H0: 1 = 2 = …. = k = 0 (no relationship)
• Ha: At least one  = 0 (relationship exists)
• Test Statistic (F)
• The ANOVA table carries out this F test
Coefficient of Determination
• Measures the explanatory power of the regression model
• The percentage of the variability in the dependent
variable (y) that can be explained by the independent
variables (xi)
• The higher the percentage of variation that is explained,
the smaller the error, the closer the data points are to the
line
• High r2 values do not imply cause-and-effect

r2 = (Explained Variation) = SSR


(Total Variation) SST
Estimation and Prediction
• Confidence Intervals
– Used to estimate the mean of y for a given x, i.e.
E(Y/x) with upper and lower confidence limits for each
value of x. See text on regression for formulas.

• Prediction Intervals
– Used to predict an individual value of the random
variable y for a given x value with upper and lower
prediction limits. This prediction interval will be wider
than the confidence interval. See text on regression
for formulas.
Residual Analysis

• Create a series of residual (error) plots to


test the regression assumptions
– Normal plot of residuals to test the normality
assumption
– Histogram of residuals (if sufficient data) to
check normality assumption
– Ordered chart of residuals to test for
independence
– Plot of residuals vs fitted values to test for
constant variance
Qualitative Variables
• Sometimes called “dummy” variables
• Can be included in a regression model as
 Xi = 1 if included, Xi = 0 if not included

• If a qualitative variable has k categories, then k-1 dummy


variables are required as the category not included is the
default case
• Creates parallel response lines which differ by a
constant
 Constant represents the inclusion/exclusion of the qualitative
variable
Multicollinearity
• Exists when two or more of the independent variables
are highly correlated
• Can result in significant rounding errors
• Can result in incorrect signs of the regression
coefficients
• Dealing with multicollinearity
– Examine the correlation matrix
– Use stepwise regression
– Look at the Variation Inflation Factor (VIF) for the independent
variables
VIF

• VIFj = 1/ (1 – R2j )

• If VIF > 10 then too much correlation between Xj and the


other Xi ‘s and Xj can be removed from the model.

• If VIF > 5, some authors believe that the correlation


between Xj and the other Xi ‘s may cause problems with
multicollinearity and the variable should be considered
for removal.
Influence Analysis and Cook’s Di

• Is a particular observation unduly affecting the model?

• Calculate the Di ‘s using Minitab

• Cook suggests that if Di > Fwith (k +1) and (n – k – 1)


degrees of freedom at a 0.50 significance level, the
observation is highly influential and is a candidate for
removal

You might also like