Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 60

AMM 2009

OMDEC copyright

CARBONES DEL CERREJÓN LIMITED

Advanced Maintenance Masterclass

Assessments and Benchmarking

Ben Stevens - OMDEC Inc


www.omdec.com
Ben@omdec.com 1
AMM 2009
OMDEC copyright

BP Tools – how to assess your


current practices
• #1 Maintenance Excellence Cube
– A framework for assessing…
• where we are
• where we want to get to
• and making sure we miss nothing vital

2
AMM 2009
OMDEC copyright

Introducing … Technology
The Excellence Cube and
Processes

Continuous Autonomous
Improvement Reliability Maintenance
le
o p
Pe d c e s
an our
Data Materials Tactics Measures Work es
Control R

Strategy Management
Leadership
Technology
and
Processes

s
ce
ur
so
Process Growth
Re

Mine Profit ETC


3
AMM 2009
OMDEC copyright

Technology
The Excellence Cube and
Processes

Continuous Autonomous
Improvement Reliability Maintenance
le
o p
Pe d c e s
an our
Data Materials Tactics Measures Work es
Control R

Strategy Management
Leadership
Technology
and
Processes

l e
op
Cost CBM Data ETC
Pe
CMMS Analysis Collection

4
AMM 2009
OMDEC copyright

Umbrella Analysis
People and Resources
Processes
Technology

Manage-
Strategy Data Work Materials Measures Tactics RCM TPM
ment

Covers all key areas


Helps to identify priorities and opportunities
Acts as the basis for focusing on specific issue areas
Frequently uncovers hidden issues
Provides baseline for long term improvement planning

5
AMM 2009
OMDEC copyright

Focused Analysis
People and Resources
Processes
Technology

Manage-
Strategy Data Work Materials Measures Tactics RCM TPM
ment

CMMS 1. Reason for assessment


- new, replace, upgrade, improve
2. Current status
- users, system description, support
CBM 3. System functionality implemented
- work request, work orders, inventory etc
Data 4. Use of system
Collection - collect data, process data, analyse problems,
drive reliability improvements etc
Cost 5. CMMS extensions
- CBM integration, RCM integration, failure
Analysis
prediction, mobile operations etc
6. Value realization
etc - current sources of value, value opportunities

6
AMM 2009
OMDEC copyright

The Excellence Cube Technology


and
Processes

Continuous Autonomous
Improvement Reliability Maintenance
le
o p
Pe d c e s
an our
Data Materials Tactics Measures Work es
Control R

Strategy Management
Leadership

Develop asset strategy  maintenance vision --- integrate


into the business plan. Management of human resources
and finances, organization and change.

7
AMM 2009
OMDEC copyright

The Excellence Cube Technology


and
Processes

Continuous Autonomous
Improvement Reliability Maintenance
le
o p
Pe d c e s
an our
Data Materials Tactics Measures Work es
Control R

Strategy Management
Leadership

•Asset life-cycle productivity, including:


–work identification and planning - work scheduling and coordination
–purchasing and inventory -selection of maintenance tactics
–technical and information support - performance measurement
8
AMM 2009
OMDEC copyright

The Excellence Cube Technology


and
Processes

Continuous Autonomous
Improvement Reliability Maintenance
le
o p
Pe d c e s
an our
Data Materials Tactics Measures Work es
Control R

Strategy Management
Leadership

- Performance Improvement via failure management.


- Improved productivity via proactive and planned maintenance.
- Integration of all resources - maintenance, materials, operations, technical, and admin.

9
AMM 2009
OMDEC copyright

The Excellence Cube Technology


and
Processes

Continuous Autonomous
Improvement Reliability Maintenance
le
o p
Pe d c e s
an our
Data Materials Tactics Measures Work es
Control R

Strategy Management
Leadership

Capable and supportive technology Training and skills development;


Processes that drive productivity financial and cost management

10
AMM 2009
OMDEC copyright

BP Tools – how to assess your


current practices
• #1 Maintenance Cube of Excellence
• #2 Maintenance Assessment
– The process of measuring “as is” against where
we want to be - (Gap Analysis)
– Development of recommendations
– Determining Priorities

11
AMM 2009
OMDEC copyright

Objectives:
• 1. To provide a high level, objective review of
the Maintenance function.
• 2. To identify strengths and weaknesses
• 3. To provide recommendations and benefits,
and to propose priorities.
• 4. To provide real value in terms of needs,
priorities, directions and next steps
• 5. To create a road map for change

12
AMM 2009
OMDEC copyright

Maintenance Organization doing formal


Assessments typically…
1 want to move towards World Class Maintenance

2 want to improve their Maintenance operations and are looking


for the best place to start.
3 think they need to change strategies
4 get reasonable value from their Maintenance IT Systems, but
want to increase the ROI
5 recognise they need help straightening out an internal problem
6 have an appetite for change?
7 Know they can get their boss to approve the process?
8 …and agree to the changes recommended?
Do you……………..

13
AMM 2009
OMDEC copyright

Assessment Options:
1. Full Assessment covering all
Maintenance aspects in depth
or
2. Accelerated-assessment - quick
overview, focusing on easy wins
or
3. Target a specific problem area

14
AMM 2009
OMDEC copyright

What’s the difference?


• Breadth and Scope of the analysis and the results
• Time and Cost

What’s the same?


• The depth and level of detail in the questionnaires, the analysis and the
feedback
• The planning and execution - all require careful preparation
• All require and open mind and good communications
• All require management to be prepared to change the way things are done

15
AMM 2009
OMDEC copyright

Assessments have a common Methodology:

Self-Assessment
Questionnaire,
On-site visit Report back
Data Gathering,
& Interviews to Management
Analysis and
Assessment

Sample Teams
1. Internal -- Maintenance Engineer, Maintenance Manager,
Maintenance Supervisor, Selected Trades PLUS a
facilitator (usually external)
2. External – Experienced consultant or consulting team
16
AMM 2009
OMDEC copyright

Self-Assessment Must be completed right after the kick-off


Questionnaire, Confidentiality may be required
Involve lots of maintenance-related people
Data Gathering,
Analysis and
Assessment

  Current asset/maintenance strategy and degree of acceptance.


   Asset management/PAM organisation.
   Use of maintenance tactics + their effectiveness in PM program.
   Use of reliability engineering to monitor equipment performance.
   Use of performance monitoring, measures and benchmarking.
   Use of information technology and EAM/CMMS systems.
   Use and effectiveness of planning and scheduling.
   Materials management in support of maintenance operations.
 Use of process analysis to optimize PAM effectiveness.

 Collect and analyze responses


17
AMM 2009
OMDEC copyright

To verify findings and further explore the findings


from the self-assessment
On-site visit To seek better ideas
& Interviews To get people committed to the process of change

 Interviews - some or all of:  

     Corporate Executives


 The Plant Manager;
     Operations/production managers;
    Finance Manager
   Purchasing manager;
Need to balance
     Stores manager/supervisors; internal and external
     Maintenance manager interviewers
 Maintenance superintendent;
     Maintenance engineer;
 Maintenance supervisor
 IT Manager
 PAM technicians
 EAM users 18
AMM 2009
OMDEC copyright

Written report summarizing


the findings and
Report back recommendations
to management

 Basis: Cube of Excellence


 Strengths
 Weaknesses
 Recommendations and Opportunities.
 Benefits & ROI Paybacks
 Implementation Priorities

19
AMM 2009
OMDEC copyright

Part of detailed questionnaire.....


How do you decide what PM's to do?

Do you experience frequent equipment failures?

How much maintenance rework do you do?

Do you understand what has to be done, but


can't get the practices adopted?
What sorts of Predictive maintenance do you
perform?
Which machines do you select for Condition
Based Monitoring?

Lots of questions – will vary according to the outside facilitator


20
AMM 2009
OMDEC copyright

Sample: Maintenance Tactics


• Strengths
 PM program exists for all machines and schedules are published.
 PM program has succeeded in past when it was rigidly followed.

• Weakness
 PM program compliance is low. Many PMs are skipped when machines are not available
from production. PM shift timing often does not match production downtime “windows”.
 PM program checklists are not reviewed for frequency and for value added by tasks. In
some cases equipment start up is shaky and “PM Corrective” work is required to fix the
mistakes! This is a sign that the wrong PM is being done in those cases.
 PM work is widely misunderstood - much of what gets done is actually corrective (repair)
maintenance.
 Extensive shutdowns often result in startups that are not smooth - much of the work that
is done during the shutdown may be the wrong sort of work for the failure modes actually
being experienced.
 While contractors are used sparingly to augment short term work force requirements
they are not well controlled. Contracting relationships are not consistently managed as
contracts - relationships are sometimes personal.

21
AMM 2009
OMDEC copyright

Sample: Maintenance Tactics


Recommendations
T1: When area focused organization is implemented assign PM checksheets to areas
for execution. PMs should not be assigned to dedicated PM crews, but enough time
to do PMs must be allocated within the area maintenance schedules.
 T2: Monitor PM compliance and enforce schedule discipline (do the PMs within the
planning cycle).
 T3: PM work priority should be second only to safety work orders.
 T4: Area maintenance crews to review PM checksheets for validity and frequency
and revise them as appropriate.
 T5: Explore possible uses of condition monitoring for random failure modes (e.g.:
vibrations, thermal imaging, oil analysis, NDT).
 T6: Formalize the contractor management process, introduce and enforce
standards for work performed and manage the relationships as formal contractor
relationships.
 T7: Use TPM Pilot areas as models of improved working environment.
 T8: Train on the use of checklists and their importance.
 T9: Look at outsourcing as an alternative for some work (e.g.: rebuilds, vibration
analysis, thermographic analysis).

22
AMM 2009

Sample Benefits
OMDEC copyright

Breakdown and emergency maintenance costs 3x planned corrective


maintenance; moving from 70:30 to 30:70 will save $1m in labor costs and
$0.2m in materials
Planned corrective costs 3x preventive & predictive. Moving to 70%
predictive and preventive will save a further $1m in labor
Effective planning and scheduling typically improves labor efficiency by 15
to 20%, and reduces materials cost by 10 to 20%
Analysis of the causes of equipment failure typically increases uptime by at
least 5%. For your main product line, this can generate an additional $10m in
revenues.
Standardization of parts and consolidation of suppliers can decrease
material costs by 10-15%. This translates into $0.5m in savings.

23
AMM 2009
OMDEC copyright

Showing the Results…


• Make them attractive
• Make them easy to understand
• Pictures are better than words

24
AMM 2009
OMDEC copyright

Know how well you are doing


Maintenance Strategy
9.00
8.00
Maintenance Process Reengineering Organization / Human Resources
7.00
6.00
Recognize the 5.00
gaps 4.00

Materials Management 3.00 Employee Empowerment


2.00
1.00
-

Planning and Scheduling Maintenance Tactics

Information Technology Reliability Analysis

Performance Measures / Benchmarking


25
1 Successfully led change AMM 2009
2 Model new values/behaviour OMDEC copyright
3 Challenge data/people constructively
4 Receptive to new ideas
5
6
Understand, trust and respect
Maintain focus when faced with other priorities
There’s many variations of spider charts
7 Demonstrate personal commitment
8 Teams well supported by leaders
9 Compelling need clear
10 Change in practice made clear
11 Understand relationship with corporate strategy
12 Approaches varied appropriately
13 Authority and responsibility clear
14 Change kept on track
15 Linked projects well co-coordinated
16 programme/business as usual decisions well linked
17 Decision making doesn’t slow change
18 Problems solved quickly
19 Sufficient time allowed
20 Change successfully achieved in the past
21 People expect change to succeed
22 People learn new skills
23 Encouraged to be constructively critical
24 Managers have change skills
25 Managers respected/trusted
26 Rewards/punishments applied
27 People are well informed
28 Clear about effect of change
29 Communications significantly help to achieve change
30 Grapevine well managed
31 Commitment built not forced
32 Middle managers’ not pressured
33 Managers are disciplined
34 Processes don’t block change
35 people receptive to new ideas
36 Processes change regularly Topic Categories
37 Co-operation high 1-8: Develop Leadership 35%
38 Working for the good of the organisation 9-11: Create Change Vision 44%
39 Walk the talk 12-19: Define Change Strategy 37%
40 Change teams work effectively 20-33: Build Commitment 39%
41 Project management well embedded 34-40: Manage People Performance 46%
42 Projects completed not forgotten 41-42: Develop Culture 34% 26
AMM 2009
OMDEC copyright

Start by Prioritising
• … according to the amount of pain they cause

• … according to their consumption of resources

• … according to the extent to which they cause


bottlenecks

• … according to the ease of fixing

• ….. And their benefits

27
AMM 2009

So where do we start???
OMDEC copyright

A 1 S1 S5 S2 P4 C 1 S4 A 3 C 4
O 1
H IG H O 7 O 2 IN 1 M 1 IN 4 O 8 IN 3 F1
A 5
A 4 O 11 C 2 IN 2
S S tra te g y
A 6
O 3 C 8
P1 T1 T2 S3 P2 C 7 C 6 C 5 O r g a n iz a tio n &
P5
O C hange
R 2 M anagem ent
T7 M 5 O 10 T5 M 4 IN 5 C 3
O 4 R 1
O 14 T T a c tic s
M E D M 3 IN 7 IN 6 F2
O 5 IN 8 O 13 T6
P7 P la n n in g &
Benefit

O 12 P
T4 P6
F3
S c h e d u lin g
T3 M 2 O 9 T9
P3
C 9 P8 P9 M M e a s u re s

A 2 T8
M a te r ia ls
IN
M anagem ent

LO W C S y s te m s (C M M S )

O 6
R e lia b ility
R

TPM &
A A u to n o m o u s
M a in te n a n c e

F P ro c e s s e s

LO W M E D H IG H
D iffic u lty to Im p le m e n t
28
AMM 2009
OMDEC copyright

Best Practices - Where do you stand?

???

29
AMM 2009

Where do you stand – 1 OMDEC copyright

Adapted from JC Campbell: Uptime


Strategy Organization Planning & Maintenance Performance Information Reliability Process
Management Scheduling Tactics Measures Technology Engineering Analysis

Excellence Set corporate Multi-skilled Long term & All tactics OEE; bench Fully Full value, Regular Re-
mtc. strategy independent major project based on marking, full integrated, risk analysis, view of Pro-
/ asset trades planning & analysis cost common RCM and cess Cost,
strategy engineering database database root cause
Time Quality
analysis
Competence Long term Some Good job Some CBM, MTBF/MTTR Fully Some Some review
improvement multi-skilling planning, some PM, availability, functional; FMECA used of Admin,
plan scheduling & few surprises separate linked to Eng and
eng'g support mtc. costs financials & Trades
materials procedures
Under- Annual Decentralize, Planning Time and Downtime by Fully Good failure Some review
standing improvement mixed trade group use based cause; Mtc. functional; database; of Trades
plan teams established; inspections. costs stand alone well used. Processes
ad hoc Some NDT available and Tactics
engineering
Awareness PM Partly Troubleshoot Time based Some Basic mtc. Collect data One time
improvement centralized ing support; inspections downtime scheduling, but make review of
program for some inspection records; mtc. some parts little use of it. Maintenance
trades scheduling costs not records
Process
segregated

Innocence Mostly Highly No planning, Shutdown No Manual or No failure


reactive centralized. little inspections systematic ad-hoc records. Never
breakdown scheduling & only approach; specialty reviewed
mtc. no mtc. cost systems
engineering unavailable.
30
30
AMM 2009
OMDEC copyright

Where do you stand?


Strategy Organization Planning &
Adapted from JC Campbell: Uptime
Maintenance Performance Information Reliability Process
Management Scheduling Tactics Measures Technology Engineering Analysis

Excellence Set corporate Multi-skilled Long term & All tactics OEE; bench Fully Full value, Regular Re-
mtc. strategy independent major project based on marking, full integrated, risk analysis, view of Pro-
/ asset trades planning & analysis cost common RCM and cess Cost,
strategy engineering database database root cause
Time Quality
analysis
Competence Long term Some Good job Some CBM, MTBF/MTTR Fully Some Some review
improvement multi-skilling planning, some PM, availability, functional; FMECA used of Admin,
plan scheduling & few surprises separate linked to Eng and
eng'g support mtc. costs financials & Trades
materials procedures
Under- Annual Decentralize, Planning Time and Downtime by Fully Good failure Some review
standing improvement mixed trade group use based cause; Mtc. functional; database; of Trades
plan teams established; inspections. costs stand alone well used. Processes
ad hoc Some NDT available and Tactics
engineering
Awareness PM Partly Troubleshoot Time based Some Basic mtc. Collect data One time
improvement centralized ing support; inspections downtime scheduling, but make review of
program for some inspection records; mtc. some parts little use of it. Maintenance
trades scheduling costs not records
Process
segregated

Innocence Mostly Highly No planning, Shutdown No Manual or No failure


reactive centralized. little inspections systematic ad-hoc records.
Never
breakdown scheduling & only approach; specialty
reviewed
mtc. no mtc. cost systems
engineering unavailable.
31
31
AMM 2009
OMDEC copyright

Where do you stand -2


Let’s look at the numbers (quantitative)
• Record your own metrics (estimate them if you don’t know exactly)
– Maintenance cost as % of operating costs
• How much of this is for planned work?
• How much of this is for un-planned work?
• How much of this is for emergencies?
• What’s the split among labour, material and contractor costs?
• Overtime % (average)?
• Wrench time estimate as % of paid time?
– Stores inventory value as % of asset replacement value
• What are your annual material costs?
• Inventory turns?
– Results
• Availability of plant or major “bottleneck” process?
• Output quality rate?
• Production rate as % of sustainable proven maximum rate?
• OEE?

32
AMM 2009
OMDEC copyright

Workshop
• Back to slide 30….
• 1. Circle where you are today
• 2. Circle where you think you should be

Back to slide 31
1. Put in your best guess for your own numbers
2. Put a “?” where you don’t know.
3. Should you know?

33
AMM 2009
OMDEC copyright

BP Tools - how to assess your


current practices
• #1 Maintenance Pyramid of Excellence
• #2 Maintenance Assessment
• #3 Benchmarking
– Measuring how well you do compared to others

34
AMM 2009
OMDEC copyright

Why benchmark?
• To compare your performance levels and
underlying processes to those in comparable, high
performing organisations
• To adapt the best practices from your comparative
study to close your process and performance gaps
• To know where you are in relation to the reference
and have a defined target to achieve

35
AMM 2009
OMDEC copyright

Four Phases
1. Plan
--
Train internally 2. Analyse
Identify What --
Identify Who Draw comparisons
Collect data Determine the Gap 3. Integrate
Determine the 4. Act
--
Future Vision --
Ensure Ownership
Implement
Establish the goals
Monitor
Develop the Plan
Verify
Feedback
Re-calibrate
Repeat
36
AMM 2009
OMDEC copyright

1. Plan---What to Benchmark

Maintenance Excellence Cube


– A framework for benchmarking

37
AMM 2009
OMDEC copyright

Benchmarking.....
-- the process of comparing.....


Internally between similar Divisions


Internally between similar Plants Benchmarking
the Results

Externally with similar or competitive
Companies


With Best Practice Companies


With standards such as ISO, PAS55, etc
Benchmarking
the Processes

38
AMM 2009
OMDEC copyright

1. Plan--- Benchmarking Data


• Good data is critical to good benchmarking
results…. Poor data can cause wrong
actions
– “similar” businesses
– “similar” processes
– “similar” data definitions
– “similar” ways of measuring, collecting,
analysing data
– “similar” ways of reporting results
39
AMM 2009
OMDEC copyright

2. Analyse - Comparisons
• Make sure the comparisons make sense

• Check outliers – why?

• Remember that business cultures differ

40
AMM 2009
OMDEC copyright

2. Analyse---the Gap
Quantify the gap

Graph it

Understand why

Prioritise it

41
AMM 2009
OMDEC copyright

2. Analyse---Future Vision
Select which gaps to close

Set time line

Understand what it will take to close

Understand what the other companies will be doing


while you are closing the gap

Should you be looking for a “leapfrog” strategy?

42
AMM 2009
OMDEC copyright

Choose your Benchmarks…


Us Best P Gap
Mtce Cost / Total operating Cost 10 to 15 %

Mtce Supervisors / total trades 1:15 or


more
Mtce Overtime / total mtce mhrs 5 to 10 %

Equipment effectiveness > 85 %

Planned mtce / total mtce > 80 %

Emergency mtce / total mtce < 10 %

Stores inventory turnover 2 to 10


times

Will they vary?…… if so how and why?


43
BENCHMARK QUARTILEAMM 2009
BOTTOM THIRD SECOND
OMDECTOP
copyright
RESULTS METRICS

Best-in-Class Stores/Replacement Value Percentage


Maintenance Cost/Total Sales Percentage
>1.3
>8
1.3-.8
8-7
.7-3
6-3
<.3
<3

Maintenance Availabilty
Discrete <78 78-84 85-91 >91
Batch Process <172 72-80 81-90 >90
Benchmarks (example) Chemical, Refining Power
Paper
<85
<83
85-90
83-86
91-95
87-94
>95
>94
Overall Equipment Effectiveness Not <48 48-78 >78
Source: Fluor Daniel of Greenville Measurable
S.C., gathered these benchmarks PROCESS METRICS
from 148 global companies that are Mechanic Wrench Time <31 31-41 42-52 >52
considered top quartile in their Percentage Planned Work >65 66-78 79-94 >95
industry in terms of earnings and / or Request Compliance Percentage <88 68-77 78-90 >90
market share. Schedule Compliance Percentage <15 15-35 36-70 >70
Work Order Discipline Percentage <54 55-83 84-95 >95
This data matches closely to PM Percentage by Operations 0 0-9 10-24 >25
benchmark data PWC have access Replacernent Value ($MM) per Mechanic <3.2 3.2-5.0 5.0.7.5 >7.5
to but can't release for reasons of Suggestions per Mechanic per Year Not <.5 . 5-4 >4
client confidentiality. Measurable
Stores Turnover <5 5-7 .7.1.2 >1.2
Bottom quartile column—shows Stores Service Level <93 93-96 97-99 >99
the performances of the worst 25% Contractor Cost Percentage <8 8-19 20-40 >40
of the companies benchmarked, Stores Issues/Total Material Percentage >82 82-68 67-20 <l9
TRAINING AND STAFF RATIOS
Third quartile column---shows the Span of Control <9 9-17 18-40 >40
performances of 25% of the Mechanics per Effective Planner <25 25-59 60-80 >80
companies who just exceeded the Replacement Value ($MM) per Maintenance <50 50-200 200-250 >250
bottom quartile and Reliability Engineer
Mechanics per Plant Worker Percentage <32 32-21 20-10 ~10
Top quartile column---indicates the Total Craft Designations >7 7.6 5-3 2
benchmark measurements for the Training Hours per Mechanic >80 80-70 69-40 <40
best of the best Training Cost per Mechanic >3000 3000- 1800-500 <500
1800 44
AMM 2009
OMDEC copyright

Profile of Benchmarking Company


1 Have access to good comparative and competitive data?
2 Can identify companies with which they can compare
your operations?
3 Can get good data about their own performance?
4 Can objectively accept they are behind?
5 …and can respond by improving?
6 Know they are not in the lead?
7 Have the time and resources to do a good job of
benchmarking?
8 Has an appetite for improvement?

45
AMM 2009
OMDEC copyright

• The value of the Benchmark lies solely in


the actions that it prompts.

• If your organization does not have the will


to change……
…don’t waste
your time

46
AMM 2009
OMDEC copyright

Benchmarking vs. Assessments


• Benchmark • Assessments
– Usually only conducted by – Normally external, but
external body
– Facilitator can be internal
– Need to make sure the
answers are calibrated – Very good for stimulating
properly across multiple cross divisional swap of
companies ideas
– Is a comparison against – Can be much more
the industry best detailed and targeted
– Always playing catch up – Less expensive
– What if the best don’t – Faster
play

47
AMM 2009
OMDEC copyright

Workshop: Benchmarking and Assessments

1: For your organization, what are the three biggest pluses


and minuses for benchmarking?

2: What are the three biggest pluses and minuses for


assessments?

3: How do you capitalize on the pluses and minimise the


minuses?

4: Would you recommend your organization proceed with an


assessment or a benchmarking?

48
AMM 2009
OMDEC copyright

Power Co – Benchmarking Case Study - 1

• Background research on • Looking to compare


Company operations current status with BP
– Objectives Companies
– Business unit information
• Want to find areas and
– Maintenance organization
strategy for catch-up
– Contractor/Outsourcing
status • Maintenance only
• No outsourcing
 

49
AMM 2009
OMDEC copyright

Power Co – Benchmarking Case Study - 2

Maintenance Organization,
Strategy, Policy, Procedures,
Review of internal documents. Budget
Review of options for the 10 KPI’s proposed & agreed –
Benchmarking areas. 1. Mtce $ per unit output 2.
(Wireman identifies well MTBF 3.
over 100 KPI’s) MTTR 4.
Propose KPI selection in Wrench Time 5.
advance for review. Failure Costs 6.
PM % 7.
Review with Management.
PM Compliance 8.
Stores Turnover 9.
Stores Service levels 10.
OEE
50
AMM 2009
OMDEC copyright

Power Co – Benchmarking Case Study - 3

In-house training One day training for


Examine the data availability maintenance organization on
for the proposed KPI’s process and expectations
Final confirmation of the Some data definitions needed to
KPI’s. be refined and clarified
Select Target BM Partner KPI’s confirmed
companies.
Research publically available 20 BM partners selected – expect
databases, industry 5 to join
associations EPRI and Solomon have good
information

51
AMM 2009
OMDEC copyright

Power Co – Benchmarking Case Study - 4


Preliminary Agreement on All data to be available to all
terms of disclosure. participants.
Discussion and Only their own score to be disclosed
finalization of project Average and min-max also to be
plan, process, timing disclosed
Agreement on “what if’s”
in the event of issues. List of participants available to all
Agreement on outputs. No trace-back from data to
companies
12 weeks to complete
Formal report + workshop

52
AMM 2009
OMDEC copyright

Power Co – Benchmarking Case Study - 5

• Approach BM partners – 20 approached, 7 committed


objectives, background, to participate
questionnaire, follow up Data documents identified
validation and/or - the process
interviews.
- the data definitions
• Finalize/issue requests for
data – internal and - the outputs
external - the time frame
Confidentiality and
impartiality emphasized

53
AMM 2009
OMDEC copyright

Power Co – Benchmarking Case Study - 6

Await responses from 6 weeks elapsed time


BM Partners 3 “how’s it going” follow-ups
Follow-ups with BM 1 dropped out – too busy
Partners Many questions about the
Follow-ups with Power definitions
Co (Showed the lack of standard
practice!!)
3 were on-time, 3 were late
Power Co was also late!!

54
AMM 2009
OMDEC copyright

Power Co – Benchmarking Case Study - 7

Data Analysis. Much more difficult than


Identify/resolve gaps and expected as the data
misunderstandings variations were large
Prepare initial Review and consolidation
comparisons.  issues were mainly around
Revert to BM Partners to data definitions and
validate data, clarify methods of data capture
misunderstandings, fill Good co-operation from BM
data gaps. Partners
Integrate revised data

55
AMM 2009
OMDEC copyright

Power Co – Benchmarking Case Study - 8

Prepare and discuss Results showed:


preliminary findings 1. Overall Power Co was slightly
Prepare and deliver above average (3rd of 7)
final report and 2. Stores results were worst (6th of 7)
feedback workshops 3. Failure costs and PM Compliance
to Power Co were best (2nd of 7)
Wind-up discussion
Feedback workshop focused on setting
with Management
priorities and goals
Management response “ a very
worthwhile exercise”

56
AMM 2009
OMDEC copyright

The Final Warning….


Yesterday’s Excellence

is

Today’s Standard

and

Tomorrow’s Mediocrity

Terry Wireman

57
AMM 2009
OMDEC copyright

Workshop:

• Review what we have talked about today

• Select at least three ideas that you like to


implement or would like to know more
about

• Plot them on the next slide

58
AMM 2009
OMDEC copyright

Using the Priority Chart to sort out your good ideas

High

Benefit

Medium

Low

Low Medium High


Cost/Difficulty
59
AMM 2009
OMDEC copyright

Ben@omdec.com
OMDEC Inc,
560 Burns Road,
Godfrey, Ontario,
Canada, K0H 1T0

Tel (001) 613-273-4366


Fax (001) 613-273-4367
www.omdec.com

60

You might also like