Professional Documents
Culture Documents
Silo - Tips Profiles of LD Dashboards
Silo - Tips Profiles of LD Dashboards
www.ldronline.com
Profile #4 —Lucent Technologies’ Strategic Organization and Professional Development Balanced Scorecard 13
Summary: Lucent Technologies’ Strategic Organization and Professional Development (SOPD) Group maintains a balanced scorecard
that aligns directly with scorecards used by all functions across the organization. The scorecard’s organization enables SOPD to
demonstrate visually how the SOPD group contributes to the organization’s primary objectives.
L&D Objectives Supported Noteworthy Features or Metrics
• Financial Management • L&D objectives mapped to corporate strategic objectives
• Portfolio Management Metrics to Assess:
• Internal-Customer Relationship Management • Full cost recovery for programs supporting business partners
• Leadership Pipeline Management • Leadership competency improvements
• HR business partner satisfaction
v
vi
Profile #14— W.W. Grainger’s “Voice of the Customer” Annual Survey Results 55
Summary: In an effort to obtain internal customer feedback on the performance of the L&D function, the Grainger Learning Center (GLC)
commissions a third-party vendor to conduct an annual survey of 120 managers and senior executives. In turn, GLC summarizes
their understanding of the survey results, specifically identifying key areas of effectiveness and priority-improvement and sharing
its initial strategies for addressing outstanding customer needs. While GLC primarily uses the survey results to effectively allocate
and prioritize its resource investments, it also leverages the data to demonstrate the value it has already delivered to managers and
senior executives.
L&D Objectives Supported Noteworthy Features or Metrics
• Portfolio Management Metrics to Assess:
• Internal-Customer Relationship Management • Partnering and relationship-management efforts of GLC staff and leadership team
• Communications and transparency of L&D objectives and initiatives
• Proactive customization and anticipation of line needs
• GLC’s sensitivity to individual learner preferences
• Quality and frequency of L&D feedback/evaluation systems
vii
viii
Special Thanks
The Learning and Development Roundtable would like to express its gratitude to the following individuals, who were
especially giving of their time and insight in the development of measurement tools profiled in this study:
American Standard Companies Inc. Ernst & Young, LLP Reuters Group PLC
ix
x
Letter From the Learning and Development Roundtable
Across the past decade, few issues have commanded more attention on learning executives’ agendas than
the challenge of L&D measurement. Even more remarkable than the staying power of this topic, though, has
been the intensity of debate surrounding it. Is ROI measurement feasible? Is it possible to isolate the value
of a learning intervention? Do the benefits of L&D measurement outweigh the costs? The list of questions
goes on. Still, while L&D practitioners may be divided on the answers to these measurement questions, they
have been (notably) united by a single measurement objective: to develop measurement approaches that
look beyond traditional program evaluation to enable the creation of comprehensive dashboards for guiding
strategy and optimizing operational performance.
Decidedly less clear than this objective, however, is the path required to achieve it. The irony here, of
course, is that the paucity of available guidance on L&D dashboard creation contrasts sharply with the
overwhelming volume of literature on L&D program evaluation. Given this general lack of coverage, the
Roundtable has found that the objective of dashboard creation, while clear in theory, tends to lack edges in
practice. Indeed, our conversations with more than 50 learning executives have revealed notable demand for
research that might make this objective more tangible, with a specific emphasis on the actual dashboards
employed by progressive L&D practitioners.
In response, the Roundtable’s research into this terrain has initially focused on two fundamental questions
articulated by our membership: 1) How do progressive L&D functions measure and communicate overall
L&D performance?; and, 2) Which metrics do my peers find most useful? Guided by these questions, the
Roundtable’s early research has sought to catalog the tools that L&D functions use to demonstrate their
value as well as to inventory the metrics that are most commonly used to track progress against key L&D
objectives.
With this study, the Roundtable is pleased to present the first product of this work. At its core, this study is
designed to provide a menu of templates that L&D practitioners might use to accelerate the creation of their
own L&D dashboards. Based on detailed profiles of the live L&D dashboards in use by 13 L&D functions,
this study is grounded firmly in the practical; the material herein is based exclusively on the tangible
practices of real organizations. Our sincere hope is that these profiles serve as powerful tools for L&D
functions seeking to boost the rigor and efficacy of their measurement efforts.
We would be remiss if we did not express our deep gratitude to the organizations that participated in this
study. It is our modest hope that this study serves as a useful guide to members as they examine and refine
their own L&D measurement strategies. We encourage and look forward to your feedback.
With our continued appreciation,
Learning and Development Roundtable
Washington, D.C. and London
Summer 2004
xi
xii
Study in Context
What metrics do my peers find most useful for measuring and How do I align L&D initiatives with corporate objectives and
demonstrating their performance? customer priorities?
What visualization tools do my peers employ to communicate How do I prioritize my investments and allocate resources to
L&D performance? meet my customers’ most urgent needs?
What conceptual frameworks help my peers articulate their How do I understand the value that customers derive from my
measurement approaches? existing L&D portfolio?
Which metrics map to specific L&D objectives? How do I prioritize my measurement efforts on “metrics that
matter?”
(Fall 2004)
Profiles of L&D Dashboards: A Compendium of Tools Leveraging Measurement to Inflect L&D
for Measuring and Communicating L&D Performance Performance: Best Practices in Designing Strategy-
• Compendium of 14 “Live” L&D Dashboards and Scorecards Focused Measurement Frameworks
• Visual Overviews of Metrics Tracked by Peer Organizations
• Detailed Frameworks for Demonstrating L&D Value Creation
Profiles of L&D value demonstration tools from Detailed inventory of L&D
names you recognize performance metrics culled from
peer measurement approaches
Roundtable Compilation of Nonprogram Measures
Roundtable
During its initial forayCompilation of Nonprogram
into the measurement Measures
terrain, the Roundtable collected a series of nonprogram metrics
fromDuring
both theits trade
initialpress
forayand
intothe
the“live” scorecardsterrain,
measurement of our the
member institutions.
Roundtable Certainly,
collected a seriesmost readers will metrics
of nonprogram be
well from
acquainted
both the with a significant
trade press andnumber
the “live”ofscorecards
these measures.
of ourWhile
memberthisinstitutions.
list is certainly neithermost
Certainly, exhaustive
readersnor
will be
universally applicable,
well acquainted oura hope
with is thatnumber
significant the breadth of this
of these inventory
measures. might
While thisprovide a useful neither
list is certainly startingexhaustive
point for nor
organizations
universallyseeking to identify
applicable, our hopekeyismetrics,
that theboth leading
breadth andinventory
of this lagging, for theirprovide
might own dashboards. This L&D
a useful starting point for
performance metrics
organizations inventory
seeking is found
to identify keyin the Roundtable
metrics, publication:
both leading Reframing
and lagging, for theirThe Measurement
own dashboards.Debate,
This L&D
Moving Beyond Program
performance metricsAnalysis
inventory in is
thefound
Learning Function.
in the Roundtable publication: Reframing The Measurement Debate,
Moving Beyond Program Analysis in the Learning Function.
An Abundant Menu of Measures
An Abundant Menu of Measures
Core Operations and Processes 23. Self-service penetration: Self-service transactions as a
percentage of total transactions executed
Core
Aggregate Operations
Volume, and Metrics
Cost, and Productivity Processes 23. Self-service
24. Strategic penetration:
focus: Percentage Self-service
of L&D staff timetransactions
allocated toas a
percentage of total transactions
“transactional/administrative” executed activities
versus “strategic”
1. Aggregate
Aggregate investment:
Volume, L&D investment
Cost, and Productivity Metricsper employee 24. Strategic cycle
25. Development focus:time: Percentage of L&Didentification
Lag between staff time allocated
of needto
2. Cost to serve: Cost per employee trained “transactional/administrative”
and deployment of solution versus “strategic” activities
3. L&D 1. Aggregate
investmentinvestment:
ratio: L&D cost L&Dasinvestment
percentageper of employee
total 25. Development
26. Development costs:cycleCosttime: Lag between
to develop learning identification
solutions, perof need
2. Costexpense
operating to serve: Cost per employee trained andhour
learning deployment of solution
4. Cost3. L&D investment
per time unit: Costratio:
per L&D costhour
training as percentage of total 26. Development
27. Administrative costs: Cost
intensity: to developcosts
Administrative learning
as a solutions, per
5. Employeeoperatingtimeexpense
investment: Training hours per employee learning
percentage of hour
total expenditures
4. Cost per
6. Percentage of time unit:
payroll: CostL&D
Total percosts
training
as ahour
percentage 27. Administrative
28. On-budget ratio: Percent intensity:
of L&DAdministrative
projects tracking costsagainst
as a
of5.payroll
Employee time investment: Training hours per employee percentage
time and of total expenditures
budget goals
7. L&D 6. Percentage
staff levels: of payroll:
Ratio of L&DTotalFTEsL&D costsFTEs
to total as a percentage 28. On-budget ratio: Percent of L&D projects tracking against
8. Cost ofper payroll
content unit: Cost per module delivered Effort and Investment
time and Allocation
budget goals
7. L&D
9. Tuition staff levels: Ratio
reimbursement of L&D
spend: FTEs to tuition
Aggregate total FTEs
8. Cost per costs
reimbursement content unit: Cost per module delivered 29. Effort
Channel delivery mix:
and Investment Percentage of delivery, by learning
Allocation
9. Tuition
10. Content reimbursement
portfolio: Number of spend:
distinct Aggregate tuition
learning offerings channel (e.g., classroom, intranet, CD-ROM)
reimbursement costs 29. Channel
30. Content deliverydelivery
mix: mix: Percentage
Percentage of delivery,
of delivery, by learning
by content type
11. Overhead ratio: Cost of facilities and equipment as a channel (e.g., classroom,
10. Content
percentage portfolio: Number of distinct learning offerings
of total (e.g., sales training, IT training, intranet,
leadership, CD-ROM)
and management
11. Overhead 30. Content delivery mix: Percentage of delivery, by content type
development)
12. Cost flexibility: ratio:
VariableCost of as
costs facilities and equipment
a percentage of overallas L&D
a
(e.g.,segment
sales training, IT training, leadership, of anddelivery,
management
budgetpercentage of total 31. Learner delivery mix: Percentage by
12. Cost flexibility: learnerdevelopment)
segment (e.g., executives, first-line supervisors, frontline
13. Technology intensity:Variable
Learningcosts as a percentage
technologies costs asofaoverall L&D 31. high-potential
Learner segment delivery mix: Percentage of delivery, by
budgetof total spend
percentage staff, employees)
13. Technology intensity: Learningprocesses
technologies costs as a 32. Learner learner segmentinvestment
segment (e.g., executives,
mix: first-line
Percentage supervisors,
of frontline
14. Activity-based costing of learning (e.g., strategy staff, high-potential employees)
percentage
formulation, of total
analysis, spend
design, development, delivery, relationship investment, by learner segment (e.g., executives, first-line
14. Activity-based costing of learning processes (e.g., strategy 32. Learner
supervisors, segment
frontline staff,investment
high-potentialmix: Percentage of
employees)
management) investment, by learner
formulation, analysis, design, development, delivery, relationship 33. Channel investment mix:segment
Percentage (e.g.,ofexecutives,
investment,first-line
by
Supplier Relationships
management) supervisors,
learning channel (e.g.,frontline staff, high-potential
classroom, intranet, CD-ROM) employees)
33. Channel
34. Content investment
investment mix: mix: Percentage
Percentage of investment,
of investment, by by
15. Supplier
Outsourcing ratio: Outsourcing expenditures as a percentage
Relationships contentlearning channel
type (e.g., sales(e.g., classroom,
training, intranet,
IT training, CD-ROM)
leadership, and
of total 34. Content
management investment mix: Percentage of investment, by
development)
15. Outsourcing
16. Outsourcing spendratio: Outsourcing expenditures as a percentage 35. “Required”
fragmentation/concentration: content investment
type (e.g., sales training, ITPercentage
allocation: training, leadership,
of and
of totalof outsourcing costs allocated to specific vendors,
Percentage management
investment devoteddevelopment)
to government-mandated skills,
16. Outsourcing
providers, spend fragmentation/concentration:
and suppliers 35. “Required”
certifications, investment
and compliance allocation: Percentage of
requirements
17. Delivery Percentage of outsourcing
outsourcing costs allocated
mix: Percentage to specific
of delivery vendors, 36. Business-mandated
outsourced investment devoted to government-mandated
investment allocation: Percentageskills,
providers, and suppliers
18. Development outsourcing mix: Percentage of development certifications, and compliance
of investment devoted to business-mandated skills requirements
17. Delivery outsourcing mix: Percentage of delivery outsourced 37. Discretionary
outsourced 36. Business-mandated skill-building investment
allocation:allocation:
PercentagePercentage
of
18. Development
19. Analysis and planning outsourcing
outsourcing mix:mix:Percentage
Percentageof development
of of investment
investment devoted devoted to business-mandated
to “discretionary” skills skills
outsourced
analysis and planning outsourced 37. Discretionary
38. Basic/advanced skill-building
investment mix: allocation:
Percentage of Percentage
investment of
19. Analysis and planning outsourcing mix: Percentage of devotedinvestment
to “basic”devoted
versus to “discretionary”
“advanced” learningskills
Efficiency of analysis
BackroomandProcesses
planning outsourced 38. Basic/advanced investment mix: Percentage of investment
Utilization devoted to “basic” versus “advanced” learning
20. Efficiency
Processofdigitization ratio: Percentage of L&D processes
Backroom Processes
automated (partially or completely) 39. Utilization
Workforce penetration: Percentage of employees
20. Process digitization
21. Portal/intranet ratio:
accessibility: Percentage
Percentage of L&D processes
of employees with participating in formal L&D offerings
accessautomated
to learning(partially or completely)
intranet/portal 39. Workforce
40. Absence penetration:
costs: Capacity Percentage to
costs attributable of absences
employees
21. Portal/intranet
22. Self-service accessibility:
availability: PercentagePercentage of employees
of transactions available with 41. Classroom participating
yield:inCapacity
formal L&D offerings
utilization, per course
access to learning intranet/portal
via self-service 40. Absencecosts:
42. Cancellation costs:Capacity
Capacitycostscostsattributable
attributabletotolast-minute
absences
22. Self-service availability: Percentage of transactions available 41. Classroom yield: Capacity utilization, per course
cancellations
via self-service 42. Cancellation costs: Capacity costs attributable to last-minute
cancellations
xiii
xiv
L&D Objective Representative L&D Questions Relevant L&D Value Demonstration Tool
• How Can I Understand Cost Drivers and Sources of Revenue?
• How Can I Meet My Financial Commitments?
1. Financial • How Can I Monitor Internal and External Training Spend
All L&D Value Demonstration Tools
Management Patterns?
• How Can I Examine Business Unit-Level Investments to
Facilitate Planning?
• How Can I Measure Content Quality and Relevance?
• How Can I Prioritize and Rationalize Training Content and
2. Portfolio Delivery-Channel Portfolios?
All L&D Value Demonstration Tools
Management • How Can I Better Understand Utilization Patterns?
• How Can I Integrate Customer Feedback into Learning Design
and Delivery?
Applied Global University’s Training and Certification Dashboard 1
Caterpillar University’s College and Support-Service Dashboards 5
Grant Thornton University’s Learning Vision and Strategy 9
Nationwide Building Society’s Training Management Information Pack 17
• How Can I Build a Learning and Development Culture? Applied Global University’s Training and Certification Dashboard 1
6. Cultivating
• How Can I Promote Collaboration and Peer-to-Peer Grant Thornton University’s Learning Vision and Strategy 9
Learning Culture
Learning?
Nationwide Building Society’s Training Management Information Pack 17
xv
xvi
Profile #1
Applied Global University’s Training and Certification Dashboard
1
2
…to facilitate continuous improvement and inflect workforce performance over time
What Gets Measured, Gets Done (and Improved)
“Measurement enables us to continuously align and improve our L&D offerings as well as better
manage our business to meet customer needs. While it is often difficult to be at the receiving end of
criticism, here at AGU, we believe that ‘bad news is good news.’ It would be extremely difficult for
us to improve workforce capabilities if we didn’t measure how customers respond to our offerings,
if they applied what they learned to their job, if they changed their behavior, and what
the impact of our offerings have on their business.”
Mr. Neil Underwood
Senior Director, Global Operations
Applied Materials, Inc.
Source: Applied Materials, Inc.; Learning and Development Roundtable research.
AGU employs a blended
measurement approach to
drive continuous improvement,
specifically using a combination Capturing Customer Feedback to Drive
of quantitative and qualitative
indicators to assess the efficiency Continuous Improvement (Continued)
and effectiveness of the L&D
function respectively. While Quantitative indicators focus on operations, cost, and process efficiency…
tracking operational, cost, and
process metrics helps AGU run Overview of AGU’s Blended Measurement Approach
the L&D function with the same 1
fiscal discipline as other business Objective Quantitative Approach
units, it recognizes that qualitative Use a blended Curriculum/
measures, such as program- Operations Financials
measurement Learning Services
level evaluations, customer approach • Completion rates by line, region, and class • Breakeven analysis (over/ • Completeness of curriculum
satisfaction surveys, quality to facilitate • Number of student training hours/days completed under) • Percentage of non-classroom
audits, and benchmarking results, continuous L&D • Percentage of training delivered via Web versus ILT1 • Overall training spend versus L&D approaches (e.g., COP3)
• Compliance with 40 training hours/per employee/ “input” revenue2 • Ratio of internally-developed
provide more valuable insight improvement and per year requirement • Total training dollars spent versus externally-provided
into how learning solutions inflect workforce • Training participation rates of “influential” and • Accuracy of forecasted training
enable continual learning and performance “emerging talent” training spend
• Certification rates by roles, tasks, and competencies • Ratio of variable versus fixed
inflect workforce performance. • Capacity analytics (demand met, fill rates) costs
Understandably, AGU leverages
the data culled from its qualitative …while qualitative measures demonstrate the effectiveness of L&D solutions
measurement approach to 2
communicate the value it has Qualitative Approach (Program Evaluation and Customer Satisfaction)
created to senior line partners Target Measure Data Source Methodology Process Time
across the organization. • Quality of Delivery Course participants Kirkpatrick, Level 1 • Survey Immediate
• Quality of Content • Comments
• Quality of Facilities
Learning Comprehension Course participants Kirkpatrick, Level 2 Pre- and Post-Test Immediate
On-the-Job Application • Course participants Kirkpatrick, Level 3 Surveys 30–60 days
• Direct managers
Business Results • Managers Kirkpatrick, Level 4 Metrics Review As required
• Project Sponsors
Customer Satisfaction Key Customers VOC4 Survey One-on-one interviews Every six months
Return on Investment Key Stakeholders All methodologies All of the above As required
Quality Audit Findings • ISO 9000 Various methodologies One-on-one interviews As required
• SSQA
• Internal Quality & Reliability Group
Benchmarking • Learning and Development Roundtable Various methodologies • Survey Every twelve months
• Training Magazine’s Top 100 • One-on-one interviews
• Organization-specific data
1
ILT refers to instructor-led training. Source: Applied Materials, Inc.; Learning and
2
“Input” revenue refers to the total amount of chargeback revenue. Development Roundtable research.
3
COP refers to communities of practice.
4
VOC refers to “Voice of the Customer.”
3
4
5
6
7
8
9
10
5
Channels Content
$
Create a Show
Continuous the
Learning Culture T HEMES
THEMES Value
Focus on the Strategic Impact
5
Channels Content
“Are we delivering the right • “What is our level of • “How have we demonstrated • “How have we demonstrated • “How has learning helped
business results for the right readiness for delivering that learning is a our readiness to deliver improve awareness of Grant
business initiatives to the right client services?” differentiator in the client services?” Thornton?”
business leaders?” • “Are we aligning learning recruiting process?” • “How can we provide • “How are our learning vision and
resources to performance • “How do we help recruit learning to our clients?” strategy business-relevant?”
management and career ‘continuous’ learners and
development needs?” future leaders?”
11
12
Profile #4
Lucent Technologies’ Strategic Organization
and Professional Development Balanced Scorecard
13
14
Customer * People*
• Provide unparalleled OD support for sales force • Support organizational • Post-completion client
and customer-facing functions development feedback
Metrics: • Improve competencies • Competency assessment
of leaders improvement
• Build key skills • Learning occurs (test scores)
ion
truct of employees • Participation levels
s
con • Support HR business • HR business partner
der
un
partners satisfaction
• Be a competitive Under
Construction
advantage
* Lucent Technologies uses balanced scorecard categories specific to its corporate strategy; Source: Lucent Technologies Inc.; Learning and
however, these categories are considered proprietary. For purposes of this profile, the Development Roundtable research.
categories provide a general description of SOPD’s balanced scorecard categories.
Lucent Technologies translates
its strategic framework into a
detailed scorecard that effectively
operationalizes the firm’s Supporting Strategic Priorities (Continued)
measurement strategy. This
scorecard first links specific Lucent Technologies’ SOPD scorecard links objectives
metrics to specific objectives. In to specific metrics, targets, and tracking methods
turn, each metric maintains a
defined target and a documented Lucent Technologies’ SOPD Metrics
tracking method.
Strategic Impact Metric Target Tracking Method
In sum, the SOPD scorecard Financial and Business Model
ensures that the group’s
Fully recover costs for business unit- Expense – Recovery = $0 Monthly Report
measurement strategy is specific programs
operational as opposed to
Stay within budget for Lucent-wide For instructor-led programs: spend less than $xxx, less than $yyy per
aspirational. Meet our financial programs student-day.
commitments and achieve Monthly Reports
best-in-class cost efficiencies Meet or beat benchmark costs For e-learning subscriptions: spend less than $aaa, less than $bbb per
student-hour by year-end
Stay within budget for SOPD No more than $ccc for compensation and other SOPD expenses Monthly Report
operating expenses
Execution
Participant satisfaction (average) Greater than x on “objectives met” (instructor led) and “expectations Monthly Report
Achieve best-in-class quality met” (e-learning) on Level 1 program participation evaluations
and participant and client
satisfaction Client satisfaction Greater than x on Client Satisfaction Surveys Upon completion of specific
projects
People
Support organizational Post-completion client feedback Client-assessed impact per project (for Lucent-wide and business unit- Upon completion of projects
development specific learning and organizational development initiatives)
Competency assessment Document skill attainment for instructor-led programs Mid-year and end-of-year
Improve competencies improvement summary report
of leaders Baseline for year-over-year competency improvement on Link with 360 performance
360-degree feedback feedback process
Learning occurs (test scores) For key skills areas (TBD): documented skill attainment Mid-year and end-of-year
Build key skills summary of learning assessments
of employees Participation levels yy hours of training per employee Mid-year and end-of-year
summary report
Support HR business HR business partner satisfaction Greater than x on Partner Satisfaction Survey Mid-year and end-of-year
partners as an organizational summary report
development center
of excellence HR business partner satisfaction gauges the health of SOPD’s relationships
Be a competitive advantage
Under
Construction
Under
Construction with HR partners, who map line priorities to learning and organizational
development initiatives.
Source: Lucent Technologies Inc.; Learning and Development Roundtable research.
15
16
Profile #5
Nationwide Building Society’s Training Management Information Pack
17
18
Days
Days Days
Days Days
Days Course Title Location Date Delegates
Actual
Target
Target--%% Target
Target--%% Target
Target- %-
Variance
are compared against internal Delivering Training & Development Operations # Completed/
Delivering Delivering
Delivering %Delivering
Delivering
Training Training Training Training # Registered (%)
resource allocation targets that Training Training Training (by Business Unit, Year-to-Date)
the function has set as thresholds Percent
Spend
for operational efficiency. Inclusion of actual business days of Spend
delivering training and variance from BU A $ % Customer-facing scorecard measures will be
Total spend on external training is the function’s target fosters a focus on included to gauge organizational satisfaction with
the newest addition to NBS’s suite BU B $ %
functional productivity. … training and development activities and resources.
of operational metrics. NBS uses Total $ 100.00%
this data to assess the value that
its customers get from external Balanced Scorecard Perspectives*
suppliers vis-à-vis solutions and
Financial Customer Internal Processes Team
products available internally. In
other words, the learning function Group Training & Development delivers Organizational passion about learning Group Training & Development delivers Group Training & Development promotes a
value to the organization. opportunities and service at what the business needs. culture of learning and growth.
can better pinpoint—in specific Nationwide organization has the right Nationwide. Group Training & Development supplies Group Training and Development supports
instances—whether its customers knowledge and skills. Employees feel they can make the most of products & services clearly aligned with sharing of best practices with each other.
go outside due to a lack of supply Training programs provided by Group their talent at Nationwide. organizational objectives.
or misperceptions regarding the Training & Development are the best. Employees feel encouraged and supported Group Training & Development does few
to develop at Nationwide. things exceptionally. Bubbles are color-coded according
learning function’s capabilities.
Nationwide employees have a common to performance against each metric
focus on learning & development.
* Scores are derived from a series of scores on specific activities Source: Nationwide Building Society; Learning
and measures associated with each of these items. and Development Roundtable research.
NBS maintains a close watch on
the operational performance of
the training and development
group to optimize delivery Tracking Total Training Activity (Continued)
processes and maximize
utilization of trainers and NBS utilizes discrete, detailed metrics to examine training resource allocation
course programs. Together,
NBS’s Training Management Information Pack
these measures provide a
comprehensive picture of formal
training activity and attendee Category Analysis (“of” or “by”) Measures Reporting
population characteristics across
the organization. Content Category q # programs per category of training Monthly
q # by category as % of total programs delivered and Baseline measures
As ensuring optimal penetration Financial for capturing
among internal customer markets Program Training Center Location q # programs per training center location Year-to- training courses
Delivery q # by location as % of total programs delivered Date delivered through
is a key objective, the information
pack also shows variance in the Learning Resource* Utilization q # learning resource* hour usage per type of resource non-classroom
utilization of classroom training q # hours usage per resource* as % of total hours used channels.
among different employee
Employee Demographics: q # attendees in each category Monthly
population segments.
• Full-time; part-time q # attendees in each category as a % of all attendees and
Program efficiency measures • Under 24; over 51 q Variance of training attendees in each demographic to total Financial Comprehensive
enable the function to manage • Male; female number of employees in population group (% attendees – % of each Year-to- analysis of
Program • Non-white demographic group) Date employees who
costs carefully and determine the
Attendance Business Division attend training
appropriate level of revenues and
Worker Group
spend. In particular, NBS places courses indicates
Job Family
an emphasis on comparing its Work Location gaps in utilization
cost–income ratio to that of its Training Center Location of formal training.
competitors.
Cost q Total training cost per FTE ($) Monthly
q Total training costs ($) and
Hours q # training hours per FTE Financial
Year-to-
Training FTE: FTE Ratio q # FTE per trainer FTE Date
Delegate (Attendee) Costs q Average travel + subsistence cost ($) Select efficiency
Program q Average cost of training per delegate ($) measures are
Efficiency Trainer Days q Trainer days per trainer FTE
benchmarked to
industry standards.
Delegate Attendance q Number of attendees per course category
Delegate Cancellation Rates q # courses cancelled; total # courses scheduled; % cancelled
q # employees dropped; total # of employees registered; % dropped
Key Performance Indicators q % “objectives met” for course-based training by type of training
q # delegate training days
* Nationwide Building Society defines learning resources as an inclusive measure of non-classroom-based training Source: Nationwide Building Society; Learning
and development products, including books, videos, and CBT modules lent out by the learning resource center. and Development Roundtable research.
19
20
Profile #6
Owens Corning University’s Quarterly HR Update
21
22
services.
Sales training $____; Six Sigma Program $____; SBD
Hourly
(# and %) Classroom (%)
Apr. May Jun. Jul. Aug. Sep. Oct. Nov. Dec. Jan. Feb. Mar.
23
24
Profile #7
Putnam Investments’ Balanced Scorecard L&D Metrics
25
26
Putnam Investments leverages a
balanced-scorecard framework to
communicate and measure L&D
performance. Balanced Performance Assessment
Putnam Investments’ learning Putnam Investments organizes metrics through a structured balanced-scorecard framework
function selected metrics to
communicate performance to Putnam Investment’s Balanced Scorecard
business leaders and customers
with varying demands for value
demonstration, preferences Financial Perspective Function Operational Perspective
for measurement rigor, and
an understanding of L&D
measurement. After vetting Training and Development ROI Training Penetration:
selected first-round metrics (select courses only): • Students trained # by course type; # by course; # by business unit; # by team
with senior stakeholders, the • Application to job requirements • Workforce trained % by business unit; % by team
Development
27
28
Profile #8
Schwan’s University’s Measurement and Evaluation Strategy
29
30
Business-focused measurement
is a key component of Schwan’s
University’s (SU) strategy. With the
goal of “developing and utilizing Driving to Business Value
meaningful metrics to demonstrate
the business value of its products Schwan’s measurement philosophy articulates clear guidelines for the use of L&D metrics
and services,” Schwan devised a
Schwan’s University’s (SU) Measurement Framework
strategic measurement framework
that is rooted in the philosophy
of “purposeful measurement”— GUIDING PRINCIPLES OF MEASUREMENT
measuring only to make informed What: Only things • Only for making business decisions
decisions about training and Schwan’s University • With proven methods
development. Schwan’s University
and customers need • Do well or not at all
articulates important
SU’s measurement philosophy is to make informed
business decisions that • Acknowledge organizational context
grounded in guiding principles business decisions
require metrics to help • Keep customers in mind
that set clear parameters about make the right choices. Why: To enable
Schwan’s University to • Use stakeholders’ data when
the “what” and “why” of L&D
make the right choices possible
measurement. To execute on
these guiding principles, Schwan’s
measurement framework also KEY BUSINESS DECISIONS AND OBJECTIVES SUPPORTED BY METRICS
articulates the specific business Aligning with Customer Increasing value and effectiveness Promoting organizational and
decisions and objectives that L&D Needs of programs Schwan’s University values
metrics are designed to support.
In turn, Schwan’s metrics are Allocating Resources Managing Schwan’s University Supporting corporate objectives
organized into three distinct like a business
Improving relationships Supporting people-development
categories: operational metrics, with internal customers Optimizing content and delivery goals
customer satisfaction metrics, and channel mix
program evaluation metrics.
WHAT SCHWAN’s UNIVERSITY MEASURES TO MAKE BUSINESS DECISIONS
Operational Metrics Customer Satisfaction Metrics Program Evaluation Metrics
Question One:
Question Two:
Question Three:
* SU offers L&D solutions to internal and external business clients. Source: The Schwan Food Company; Learning and Development Roundtable research.
31
32
Ethics Check • How well did we live our stated values and principles as we provided this product or service?
• Were there any ethical trouble spots?
• If so, what should we do to repair damage?
Measurement • What should we do to avoid similar problems in the future? What would that take?
categories and
associated indicators
Learning • What learning or change in individual, group, or organizational capacity did this solution produce
enable credible and
Produced for Schwan’s U?
thorough evaluations. • What would it take to make this learning permanent?
Source: The Schwan Food Company; Learning and Development Roundtable research.
Profile #9
TD Bank Financial Group’s Annual Global Training Report
33
34
$ #
2002 2003 2002 2003 2002 2003 2002 2003 2002 2003 2002 2003 2002 2003 2002 2003 2002 2003
Retail Wealth Wholesale Retail Wealth Wholesale Retail Wealth Wholesale
Distribution Management Banking Distribution Management Banking Distribution Management Banking
35
36
* All Self-Study Electronic and Self-Study Paper-Based Source: TD Bank Financial Group; Learning and Development Roundtable research.
training days are based on estimated completion time.
Profile #10
Texas Instruments’ Training and Organization
Effectiveness Balanced Scorecard
37
38
Texas Instruments’ Training and
Organizational Effectiveness
(T&OE) group emphasizes
performance measures that Supporting the Business Model
provide crisp data about the value
of its products and services to Texas Instruments’ balanced scorecard promotes responsiveness to internal customer needs
internal customers, who are not
Texas Instruments’ Training and Organizational Effectiveness (T&OE) Balanced Scorecard
required to use the T&OE group
for training and development
solutions. Measurement is also Objectives Measures
critical in the context of the Stay at or under annual budget (hard cap on spending) Gross cost (Total T&OE Spend) $ monthly
group’s business model—T&OE
Hit zero net cost target (break-even on expenditures by Net cost (Total cost of services rendered – revenue intake)
employs a 100 percent charge- year’s end) $ monthly
back model, effectively exposing Financial
L&D to the rigors of the market Forecasted cost $ monthly
and creating a critical mechanism Critical financial metrics track progress Actual cost $ monthly
for ensuring responsiveness to in breaking-even by year’s end.
Variance between forecasts and actuals $ monthly
internal customer needs.
Achieve general TI population satisfaction in enabling Customer satisfaction for each catalog class
To capture and communicate
employees to perform at a higher level
value to its customers, T&OE
maintains a balanced scorecard Ensure relevance of training catalogue programs Annual catalogue rationalization: Keep % _ ; Drop % _
that is designed to foster
accountability for meeting the Customer Enable business leaders to visualize value by demonstrating Determine ROI ($, behavior change, or other performance outcomes)
needs of internal customers. returns on major learning events for high priority, customized learning events
Scorecard metrics are designed to The learning function measures its proficiency Meaningful ROI metrics for major training and
ensure focus on the most critical at brokering high-quality resources. development initiatives communicate competitive value.
drivers to executing T&OE’s
business strategy. Ultimately, the
Manage vendors to an exacting standard Supplier ratings: “top 10” and “bottom 10” rankings
metrics in each scorecard category
provide T&OE with information Promote trainer effectiveness and quality Training instructor ratings: “top 10” and “bottom 10” rankings
it needs to understand its Business Track progress in resolving key training and development Average time required to resolve customer issues
competitive value and promote Process service issues
T&OE as the business partner of
choice for line managers across Promote T&OE knowledge sharing on key action issues Effectiveness of quarterly meetings to update T&OE function on action
the organization. item status and results
Develop and execute plans and services aligned with Compile quarterly results on new products introduced to TI
organizational and line customer priorities customers # _ and results on action plans completed %_
Innovation
Retain top T&OE talent Effectiveness of quarterly T&OE talent reviews to examine
and Learning
Raise overall performance of T&OE staff performance, map out talent needs, and devise development and
deployment strategies
Ensure T&OE has right people in right roles
Source: Texas Instruments Incorporated; Learning and Development Roundtable research.
39
40
Profile #11
Textron’s Balanced Scorecard
41
42
Textron’s enterprise-wide
balanced scorecard highlights
three learning and development-
oriented metrics that support the Measuring Against Organizational Performance Targets
organization’s talent objectives.
As the L&D function is in the Textron communicates L&D performance against targets for critical corporate objectives
early stages regarding some areas
Textron’s Balanced Scorecard
of performance measurement,
it initially opted against the L&D Roll-Up Metrics
inclusion of complex metrics;
instead, the function selected L&D metrics are rolled up to Inclusive Workforce and Premier benchmark enables
simple success measures that the organization-wide level Talent Mobility metrics map comparison of the organization’s L&D
would prove meaningful to from business unit reviews. to strategic focus areas. measure against a world-class standard
internal business partners.
For both education and
development and inclusive Current Year Premier
Scorecard Category Key Performance Measures Future Target
workforce metrics, a benchmark Goal Benchmark
value is identified from best- Customer Satisfaction Process X% X%
in-class companies across the Successful Customers Organic Revenue Growth X% X%
industry. For talent mobility,
targets are set to achieve a desired New Products and Services X% X%
Education and Development XX
level of internal moves based on Education and Development XXHours
Hours XX
XXHours
Hours XX
XXHours
Hours
the organization’s business needs.
Talented People perperEmployee
EE* perperEmployee
EE* perperEmployee
EE*
Talented People Inclusive Workforce X% X%
Inclusive Workforce X% X% X%
Talent Mobility X% X%
Talent Mobility X% X%
Integrated Supply Chain Benefit $M $M
Recordable Injury Rate X X
Lost Time Injury Rate X X
World Class Procedures
Textron Six Sigma Benefit $M $M
Price–Cost Ratio X% X%
Portfolio Performance X% X%
Textron ROIC X% X%
NOP Margin (Before Restr.) X% X%
Industry Leading
Free Cash Flow (% of Net Inc.) X% >X%
Performance
Earnings per Share $ $
P/E vs. Peer Quartile Quartile
Business targets for average Inclusive workforce measure Measure captures different
hours spent in training vary, captures extent of diversity among types of internal moves among
but a baseline standard ensures organization’s global employee base. senior-manager ranks to fully
compliance against target goal. track cross-pollination of talent.
43
44
Profile #12
Vanguard University–HR–Corporate Dashboard Linkage
45
46
Group Services
Vanguard University
Audience Audience Audience
CEO, board directors Managing Principal-Level,
and officers, senior Director-Level metrics owners
management team
MD of HR Principal,
VU
* The Vanguard Group refers to its employees as crew members. Source: The Vanguard Group, Inc.; Learning and Development Roundtable research.
A critical feature of VU’s
dashboard is its underlying
links to HR priorities and
organizational objectives. As Linking L&D Metrics to HR
demonstrated in this graphic,
VU has clearly articulated the and Corporate Indicators (Continued)
perceived relationships between
training, HR, and corporate VU’s dashboard clearly links with key metrics within the HR dashboard,
drivers and outcomes. In this which, in turn, roll up to select measures within the corporate dashboard
example, VU shows that the
design of its training programs Corporate Dashboard
partly impacts learning results
such as the extent of learner Drivers Outcomes
satisfaction, the amount of skills HR Dashboard
or knowledge acquired, and Relative Fund Performance
on-the-job learning application. Drivers Outcomes Operating Profitability
In turn, learning results partly Product, Services &
influence the overall performance Recruiting Market Development
Crew* Effectiveness
of training and other HR drivers (Total Organization)
which, in turn, link to the level Vanguard University Dashboard Sales and Marketing
Training
of employee engagement and Effectiveness
Net Cash Flow
Crew* Loyalty
effectiveness.
Workforce Results
Drivers Outcomes (Total Organization)
Core Processes
Crew* Satisfaction
Leadership
and Effectiveness
Crew* Effectiveness Client Loyalty
Course Design Learning Results
(Total Organization)
Organizational
Effectiveness Operational Excellence
Course Delivery
University-Wide HR Expenses/Total Rewards
Reputation Total Rewards Business Environment
Expense
Operational Excellence
Strategic Level
Progress
Reputation as Employer
(Total Organization)
3
Progress
Training Cost
Effectiveness
Key metrics from the
HR dashboard appear in
Business Environment the Crew* Satisfaction &
Business Environment Effectiveness section of
2 the corporate dashboard.
1 Key indicators from the Learning Results
Detailed measures compose each category roll into the Training section of
of the driver categories which the HR dashboard along with other driver
relate to discrete metrics within categories, which, in turn, relate to Crew*
the outcomes section of the VU Loyalty and Crew* Effectiveness categories.
dashboard.
* The Vanguard Group refers to its employees as crew members. Source: The Vanguard Group, Inc.; Learning and Development Roundtable research.
47
48
Source: The Vanguard Group, Inc.; Learning and Development Roundtable research.
Underlying the VU dashboard
categories related to training
drivers and outcomes are a rich
set of operational measures and Linking L&D Metrics to HR
program evaluation results that
are maintained by select process and Corporate Indicators (Continued)
owners within the L&D function
and supported by HR partners Beneath the key measures of the dashboard lies a detailed set of metrics that
with Six Sigma expertise. These can be broken down by specific L&D teams to facilitate further root cause analysis
process owners are responsible
for monitoring performance, VU Training Drivers Measures Matrix
updating the underlying Illustrative
dashboard metrics with current
information, and launching and Measure Stoplight Nov 2003 Goal YTD 2002
implementing Six Sigma projects Design Project Status (Green/Yellow/Red) X/X/X FYI Rolling Rolling
where deemed necessary by the Course Design Design Applied Time—Billable % X% X% X% X%
L&D function. Design Capacity—Projected (YTD - 3 mo. Proj.) X% X% X% Predictive
Design Client Satisfaction Scores X X N/A X
This structure enables senior
Trainer Quality—Level 1 X X X
executives and HR and L&D staff
e-Learning Percentage of Courses X% X% X% X%
to drill down into each metrics
Trainer Utilization X% X% X% X%
category and subcategory to Course Delivery
understand the root causes of Trainer Capacity—Projected X% X% X% Predictive
existing or potential problems Overall Hours per FTE (YTD Annualized) X X/Yr X X
designated by red and yellow Hours per Tenured FTE (YTD Annualized) X X X
stoplight colors. Hours per Tenured FTE—CW/Dept X/X FYI X/X X/X
No Show/Late Cancel/Incomplete % X% X% X%
Canceled Class % X% <X% X% X%
Operational Excellence
Cost per Training Hour $X $X $X $X
Publications Cycle Time (Days/Project) X Days <X Days X Days X Days
Wait List Analysis X Sessions Needed Sessions Mixed Rolling N/A
Source: The Vanguard Group, Inc.; Learning and Development Roundtable research.
49
50
Profile #13
W.W. Grainger’s Operations and Training Delivery “Cockpit Charts”
51
52
% Occupied
Training $*
funding model, GLC reviews
the ratio of GLC- and non-
GLC-provided training (e.g.,
external vendors) to ensure that J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
it effectively brands and markets Month Month Month
* EDC refers to Employee Development Center. * In thousands.
its products and services, a
particularly significant challenge GraingerLearningCenter.com GLC Class Participation Average Cost per Learner*
given its geographically dispersed Registrations (ILT and Online) ILT Average Cost
workforce. Non-ILT Learners
# of Registrants
Instructor-Led
Average Cost
● ●
Learners
Non-ILT
● ● ● ●
●
# of
●
● ●
● ● ● ● ● ● ●
● ● ● ● ● ●
●
J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
Month Month Month
* Doesn’t include GLC chargeback participants.
Customer Service Requests Trouble Ticket Allocation Unique Learners per Work Area
Trouble Tickets
Calls Resolved by GLC Employees
E-Mails Resolved by SSC* Unique Training
# of Registrants
% of Employees
● ●
# of Trouble
in Training
Employees
●
Tickets
# of
●
●
s . in
J F M A M J J A S O N D J F M A M J J A S O N D m ce R ing SS ev ha
s te an H dis M D C
Month Month Sy Fin an y& ply
r is
e te
g p ch
p ra Su er
* SSC refers to Support Services Center. er St M
E nt Business Area
Source: W.W. Grainger, Inc.; Learning and Development Roundtable research.
In addition to its operations
“cockpit chart,” GLC captures
more detailed metrics related to
the cost and quality of training Contributing to Organizational Performance
delivery. Aside from tracking the
amount of spend and number Through Operational Excellence (Continued)
of instructor hours dedicated to
training delivery, GLC examines GLC maintains a dedicated training delivery dashboard that shows its amount of spend,
the number of class management number of instructor hours, process quality, and satisfaction ratings for instructors and facilities
process errors, specifically
breaking it down by registration, GLC’s Training Delivery “Cockpit Chart”
scheduling, and roster issues. This Travel and Entertainment Spend Program Delivery Spend RTM1/BST2 Instruction Hours
granular-level reporting enables
Actual Actual
the L&D function to immediately Planned Planned
$ in Thousands
$ in Thousands
address administrative errors
# of Hours
and ultimately ensure the process
quality of its offerings. The
training delivery dashboard also
captures learner satisfaction with December YTD December YTD J F M A M J J A S O N D
Month Month Month
instructors and training facilities
for its most popular courses. Instruction Days (Sales RTMs1) Instruction Days (Service RTMs1) Instruction Days (BSTs2)
Actual Actual Actual
While operational metrics Goal Goal Goal
certainly do not reflect GLC’s core
% of Business
% of Business
% of Business
◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆
value proposition of providing ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆ ◆
Days
Days
Days
YTD
J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A S O N D
YTD
workforce capabilities and YTD
organizational performance, Month Month Month
measuring throughput provides
Class Management Process Errors Successful Service Skills Course Dimensions of Professional Selling Course
the L&D function with
Cutoff
reliable indicators of resource Scheduling December December
YTD YTD
productivity. For example, by Rosters
# of Errors
Rating (%)
Approval
Approval
and money its staff devotes to
training delivery, GLC is better
able to discern if it is effectively J F M A M J J A S O N D Instructor Facility Instructor Facility
allocating adequate resources Cutoff X X X X X X X X X X X X
to such activities. In addition, Scheduling X X X X X X X X X X X X
the L&D function is also able Rosters X X X X X X X X X X X X
to leverage these measures to
demonstrate additional value to 1
RTM refers to regional training manager. Source: W.W. Grainger, Inc.; Learning and Development Roundtable research.
the organization through cost and 2
BST refers to branch services trainer.
process efficiency.
53
54
Profile #14
W.W. Grainger’s “Voice of the Customer” Annual Survey Results
55
56
High
Importance 5
In turn, GLC summarizes its
understanding of the survey Low Importance, High Importance,
0 1 2 3 4 5 4 High Satisfaction High Satisfaction
results to communicate back to Low High
Satisfaction
managers and senior executives, Satisfaction 3
specifically identifying key areas
of effectiveness and priority 0 1 2 3 4 5
improvement and sharing its Low High
2
initial strategies for addressing Low Importance, High Importance,
outstanding customer needs. Alignment with Internal 1 Low Satisfaction Low Satisfaction
While GLC primarily uses the Customer Priorities
Question #11:
Low
survey results to effectively 0
allocate and prioritize its resource How would you describe the importance and your satisfaction
investments, it also leverages the with GLC’s ability to integrate your business unit’s needs into
data to demonstrate the value it the content of training?
has already delivered to managers Importance
and senior executives. Survey questions GLC uses its survey results to
0 1 2 3 4 5
Low High emphasize GLC’s working identify priority-improvement areas.
relationships with managers
Satisfaction and senior executives and
Source: W.W. Grainger, Inc.;
its ability to align training Learning and Development
0 1 2 3 4 5
Low High courses to business needs. Roundtable research.
GLC’s survey results create an
especially powerful resource for
communicating the contribution
of the L&D function. Using the Demonstrating Value Through
feedback obtained from managers
and senior executives, GLC can the Voice of the Customer (Continued)
effectively articulate its value in
the voice of its customers. Survey results enable GLC to illustrate its contributions to
business-unit priorities in the voice of its internal customers
Aggregate Results I S G Working Relationship with GLC Team I S G
Overall Performance of GLC × × × Partnering Efforts to Meet Your Needs × × ×
Effective Management of Department by GLC Leadership Team × × ×
Alignment with Internal Customer Priorities I S G
GLC’s Relationship-Building Skills with You × × ×
Integration of Your Business Unit’s Needs into the Content
× × × Communication of GLC’s Current Year Objectives × × ×
of Training
Development of Course Offerings to Proactively Address Your VP’s Relationship-Building Skills with You × × ×
× × ×
Business Needs
Design of Course Content Anticipates the Future Needs Type and Quality of Training Courses I S G
× × ×
of Your Business Function Expertise of Course Instructors × × ×
Created Plans for Learning Transfer and Reinforcement × × × Overall Quality of Courses × × ×
Customization of Courses to Meet Your Specific Business Course Offerings Appropriate to My Job × × ×
× × ×
Needs
Availability of Training Methods (e.g., Online, Instructor-Led,
Sensitivity of Courses to the Needs of the Individual Student × × × × × ×
etc.)
Classes Offered in Appropriate Locations × × ×
Diversity/Variety of Course Offerings × × ×
Administrative Capabilities I S G Courses Offered to Support Performance Excellence Planning × × ×
Informing Participants of Scheduling Changes and/or Class Quality of GLC Communications × × ×
× × ×
Cancellations
Convenience of Course Enrollment × × × Quality of Feedback/Evaluation I S G
Publicizing the Availability of Classes Well Enough in Advance × × × Measure Effectiveness of Strategic Courses × × ×
Key Creating Added Value for a Reasonable Cost × × × Implementation of Changes and Improvements Based
× × ×
I = Importance (0–5) Usability of Learning Management System (Including Online
on Feedback
S = Satisfaction (0–5) × × × Seeking Feedback on GLC Course Offerings
Course Catalog and Registration) × × ×
G= Gap between importance
Responsiveness to Your Feedback/Complaints × × × Post-Course Evaluation Process × × ×
and satisfaction ratings
(Importance – Satisfaction) Source: W.W. Grainger, Inc.; Learning and Development Roundtable research.
57
58
L&D Non-Program Metrics Inventory
59
60
Effort and Investment Allocation 71. Learning Portal Penetration: Percentage of unique users accessing the L&D portal (or HR portal with
L&D-specific information)
47. Channel Delivery Mix: Percentage breakdown of L&D solutions offered by delivery channel (e.g., 72. Unique Portal Users: Number of distinct and registered users who accessed the learning portal within a
classroom, e-learning) given time period
48. Channel Investment Mix: Percentage breakdown of L&D spend by delivery channel (e.g., classroom, e- 73. Third-Party Courseware Utilization: Percentage of employees who have accessed vendor/consultant-
learning) provided courseware
49. Formal Learning Intensity: Number of instructor-led classes and e-learning courses versus total number 74. E-Learning Learner Penetration: Percentage of employees who have used e-learning courses
of L&D solutions (including informal learning) 75. E-Learning Course Completion: Percentage of employees who have fully completed e-learning courses
50. Blended Learning Intensity: Percentage of blended learning (e.g., instructor-led classroom, e-learning, and 76. E-Learning Cost Avoidance: Total amount of L&D costs avoided through digitization of course content
action learning) L&D solutions 77. LMS Workforce Penetration: Percentage of employees who have accessed L&D solutions through the
51. Experience-Based Learning Intensity: Percentage of experience-based (e.g., action learning, lateral job LMS
rotations, stretch assignments, participation in “shadow” cabinets) L&D solutions 78. LMS-Accessed L&D Solutions: Total number of L&D solutions accessed via the LMS
52. Relationship-Based Learning Intensity: Percentage of relationship-based (e.g., mentoring, reverse 79. LMS-Based Registrations: Percentage of course enrollments/registrations conducted via LMS
mentoring, executive coaching) L&D solutions 80. LMS-Hosted Solution Utilization: Average number of L&D offerings accessed via the LMS per employee
53. Content Delivery Mix: Percentage breakdown of L&D solutions delivered by content area (e.g., sales 81. LMS Training Hours: Total training hours tracked using the LMS
training, IT training, leadership development, etc.) 82. LMS Training Hours/Total Training Hours: Ratio of LMS training hours versus total training hours
54. Content Source Mix: Percentage breakdown of L&D solutions by content source (e.g., off-the-shelf, within a given time period
internally developed, externally developed by vendor/consultants) 83. LMS Content Area Focus: Breakdown of LMS training hours by content area (e.g., sales training, IT
55. Content Investment Mix: Percentage breakdown of L&D spend by content area (e.g., sales training, IT training, leadership development)
training, leadership development) 84. New-to-Role Penetration Rate: Percentage of employees new-to-roles (e.g., from individual contributor
56. Learner Segment Delivery Mix: Percentage breakdown of L&D solutions delivered by learner segment to manager) who have completed required training
(e.g., first-line supervisors, middle managers, senior executives) 85. New-Hire Orientation Ramp Time: Percentage of new hires who have participated in orientation
57. Learner Segment Investment Mix: Percentage breakdown of L&D spend by learner segment (e.g., all training within the desired time period
employees, first-line supervisors, senior executives) 86. Instructor Utilization: Percentage of instructors actively teaching training courses within a given time
58. Line Partner Delivery Mix: Percentage breakdown of L&D solutions offered by business unit or functional period
area 87. Instructor Training Days: Average number of instructor days within a given time period
59. Line Partner Investment Mix: Percentage breakdown of L&D spend by business unit or functional area 88. Top 10/Bottom 10 Instructors: Ranked list of top ten and bottom ten instructors based on aggregate
60. “Required” Investment Allocation: Percentage of L&D spend dedicated to government-mandated skills, satisfaction surveys of L&D staff, line partners, and internal customers
certifications, and other compliance requirements 89. Instructor Expertise: Overall satisfaction rates with instructor’s level of subject-matter expertise
61. Business-Mandated Investment Allocation: Percentage of L&D spend dedicated to business-mandated 90. Facilities Utilization: Percentage of facilities occupied for training purposes within a given time period
skills 91. Course Optimization: Percentage of courses below the required registration threshold
62. Discretionary Skill-Building Investment Allocation: Percentage of L&D spend dedicated to 92. Class Cancellation: Number of classes cancelled as a percentage of total classes offered within a given time
“discretionary” skills period
63. Basic/Advanced Investment Mix: Percentage of L&D spend dedicated to “basic” and “advanced” training 93. Cancellation Costs: Capacity costs attributable to last-minute cancellations
94. Class Rescheduling: Number of classes rescheduled as a percentage of total classes offered within a given
Penetration and Utilization time period
64. Workforce Penetration: Percentage of employees who have accessed L&D solutions 95. Absence Rate: Percentage of employees who failed to participate in a registered course
65. Mandatory Training Compliance: Percentage of employees who have complied with mandatory training 96. Emerging Talent Participation: Percentage of HIPO population who have accessed L&D offerings within
requirements a given time period
66. Average Course Utilization: Average number of class and course enrollments/registrations per employee 97. Top 10/Bottom 10 Course Attendance: Ranked list of top ten and bottom ten courses attended
67. Absence Costs: Total capacity costs attributable to absences
68. Classroom Yield: Average capacity utilization per course
69. Course Density: Average number of learners accessing a course
70. Learning Portal Accessibility: Percentage of employees with access to the L&D portal (or HR portal with
L&D-specific information)
61
62
63
64
65
66
67
68
Retrieve this
study and our You may also order
other strategic an unlimited number
research online. of hard copies of this
and other studies.
69
70
Learning and Development Roundtable
ORDER FORM
The study titled Profiles of L&D Dashboards is intended for broad dissemination among L&D executives and staff within your organization. Members are welcome
to unlimited copies without charge. Online ordering is available at http://www.ldronline.com. Alternatively, you can call the Publications Department at
+1-202-777-5921, e-mail your order to orders@executiveboard.com, or fax in the order form on this page. Additionally, members interested in reviewing any of the
Roundtable’s past strategic research are encouraged to request a listing of completed work.
Institution _______________________________________
Address _______________________________________
_______________________________________
_______________________________________
Telephone ___________________________________________
71
72