Storyboarding The Power of Planning

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 424

More Praise for Leading the Learning Function

“This is a wonderful guide to the elements that make a great culture of learning. Born from a true collaboration among
learning leaders—at organizations like Accenture, American Airlines, Deloitte, Grainger, IBM, and UPS to name a
few—Leading the Learning Function shares important new perspectives and effective strategies.”
—Britt Andreatta, PhD, CEO, Britt Andreatta Training Solutions

“The ATD Forum has done a masterful job compiling best practices, innovative tools and techniques, and strategic processes
from over 40 senior practitioner members. If you are looking for how-to examples to enhance performance and elevate the
value and impact of your learning function, look no further than this must-have resource for all learning leaders!”
—Rita Bailey, Owner, Up To Something

“The future of work may be uncertain, but one thing is for sure—there will always be a need for continual upskilling
of the workforce and talent leaders who can achieve it with a sound strategy and tried and tested tools and techniques.
This book showcases innovations for today and provides fuel for tomorrow.”
—Elaine Biech, Author, The New Business of Consulting and ATD’s Foundations of Talent Development

“Leading the Learning Function is full of real-life examples that demonstrate how to turn leadership theory into practice.
You’ll find tips, tools, and techniques that will help you excel at any level of leadership. Read this book and be a better
learning leader!”
—Ken Blanchard, Co-Author, The New One Minute Manager and Leading at a Higher Level

“This extensive, but practical, collection of processes, practices, stories, tools, and techniques from successful
practitioners in a variety of industries provides the fundamentals for enabling all learning leaders to build organizational
capability that produces impactful results. Leading the Learning Function is the new go-to source for all talent professionals.”
—Marshall Goldsmith, New York Times #1 Bestselling Author, Triggers, Mojo, and What Got You Here Won’t Get You There

“Compulsory reading for any future-focused leader wanting to put talent development at the heart of their organization’s
success. Leading the Learning Function draws into one volume the distilled wisdom of seasoned learning leaders who prove that
intentional learning transforms organizations.”
—Jonathan Halls, Author, Confessions of a Corporate Trainer

“This book is a smorgasbord of learning and development ideas, content, and inspiration that will provide anyone leading
the L&D function with targeted, focused, and actionable information to take their organization to the next level and beyond.”
—Karl M. Kapp, EdD, Professor of Instructional Design and Technology, Bloomsburg University

“Experience is the best teacher as long as we learn from it. And that’s what the contributors to this book have done.
Kudos to MJ Hall and Laleh Patel for bringing together a phenomenal group of on-the-ground experts who share their
strategies, techniques, dashboards, examples, best practices, and lessons learned. It’s a treasure trove of essentials for
the 21st-century talent developer who needs relevant, experience-based, and practical information to help guide them
through the challenges of preparing today’s workforce. If you want to thrive in the future, prepare in the present by
putting Leading the Learning Function to use in your organization.”
—Jim Kouzes, Co-Author, The Leadership Challenge, and Executive Fellow, Center for Innovation and Entrepreneurship,
Leavey School of Business, Santa Clara University

“Leading the Learning Function offers insights into the key leverage points for impact, innovation, and engagement. It is
overflowing with creative ideas and best practices from leading talent development practitioner experts who are serving
on the front lines.”
—Manuel London, PhD, Dean, College of Business, State University of New York at Stony Brook
“Leading the Learning Function is a must-read for anyone who needs practical, step-by-step guidance for identifying,
acquiring, and growing talent in any organization. Loaded with time-tested models, results-enabling questions, and
easy-to-implement best practices refined from the experiences of 31 expert TD professionals, this book provides timeless
resources every leader needs to maximize human potential with guided confidence and expertise. If you’re seeking a road
map for achieving real, measurable, and scalable impact, this is it!”
—Sardek P. Love, CEO, Sardek Love International

“Learning can be natural, yet leading learning requires strategy, agility, readiness, divergence, collaboration, and
leadership. Leading the Learning Function provides the tools and strategies essential for strategically driving learning in our
evolving organizations.”
—Elliott Masie, Chair, The Learning CONSORTIUM @ The MASIE Center

“In Leading the Learning Function, you’ll get a unique look into the challenges that experienced learning leaders face as you
read their reflections, practical experience, and advice. Additionally, the editors infuse a layer of value-adding insights to
help you see the larger story unfolding in talent development: organizations are transforming, and learning is the dynamic
that makes this transformation possible. These are creative times for the learning and development field, and the lessons
in this book will catalyze you and your team!”
—Pat McLagan, Author, Unstoppable You

“I found Leading the Learning Function to be highly pragmatic in its approach and thorough in its review of the latest
thinking in learning. I especially liked how it pivoted on creating programs directed at performance outcomes through a
powerful combination of the use of strategy, methodology, and technology.”
—Bob Mosher, CEO and Chief Learning Evangelist, Apply Synergies: THE 5 Moments of Need Company

“How do leaders add value? The Center for Leadership Studies, through the eyes of Situational Leadership®, says:
Leaders accelerate the development of those around them. Hall and Patel deserve thanks for their tireless efforts in
putting together a comprehensive reference that will undoubtedly help the leaders of learning functions around the world
achieve this objective!”
—Sam Shriver, EdD, Executive Vice President, The Center for Leadership Studies

“Congratulations to the more than 50 practitioners from across a variety of industries for this huge contribution to
the learning profession. Leading the Learning Function is full of practical tips, examples, and tools, written in a style that
encourages readers to experiment and take action. It is a must have handbook for all talent professionals!”
—Brenda Sugrue, Global Chief Learning Officer, EY

“Leading the Learning Function is one of a kind: real learning leaders sharing their experiences and lessons learned. This is
not an academic or formulaic checklist of to-dos, but rather an unplugged collection of tips and resources for you to use
to excel at all phases of leadership—planning, strategy, execution, and measuring impact.”
—Megan Torrance, CEO, TorranceLearning

“What impresses me most about Leading the Learning Function is how it combines theory with practical application. As leaders,
we’ve all read about what should work in L&D; it’s refreshing to read instead about what does work. This book taps into the
expertise of industry leaders, packages that expertise into effective strategies and, most importantly, explores the why behind
applying those strategies to your business. I’d recommend it for any leader in the rapidly changing world of L&D.”
—Andy Trainor, Vice President, Learning, Walmart U.S.
© 2020 ASTD DBA the Association for Talent Development (ATD) All rights reserved. Printed in the United States of America.

23 22 21 20 1 2 3 4 5

No part of this publication may be reproduced, distributed, or transmitted in any form or by any means, including photocopying, recording, infor-
mation storage and retrieval systems, or other electronic or mechanical methods, without the prior written permission of the publisher, except in
the case of brief quotations embodied in critical reviews and certain other noncommercial uses permitted by copyright law. For permission requests,
please go to copyright.com, or contact Copyright Clearance Center (CCC), 222 Rosewood Drive, Danvers, MA 01923 (telephone: 978.750.8400;
fax: 978.646.8600).

ATD Press is an internationally renowned source of insightful and practical information on talent development, training, and professional development.

ATD Press
1640 King Street
Alexandria, VA 22314 USA

Ordering information: Books published by ATD Press can be purchased by visiting ATD’s website at td.org/books or by calling 800.628.2783 or
703.683.8100.

Library of Congress Control Number: 2020936056

ISBN-10: 1-950496-61-9
ISBN-13: 978-1-950496-61-7
e-ISBN: 978-1-950496-62-4

ATD Press Editorial Staff


Director: Sarah Halgas
Manager: Melissa Jones
Community Manager, Learning & Development: Eliza Blanchard
Developmental Editor: Kathryn Stafford
Production Editor: Hannah Sternberg
Text Design: Shirley E.M. Raybuck
Cover Design: Molly von Borstel, Faceout Studios

Printed by Versa Press, East Peoria IL


Contents
Forward by Tony Bingham..................................................................................................................................................v

Introduction................................................................................................................................................................. ix

Section 1: Setting Direction...................................................................................................................................... 1


Chapter 1: Synching Up for Synergy
by Lisa Gary..................................................................................................................................... 3
Chapter 2: A Proactive Approach to Strategic Learning Alignment
by Sandi Maxey............................................................................................................................... 15
Chapter 3: Strategy Revealed: A Personal Experience Journey
by Teri Lowe................................................................................................................................... 27
Chapter 4: So, You Want to Be Strategic?
by John Kelly................................................................................................................................... 41

Section 2: Managing Processes and Projects..................................................................................................... 53


Chapter 5: Organizational Needs: Determining Gaps and Aligning Solutions
by Chris Garton and MJ Hall.......................................................................................................... 55
Chapter 6: Heeding the Call
by Jerry Kaminski With MJ Hall...................................................................................................... 71
Chapter 7: Leveraging Neuroscience in Learning Design
by Leanne Drennan, Casey Garhart, and Joan McKernan............................................................................ 85
Chapter 8: Iteration, Not Perfection
by Suzanne Frawley.......................................................................................................................... 95
Chapter 9: The 70-20-10 Framework: Options for Impact
by Alan Abbott and Rachel Hutchinson.............................................................................................. 109

Section 3: Leading and Developing People....................................................................................................... 121


Chapter 10: Identifying and Onboarding the Best Talent
by Sarah Siegel and Elizabeth Huttner-Loan.......................................................................................... 123
Chapter 11: Enabling Continuous Learning: Tools and Approaches to Make It Happen
by Laura Solomon and Caroline Fernandes............................................................................................. 135
Chapter 12: Develop Your Team to Develop Their Team
by Alissa Weiher With MJ Hall........................................................................................................ 155

Section 4: Making an Impact............................................................................................................................... 165


Chapter 13: Metrics Matter
by Rachel Hutchinson........................................................................................................................167
Chapter 14: A Dashboard Journey
by Ron Dickson........................................................................................................................... 181
Chapter 15: Impact: Making It Happen
by Graham Johnston..................................................................................................................... 197
Section 5: Stakeholder Collaboration................................................................................................................ 207
Chapter 16: Go Slow To Go Fast
by Marie Wehrung.......................................................................................................................... 209
Chapter 17: The Impact of Coalitions
by Bryan McElroy, Rachel Hutchinson, Emily Isensee, and David McGrath With MJ Hall............................... 227
Chapter 18: Structure and Governance
by Kozetta Chapman and Graham Johnston With MJ Hall....................................................................... 241

Section 6: Enabling Learning Using Technology............................................................................................ 253


Chapter 19: Is Tech the Answer?
by Jerry Kaminski........................................................................................................................... 255
Chapter 20: Taking the Fear Out of the Technology Jungle
by Terry Copley.............................................................................................................................. 267
Chapter 21: The Role of L&D in the Digital Age
by Brandon Carson......................................................................................................................... 279

Section 7: Innovation.............................................................................................................................................. 289


Chapter 22: Business Impact Through Learning Research and Innovation
by Dana Alan Koch, Michelle M. Webb, and Tanya Gilson........................................................................ 291
Chapter 23: Innovation in Learning
by Graham Johnston........................................................................................................................ 311
Chapter 24: Changing Times: Innovate for Impact
by Ann Quadagno and Catherine Rickelman........................................................................................... 319

Section 8: Leader Behavior and Practices........................................................................................................ 339


Chapter 25: Leadership: Enabling Others to Move the Needle
by MJ Hall.................................................................................................................................. 341
Chapter 26: Getting Better at Getting Better: Tools and Techniques
by MJ Hall.................................................................................................................................. 355

Acknowledgments................................................................................................................................................... 369

Appendix 1: Suggestions for Professional Organizations............................................................................ 373

Appendix 2: Technology Platforms and Systems........................................................................................... 375

Appendix 3: More Tools and Techniques for Enhancing Personal Leadership Capability................ 377

References................................................................................................................................................................. 381

About the Authors.................................................................................................................................................... 389

About the Editors..................................................................................................................................................... 401

Index............................................................................................................................................................................ 403

iv | Contents
Foreword
Tony Bingham

“Learning is a force multiplier,” write MJ Hall and Laleh Patel in the introduction to Leading the Learning
Function: Tools and Techniques for Organizational Impact. It’s a compelling statement because it evokes the power
of what learning can accomplish. When harnessed, focused, and intentional, learning has the power to
shift behaviors, beliefs, outcomes, systems, and societies. We were already in an era of unprecedented
change before the coronavirus pandemic spread across the globe. In early 2020 we were seeing significant
shifts in business, in science, in creativity, in technology, and in jobs. The global shutdown that resulted
from the pandemic redefined almost every facet of society and it is apparent that we stand at the edge of
what could be called the Era of Mass Disruption. The work of professionals who are committed to the
development of knowledge, skill, and capability in the global workforce is absolutely critical today.
Likewise, the role of leaders and the development of leadership capability is paramount if we are
to truly realize the potential of what talent development efforts can accomplish in organizations around
the world. Learning as a strategic function requires talent professionals to lead. Real leadership is about
growing future leaders while casting a vision for what is possible and how to achieve it.
Learning empowers. Developing talent empowers. Leadership empowers.
In 1991 when the ASTD Benchmarking Forum was created, the idea was simple: Provide a private
consortium in which learning leaders focused on metrics-driven scorecards and shared their data to
benchmark best practices. Over the years, the focus of the group expanded and so did the qualitative
and quantitative metrics and topics. Performance scorecards were incorporated into ATD’s State of the
Industry report, and the benchmarks into the association’s BEST Awards. As the field has developed

v
and matured, so has the work that Forum members come together to explore. Today, the ATD Forum
continues to nurture collaboration, share best practices, and solve business challenges. Members are
catalyzed and inspired by one another to stretch, adapt, experiment, and make their talent development
efforts future-ready.
Over the years, we’ve been careful to protect the nature of what the Forum does and accomplishes.
And members value the confidential environment in which they collaborate. Now, they are ready to share
some of the insights, techniques, tools, and best practices they’ve built as a multiplier of their own. When
we know better, we do better. The Forum members are sharing what they know in an effort to help other
learning leaders know and do even better work.
When you travel by plane, you know the view from 37,000 feet is different than it is at 200 feet. If
your trip is of any distance, it’s likely you’ll see changes in terrain and topography. You see landscapes
and the components of them. But pilots see things with a completely different perspective, because their
focus is on harnessing the technology, mechanics, and power of the plane to keep it on course, arriving
safely at the destination. I think this can be a powerful analogy for those in the talent development field.
Every TD practitioner needs to understand the pivotal time in history in which they are working. In
January 2020 at Davos, the World Economic Forum announced that within just 10 years, 30 billion people
would need to be reskilled. They coined the term “Reskilling Revolution.” That’s the 37,000-foot view.
The horizon is vast and complex, and the potential for our field to soar is remarkable.
Descend from that altitude though and you see the practical and nuanced work that must be done
in organizations, departments, and individual roles and responsibilities. It is here that talent development
professionals strategize, plan, and do the work of helping others grow knowledge, skills, and capabilities.
There are countless inputs, levers, and gauges to consider. There are plans and objectives to achieve.
And there is the execution that ties it all together, resulting in a learner going from point A to point B in
their development journey, improving their ability to perform more complex work in an environment of
constant change and disruption, and adding value to their organization.
Consider this book an instrument guide of sorts.
In here you will find stories and insights from your colleagues in a variety of industries who are shar-
ing actual examples of work they did, the results they achieved, the lessons they learned, and the tools and
techniques they used to drive results and impact their organizations. The power of the book isn’t found
in one particular example, rather it’s the collective wisdom of successful practitioners that you will benefit
from. More than 50 members of the Forum contributed to this effort.
I owe a debt of gratitude to the contributors, interviewees, and reviewers who collaborated to make
this book a reality. The writing of the book exemplifies the ATD Forum community and its focus on
connecting, collaborating, and learning from one another. Like in any Forum gathering, the goal of the
book is to leverage practices that help you address business challenges in a way that enables performance
and continually builds capability at the individual, team, group, and organizational levels.

vi | Foreword
World-renowned leadership expert John C. Maxwell states, “Anyone can steer the ship, but it takes a
leader to chart the course. Leaders who are good navigators are capable of taking their people just about
anywhere.”
The right skilling of the workforce to meet the demands of the future is a significant task—and our
profession has the responsibility to chart the course and help make it happen. I’m grateful for the work that
leaders in ATD Forum organizations are doing and their willingness to share that work because it helps
us navigate a path forward in developing our own efforts and the impact they can achieve for the people
and organizations we serve. As we emerge from this current disruption and construct the “next normal,”
I encourage you to put what you learn here into action.
Together, we are creating a world that works better.

—Tony Bingham
President and CEO, Association for Talent Development
June 2020
Alexandria, Virginia

Foreword | vii
Introduction

Many say the definition of a leader is someone who can take others to a place they would not go by them-
selves. If this is true—as we believe it is—today’s frequently changing and complex workplace provides
a unique opportunity for leaders to excel. And to excel as leaders, what must they be able to know, do,
and communicate? Moreover, what changing technical areas must these leaders be experts in to build
and influence the individual, team, and organizational performance capabilities of the entire enterprise?
These are the challenges talent development professionals face and the questions we ask at the ATD
Forum; they are also the driving force of this book.
For more than nine years, the two of us have been the hub of the ATD Forum, a vibrant consor-
tium for connecting, collaborating, and sharing knowledge about anything related to talent devel-
opment. The goal of this peer-led group is to leverage lessons learned from future-ready practices
others within the community are using. This helps members stay ahead of the challenge curve to
support competitive advantage and build capabilities within their respective organizations.
There are several venues for sharing practices within the consortium: semiannual community
events, which are two-and-a-half day experiential labs on a topic selected by the host member; virtual
web sessions called ConnectSparks, which are one-hour discussions with subject matter experts;
ATD-sponsored, in-depth reports; benchmarking through short surveys; and informal conversations.
We are constantly amazed at the excellent practices these members demonstrate as they lead
their respective teams. But the most interesting part is seeing these practices expand in real time: One
member would share, another would ask probing questions, still another would make suggestions,
and then months later, in discussions, we would hear how the initial practice had served as a trigger
and was showing up in a slightly different form in another organization. Or, a team from a member

ix
company had gone to another member’s company to benchmark, and from that perspective had
discovered other areas of shared interest, and they were now collaborating on another project.
This force multiplier effect of sharing learning reminds us of a quote from Seth Godin (2019):

Learning . . . is self-directed. Learning isn’t about changing our grade, it’s about changing
the way we see the world. Learning is voluntary. Learning is always available, and it
compounds, because once we’ve acquired it, we can use it again and again.

Our profession, like the workplaces we support, lives in a world the Center for Creative Leadership
recently labeled RUPT—rapid, unpredictable, paradoxical, and tangled. Some of the transformations
in the learning world are multifaceted and others are just new ways of operating, such as automating
a system. But change, either massive or small, is always difficult. Whether the change is implementing
a modern leadership process, scaling curriculum globally to accommodate new employees, upskill-
ing employees because of new products and services, or integrating emerging technologies for more
personalization of and access to learning resources, others may have gone through something similar.
When facing a challenge, having examples from others and learning from their experiences provide
ideas, insights, and motivation. No one is alone. When leaders engage with one another, especially those
from different industries with similar challenges, the possibilities are endless.

About This Book


While sharing learning practices, exchanging case studies, and swapping resources are the hallmarks of
the ATD Forum, we have found that these benefits are manifold—and we’re thrilled to extend our learn-
ing reach in this book, Leading the Learning Function: Tools and Techniques for Organizational Impact.
The Forum relies on volunteer members, and that is how this book was accomplished. After several
iterations to determine the outline of topics to cover, which included a survey to our 60-plus members, we
ended up with eight major areas, each including a variety of subareas. We then hosted a web session to
formally introduce the project, and followed that with an outline for those interested to sign up by topic.
The result was 44 expert content contributors presenting their best practices, innovative tools and tech-
niques, and general problem-solving methods for facing today’s business challenges for learning, either as
authors or interviewees.
These contributors focused on practices that are essential to developing performance behaviors to
achieve desired business results—but there are no magic formulas. Like their practices, their stories are
also different. Some are straightforward case studies with lessons learned. Others look a bit different than
what’s typically found in a business book. Instead of the usual expert-talking-to-novice approach, some
chapters read more like a novel; several even include self-talk about an experience. This mucking around
with ideas and questions is similar to how we acquire knowledge and skill in our daily lives: we struggle,
we talk with others, we search, we solve problems, we iterate, we see associations, we learn, and we do.

x | Introduction
As editors, we are excited to share this body of knowledge and expertise with all talent professionals,
especially those leading the learning profession. This book was made possible by the collective contribu-
tions of more than 50 individuals representing more than 50 percent of ATD Forum member companies:
• 31 people contributed content as an author or co-author of a chapter
• 11 were interviewed about their thoughts on leadership
• 4 contributed interviews and case studies on tools
• 12 acted as content reviewers.
Our ultimate goal is to provide guidance on how Forum learning leaders carry out their roles to
assess and build organizational performance capability that supports the business’s competitive position
in their respective market. The actions they take use generic leadership and management skills and
address specific organizational learning pain points and challenges. The book’s 26 chapters are divided
into eight sections:
• Section 1, Setting Direction, looks at methods for proactively prioritizing and making
sense of the complexity of the work. It includes aligning learning solutions with the goals
and objectives of the enterprise to build capabilities. This involves a holistic view of the
organization and deep understanding of what enables competitive advantage.
• Section 2, Managing Processes and Projects, features ideas and practices for assessing,
understanding, and communicating performance needs and gaps within the organization and
ways to build solutions. It zeroes in on processes and constructs unique to the learning arena.
• Section 3, Leading and Developing People, examines ways leaders can personally create
the environments and opportunities for enabling others to excel in their roles as talent
professionals. The ideas begin at hiring and onboarding and extend through continual
coaching and encouraging their professional certification and credentialing. The section
includes a variety of ways to set others up for performance success by serving as a spark to
ignite curiosity, energy, and motivation, which can lead to new capabilities.
• Section 4, Making an Impact, considers how the learning profession builds capability,
enabling the organization to reach its business goals. Metrics, dashboards, and evaluation
processes are covered.
• Section 5, Stakeholder Collaboration, focuses on understanding, communicating with, and
influencing those we serve in our organizations. It recognizes the value of collaborative
partnerships, the ways learning can be structured to be more efficient, and the importance
of a governance board.
• Section 6, Enabling Learning Using Technology, explores everything from the basic
technologies available to track and monitor learning to emerging and disruptive
technologies that are changing the way learning content is designed, developed, delivered,
and consumed.

Introduction | xi
• Section 7, Innovation, recognizes how the future of work and learning is being fueled by
advances in technologies and neuroscience.
• Section 8, Leader Behaviors and Practices, is jammed with moments of impact when
leaders have had informal and formal opportunities to interact with and support others.
These stories zero in on the many ways leaders show up, move the needle on performance,
and continually reskill, upskill, and new skill themselves.
We hope this portfolio of personal stories, tools, techniques, and examples for solving business prob-
lems and challenges through capability building is both helpful in your current work and inspiring for
building more powerful learning in the future.
If you take action on these ideas by experimenting with new tools and techniques and expanding
your practice portfolio of resources, you and your team will be more able to address the performance
challenges your organization faces. We encourage you to ask thought-provoking questions to trigger
further research. But don’t stop at experimenting with these concepts and building capability in your
own organization. Be a leader who takes others to places they would not go by themselves—share
your own challenges and successes in building performance capability in case studies and through
articles, blogs, videos, and books. These actions will enable you to continuously get better at getting
better with your role: coaching, guiding, and encouraging others to be open to new opportunities.
Your result will be learning professionals who are masters at advising and guiding business leaders on
changes that influence the future of the organization—building performance capability at all levels
and in all areas!

—MJ Hall and Laleh Patel

xii | Introduction
Section 1
Setting Direction

S
ection 1, Setting Direction, looks at methods for proactively prioritizing and making sense

of the complexity of the work for now and for the future. This involves a holistic view of the

organization and a deep understanding of what performance enables competitive advan-

tage. What does the organization want to accomplish in light of internal capability and external

constraints and changes? How does the organization set priorities and communicate to all parts

of the enterprise?

For the learning leader, this means being visionary at the learning level. But more important, it

means being aligned and integrated with organizational goals and objectives, as well as the desired

business results. This includes setting direction, deploying plans for execution, and simultaneously

managing changes associated with both strategic transformations and daily transactions.

In chapter 1, Lisa Gary shares how Ingersoll Rand uses Lean and Hoshin Kanri to focus on a

few vital priorities they call their North Box, and how this philosophy is replicated in every division

to create alignment. This includes keeping two questions front and center in all decisions: Where

will we play? and How will we win? The approach includes recognizing that while executives set
direction, employees execute that direction through their respective actions and performance.

In chapter 2, we learn how a midsize bank aligns and integrates learning with the larger corpo-

rate goals and objectives using a four-step approach: know the business, build a business case,

engage senior leaders, and communicate results. For each step, Sandi Maxey provides actions,

1
tools, and thought-provoking stop and think questions. She also suggests using the business model canvas to

provide a clearer perspective for how the parts fit into an operating system.

In chapter 3, we experience a personal journey of learning being pushed into the organizational strategy.

Teri Lowe uses her own conundrum as the starting point, and then uses questions, actions, reflection, and feed-

back to sort through the information and share her point of view of the big picture and how learning fits into the

overall strategic system.

The personal journey described in chapter 4 starts with the desire to be strategic. The first lesson is that

having the title does not make one strategic. However, through continuous research and lots of experiences,

John Kelly recognizes that the first step is summarized by ACT: the need for access, credibility, and trust. He

blends this with a framework from Peter Block and integrates project management techniques to develop his

formula for being future focused.

2 | Section 1
1
Syncing Up for Synergy
Lisa Gary

Open your email, access social media groups, read professional journals, or attend a course or conference.
What is a consistent theme learning and development (L&D) professionals hear? For learning to be effective in
delivering the performance needed for business results, your learning assets must be strategic and aligned with the organization’s
needs and goals both now and for the future.
But how does this really happen? In basic terms, it means that the learning leader is future focused
and has the experience and skills to align their functional work of developing people with the organiza-
tion’s future needs. Another way to put it is strategically developing organizational capability.
Future focus and alignment are especially critical for learning professionals because we have multi-
ple roles along a functional spectrum. On one end of the spectrum, we serve as technical and functional
experts responsible for developing cognitive, performance, and behavioral intelligences using learning and
training best practices. On the other end, we serve as business learning advisors assisting senior executives
to build competitive advantage at the organizational systems level. Our role focuses on assisting senior
executives with clarifying strategic priorities, determining use of resources, and identifying gaps and road-
blocks for productivity and results.
No matter the role, the primary goal is enabling the organization to succeed, however success is
defined, by increasing the overall capability to be competitive. This workforce capability is a force multi-
plier for success, which includes having all employees performing at desired levels, constantly building new
capabilities, and being inspired to excel personally and professionally. But more important, employees
need the knowledge and skills to work collectively as interdependent teams with a common mission and
common goals.

3
Our success as L&D professionals is defined by creating opportunities for developing workforce capa-
bilities at the individual, team, and organizational levels, even as the pace of change accelerates.

Various Roles Require Numerous Skills


To be proficient at building competitive advantage in the complex and ambiguous environment that
pervades every industry, learning professionals need to be skilled in organization development practices,
such as leading change, catalyzing innovation, and future visioning. They need to understand the business
and know how products or services are produced and delivered to make money for the organization. They
need to have financial acumen. Additionally, they need to understand how the organization is designed
and how it operates—especially how all the parts fit together. They need an in-depth understanding of
how the organization’s people, processes, and work are connected to deliver results. This understanding
requires systems thinking to see the organization as an interdependent ecosystem; critical thinking to
analyze, synthesize, and evaluate information to make decisions; and strategic thinking to have a long-
term perspective to address challenges and solve problems by identifying and closing gaps and building
competitive advantage.
Having the requisite knowledge and skills is the starting point. But learning leaders also need to
act, which means having a portfolio of tools, techniques, and practices to implement as needed. There
are several elements to consider when working with enterprises interested in gaining greater competitive
advantage through organizational alignment. These include having a clear direction or focus, understand-
ing the customers and their changing requirements, knowing where and how you compete (for example,
low cost, customer service, or product innovation), having internal processes to support customers, using
measurements to monitor progress toward the desired state, and constantly evaluating the impact of the
collective work. These elements are anchored in a clear understanding of organizational challenges, goals
and objectives, strategies designed to meet these challenges, awareness of the competition, and the change
needed to continue to excel. Additionally, all employees need to be motivated and incentivized toward this
end, thus creating a winning culture.
While there are many options, frameworks, models, and ideas for getting better at organization devel-
opment, especially in aligning processes and performance, the following four suggestions can jump-start
your journey.

Develop Strategic Insights


Gathering data and asking questions are critical for moving from a focus on developing tactical training to
serving as a business learning advisor with an understanding of the organizational system. Two places to
start are the self-assessment tools offered by Peter Drucker (2010) and the Organization Profile from the
Baldrige Criteria for Performance Excellence, offered online. Both tools focus on general questions you
need to ask to gain strategic insights, such as:

4 | Chapter 1
• Mission: Why does the organization exist, or what is our purpose?
• Vision: What is our aspiration, or what do we want to be? (For example, are we laying
individual stones or building a cathedral?)
• Values: What do we believe? (Values are not the statements on the wall. Values are the way
life is lived every day, the way the organization does business. The values are the criteria against
which all decisions are compared before actions are taken.)
• Customers: Who receives our products or services?
• Customer requirements: What are the products or services our customer groups need
from us?
• Critical processes: What key crosscutting operational processes do we need to employ to
support our customers with a high level of satisfaction?
• Measurement: What metrics will show how our processes are doing in support of the
customer requirements and determining satisfaction?
Once the direction is known, we continually review our situation in terms of:
• What are the internal and external challenges keeping us from being competitive within our
market (for example, being the supplier of choice)?
• What are our goals at the highest level?
• What objectives will enable our highest-level goals?
• How will we reach the objectives and goals (that is, what is the game plan—strategy—for
reaching these goals to overcome challenges)?
• What tactics and methods are needed to execute the strategies with a high degree of success?
• How are we developing and incenting our workforce to ensure they are a competitive advantage?
Gathering data and asking questions about the measurement system are critical, but that’s not
complete in and of itself. It is imperative to use analytical methods to dissect and make sense of the data.
Talent professionals typically see the goals and objectives and then focus on building individual capa-
bilities to address them. According to Alec Levenson (2016), however, for individuals to be successful
they need the right combination of three factors: individual competencies, motivation and attitudes, and
aligned job design—including whether a job is aligned with the larger group. This increases the scope of
our work to continually building intellectual capital and capabilities at the group or working-team level. As
Levenson states, “We need to know whether the whole of the group’s output is greater than the sum of the
individual jobs’ tasks. If not, then talent at the group level is not performing as it should.”

Link to the Most Senior-Level Goals


Another way L&D professionals can ensure strategic alignment is by making a direct connection to the top
goals of senior business leaders for the various business units. Put yourself in their position. What are they
being held accountable to deliver? Is their goal cost reduction (which could include process improvements,

Syncing up for Synergy | 5


decreasing errors, new technologies, or increased employee productivity)? Is the goal revenue generation
through new sales or higher customer satisfaction (which increases loyalty and expanded sales)? Or is their
goal regulation and compliance to prevent errors?
If you are an organization that uses a goal deployment process, aligning the learning function is gener-
ally easier. The goal deployment process, or Hoshin Kanri, is a Lean planning and execution method for
ensuring the strategic goals of a company drive progress and action at every level. A Lean organization
understands customer value and focuses its key processes to continuously increase it with zero waste. To
accomplish this, Lean thinking changes the focus of management from optimizing separate technologies,
assets, and vertical departments to optimizing the flow of products and services through entire value
streams, which flow horizontally across technologies, assets, and departments to customers.
Eliminating waste along entire value streams, instead of at isolated points, creates processes that need
less human effort, space, capital, and time to make products and services at far less costs and with much
fewer defects, compared with traditional business systems. Companies are able to respond to changing
customer desires with high variety, high quality, low cost, and very fast throughput times. Information
management also becomes much simpler and more accurate.
Using the Hoshin Kanri process at the organizational level enables L&D to be aligned and integrated
with strategic priorities—what the process calls the North Box. This helps keep the talent professionals
from becoming order takers from individual managers in siloed departments. It also facilitates a focus on
the projects that are likely to have the greatest impact at the system level. With this focus and direction,
L&D can more easily say no to requests that might be pet projects and not part of the North Box. Addi-
tionally, the structured approach of Hoshin Kanri spills over to the way L&D itself operates; for example,
the way it designs, develops, pilots, and measures projects. This has a compound effect because L&D then
uses more business language and less learning language.

Have an Executive Sponsor


Another way to ensure strategic organizational alignment is having an executive sponsor for your learning
solutions. A sponsor is someone willing to dedicate their time, effort, energy, political capital, and leader-
ship to delivering the business outcomes. If L&D is driving the solution or we, as learning professionals,
care more about the outcome than helping the business leaders address their challenges to meet their
goals, we need to stop and assess what we are doing. The best part of this organizational alignment strat-
egy is that L&D does not move forward with an initiative unless it has an executive sponsor.
Here’s an example of this strategy at work: Ingersoll Rand (IR) set a North Box goal that the whole
organization would understand its newly designed materials playbook and deliver cost reduction to its
operations. As a manufacturing organization, IR has a lot of expense and risk associated with the materials
it takes to manufacture its products. The executive sponsor, who was the head of integrated supply chain,
decided that everyone should go through a custom materials learning path to ensure that they understood

6 | Chapter 1
the playbook. The Ingersoll Rand University (IRU) learning team met with this leader on a monthly basis,
co-designing the solution that he then signed off on. The IRU team met with him to provide feedback and
deliver completion reports so he could report to the executive team on the progress. The business impact
was that they were able to cover material inflation and manage it in a strategic way. This helped to create
consistency in operations globally. Because of this result, the alignment and execution for future endeavors
remain well structured and continue to be followed. The head of integrated supply chain delivered incred-
ible lasting business value, and the learning function enabled his and the organization’s success.

Create a Governing Strategy


Another way to ensure alignment and focus is to get help prioritizing your learning function’s initiatives
by creating a governance strategy. Most organizations talk about driving performance. Ingersoll Rand, on
the other hand, focuses and acts on delivering performance. This distinction—which is built on its vision,
mission, purpose, and values—is what enables IR to align its strategy to achieve big goals.
How do you align your learning organization to deliver both the what and the how for ensuring
performance is delivered? Let’s face it, there is never enough time or resources to support everything that
is asked of the learning function, so you need a mechanism to help with this critical process. Said another
way, what requests can you say no to? A key way to achieve this organizational alignment is to form a
governance board. Ingersoll Rand’s enterprise learning development and prioritization strategy is directly
aligned because of the engagement and involvement of its governing board.
This governing board is chaired by the chief learning officer, and the chief executive officer, chief
human resources officer, and vice president of talent are standing members. It also includes key business
and function leaders of the executive leadership team. The governing board meets twice a year to ensure
critical business alignment to the learning strategies and provide approval for investments in key talent and
leadership development programs. This ensures resources are dedicated and prioritized to the initiatives
that are most critical for the company. This powerful relationship between the learning function and the
executives has enabled continued investment in learning and development, even during economic down-
turns, and delivered incredible business results time and time again for the company.
It’s important to note that we never take for granted what makes us unique. While many organiza-
tions may find value in providing learning assets that are much broader in application and general usage,
the training our corporate university provides is directly aligned with the business; that allows us to work
more efficiently to solve business needs in real time. Because of our involvement and investment in the
strategy governing board, we always have our finger on the pulse of the organization’s strategy and objec-
tives, and therefore its learning and change needs.

“We decided early on that we cared about how our people achieved results, not just what the results were.”
—Ingersoll Rand CEO

Syncing up for Synergy | 7


Moving Forward—Eating Your Own Dog Food!
While knowledge and skills are critical, nothing happens without leadership—that’s who gets people
engaged, committed, and energized to perform with excellence. Leaders harness the momentum
of the organization to drive change. They provide direction and support. Part of being the learn-
ing leader is working with your team to create a strategy that ensures the learning organization is
driving change and prioritizing the right focus areas. But, how do you go about creating a learning
strategy? What steps do you need to take?
Leadership and organizational momentum are critical to driving change and promoting a
continuous evolution of ideas. They are also imperative to drive strategy development. At Inger-
soll Rand, we believe that if we are not continuously improving and innovating, we are becoming
obsolete. Innovation is one of our core values, and our mission of growth excellence, delivering
operational excellence, and building a winning culture is what has delivered premier performance
for our company. The tool used is a dynamic strategy.
In Playing to Win: How Strateg y Really Works, Laf ley and Martin (2013) state: “Every day
we make choices. Some are small, while others have huge implications. The good thing about
having choices, even the hardest ones, is it means that you can take action. You can inf luence
the world—not just today—but into the future. Choosing what to do and what not to do—in
conditions that can be ambiguous, heavy with consequences, and changing constantly—is the
definition of strategy.”
So, with this backdrop, we determined that we needed to take a fresh look at our enterprise
learning strategy. As the CLO, I set out to apply our own standard of what we as the learning func-
tion were teaching our leaders. We had developed robust strategy, standard work, and tools, as well
as a workshop that trained our product managers to create a strategy for their products. This was
eating our own dog food, as the saying goes. Applying business strategy to creating a learning strat-
egy was very challenging; however, it was also very rewarding. What we learned and the process we
undertook are things I recommend for all learning functions.
In 2018, IRU embarked on an ambitious learning strategy refresh. We looked at the whole
enterprise learning function with the goal of determining key focus areas for the function for the
next three to four years. IRU applied Laf ley and Martin’s critical strategic decisions process meth-
odology, called “Where will we play and how will we win?” (Figure 1-1).
The process started with a small group from Ingersoll Rand University conducting a full
current-state assessment by:
• Generating a list of hypotheses to prove or disprove. It was important in this step to be
critical of the learning function and allow the team to state any unspoken statements
(such as this learning function should be completely outsourced). This process allows you
to cast a wide net of questions.

8 | Chapter 1
Figure 1-1. The Strategy Choice Cascade

What is
our winning
aspiration? Where
will we
play? How
will we
What
win?
capabilities
must What
we have? management
systems do we
Source: Lafley and Martin (2013).
need?

• Gathering all key facts and data about the learning organization (including head count, courses
offered, delivery methods, evaluation results, and learner feedback).
• Scanning the external environment to see what other learning organizations were doing. For
this scan we used an on-demand survey through the ATD Forum to survey other best-in-class
learning organizations. We also conducted private interviews with learning leaders from a
variety of industries who were also members of the ATD Forum to gain an understanding of
their best practices.
• Conducting an internal survey of business leaders to determine how the learning function
was viewed. As members of the Bersin by Deloitte group, we were also able to use their high-
impact learning organization diagnostic.
• Interviewing our senior executives and HR leaders to gain their insights. The 21 questions
asked of all executives as part of the strategy refresh are listed at the end of this chapter for you
to use in your organization. The data were synthesized into key organizational insights based
on themes, patterns, and topics.
• Researching the external environment by reading more than 100 research articles—our
external research data-mine process. We assigned and disseminated seven to 10 articles by
topic to small teams, which then worked together to glean the insights, implications, and
recommended actions from the research. Using small teams allowed everyone in the global
learning function to be a part of the strategy refresh process, while also allowing us to quickly
and efficiently develop and prioritize the trends taking place in the learning marketplace. This
process gave everyone in the talent organization a chance to join the change journey and to
dream and brainstorm on what the future state could be.

Syncing up for Synergy | 9


Once we completed our current state assessment and external environmental scan, we sched-
uled a learning strategy workshop. We used the same strategy workshop that we delivered to our
product managers. To execute this, we worked closely with our corporate vendor, Strategy Gener-
ation Company, to deliver a two-day strategy off-site that would help us recalibrate our approach.
Another innovative part of the process was inviting two learning experts from an external learning
organization to join the workshop to serve as a fresh set of eyes and ask questions. I cannot empha-
size enough how important it is to engage outsiders in your strategy process. Having objective, inde-
pendent minds weigh in and ask insightful questions that both challenge and validate the direction
is a very powerful addition to the process.
Once completed and reviewed by our team, the enterprise learning strategy was presented to
the IRU Strategy Board, our governing board, so they could understand the current state and the
envisioned future state, as well as our recommended pathway there. This allowed the board to give
input to the strategy as well as approve and buy into the direction. To ensure that all members of the
IRU team were well versed in the new strategy, one of our team members, Jennefer Pierce, created
a summary graphic version, called a placemat, for each of them to display at their workstations
(Figure 1-2). The visual allowed us to keep our strategy uppermost in our minds every day.

Figure 1-2. The IRU Learning Strategy, or Placemat

10 | Chapter 1
As you can imagine, this was a major project requiring extensive planning, organizing, directing, and
doing—it was a lot of work! You need a core team to manage the project, but you also need leadership
to champion it and constantly emphasize its priority and value for meeting the organization’s needs to
deliver results. IRU was fortunate to be in a position to dedicate time and resources to an externally facil-
itated strategy workshop, but ultimately it was our team’s execution power that enabled the change to
happen. After finalizing our strategy, we produced a 10-minute video for our internal social collaboration
site Yammer, to serve as a further reference for our journey to the future, a reminder of our guaranteed
success, and most important, as recognition of the sweat equity our team had put into the process.
While creating a strategy using a new and innovative refresh process may seem like a huge,
once-in-a-generation effort, it’s actually an ongoing, iterative process. For every organization, creating a
learning strategy is critical for knowing what you want to aspire to and how to set big-picture goals. As
the organization evolves throughout the learning strategy change initiative, it creates the momentum and
sustained energy for the team’s efforts to make the necessary changes and deliver the big-picture goals.
An added benefit of this work is that the entire team is rowing together and learning new skills. To ensure
excellent delivery of this new reality, the team should pause every two months for a full-day meeting to
revisit its progress, discuss the overall strategy within the learning group, celebrate the progress made, and,
most important, to build the team’s internal capability.

Summary
Several conditions must be in place for learning to be aligned with and focused on the organizational
goals and objectives needed to produce the desired business results. First and foremost, TD leaders
must understand the strategic direction from a systems perspective and the details of the business:
• how the business makes money
• the business model employed
• how the company tracks profits
• who the stakeholders and strategic business partners are and what is important to them
• the overall competitive position of the organization (for example, competitive marketplace
advantage).
The learning content must be integrated into the business context (that is, integrated with the
business’s language, goals, and values). This context steeps the training in the organizational reality
and culture. Additionally, learning activities must be grounded in the performance needs of the
individual employees and the roles they play within their work groups. What does it take to enable
groups of employees working together to successfully achieve organizational results? This direction
and alignment enhance the overall system, thus enabling leaders to keep the workforce connected,
working together, and focused on the big picture, as seen in this chapter’s examples. This in turn
generates value and impact, thus creating a competitive advantage in the marketplace.

Syncing up for Synergy | 11


Strategic alignment of all learning programs, both formal and informal, with the organizational
direction is good for the business. It demonstrates the contributions and value of the learning func-
tion in developing people by increasing organizational capabilities and interoperability among and
within business units. But more important, it leverages the ability of the learning leaders to make an
operational impact on senior-level goals and objectives, which in turn positively affects results and
thus competitive advantage.
Periodically stopping and conducting a total refresh of the learning strategy is critical to steer-
ing the learning ship in a new direction. While we benefited from the assistance we received from
memberships in professional consortia and consultants, the efforts of the learning team in gathering
and sorting data and engaging in building the future, were magnanimous.

Key Takeaways
9 While all organizations are unique, there are some must-haves for future focus and alignment. They
generally start with the result—competitive advantage.
9 Incorporating a management philosophy like Lean or Hoshin Kanri can help focus the direction,
determine priorities, align processes, and create a common language.
9 A governing board includes critical decision makers and stakeholders who can champion initiatives
and keep projects on track.
9 A complete refresh of a learning strategy takes time and effort, but has huge benefits for the
organization and the learning team.

Questions for Reflection and Further Action


1. What structures does your organization have in place to ensure that learning is aligned with the goals
and objectives of the business? How do they influence the work your department does?

2. What is your organization’s North Box (or top-level strategic breakthrough priorities)? How is your
learning team ensuring that the capabilities needed for these priorities are in place today or being
built for tomorrow?

3. What aspects of the strategy refresh could work for your team?

12 | Chapter 1
QUESTIONS ASKED DURING EXECUTIVE INTERVIEWS
AS PART OF THE STRATEGY REFRESH

1. Regarding external trends and internal capabilities, what external trends are facing Ingersoll
Rand?
2. What skills and capabilities will be required for employees and managers that we may not have
today but will enable us to successfully execute in the future?
3. What business value or value proposition does IRU bring to Ingersoll Rand today?
4. Looking into the next three to five years, what additional value would you like to see IRU
delivering?
5. Conversely, what should we (if anything) stop doing?
6. If your business or team has a learning need that cannot be met by IRU today, where do you go to
fulfill that need?
7. What would you recommend IRU do to gain a more competitive advantage in the internal and
external marketplace?
8. Regarding leadership development, what is working well and what needs to be improved?
9. What (if anything) should we do differently with our high-potential cohort leadership
development programs?
10. Regarding strategic capabilities, what is working well and what needs to be improved?
11. What should IRU’s top three priorities be for the next three to five years?
12. What are your measures of success for IRU’s products and services?
13. What is your perspective on our leaders being in the classroom and sharing their expertise with
our talent?
14. What amount of time would you want your senior leaders to give to this endeavor?
15. What are the upsides or downsides of a pay-per-seat model?
16. Would you allocate budget dollars and pay-per-seat for your employees to attend IRU programs?
17. As for benchmark data, what might IRU learn from you if you were to reflect on your past
experiences with other high-performing enterprise learning functions?
18. What else would you like to share with us that we did not ask?
19. What effective learning solutions and processes do you see from your strategic business unit
technical/product-training teams?
20. What inefficiencies do you see?
21. What overlap in learning content or audiences compared with IRU’s scope do you see?

Syncing up for Synergy | 13


2
A Proactive Approach to
Strategic Learning Alignment
Sandi Maxey

Does senior leadership view your learning organization as a critical component of the company’s
overall success, a cost to be avoided, or somewhere in between? In a 2015 study by Human Capital
Media, learning leaders were asked to compare how they view the value of learning to the business
with how they believed their organization’s business line leaders perceived it (HCM Advisory Group
2015). Overall, learning leaders viewed their departments as strategic enablers of achieving business
objectives. Conversely, they believed that their organization’s business line leaders were more likely
to view the learning function either as costly but necessary or as a pure cost center. This may not be
a surprise to learning leaders, but what does it mean?
For learning leaders to achieve strategic alignment with the business, we must understand where
the perceptual disconnect described in the study originates. Business line leaders are judged based on
goal achievement, which, for most businesses, equates to numbers such as production, revenue, fees, or
billable hours. Learning, on the other hand, is often measured by the number of classes, participants,
or course ratings on a smile sheet. What’s not obvious to senior leaders is how these learning numbers
affect business results. In a 2019 study conducted by the Association for Talent Development (ATD),
less than half of respondents (40 percent) believed their learning evaluation efforts helped them meet
their organization’s business goals (ATD 2019a).
As the learning leader, you might consider strategic alignment at two levels:

15
• At the organizational level, you align the learning strategy with the organization’s overall
business goals, objectives, and strategy.
• At the business line level, you align specific learning initiatives to the goals of the unit.
For example, if the organization’s strategy includes organic growth, the learning function might
develop an enterprise-wide sales training strategy with multiple sales training initiatives for specific busi-
ness lines under that umbrella.
This chapter will present a case study describing how the learning team at one midsize community
bank in the mid-Atlantic region of the United States implemented an intentional strategy for achieving
strategic alignment with the bank’s overall business goals and objectives to gain support for an important,
but somewhat risky, learning initiative. The details of the story are presented using the framework of a
four-step model. For each step, we’ll explore the actions taken, the tools and techniques used, the lessons
learned, and practical suggestions for how you might apply the model in your own organization. Finally,
each section will provide a list of stop and think questions to stimulate your thinking (Schellenger 2015).
It is important to note that the tools, models, and methods used by the featured organization in this case
study are not the only tools available. They were simply chosen by the bank’s learning team based on best
practices and how well they fit the organization’s culture.

The Case and the Conundrum


The bank in our case study has operated for more than 150 years in a large metropolitan area known
for its affluence and strong growth. It is categorized as a large community bank with assets of more
than $8 billion and approximately 950 employees. The learning team is made up of five learning
professionals. The competitive environment for banks in this market area is intense. Readers without
a banking background need to understand two simple concepts to grasp the basics of our story:
• Banking, in general, is now a commoditized industry. Banks have reached product parity; they
all offer the same basic products and services. The large commercial banks compete for market
share on the basis of price and technological innovation. Community banks, on the other
hand, must differentiate themselves with service and expertise; in a word: people.
• Bank profits come primarily from commercial lending, or lending to businesses. Community
banks rely on a cadre of skilled commercial lenders to source new loans and establish deep
banking relationships with businesses in order to be profitable.
We now reach our business conundrum in the story. The number of skilled commercial lenders avail-
able for hire are dwindling. In the 1980s and 1990s, large commercial banks recruited thousands of fresh
college graduates from the best business schools and put them through rigorous credit training programs.
However, these programs were eliminated during times of economic downturn. Fast-forward 20 years and
the banking industry, as a whole, is facing a shortage of skilled talent for its most profitable product. New
commercial lenders have stopped entering the pipeline, and the existing supply is aging out of the system.

16 | Chapter 2
The bank’s learning team realized that this was an opportunity to partner with the business to solve an
undeniable problem. Despite the criticality of the situation, they knew it would be challenging to persuade
senior leadership to invest in a learning solution because any recommended remediation would involve
significant cost and risk. Thus, the keys to successfully selling the learning solution would be demonstrating
that the learning team understood the problem and had the bandwidth and expertise to solve it internally.
In addition, they needed to engage with business leaders to develop a solution using the same rigorous
business standards as product development and technology investments. In other words, the strategy for
solving the problem needed to align with the business strategy for achieving revenue goals.

As you assess how your organization’s learning function can become more strategically aligned with
the business, you might want to start at the business line or program level, particularly if you work for
a large organization. To help you focus your search for a specific project to align with business needs,
ask these questions:
• Which job roles or skills are most critical to your business success?
• What are industry forecasts saying about future trends and outlooks?
• What new technologies or innovations are expected to affect your business?
• Where is your industry or organization experiencing the greatest pain points?

The Learning Leader’s Point of View


Before digging into the model and the specifics of strategic alignment, let’s pause to consider how the
bank’s learning team views the learning function’s role in the organization—what we do, whom do we do
it for, and most important, why we do it. You might think of this as the mindset for alignment. Simply put,
the team believes the learning function’s sole reason to exist is to support business objectives. This belief
was documented with vision and mission statements to clearly articulate their role relative to the larger
purpose of the organization and their commitment to supporting it through strategic learning. They
adopted the organization’s three-year strategic planning model to draw a clear connection between the
bank’s objectives, the business strategy, the learning strategy, and individual learning initiatives. The team
also conducted a SWOT (strengths, weaknesses, opportunities, threats) analysis to incorporate methods
for leveraging strengths and opportunities and to mitigate weaknesses and threats.

The Model
Models provide frameworks to organize your approach for understanding and solving problems.
In her book Strategic Learning Alignment: Make Training a Powerful Business Partner, Rita Mehegan Smith
(2011) presents a simple four-step model for “understanding what is important to your business leader

A Proactive Approach to Strategic Learning Alignment | 17


and how best to communicate learning processes and outcomes in business terms.” Her SLA model
is shown in Figure 2-1.

Figure 2-1. Strategic Learning Alignment (SLA) Model

Source: Smith (2011).

We will break the model down step-by-step and use it as a framework for exploring how the bank’s
learning team applied the model’s concepts to the learning solution described in the case study.

Step 1. Know the Business


A learning leader must understand the business and be able to speak the language, even though their
required skill set can seem separate from the business. Instructional design, for example, follows the same
basic process whether your company is a manufacturer or a hospital. And we’re all familiar with the
glazed-over expressions our business leaders get when we mention “learning objectives.”
The bank’s learning team makes business acumen a priority—each team member’s annual devel-
opment plan includes activities to enrich the learning practice and deepen business acumen. Team
members read industry periodicals and attend training programs and conferences sponsored by the
state and national banking associations. They invite business line leaders to speak at department
meetings, and embed themselves in the business by visiting branch locations and by job shadowing
in other departments.
To develop a deeper understanding of the bank’s skills gap issue, the learning team researched
industry sources for data and information to quantify the problem and validate their solution. They

18 | Chapter 2
interviewed business line leaders, job incumbents, and internal recruiters to learn how the problem
was affecting them and what solutions they recommended. This practice had the added benefit of
identifying leaders and subject matter experts (SMEs) who wanted to be involved in the project.

Actions to Take
How do you, as a learning leader, ensure that your team can speak the language of the business? You
can start with your organization’s vision, mission, and values statement and the corporate objectives.
Many of these internal documents can be found on your company’s intranet or posted on the wall in
the break room. They provide the 30,000-foot view of what your company does, why and how it does
what it does, and whom it does it for. If the company is publicly traded, read a copy of its annual report.
In addition to the financials themselves, the annual report provides a summary of the business strategy
and operations for shareholders.
More important, get close to the business. In some industries, you can be a customer of the business,
which allows you to experience the company from a user point of view. Familiarize yourself with the
products and services offered. Use the technology your company’s clients use to interact with the business
or manage their relationship with it.
If it’s not practical to be a customer, try to get a more intimate view of the internal workings of
the business. The human centered design (HCD) discipline offers a variety of tools for understand-
ing the client experience. The learning function’s customers are your company’s management and
workforce, and you can adapt many HCD tools to help understand them and how they do their work.
Innovating for People, a handbook of HCD methods published by the Luma Institute (2012), is a rich
source of tools and techniques.
Your research should help you answer questions such as:
• What is your organization’s value proposition?
• What products or services do you provide?
• Who are your customers?
• What customer segments are most important to your business now and in the future?
• What are the business goals of the specific department or team you are working with?
• How does the work of the department or team contribute to the company’s overall mission?

Step 2. Build the Business Case for Learning


How many times have you heard, “We need training,” from a business line leader experiencing perfor-
mance issues? It’s far too common for L&D to react to the request only to later determine that a learning
solution wasn’t the answer. Throwing a costly learning solution at a problem that needs a different fix
threatens your credibility and can damage relationships with your business line partners. For this reason,
a learning leader must be skilled at front-end needs analysis.

A Proactive Approach to Strategic Learning Alignment | 19


Assuming that you have conducted a thorough front-end analysis of the performance issue, step 2 in
Smith’s model involves justifying why a learning program is the right solution. This is where the learning
leader must show the critical links between the learning solution and the business goals and outcomes.
To do this, they must think like a business leader. Using data to support an investment in resources is a
convincing place to start.
The bank’s learning team created links to the business goals and objectives by cascading their learn-
ing strategy from the business strategy using an adaptation of Bersin’s Talent Management Framework.
Figure 2-2 illustrates the components of the learning strategy. By going through this exercise, they
were able to prioritize their learning initiatives based on the how directly the initiative supported business
strategies.

Figure 2-2. Components of the Learning Strategy

Building the business case to develop a commercial lender training program included expense data
associated with sourcing candidates from the outside. Human resources provided statistics on turnover
and average days to fill the role. The team also worked with their partners in the finance department to
obtain hard cost data for fees paid to external recruiters as well as estimates for lost productivity while
open positions went unfilled. When coupled with anecdotal evidence from field interviews, they could
compare the costs of implementing a learning solution with the cost of doing nothing.

Actions to Take
Building the business case for learning begins long before any training solutions are developed. As
described, alignment with the business requires a well-thought-out learning strategy that cascades from
business objectives. You might consider using Rich Horwath’s Goals Objectives Strategy Tactic (GOST)
model to frame your learning strategy. This model is designed to help managers bring clarity to the plan-
ning process by differentiating among each of those planning terms.
Use these stop and think questions to help prepare your mindset for building a business case for your
learning initiative:

20 | Chapter 2
• Have we clearly articulated our learning strategy with direct connections to business objectives,
goals, and tactics?
• How does our learning strategy contribute to achieving business objectives?
• Which learning initiatives have the highest priority? Why?
• How are we allocating our resources toward our highest-priority initiatives?
• Are we fully leveraging front-end analysis to properly diagnose training-related performance
issues?
• What business problems are we helping to solve?
• How might we obtain data to support our proposal?

Step 3. Engage Leaders in Key Learning Activities


To gain alignment with the business strategy, business unit leaders must be involved in learning
design and development. Your design team can tailor the solution to meet the needs of the partici-
pants and ensure maximum relevance and effectiveness. Involving business leaders provides valu-
able sponsorship for the project, which is essential for communicating the program’s importance.
Because of the complexity of the commercial banker training program, the bank’s learning
team started project planning by creating a framework to help them organize each component they
needed to include. This framework was anchored by a governance structure made up of key stake-
holders who had a vested interest in the project’s success. The governing interests included executive
sponsors, a project owner, a steering committee, a curriculum review team, and mentors. A core
team, made up of the instructional designer and business line SMEs, was added to play a key role in
designing the career progression for the trainees. They also completed a stakeholder analysis map
to identify every leader or team that would have some interest in or inf luence on the project. This
served as the basis of the project communication plan.
The key pillars of the framework were candidate selection, curriculum development, partici-
pant engagement, and program evaluation. Each pillar outlined strategies for maximizing the final
component of the program: candidate retention.

Actions to Take
Partnering with the business line to design, develop, and deliver key learning activities is a major
success factor for driving alignment. Business line leaders can be engaged in sponsoring, communi-
cating, and, ideally, leading learning initiatives. They can also provide invaluable insights into the
nuances of the business unit’s culture, business model, clients, and so forth. This enables the learn-
ing design team to ensure the context of the learning solution resonates with the target audience.
To maximize alignment by engaging key leaders in learning activities, use these questions to
guide your strategies:

A Proactive Approach to Strategic Learning Alignment | 21


• Which leaders have the most to gain from this initiative?
• Who are the inf luencers associated with this project?
• Who are the other stakeholders in this project? What is their level
of interest or inf luence?
• How do we need to frame this project to meet the learning objectives?
• What business realities must we design around, such as geography,
time zones, and staffing?

Step 4. Communicate Your Business Results


Strategic alignment results from helping your organization achieve business results through learn-
ing. Many business leaders see value in learning. However, unlike learning leaders, who believe
implicitly in the value of learning to the business, not all business leaders are able to make the
connection. It’s not enough to report learning metrics, such as evaluation scores, on your company’s
intranet. To be viewed as a true business partner, you must have a communication plan that ensures
business line leaders are aware of the value your learning function contributes to the organization.
In step 3, you used a stakeholder analysis map to identify all interests in the project and align
a specific learning project to the business needs. You can also extrapolate this concept and apply it
to your learning function as a whole—the map helps identify individuals or groups with inf luence
on and interest in the learning function. It is important to enroll these entities early in the process
and regularly communicate results.
Your communication plan should match the frequency and delivery method of communication
to the inf luence and interest level of the stakeholder groups. For example, the bank’s executives
had a strategic interest in the project, so they received an annual high-level update. Business line
leaders, on the other hand, had a more vested interest in the project’s success, so they received more
frequent and detailed updates to assist them with budget and staffing plans.
Measuring the business impact of learning is one of the most challenging aspects of leading the
learning function. As mentioned earlier, research conducted by ATD (2019a) indicates that less than
20 percent of organizations measure the business impact of learning to a high extent. Why is this
number so low? Many factors contribute to successful business outcomes—product differentiation,
effective pricing and marketing, and geographic advantages, to name a few. According to the same
study, the top barrier to measurement is the difficulty of isolating the effects of learning from other
factors, followed by a lack of tools for measurement and the inability to extract the right data from
the learning management system.
Even with right tools and data, measuring the business impact of learning is difficult, time
intensive, and expensive. The bank’s learning team had to learn to be more intentional in how

22 | Chapter 2
they focused their measurement efforts. In terms of establishing strategic alignment, they selected a
method and level of measurement on a project-by-project basis. The team tried to match the level of
evaluation to the project objectives and complexity. If training was necessary, for example, to meet
compliance requirements, Kirkpatrick’s Level 1 (Reaction) or 2 (Learning) evaluation may be suffi-
cient. However, if the project was high stakes, high profile, or mission critical, they might employ
more rigorous measurement standards. They also found that by involving business leaders in the
learning design, they could capture their definition of learning success and focus on delivering the
return on expectations the business leaders had for the program.
Measuring the impact of the commercial lender training program will take several years. In the
program design, the team engaged job role incumbents and their leadership to identify key learning
objectives and evaluation criteria for each phase of the program. Early in the program, much of the
evaluation was at the learning level. As the participants transitioned from learning to performing,
the evaluation level will move to behavior and results.

Actions to Take
Communicating learning’s impact on business results is the ultimate outcome of strategic align-
ment. The skilled learning leader treats communication as an intentional tactic for achieving stra-
tegic alignment, and weaves it throughout learning design. It can’t be left as an afterthought or you
risk misidentifying the most relevant metrics for measuring success.
Your strategic communication plan might include answers to these questions:
• Who has inf luence on and interest in advancing the
earning function?
• What level of communication do our stakeholder
groups need?
• What communication channels should we use?
• How can we demonstrate learning’s impact on
the business?
• How can we measure learning impact more intentionally?

One More Tool to Consider


Another tool Smith recommends for understanding how the business unit creates value for the
organization is the business model canvas (Figure 2-3). This is a one-page template for summarizing
nine essential business drivers: key partners, key activities, key resources, value provided, custom-
ers, customer relationships, channels, costs, and revenue. Learning leaders may find it to be a useful
and easy tool for creating links between the business and learning services.

A Proactive Approach to Strategic Learning Alignment | 23


Figure 2-3. The Business Model Canvas

An Organization’s Business Model


(Business Model Canvas)
Key Partners Key Activities Value Provided Customer Customers
The network that • Making products • Convenience Relationships • Reason to exist
enables the business (including services) • Price • Personal
model to be effective • Problem solving • Design • Automated
• Sales (including • Cost reduction • Self-service
promotions) • Risk reduction
• Admin work
Key Resources Channels
• People • Enable awareness
• Physical (e.g., land • Purchase
and buildings) • Support
• Intellectual
(e.g., brands and
copyrights)
Costs Revenue
Related to acquiring resources, performing key activities, and • Sales
working with partners • Leases
• Rents
• Usage fees
• Subscriptions

Summary
Today’s learning leaders have a lot on their plate. In the vast majority of organizations our role is
expanding to include other aspects of talent management, such as leadership development, perfor-
mance management, change management, and succession planning (ATD 2017a). As our duties
expand, so does our influence and our ability to affect business results. By undergoing an intentional
process for building strategic alignment, we can prioritize our learning initiatives to support the orga-
nization’s most important objectives.
In this chapter, you were introduced to four steps for achieving strategic learning alignment: know-
ing the business, building the business case for learning, engaging leaders in key learning activities, and
communicating business results. You can apply these steps at the organizational, department, or project
level, depending on your organization’s structure, size, or culture. This gives you the flexibility of deter-
mining the most appropriate opportunity for demonstrating the ability of your learning function to be
a strategic enabler, not merely a cost center.
The first two steps of building alignment—knowing the business and building the business
case for learning—require you to understand the organization’s purpose and value proposition and

24 | Chapter 2
how it measures results. You must be able to express, in business terms, how your learning solution
equates to improved business results. Would you buy a car from someone who couldn’t answer
basic questions about gas mileage and safety? Of course you wouldn’t. So why would a business
manager be convinced to engage your learning solution if you can’t demonstrate an understanding
of their business?
Your critical source for designing and developing the best-fit learning solution is the business
leaders themselves. They can fill multiple roles and serve many purposes. As part of the governance
structure, business leaders lend vital legitimacy to and endorsement of the project or function as a
whole. Their support sends a strong message that learning is critical to the organization’s success.
Business leaders can be more directly involved in learning by participating as course facilitators and
providing access to SMEs. Finally, business leaders who are vested in the learning solution will hold
their business units accountable for applying their new skills on the job.
The final step for building learning alignment is to communicate the value that learning contrib-
utes to the organization. One of the rules of sales management is “No anonymous giving.” There is no
shame in making sure the influencers in your organization know how your learning function influences
business results. In fact, it is imperative if you want to earn valuable currency in the competition for
scarce resources.

Key Takeaways
9 There is a gap between how learning and business leaders view the value of learning to the
business. Learning leaders are significantly more likely to see learning as a strategic enabler
rather than a cost center.
9 To close the gap, learning leaders should think and manage their units like business managers.
In an aligned organization, the main filter for sorting and prioritizing learning initiatives is
business impact.
9 Learning leaders should seek to achieve alignment at as many organizational levels as
practical, given the size and complexity of the organization. This requires a structured
approach and intimate knowledge of the business.
9 Involving business leaders in learning and communicating business results are two highly
effective ways to ensure that learning initiatives have the support of senior management and
are designed for business impact.

A Proactive Approach to Strategic Learning Alignment | 25


Questions for Reflection and Further Action
1. How might you use these tools in your organization?

2. What adaptations do you need to make for these tools to fit your culture and meet your
alignment needs?

3. What opportunities are there for operationalizing strategic learning alignment in your
learning organization?

4. How might you leverage your networks to learn “best practices” for learning alignment?

5. What internal support or sponsorship for learning alignment is available?

26 | Chapter 2
3
Strategy Revealed:
A Personal Experience Journey
Teri Lowe

“Let’s make a decision and get on with it.”


Yep. Those were some of my thoughts in response to being asked to take a step back and develop a
learning strategy. Of course, what I actually said was more like, “That’s a good idea. Let’s take some time
and give ourselves the luxury of developing a learning strategy.” Meanwhile, my thoughts continued along
the lines of: “Strategy. A luxury. Not a necessity. Who has time away from the real work for strategy?”
I had asked what I thought were very simple questions of my director: “What do you think we should
do about the supervisor learning path? Should we take the same approach as when we created the curric-
ulum two years ago and align content to our new leadership capabilities? Or is an offering specific to
supervisors still needed? What if we just align content to the capabilities, not to a specific leadership level,
and users can start where they think they need to begin?”
I wanted her to weigh in on what the next iteration of that packaged offering should look like, given
the recent work of team members who had completed an inventory of our leadership content and aligned
it to our new leadership capabilities. Now it was time to do something with the results of that work effort
and get the existing offerings updated and on their way to users.
But I was confused when I could not get an answer from her about this year’s approach.
As a midlevel training manager within our organization, I had learned through observation and
experience what I believed my role to be. My job was to juggle resources and projects to meet the requests

27
of senior stakeholders. We had been asked to create a learning path for supervisors after years of internal
discussions and consultations with vendors. And before that program was even rolled out, we were asked
to develop one for middle managers and then division managers. Clearly, our senior leaders were inter-
ested in developing leaders within our organization—at all levels. And so, I knew through observation and
experience that it was important to obtain my director’s buy-in for this year’s offerings.
But wait. My director this year wasn’t the same director as last year. And the director I’d had last year
wasn’t the same as the year before. And all of their bosses had changed as well. As often happens, senior
staff were rotating quickly from one position to another. And they were all requesting specific learning
programs based on what they believed to be the needs of the organization. In fact, program design and
deployment was often dictated by the requesters and not by the learners or the learning professionals
developing the training.
Looking back, I should have been giddy that our director was asking for a learning strategy. But
instead, I was confused. I recall thinking and asking, “Don’t we already have a strategy?” You know, “Iden-
tify high potentials, develop leaders, transition leaders, become a learning organization.” That’s what we’d
come up with two years ago, the last time the “s” word was mentioned. But I was informed that a learning
strategy was different from a leadership or leader development strategy, which was again different from a
talent strategy. Now, for the first time in my career, I was being asked for a learning strategy.
And I was even more confused. We’d had formal training departments within our organization for
decades upon decades, so didn’t we already have a theory of learning driving that work? Maybe all I
needed to do was search the archives for where that theory was kept and share the information with my
director. Besides, I was relatively new to my position. I could say to her, “Look! We have a learning strategy.
It’s already done! All you need to do is read this document from 10 years ago.”
But to be honest, this wasn’t the first time I was being asked to engage in strategy. In the last few years,
we’d had projects under way related to our content strategy, our measurement and evaluation strategy,
our onboarding strategy, our governance strategy, and most recently, our leadership development strategy.
That’s why I was able to rattle off the four strategic imperatives related to that most recent work. However,
this was the first time anyone had asked about a learning strategy.
How many other learning department leaders have suddenly been asked for or about strategy? And
what would I do when I learned my director was not going to answer simple questions about how to
package the next iteration of a supervisor learning program, but instead wanted a strategy? You’re about
to find out.

Looking Around: What Do Those Closest to Me Know?


Instead of first looking inward, I naturally did the easiest thing and looked for an answer from those
around me. Did my peers and team members know something about our learning strategy that I had
somehow missed? I informally polled them, asking:

28 | Chapter 3
• What is our company’s vision for learning?
• What is our department’s learning strategy?
• What is your strategy for learning?
• What is the last thing you learned? How did you go about that learning?
The responses made me think I hadn’t missed anything after all. “Not sure” showed up as a response
a number of times for the first two questions. But so did other responses.
When asked, “What is our company’s vision for learning?” my colleagues also replied:
• “I think our company has the desire to become a learning organization, but we don’t have a
plan that I’m aware of on how to get there. For now—maybe it’s to deliver the right training at
the right time as quickly as possible so people can safely perform their jobs.”
• “Self-directed and a mindset that learning is an investment versus a cost to cut.”
• “Although I don’t remember seeing anything articulated, our company develops its employees
with the goal of excelling at their current positions and being prepared to take the next step in
their careers.”
• “Our vision for learning is one in which there are individuals independently learning and
seeking out experience and exposure opportunities to continuously develop.”
• “I believe the company’s vision for learning is to continue to transform and strengthen our
leadership position, from within, while maintaining our core values and enterprise strategy.
We continue to invest heavily in learning in job-specific workshops, instructor-led training, and
web-based training and courses.”
• “The company strives to provide our employees a world-class experience, where learning serves
to advance those that want to grow through development.”
When I more closely investigated the work we do and asked, “What is our department’s learning
strategy?” I got the following responses, in addition to the “not sures”:
• “Our strategy is to create e-learning content and job aids to help promote our learning vision.”
• “Our learning strategy is to create meaningful, worthwhile training.”
• “Find leaders, develop leaders, transition leaders.”
• “We curate when possible and create learning programs (online and instructor-led training)
focused on leadership levels: part-time supervisors, full-time supervisors, middle managers, etc.”
• “We do what we are told to do in response to needs identified in the operations.”
• “We talk about more microlearning (videos, shorter modules, etc.), but are we building those?”
• “Our department’s learning strategy seeks to identify and develop the talent we need by leading
through change using the leadership framework.”
• “We are striving to set ourselves up as a learning organization . . . to be the ambassadors of
learning and guide our leadership toward development content/activities they need to be
successful (to sharpen and grow) in current and future roles.”

Strategy Revealed | 29
The question, “What is your strategy for learning?” generated a couple of “I don’t have a strategy”
replies and the following:
• “Continuous, purposeful learning.”
• “When I see a need to develop my skills, then I seek out additional information.”
• “Read, write, watch, do, sleep, repeat.”
• “Mostly I seek out independent learning offerings outside our company to learn. Sometimes I
use our vendor content as well.”
• “My strategy for learning involves identifying strengths and development opportunities
according to our leadership framework as well as identifying a mentor to job shadow and learn
more about related departments.”
• “I value visual stimulus and the experience of application, then time to reflect and evaluate if
I truly learned, knowing that repetition and meaningful application improves my retention. I
have also come to heavily rely on just-in-time learning for real-time application and discovery.
Researching a topic, reading, and watching a video, and then performing tasks as I pause and
replay a how-to instruction.”
The last questions, “What is the last thing you learned? How did you go about that learning?” yielded
these insights:
• “How to do pivot tables (it’s been a while); I went on YouTube.”
• “I am constantly finding information online that I either seek out intentionally or may come
across unintentionally. If the information piques my interest or a current need, then I will
pursue further. It may be in the form of online training modules, or it may be a blog, a
YouTube video, or even research articles. I also like to learn from others. . . . There is so much
information at our fingertips, it would be an injustice to focus on one way or one method of
learning.”
• “I’ve recently been learning more about photography, editing tips, and techniques on mobile
devices, and recently found a learning path on graphic design on Lynda.com that I am
interested in completing. I’ve also been doing more personal development on subjects such as
values, emotional intelligence, Myers Briggs types, along with the time management styles that
a co-worker shared with us.”
• “I needed to know how to set my homepage in Edge. I googled the “how-to” question, read
through the first instructions that displayed in search results, then followed along in the Edge
settings. I had to repeat the process twice, because the first time my new homepage didn’t save.
I had missed the ‘save’ icon; that info was not in the instructions.”
• “The last thing that I have learned is from using Storyline. I learned by doing, and learned
from the help online.”
• “Learned some French by reading a teach yourself French novel.”

30 | Chapter 3
• “I recently changed the clutch plates and springs on a 2005 Honda CRF150 dirt bike. I have
very little motocross experience, had no owners manual, nor am I a mechanic. I didn’t even
own the right tools to complete the job. . . . Starting with a google search I was able to pull
up a free schematic diagram. . . . I then did a YouTube search and watched a video of the
entire procedure. What took them 20 minutes to do in the video took me a week and a half,
but after watching the video multiple times and buying a few additional parts, I successfully
completed the job.”
Responses to these four questions revealed that we weren’t clear on a company or department-wide
learning strategy, but we knew quite well what works for us. As learners, we find ourselves seeking out
information when needed or desired, often applying or having the intent to apply soon after the learning
event, and turning to Google, YouTube, or another location as a ready source for information.

Looking Within: What Do I Know at This Point?


Given my own and my colleagues’ responses to the informal learning strategy questions, I was prompted
to look inward.
As I mentioned, I thought we already had a strategy when first asked to take a step back and consider
our learning strategy. My initial response—“our strategy is to identify our high-potential employees;
develop our high-potential employees; transition leaders; and become a learning organization”—was met
by a resounding no from my director: “That’s not our strategy.”
Yet, I knew that a number of those strategic imperatives were driving work that had taken place
in our department over the last few years. A high-potential assessment had been rolled out when we
transitioned to a new career management, performance management, and learning management
platform. And funds were approved to have an outside organization validate our internal assessment.
My team had developed and implemented three new programs with multiple offerings intended to
help develop part-time and full-time supervisors’ core corporate values and leadership competencies.
And soon after launching those programs, we had developed and launched a transition program for
new midlevel managers. At the same time, we were continuing to meet the needs of leaders in our
operations with simulations and resources for addressing conflict. Likewise, in the realm of executive
development, a senior leadership development program had been developed, piloted, and imple-
mented, and funding secured for the next iteration. Clearly, we were implementing what I believed to
be our leadership development strategy by identifying, developing, and transitioning our employees
in leadership positions.
As for the imperative to “become a learning organization,” I wasn’t privy to the work being done by
others within the organization on that initiative. However, the topic came up frequently during my imme-
diate team’s meetings, and we often shared information and articles with one another. Every now and then
I would catch a glimpse of a slide that captured the leadership development imperatives I was familiar

Strategy Revealed | 31
with, but then moved beyond them to incorporate some of the components of what we now spoke of as
our “new” leadership framework.
But now, I realized that when I thought about “learning” and “learning strategy,” I thought about
learning that is self-directed and applied at the moment of need. For example, within the last couple
months at work I had wanted to know about the difference between coaching and mentoring. So, I went
into our learning management system and searched our training content using those keywords. I easily
located and took the e-learning modules on both topics. In fact, I not only accessed the training in search
of a definition; I completed the training—as evidenced by my “official” transcript and “permanent
record.” No one assigned the training to me, nor had I placed either topic on a career development or
performance plan. In fact, when it came time to apply my newly acquired knowledge, I used the infor-
mation in an email. To whom, I don’t recall. Why, I don’t recall. But at the time, I felt the need to educate
myself about how those two terms were used and differentiated. Yet can I tell you, right this very moment,
what I learned or the difference between those two words? No.
So, did learning take place? Or was I just able to access information in the moment of need? How
important is it that I be able to recall, from memory, how those two e-learning modules defined those
terms? Why did I choose that source of information instead of turning to the dictionary, Wikipedia,
Google, YouTube, or the books and videos within our LMS? What about the person next to me? And
what biases, conscious or unconscious, did my choices reveal?
I didn’t realize it at the time, but in hindsight I must have valued the vendor-provided e-learning
resources within our LMS over my other options, because I invested time into completing the modules—
tests and all. And some “good student” part of me must have decided that getting credit was worth the
time that I invested in learning. An embedded assumption must have also been that I could trust the
information in the modules and that the sources were credible. Because I didn’t look elsewhere to validate
the information.
Looks like a couple of personal biases were revealed here: I want to seek out credible sources, but I
don’t want to spend any more time than necessary doing so. I’m willing to invest my time once a credible
source is located, but I wanted credit for doing so.
Would other learners have taken the same steps and arrived at the same resources and answers
I did if they wanted or needed to know the difference between coaching and mentoring? Probably
not. Does that make my process in finding answers better or worse than the process someone else
might use? No. Thus, I’ve just uncovered another personal bias about learning: There is no one right
way to learn.
But wait, if I can’t recall the information I sought out months ago, did I even learn anything? And
does the “stickiness” of learning even matter? As a learner, should I be concerned that a tidbit of infor-
mation stayed in my short-term memory only long enough for me to incorporate it into an email, not long
enough to truly apply it, and that the information never made it to my long-term memory? Although,

32 | Chapter 3
since taking those modules in the LMS, I can recall several workplace conversations about coaching and
mentoring in which I shared my personal experiences and observations on both topics. Hmmm. Did
those e-learning modules inform any of those thoughts? Or, was what I shared based on my decades of
experience? Again, another personal bias revealed: Experience likely takes precedence over information.
I’m still a bit bothered, though, by the question I asked and didn’t answer earlier: “If I can’t recall the
information I sought out months ago, did I learn anything?”
Like many of us, I have a passion for learning. But if I can’t recall what was in an e-learning
module from two months ago, what happens when I can’t pass all the tests that I took in college if I
had to retake them today? Do I have to turn in the degrees? Besides, when is learning demonstrated
through recall essential?
In trying to answer that last question, I think about times when my health and safety are at stake. For
example, it’s important that I know how to lift and lower a heavy package without injuring my back. It’s
important that I remember how to turn a piece of industrial equipment on and off. It’s important that
I recall where to go and what to do if I get chemicals in my eyes. Bias revealed: It’s important that I’ve
learned and can demonstrate behaviors that keep me safe.
Of course, it makes my life easier when I can remember what year a leadership training program
deployed or the name of a vendor we worked with, but I have that information written down—some-
where. And, while I might save myself and others some time if I can remember the details of a simu-
lation within a training offering, that type of recall or demonstration of learning isn’t essential. What is
important is knowing where to look. Suddenly, or not so suddenly, knowledge management has revealed
its importance. And once again, another personal bias about learning has been revealed: I’m OK with
not being able to remember everything, as long as I know and can recall where to look for the needed
information.
In thinking through my search for the difference between coaching and mentoring, I’ve defined learn-
ing—at least for me—as “learning that is self-directed and applied at the moment of need,” and I’ve
revealed the following personal biases:
• I want to seek out credible sources, but I don’t want to spend any more time than necessary
doing so.
• I’m willing to invest my time in learning once a credible source is located. And, I want credit
for doing so, if credit is available.
• There is no one right way to learn. My approach to learning may be different from yours. And
I’m OK with that.
• Experience takes precedence over information.
• It’s important that I’ve learned and can demonstrate behaviors that keep me safe.
• I’m OK with not being able to remember everything, as long as I know and recall where to
look for the needed information.

Strategy Revealed | 33
When I reflected further on my personal learning experiences, biases, and strategy, I arrived at the
following: Perhaps our choices and actions reveal our strategy, regardless of whether that strategy is artic-
ulated or documented.

Looking at Our Practices: What Have We Been Doing?


In considering the approach our department has taken to the design, development, and deployment of
learning offerings over the past few years, it’s possible to determine some organizational—or at least
departmental—biases, approaches, and strategies.
For example, when I first joined our corporate leadership and talent development group several years
ago, one of the first projects I worked on was aligning content in our LMS to subject areas so users could
easily browse for training based on a given topic. Yes, our LMS had a subject area feature, but it displayed
a hodgepodge of offerings that were often unrelated and had no organizational structure. Identifying the
big buckets of content and then the subtopics within those buckets was the first task. Identifying and align-
ing content followed. Strategy revealed: Make similar content easy for the user to locate.
When we moved to a new LMS a couple years later, it had a feature where users could publicly rate
content by assigning up to five stars and leaving a comment, if they desired. Then, when users searched for
content, they could easily see those star ratings, as well as the ratings and comments, the number of users
who’d provided a rating, and even who had weighed in on the materials. Even though we’d offered elective
Level 1 reaction evaluations for years, those results are not publicly available. The LMS comments and
ratings were. Another strategy revealed: Provide users with the opportunity to publicly evaluate content
offerings. Worried about content curation? Enable users to help curate for you!
During this same period, we were working on projects related to our content strategy, our measure-
ment and evaluation strategy, onboarding, and our governance strategy. Many of those work efforts and
deliverables continue to inform what we do today. However, enterprise-wide strategic initiatives, such as
onboarding and governance, have been difficult to implement or sustain given their far reach and scope.
Governing the use of the LMS and providing guidelines for e-learning design, development, deployment,
and measurement is doable, at least in our immediate realm. Strategy revealed: Start controlling what is
in your realm of control.
Upon closer inspection, I can also see how earlier work on our content strategy and measurement
and evaluation strategy informs our processes and deliverables today. Thus, the following preferences that
relate to a learning strategy are revealed:
• a preference for shorter training bursts, such as videos, rather than the longer e-learning
modules we’ve all been used to
• a preference for shorter e-learning modules; less than 30 minutes in duration when possible
• a preference for a variety of blended learning resources packaged together in
curriculums or libraries

34 | Chapter 3
• a preference for simulations and role plays in both e-learning and instructor-led offerings
• the use of standardized Level 1 reaction evaluations and Level 2 pre- and post-tests to inform
content updates or future offerings
• a 70-20-10 approach to learning, with experience accounting roughly for 70 percent of what’s
learned, exposure accounting for 20 percent, and formal training offerings accounting for only
10 percent
• curating content whenever possible rather than creating it.
Looking at and reflecting on what we had been doing over the last few years revealed what looked
to be a learning strategy, but where did these practices originate and did any of them align with industry
best practices? It was time to delve deeper and see what others outside our organization had to say about
the topic.

Looking to the Industry: What Practices Exist Outside Our Organization?


First up was identifying the roots of our current practices.
The preference for just-in-time and self-directed learning was enabled by the advent of e-learning
delivery via an LMS in the early 2000s. A user-centered design approach clearly informed the design
of our programs and the layout of user interface features, such as browsing by subject area, that
we were able to control. The ability to grant credit and track activity appealed to users and content
administrators alike.
Our evaluation practices are historically rooted in the work of Donald Kirkpatrick and informed by
the experience of team members taking part in the return on investment (ROI) certification workshops
offered by Jack and Patti Phillips of the ROI Institute.
The desire for blended learning programs and the use of simulations and role plays in all delivery
modalities certainly tie into our century-plus history of skills and leadership training, both of which have
a focus on hands-on and applied learning.
Yet in more recent years, the preference for shorter training bursts tracks back to the concept of
microlearning. It was also a response to Josh Bersin’s “Meet the Modern Learner” infographic, which
confirmed for the industry what we already knew from personal experience: “Today’s employees are over-
whelmed, distracted, and impatient.”
Our 70-20-10 approach is grounded in the popular and controversial learning model that experience
accounts for 70 percent of learning, exposure and learning from others accounts for 20 percent, and only
10 percent comes from formal learning events such as a course or workshop. We don’t hold to any hard
rules about applying those percentages to our learning programs, but in recent years we have been build-
ing guided experience and exposure activities into our programs.
Curating content when possible is made feasible through our vendor arrangements and makes good
business sense in terms of resource allocations.

Strategy Revealed | 35
Bottom line, our department had a learning strategy driving our deliverables, but we had not yet
consciously revealed that strategy.

Looking to the Experts: What Do They Have to Say?


Next up was learning I was not alone in trying to answer the strategy question.
I found great solace in Rich Horwath’s “The Strategic Thinking Manifesto” (2015), when I
read that a survey conducted by Roger Martin of the Rotman School of Management revealed that
67 percent of managers believe their organization is bad at developing strategy and that 43 percent
of managers can’t articulate their strategy. Likewise, he cites a study from the Harvard Business
School, which reported that in large organizations, such as my own, 95 percent of employees are
either “unaware of or don’t understand their company strategies.” And, to make matters worse, I
learned that 90 percent of directors and vice presidents report receiving no training on business
strategy. Lack of training on how to think strategically was clearly part of my skills gap, and for
others as well. I was not surprised, however, to learn that strategy is often confused with concepts
such as mission, vision, goals, and objectives. Fortunately, Horwarth provides a framework one can
use to identify, articulate, and differentiate goals, objectives, strategy, and tasks. This means other
strategic-thinking tools are likely available as well.
Richard Rumelt (2011) has even more to offer on the topic in his book Good Strategy, Bad Strategy. In
the introduction, he writes, “A leader’s most important responsibility is identifying the biggest challenges
to forward progress and devising a coherent approach to overcoming them. . . . Strategy matters.” He
acknowledges that leaders struggle with strategy, and if they say they have a strategy, it is often a bad one.
He then goes on to identify the three elements of a good strategy: “a diagnosis, a guiding policy, and
coherent action” working together to provide an approach for overcoming a challenge. Clearly, I have a
lot more to learn from those outside my training bubble.
And back to the personal, I also learned about a process for experience mapping my own strategy
journey, thanks to a 2017 article by MJ Hall. According to Hall, the process is one of the human-cen-
tered design methods within the LUMA System of Innovation framework. LUMA stands for “looking,
understanding, and making,” which are the key design skills behind innovative thinking and problem
solving. The process itself involves using an experience diagram to map interactions with people, places,
and things along a time continuum of beginning, middle, and end. In the article, Hall quotes American
educator John Dewey: “We do not learn from experience. . . . We learn from reflecting on experience.”
My strategy journey had clearly begun. Documenting my experiences and reflecting upon the journey
promised even more revelations.
The fact that others outside my organization were asking and attempting to answer the strategy ques-
tion, despite the difficulties, gave me hope.
Which brings us to the next step in the process for further revealing our learning strategy.

36 | Chapter 3
Looking Back and to the Future: Why Strategy? Why Now?
I needed to step back, once again, and consider the context of why I was being asked for a learning strat-
egy, and why now. Within our organization, we’d been hearing from our CEO and other senior leaders
about the importance of transformation: organizational transformation, human capital transformation,
and other business-unit and function-specific transformations. And the concept of continuous transforma-
tion had also entered our worldview. Enter the need for our department to have a learning strategy that
would help employees respond to and thrive during times of ongoing transformation.
The previous work that went into our content strategy, our measurement and evaluation strategy,
and our leadership development strategy will certainly inform and tie into what will become our learning
strategy. And thinking beyond our immediate training group, our department’s strategies will support our
human capital and enterprise strategies.
And so, several members of my team formalized the learning strategy question into a learning strat-
egy project, and we have embarked on a journey to reveal more.
What more can we learn about the external business environment and the organizational context
that leads to a request for a learning strategy? What changes are taking place in our workforce, and what
needs are surfacing as a result? What more can we learn from others, both within and outside our orga-
nization? And back home within our talent management world, what do we want our future to look like?
These are the questions that drive our work today. These questions also force me to return to my initial
thought—making time to develop a learning strategy is a luxury, not a necessity. I now know I was wrong.
Being able to articulate our learning strategy is necessary for us to determine how to move forward and
shape our future.
One more strategy revealed: I can learn from failure.

Summary
All this learning process work and insight still brings me back to the initial question, which we have yet
to answer: “What do you think we should do about the supervisor learning path?” And the even larger
question, “What’s next for learning design, development, deployment, consumption, and measurement
at our organization?”
The answers to those questions remain to be seen. But we are working on several opportunities that
revealing our learning strategy will help inform.
First up are the ongoing requests for instructor-led training (ILT), coupled with the organization’s
need for cost reduction. Our current ILTs were developed more than a decade ago after the former
resource-intensive programs were dismantled. The decade-old “new” offerings have been minimally
updated since then, and that was only when content was retired or topic labels changed. In addition, the
live virtual classroom sessions developed in conjunction with a vendor a decade ago have not been offi-
cially offered since the new LMS was launched two years ago. The vendor-coded modules no longer func-

Strategy Revealed | 37
tion in the LMS as intended, and we discovered that internally we could not update the vendor coding. In
addition, the technology behind the virtual classroom had changed, and facilitators were unable to justify
continued virtual offerings when participants who had signed up for the classes did not show up, showed
up unprepared, or did not return for multiday sessions. What had once been a well-rounded catalog of live
and virtual ILT offerings dwindled to the essential few and the occasional ad hoc offering in an attempt to
address an immediate need. The need for instructor-led training remains.
At the same time, changes in the workforce due to retirement, retention, and new hires have led
to the need to support employees at all levels as they upskill, reskill, and transition into new or differ-
ent roles and roles of increasing responsibility. The need for transition and performance support is
greater than ever.
Given our team’s current thinking about learning within our organization, we also believe we have
a need for cohort-based social learning, as well as venues for easily accessible user-generated content.
However, the learners within our organization are not asking for more learning opportunities. What
message does that send about our learning strategy? In a recent focus group with newly promoted middle
managers, their message was clear: What they didn’t want was more e-learning.
What we do know at this point along the strategy journey is that we are not alone in the quest.
What began as a simple question about what the next iteration of a learning program should look
like has led us to start questioning the learning strategy behind that and other programs, and what that
strategy might look like. The work continues.

Key Takeaways
9 Buy-in from senior stakeholders matters. When they ask the strategy question, it’s because they are
interested in the answer and the value that answer brings to the organization.
9 Make time for strategy. Strategy will inform what you do, and why you do it. Well-designed and
relevant programs can last longer than the people who advocate for or manage them.
9 When you’re asked the learning strategy question, start with yourself and those closest to you. From
the learner’s perspective, you already have some of the answers. Our personal choices and actions as
learners reveal our strategy—regardless of whether that strategy is articulated or documented.
9 If you have to search the archives for a strategy documented long ago, then it’s time for a new
strategy.
9 Departmental and organizational choices further reveal strategic preferences for collective groups of
learners. Even if it’s not articulated, if you deliver learning, you have a learning strategy driving your
deliverables. Take time to reveal that strategy.
9 You are not alone in trying to answer the strategy question, even if it feels like it sometimes.
Consider what the question and a focus on strategy reveal about your learning department
and organization at this moment in time. Then consider the benefits of an articulated strategy

38 | Chapter 3
on your deliverables, your learners, and your organizational goals. Being able to answer the
strategy question is critical in determining how to move forward and in shaping the future for any
learning organization.

Questions for Reflection and Further Action


When you’re asked to answer the strategy question, consider the following questions and action steps:

1. What do those closest to you have to say?


Action Step: Develop and launch an internal survey or conduct a focus group.

2. What do you think you know or believe? What biases are you aware of ?
Action Step: Use writing as a tool and process for discovery. Map your journey.

3. What do others know?


 Action Step: Benchmark with other organizations, research the topic, ask many questions, and
document what you learn.

4. What influences within and beyond your organization have led to someone asking for strategy at this
time and in this space? What ideas or concepts keep bubbling up and resonate with your current
environment?
Action Step: Ask questions. Be curious. Take notes.

5. What does an articulated and documented strategy mean to your work and your deliverables? To the
future of learning at your organization? What needs to change? Why? How soon? Who needs to be
involved in and agree with those changes? Do you need to market or evangelize the strategy beyond
your immediate team? Is governance needed?
Action Step: Share what you have learned. Continue the journey.

Strategy Revealed | 39
4
So, You Want to Be Strategic?
John Kelly

I had a dream. I wanted to be a business partner. I wanted to work closely with business leaders, advising
them on ways they could use learning and organizational effectiveness to enable their business. I wanted
to work within a team of internal HR and talent consultants supporting an executive team. In short, I
wanted to be strategic. It was a role I had been encouraged to do in several different organizations, and
yet it always seemed just out of my grasp.

What Does It Mean to Be Strategic?


When I finally got the opportunity, I was so excited! I was working for a large global corporation
within an HR business partner team that welcomed me with open arms. I thought I had finally
found what I was looking for. But then, I started to receive requests to work on projects that seemed
transactional and unrelated to the business.
As collaborative and collegial as the team was, I came to realize that simply having a title or
being part of a senior internal consultant team doesn’t make you strategic. Therein lies the first of
the challenges to being strategic. If I had a dime for every time I heard someone say, “We need HR
to be more strategic” or “We need the learning function to be more strategic,” I would be a very
rich person. The reality is, I don’t think the speakers knew what being strategic meant, and I don’t
recall ever being given a specific process or a set of deliverables that would enable me to consult in
a strategic way.

41
The first step in any transformation is to define the change. So that is where we begin—what
does it mean for a learning professional to be strategic? I propose it means being in a role that works
closely with business leaders to enable business results through learning and organizational effec-
tiveness solutions.
We also need criteria to measure ourselves in this new role. How will we know if we are doing
strategic work? I took a cue from business leaders to answer this question. Typically, the most strategic
roles in an organization have the characteristics listed in the reflection box below. When I started
assessing the projects I worked on against this list, it was enlightening and frankly a little humbling. I
challenge you to do the same.

Reflection: How many of your projects:


• Support customer-facing roles?
• Are closely aligned to the strategic priorities of the organization?
• Have a wide scope of impact on the organization?
• Are directly affecting profit generation?
• Are as involved in the creation of business strategy as the implementation of business strategy?
• Have a clear business impact with measurable results?
• Are aligned with the future growth of the organization?

The goal is to be part of a team that is working with a business leader doing projects that meet
these criteria.

What Do I Need to Be Strategic?


According to Robinson and Robinson (2011), to be successful as a strategic business partner you
need three things: access, credibility, and trust (the ACT model):
• Access is being able to connect with the business leader client on a regular basis. This
doesn’t guarantee the work will be strategic, but we can’t get strategic work if they never
meet with us.
• Credibility is related to our expertise in learning and organizational effectiveness. We
must have deep knowledge and experience in learning theory and practice. If they do not
believe we can provide effective learning solutions, they will not reach out to us for help.
• Trust is essential. Without it, they will not be vulnerable or open up to us. We won’t be
allowed to be part of the needs analysis. We will only be invited when they think we can
provide a solution they have already selected.
This gives us a framework for creating a plan of action. For instance, when I realized that I was
not getting access to the plant manager at a location, I focused on having lunch with any members
of management who happened to be in the cafeteria. It started with supervisors who attended some

42 | Chapter 4
classes I taught. Soon, I was having lunch with their managers, followed by their directors. Within
a few months, I was having lunch with the plant manager. Through this I gained access.
The first time I was given a project, I worked tirelessly to learn about the business and its needs.
I produced a product that far exceeded the expectations of the plant’s senior leaders. By doing this
on a consistent basis, I was able to build credibility.
I also intentionally moved away from only “talking shop” and connected with people on a
more personal level. I would ask about their pain points and challenges and share mine with them.
Because I made the first move and was vulnerable with them, they began to open up to me. This
fostered trust.
Over a long period of time, most anyone can organically gain access, credibility, and trust.
But, if we act intentionally, we can considerably shorten this period. Be aware, though, that if we
are only doing these things to achieve our goals, it will not work. In our heart of hearts, we must be
invested in their success as much as they are. Our behaviors must be rooted in an authentic desire
to help them succeed. Without authenticity, we will not be able to gain their trust.
Even with the ACT model, it is easy to fall into a transactional relationship with our clients.
If we wait for them to give us projects, we can only be as strategic as they let us. We need to be
proactive and put ourselves into a position where we can recommend learning solutions that we are
confident are business enablers.
If we are being proactive, our role begins to look like a sales consultant who is more concerned
about solving their client’s problems than making money. Our responsibility becomes learning their
business, building relationships, identifying needs, and recommending solutions that meet those
needs. We develop more of a collaborative exchange and are less likely to be waiting by the phone
to take an order. Thinking and acting like a sales consultant creates a new paradigm for our role.
We called this proactive process “learning strategy consultation.”

Reflection: How proactive are you?

How to Pursue a Learning Strategy Consultation Process


We have already determined that the word strategy is vague and undefined, and adding the word
learning to the front doesn’t really help. Consider this definition:

A learning strategy is a process that enhances the organization’s ability to accomplish


tasks or activities that are mission critical; requires a depth of knowledge, skills, and
experience; affects multiple parts of the business; and addresses significant competency
and organizational effectiveness gaps.

So, You Want to Be Strategic? | 43


Reflection: Take a minute to read that definition and note key words or phrases that jump out at you. For example:
• Process. It is not just a document or deliverable—it is a set of steps we take. It starts before we meet
with business leaders, includes a needs analysis and selection and implementation of learning solutions,
and concludes with an evaluation of impact. It continues until we deliver the expected impact to the
business.
• Enhances the organization’s ability. The activities should have a clear link to what will affect
the organization’s ability to perform.
• Mission critical. We need to prioritize our activities around the most important strategies of the business.
• Knowledge, skills, and experience. Our solutions should be a 70-20-10 blended approach
(experience, exposure, and education).
• Multiple parts of the business. The focus of the learning strategy would typically have the broadest
scope possible, rather than the needs of an individual or small group of employees.
• Competency and organization effectiveness gaps. Not only do we address skill and
training needs, we also take a holistic approach considering all things that influence organizational
performance.

Creating a learning strategy has three benefits:


• It is a disciplined approach to solving capability challenges, ensuring that we are
addressing root causes and key contributing factors.
• It focuses development resources on the activities with the highest inf luence.
• It is a structured, planned approach that accelerates organizational change.
The learning strategy consultation process identifies the business strategy and aligns learning
solutions to enable the execution of the strategy.
In his book Flawless Consulting, Peter Block (2005) outlines a process for engaging internal or
external clients, ensuring that you are working on things that are either solving the root cause of the
problems they experienced or creating enablers for the opportunities they sought. It is this model,
with some modifications, that the learning strategy consultation process is based on.
To be clear, we will always be asked to do ad hoc, transactional work that may or may not be
enabling business strategy. Also, even if we create a learning strategy, there will be many tactical
and transactional aspects of implementing it. Let’s face it, if we asked a CEO—who is possibly the
most strategic person in your organization—to describe their weekly activities, I think we would
all be surprised by how much tactical and transactional work they really do. When implemented
correctly, the learning strategy consultation process ensures that the bulk of our work serves the
organization’s strategy.
The process is similar to a typical sales entry and account development process, which adheres
to the typical timeline a salesperson might experience (Figure 4-1).

44 | Chapter 4
Figure 4-1. Learning Strategy Consultation Process

Data Gathering Link Learning


Learning Strategy Project Project
Preparation and Needs Solution to Evaluation
Consultation Planning Management
Analysis Business Need

Entry and Account


0–30 30–60 60–90 90+
Development (Days)

The phases of this process include:


• Preparation. These are the things we do, gather, and review before meeting with the client.
They may include researching the organization, making plans, and selecting approaches. This
is about gaining insights via existing data from company reports, quarterly business review
presentations, strategy documents, and so forth; engaging the key stakeholders; and making
plans to initiate data gathering and analysis.
• Data gathering and needs analysis. Next, we meet with business leadership to get their
input on the organizational strategies and challenges. This can be conducted in several ways,
but it is best to get input from the entire leadership team in some way. We have found that the
most common and effective method is through structured interviews. This process is about
gathering and analyzing qualitative data.
• Link the learning solution to the business need. In this step, we determine the key issues
that need to be addressed based on the data we collected. This involves selecting priority needs
and relevant solutions and then proposing and gaining agreement on them from the business
leader. This can be a very collaborative approach, involve working more independently, or fall
somewhere in between depending on the relationship with your key stakeholders.
• Project planning. This is a typical project planning approach to executing the agreed-upon
solutions. It involves defining objectives, scope, timelines, budget, and a plan for evaluating
impact. It also involves the change management aspects of our solution.
• Project management. This is creating a regular cadence of revisiting the plan, addressing
slippage, and updating stakeholders on progress.
• Evaluation. Using the information gathered from our evaluation plan, we will create and
present a learning brief for our consultant partner and client.
Each phase of the process has a purpose, steps to do, tools to support implementation, and expected
deliverables to be completed. At the end of each phase is a project gate. Before passing through each gate,
a document or tool summarizes the work done, decisions made, or output of that phase. These documents
ensure consistency in following the phase steps before passing through the gate. Figure 4-2 shows the gates
and corresponding deliverables for the learning strategy consultation process.

So, You Want to Be Strategic? | 45


Figure 4-2. Project Gates and Deliverables

Project Gates Gate Deliverables

Learning Strategy Strategy Document


Consultation

Learning Charter Project Plan


Project Planning Documents Document

Quarterly Board
Project Management Review Scorecard

Evaluation Learning Briefs

As you can see in the figure, there are a few different types of deliverables:
• Strategy document. This summarizes the process followed, the identified business needs,
agreed-upon solutions, the budget, and high-level timelines. It is the proposal delivered to
clients. Essentially, if the strategy document is complete, all the steps of the preparation and
needs analysis process were followed.
• Learning charter. For each solution or learning initiative, a project charter is completed.
This outlines the purpose, description, people involved, and plan for evaluating the impact.
This is like charters used for project management.
• Project plan. The project plan takes the road map outlined in the proposal and the
information from the charter and converts them into a more detailed list of actions, due dates,
and people responsible. It can be stored in whichever tool you have access to. There should be
enough detail that there are tasks to complete and report on nearly every week.
• Quarterly business review scorecard. Each solution or learning initiative is reported on
using a quarterly business review with the client and consultant partner. A consistent, weekly
project management check-in cadence will ensure progress is maintained.
• Learning brief. Assuming SMART goals were used, and an impact evaluation plan was
created in the project charter, you can easily create a learning brief. This will consist of the
original project charter and the results of the project.

46 | Chapter 4
Preparation Phase
This may include researching the organization, making plans, selecting approaches, and so forth.
You will need to gather quantitative data, engage any internal consultant partners, and make plans
to initiate data gathering and analysis. This process occurs during the first 30 days, or as soon as the
client is assigned.
The key players are the business leader, any other internal consultants or partners, and your
team. During the preparation phase you’ll get to know them and their business, as well as share your
process and how you will work with them.
Be sure to gather relevant business, talent, and people data before you meet with them; this will
give you a preliminary view of their organization. You should gather data about the mission and
purpose of the organization: What do they make or do? What are they known for? Explore their
organization, structure, leadership, and culture. How are they currently performing? What are their
sales? What is their current employee engagement?
The time that clients can give you will be limited, no matter how committed they are to learning.
The research you do up front not only allows you to get more quickly to the strategic needs of the
business; it also demonstrates to them that you have understand their business and are invested in
them. These actions will help you build credibility and trust.
Hint: If you work within a team of internal consultants, fostering a productive and collaborative
partnership is critical to the success of your consultation. Prior to engaging the client, meet with your
consultant partner to get to know them, agree on how you will work together, plan your approach
to the learning strategy consultation, and come to an agreement regarding access to any relevant
information.

Reflection: What can get in the way of a productive relationship with internal consultant partners?

Data Gathering Approach


A successful preparation phase will result in agreement between the client, the consultant partner,
and your team to conduct some form of data gathering and needs analysis. The more the leadership
team participates in the needs analysis process, the more accurate the outcomes will be. In addi-
tion, they will be more engaged and therefore implementation will be more successful. The two
approaches I have found most effective are a team-facilitated experience or individual structured
interviews.

So, You Want to Be Strategic? | 47


Team-Facilitated Experience
The team-facilitated experience involves bringing the leadership team together and moving them
through a process of discovery (Figure 4-3).

Figure 4-3. Process of Discovery

For example, my team used four activities to guide our client through the learning strategy
consultation. The mission activity explored what the current and future mission and vision of the
organization was. The SWOT (strengths, weaknesses, opportunities, and threats) activity facili-
tated a discussion around how prepared the organization was to deliver on that mission. From the
SWOT, we identified business implications that had to be put into place to enable the mission.
Finally, from the business implications, we could identify learning solutions that would enable
those implications.
Instructions for developing a mission statement and conducting a SWOT are available on the
Internet. The business and solution implication activities can be more freeform, pulling findings from
the SWOT activity. For instance, the team might agree that one business implication from the SWOT
activity was to increase sales using a more consultative sales approach. The solution implication might
be consultative sales training, interpersonal skills training, and having less experienced sales people
ride along with more tenured ones.

Individual Structured Interviews


Structured interviews are one-on-one conversations with clients. They are called “structured”
because we ask the same questions of every business leader. This allows us to easily find key
themes that are common to all or at least most of the leadership team. If we conducted more
open-ended interviews, we could quite easily find ourselves having to compare apples to oranges
to bicycles to fish. The questions we ask in our structured interviews fall into the categories shown
in Figure 4-4.

48 | Chapter 4
Figure 4-4. Questions for Structured Interviews

Once all the interviews are completed, we (and often our consultant partner) lock ourselves in a room
and analyze the data. We typically use an affinity sort activity in which we review our notes from all the
interviews and write each bit of data on a separate sticky note. We then stick them to a whiteboard and
begin collating sticky notes with common bits of data on them. Then we review each group and identify
the common theme it represents. These common themes become our business implications. Instructions
for this activity can be found on the Internet.
Figure 4-5 shows the before and after of an affinity sort activity.

Figure 4-5. An Affinity Sort Activity, Before and After

Before Affinity Sort After Affinity Sort

As with the facilitated team experience, you would identify the solution implications based on the
business implications that emerged from the affinity sort.
There are many ways to analyze data, and the affinity sort is only one of them. Whether it is a
facilitated team experience or structured interview approach, using an objective method of analysis
helps the team come to a data-based decision about not only the needs of the business, but also their

So, You Want to Be Strategic? | 49


root causes. It also helps you avoid treating the symptoms without curing the disease. If your approach
was successful, you should now have a list of business implications linked to learning solutions.
Now you are ready to provide feedback to the client. Your presentation should include a summary
of the business strategy, business needs, roles critical to those needs, and any gaps or opportunities they
shared with you. Once you confirm that they agree with your summary of their current state, you can
share the learning or organizational effectiveness solution that you have determined corresponds to each
of their needs.
It is important that they confirm and agree to what their needs are before you propose any solutions.
This allows them to see the thread that starts with their mission, passes through their current needs, and
connects to your proposed solutions. If you were to simply propose your solution without connecting it to
their stated needs, you are more likely to experience unnecessary objections or resistance.

Project Planning and Management


The project planning and project management phases are simply the application of basic project manage-
ment practices. The most important aspect is the weekly cadence of reviewing the project plan and iden-
tifying the tasks that will move the project forward that week. We are all very busy and juggle multiple
priorities, and these sorts of projects are typically long term (lasting between 18 months and five years).
Putting out the fires of any given day or week can easily push these projects to the bottom of the priority
list. Also, because learning is what Stephen Covey would call “important but not urgent,” you may find
yourself very far behind in your projects by the time the client remembers to ask for a status update.

Evaluation
There are many ways to measure the success of a learning project. I like to use Kirkpatrick’s levels of
evaluation to create an evaluation plan. My group typically measures to Level 3 (application of skills)
or 4 (business impact). If we are evaluating to Level 4, it is standard practice to measure the three levels
that lead up to it. For instance, if we claimed that our training increased sales by 25 percent (Level 4), but
participants said they hated the training (Level 1), could not demonstrate mastery of skills (Level 2), and
never applied them when making sales calls (Level 3), it’s very unlikely that our training was responsible
for the sales increase.
Whatever we are measuring, it should align with the outcomes identified with the client and consul-
tant partner before the project began.
The evaluation plan should list the evaluation level, the metrics you will measure, and the measure-
ment methods used. Figure 4-6 shows an evaluation plan with examples of what the typical metrics and
measurement methods might be.

50 | Chapter 4
Figure 4-6. Sample Evaluation Plan Showing the Four Kirkpatrick Levels

Evaluation Level Metrics Example Measurement Methods


Level 1 • Worthwhile use of my time • Survey participants
(Reaction) • Relevant to my work
• Recommend to others
Level 2 • Successful completion and output of process • Action plans from strategy
(Learning) • Met the objectives of the process planning, new organizational
• Confident can apply what was learned design, team behavior
• Learned new knowledge and skills commitments
• Survey participants and
manager
• Test or performance
assessment
Level 3 • Apply new knowledge and skills on the job • Percent of action plan
(Application) • Level of effectiveness of knowledge and skills applied implemented
• Barriers to applying new knowledge and skills • Survey of participants,
managers, customers, etc.
• Audits and observe behavior
• Pre- and post-survey of
situation
Level 4 • Percent of key positions • Net promoter score • > 90 day survey of stakeholders
(Business with “ready now” successor • Improved productivity requesting:
Impact) • Employee net promoter • Reduced costs ° Valid report of business
score of participants • Improved processes impact
• Retention of participants • Increased employee ° Estimation of contribution
• Revenue engagement of learning solution to that
• Productivity • Increased innovation result
• Monetary savings • Increased customer ° Pre- and post-comparison of
• Voluntary turnover retention business performance

Summary
Being strategic is a relatively simple concept, but it is very challenging to achieve. It requires sound
technical skills in the learning and organizational effectiveness domain. We must balance that with
relationship-building skills. Finally, we must use an effective, disciplined, and repeatable process
to achieve results that enable business success. This requires a unique combination of hard and
soft skills, as well as the ability to be both f lexible and structured to be strategic. It is a formidable
challenge indeed, but one that is worth taking on.

So, You Want to Be Strategic? | 51


Key Takeaways
9 Having a title or role that sounds strategic does not guarantee that the projects you receive will
be strategic. You need to be proactive in seeking out relationships and projects that inf luence
the business in the most strategic way.
9 Being proactive means shifting your mindset from being an order-taking salesperson to being
like a sales consultant. Learn about the business, build relationships, understand their pain
points, and recommend the most effective solutions.
9 Using the ACT model (access, credibility, and trust) will guide where you should focus when
building relationships and your own capabilities to support the business leader’s needs.
9 Using a disciplined process will increase the likelihood that you are working on root causes,
selecting effective solutions targeted at the most important needs of the business, implementing
lasting change, and measuring impact.
9 Working on strategic projects is no guarantee you will never be doing ad hoc, transactional
work. However, if you apply the disciplined, consultative process, you can be confident that you
are achieving the highest possible value for that work.

Questions for Reflection and Further Action


1. In what ways can you be more strategic within your environment and context?

2. What skills do you need to improve or build to be more strategic?

3. In what ways would changing how you view your role help make you more strategic?

4. How can you begin your journey to be more strategic? What are the first three steps you need to
take? When and how will you take them?

52 | Chapter 4
Section 2
Managing Processes
and Projects

T
his section is about work that is uniquely the functional domain of learning. It starts

with a variety of ways to work with senior business executives to assess the capability

of the organization as a system of systems. This view is much larger than learning and

includes not only people capabilities but also resources, communications, and culture. It features

many tools and techniques to assess the organizational needs to excel in the current market and to

be competitive in the future. The ideas are based on learning research but also include techniques

from other fields, such as Lean, Agile, project management, and organization development.

How do you design the work systems in your organization to ensure they are meeting current

and future requirements? How do these processes enhance organizational alignment? This section

demonstrates some of the many ways the learning team can make an impact on organizational

goals and business results using a variety of core learning processes to build a more competent

domain work system to use in service to business partners.

Chapter 5 focuses on assessing the current state and a clear understanding of a desired future

state to determine performance gaps. As experts in the realm of understanding employee needs
and designing learning experiences, generally learning leaders have an expansive toolkit. We can

use our experience to bring context and purpose together to develop a clear and succinct point of

view for when and how to use a high-impact tool or technique. But perhaps the biggest conundrum

we face with the plethora of tools is how to decide how to decide? Which is the best and most effi-

53
cient way to close the gap? In this chapter, Chris Garton provides insights on several ways to assess needs. During

the 2019 Fall Lab show-and-tell sessions, Jill Carter, Mark Lemon, Taylor Harlin, and Susana Sipes shared their

written case studies and discussion presentations. Based on follow-up activities with participants and interviews

with the presenters these practices and tools are included.

Once organizational needs are determined to be associated with employee capability building, learning

leaders can delve deeper into the gap and work with their business partners to determine the most effective

learning solutions. Chapter 6 addresses one of the most pressing issues we face—what is commonly referred to

as the 1-800-TRAIN problem. The manager calls the learning department and asks for specific training without

any assessment or analysis or skill in learning theory. Jerry Kaminski provides some suggestions for working with

this type of request, which is usually for a triage situation. Once addressed, the experienced and skilled business

learning leader can use their tools in instructional systems design to focus on the root cause of the gap.

Chapter 7 highlights the many ways neuroscience research is elevating learning practices, especially to make

them stickier and more durable over the long term. In the workplace, learning is about changing performance-based

behaviors as a result of new skills and knowledge. It is not so much about what we know but what we do, the actions

we take. Leanne Drennan, Casey Garhart, and Joan McKarnan of IBM share research from neuroscience and then

provide actions to take. One example they share is using priming questions to help connect new information with

what is already known.

In chapter 8, Suzanne Frawley shares details on using a project-based approach to respond to a business

partner’s request. She uses the Covey adage “Start with the end in mind” by gaining clarity on the organizational

need with this question: What is the business challenge your team is trying to solve? As part of the collaborative

work she incorporates design thinking methods and tools to work on both a learning solution and to build the

design team’s problem-solving capability. The big takeaway is the value of iteration as the solution is built.

While many outside of the talent profession still think of learning as a stand-alone event divorced from work

itself, learning happens in many different ways and is a continuous process or journey for most workers. There

are a variety of ways learning is built into the organizational system, including formal instruction in the classroom,

working with colleagues, learning from experiences, and independent learning. In chapter 9, Alan Abbott of UPS

and Rachel Hutchinson of Hilti share how they use the 70-20-10 framework to build a more comprehensive learn-

ing process. The 70-20-10 framework does not eliminate formal training, and in many cases recognizes that formal

training is the starting point. However, it does recognize that learning can take place in a variety of formats and

locations and suggests using a variety of learning modalities—formal learning, social learning with others, and

learning from a variety of experiences.

54 | Section 2
5
Organizational Needs:
Determining Gaps and
Aligning Solutions
Chris Garton and MJ Hall

Consider Jacob, a learning and development manager for an outdoor gear company. Jacob leads a team
that helps deliver training in a customer service call center, providing support for a variety of outdoor
products. In his quest to become the ultimate business learning advisor for his partners, he frequently
researches and tries out new strategic tools to improve his efficiency and stay ahead of the game.

Making a Strategic Impact


Since a new CEO took ownership of the company two months ago, Jacob has seen numerous requests
land on his plate. Under this new leadership, the company wants to roll out a comprehensive restructuring
program that will drive better collaboration among siloed departments. Jacob and his team have been
tasked with owning the leadership training program and ensuring the success of the new vision.
Knowing he has a challenging road ahead, Jacob starts by setting up a brainstorming session with
leaders of departments across the company to come up with ideas. The group was able to provide numer-
ous potential plans, and Jacob walks away from the meeting feeling a major sense of accomplishment.
Unfortunately, by the next day, that positive feeling has quickly faded and dread has sunk in. He has no
idea where to start.

55
Impact Effort Matrix
For anyone who has ever found themselves in a state of paralysis due to overload of information,
fear not! Sometimes figuring out how to decide something is an even bigger obstacle than the deci-
sion itself, leaving us to waste valuable time simply identifying what to tackle first. There are many
strategies and tools available to help solve this problem, but perhaps none are simpler to grasp and
implement than the impact effort matrix (also known as an action priority matrix).
So how does it work? The basic premise is two axes—the vertical axis measures how much
potential an idea has to make a big impact. The horizontal axis tracks the level of effort needed to
turn the idea into reality (Figure 5-1). For anyone familiar with the principles of Six Sigma, this
will be a familiar concept; for those familiar with design thinking, it is similar to the importance/
difficulty matrix.

Figure 5-1. Impact Effort Matrix


High

Quick Wins Major Projects


Impact

Low

Fill-In Jobs Thankless Tasks

Low High
Effort

The four quadrants in the matrix can be understood by their key components:
• Quick wins. High-impact and low-effort items that will bring a lot of bang for their buck.
• Major projects. Greater effort is needed to bring these concepts to life, but they provide a
solid payoff in the end.
• Fill-in jobs. The items in this quadrant may be easy to do, but they won’t provide significant
benefits.
• Thankless tasks. Don’t do these time-wasters.
Jacob is no stranger to finding the right tool for the job. So he quickly turns to the impact effort
matrix to start plotting where the ideas from his brainstorming meeting should fall. By categorizing
each idea, what originally seemed like a chaotic mix of concepts quickly becomes an orderly guide

56 | Chapter 5
for strategic prioritization. Knowing he needs some quick wins to gain initial trust with the new CEO,
Jacob leads his team and partners through a series of introductory collaboration workshops to get the
ball rolling on the new vision. Now one step closer to becoming a true business learning advisor, Jacob
begins planning the major projects that will take the team to the goal.

Eisenhower Matrix

“What is important is seldom urgent and what is urgent is seldom important.”—Dwight D. Eisenhower

A single tool is seldom sufficient to build something great or solve a complex problem. That certainly
explains why toolboxes exist—so we can use a combination of the right ones for each job. A complement
to the impact effort matrix is another tool that uses two axes to create four quadrants: the Eisenhower
matrix (Figure 5-2).

Figure 5-2. Eisenhower Matrix


More

Do Decide
Important

Less

Delegate Delete

Less More
Urgent

Also known as the urgent/important matrix, this tool was named after President and U.S.
Army Gen. Dwight D. Eisenhower and was a key productivity strategy for him. In this matrix, one
axis is dedicated to level of urgency, and the other focuses on importance. This results in four quad-
rants in which tasks can be grouped to help identify the right action to take for each one. A popular
version titles each box as follows:
• Do. These tasks are both urgent and important—focus on them first.
• Decide. Items in this box are important to do but do not carry high urgency. Decide by
scheduling a future time to accomplish these tasks.

Organizational Needs | 57
• Delegate. Tasks found here aren’t vital but need to be accomplished quickly. Delegate
them to others.
• Delete. The lower-right box is neither important nor urgent. Whenever possible, try to
eliminate these items altogether.
All too frequently, leaders don’t even realize they’re spending their time focused on tasks in the
wrong quadrant. These four simple boxes can help you make more strategic decisions about how to
prioritize and be more effective.
Now that Jacob has determined the most vital ideas from his discussions with business partners
to implement, he opens up his toolkit and realizes that pairing two tools may be advantageous for
his current position. The impact effort matrix created a foundation that established the right areas
to focus on and when to leverage them, but he won’t get very far with the idea unless he plans his
route to the finish line. Keep in mind, in addition to this project, he still has his normal responsibil-
ities, and the tasks and requests he needs to handle on a daily basis have certainly not slowed down.
So, Jacob begins listing the main deliverables from the collaboration initiative and places them
in the appropriate boxes of the Eisenhower matrix. He finds that many fall into the “decide” cate-
gory, which is an ideal location to focus on. Using proactive measures to keep important items on
track will ensure he doesn’t end up with all tasks becoming urgent.
Next, Jacob pulls out his to-do list of tasks that were not included in the scope of the project and
plots those across the Eisenhower matrix. The resulting view provides a cohesive direction allowing
Jacob to place an appropriate level of focus on each task and ultimately deliver a successful training
product that’s aligned with the company’s new vision.

Optimal Strategic Zone


As you ref lect on how these tools can help increase effectiveness in your own space, there’s one
additional model that should always be included in your thought processes and decisions. Steven
J. Stowell’s optimal strategic zone concept ref lects the idea that there are certain times when there
is true advantage to foresight, thorough planning, and strategic review. But in other situations, a
routine job just needs to get done—simple as that.
Finding yourself over-strategizing day-to-day items is not only exhausting; it’s a waste of time. On
the other hand, under-strategizing a key initiative that can bring real value may leave you in a position
where failures were avoidable, and a great idea ends up in the dustbin due to poor planning. In the space
between these two extremes lies a sweet spot known as the optimal strategic zone (Figure 5-3).
This model presents the same measures we have seen before: importance-impact and energy-effort.
Although they’re used in a different way, it is not difficult to see a common thread between the concepts
presented in this section. If you use these threads with the right tools, you’ll find they create an incredibly
strong fabric to enhance your decision-making process.

58 | Chapter 5
Figure 5-3. Optimal Strategic Zone

Assessing Needs
Developing solutions to satisfy organizational needs is one of the primary roles business learning
advisors play. However, while most think these solutions are always focused on learning, the role is
actually about collaboratively working with others in the enterprise to understand the organization.
From that perspective they then strategically look for the best solution for the root cause for each
identified need—and that is not always a learning solution.
According to ATD’s 2018 report Needs Assessments: Design and Execution for Success, “Needs assess-
ments are an important part of organizational efforts to improve the overall business.” ATD defines
and differentiates between needs assessments and training needs assessments in this way:

Needs assessment: the process for determining and addressing gaps between current
or desired conditions. This is a broader process than training needs assessments. Training
and nontraining solutions may close the existing gaps.

Training needs assessment: the process of identifying how training can help an
organization reach its goals. Specifically, training needs assessments are used when
training has been identified as the solution necessary to close a gap. Training needs
assessments can be a subset of broader organizational needs assessments.

Organizational Needs | 59
ATD found that approximately 56 percent of organizations conduct needs assessments—but
when they did, 68 percent thought the assessments were highly or very highly effective: “The top
three challenges associated with needs assessments were stakeholders already believing they know
the needs (70 percent), the extensive time required to conduct the needs assessment (50 percent), and
getting buy-in from others such as business leaders (44 percent).”
Based on the types of needs they identify, the approaches TD professionals can use will range
widely—from transactional to extremely strategic. There are also many different tools and tech-
niques for each of these approaches that come from a variety of fields, including quality management
disciplines (such as Agile, Lean, and Hoshin Kanri) and design thinking and human performance
improvement. In a survey of the members attending a Forum Lab on using tools and techniques to
be more strategic, the following tools were identified:
• 4 Disciplines of Execution • HPI (human performance
• 5 Moments of Need improvement)
• A3 Problem Solving • Iceberg Model for Systems Thinking
• Action Mapping With Cathy Moore • Learning Organization Maturity Model
• ADKAR Change Model (awareness, (with Josh Bersin)
desire, knowledge, ability, reinforcement) • Mager’s Front-End Analysis
• Agile and Minimum Viable Product • MDMP (military decision-making
• Applied Systems Thinking (from Peter process)
Senge) • OKR (objectives and key results)
• Baldrige Organizational Profile • Predictive Analytics
• Concept Maps • Situational Leadership
• Design Thinking • Strategic Thinking Manifesto
• Edgar Schein’s Organizational Culture • Strategic Leadership Model (from the
• Gilbert’s Grid Center for Creative Leadership)
• Hedgehog Concept for Organizational • The Six Boxes
Focus • Thinker Toys
• Hoshin Kanri • Voice of the Customer.
Regardless of the tool or technique, according to Roger Kaufman and Ingrid Guerra-López (2013),
“Differentiating between want and needs is crucial, and doing so from the beginning saves time, money,
and endless amounts of frustration.” Another vital suggestion is using a variety of tools to collect data in
different ways. Many experts recommend triangulating the data.
Once you have clarity on the needs, and are sure that they are not wants, it’s imperative to isolate
the factors and separate the gaps you can close with training or learning experiences from the ones that
require environmental changes, additional resources, or other nontraining solutions. From that point, you
can leverage tools and techniques to delve deeper into analyzing the specific training needs.

60 | Chapter 5
As part of the Forum Labs and case studies, members of the Forum share tools and techniques
with one another using zing rounds, carousels, and taped recordings. We captured some of the recent
cases on identifying gaps and aligning solutions to business goals to include in this book. We gathered
data from members sharing in an informal small group, discussions and reflections from participants
in the groups, and teach-back stories participants developed.

Elaine Biech’s Tools for Gathering Data


According to ATD research, the top challenge to a successful needs assessment is stakeholders
believing they already know the organization’s needs. Kaufman and Guerra-López (2013) agree,
stating that business learning advisors need to minimize this challenge by focusing on asking the
right questions: “Asking and answering the right questions is what’s going to make you a successful
talent development professional.”
Elaine Biech, an icon in the talent area, has spent her life helping talent development profes-
sionals improve their practice. One of the most generous ways she does this is writing and publish-
ing her ideas, practical tools, and techniques. Many of her tools provide excellent questions to ask
to learn more about the current state and needs of the organization. Most of the tools we discuss
here are from her book Thriving Through Change: A Leader’s Practical Guide to Change Mastery (2007), and
others are available as free downloads on the ATD website. During the Forum Lab, Jill Carter and
Mark Lemon of Intermountain Healthcare shared how they use some of these assessments.
These tools follow Biech’s six-step change process:
1. Challenge the current state.
2. Harmonize and align leadership.
3. Activate commitment.
4. Nurture and formalize a design.
5. Guide implementation.
6. Evaluate and institutionalize change.

Why Is It Important?
Whether you call these tools scans, surveys, assessments, or diagnostics, they serve the same
purpose: helping leaders understand the current status and needs of the employees (therefore, gain-
ing empathy with their perspectives). When an organization sets a goal, it is always about change.
Learning is about changing knowledge, skills, and behaviors, especially in what we know and can
do as individuals and organizations.
Carter and Lemon provided details for several ready-to-use tools for change readiness, including the
following from Biech’s book (worksheet numbers provided; Figure 5-4).

Organizational Needs | 61
Figure 5-4. Elaine Biech’s Change Readiness Tool

Tool Purpose
Change Management Skills Assessment (worksheet 2-1) Assess your competency as a change agent.
Organizational Change Readiness Audit (worksheet 4-1) Rate the organization’s readiness from a qualitative
perspective.
Change Readiness Predictor (worksheet 4-2) Rate your view and key stakeholders’ views of the
organization’s readiness.
Change Management Employee Expectations: What Do Rate employee expectations regarding a specific change.
You Want?

When Are These Assessments Used?


These tools are designed to help change leaders and change agents efficiently move organizations through
change. They typically get the most attention with organization-level changes, but can also be used for
individual lines of business and departments. The tool you use depends on the change step you are focus-
ing on. The assumption with many of the assessments is that they document important information that
can then be shared with whoever needs it to initiate or implement change.
For example, if an organization were to implement a restructuring effort, the Biech organizational
change readiness audit would be useful for gathering qualitative information from stakeholders regard-
ing actions that have been successful in the past and what actions may be necessary to achieve a success-
ful change in the future. Regarding this same restructuring effort, Biech’s change readiness predictor
tool may be used to gather quantitative data from key stakeholders about what factors are needed to
bring about successful change in the organization. Using these tools creates a holistic database of the
qualitative and quantitative data gathered throughout the entire organization. Finally, it is important
to identify employee expectations regarding a change, and the change management employee expec-
tations tool may be used for this purpose. It is important to weigh in with employees on the front end,
through the transformation process, and finally at the end to ensure they have the information to
support the change and to help others support the change. Figure 5-5 uses examples from healthcare
showing why, when, and how some of the tools might be used.

What Is the Process?


In Thriving Through Change, Biech provides a self-explanatory toolkit covering a variety of topics related to
change, such as expectations and readiness. The key to success, however, is to use them to gain informa-
tion from your employees and stakeholders. According to Biech, these questions serve as a guide—she
never uses the same question twice, always customizing them for the unique audience.
Forum presenters Carter and Lemon suggest that you can compile qualitative and quantitative
data from multiple tools and groups to guide impactful conversations about the change prior to

62 | Chapter 5
determining the change process. This creates a more holistic approach to gathering data prior to
developing the strategy for the change.

Figure 5-5. Example Uses of Elaine Biech’s Change Tools in Healthcare

Biech Tool Why, When, and How


Change • Why? Leading a new initiative in telecommuting and flexible workplace arrangements
Management Skills • When? Front end of the change process prior to change strategy development
Assessment • How? The change agents assess their own capability to lead change
Organizational • Why? Leading a new initiative on teaming across functional boundaries (such as community-
Change Readiness based care, specialty-based care, and Teladoc)
Audit • When? Front end of the change process prior to change strategy development
• How? Qualitative data gathered through interviews across the entire organization and at every
level to determine which change efforts have been successful and which have failed and why
Change Readiness • Why? Leading a new change initiative that proposes merging two large departments (patient
Predictor safety and patient satisfaction)
• When? Front end of the change process prior to change strategy development
• How? Quantitative data gathered from key stakeholders to determine their perspective on
whether the organization is ready for such large-scale change
Change • Why? Leading a new initiative going from private offices to open space cubicles
Management • When? Front end of the change process prior to change strategy development
Employee • How? Electronic survey to all employees to determine what they need to know about the
Expectations: What change to drive a successful change effort
Do You Want?

Caution! What Do I Need to Watch Out For?


Examine the tools you plan to use and consider those who will be experiencing change. Biech provides
a proactive set of questions you can ask to explore what employees need before the change begins. Use
what is appropriate from these resources, and then use the other questions as needed. Additionally, set
expectations for what will be done with the data and what will be happening as a result. Be intentional
about managing employee expectations regarding the change to ensure they have enough information
to understand it.

Investigative Report Organizational Scan


Another tool for gathering data to gain a deeper understanding of the organization is the investigative
report organizational scan from Needs Assessment Basics, 2nd edition (2016). This was shared as part of a
gallery walk featuring posters of other tools not discussed in the Forum Lab. The investigative report orga-
nizational scan is a fun way to gather data and informally but intentionally keep tabs on the organization.
The goal is to look at the current environment with a fresh set of eyes and a different perspective.

Organizational Needs | 63
Why Is It Important?
As business learning advisors, our role is much more expansive than simply designing and delivering
training. The talent development profession needs to see and understand the entire organizational system
to effectively address current competencies and future capabilities.

What Is the Process?


This is a framework that imitates the investigative processes and questions used by a news reporter for a
big story. Figure 5-6 lists several actions and how to accomplish them.

Figure 5-6. Investigative Report Organizational Scan Tool

Investigative Reporter Actions Ways to Do It


Get out into the organization • Eat lunch with different people.
on a regular basis and go to • Vary the places you take your break.
different places. • Walk around the production floor and observe (with permission of the manager,
of course).
• Ask managers of departments you work with to “ride along” or “sit along” with
some of the employees in their departments.
Ask a lot of what and how • What do you mean by that?
questions. • How would this look if it were different?
• What is happening that should not be happening?
• What is not happening that should be happening?
• How would you solve this if you could?
Think about things in different • What is the root cause of this?
ways. • How does this factor depend upon other factors?
• What analogy would capture the way this process looks and feels?
• If [insert subject] were facing this problem or challenge, how would they handle
it? (Use either a person known for being a creative problem solver—Thomas
Edison, Steve Jobs, Albert Einstein—or a metaphor such as a baseball team, army
platoon, or race car crew.)
Keep information and arrange • Write notes on an index card or use OneNote or another digital technique.
for seeing patterns. • Maintain a file of clippings from internal and external sources.
• Use sketchnotes, index cards, or sticky notes to post the collection in an area you
can easily see; use arrows to make connections or boxes for repeats.
Allow yourself thinking time. • It generally takes time and distance for patterns, insights, and big-picture
concepts to occur to you.

When and Where Is It Used?


Because this is a technique to take the pulse of the organization, it is recommended that you use it
all the time. However, to give it power, summarize your general findings monthly or at least quarterly
and use them for planning. For a specific project, the process needs to be more regimented and over
a shorter timeframe.

64 | Chapter 5
Impact Map
The impact map—adapted from the work of Robert Brinkerhoff—is a tool that directly links a
service, product, or experience to a result, which creates a line of sight and, consequently, alignment
between the two. At the highest levels, this can link a strategy, product, or service design to the results.
It can then demonstrate how the new knowledge improved the desired behaviors and how those
behaviors affected key job results and organizational goals. At the Forum Lab, this technique was
presented by Taylor Harlin, an experienced user from Johnsonville. Figure 5-7 presents the standard
impact map tool.

Figure 5-7. Impact Map

Skills and Knowledge On-the-Job Application Key Job Results Organizational Goals

Why Is It Important?
During the lab, Harlin shared how the tool helped his organization with alignment and commitment
to behavior change. As part of the discussion, he gathered ideas from others on the tool’s pros and
cons. The tool is designed to show the relevance between a course of action and the business goals and
objectives. It is important to create alignment and visibility to show how any event, activity, or strategy
will improve business performance to demonstrate value, but also to ensure customers are getting what
they need and want.
In a learning environment, the impact map helps participants understand how the knowledge
should influence themselves, their teams, and the organization, while also reinforcing why learning is
important. In other words, it provides what Simon Sinek calls “the Big Why.” It also helps the learning
program designers ensure they are providing the right content to address the need. Cory Bouck (2013),
author of Lens of Leadership: Being the Leader Others Want to Follow, refers to the impact map as part of the
talent professional’s “ultimate tool kit.”

When and Where Is It Used?


The impact map can also be used for many things outside L&D, including product and service design.
When used as part of the learning solution, it can be leveraged during the “needs” part of the ADDIE
process. It is useful for visually representing the needs assessment because it links the desired organi-
zational improvements to the learning experience being designed. It can also be used by learners and
their managers as pre-work to clarify why the course is important to their and the organization’s success.
Additionally, it can be used to measure impact evaluation; for example, after a specified period of time
(such as three to six months), the L&D team could follow up with a learner and their manager to eval-
uate change in behavior and how this has affected the team’s and organization’s results.

Organizational Needs | 65
An impact map has three functions that can be applied within three lenses. It can be used for
learning experience design, learning application, evaluation, and course relevance, as well as looking
at a specific learning experience (course, e-learning, job shadow, and developmental assignment), a
specific program or group of courses, or the entire learning and development strategy.
If you were designing a course, program, or L&D strategy, the tool would flow from the right to the
left. Beginning with the goals set by the organization, you could determine what key job results would
be needed to deliver on them. From there you could determine what skills, attitudes, and behaviors are
needed to achieve the results to deliver on the organization’s goals. Once you know what skills, attitudes,
and behaviors are needed, you could perform a gap assessment to understand what training is needed
from L&D. This then directly ties the development to the organization’s goals.
For Harlin’s organization, the impact map is used as a discussion tool before a learning experience
between a coach (manager) and a member to clarify why and how what they learn will be applied on
the job. This ensures that both sides are clear on how application will affect performance.

What Is the Process?


There are several approaches that can be used. Employees can fill in the map with their coach, which
enables good conversations about what they’ll get out of the class. This can then be shared with the orga-
nization development and learning team so they know what participants are focused on and can ensure
enough time is spent on the right content. After the class, the coach and the member can have regular
discussions about the impact map and actions they are taking to ensure the intended impact is achieved.
You can also use a completed impact map to evaluate the effectiveness of a learning experience by
assigning levels of evaluation to each column. Skills and knowledge can be evaluated to check for Level 2
evaluation; on-the-job application achieves a Level 3 evaluation, pending validation of application; and
the change in the selected results can achieve a Level 4 and potentially Level 5 evaluation.
Another recommended use is to determine the relevancy of the current content in alignment
with top business goals. To do this, begin by looking at the organizational goals on the right and
working left toward the skills and knowledge.

Caution! What Do I Need to Watch Out for?


According to Harlin, when implementing a new tool like an impact map, leadership buy-in may be a
challenge because it is a collaborative effort with managers. Another challenge is educating the organi-
zation, and specifically the managers using the tool. While it looks simple on paper, employees and their
managers can get stuck if they don’t know how to use it. Once the impact map process is in place, the
toughest part is ensuring accountability to use it consistently for course and program design.
Accountability for use and follow-up conversations are imperative to sustaining the plan and process.
Additionally, process discipline is needed to ensure you are evaluating the programs and courses to see if

66 | Chapter 5
a change in organizational goals is still met by the knowledge and skills to prove experiential relevancy.
The most common issue occurs when participants leave class and, despite spending the time to develop
their map, never look at it again, especially if their performance coach or manager doesn’t hold them
accountable. When this happens, the business impact will not be achieved, which means the return on
investment won’t be achieved.
During a discussion session with senior learning leaders at the Forum Lab, some members mentioned
that they changed the column labels. In other words, the tool provides a framework for aligning learning
goals to business outcomes, which organizations can adapt to better fit their needs as long as the tool’s
original intent is not lost.

McKinsey 7-S Model


The McKinsey 7-S framework is used for organizational analysis and design, and was presented by Susa-
na Sipes from Grainger during the Forum Lab. The foundational premise is that organizations are most
effective when seven internal elements are aligned and reinforce one another: strategy, structure, systems,
style, staff, skills, and shared values.
The framework defines these seven as either hard or soft S’s. The hard S’s are the more tangible
aspects that can easily be identified as the what in an organization:
• Strategy. The organization’s approach for achieving goals and objectives for moving into the
future.
• Structure. How a company is organized to drive accountability, including direct reporting
lines and decision-making authority.
• Systems. The platforms and processes that support the business and how work gets done
from an operational standpoint.
In contrast, the soft S’s are driven more by culture—the who and how of the organization:
• Staff. Employees, their capabilities and compensation, and how they are attracted and hired.
• Skills. Capabilities within the organization to meet current and future goals and objectives.
• Style. The culture of the organization and how people work.
• Shared values. How an organization defines its purpose and what drives meaning.
This framework ensures a comprehensive view of the magnitude of interdependencies that are often
overlooked when deciding on change initiatives.
The McKinsey 7-S framework provides a structure to analyze the current state of the organization at
the most strategic level. The results include identifying interdependencies, gaps, and the effect of a change.
The addition of the softer parts of the organization—shared values, skills, style, and staff—provides a full
picture of the broader organizational ecosystem. Most important, shared values are at the center of the
framework. However, it does have its limitations; one of which is that it provides only the current state. To
close gaps, one must resort to other tools and techniques.

Organizational Needs | 67
When and Where Is It Used?
Completing the 7-S framework is not an easy task. It requires a team with a strategic understanding of
the organization and access to the necessary data.
There are various uses for the framework. First and foremost is to get a clear description of the
current state of the organization to serve as a profile for ensuring that everyone understands the ecosys-
tem. It is also useful for identifying the root of performance issues within an organization, assessing or
monitoring changes when performing role alignment or structure changes, and implementing new strate-
gies that require a focus on people management instead of process and technology.
While the framework provides a structured framework for the elements to consider, there are vary-
ing approaches for how to access and document the data and assess the alignment between the elements.
Additionally, there are no tactics for consistent identification of each of the seven S’s. However, there are
websites, such as mindtools.com, that provide a list of questions to ask about each S.
There are some cautions when using. While it can be used to outline at any level, it must be started
at the senior-most organizational level. If the framework is used at the lower levels as an alignment tool it
must “feed into” the Level 1 version. It could be useful to use the McKinsey 7-S framework in conjunction
with the structural tension model, which focuses on the tension between the current reality and future
goals and vision with some adaptation (Fritz 2011).
The McKinsey 7-S framework is helpful for talent professionals on two levels: First it produces a clear
profile for the organization, which provides an understanding of the business system. Second, because it
identifies current gaps in skills, staffing, and future capabilities, it can inform the learning strategy.
Because it is only internally focused, the group must have another process or tool to gather and use
external data. One simple tool suggested by Sipes is the SWOT analysis (Figure 5-8).

Figure 5-8. SWOT Analysis

Strengths Opportunities

Weaknesses Threats

Implementing SWOT Analysis


SWOT (strengths, weaknesses, opportunities, and threats) is an analytical technique that enables
the user to gain an awareness of the organization’s current internal capability and an assessment of
external forces that can positively or negatively affect the organization in a desired end state.

68 | Chapter 5
The SWOT analysis is important because it is an objective analysis of current factors, systems, processes,
products, and services in the organization. The information is used to identify the current competitive state
and includes:
• internal aspects that are working well (strengths) and contributing to successful results
• internal aspects that are not working well and can be improved upon (weaknesses)
• external forces (opportunities) that may have a positive influence and are potential sources
of strengths
• external forces (threats) that may have a negative influence and serve as a catalyst for
performance improvement.
Completing the SWOT analysis is a collaborative process to which employees representing all facets
of the organization can contribute. Large groups can use a digital survey, while small groups can use sticky
notes on a poster. The responses to the following four questions should be backed by data and trackable to
a source:
• What are the current strengths for the organization? (What are the internal successes?)
• What are the current weaknesses for the organization? (What are the major challenges?)
• What current and future external opportunities exist—including trends and changes in
technologies, policies, economics, and demographics—that can affect the organization? Look
at the strengths to see if they open up any opportunities. Alternatively, look at weaknesses and
determine if eliminating them could open new opportunities to be competitive.
• What are the current and future threats that could affect the organization, including competition
from others and the trends and changes listed in the third step?
Because each of the questions will result in a list, the list may need to be grouped into categories using
tools such as affinity clustering, then sorted by priority or impact.
A SWOT can be used at the organizational, department, or team level, or anywhere in between.
However, it is important to be clear on the level for use as well as the desired end state. While the SWOT
is useful in conjunction with the McKinsey 7-S framework at the strategic level, it is also a stand-alone tool
that can be used as part of any analysis. This makes it a go-to tool for the business learning advisor.

Summary
As business learning advisors, we make decisions all the time—some are quick and easy, while others are
difficult and complex. However, understanding our options and how we make decisions is critical. In
our role to help the organization stay competitive, our specialty is creating an environment for building
capabilities across the organization. This starts with understanding the entire organizational system and
assessing needs across the ecosystem (such as organizational needs and gaps). These needs can be relat-
ed to building capabilities, but they could also be related to the design of the job, resources, communi-
cations, and so on. A variety of tools from many disciplines can help you make the best decisions.

Organizational Needs | 69
Once we are confident that the need is related to training or learning either through upskilling or
new skills and behaviors, we can dig even deeper into what specific training is needed, and how and
when to deliver it, using our deep understanding of learning theory to provide solutions. Again, there
are many tools.
When you are called on to assess needs or close a gap, what tools are in your portfolio? As we
discovered in this lab, there are many tools and techniques to choose from. What are your criteria for
deciding which tool or technique to use? Is it the time or cost involved? Is the decision based on ease
of use or the number of resources available? Or do you choose the tool that provides the most robust
information? Or is the decision simply based on familiarity?
One of the easiest ways to learn about new tools and techniques is benchmarking other companies.
Understanding what tools are available and how they are used to keep a pulse on internal capability
within the organization provides useful insights.

Key Takeaways
9 To be strategic in our role as business learning advisors, learning professionals must have an
understanding of the entire organization and how it operates as a system.
9 There are many tools and techniques to choose from for assessing needs, both at the systems
level and the capability-building level. Using multiple tools and triangulating the information
generally provides a clearer picture.
9 Most tools and techniques require discipline to use and accountability to sustain.

Questions for Reflection and Further Action


1. Does your organization have an intentional strategy for continually building and upgrading
capabilities?

2. What tools for assessing the organization and training needs are in your portfolio?

3. How do you decide which tool to use?

4. Which of the tools from this chapter could you use to gain a deeper understanding of the current
capability of your organization?

70 | Chapter 5
6
Heeding the Call
Jerry Kaminski With MJ Hall

“An organization’s ability to learn, and translate that learning into action rapidly, is the ultimate competitive
advantage.” —Jack Welch

Ike, an internal performance coach for a large organization, has just received an email from Clara,
one of the company’s senior technical managers. In the email, Clara states that one of her project
teams is having huge performance problems. The team’s members are not meeting the milestone
schedule for Project XYZ and they need training. She wants Ike to conduct a two-and-a-half-day
team problem-solving training within the next month.
This request puts Ike in a bit of a bind. Not only is his current schedule very full; he already has a
new project on hold and he’s been waiting all year for his family beach vacation—which he leaves for
in two weeks. However, Clara has higher rank in the organization and has influence with the C-suite.
Her project is critical to several organizational strategic goals and is also visible to the external clients.
As an experienced performance coach and trainer, Ike has taught many courses on team prob-
lem solving. After quickly reviewing the shelves behind his desk, he realizes he has six different work-
shops of varying lengths on team development skills, like problem solving, from previous sessions.
Ike pulls the notebooks from his shelf and starts reviewing the training modules. He puts a few
sticky notes along the edge of several pages. Then, Ike goes back to the computer, hits reply to Clara’s
email, and starts to write.

71
What Does Ike Say?
As we know, this is an age-old problem facing the talent profession. Organization development specialists,
consultants, and anyone else who helps with organizational challenges refer to this as the “1-800-TRAIN
effect.” A manager calls the training department or consultant and requests a training program to fix some
hot performance challenge. The manager wants the fix now, and they want it to be quick because they
cannot afford for employees to be offline due to their heavy workload.

Question for Thought: What else do you notice about Ike’s scenario?

The manager also wants it cheap because budgets are being cut, and an internally developed program
will save costs. Thus, the scenario unfolds with the manager telling the trainer what solution they want,
when they want it, and the amount of time the employees can be available for the training.
Fast. Cheap. On-demand. Oh, and high quality. It’s an all-too-familiar story.
This scenario typically plays out like this: Ike will respond to the request quickly and immediately
start offering ideas for implementing the manager’s proposed solution. But here’s the catch: He still doesn’t
know the real problem. This means that the manager, who is generally not trained in learning theory or
instructional design, is ordering a training solution for an undefined problem.
This scenario propagates the terms dipping or spray and pray as derogatory phrases for training. Iron-
ically trainers may find themselves being OK with this dipping approach because they also have busy
schedules, may not be confident in the solution, and generally have a large portfolio of canned training
products or modules at their disposal.

Question for Thought: If you were Ike, what would your email to Clara say?

On the f lipside, custom design work that creates an environment where work and learning can
thrive simultaneously is very effective. But it also takes time to assess, plan, design, and develop
materials, including research, to gather data on the unique situation.
This scenario and the 1-800-TRAIN effect can be compounded when the manager making the
request uses a “seagull management” approach—a joke Ken Blanchard makes in Leadership and the
One Minute Manager (1985): “Seagull managers f ly in, make a lot of noise, dump on everyone, and
f ly out.”
Ike sits up in his chair, takes a deep breath, and proofreads his email, which says:

Dear Clara,

My recommendation is for us to carve out time to meet this week so I can ensure I fully understand your
business need. I’d like to ask you some questions and adequately prepare for your training need. I want

72 | Chapter 6
to listen and fully process everything so I can make sure we are customizing this to get the results you’re
looking for. I know you’re very busy, as am I, but I’ll do what I can to work around your schedule to get
this meeting on the books ASAP.

Thanks,
Ike

He hits send and gets to work.

An Invitation to the Dance


With this approach, the requesting managers never have a clue that their management style is the
issue. This manager type, and others who do not interact collaboratively with the workers, frequently
expect the trainer to come in and “fix” their team.
The following story, shared by MJ Hall (2014), is a recollection from a colleague about his
exchange with his professor during graduate school:

As a new OD practitioner who was taught that training isn’t always the answer, I
struggled with how to get people to see a need from a systems perspective. It seemed
like I was always telling them “No!” when they requested training. One day, a professor
of management and organizational change and the President’s Teaching Scholar from
the University of Colorado’s College of Business and Administration was speaking at
my alma mater. Following the lecture, I spoke with him, and his words changed my
perspective forever. He said: “Training is an invitation to the dance.”
He went on to say that training was the only acceptable way managers could ask
for help in most organizations. MBA programs, business schools, and management
development programs all teach that training is one of the few things managers can
request. When managers are struggling and don’t know what to do, they often ask for
training. What they are really asking for is help.
Some may see it as a sign of weakness, failure, or lack of capability for a manager
(that is, a people leader) to ask for help in any area. However, it is even worse to say
there is a huge problem, and that you don’t have a clue what it is or how to go about
solving it. No one ever got in trouble for asking for training.
The colleague talked about methods he used once the invitation had been
extended to “turn the conversation.” He suggested that the trainer/business partner
agree to provide training but to continue the conversation and learn more about the
situation. Then, after a deeper conversation, they should ask the manager whether it
would it be OK to walk around and talk to a few people to make sure the training they

Heeding the Call | 73


are offering is the most appropriate. Walk with a very open mindset, looking at other
facets of performance that will help improve performance.
With the manager’s concurrence, the trainer/business partner is then able to
conduct a simplified performance assessment by observing actions and using questions
to gain further understanding of the issues from the employees’ perspective. Some
questions that are broad but also diagnostic are:
• How do you know when to start your job?
• What are you trying to achieve?
• What are the behaviors you want to demonstrate?
• What is working well and what is not?
• How do you know how things are working? What are you measuring?
• Is this a new skill or a major change from a current one?
• Are you having a good day or a bad day? How can you tell?
• Do you have what you need to do your job (such as tools, computers, software,
and resources)?
As the trainer/business partner walks around the work site, they should look
for visual cues that support or contradict what they are hearing. Then ask clarifying
questions about those cues. They should make sure their assessment is decisive and
quick! Then, they need to go back to the manager who requested the training and
share their findings and suggestions.

The “invitation to the dance” concept includes being open to conversations and discussions. This
openness requires deep listening, not surface listening. It requires open-mindedness and a willingness
to collaborate with the requester on a system of solutions that they will implement. It also requires
using your skill in learning theory and design, as well as your influence as a leader. You’ll need to help
others by providing support as they transition from one state to another.
This leadership component is critical, and it’s where Ike can provide greater direction and clarity
to Clara. Ike is best served to take the lead in this scenario, ask the initial questions, and then dig
deeper by asking probing questions that assess Clara’s needs. Given the other feedback received from
members of the organization, he’s now equipped to offer an objective assessment of the best solution
for his customer within the time constraints given.
Saying yes is the secret! By saying yes, you accept the invitation to the dance, which then allows
you to engage in a conversation. Once the conversation develops, you can go to the dance floor and
take a look around. Who knows? The manager may have asked you to teach the waltz, but you end
up discovering that the team is actually dancing to tango music. Given enough time and data from
your observations—and new understanding of the situation—you may end up being able to change
the music.

74 | Chapter 6
Turning the Conversation
Most organizations have a broad definition of training and, therefore, many initiatives will fit under the
training umbrella. Where individual performance is suspect or there is great variability in the perfor-
mance of individuals, training can be used to provide a level playing field for all. Training also provides
a forum for consistent communication, which makes it easier to respond yes to the invitation, because
training is a safe way for a manager to ask for help. Joe Harless used to say something like this: Whatever
you provide, don’t be afraid to call it training—they asked for training and you provide training. While
not clinically accurate all the time, if you are providing what they ask for, you’ll get much further and
they will begin to trust you.
Other ideas exist for “turning the conversation,” but before they can act on them the person
providing the services (the learning business partner) needs to have clarity on these basic assumptions:
• The overarching focus should be helping employees achieve the organization’s
business goals.
• The totality of the work/learning ecosystem needs to be considered. This includes the
work itself, the performance needs of the employees, and the inf luence, constraints, and
limitations of the workplace (for example, environment, infrastructure, culture, and
support mechanisms). Only when working within the system will the real problems
come to light.
• Approach this from a “partnership” with the requester. While you are there as a learning
professional, your role is to help the business partner. Another thing Joe Harless used to
say was do the work, give the credit of success to the business, and step away. In the end,
success lies with the business, not the learning function.
Some other questions for turning the conversation from a specific training request to analyzing system
issues include:
• If we were to be successful at helping you here, what changes would you see?
• What do you see as the greatest barriers to success?
• What pressures are you under? What are the sources of these pressures?
• What immediate change would you like to see for your team based on our work?
• Currently, what are the successes for the team?
• What are the employees saying about their problems with meeting the schedule?
• What are competitors doing about this issue? Do other or similar solutions already exist for
similar issues or problems?
• How does this proposed training support the organization’s strategic goals or initiatives?
• What other changes aside from training do you think would be helpful?
• When was the last time this team had training? Did it make a difference?
• How can we work together to increase the stickiness of the training?

Heeding the Call | 75


Dear Clara,

Thank you for your time on Thursday afternoon. I have a few follow-up questions that came from our conversation.
We talked a bit about defining success and goals for your group on this project—and future projects. I wanted to
ask more specific questions around culture and fit.

Do you feel like you have the right people in the right positions now?

How much time would you like me to devote in our training toward ways to improve culture and engagement in
an effort to boost performance?

Thanks,
Ike

Once you’ve turned the conversation, it’s time to get your hands dirty and dig into the analysis. You
need to determine the root of the problem that will best solve your customer’s needs. The best place to
start is with front-end analysis.

Assessing Requests Using Front-End Analysis


Front-end analysis (FEA) is the starting point in the process where you begin to figure out what the problem
is and what you as a learning and talent professional need to do. This is where the magic begins—especial-
ly for Ike as he gets going on his training request! Credited to Joe Harless and Robert Mager, this concept
focuses on moving from a current state to a future state of excellent performance. FEA is a component of
the human performance technology (HPT) model, which is a systematic approach to improving produc-
tivity and competence (Figure 6-1). It is a series of multiple analyses to investigate various components:
• the current stated problem
• the job or work to be completed
• the tasks to be completed
• the needs for the business
• training the audience and even the environment.
The goal of FEA is to find out all details of the current state, as well as the details of the desired
performance of the future state. The FEA becomes the blueprint for analyzing the current situation to
determine requirements needed for performance. When done properly and early enough, it will help
identify the root cause for the problem, save time and money, and possibly pinpoint what Allison Rossett
(1999) calls solution systems—integrated and cross-functional approaches to solving problems and realiz-
ing opportunities.
It also identifies alternatives to delivering training. Completion of the FEA is normally done by the
instructional designer, subject matter experts, instructional design manager, or in some organizations a client
relationship manager (CRM). The CRM’s role is to help complete the intake process (requests) for training.

76 | Chapter 6
Figure 6-1. Instructional Design Front-End Analysis Flow

When you think of training, you are looking predominantly at the lack of skills or knowledge for a
defined performance. However, by taking a training request on face value, you may end up developing
training to solve a problem that really doesn’t exist. Using front-end analysis requires a consistent and stan-
dardized process that focuses on the real cause of the problem you have been asked to solve. From there,
you can provide solutions across the entire spectrum of performance-related issues.
Harless and Mager’s FEA model allows you to check the three major reasons for a perfor-
mance issue: skills and knowledge (the only issue training can resolve), tools and environment, and
motivation and incentive. This process is f lexible in nature but helps pinpoint solutions that may
be necessary for feedback, policy and procedures, resources, incentives, capacity, motivation, and
skills and knowledge.
When completing the front-end analysis, there are generally six goals that should be achieved:
1. Isolate performance problems that have potentially high “worth.”
2. Isolate the precise performance deficiencies within the problem area that account for the
greatest loss.
3. Increase the probability that the solution to a given problem is effective by matching the cause
of the problem to the appropriate type of remedy.
4. Increase the probability that the selected solution subclass is the most cost effective.

Heeding the Call | 77


5. Isolate the root cause of the performance problem rather than any symptoms of the effects of
the problem.
6. Increase the probability of a match between the precise performance deficiency and the
individuals who have the deficiency.
Using a front-end analysis flow chart or decision tree, group questions into six categories to help
further identify possible solutions:
• What’s the current problem?
• Is it worth solving?
• Can we apply fast fixes?
• Are consequences appropriate?
• Do they already know how?
• Are there more causes?
Use of the flowchart is not overly complicated or hard, but it does require discipline and consistent
use to make sure it works. The project manager should be familiar with the flowchart and use it as the
guide for the intake meeting or front-end analysis kickoff. The intent is not to share the flowchart with the
client to use as a list, but to walk through it verbally during the intake meeting. While it is simple enough
that it seems like the client could complete the flowchart, there may be halo effects because they are
already focused on training as the solution and may not see the other factors. Asking these questions and
using your own experience will help you narrow the root cause of the performance deficiency and any
appropriate solutions. This is invaluable and well worth your time.

Assessing Requests Using Gilbert’s Grid


Many years ago, Stephen Covey coined this truism: “Start with the end in mind.” While it’s frequently
used for project planning, it is also appropriate for learning strategists working with business partners to
design solutions for performance enhancements. The starting point is always focused on what the partici-
pants in the learning experience need to do.
As learning professionals, we’re always willing to entertain new ideas that help us support the orga-
nization’s performance. Conrad Goffredson and Bob Mosher’s Five Moments of Need is a powerful
method that helps determine what content to develop, how and when it can be taught, and how to get
started. Forward-thinking learning professionals are now assuming the role of business partners, which
means thinking in terms of collaborative partnerships to co-create a solution based on the assessed needs,
and within the context of the work environment. The first conversation with the customer can set the
tone for the direction of the solution—and, ultimately, the ability for that solution to make an impact and
deliver the required results.
So, what is the end state? Very simply, the end state is performance-related behaviors by employ-
ees. Given this, what do participants need to do differently to ensure success for desired results? Do

78 | Chapter 6
they need training, resources, coaching, or structure? While the learning department might have a
large repertoire of training modules available in a variety of formats, training, per se, may not be
the need.
Let’s look back to Clara and Ike. Clara stated that her team had performance issues and wasn’t
meeting its project milestones. The request was for problem-solving training. Should Ike continue on
the path of providing this training given a successful FEA?
How do you, the learning professional, work with your customers as partners for figuring out
the best solution, regardless of the need? One method is starting the conversation using questions
based on Thomas Gilbert’s behavior engineering model, which is generally called Gilbert’s Grid.
This method looks at the total environment—what the organization provides, and what it brings to
the table. This is always the best option when you’re looking to hit your mark at the end of a project,
particularly when you have time and resources at your disposal. In Ike’s case, he will need to deter-
mine whether he has the time, given the constraints he’s working with.
The success of using the performance engineering model lies in classifying the behavioral change
of the request into six categories:
• expectations and feedback
• tools and resources
• incentives (consequences and rewards)
• skills and knowledge
• individual capacity
• motivation.
You may need to provide multiple solutions across all areas to make sure performance improves.
Mager’s and Gilbert’s models have many similarities in that they both look at the entire system of
factors and aim to provide solutions across each. While the instructional designer focuses primarily
on the skills and knowledge category (learning only), it is imperative during the intake assessment for
the designer to explore all six categories and identify possible solutions for each. By only focusing on
one category, you may miss the bigger issues surrounding the deficit performance. In most instances,
you have to address all six categories to ensure a successful outcome (Figure 6-2).

Question for Thought: With limited data, what might you see as possible solutions within Gilbert’s six boxes for
Clara’s request?

There are no shortcuts; by skipping steps, you set yourself up for failure. Even if a situation
seems to demand training, it is important to go through the whole analysis. As a result, the biggest
challenge you’ll face is gaining buy-in to do the work. Unfortunately, this is where most organizations
lose out—they’re not willing to put in the time to do it right.

Heeding the Call | 79


Although you may jump-start the training, you want to optimize the performance issue to review each
training need. The key is to be thorough. If you’re not, you may end up developing training that doesn’t
satisfy the organization’s performance needs.

Figure 6-2. Gilbert’s Behavioral Engineering Model

Information Instrumentation Motivation


1. Expectation and Feedback 2. Tools and Resources 3. Incentives (Consequences and
• Does the individual know what • Do people have the right tools Rewards)
is expected of them? for performance? • Are adequate financial
Environment

• Do people know how well • Are tools and materials incentives that are contingent
they’re performing? designed to match the human upon performance available?

Management System
• Are people given guidance factors of performance? • Are non-monetary incentives
about their performance? available?
• Are career development
opportunities available?
4. Skills and Knowledge 5. Individual Capacity 6. Motivation
• Do people have the skills and • Is performance scheduled for • Are people willing to work for
Individual

knowledge needed to perform times when people are at their the incentives?
as expected? best? • Are people recruited to match
• Is well-designed training that • Do people have the aptitude the realities of the job?
matches the performance and physical ability to perform
requirements available? the job?

Gilbert’s Grid can be used as a framework to gather information about performance problems and
analyze them for solutions. The questions can either be given to the customer or they can be asked as part
of the first conversation. Like all collaborative discussions, it should provide information to both parties.
Gilbert-related questions can also be used to guide the thinking around the solution. But more important,
the framework encourages managers to be more engaged and involved with the solution process.

Annual Assessments
In addition to an invitation from organizational managers and incorporating the FEA and Gilbert’s Grid,
many organizations use an annual needs assessment to see what new training programs are required for
the coming year. This can be accomplished through a survey of users, a survey of managers, a review of
performance management data (such as individual development plans), and observations or assessments
of new programs, processes, or systems. Conducting an annual needs assessment is a good method of
taking the pulse of the organization, while also allowing the business to have a say in the work of the
training function. This is also a critical component of involving your business partners in decision making
and the process.

80 | Chapter 6
The downside is that you may get a wish list approach to development. When using a survey, partic-
ipants could list their wants rather than their needs for training. While the data collected is valuable, I’ve
learned from experience that it’s imperative to run things through your needs assessment process to ensure
that you are hitting the mark.

Learning Request Forms


For mature organizations, a formal learning request form or system is recommended. This intake process
allows you to track all requests and collect considerable data at the onset of the project. While a learning
request form is similar to a direct request, it formalizes the process and puts responsibility on the requester
to provide additional information the learning function can use to assess the request. It also allows you
to better track your workload, increase your project management capability, and better leverage your
resources with prioritization.
The more complete your request form, the more assessment data you have to start the process. Ques-
tions you might ask include:
• What is the project type (for example, a new project, revision, or project assistance)?
• Provide the requester name and department.
• Give a short description of the request.
• What problem are you trying to solve?
• Is there already a course, or do you know of any existing solutions?
• How many people are affected by this request (target audience numbers)?
• What is the primary work group affected by this request?
• When is the project needed by?
• What are the risks of not delivering the training by the targeted date?
• Will regular revisions be required?
• Does this affect qualifications (skills imparted)?
• Are subject matter experts available to work on this project? Names?
• Is there budget for this request?
Once the learning function receives the request, it can provide data to add to the form to make it
more complete. This might include:
• Was initial contact made with the requester? When?
• Are there any unique needs or considerations for this request?
• What is the desired delivery method?
• Is this a new delivery or new development? Both?
• What are the roles or job titles affected by this request?
• What is the subject area for this request?
• Is there executive approval for this request?

Heeding the Call | 81


• Is there a vendor solution for this request?
• What is the customer impact for this request? Internal? External? Both?
• Does the project require new technology or technology known in the industry but new
to the company?
• Does the project have a visible influence on a large number of customers, business
partners, or the community (for example, will there be a large number of learners)?
• Will the project lead to a large amount of change in the company?
• Who is assigned to the project?
• What is the estimated project start date?
While you can take on projects without one, the learning request form helps formalize the request,
capture information at the onset of the project, and tracks the history of your work. It also serves as
the starting point for a detailed analysis report or design document. Often, the learning request form
helps jump-start the FEA process before the intake meeting.
Each organization should review these questions, select the ones that best match its needs, and
then develop its own form. The more complete the form, the more you have to start the project with and
the more detailed information you have for the intake assessment.

Using ADDIE and SAM


Instructional designers who follow a standardized process such as ADDIE (analyze, design, develop, imple-
ment, and evaluate) or SAM (successful approximation model) conduct their intake analysis using either the
performance engineering model or front-end analysis model in both their decision-making and needs anal-
ysis processes. These models are standard tools in the instructional design process that frame the outcome
of the needs analysis and give clear direction on a set of solutions that will best meet the needs of the
requester. Design thinking is used as part of the thought process and helps guide decision making.
ADDIE is best used for more detailed, thorough approaches where time is not as much of a factor.
It is akin in project management parlance to using the Waterfall approach. SAM is a more iterative, Agile
approach that is better suited for situations where time is at a premium. That said, ADDIE is generally used
in some capacity.
A word of caution—your requester will likely believe they have already given you the solution in
their request and even the timing of the delivery. It is contingent on the designer to show the value of the
assessment to the requester, and also explain the process of the assessment. While cost is not the defining
component, it comes into play when selecting a solution and determining whether to proceed.

Summary
Like people, organizations have challenges and want them solved—fast. For years, training has been the
go-to solution for many (and any) problems. It’s not uncommon to develop great training, only to realize

82 | Chapter 6
too late that it was the wrong solution. With limited time and resources—and more advanced capability
to integrate learning assets—it is imperative that we know the true root cause of the challenges employees
face and address them accordingly. Determining the type of dance will only happen after you accept the
invitation and ask many questions.
Front-end analysis and Gilbert’s Grid are excellent tools for digging deeper—in a manner that is
collaborative and leads to a logical conclusion. Focusing on the real performance need and determining
how to arrive at the best solution while balancing limited time, resources, and people adds value to the
learning experiences developed and the organization.
The FEA decision tree is a thorough and specific tool that allows even those new to the field the ability
to discern the real issues. Gilbert’s Grid can be used after the decision tree to classify the output. These
tools can be combined and customized to enable the learning and talent staff to make the best choices for
their unique situation.
Be willing to understand your client’s needs and ask tough questions. This will earn you respect
and give you the answers you need to help solve the problem. No matter the time, resource, or cost
constraints, you must try to see things as objectively as you can and be prepared for addressing
customer needs.
As training and learning professionals, we should always heed the call. Rising up to meet the chal-
lenge is the mark of a great leader. As Ike showed us, once the invitation to the dance is extended, we
should always be willing to say, “Yes!”
So, what did Ike do?
He completed the FEA and found many possible solutions, including training on problem solving.
He also found that there was a lack of solid procedures, missing tools, and a need for project management
training. He helped Clara define the procedures and secure new tools, and then he delivered a day of
problem-solving training and a day of project management training. Clara’s team successfully completed
the project ahead of schedule and budget. She now routinely comes to Ike—well in advance—to have
him conduct an FEA at the start of each new project. It was a major win for all involved.

Key Takeaways
9 Your value to the organization as a learning and talent development professional is having the
expertise, tools, and techniques to create environments that enable employees to continually get
better in their evolving work roles. If a manager comes to you and says, “Our performance is
down—we need a training solution!” Your response should always be, “Please let us know how we
can help and support you in doing your work more effectively and efficiently.”
9 The 1-800-TRAIN call is simply an invitation to the dance. Saying yes means you are opening
the door to deeper conversations and opportunities. It allows entry into the requester’s part of the
organization—and possibly, the opportunity to change the dance or even the music.

Heeding the Call | 83


9 Actually observing and talking to employees is time consuming but makes the next conversation
with the requesting manager much richer and more authentic. And more accurate information
from employees should result in better, more targeted solutions.
9 The front-end analysis (FEA) decision tree is a formal tool using detailed questions that flow
from one to another. Once the analysis has been completed, the output can be put into Gilbert’s
six-box grid to confirm whether the need is training—or another issue. It provides clarity on
your suggested solution and offers the requesting manager a bigger perspective to consider.
9 Provide both internal and external training on the models; as a manager, get actively involved
in training your teams to do the work. Start with some team-building activities to ensure
they have a solid understanding and are able to confidently deliver the solution for the client.
Understanding the processes and working together helps build stronger teams and prepares
employees to deliver the desired performance.

Questions for Reflection and Further Action


1. What squares with your thinking about “heeding the call” and the “invitation to the dance”
concepts?

2. What is your current practice when you receive a 1-800-TRAIN call?

3. What tools or techniques do your learning team use to ensure that the issue is a performance issue
that can be fixed through training?

4. Once a performance issue is determined to be a knowledge or skills gap, what processes are in place
in your organization to drill down on the specific competency or capability to focus on for designing
training?

84 | Chapter 6
7
Leveraging Neuroscience in
Learning Design
Leanne Drennan, Casey Garhart, and Joan McKernan

At its core, corporate learning is about changing behaviors for the purpose of accomplishing organiza-
tional goals. As we sometimes say, we want people to do stuff, not just know stuff. But let’s be clear—
knowing stuff is critical. To do stuff, you need to know stuff. More specifically, before they can take
action on the new information, people need to be able to recall that new information.
To allow for later retrieval, learning content needs to be stickier and more durable—so it can
move from short-term memory to long-term memory via neural networks. And, because our long-
term memory can hold unlimited amounts of information, we also need to be able to find that specific
piece of information again. This is the job of the brain’s hippocampus region.
Let’s look at it another way. If, in our homes, we simply put away items in the first empty space we
could find, it would be almost impossible to find a pot when we needed one. So, we put all the kitchen
things in one room, and then within that room we store pots in one place and dishes in another. Like-
wise, the hippocampus uses pattern recognition to store new information with what it considers to be
similar information. These “storage bins,” or neural schema, need to be connected through neural
pathways. The more a pathway is used, the stronger it becomes. And the more pathways between
and among schema, the easier it is to find information. So, the more you retrieve it, the easier it is to
access later.

85
“As neural networks become more complex and interconnected, thus providing more options for
interpreting and reinterpreting experience, the brain comes to know in more complex ways” (Taylor
and Marienau 2016). When schema are connected through strong networks or were created at the
same time, they can trigger one another. Additionally, the hippocampus is also part of the limbic
system (the brain’s emotional processor), so the connections between memories and emotions can
be very strong. That’s why the smell of baking cookies can trigger memories of your grandmother.
Fortunately, there is much research available on how we can push learning into long-term
memory to improve retrieval. This chapter summarizes this research and provides suggestions for
making your learning content durable and sticky.

Leverage Neural Networks: The Science


Neural networks connect all the bits of information we know and make it possible to find that infor-
mation once it is stored. As with highways, the bigger the pathway and the more frequently they are
used, the easier it will be to get where you’re going. A poorly marked dirt road is more difficult to
navigate than a superhighway, and in the same way, the stronger the neural network, the easier it
is to retrieve the information.

What This Means for Instructional Design


To facilitate learning, we need to maximize the use of these neural networks and schema. We also
need to use existing pathways and strengthen pathways that aid retrieval of information.
One way we can do this is by connecting new information to what people already know,
especially if that knowledge already has strong neural networks leading to it. This utilizes strong
networks and helps to continue strengthening existing networks and pathways. At the same time, we
need to intentionally design our learning content so that learners build new pathways among their
schema. The more connections there are, the more likely they will be able to retrieve information.
The hippocampus can more easily determine where to store information if the appropriate
areas are activated at the start of the learning program. It also helps if the relevant structure of the
content is visible, which will ensure it’s stored with related information and patterns. In addition
to the knowledge and information aspects of learning, memory and retrieval increase when we
create emotional connections. These can be new emotional experiences or connections to existing
emotional memories. Finally, the strength and number of neural connections increase when we
create space for individual insights to occur.

Things to Try
Accessing and building neural networks should be done throughout the learning experience. More
than 50 years ago, David Ausubel recommended the use of advance organizers to bridge the gap

86 | Chapter 7
between new material and existing knowledge. Today we understand more about how and why
tools such as advance organizers work. At the beginning of a learning experience, it is valuable
to get participants thinking about relevant topics they already know something about. Activating
these neural schema provides places to connect the new information. Here are some ways you can
do this:
• At the beginning of a lesson or module, ask one or two priming questions. While learners
don’t need to know the answers and should not be evaluated, the questions should pique
their interest. They should be general enough that they don’t feel like trick questions, but
specific enough to trigger the appropriate schema. Feedback about the questions can help
focus the learning, as well as provide an interesting nugget of information that makes the
learner curious to know more.
• In a live learning event, create icebreakers that are connected to the upcoming learning
content. For example, instead of just having learners get to know one another, get them to
activate their own neural networks by asking questions or telling stories that are relevant
to the topic at hand.
• Start the learning program with a story or metaphor that participants will be able
to relate to. If the session participants represent a variety of age groups, make sure
the metaphor isn’t specific to one generation and unknown to the others. This is also
important with global audiences. In some cases, it may be worthwhile to use multiple
metaphors or stories to ensure that the learners have the relevant schema to begin with.
For instance, references to a Bollywood movie might be very successful with an Indian
audience, while leaving North Americans at a loss for context. The point is to connect to
schema that are already strong, not to build a new schema.
During the learning experience, continue to help learners build multiple neural pathways by
making the context clear, helping identify how the new information fits with their existing neural
patterns, and activating existing pathways to strengthen connections. The key to learning that
sticks is the ability to easily find the information later. Try these two techniques:
• Use metaphors and analogies to help people make strong connections to schema that are
already well developed and connected.
• Use a range of topics to help maximize opportunities to connect the dots. Give learners a
chance to tell their own stories to ensure the connections are meaningful for them. This
is especially important in a global organization, where learners likely have very different
backgrounds.

Focus on Focus: The Science


Attention or focus on a topic is needed for the hippocampus to start working and begin encoding memories.

Leveraging Neuroscience in Learning Design | 87


In addition, research has shown that the brain (hippocampus) can focus only on a single item at a
time and then for only about 15 to 20 minutes. After that, the brain will begin to tire and shift its
attention to something else.

What It Means for Instructional Design


We need to focus on a single chunk of learning content for no more than 15 to 20 minutes, and make
sure it is relevant to the audience. After the learning chunk, we can do one of two things:
• Allow the brain to rest—such as scheduling a break.
• Recapture the brain’s attention on the topic by using a
different activity to allow for processing.

What to Try
At the start of our learning programs, we often ask learners to be present and focused. We request
that they put away their laptops and phones, so they won’t be tempted to multitask. (This is not new,
but the neuroscience supports its importance. You could even explain the neuroscience behind why
multitasking doesn’t work. Keep it simple and on point.) Perhaps start your programs with mindful-
ness activities to help learners focus.
Try to design learning content in 15- to 20-minute intervals. Make no mistake, this can be a
challenge, especially with overzealous presenters! After a presenting learning chunk, look for ways
to follow it up with an activity that recaptures the brain’s attention and reinforces the informa-
tion—such as individual reflection using a journal, table discussions, or design thinking activities.
So, although the topic might be the same, the next chunk of learning is experienced differently. Be
creative to regain the attention of your learners’ brains!
In one of our leadership programs for up-and-coming executives, we created a novel space for
breaks called the Discovery Zone. It is a separate room or area from the classroom, where partici-
pants could go play games, work on puzzles, read materials, watch videos, or have discussions with
one another in a coffeehouse setting. To improve the experience, breaks were extended to at least 30
minutes. This was a favorite of the learners.
In another program, we broke a topic into shorter lengths so learners could focus by created
a multistep practice exercise for sellers to create client-centered value statements. In the first five
minutes, the facilitator shared the elements of a good value statement. Then, in small groups, learn-
ers spent five minutes creating a value statement. In the next step, each group read out their value
statement and received feedback from the facilitator and the other learners. The learners then got
another five minutes to update their value statements. Finally, each group shared their new value
statement and explored what changes led to the improved statements. While this exercise lasted a full
hour or more, the learners’ attention was recaptured every five to seven minutes.

88 | Chapter 7
Include Space: The Science
Research has shown that retention is better when learning is spaced over time instead of one condensed
period. This gives the brain time to absorb the information and build neuro-connections through infor-
mation retrieval. This means that, over time, spacing can improve retrieval and long-term memory. In
addition, adding a night’s sleep as part of the spacing process increases retention because memories move
into long-term memory during sleep.

What It Means for Instructional Design


No more consecutive days of jammed-packed agendas! We need to include spacing when we’re
designing learning programs. Incorporating any kind of spacing is better than having none at all—
even if it’s just a few minutes. And incorporating spacing between retrieval activities is even better.
So, design your program’s spacing within a session, between a session, or overnight to make the most
of those zzzzzs!
Why? Because spacing, including sleep, helps maximize long-term memory formation so they can
remember what they learn today in the future!

Things to Try
When designing a training program, try incorporating some of these techniques:
• Include any kind of spacing.
• Ensure learners get ample breaks between sessions; consider having more breaks throughout
the day or making existing breaks longer.
• Consider an extended format—instead of a multiday program with a packed agenda, break it
up into a series of half days. For bigger impact, include retrieval activities between the series.
For example, at the start of day 2, give a quick quiz or activity to get learners to retrieve the
information they learned the day before. Breaking up the learning over a period of days also
allows for sleep, which is even better for long-term retention!
• Consider spacing out the learning content to include prework and post-work topics. For bigger
impact, add retrieval activities, such as leading a game on the prework content at the start of
the class.
Here are some more examples of how to add spacing into your learning programs:
• For multiday classes, we usually start each day with a debrief or retrospective. We cover any
questions participants have, what topics resonated with them, and ideas they came up with for
how to apply concepts on the job. Doing this has a greater impact because we are combining
spacing, sleep, and retrieval activities.
• In some leadership programs, we follow up with post-work composed of a series of simple
messages asking the learner a few questions about what they learned and how they are

Leveraging Neuroscience in Learning Design | 89


applying it on the job. The post-work messaging is sent out once a week or every two weeks for
a specific period of time. Again, spacing is combined with sleeping and retrieval activities for
better retention.

Foster Insights: The Science


The good news about strong schema and neural pathways is that they reduce the brain’s workload when
recalling information and executing behaviors. The bad news is that strong neural pathways can also
make learning new information and changing our behavior more difficult. Just think of those paths in
the woods. If you have been using the same path for years, going in a new direction off the beaten path is
going to be difficult, even scary.

What It Means for Instructional Design


When we share information with people without also asking them to practice or reinforce it, or
giving them the opportunity to integrate new information into their personal schemas, we create the
awareness but not the memory required for behavior change. People remember what they discover
for themselves.
Whenever possible, let people discover things on their own. A learner’s previous experience and bias
will shape how they receive the information provided in the learning program and whether they will apply
the information in their work. We cannot design for every single perspective, but we can provide space and
opportunity for them to integrate new knowledge into their own experiences.
Certainly, learners need to gain knowledge about something before they can figure out how to use
or apply that knowledge. To help them generate their own insights, move beyond knowledge transfer.
Intentionally decide where in the learning experience knowledge gain ends and application and inte-
gration starts.

What to Try
Insight can be as simple as an effective debrief question like, “How would you use this concept in your
work?” Or it can be as complex has having learners develop their own model. You can:
• Share the elements of a new programing language, then let learners program something.
• Share a new negotiation model, then have learners share their perspectives of its benefits
instead of having the instructor state them.
• Use the expertise in the room. Learners often have valuable, related experience, so allow them
to share what they’ve learned or done. Then supplement that with any content you want them
to know and they haven’t mentioned.
• Have learners build a sales process in teams first, then share the desired process and highlight
the similarities between the two.

90 | Chapter 7
• Do pre- and post-polls so people can see the shift in their own and others’ perspectives.
• Put learners in pairs and have them explain a concept to each other. Make sure they
discuss the impact the new concept would have on their work.
• Provide time and a tool for individual ref lection. Include questions or guidance so they
intentionally focus on what the new information means to them or their work.
Here are two real-life examples of design allowing space for insight:
• We have a workshop that introduces recent university graduates to critical selling skills.
The workshop begins with small groups rating the performance objectives of a six-month
program on two factors: difficulty and importance. During the workshop, they actively
practice performance objectives like leading client conversations and handling objections.
Then, at the end of the workshop, they rate the objectives again. The workshop facilitator
debriefs based on the rating changes. In the pre-workshop assessment, participants
often rate “asking effective questions” as important and not difficult. However, once
they practice a few assessed client conversations, their perception of the importance
increases, and the difficulty rating goes way up. The two rating sessions separated by the
opportunity to practice allow new sellers to develop their own insights about what makes
a successful sales professional, and opens them up to further learning and development
opportunities.
• Research conducted before the design of a new leadership program revealed that time
management was an issue for the target audience. Because the audience was experienced
professionals, we didn’t feel comfortable providing basic time management content.
Instead, we started the time management model by providing general areas where
managers spend their time. In small groups, the managers developed a pie chart to show
how a high-performing manager would allocate their time. Then, we compared their
models as a large group, discussing the similarities and differences required by different
organizational demands. Managers were then asked to look at their work calendars and
create a pie chart depicting where they actually spent their time in a typical week within
the last month. With their own reality in front of them, they gained immediate insight to
changes they needed to make to match their definition of how good managers allocate
their time. They were engaged. They were more committed to changing their behavior
because they each reached their own meaningful insight.

Consider Emotions: The Science


The hippocampus and amygdala regions of the brain are both essential for learning. The primary func-
tion of the hippocampus is in learning systems, navigation, and memory, while the amygdala is the
emotional processing center of the brain and the seat of working memory. Together, the two regions take

Leveraging Neuroscience in Learning Design | 91


in information and move it into long-term memory, storing the emotional context along with the memory.
When the amygdala is aroused, it automatically turns on the hippocampus.

What It Means for Instructional Design


Certain levels of emotional arousal increase attention, help learning content be more memorable, and
enhance retention. Emotional content grabs the attention of the learner and helps focus that attention. It
then signals the brain that the content is salient, ensuring it is stored effectively. This is particularly import-
ant with behavioral change programs.
All emotional arousal is not equal, however. Extreme emotions, whether positive or negative, can
interfere with learning by interfering with our ability to attend to anything else and creating a lack
of focus. Exercises that remove the fear of failure are especially important for facilitating learning
programs. The best emotions for learning are in the middle—not too exciting and not too threatening.
In this range, both positive and negative emotions can aid retention, although using negative emotions
can be dangerous. What seems like a mild negative stimulus to some can be interpreted more strongly
by others. Mild positive emotions are more likely to achieve their goal with a wider range of learners
and have been shown to aid creativity, collaboration, and insight.
Understanding, making meaning, and problem solving are all key components of learning, but
emotion is also essential. The emotional component is what enables us to make evaluative decisions
based on what we know.

What to Try
Connecting to emotions to enhance learning can be twofold. On the one hand, you want to decrease
any negative emotions and stress that participants bring to the learning event; on the other hand, you
want to create positive emotions that help with later retrieval. Try some of these techniques:
• Help reduce negative stress at the beginning of a session by letting people name their
distractions and concerns. People may be bringing concerns from their personal lives, but
they may also be worried about work that isn’t getting done while they are in the class. If
learners are not comfortable sharing their concerns with the group, you can ask them to
write down what they’re worried about and then put it away. Just naming concerns and
fears can help reduce their significance.
• Light competition can stimulate emotions, as long as the pressure to “win” doesn’t become
stressful. In online courses we often employ Jeopardy-like games where learners can test
their knowledge and try to achieve higher scores by answering more difficult questions.
Their fear of failure is minimized because we allow them to retake the knowledge checks.
However, it is important that these games don’t seem cheesy or childish because that will
produce counterproductive emotions.

92 | Chapter 7
• During the learning experience, create situations where people can connect with others
and experience emotional resonance. This can be more difficult in online courses than in
face-to-face classes. In a course for technicians that outlined the importance of following
a set of procedures to get servers back online quickly, we used scenarios depicting real
consequences for people who were depending on those servers. This allowed the learners
to relate to the situation and resulted in a greater commitment to solving problems for
other people. The session was no longer just about remembering a set of random steps to
make the hardware work.
• Challenging the learner increases emotion and helps increase retention, but don’t go too
far. To build technical and conversation skills, sales professionals need to practice client
conversations with subject matter experts in a workshop. For early career professionals,
this can be highly stressful. To help balance between participants being under challenged
and overly stressed, the facilitators can focus on the growth opportunities and penalty-free
environment a practice session offers.

Summary
Incorporating neuroscience concepts into learning designs for business builds robust neural networks
that facilitate better retention and retrieval, driving behavior change and business impact.
As we learn more about how the brain works, we learn how to use the brain’s cognitive processes
to enhance learning, rather than fighting against its natural impulses.

Key Takeaways
Here are several ways to leverage the brain’s cognitive processes:
9 Focus on the learner first, then the content. Neural networks, focus, space, insight, and emotions
are all about the learner, their experience, and how well that experience promotes the retention
of new knowledge.
9 Honor the learners’ current schemas as you design the experiences to build new ones. Provide
time and space for learners to integrate these new ideas into their personal schemas.
9 Consider neuroscience principles in every delivery method. Neural networks, focus, space,
insight, and emotions influence everyone whether they’re in a standard classroom, live virtual
classroom, social learning, or online self-study.
9 Find the right balance as you design. No learning experience can be all about emotion or focus
or insight. Just as leveraging emotion needs to find the right balance between under stimulation
and overstimulation, the program designs should be integrated and balanced with other learning
design techniques.

Leveraging Neuroscience in Learning Design | 93


Questions for Reflection and Further Action
1. What potential perspectives are your participants bringing to this learning experience?

2. At what points in the learning experience can you leverage neural networks, focus, space, insight, or
emotions to increase retention?

3. Where does the learning experience call for a change in perception or belief ? How can you leverage
activities that address neural networks, focus, space, insight, or emotions to foster behavior change?

4. Which of these learning design suggestions will work in my organization? How might you use these
ideas to learn more about the neuroscience of learning?

94 | Chapter 7
8
Iteration, Not Perfection
Suzanne Frawley

Dee recently joined a midsize organization to lead the talent development team. One of the goals her
team committed to delivering this year was a foundational management program for new leaders who
have direct reports for the first time. The need for this program was based on feedback from senior leaders
and department leaders within the organization, observations from HR business partners, and a longtime
member of the talent development team.
To better understand the big picture, as well as the organization’s culture, leaders, and employees,
Dee met with business leaders across the organization. The HR business partners told Dee they were
spending too much time investigating employee complaints and coaching leaders on basic management
skills. They asked Dee and her team to immediately begin working with new leaders who were early in
their careers. To help speed things along, the business partners drafted an agenda for a five-day, face-to-
face training program for Dee and her team to implement.
Dee also found out the following:
• An operations executive was considering a third-party vendor to train his leaders in ownership
and accountability.
• New leaders were frustrated in their roles—unsure of what was expected of them and how
they were supposed to do their jobs—and felt they had been “thrown to the wolves.”
• Training for new leaders had never been offered within the organization.
• Exit interviews showed that employees believed they were not being coached or developed,
received little to no feedback, and their work was not appreciated.

95
From prior experience, Dee knew the benefits of building the skills of new leaders. The orga-
nization was growing, and increasing new leaders’ capabilities would contribute to the growth
of individuals and company performance. With the current competitive landscape, retaining and
developing internal talent was a primary focus for HR and the talent development team. Dee also
knew that training new leaders to feel competent and confident in their roles would take more than
five days. However, in speaking with the business leaders, she learned that four or more consecutive
days in a face-to-face training would not be possible due to business needs, potential downtime,
and travel costs for 60 percent of the participants. Operations stressed they could not afford to miss
growth opportunities or production targets. For the remaining new leaders who occupied various
roles in the corporate office, time away from a heavy work schedule was also a concern.
Dee knew that if new and existing leaders were not involved in developing the program she
proposed, it would not be endorsed, attended, or reinforced. She estimated that to effectively and
efficiently deliver a pilot and launch a sustainable program would take six to eight months.

Questions for reflection:


• What is the best approach for Dee and her team to take?
• How can she involve the business leaders and reduce the temptation for departments to do their own thing?
• Rather than lead a horse to water, how can Dee and her team make people thirsty?
• How can Dee and her team demonstrate the value they bring to the organization?

Where to Begin? Big Project, High Visibility, Critical to Business Success


With all the advice, predetermined solutions, suggestions, timelines, and skepticism from the business
side, it can be overwhelming for talent development professionals to know where and how to start.
While it is tempting to jump right into program design and development, Dee knew that it was more
important to first try to understand the challenges of a new leader. Her TD team needed to walk in the
shoes of a new leader to learn the demands of the business and what senior leaders expected of them.

The Process
Dee understood that the business was not concerned with how her team designed, developed, or delivered
the leadership development programs—nor was it appreciative of the language that talent professionals
use—it simply wanted to meet its business goals. So, she began the project by meeting with her team to
outline the big picture. She summarized the needs she learned by speaking to the business and listening to
feedback from her team. Then, the team brainstormed next steps, including contacting peers in other orga-
nizations and reviewing research on what new leaders needed to be successful, including key capabilities.
They recognized that there were multiple ways to tackle the project, so they focused on the question,
“What is the business problem we are trying to solve?”

96 | Chapter 8
Dee wanted a solution that provided impact to the organization and simultaneously ensured the buy-in
and involvement of all stakeholders. Given this situation, she outlined an iterative approach for the TD
team and drafted a high-level design thinking approach and timeline for the project (Figures 8-1 and 8-2).

“We must design for the way people behave, not for how we would wish them to behave.” —Donald A.
Norman (2010)

Figure 8-1. An Iterative Approach

Figure 8-2. High-Level Project Approach

Dee knew that she and her team needed to be clear on the problem, so she outlined an approach that
included a blend of project management and design thinking tools.

Iteration, Not Perfection | 97


Design Thinking, an Integral Part of the Process

“Design thinking is a human-centered and collaborative approach to problem solving, ensuing a designed mindset
to solve complex problems.” —Tim Brown, President, IDEO

Dee suspected that using design thinking tools would accelerate the design, development, and testing
process. To ensure she was selecting the most effective tools, she reached out to Michelle Webb, learning
strategy manager with Accenture, who recommended asking the following questions:
• Why does the business want what they are asking for?
• Why are the leaders and HR business partners asking for particular models or approaches
(such as training on ownership and accountability or five days of face-to-face training)?
• How will you document and fill the current gaps to solve the problem?
• How will you determine that you have the best mix of end users during the design phase?
• How will you persuade participants in the design and development phases to think more
broadly and focus on the needs of new leaders instead of jumping to preferred solutions
(for example, proposing a training agenda for new leaders before understanding what new
leaders need)?
• How will you encourage participants to have empathy for the end user, so they see the new
leader’s problem as their problem?
• Will you be able to trust the design thinking process to determine the best solution instead of
using it to fit a predetermined conclusion?
As Dee began preparing for the design thinking session, she identified the challenge: “How to
help a new leader quickly gain competence and confidence in their role while keeping time away
from their jobs to a minimum.” She listed the needs she’d identified in her research, which included
having “tough conversations,” accurately documenting a counseling session, setting clear expecta-
tions and goals for the team, coaching, and giving feedback to employees.
Michelle told her that she should invite those who would be delivering and receiving the training
to these sessions. She also mentioned that these participants might have a different perceived view of
what is needed, but that was OK. To ultimately determine the problem that actually needed to be
solved, they would have to display convergent and divergent thinking.
Convergent thinking is the tendency to move toward one point within groups or organizations.
However, the benefits when working with a group is to leverage different viewpoints—if a group
working together thinks the same way, it may be difficult to come to the best approach or solution.
On the other hand, when the group determines the best approach, the benefit of convergent thinking
is the group has a consistent message that they are committed to. Divergent thinking is moving in the
opposite direction from a common point. When groups first begin to work together on a problem,

98 | Chapter 8
they often display this kind of thinking because the problem they are trying to solve is not clear or
has not been agreed upon. It is important for the facilitator of the design thinking session to listen to
each member of the group and understand their thinking.
Michelle encouraged Dee to take time to speak with end users (leaders and new members of
management) and recommended that she start with simple problem-framing methods called Rose,
Bud, Thorn (RBT), and affinity clustering. These tools helped Dee and her team understand the
issues involved in a large amount of data. The session began with the team generating all the known
information about a topic using a trigger question; for example, “What are the critical issues associ-
ated with current new leader development practices?” Each team member responded by writing their
ideas on paper and transferring them to different colored sticky notes:
• Roses (pink sticky notes): Issues or ideas that are positives or successes.
• Buds (green sticky notes): Issues or ideas that have potential to be more positive than
negative.
• Thorns (blue sticky notes): Issues or ideas that are negatives or challenges.
Once this step was completed, the team posted their ideas using an affinity process to cluster and
label all like topics. Then, they reviewed the trigger question, and if needed, reframed the problem
question to reflect what was discovered in the exercise.
Rose, Bud, Thorn helped Dee and her team understand the business goals new leaders needed
to achieve, the areas they needed to learn (process, procedures, leading themselves and a team, and
developing people), and the time needed to develop competence and confidence in these areas. Dee’s
team enjoyed the collaborative process and said it helped them move away from quick problem label-
ing and offering solutions to ideation.
After reviewing the roses, buds, and thorns, the team discussed how they could turn the buds into
roses, and how to eliminate thorns (Figure 8-3).

Figure 8-3. Turning Buds Into Roses and Eliminating Thorns

Current State Options


Buds into roses New leaders are familiar with completing Review platforms for live virtual delivery and
assigned, compliance-related online learning tools for sustainability
through the LMS
Eliminate thorns Training material is a PowerPoint with lecture Redesign program with emphasis on a learner-
centric approach

At the end of the session, the team realized that when a problem was framed accurately, they had a
better chance of uncovering the root cause and improving the current practice.

Iteration, Not Perfection | 99


Managing the Project

“Project management is the planning, organizing, directing, and controlling of resources for a finite period to
complete specific goals and objectives.” —Biech (2014)

After the problem was well defined, Dee used several project management tools to keep things on track.
First, she drafted a project charter for her team to use as a road map. The charter provided the initial
requirements to meet stakeholder needs and expectations and defined the business needs, the high-level
deliverables, and the overall timeline. The team referred to the charter often, which helped them stay
focused on the project’s critical components:
• business need and impact
• budget
• project scope
• criteria, constraints, and assumptions
• sponsors, project lead, and team members
• out-of-scope items
• deliverables
• key performance indicators
• key milestones.
Once they created the project charter, the next step was to draft a work plan and determine who
would do what by when.
There are numerous options for work plans; the key is to find one that works best for the project
team and the project. Work plans that are easy to use, update, and access work best. For this project,
the team opted to combine a work plan with Bain & Company’s RAPID Decision Making Model,
which works by clarifying who should do what for each decision that needs to be made (Figure 8-4).

Figure 8-4. Example of a Work Plan Combined With a RAPID Model


Change Management
Vehicle or Method

R (Recommend)
Project Activity
Accountable

P (Perform)

D (Decide)
A (Agree)
Involved

I (Input)
or Stage

Activity
Start
Task

Due

100 | Chapter 8
The team wanted to be clear on who had the final decision, because the project had high visibil-
ity and multiple people were involved in implementing the decision. According to Bain & Company,
a RAPID model is best used when there are two or more functional units or teams involved and big
decisions need to be made.
The RAPID acronym represents five roles:
• Recommend. Create the initial proposals and recommendations.
• Agree. Must agree with the proposals from the recommend group.
• Perform. Execute the work after the decision is made.
• Input. Provide information and facts to the recommend group.
• Decide. The person who has the authority to make the decision.
To complete the work plan, Dee and her team needed to understand what was important to their
stakeholders. To do this, they completed a stakeholder mapping (Figure 8-4).
They began by identifying their main stakeholders and writing those down on a whiteboard. The
stakeholders included functional and department leaders, HR business partners, IT, employees, and the
TD team. Then they wrote down each groups’ perspectives regarding new leader onboarding. The team
considered the following questions and perspectives:
• What concerns would each group have regarding implementing new leader training?
• What challenges could result from having new leaders not in their role for a few days?
• What are the biggest challenges for new leaders?
• How will we ensure the training is consistent?
• What is the best way to assign and track the training?
• How will we reinforce and sustain this training?
The team then contemplated how each group could influence other groups and used the following
criteria to determine influence:
• Contribution (value). Does the stakeholder have information, counsel, or expertise on the
issue that could be helpful?
• Willingness to engage. How willing is the stakeholder to participate?
• Influence. How much influence does the stakeholder have? Who or what do they influence
(for example, strategic decisions or budget allocation)?
• Necessity of involvement. Is this someone who could derail or hinder the change process?
• Blockers. Is there anyone who could block the development and implementation?
Next, team members drew a circle around any role that was related and drew solid arrows between
roles with tight or significant connections. They also drew a dotted line between slight connections. This
gave them a clear picture as to the level of influence one group may have on another.
The team felt it was critical to identify the top influencers and blockers, so they placed a green dot
on the sticky notes for the top three to five influencers and a red dot on the top three to five blockers. Top
influencers were frontline leaders and HR business partners; top blockers were midlevel leaders and IT.

Iteration, Not Perfection | 101


As a result of the stakeholder mapping, the team realized if they were going to minimize travel,
prepare new leaders for learning, and make the learning stick, they would need to use technology that was
not currently in use within the organization. An interactive virtual learning platform and a sustainability
tool would enable remote employees to have easy access from their computers and mobile phones.
To sell these needs to IT and the business, they developed a business case. The purpose of the interac-
tive virtual learning platform and sustainability tool was improved performance and retention of learning,
reduced travel costs, and increased leader engagement. The business case summarized the business reason
and included these components:
• the summary
• background and project description
• strategic alignment (how the needs align to the overall organizational goal)
• impact (benefits) to the business
• alternatives
• environmental analysis
• risk assessment
• cost/benefit analysis
• conclusion, recommendation, and implementation.
After Dee presented the business case, IT thanked the team for making it easy to make the decision
and agreed to implement their request.
At this point in the project, Dee and her team had reviewed exit and engagement survey data and
interviewed senior leaders, HR business partners, and leaders with one to two years of experience. They
also benchmarked against large and midsize organizations and reviewed research from ATD, Bersin, and
Gallup. However, they felt overwhelmed by the amount of information they’d collected and struggled to
see how it all connected. Some team members even suggested they needed to uncover more information
and conduct more interviews.

“I Wish I Had More Data”


Really?
More data is usually available. It takes time or money, but you can get more data.
But you’re probably not using all the data you’ve already got.
I’m guessing what you meant was, “I wish I had more certainty.”
And that, unfortunately, isn’t available.
If it’s worth the work you put into it and the change you seek to make, it’s worth dancing with the uncertainty.
Reassurance isn’t going to come from more data—that’s a stall.
Forward motion is the best way to make things better.
—Seth Godin (2019)

102 | Chapter 8
To help the team feel more certain about moving forward, Dee suggested they create a mind map so
they could visualize the data’s frequency, patterns, and relationships (Figure 8-5).

“A mind map is a thinking tool that reflects externally what goes on in your brain.” —Tony Buzan, inventor of
the mind map

Figure 8-5. Leading Teams Mind Map

Mind mapping enabled team members to capture all their thoughts and use color, fonts, and icons to
show the needs of those new to management and prioritize key areas of focus.
Throughout the entire project, Dee reminded her team, “It’s not about us; it’s about them”—in
this case, the end user or new leaders. To keep their focus on the end user, Dee and her team developed
persona profiles based the different levels of leaders, time in role, and whether they were corporate or field
based (Figure 8-6). The persona profile is part of the design thinking methods toolkit and originated in the
work associated with user-centered experience. The team outlined three different categories:
• new leaders with less than six months in role
• leaders with six months to two years of experience
• leaders with more than two years’ experience.

Iteration, Not Perfection | 103


Figure 8-6. Personal Profile for a Business Learning Advisor

Who: Who is this person? Tasks or Goals: What do they need to do? What decisions do they need to make? How
What is their situation? will they know if they are successful?
Where do they work?

Skills Needed to Get Results: Based on their goals and role, as well as your understanding of the VUCA (volatile,
uncertain, complex, and ambiguous) workplace, what are the skills most needed in the talent leader space to serve as a
business learning advisor?

Characteristics: What do they do? What behaviors do we Influencers: Whom do they listen to? What are they
observe? What can we imagine them doing? hearing from others? What are they watching and
reading?

Values and Motivators: What are their wants, needs, Pain Points: What are their fears, frustrations, and
hopes, or dreams? anxieties?

Questions They Typically Ask a Client: What have we heard them say? What can we imagine them saying? What is
their focus?

A learning experience to advance skills in a way that helps a current talent development professional to upskill
to a BLA is . . .

The team then applied those profiles to new leaders in the corporate office and those who were
in the field. The result was six persona profiles.
The team believed that creating personas for those with two or more years of experience would
make the program attractive to leaders who had never had formal management training. By incor-
porating this exercise into the program’s design and development, the team was able to develop

104 | Chapter 8
empathy; identify characteristics, values, goals, skills needed; and gather knowledge of how leaders
like to learn. The personas enabled the TD team to have a deeper understanding of the role and an
increased perspective of a new leader. As a result, the team was able to provide a more tailored design
and development approach.

“A persona is a way to model, summarize, and communicate research about people who have been observed
or researched in some way. A persona is depicted as a specific person but is not a real individual; rather, it is
synthesized from observations of many people. Each persona represents a significant portion of people in the real
world and enables the designer to focus on a manageable and memorable cast of characters, instead of focusing on
thousands of individuals.” —Shlomo Goltz, Software Designer

It was time for Dee and her team to gain a deeper understanding of the new leader’s experience
managing a team and developing people. They pulled the data and information they had learned
and used an experience diagram, which illustrates the experience in detail using a series of prompts
or triggers. The team used a new leader’s first six months in their role as the goal for the experience.
In preparing for this session, Dee reviewed the components of an experience diagram to see the best
way to capture a new leader’s first 180 days:
• Formats. An experience diagram can take a variety of formats. It can be a flowchart,
matrix, timeline, or a simple map, similar to a typical journey map or user-experience map.
• Symbols. Within any format, a variety of symbols can be used to describe and illustrate
the experience. These include icons, graphs, pictures, words, sentences, quotes, sticky
notes, or documents.
• Prompts or triggers. There are numerous prompts or triggers that help groups
delve into a common experience to diagram and illustrate deeper details and further
layers of granularity. These prompts include goals for the experience, critical waypoints,
significant touchpoints, stakeholders, interactions with people, and tools, tasks, and
technology.
They chose a timeline. Using a large whiteboard and large sticky notes, the team broke down
the experience by time and category. They wrote timeframes horizontally across the top of the white-
board: before start date, day 1, days 2–7, days 7–30, days 30–60, days 60–90, and days 90–180.
Then, horizontally, they wrote the four main categories: systems, policies, and procedures; manag-
ing myself; managing my team; and developing people. The team then referred to their data and
research, mind map, and personas as they wrote steps on the large sticky notes and placed each step
on the timeline corresponding to the appropriate category where they thought new leaders would
need that experience. By doing this, the TD team gained further insights and empathy for new lead-
ers in their first six months in the role.

Iteration, Not Perfection | 105


Summary
The days of taking a year to conduct a needs analysis and then design, develop, and implement a learning
program are gone. Businesses and needs change quickly and when the rollout of a program occurs, it usual-
ly needs to be updated, changed, or modified to keep it from becoming obsolete. Talent development
teams need to be flexible and anticipate the needs of the organization. In addition, they can no longer
lock themselves in a room to design and develop a program and have the pilot be the first time the
organization sees it. When TD professionals take this approach, the business often pushes back—not
because the team missed the mark, but because the business thinks they have much to offer and were
not a part of the process.
The project charter and RAPID model are excellent tools for managing the project and outlin-
ing specific work that needs to be done and who is involved at each step. Iteration throughout the
design and development provides the flexibility to update, refine, and confirm the work in real time and
involves members of the organization along the way. The Rose, Bud, and Thorn and affinity clustering
tools give a clear picture of what is working, the current challenges, and where opportunities existed.
It only takes 90 minutes to complete this exercise and the outputs are extremely valuable. Using the
mind mapping approach helps show patterns and connections in the data and increase confidence in
final recommendations. Adding personas facilitates stepping into the shoes of the end user, resulting in
deeper understanding and perspective.
These tools were combined using the Experience Diagram exercise, which helps show the program
and the journey through the eyes of a new leader. The exercise shows the big picture, helps build empa-
thy for how overwhelming a new leader’s role can be, and results in a realistic timeline for new leaders
to feel competent and confident in their role. It also helps confirm which learning opportunities could
be done virtually, self-paced, or face-to-face.
Involving members of the organization is critical; however, there is a delicate balance between
involving the business enough versus too much, which has the risk of slowing down progress. Some
will see a program as risky if they are concerned it will keep leaders away from producing results. They
may also fear criticism if the program is not perfect or doesn’t guarantee which performance results the
program will deliver.
Talent development leaders understand that organizations want results faster, cheaper, better.
By involving the organization early and often, and guiding them in the process, their needs can be
addressed immediately and they can give input, become advocates, and own the outcome.

What Happened?
Dee, her team, and leaders from the business gained realistic expectations of new leaders after a review
and debrief of the experience mapping. The team then proposed the program design and committed
to launching a pilot in three months. Business leaders agreed with the design, the proposed content, and

106 | Chapter 8
overall timeline. As the team prepared to pilot the program, a senior vice president asked if he could
kick off the program by sharing his leadership experience, why the company was offering training for
new leaders, and how critical their role is, as well as why the company wanted to focus on setting new
leaders up for success. In the first three months, 50 new and seasoned leaders actively participated in
the program, different vice presidents kicked off every session, and senior leaders made the program
mandatory for all members of management.

Key Takeaways
9 Talent development professionals can demonstrate to the organization that they are agile, and they
listen and work to understand the needs of the end users. It is critical to involve the business early
and throughout the discovery, design, and development process. When they do this, it builds trust
and confidence in talent development and reduces the suspicion of rolling out a program that is too
complicated, does not meet the needs, and takes too much time, causing a reduction in performance
9 Talent development and learning professionals need to understand how things get done within
the organization. Call it politics or organizational savvy; key members of the company need to be
involved. Socializing with business leadership and management is one way to involve them; another
is to include them in the design thinking sessions. Asking leaders to review the content or designate a
SME from their area to provide ongoing input helps them feel more comfortable and confident in the
end product.
9 Iteration, iteration, iteration—those in the TD profession who were taught ADDIE, ISD, or other
design approaches need to embrace an iterative, Agile approach to designing that is human-centered
and starts with the customer. Make it an expectation: Iteration, not perfection. TD leaders need to
balance iteration and input with the need to move forward and make a decision. Be a champion of
getting better at getting better.
9 Combining traditional project management with design thinking tools gives the flexibility to iterate,
empathize with the end user, and produce the desired business outcome. Rose, Bud, and Thorn
clarifies the problem and the project charter shows the big picture road map and project focus.
RAPID outlines the work that needs to be completed, who needs to be involved, by when the work
needs to be completed, and who has the final decision every step in the plan. Mind mapping helps
bring structured chaos to vast amounts of data and ideas, and helps individuals and teams see
patterns and connections. Personas summarize the mindset, needs, and goals, and make the data
meaningful, memorable, and visible. The experience diagram gives visibility to the needs assessment,
the topics that need to be addressed, who needs to be involved, and the amount of time new leaders
need to complete the program.
9 Get members of the TD team and the group that will be affected by the decision the opportunity
to experience the design thinking approach. Say, “Let’s solve this problem fast by looking at the

Iteration, Not Perfection | 107


issues: understanding all of the components, people, and so on, and iterating using prototypes; in
other words, learn-build-act, learn-build-act.” This approach also increases the partnership between
talent development and the business, and results in exceptional performance. Michelle Webb from
Accenture encourages talent development professionals to learn by researching internally and
externally, reviewing the science and technology. To research:
• internally, ask what new leaders coming into the role think they need to know
• externally, learn what other companies are doing well and what insights they have
• science, discover well-respected sources with unbiased and objective methods
• technology, find out which technology works best—virtual reality, augmented reality,
collaborative tools, or mobile.
9 Here are some of the common questions about design thinking that Michelle asks:
• How much do I need to get people involved in the process?
• How do I make it agile?
• How much will this cost?
• How much time will it take?
• How do I estimate and budget for this process?
• How do I make this agile and still meet the deadlines?

Questions for Reflection and Further Action


1. What ideas from this chapter might you want to explore further?

2. What ideas in this chapter resonate with your thinking?

3. What is your current practice when you are asked to build a program that is identified as critical to
the future success of the business?

4. How might you incorporate some of these project management and design thinking tools and
techniques into your team’s work?

108 | Chapter 8
9
The 70-20-10 Framework:
Options for Impact
Alan Abbott and Rachel Hutchinson

We’ve all been there. You spent months coming up with a well-structured, engaging instructor-led class.
Everybody passes the assessment and the post-course evaluations are positive.
But then you follow up a few months later and find you’ve made almost no measurable impact
on employee performance. When you set up a focus group or send out surveys to see what happened,
the participants confess that they either forgot all the material or never found a way to apply it to
their jobs.
One solution to this scenario is the 70-20-10 framework. It takes formal training initiatives and
supports them with robust on-the-job support and opportunities to learn from peers. As a result, learn-
ing is extended and reinforced, and participants are better able to apply their new knowledge to the job.
The origin of the 70-20-10 framework can be traced back to research conducted in the 1990s by
Morgan McCall, Robert W. Eichinger, and Michael M. Lombardo when they were colleagues at the
Center for Creative Leadership. Responses to a survey sent to successful executives indicated that they
learned 70 percent from challenging work, 20 percent from supportive relationships, and 10 percent from
formal education. The three men then published their findings in the third edition of The Career Architect
Development Planner (Lombardo and Eichinger 2000).
Over the course of the succeeding years, the 70-20-10 framework came to embody an approach
to learning and development that deemphasized classrooms and coursework in favor of on-the-job and

109
social learning in the form of self-evaluation, reflection, on-the-job support, group projects, mentoring,
and discussion groups.
The framework has become popular in learning and development to foster and apply new skills and
close performance gaps. The 70:20:10 Institute was created to meet the demand for consultation on the
framework, and many global companies have adopted 70-20-10 approaches.
As the influence of 70-20-10 has grown, criticisms have also emerged:
• In the original research, the sample size was small (about 200 people).
• Self-reported data can be undermined by personal biases and blind spots.
• Formal learning, the longtime standard for developing employees, seems overly deemphasized
in the model.
• The presentation of the findings as 70-20-10 seems to betray a round number bias—the
economic concept that people tend to pay special attention to numbers that are “round” in
some way.
However, talent professionals who use the 70-20-10 approach rarely attempt to follow that equation
exactly. The 70:20:10 Institute writes:

The numbers are essentially a reminder that people learn most from working and interacting
with others in the workplace (70+20). The specific ratio (70-20-10), in any given situation,
will vary, depending on the work environment and the organizational results required.

Model Ratio 10% 20% 70%


Model Topic Course work and training Developmental Challenging assignments
relationships
Other Terms Educational learning Exposure to and Experiences within the
events interaction with others context of the workflow
Examples • Structured training • Communities of practice • Performance support
programs • Mentoring • Rotational assignments
• Classes and courses • Coaches • Project teams
• Workshops • Feedback (especially action
• E-learning • User generated content learning groups)
• mlearning • Collaborative platforms
Source: Arets, Jennings, and Heijnen (2019)

The 70-20-10 framework is similar to the Pareto Principle, which states that roughly 80 percent of
the effects come from 20 percent of the causes. The Pareto Principle is not absolute, and many excep-
tions can be found. What is important is that it helps you realize that the majority of effects come from
a minority of causes. With the 70-20-10 framework, most of what we learn happens when we are trying
to do tasks (perform). As learning professionals, we need to be advocates and help learners in every area
and any way we can. Traditionally, L&D has spent the majority of its resources on formal learning.

110 | Chapter 9
To be sure, formal learning remains a critical component of the talent development repertoire.
For example, would you want your pilot or doctor to learn only through practice or peer exchange?
Probably not. But deliberate practice with feedback and coaching is what makes your pilot able to
make a split-second decision when needed. A prime example is Capt. Chesley “Sully” Sullenberg-
er’s safe landing on the Hudson River in 2009 after the jet’s engines were disabled. This event is now
remembered as the Miracle on the Hudson.
You likely can think of other examples in your own life where expert intuition used by highly
skilled and experienced individuals led to success. Experts can walk into a situation and quickly
“see” a fix for the problem because the conditions remind them of other situations. Many of us can
do this when we assess an organizational problem. Of course, most of us won’t have to react to a
life-threatening experience, but we do have smaller day-to-day instances where our experiences
affect our decisions. It could be as simple as how you react when a videoconference call is not
connecting—a decade ago you would have called IT. Today you’d do some trial and error or search
the Internet for solutions. Formal training cannot anticipate every single variable, and even if it
could, your brain still couldn’t retain that much information. These examples show that focusing
only on the 10 percent has a huge potential impact on your ability to add value to the organization
and the ability of all learners to add value.
L&D has a unique opportunity to inf luence people within an organization, which means that
what we focus on can positively—or negatively—affect business outcomes. When we limit our role
to formal learning, we may:
• Prevent learning opportunities from being made available to our team members.
• Design solutions in a way that does not encourage learning from one another.
• Limit 70 to 90 percent of on-the-job, real-world, in-the-moment learning to only
external, invalidated sources such as Internet searches.
People will always search for information and exchange ideas and practices with others. If you
don’t get in front of that or support it by providing structure and relevance, you will surrender your
inf luence and undermine your effectiveness.
Additionally, research continues to validate the fundamental assertion of 70-20-10—develop-
ing complex workplace skills usually requires learning from outside the classroom and incorporates
evidence-based social and experiential learning activities. Deemphasizing, but not eliminating,
formal learning (such as lectures, e-learning programs, and workshops) and supplementing it with
activities like group projects, rotational assignments, coaching, and discussion groups isn’t just
being done by organizations that follow 70-20-10; it’s also being done by secondary and postsec-
ondary schools throughout the United States.
How does the 70-20-10 framework work? According to Vivian Heijnen, co-author of 70:20:10
Towards 100% Performance, the process includes working your way through five phases—perfor-

The 70-20-10 Framework | 111


mance detective, performance architect, performance master builder, performance game changer,
and performance tracker (Arets, Jennings, and Heijnen 2016). As repetitive as it may seem, starting
each phase with the word performance is key because it keeps you aligned to the fact that you are not
creating a training program or a learning solution—you are affecting the ability of people in the
organization to overcome a business issue (that is, their performance).
The five phases are guided by a series of workbooks with job aids, planners, process f lows,
ref lection opportunities, and prompting questions. It is the responsibility of a learning and devel-
opment group to create, update, and train people in the use of these workbooks. Let’s take a minute
to look closer at each phase:
• Performance detective. This phase includes a root cause analysis that the sponsors
and talent team, and, if needed, the learners themselves, will complete. This ensures
alignment with key stakeholders about the exact scope of the problem and how it is
affecting the business. This phase is particularly important because the performance
detective frequently uncovers obstacles that are not primarily knowledge gaps and
require support from stakeholders outside learning and development.
• Performance architect. In this phase, you design a potential solution, and then
move into co-creation of components of the solution. It’s important to make sure that
everything is focused back on the core issue affecting the business. You already know
which problems are caused by skills, knowledge, experience, processes, or systems gaps,
because they were defined in the performance detective phase. If the architect phase is
successful, you’ll leave with a general set of goals and a way to measure success.
• Performance master builder. The performance master builder role uses the critical
tasks identified by the performance detective as a starting point, and then co-creates
effective solutions based on the performance architect’s design. From that point, the
process uses standard checklists to bring together resources and tasks to develop the
optimal solution.
• Performance game changer. Think of this as the road map or a launch pad for
the change. Here, you might be creating a series of milestones, communication plans,
learning opportunities, opportunities for ref lection, meetings, and so forth. You are
ensuring the successful implementation of your solutions.
• Performance tracker. In this final phase, you return to the first problem definition
and desired performance to determine if your solution has moved the needle on
performance. If the performance architect said success was a 20 percent increase in sales,
this is when you find out if you met your goal and analyze why you did or did not. If
you did not meet your goal, you may consider starting the process from the beginning.
Chances are, there was a failure in one of the first two phases.

112 | Chapter 9
This was a simplified version of the model described in 70:20:10 Towards 100% Performance.
Throughout the entire process, you are always considering the business problem and then design-
ing to reduce or eliminate it. You are never designing to increase skills; instead, you’re focusing on
performance. Of course, to eliminate or reduce the problem, you will have to increase skills, change
the way people work, and change their mindset. The difference is in how you approach it. You can
be confident that your solution is highly relevant and highly likely to succeed because you know
what you are trying to solve, what is inf luencing it, and what success looks like to your stakeholders,
and you have a structured way to overcome the problem.
One of the reasons that L&D is often seen as outside the core business is that they use specific
training language. Since 70-20-10 is so closely aligned to core performance measures, this language
barrier is less of a problem than it is for other models. When following the framework, you will start
using business terms, understand the core business areas, know the core tasks of your various teams,
determine which other areas of the business inf luence that team, and become much more embed-
ded in the organization. You will also develop a much deeper and broader network, increasing your
credibility and personal brand as the organization starts to see you not only as an expert in your
field but also an expert on your organization’s business. Because L&D touches so many areas of the
business, you have unique opportunities to gain a far broader understanding of the organization
than most people who are not in the C-suite.
So, why should you adopt the model? Why is this such a relatively new topic of conversation in
L&D? As Seth Godin stated at the ATD International Conference & EXPO in May 2019, the rate
of change you are experiencing today will never slow down. In fact, the hectic pace of business you
experience now is the slowest it’s ever going to be. All organizations are experiencing the same pace
of change, and our employees must grow, adapt, and learn quicker and quicker. Thus, we need to
evolve quicker to serve them. In today’s world, employees must be able to learn in the f low of work
because the work is constantly changing, and their roles are adapting even if their titles remain the
same. This also means that simply continuing to do things the way they were always done is sure
to fail. We need to be able to identify good practices (current and future) and apply them to new
situations. Experience helps you make better decisions and exchange can change your perspective,
but if no one has experienced what you are doing, you may need to supplement with formal learning
on the topic. And, of course, L&D may need to curate more and foster a new organizational culture
that is OK with sharing things that will become outdated and having peers point out when that
happens. It is not a simple transition, but it is a highly valuable one for your L&D teams to make.
So, what does the 70-20-10 framework look like in practice? To answer that, we turn to two
case studies.

The 70-20-10 Framework | 113


Case Study: Hilti
By Rachel Hutchinson

Hilti is a multinational manufacturer and marketer of products, software, and services for the construction
industry based in Lichtenstein. In 2015, the company began overhauling its entire approach to learning
and development.
The L&D team cast its net broadly in the search for useful information that could help form
a new learning strategy. They solicited feedback from more than 4,000 employees, created focus
groups, poured through research whitepapers, and benchmarked against almost 60 organizations.
The team saw some emerging organizational changes that would require adaptation. For
example, the normal turnover rate was increasing due to promotions and expansion, meaning a full
third of the company’s leadership changed every year. Hilti’s speed of business was accelerating,
and change was happening faster than the team could develop and roll out new formal training. As
a result, frontline leaders were asked to be more independent and agile every year.
Demographically, Hilti was achieving greater diversity in gender, age, and educational back-
ground. Previously, it had hired heavily from a pool of candidates with a construction background,
but that pool was shrinking and the skills a construction background brought were becoming less
essential. In addition, it was increasingly becoming a Gen Y and Gen Z company, which meant that a
majority of employees weren’t used to learning primarily through formal education. Instead, employ-
ees resoundingly stated that they wanted to learn on the go—often in noisy construction environ-
ments, inside car “offices” while on the road, or on public transportation while commuting to work.
The team had to figure out how to go from primarily offering formal learning to making it
possible for the workforce to search for the information they required at the moment of need. Need
to demonstrate a tool to a customer or prepare a presentation for a customer CEO? And nearly 75
percent of its workforce wanted to skip the two-week orientation and just have easily accessed step-
by-step content.
The 70-20-10 approach became the backbone of the new learning strategy. Much of the formal
online training was supplemented with workbooks and job aids that would guide the participant and
manager through learning on the job. When the company shifted to a new digital and social learn-
ing platform, it became easier to provide easy access to bite-size, peer-created learning moments.
The team didn’t try to do away with formal learning either; instead, it tried to increase what was
available for the 70 and 20 parts of the model. It was able to modify an existing classroom session
to start with a peer exchange, which cost nothing and allowed local facilitators to see the power of
exchange.
The new approach had a number of advantages:
• It de-emphasized formal learning without eliminating it.

114 | Chapter 9
• It empowered the learner and manager to create solutions.
• It limited development to topics aligned with critical company goals.
• Learning became part of the f low of work and not distinct from it.
• It created a form of measurement beyond merely passing a test that may or may not
ref lect the participant’s ability to transfer knowledge to the workplace.
Hilti rolled out two major programs as part of the new learning strategy to help make the
mindset shift from sending team members to the repair shop to helping them learn what was needed
when it was needed. The first was the 70-20-10 Expert Program, which was aimed at employees
at the director level and above in departments such as learning and development, Lean, and logis-
tics, where there was high interest and they’d previously used many large-scale formal learning
programs. This group would become the champions needed to support the success of the initiative.
The second program was the 70-20-10 Master Builder Program, which was for any employee
responsible for developing training. To be eligible, participants had to have an ongoing project
geared toward helping build a competency or improve performance. Some of these participants
were instructional designers, but many were from other departments and performed part-time
training.
These programs also allowed participants to learn about 70-20-10 in a 70-20-10 environment.
They had a mixture of formal 70-20-10 training, a 70-20-10 project, and social support with other
participants. Participants received feedback from Hilti stakeholders, peers, and the 70:20:10 Insti-
tute, all while interacting within the new digital learning platform so that they could see how it
could be used to change performance.
These two programs transformed Hilti’s entire approach to developing its employees. For
example, the New Leader Onboarding Journey, a six-month structured path for new management,
started in 2018. During the first 90 days of the program, when participants are first adjusting to
their new positions, everything is delivered and shared online. They spend this time communicat-
ing with peers, completing a self-assessment, finding on-demand support, referencing job aids, and
identifying areas they need to work on during the rest of the program. Around the three-month
mark, participants attend classroom training, where they receive formal training on topics such as
emotional intelligence and plan out the next three months of their development. They then create a
90-day action plan for their team or department, which is then approved by the next level up. The
70-20-10 framework has successfully helped align learning and development programs to the needs
of the company. Participants in the New Leader Onboarding Journey report that they have a much
better perspective on expectations.
Figure 9-1 shows an example of a 70-20-10 training request. Notice that the form asks for the
desired business outcome, not the subject of the training.

The 70-20-10 Framework | 115


Figure 9-1. Training Request Example

Please complete the entire form and then email it to LearningAndDevelopment@email.com

Requestor Name Request Date

Project Title Project Sponsor and Steering Board Names

Project Lead (Individual person or group) Support Level Requested From Global Learning and
Development (Full involvement, partial involvement,
consultation only—please explain)

Project Description

Suggested Timeframe Business Priority (1=low / 5=high)

Desired Outcome
(What will be achieved by this project?)

Business Need or Objectives


(Define the business issue or what is driving this project, as well as the business goals this project aligns with)

Target Audience Secondary Audiences

Additional Comments

116 | Chapter 9
Case Study: UPS
By Alan Abbott

UPS, based in Atlanta, Georgia, is the world’s largest package delivery company and a leading global
provider of specialized transportation and logistics services. Among its many responsibilities, the talent
management department creates and curates learning and development content for more than 50,000
members of management.
UPS’s talent management department is influenced by a number of training and development strate-
gies. However, the 70-20-10 framework is central to its leadership career development guides. Sheryl Lee,
a manager at UPS Talent Management, has been preaching the framework for years, calling it “the secret
sauce of our development program.”
Lee has found that by including 70-20-10 as a reference in personal development and promoting it
whenever possible, she hasn’t had to force the concepts on HR professionals. When talent managers kept
coming to Lee for more information on the framework, she created a job aid that provides examples of
appropriate activities for each component of 70-20-10. Education might include accessing some curated
content or attending a class. Exposure might include asking for feedback from peers, obtaining a mentor,
or expanding a network. Experience activities include solving real problems in a business resource group,
championing a new initiative that stretches the employee’s abilities, or volunteering expertise in a new
situation. The job aid, which is now available in nine languages, can be accessed at a number of places in
the company LMS, and it forms the basis for two management development initiatives.
Every year, UPS’s talent management employees work with managers to create a development plan
for themselves. When they enter their plans into the learning management system, they’re presented with
a 70-20-10 template requiring every task to be categorized as formal education, experience, or exposure.
They’re encouraged to keep the time commitment roughly proportional to 70-20-10.
New leadership at UPS participates in the supervisor learning path, which has a mixture of curated
instructional content, experience documents to prompt reflections on different areas of day-to-day perfor-
mance, and exposure exercises that guide learners to peers, teams, and business resource groups that could
give them access to social learning.

Summary
The 70-20-10 framework is used for developing employee skills and knowledge in a way that de-emphasizes
but doesn’t eliminate formal training. Instead, it places formal training within a framework that includes
on-the-job training and social learning. Articles, classes, and videos are only part of a journey that also
includes stretch assignments, job aids, workbooks, discussion boards, meetings, and group work.
Companies are increasingly acknowledging the need to extend learning beyond a single event, but
aren’t always sure how to deliver this additional training or justify the time commitment for learners. It
can also be difficult to concisely and clearly explain the goals and methodology of a training initiative.

The 70-20-10 Framework | 117


The 70-20-10 framework solves these issues and a number of other persistent challenges in training
and development.
You might be wondering how 70-20-10 differs from ADDIE and SAM. The biggest difference is
that ADDIE and SAM start with learning as the core component—they analyze what people need to
learn, create learning objectives, develop and implement mostly formal learning solutions, and evaluate
to determine if anything needs to be redesigned before the next iteration or learning solution. The
70-20-10 framework looks at the business problem. What does the business need to be more effective
and reach the business goals? In the organizational ecosystem, what is affecting the current perfor-
mance? What systems, processes, or behaviors need to be adapted? From there you can incorporate
formal learning solutions, work with your Lean department to optimize processes, or work with IT to
modify restrictive systems. L&D becomes truly integrated into the business and acts as a business part-
ner, not a repair shop.
Even though the term is 70-20-10, the numbers are really a generalization. Each business problem
may require somewhat different ratios. While there are structured phases to follow, it’s unlikely that any
two solutions will be identical. You also need to remember that, like any change management effort,
shifting your team’s and your stakeholders’ mindsets about the role of learning and development will
take time. And it’s possible that some won’t want to come on the journey with you—not every stake-
holder is going to be an early adopter of a new approach. You are responsible for communicating what
is changing, why it is changing, and, above all, what benefit it has for the greater organization.
Even when you have the initial momentum, it’s important to let stakeholders know that the specific
ratio of learning opportunities will vary by learner and development area. The framework is elegant
and easy to understand. You could even leave the numbers out entirely and explain to your stakeholders
that you’re creating a system that involves formal education, on-the-job experience, and social exposure.
What matters most is that the framework provides a path to extend learning outside the classroom into
evidence-based, on-the-job, and social-learning activities. This will change the way you look at and talk
about performance solutions in your organization and improve your connection to the organization.
Adoption can take many forms. Hilti overhauled its entire learning and development approach,
trained key stakeholders in the framework, and marketed the changes that were happening. UPS
gradually expanded the framework’s inf luence until talent managers in other business units were
asking for ways to promote it. There is no right or wrong way to start using 70-20-10. No matter
how you go about it, you’ll likely be taking a multiphase, iterative journey as you further define your
ecosystem for learning.

Key Takeaways
9 The 70-20-10 framework starts with the business problem rather than the needs of the employees.
9 The percentage ratios are suggestions, not rigorous standards.

118 | Chapter 9
9 The 70-20-10 framework does not eliminate formal training, and in many cases recognizes that
formal training is the starting point. However, it does say that learning can take place in a variety of
formats and locations, and suggests using a variety of learning modalities, such as formal learning,
social learning with others, and learning from a variety of experiences.
9 While the framework construct uses training language, the language of business and organization can
be used to make the process more appealing to executives.

Questions for Reflection and Further Action


1. How does your learning strategy ratio align with the 70-20-10 framework both specifically and in
general terms, including incorporating a variety of ways to build capability? What about from a
budget perspective? What about for different types of employee roles?

2. Can you incorporate ideas from this framework and the case studies to advance social learning and
on-the-job training into your current development programs?

3. How might technologies, tools, and techniques enable the three types of learning to be more fluid
or for the ratios to change? For example, how might adding an augmented reality experience into a
formal class focus on performance and add value to the experience?

4. Does having a variety of experiences (such as work details or rotational assignments), project teams,
and discussion groups guarantee capability building and performance improvement? How might
you know?

The 70-20-10 Framework | 119


Section 3
Leading and
Developing People

T
his section examines ways leaders can personally create the environments and

opportunities for enabling others to excel in their roles as talent professionals. How

do you work with others to help them build their personal capabilities and help them

to convert the learning into performance actions and behaviors that affect team and organiza-

tional success through continually coaching? People development is the essence of the role for

a leader and is even more critical for leaders of learning professionals. As Elaine Biech stated in

Capabilities for Talent Development, “We need to ratchet up our role to become advisor to leaders.

We need to know what their problems are and have the skills, tools, and techniques to help them

solve them” (Galagan, Hirt, and Vital 2019).

This section provides ideas for making an organizational impact using a variety of ways to

set the learning team up for success. Chapter 10 starts with determining how you will adver-

tise a role when hiring. This must be intentional to ensure that your team has the collective

and future-focused skills needed. Sarah Siegel and Elizabeth Huttner-Loan of IBM share their

story for assessing their team needs and benchmarking other organizations to understand

who they were hiring—even paying attention to the titles used and the specific requirements

detailed in the job descriptions. They did not leave this to the hiring department—they did

their own research. The intentionality continued as they made selections and onboarded the

new team members.

121
Chapter 11 provides 10 unique ways to enable a team to continually learn and share with one another to

leverage the collective capability. The ideas model what Lew Platt, the former CEO of HP, once said: “If only HP

knew what HP knows, we would be three times more productive.” Laura Solomon and Caroline Fernandes of

IBM explore opportunities available through badging, game mechanics, and content bundles and discuss shar-

ing through watercooler sessions, learning toolkits, webinars, and coaching circles. The value is in having a vari-

ety of tools so that participants do not get bored, but also in continually adding tools that the learning team can

experience first and then use with others as part of their professional practice.

While it is critical to stay current in how you practice your craft, Alissa Weiher explores several other

ways talent development leaders enable themselves and others to build capabilities. First, you continue to

build personal awareness of your leadership style and how you show up to others. You invest time and energy

in your personal development and model self-directed learning. You serve as the guiding light for developing

those on your team through coaching and creating an environment where they are encouraged to learn, take

actions, and improve their performance. You demonstrate discipline and perseverance by earning credentials

and advanced degrees. You engage with professional organizations to further your leadership and learning.

And finally, you invest time in branding your collective department as experts in their profession, as profes-

sionals willing and able to engage at the senior levels to determine how the organization will gain or maintain

competitive advantage.

122 | Section 3
10
Identifying and
Onboarding the Best Talent
Sarah Siegel and Elizabeth Huttner-Loan

I, Sarah Siegel, am a seasoned learning design leader at IBM. Until a quarter of my team retired, and
another quarter was “poached” by the commercial part of our organization—with my full cooperation,
of course—my team was the most expert one among the several like-missioned teams across the organi-
zation. The senior learning designers who reported to me had at least 10 years of experience in the field,
and a number had been learning designers for 20 to 30 years.
The team included a mix of learning PhDs and master’s degrees in instructional design. In a couple
cases, their experience predated the Internet for commercial use, let alone digital learning. They kept
their skills relevant through continuous learning—both because they were self-motivated and because we
required it. For example, I asked the entire team to learn how to build a chatbot, because I think of them
as 21st-century performance support.
So, when faced with replenishing my team, I knew we had a big task ahead of us, and the new
members would have big shoes to fill. This is the story of how we did that.

What Do Job Descriptions for the Next Generation


of Learning Experience Designers Include?
The senior learning experience designer and the more junior learning experience designer job
descriptions we posted included criteria that were informed by the skills and descriptions listed in

123
our competitors’ web-based job descriptions. To remain competitive in the marketplace, we bench-
marked against our competitors to ensure that the talent we were recruiting had relevant skill sets
for meeting the future needs for learning.
We noticed that like us, they were looking for learning designers with video production skills.
However, what was not explicitly listed in their job descriptions was the ability to educate through
storytelling and deliver consumer-grade experiences. By “consumer grade” we mean an experience
that feels as smooth, rich, and useful as any platform our learners might use outside the workplace.
Our senior learning experience designers would also need to act as consultants and trusted advisors
to our portfolio leaders. So, we made sure to interview for all those qualities.
Both roles also required at least five years of experience (eight for the more senior role) as
an instructional designer, UX designer, interaction designer, product designer, or similar disci-
pline, as well as proficiency in visual design and video production; copywriting experience was also
preferred. In addition, we expected everyone to have a portfolio that showcased their digital and
face-to-face learning offerings.
We expected that the learning designers in both the junior and senior roles to be skilled in adult
learning principles, effective instructional approaches, needs analysis, Agile methods, education
trends, and design thinking. In addition, their learning solutions needed to incorporate best prac-
tices in adult learning theory, neuroscience, and leadership.
Requiring candidates to incorporate neuroscience best practices into their learning solutions is
fundamental, although our competitors rarely refer to it explicitly. The neuroscience best practices
we most commonly leverage are Carol Dweck’s and David Rock’s principles for encouraging a
growth mindset in our employees.
Our learning offerings need to be science-based, human-centered, and “sticky” or applica-
ble. For example, our chief leadership, learning, and inclusion officer routinely asks, “Where’s the
science behind that?” when we share a diversity and inclusion learning journey, for example. We
know to do our homework up front so that we can answer these questions about any offering, jour-
ney, or series we design.
Of course, the human-centered, sticky attributes can be more subjective, and we achieve them
through employing top learning scientists and learning designers whose work garners awards and
excellent net promoter scores (NPS). We use NPS not to measure learning effectiveness, but rather
to see how engaging our offerings are.
Comparing our job posts, the senior learning experience designer listed six skills that the more
junior role did not include. Senior learning experience designers need to manage stakeholders and
subject matter experts in the content development process, ensuring alignment, engagement, and

124 | Chapter 10
expectation management. They must serve as effective partners for stakeholders and sponsors across
the company at all levels, and coach and mentor junior developers in use of tools and instructional
design methodologies (Figure 10-1).

Figure 10-1. Comparison of Required Technical and Professional Skills

Senior Learning Experience Learning Experience


Designer Skills Designer Skills
• At least 8 years of experience as an instructional • At least 5 years of experience as an instructional
designer, UX designer, interaction designer, product designer, UX designer, interaction designer, product
designer, or similar role. designer, or similar role.
• Master’s degree in instructional design, human- • Bachelor's or equivalent degree in instructional
computer interaction, user experience design, user- design, human-computer interaction, user experience
centered research, behavioral science, or training design, user-centered research, behavioral science, or
and development. training and development.
• Deep knowledge of adult learning theory. • Exceptional writing and storytelling skills.
• Exceptional writing and storytelling skills. • Solid understanding of Agile methodology, responsive
• Deep understanding of Agile methodology, design principles, and instructional application design.
responsive design principles, and instructional • Innovative design approach.
application design. • Adept and efficient at designing learning offerings in
• Innovative design approach. response to a given business problem.
• Experienced in developing learning offerings in • Proficient in visual design, video production, and
response to a given business problem. editing; copywriting experience is preferred.
• Proficient in visual design, video production, and • Skilled with graphic tools and authoring tools.
editing; copywriting experience is preferred. • Experience with directing video shoots and managing
• Skilled with graphic tools and authoring tools. vendors in developing multimedia content.
• Experience with directing video shoots and • Ability to work independently and switch gears due to
managing vendors in developing multimedia content. the demand of the business to meet new deadlines.
• Ability to work independently and switch gears • A portfolio that showcases digital and face-to-face
due to the demand of the business to meet new learning offerings.
deadlines.
• A strong portfolio that showcases digital and face-
to-face learning offerings.

Senior learning experience designers need to serve as thought leaders and subject matter experts
in learning design, as well as scope and estimate the level of effort for project deliverables, applying
strong project management discipline for setting expectations, execution, and resolving issues. Finally,
they must develop all materials relevant to the program rollout, as shown by the responsibilities in bold
in Figure 10-2.

Identifying and Onboarding the Best Talent | 125


Figure 10-2. Comparison of Required Technical and Professional Responsibilities

Senior Learning Experience Designer Responsibilities Learning Experience Designer Responsibilities


• Design end-to-end learning solutions (e-learning, • Design end-to-end learning solutions (e-learning,
instructor-led face-to-face and virtual, and self-paced) instructor-led face-to-face and virtual, and self-paced)
that incorporate best practices in adult learning theory, that incorporate best practices in adult learning
neuroscience, and leadership. theory, neuroscience, and leadership.
• Manage stakeholders and subject matter experts in • Develop all relevant materials related to program roll-
the content development process, ensuring alignment, out, including learner materials, pilots, and facilitator
engagement, and expectation management. enablement.
• Serve as an effective partner for stakeholders and • Maintain a balance of innovation and experimentation
sponsors across the company at all levels. with sound adult learning theory and instructional
• Coach and mentor junior developers in use of tools and design principles.
instructional design methodologies. • Scope and estimate level of effort for project
• Serve as a thought leader and subject matter expert in deliverables
learning experience design. • Develop insights from multiple sources of data
• Scope and estimate level of effort for project and feedback about how our programs perform to
deliverables. continually improve them.
• Apply strong project management discipline for setting • Conduct independent research and interviews with
expectations, execution, and resolving issues. subject matter experts and translate findings into
• Develop insights from multiple sources of data cogent and compelling learning content.
and feedback about how our programs perform to • Facilitate pilots with audiences of all levels of
continually improve them. leadership.
• Conduct independent research and interviews with • Edit and perform quality assurance on your own work.
subject matter experts and translate findings into cogent
and compelling learning content.
• Facilitate pilots with audiences of all levels of leadership.
• Develop all relevant materials related to program
rollout, including learner materials, pilots, and facilitator
enablement.
• Edit and perform quality assurance on your own work.

So, in looking at these two figures, what skills categories did we seek?
• Outside-in orientation. Curiosity about where workplace learning and learning technology
is headed; for example, considering industry peers, ATD, and related organizations’ best
practices, and generally keeping an eye out for what’s next.
• Collaboration. Tendency to be team players, rather than lone wolves; for instance, offering
examples of succeeding through teamwork and being generous in sharing credit.
• Independence and ownership. Sense of autonomy and accountability for each project; for
example, ensuring that their projects were on time and on budget.
• Writing skills. They were imaginative; for example, creating compelling stories and writing
engaging, scenario-based assessments.

126 | Chapter 10
• Creativity and innovation. For example, one of the candidates created a clever course that
included gamification called “Night of the Learning Dead.”
• Business results. Measuring the impact of the learning solutions they designed; for
example, shorter customer service calls through better performance support and the associated
savings per rep.
• Agile. They included sponsor users and designed iteratively; for example, releasing the
essential all-manager version of a course, followed by the essential all-employee version, and
then fleshing out a couple modules for each post-launch.

How Did We Interview Promising Candidates?


The team interviewed 16 candidates for the learning designer and senior learning designer roles.
Eight made it to the second round of interviews. One requirement we had of candidates was that
they lived in proximity to one of our hubs around the country, because we believe in co-location for
collaboration.
We were looking to fill experience and diversity gaps on our team, so we strategically planned to
mitigate bias in hiring as follows:
• Every candidate replied to the same core questions and we had a series of skills categories,
in which we scored the examples they shared on a 1 to 3 scale (1=low and 3=high).
• The first-round interview with each candidate was done by two people (most often by
WebEx). Any candidate who advanced to the second round was interviewed by my manager
and me. The pairs of interviewers always included a current senior learning experience
designer or stakeholder. I was the leader to whom the candidate would be reporting.
We nearly hired one of the premier candidates. The person was affable and skilled, and we all
felt akin with them. However, after the second round of interviews, my management and I realized
the candidate was too similar to the rest of the team in experience and demography. I was startled
that we came so close to offering the candidate a position, despite all our bias mitigation tactics. Ulti-
mately, we told the candidate that while their experience was deep, the team needed designers with
different experiences, especially in their ability to design consumer-grade digital learning.
We had three positions open, and we offered them to a total of six candidates. We got there too
late with one of them—another firm had snapped her up. That was a big lesson learned: Don’t linger
in what appears to be a strong market for job seekers. Two candidates came back and told us that
their management had made counteroffers they couldn’t refuse. Having that experience cemented our
impression that it was a candidate’s job market. Fortunately, three of the candidates accepted with plea-
sure—two were new hires and the third transferred to us from a different department in the company.
Our company culture heavily promotes continuous learning, especially to ensure that our
current workforce can upskill itself and successfully self-reinvent. With that focus in mind, we took a

Identifying and Onboarding the Best Talent | 127


calculated risk with a learning designer and transferred in a recent graduate of our HR Leadership
Development Program (HRLDP).
HRLDP is a cohort of our highest-potential HR professionals. We calculated the risk and deter-
mined it wasn’t giant for three reasons. The candidate:
• was an HRLDP graduate
• had created memorable informal learning experiences across Latin America
• came from a country that was among our company’s most successful last year.
We figured that the current team could learn best practices from him and that his experience would
in turn complement theirs.

Our Six Hiring Tips


Based on our experiences in hiring for the team, we offer six tips:
• Work with a premier talent acquisition professional. Our company is large enough
that we have them on staff. They pulled the best of the applications and pre-interviewed them
for us. If any were promising, they were recommended for a first interview.
• If you’re with a smaller firm, bear in mind that you are closer to the ground
on what you need, which is a good thing. Remember, even great recruiters are not
necessarily experts in the learning space. So as the manager, you’ll need to do an additional
level of screening after the recruiter.
• Networking and always building a pipeline of potential candidates as a manager
is key. Managers can’t fully rely on recruiters and should always be recruiting too. For
example, I keep in contact with my best professors from my graduate program, and they
inform the pipeline for earlier-career hiring.
• When we met candidates, we asked the outside-in question first. One candidate for
the learning experience designer role said they used Quora and Reddit for reference, as well as
doing more traditional research. We liked hearing that they looked for workplace learning and
learning technology trends in modern sites that rely on crowdsourcing. Of course, we expect
our learning designers to be excellent at traditional research, but modern designers need to
make use of modern tools.
• Being with a mature company can be a selling point. Your candidate will join
a strong design community. When we talked with candidates about collaboration,
independence, and ownership, several said they were craving collaboration opportunities and
spoke of nearly total independence and ownership in their current roles—necessarily, because
they were all working in startups.
• If you are hiring three or more learning designers at once, and your company
has a continuous learning culture, consider building organizational capability

128 | Chapter 10
with an internal transfer who comes from a learning-adjacent role. Fill one of
the more junior roles with someone who has proven to be high potential and has succeeded
in prior learning-adjacent assignments. What do we mean by “learning-adjacent”? We mean,
for example, a diversity and inclusion stint where the candidate produced live, large-scale
employee events that would qualify as informal learning experiences when looked at through
the learning lens.

How Did We Onboard the Candidates Who Accepted Our Offer?


We had a terrific way to onboard learning professionals—or so we thought. I designed a moment-
by-moment learning plan for our new hires’ first week. The learning plan included our leaders’
all-hands replays, all-IBMer required courses, resources on design, links to training on the authoring
tools and our learning experience platform, and learning activities focused on diversity and inclusion.
It was action packed!
I chose a mix of content because I wanted them to succeed fast. And I included Diversity &
Inclusion offerings because inclusion and inclusiveness are among our culture’s values. New hires
needed to see that they made a good decision in joining us; just look at all the resources they had at
their fingertips!
The learning plan had a confident name too: “Success Accelerator for IBM Learning Experience
Designers.” I even created a companion agenda with a bit of context for the activities (Figure 10-3).

Figure 10-3. Day 4 Agenda for a New IBM Learning Experience Designer

Day 4: Thursday, July 11, Cambridge


8:30 a.m.–12 p.m. Orientation to how we build typical learning offerings. Review:
° Experience playbook (use .pptx template for any slides needed and also rely on icon library
and unsplash.com for photos)
° Learning bundles
° Learning plan builder and how to use it, plus how to use the YL service center
° Getting started with the open-source Adapt authoring tool
° Check out the badge criteria, including additional details required ahead of working on the
training and skills digital learning platform, based on Moodle
° Just FYI: IBM Web Publisher authoring tool
12–12:45 p.m. Lunch
1–3 p.m. Virtual onboarding class
3–4:30 p.m. Review and imagine how to use Watson Campaign Automation (and see the developer content)
to support leadership development learning. Continue completing the required 30-day learning
content in Essential and bonus links.
4:30–5 p.m. Debrief on day 4

Identifying and Onboarding the Best Talent | 129


Of all the lessons learned in this chapter, some of the biggest emerged during the onboarding process.
Chief among them was that during a new employee’s first week, learning is a two-way process—that is, if
you did a good job with hiring and you’re a willing listener. It’s vital to get feedback from your new hires.
You hired them for a reason!
The week 1 agenda included daily debriefs with their manager (me) to check in and hear impressions
of what the new employees were learning and answer any questions. By day 3, when we connected on
a videoconference call, Liz, the newest employee, looked discouraged. Could she possibly be the same
person who showed up on day 1 with a smile full of shining optimism?
“I feel like I’m a bit behind,” Liz said. “I’m not sure how best to prioritize my learning.”
As Liz described the impact of my action-packed plan, I realized I may have tried too hard. I’d
wanted to equip our new learning designers for success and give them everything they would need all
in one place. However, in this case “everything” was the problem. We talked further, and Liz kindly
offered to send me an alternative of how it could have been designed more effectively.
I am glad she felt safe enough to make the offer. Perhaps she did so because the day prior, I had
assigned an IBM Archives podcast in which Thomas Watson Jr. addressed University of Pennsylva-
nia students on the importance of conflict in design. Liz brought it up, saying how happy she was that
he thought conflict was part of the mix in good design.
Maybe Liz also felt safe because I had been inclusive and encouraging up to that point and had
also said that in my experience at IBM, good ideas were welcome, and their implementation was
always a question of when, not if. Whatever inspired her, Liz went for it, and she is the author of the
onboarding analysis and tips that follow here.

Onboarding Rebooted: A Prototype for the Week 1


Learning Experience Based on Three Days at IBM
By Elizabeth Huttner-Loan

How might we offer an onboarding experience that helps a new team member acquire essential
knowledge, complete procedures, and connect with key team members, while also allowing the
new team member to have agency in terms of spending more time on certain pieces of training and
meeting an approved personal goal?
There are a few key questions we should ask:
• What is really essential? I can only answer from my extremely limited perspective.
• What is the right balance of read and watch versus activity-based training?
That will vary by learner.
• How can we prompt the new team member to save content they will likely
need in the future? Should we just put this content together, or would it be better to
invite the learner to compile it in a way that makes the most sense for them?

130 | Chapter 10
• Who are the stakeholders for this learning experience? Can we make the
stakeholders transparent to the new team member? It would be helpful to know more
about the ecosystem.
• How can the new team member collect and share ref lections on their
learning experience?
• What kind of data should we collect about how the new team member
approaches the orientation experience and outcomes after completion?
What are the options for collecting this data?
I’ve noticed that there are different activity categories in the onboarding experience. However, there
is overlap between them, and it would be possible to categorize them differently (Figure 10-4).

Figure 10-4. Categories and Descriptions of Onboarding Experiences

Category Description
Setup and Logistics This includes getting on the IBM network with a personal computer, configuring that
computer, and obtaining ID badge access. Some setup, like benefits, can’t be done in week 1.
Some of this is more about the manager making sure the new team member is added to
various lists and online resources.
Meet IBM This is about IBM as a company and includes information about corporate strategy, messages
from the CEO, and IBM’s approach to design.
Meet Leadership, This includes viewing organizational charts, recordings of key meetings, and program
Learning and Inclusion overviews.
Required by Law This includes first 30 days' training, such as business conduct guidelines, sexual harassment
prevention, workplace bullying prevention, and cybersecurity.
Connect With IBMers This includes meetings and interactions with team members. It is a separate category to
distinguish it from more solitary tasks.
Role Specific This includes resources and learning experiences related to the actual work of learning design.
Reflect This is about considering all the onboarding content, reacting to it, and reflecting personally or
with a manager.

Flow of Onboarding Experiences Over Week 1


These suggestions reflect a choice to have a more logistics-focused day 1 and a lighter schedule (Figure
10-5). We want to make sure the new team member connects with the manager as needed.
Key differences from the original flow include:
• There is time throughout to allow the new hire autonomy to schedule a quick meeting when
needed. This is something I wanted. For instance, I had a colleague offer to sit down with me to
explain some logistics.
• In general, there’s more time for the new team member to reflect.
• Day 1 accounts for a long setup phase and focuses on IBM as a whole rather than drilling down.

Identifying and Onboarding the Best Talent | 131


Figure 10-5. Onboarding Experience Categories With Time Percentages

The new hire gets an idea of how much time in their day to allocate to the various categories, but has choice in terms of
what activities to pursue within the categories.
Percent of Category Description
Time
Day 1
60% Setup and Logistics Could include paperwork completion and laptop configuration
20% Connect With IBMers Could include a welcome lunch with team members
20% Meet IBM Could include watching presentations from the CEO
Day 2
10% Setup and Logistics
50% Required by Law Could include completing required online training
20% Meet Leadership, Learning Could include reviewing key all-hands replays and face-to-face or online
and Inclusion introduction to upline leaders, depending on location
10% Connect With IBMers
10% Reflect Could include generating questions based on experiences so far
Day 3
10% Setup and Logistics
20% Connect With IBMers Could include a meeting with a more experienced team member
40% Role Specific Could include learning applications needed to create learning experiences
or whatever is essential for the role
30% Reflect Could organize notes and then share thoughts and reflections with manager
Day 4
10% Setup and Logistics
20% Required by Law
30% Role Specific Could include getting started on a project
20% Meet Leadership, Learning
and Inclusion
10% Meet IBM
10% Connect With IBMers
Day 5
30% Role Specific
40% Choice Could include answering emails, organizing their workspace, and revisiting
any content that was confusing
10% Connect With IBMers
20% Reflect

132 | Chapter 10
• Day 2 has more of a focus on getting IBM requirements out of the way and introducing our
organization’s mission.
• Role-specific work doesn’t truly start until day 3. This is to make the onboarding process less
overwhelming.
Much of the material from the original flow fits, but a few things will need to shift to week 2. I think
it could be a good opportunity to revisit the concept of “essential,” specifically for week 1. How can we
make it even clearer what’s absolutely essential for week 1?

Summary
Successfully hiring and onboarding learning professionals involves finding designers with consum-
er-grade skills to amplify and complement your current team’s experience, especially when the team
is educating through storytelling and scenarios. In our experience, you can help mitigate bias by using
the same interviewing format for all candidates. Then you can pause and ask yourself if your favorite
candidates complement the team’s skills and diversity. If not, think twice before hiring them. Finally,
listen to your new employees’ points of view. If you did a good job hiring, they can take your mission
to new heights.
It is our hope that learning from my team-rebuilding challenge and Liz’s onboarding odyssey helps
learning leaders at organizations of any size hire the best learning designers and avoid onboarding
pitfalls. We were glad to share IBM’s job descriptions for the next generation of learning designers,
along with how we succeeded in finding candidates with the skills we needed, how we mitigated bias in
our candidate interviews and hiring, and how we onboarded them, as well as how we might have done
so even more effectively! We hope you appreciate the tips and encourage you to adapt them to your
own environment.

Key Takeaways
9 The knowledge and skills needed by learning practitioners to excel today and be future-ready
are much more varied than in the past and include the science of learning, design thinking, and
understanding the ways technologies can support and enable learning.
9 When a learning-team member suggests a new concept or solution, the leader might ask, “What does
the science [of learning] say about the effectiveness of this approach?”
9 While most organizations have a unique recruiting and onboarding process, it is important to
periodically benchmark with other companies to have a firsthand glimpse of what they do.

Identifying and Onboarding the Best Talent | 133


Questions for Reflection and Further Action
1. What processes does your learning team employ to ensure new hires can significantly enhance
the overall team skills and portfolio?

2. How might you incorporate some of the ideas from the process and lessons learned in this
case study?

3. How frequently does your team review and update the descriptions you use to hire new learning
practitioners?

134 | Chapter 10
11
Enabling Continuous Learning:
Tools and Approaches to
Make It Happen
Laura Solomon and Caroline Fernandes

Don’t we all hope that the learning we design and develop will be valued and applied after it is first
experienced? Unfortunately, despite our best intentions, after the course is completed, the whitepaper is
read, or the performance support tool is tried, it is forgotten. Sending reminders or establishing require-
ments are just two of the typical approaches for reinforcing the value and applicability of a learning
asset. It’s a universal challenge for learning professionals, and one that many continue to address with
varying levels of success.
As many organizations focus on skills and reskilling, the need for continuous learning has gained
significant importance. The motivation to make development a priority is a shared responsibility between
an individual and their organization and is a topic for its own book.
This chapter focuses on the tools and strategies for continuous learning that our global company uses
to inspire transfer and retention as well as application and sustainment. Also included are examples of
how some of these tools could be adapted by other learning professionals.
Each of the descriptions of the tools and strategies outlined here include:
• the purpose of the tool or strategy
• when to use it

135
• when not to use it
• examples of how this tool or strategy can be used.
The tools are organized alphabetically so you can easily reference them.

Badging
Badges are an effective way to encourage learning designers to increase their skills and continually
grow their capabilities. They’re also a way for individuals to validate their learning activities, skills,
and achievements, such as completing courses, publishing papers, creating apps, or achieving other
significant and measurable accomplishments. The Open Badges organization has established stan-
dards and processes that enable individuals to earn badges and to display any they have earned
externally.
The concept of Open Badges dates back to 2010 from those working at the Mozilla and MacAr-
thur foundations, and out of the research of Erin Knight, founding director of the Open Badges
project at Mozilla. The initial intent was to recognize that learning happens in different ways and
that there are new skills and literacies that are important in modern society. The Open Badges
website describes badges as, “visual tokens of achievement, affiliation, and authorization that are
shareable across the web. They represent a more detailed picture than a résumé as they can be
presented in ever-changing combinations, creating a constantly evolving picture of a person’s life-
long learning.”

Elements of Badging
Organizations can issue Open Badges and individuals earn and display them. The elements of badg-
ing include:
• specific skills, knowledge, or achievements that warrant certification
• standards and processes for earning badges
• a team to administer the badging process
• a platform or infrastructure to deploy the program.

When to Use Them


Badges are useful when you have a skill or an activity that requires a substantial amount of effort to
accomplish and you want to provide a way to validate achievements and give internal and external recog-
nition for effort.
To establish a badge program, consider partnering with an organization such as Pearson VUE
Acclaim, which can provide a digital badge program administration platform. Pearson’s Acclaim was
acquired by Credly in 2018. According to the Credly website, “the Credly Acclaim platform provides
a comprehensive global solution for recognizing skills, capabilities, and achievements.”

136 | Chapter 11
An internal team should be responsible for creating standards, a process and plan for administering
the many aspects of a badge program, such as reviewing and vetting badge proposals, and working with
the organizations or teams within your company that want to offer badges.
Badges are also a good way to motivate or incentivize learners to complete a course, develop a skill,
or learn a body of knowledge. They incorporate aspects of gamification that encourage people to progress
through activities and can be displayed on an organization’s profile pages and LinkedIn profiles, among
others.
Badging provides a way for an organization to measure and track completion of a course or curric-
ulum, as well as recognize those who have earned those credentials. Badges also allow organizations to
identify a person with a particular skill or subject matter expertise they may need to hire or access for
assistance with a client or project.

When Not to Use Them


Badges are likely not a good investment when the skill or activity is not robust enough to warrant certifi-
cation or doesn’t have value or is not recognized outside your team or organization. Badges are also not
recommended if the requirements for attaining them are likely to change.

Examples From the Field


Our organization currently offers more than 50 badges on a range of topics, most of which focus on tech-
nical skills. We also offer a badge for learning designers on the topic of enterprise design thinking. Within
this topic, there are different levels of certification that can be achieved, such as practitioner, advocate,
leader, coach, and co-creator.

Bundles
Bundles provide a unique way to package learning content. They incorporate gamification elements to
drive completion while enabling tracking and completion. Bundles are a form of e-learning in that they’re
a digital way to engage in content, but they apply a particular format and incorporate certain elements
that make them unique. While they do share some of the same elements as a learning journey or learning
path, there is more flexibility around the order in which topics are consumed and additional features that
promote engagement.

Elements of a Bundle
The different elements of a bundle include:
• Subtopics. Also referred to as activities that divide the content into smaller consumable
chunks, ideally between five and 15 minutes.

Enabling Continuous Learning | 137


• Images. An image or graphic for each subtopic engages the learner and provides a
memorable way to retain and reference the content.
• Introduction. While there is typically an order in which the topics should be consumed
and an overview that introduces them, the learner can often tackle a topic in any order
they choose.
• Content. Typically, the content within a subtopic consists of brief sections of text
interspersed with graphics, videos, and reflection questions to keep the learner engaged.
• Points. Each subtopic or activity is assigned a specific number of points, which tend to be
based on the amount of time it takes to complete that activity. For example, a five-minute
introduction activity could earn you 100 points, while a 15-minute activity is worth 300 points.
• Duration. Provides the estimated amount of time it should take a learner to complete a topic.
• Progress tracking. Once a learner views a subtopic, an image of an eye appears on that
activity. When they complete that subtopic, a check mark replaces the eye. When a bundle is
part of a larger collection of bundles, a progress bar may indicate the learner’s rank based
on the number of points they have accumulated after completing each activity. Progress
tracking may even be connected to an organization’s overall initiative to drive learning. An
example of this is connecting it to a program that tracks the amount of time each team
member spends learning and then assigns credit or points based on their completion.
• Encouragement. Progress tracking helps encourage learners to complete activities, as do
encouraging comments such as “I’ve checked it out!” “Congratulations!” and “Let’s keep
going!”
• Assessment. Bundles typically include some way for learners to test their knowledge of
the topic. In some cases, learners answer several multiple-choice questions at the end of a
subtopic, which they can keep taking until they get them all correct. In other cases, there is
a separate subtopic for testing learner’s knowledge. We created one called “Show what you
know,” which includes 12 scenarios with multiple-choice questions. The learner is required
to get at least nine of the 12 questions correct, but they can also take the test as many
times as they want until they pass.
• Bundle topics. We create and use bundles to Figure 11-1. Testing Your Knowledge
deliver many kinds of content to a large and
global audience. One example is an enablement
program for learning professionals across the
organization, but we also use bundles to target
selling skills, coaching, resilience, and agile ways
of working. (Figure 11-1 is an example of a bundle
we created for learning consultants and designers.)

138 | Chapter 11
When to Use Bundles
Bundles are effective for providing online content around a topic that learners can consume at their
own pace and access when needed. Bundles can also be used to deliver courses that track learner prog-
ress and completion.
Note: A bundle differs from a toolkit (covered later in this chapter), because it is typically focused
on a more specific topic, with subtopics or activities that follow a consistent format and allow for
tracking.

When Not to Use Bundles


Avoid using a bundle when the learning content requires live or face-to-face interaction, skill practice, or
synchronous collaboration. While a bundle is more easily updated than an online toolkit, avoid creating
one if there will be frequent changes to the content.

Examples From the Field


We use bundles to deliver an enablement program for learning professionals across the organization.
The program includes an array of content to help learning consultants and designers stay rele-
vant, innovative, and successful. The program covers approximately 16 hours of content on disrup-
tive learning technologies, aligning learning with corporate strategies, applying digital to learning,
measuring the impact of learning, learning best practices, and providing valuable experiences for a
client, among others.
The bundle’s core topics are:
• learning’s disruptive transformation
• know our business
• meaningful metrics
• learning that works
• creating value.

Discussion Guides
Discussion guides are learning assets that enable a person to share what they learned with others.
These guides highlight key concepts to engage people in discussions for sharing insights, best prac-
tices, lessons learned, and ways to apply them. Studies such as those in the May 2018 edition of the
British Psychological Society Research Digest have shown that teaching others is an extremely effective way
to increase understanding and retention of what they have learned (Jarrett 2018).

Elements of a Discussion Guide


Discussion guides typically include:

Enabling Continuous Learning | 139


• Facilitator notes. Because the person using your discussion guide isn’t typically a
trained facilitator and may not even be a subject matter expert, we provide easy-to-follow
guidance for leading the session, such as introducing the topic, key concepts to present,
questions to engage attendees in discussion, and ways to close the session to maximize
engagement and learning impact.
• Slides. Slides should be minimal because the primary focus is on discussion and not content
presentation. However, slides can be useful for guiding the facilitator, visually highlighting key
points, and keeping participants engaged, especially if the session is virtual.

When to Use a Discussion Guide


Consider creating a discussion guide when you want someone who has completed a course or workshop
to reinforce what they learned by teaching or sharing what they learned with others.
Discussion guides can also be used when you need to cascade information from a certain group of
individuals to a large audience, such as learning directors, managers, or supervisors who attend a program
or need to impart this information to the designers and developers they lead.

When Not to Use a Discussion Guide


Avoid using discussion guides when the content or skills you need people to learn require a trained facili-
tator or subject matter expert. Also avoid using them if there is more content than can be covered within
an hour, unless you plan to schedule a series of discussions over an extended period of time.
Discussion guides are not the best tool to use if the learning content requires more than just a discus-
sion, such as skill practice or coaching.

Examples From the Field


We used discussion guides as part of a large-scale global initiative to give leaders who participated in a
workshop the opportunity to engage their team members in rich conversation about key topics. We posted
these guides online so leaders could easily access, download, and print them as needed. Each one had
three sections:
• Prepare
° scheduling the meeting (such as timing, face to face, or virtual)
° selecting the topic to discuss
° collating links to the learning assets to review and prepare.
• Discuss
° providing a brief overview of the topic using facilitator notes and slides.
• Continue
° adding a link to enroll in the workshop so leaders could easily offer it to their team members.

140 | Chapter 11
Learning Design Guild
Learning design guilds are groups of peer learning designers who create a plan and schedule for collab-
oration and sharing. They may also take turns presenting their work to one another to exchange ideas,
increase awareness of assets for reuse, build a repository of resources, and strengthen their network of
subject matter experts.

Elements of a Learning Design Guild


A learning design guild includes the following elements:
• a group of peer learning professionals who are interested in participating and sharing
• an individual who is willing to organize the guild and administer the meetings
• a plan or an agenda for meetings to include frequency, length, and whether they’re face-to-face
or virtual
• engagement that involves sharing ideas, resources, and help.

When to Establish a Learning Design Guild


A learning design guild is a great approach when you have a significant number of learning designers who
are interested and committed to participating as well as contributing. They can be especially valuable in a
decentralized organization where designers have little or no opportunity to work together.
One key to administering a guild is having a substantial number of contributors to generate enough
ideas and assets to share. Also, critical will be an individual or two who are willing to create, market,
launch, and lead the learning design guild for your organization.

When Not to Establish a Learning Design Guild


Avoid starting a learning design guild if you do not have a significant number of interested and committed
designers or someone to maintain the effort over time.

Examples From the Field


We created a Quarterly Guild Playback series for learning designers from across our organization to learn
about new offerings from fellow learning designers. Each playback session features three learning design-
ers who take about 15 minutes to share an overview of their offering. While the primary purpose is to
share new offerings, these sessions also give designers exposure to one another and access to subject matter
experts whom they can reach out to if they have questions or want input on a related project.

Lunch & Learns


Lunch & learns are informal gatherings of peers during lunch time to increase knowledge about a topic.
They are typically, but not always, positioned as a means to learn from one another. While peers may

Enabling Continuous Learning | 141


rotate responsibility for teaching others about a specific tool, technique, or topic, the organization may
also invite subject matter experts from within or outside the organization to present.
Different from a watercooler session (covered later in this chapter), lunch & learns have a specific
topic, a learning objective, assigned facilitators, and time for questions.

Elements of Lunch & Learns


What makes some lunch & learn efforts effective while others seem to die out after the first session or two?
A few elements are key to success and longevity:
• current and relevant topics that participants care about
• a structure or an agenda that maintains the focus of the session while providing opportunities
for engaging participants
• a subject matter expert or presenter (this can be an internal or external thought leader or one
of the participants)
• a skilled facilitator to guide the discussion, engage participants, and ensure learning objectives
are met (this role can be rotated among the participants)
• an individual who “owns” the offering and takes the lead to schedule and administer the
sessions.
Lunch & learns can be face-to-face, virtual, or a hybrid (combination in which some participants are
face-to-face while others are participating remotely). Virtual or hybrid sessions should provide an opportu-
nity for participants to see one another through video using one of many virtual collaboration tools (such
as WebEx, Zoom, or Skype).

When to Use Lunch & Learns


Consider using a lunch & learn when you have a substantial group of peers (ideally more than six) who
are interested in and committed to learning about new topics, and an individual who is willing to take the
lead to coordinate the sessions. This is also a great way for team members who have participated in an
external conference or learning session to bring back their new knowledge and share it with their team.

When Not to Use Lunch & Learns


Simply put, don’t host lunch & learns if you do not have a large enough group of peers to warrant the
investment, especially if you plan to include external subject matter experts. It may be difficult to recruit
an individual to give up their time when only a small number of people will benefit. Similarly, if you plan
to focus on peer-to-peer learning, individuals will also want to feel like the amount of time they invested to
prepare and present was worth it. This can be an issue if you expect a significant number of last-minute
no-shows, especially if your group is already small. Additionally, if the group is too small, it will be difficult
to have a robust discussion and range of perspectives.

142 | Chapter 11
Examples From the Field
While a lunch & learn can focus on any topic, we offered a session on the basics of learning design for
learning consultant team members in our organization. Although not tasked with the actual design of
learning solutions, they are most often the first point of contact for a learning request, such as “We need a
30-minute workshop on leadership.” The hour-long session focused on the importance of asking the right
questions to clarify the objectives of the requested solution and setting expectations for what is achievable
within the allotted time.

Microlearning
Research shows that receiving digestible, relevant learning content that you can apply right away can drive
more than 20 percent additional retention, which is an essential ingredient for changing behavior.
Microlearning organizes content into digestible chunks so that it can be quickly consumed, often
within five minutes. It is typically understood to be composed of relatively small learning units and short-
term learning activities that are most often experienced as e-learning. After conducting a poll and research
study, ATD Research concluded that a microlearning activity was most effective when it lasted between
two and five minutes

Elements of Microlearning
Microlearning typically includes:
• content that is designed in small and easily consumable learning units or activities
• a delivery system for deploying or accessing the microlearning content (such as through email
or a website)
• a marketing strategy to promote usage and engage learners.

When to Use Microlearning


Microlearning can be used in different ways:
• Before training. It can be used prior to a live class or webinar to introduce learners to the
content or reserve valuable face or synchronous time for live interaction, discussion, or practice.
• During training. Offering microlearning between the sessions of an ongoing course or series
can help maintain the momentum and retain learners while reinforcing the concepts or content
they learned and preparing them for the next session.
• After training. Distributing microlearning after a course can reinforce what was learned,
providing guidance for applying behaviors and skills as well as links to resources for more
in-depth material on a topic.
• As a stand-alone solution. Microlearning may be used on its own with topics that
are small and digestible, or for courses with topics that can be broken into a series of
microlearning units that are distributed over time.

Enabling Continuous Learning | 143


When Not to Use Microlearning
Avoid microlearning when the content you need to deliver requires more than five minutes to share.
In these cases, the learning objective will be compromised.

Examples From the Field


We used microlearning after a three-hour global leadership development workshop to reinforce
what was covered in the session, encourage application, provide additional content, and maintain
the momentum of the initiative the workshop supported. For the 10 weeks following the workshop,
learners received an email that contained content they could consume within five minutes. The final
microlearning was titled “Keep the Momentum,” and included a link to a website that served as a
library of all the microlearning they had received, to review and reference as needed.
Each microlearning unit built on the topics covered in the course and contained the following
elements:
• a catchy title that included the learner’s name and highlighted the amount of time it
would take to consume the content; for example, “Take five minutes to create better team
interactions, be a more resilient leader, or set a tone for extraordinary results”
• a two- to three-minute video of an exemplar or thought leader describing how they apply
that behavior or skill
• three to five key points to remember or practical actions they could take now
• a link to an article where they could learn more about the topic.
While sending content via email incurs the challenge of getting the learner to open it, we used
Watson Campaign Automation (WCA)—an IBM tool that leverages artificial intelligence and
advanced analytics—to personalize the distribution of our microlearning content. WCA allowed us
to select the best time and day of the week to send individual emails using its send time optimization
feature.

Online Toolkit
An online toolkit is a way to offer learning assets or performance support tools about a topic that
is easily accessible and consumable. While a bundle packages content that targets a set of learning
objectives oriented to a particular curriculum, a toolkit organizes a variety of resources that the
learner can use when needed (for example, habits, ref lection questions, videos, articles, books).

Elements of an Online Toolkit


Toolkits typically include:
• learning assets that enable or support the performance of a set of skills or behaviors
• a platform for accessing the assets.

144 | Chapter 11
When to Use an Online Toolkit
Online toolkits are useful for providing online content about a topic to a global audience that needs to
access it whenever they need it. Toolkits work best with content that is not intended to be consumed in a
particular order at a particular time, such as performance support tools. Topics such as resilience, engage-
ment, and feedback could be well served by a toolkit to provide different types of assets to such as guidance
to apply a skill, ideas for habits, in-depth content, or templates for planning.

When Not to Use an Online Toolkit


Toolkits aren’t useful if the learning content requires live or face-to-face interaction, skill practice, or
synchronous collaboration. Also avoid creating an online toolkit if your content is likely to change
often, as it could be challenging for your developers to update it on an ongoing basis.

Examples From the Field


We created and deployed an online toolkit to a large global audience of leaders to help them increase
employee engagement. While it is a broad subject, we used data from our annual engagement survey to
identify drivers of high engagement and built the content around those drivers. Our toolkit addressed 10
topics and each topic focused on two subtopics; for example, the topic “connect” addressed the subtopics
of “empathy” and “transparency.”
A toolkit for learning designers might be framed around ATD’s Capability Model, ADDIE, or even
an assessment that you have conducted to determine the knowledge and skills gaps of your target audience.
• If the toolkit is based on ADDIE, it may have six topics: overview, analysis, design,
development, implementation, and evaluation.
• Subtopics for design might be synchronous, asynchronous, self-paced, face-to-face,
development tools, or design strategies.
Our toolkit on engagement included several different elements. The following examples show how
you can apply each of those elements to a toolkit for designers and developers:
• Essence. A brief description of the topic and its relevance to learning designers.
• Video. Exemplar learning designers share stories and examples of how they addressed
the topic.
• Think and reflect. Questions that trigger insight and encourage the user to identify with
their own situation. For example, “Which of these approaches might you consider for your
design project?” or “What challenges do you face to make continuous learning a priority and
how do you address them?”
• Tips and habits. A quick reference of actions to take and habits to adopt. For example,
“Schedule time on your calendar to read a current article or speak to a peer designer
about their project” or “For each of my designs, I will engage at least one thought leader

Enabling Continuous Learning | 145


from outside my organization to consider different approaches and perspectives, as well as
challenge my assumptions.”
• Go deeper. Links to additional resources for more in-depth information (such as articles,
websites, tools, and books).
In addition to the 10 topics and content on engagement, our toolkit included many different
assets. The following examples show how you can apply each asset to a toolkit for learning designers
and developers.
• Kickoff or introduction video. A two-minute video by a senior learning leader in your
organization who introduces the toolkit or initiative on continuous learning and inspires
others to participate.
• Assessment guidance for where to start. Provide direction for where to start or
which topics to focus on first. Two options to consider:
° If you already asked designers with an assessment to determine their knowledge and
skills gaps, provide a link to the survey results and use the data to suggest certain topics
that would best help them improve.
° If you did not offer an assessment or the designer did not have results, they could access
a self-assessment with questions about the extent to which they agree with a set of
statements. This would provide the insight needed for determining where to start.
• Build the case. Slides on the topic with notes to present to your team.
• Cards. Small cards that contain sentiments and reflection questions that learning leaders
could print and use to trigger ideas and engage their teams in rich conversation about
continuous learning.
• Co-creation. The opportunity for designers to add their own examples of what worked
for them (for example, make continuous learning a priority or design a novel learning
solution for a client request).
• Discussion guides. Slides with facilitator notes that learning leaders can use to lead a
discussion with their teams. (See the discussion guides section of this chapter.)
• Guidance for using the toolkit and tools. A video demonstrating the toolkit showing
how to use the different elements.
• Key moments recommendations. Recommendations for moments identified by data
that have a high impact on engagement, such as an employee who has been in the same
job longer than two years or someone who hasn’t received feedback from you in the past
six months or more.
• New content notifications. An ability to subscribe to notifications that would let a
designer know when new content was added so they could stay up-to-date.
• Manager enablement. Resources to enable senior learning leaders to cascade the
information to the leaders they manage and their teams.

146 | Chapter 11
• Survey app. This enables a learning leader to send a five-question survey to their
teams at any time throughout the year to get input on the key drivers of continuous
learning.
• Workshops. Two-hour workshops delivered face-to-face or virtually that focus on
helping learning leaders engage designers and increase their knowledge and skills on
key topics.

Peer-to-Peer Development
Also called coaching circles, peer-to-peer development provides opportunities for small cohorts of
learners to share their experiences of applying the behaviors and skills they learned in a class or work-
shop as a way to reinforce concepts, deepen learning, and gain insight. Individuals may also use this
safe and supportive environment to practice skills through role play for coaching and feedback. Cohorts
engage in a series of sessions over a set period (for example, once a week for six weeks). This approach
enables participants to develop the comfort and trust they need to be vulnerable and open and get the
full benefit of the experience.
How often have we worked hard on designing and developing a learning program only to receive
participant feedback that the best part was being able to connect with peers? Clearly, colleagues who
share similar experiences and challenges tend to be the best source for identifying options and sharing best
practices. That is the principle behind peer-to-peer development.

Elements of Peer-to-Peer Development


Peer-to-peer development includes the following elements:
• Skilled facilitator. While the primary focus is on peer-to-peer sharing, it’s important to
have a trained and experienced individual facilitating the sessions. While the facilitator does
not need to be an expert on the topics being discussed, they do need to be skilled at creating
a safe space for meaningful sharing, which includes engaging participants, maintaining
focus and boundaries, and asking good questions that encourage insight. A key challenge for
many facilitators is to avoid providing solutions or giving advice, instead encouraging this
input from the group. An option to consider is providing a facilitator for the first three to
four sessions to model the role and demonstrating how an effective peer development group
works. Then by the fifth session, you can pass the facilitator role to a group member (or
rotate it among group members), freeing up the facilitator to launch another circle or focus
on another project.
• Session structure. Although this approach may seem unstructured because participants
determine the actual content and flow of the discussion, a certain amount of structure is
needed to set the session topic, length, and participants to include.

Enabling Continuous Learning | 147


• Pre-work. You may also choose to provide pre-work before each session. Pre-work can
be used to focus the discussion and trigger ideas. The pre-work could be an article, a self-
assessment, a video, a ref lection question, or a suggested action (for example, interview a
subject matter expert on a skill you want to develop and note the behaviors you believe
were most effective and plan to try). Whatever the activity, consider limiting the pre-work
to 15 to 30 minutes and providing adequate time for completion, ideally a week before
the session.
• Ground rules or group norms. It’s also important to establish ground rules or group
norms that ensure confidentiality and set expectations for how participants will interact.
Confidentiality is usually nonnegotiable, but the facilitator may ask participants to
determine the code of conduct and suggest behaviors they believe will foster the greatest
participation, benefits and outcomes.
• Cohort size. Most papers outlining the benefits of peer coaching and coaching circles
say six to eight is the ideal number of participants. You want enough individuals to have a
range of ideas and perspectives, yet not so many that people are denied the opportunity to
contribute. While participants who just listen will also benefit from the discussion, the value
is significantly greater if they can ask questions and get feedback about their own situations.
• Face-to-face or virtual. There are pros and cons to either option, and both can be
quite effective. If participants are all co-located and you have a designated private meeting
space, then it makes sense to meet in person. However, if this is not possible, virtual is
extremely effective if participants can see one another using a webcam and tools such as
Zoom, WebEx, or Skype. The advantage of having remote sessions is that you can include
individuals from a wide range of locations while considering the time zone and common
language. The diversity you gain by including individuals from different geographies often
creates a richer conversation and greater insight, due to their unique perspectives. Avoid a
hybrid approach where some participants are face-to-face while others are virtual, because
it is extremely difficult to engage everyone equally.

When to Offer Peer-to-Peer Development


Peer-to-peer development is a useful approach when the content has already been taught or learned
and there is value in having peers share their experiences, best practices, and lessons learned.
Providing a safe place to role play and practice interpersonal skills, along with peer coaching, is
another reason to use peer-to-peer development.
An example would be after a face-to-face or virtual leadership development class or workshop.
The approach provides participants with the opportunity to go more in depth on certain topics,
engage in discussions about how they are applying skills and behaviors, and share lessons learned.

148 | Chapter 11
When Not to Offer Peer-to-Peer Development
Peer-to-peer development should not be considered if the facilitator or participants are unable to commit
to joining at least three sessions. While a group of peers may suggest meeting on their own after a particu-
larly engaging program, the commitment tends to wane over time without true focus and structure.

Examples From the Field


We used peer-to-peer development after a three-day leadership program for senior managers. After the
program ended, alumni were invited to enroll in a six-week peer-to-peer development experience. The
session facilitator was an experienced coach who also facilitated the leadership program that participants
attended. Each session focused on a different topic from the program, and participants were encouraged
to share how they were applying what they learned with their teams.
We also used peer-to-peer development to engage HR specialists in a four-session experience that
focused on three specific topics related to trust-based relationships, detecting crossing-the-line behavior,
and having difficult conversations. The first three sessions took place a week apart and were preceded
by 15-minute pre-work assignments. The fourth and final session was held several weeks after the third
session, which gave participants ample time to apply what they had learned.

Webinars
Webinars, also known as live virtual classrooms, provide synchronous learning around a particular topic
using technology to engage participants from remote locations. While webinars can be used to deliver
core curriculum when you can’t gather participants in the same location, they can also be used for
continuous learning as part of a blended solution. There are many different types of webinars—from
a virtual lecture to a highly interactive learning experience where the participants engage in a range
of activities (such as idea sharing on a whiteboard, discussions, small group problem solving, and skill
practice in breakout rooms).

Elements of a Webinar
Webinars typically include the following elements:
• topics of interest
• subject matter experts presenting or facilitating the topic
• technology that allows participants to see each other (such as Zoom or WebEx)
• an agenda and a design that engages learners and promotes learning.

When to Use a Webinar


Use webinars when there is a topic of interest in which learners can benefit from live interaction with
a subject matter expert. Webinars are also useful when participants are not co-located and distance

Enabling Continuous Learning | 149


or travel restrictions make it difficult to bring them together for a face-to-face experience. If opting for a
webinar, ensure that the experience is highly interactive. Most virtual tools offer features to engage the
learner—such as polling, breakout rooms, and whiteboards where they can brainstorm, post a response,
and even draw. However, what’s most important is the ability for participants to see one another.

When Not to Use a Webinar


If you’re unable to bring participants together at a time that’s convenient for all due to different time zones
or schedules, webinars may not be an option to consider. If the session’s objectives don’t require live interac-
tion with an expert or peers, then it may not make sense to invest in a webinar. Webinars may also not be an
ideal option if you have a lot of content that can’t be consumed in one- to three-hour sessions. It’s difficult
to keep participants engaged in virtual sessions that are longer than this, even if you include a break. If
you have more than three hours of content, consider chunking it into a series of consecutive sessions over
a few days. You could also use the webinar for core content, and then provide the rest of the content with
self-paced solutions.

Examples From the Field


We used webinars that we complemented with an online community, to help facilitators (primarily) learn
the fundamentals of learning design. The program consisted of six hour-long live virtual sessions, which
included such topics as:
• introduction, needs analysis, and learning effectiveness
• personas, learning solutions, and objectives
• content curation
• presentations, assessments, and feedback
• PowerPoint presentations.
Participants were also added to an online community in which they could view the program struc-
ture and session agendas, download pre-work and post-session materials, access recorded versions of the
sessions, and submit assignments. Within the community, participants could also engage in discussion
forums on topics they learned during the live sessions. The community continues to be open, so participants
can review the materials at their own pace and at any time. It is also open to any learners who want to brush
up on learning design concepts.

Watercoolers
Watercoolers are informal gatherings of peers with no specific agenda. They provide an opportunity for
individuals to come together either face-to-face or remotely to discuss current issues, share ideas, and ask
for feedback or help. Although not the primary objective, water coolers also serve to strengthen networks.
Participants engaging in watercoolers usually have a common bond, such as all being learning professionals.

150 | Chapter 11
Elements of a Watercooler
Watercoolers typically include:
• a group of peers with a common interest who see value in coming together on a regular basis
(such as an hour each month or bimonthly)
• an individual who is willing to take on the role of coordinator to schedule sessions and
facilitate as needed
• a space to either meet face-to-face or a platform that lets participants see one another (such
as WebEx or Zoom).

When to Use Watercoolers


Watercoolers are useful when you have a substantial number of peers who have a limited opportunity
to interact, because of either proximity or assignments, but could benefit from connecting on a regular
basis.
Having a substantial number of peers who are interested in and committed to participating (for
example, six or more people) is even more important than a lunch & learn, because watercoolers rely
on the participation and input of attendees, whereas lunch & learns focus on a topic presented by an
individual. Watercoolers also need an individual who takes responsibility for organizing the events and
someone to facilitate the sessions to keep them engaging and beneficial.
A key challenge of watercoolers is the perception that they are a “nice-to-have option when I have
extra time to attend,” which most people don’t have. Instead, participants should believe that the infor-
mation and relationships gained from attending watercoolers is useful to their work and helps them be
more effective. If that doesn’t occur, attendance tends to decline and sessions get canceled.

When Not to Use Watercoolers


When you are lacking enough interested individuals to participate, it’s likely not the right time to initiate
watercoolers.

Examples From the Field


We established a monthly virtual watercooler series in which learning designers from across the compa-
ny come together for a 45-minute WebEx meeting. The purpose is to give learning designers an oppor-
tunity to do the following:
• bounce learning design ideas off one another
• bring thorny learning design challenges to the group for advice
• ask lingering questions from the quarterly learning design guild playback series
• be reminded that we belong to a mighty community of learning designers across the
organization.

Enabling Continuous Learning | 151


Summary
While organizations wish that their employees would be intrinsically motivated to continuously learn and
apply what they gain from the education they experience, the reality is that it often requires enablement
through tools and strategies.
Even learning professionals—who should be the most knowledgeable about the importance and
benefits of continuous learning—are often at risk of putting their own development on the back burner
due to work constraints and competing priorities. The tools and approaches described in this chapter offer
a range of options for engaging with others in continuous learning, individually or in cohorts, remotely
or co-located, and synchronously or asynchronously. Consider the unique needs of your learners and the
characteristics of your organization to identify the tools and strategies that would provide the greatest
impact. Then try one or two or more!

Key Takeaways
9 Continuous learning is more important than ever as organizations are challenged to ensure their
employees have the skills needed to address current and anticipated customer expectations.
9 Consider these tools and strategies to inspire and enable the learning professionals in your
organization to engage in continuous learning:
• badging
• bundles
• discussion guides
• learning design guilds
• lunch & learns
• microlearning
• online toolkits
• peer-to-peer development
• webinars
• watercoolers.

Questions for Reflection and Further Action


1. What learning programs do you have that could be enhanced by adding one of these continuous
learning approaches?

2. Which of the approaches listed would work best for the needs of your own team?

152 | Chapter 11
3. Which approaches have you used in the past? How have they worked? How do they differ from the
way they are listed here?

4. How might you apply or customize one of these approaches to promote your own continuous
learning?

Enabling Continuous Learning | 153


12
Develop Your Team to
Develop Their Team
Alissa Weiher With MJ Hall

Think for a moment about the role of the learning function. Our goal used to be gaining a seat at the
table, and we were thought of as being operators for the 1-800-TRAIN calls from managers. Howev-
er, the world of work has changed, and with it, so has what the organization expects and receives
from the learning team. With our expertise in learning modern concepts (especially those related to
the science of learning), the ability to incorporate new technologies, and a focus on alignment with
the business goals and objectives, the learning team is now a key catalyst for successful change efforts.
We are seen as designers of myriad types of learning experiences enabling new capabilities for the
entire enterprise.
This move was partly the result of our functional domain morphing from a focus on training to
learning and then to performance. Another aspect comes from research by Mary Broad and John
Newstrom in the early 1990s, which indicated that the responsibility for improving the transfer of
training from the classroom to the job must be shared by managers and trainees, as well as trainers.
Further research, including Mosher and Gottfredson’s complete journey, depicts learning, transfer, and
sustainment components, and again places much of the responsibility for moving from learning to
performance on the manager.
While this concept has talent departments creating learning experiences to teach and support
managers in their role of developing people, what is the implication for managers in the learning

155
function? Not only do they serve as the enterprise-wide domain experts in learning, but, as managers
and leaders, they also are responsible for developing their own team—and modeling excellence as
people development leaders.
There is an old adage that the cobbler’s children wear no shoes. Too often we hear this being
applied to talent development professionals. We spend so much time and resources investing in
growing the knowledge, skills, and abilities of others, we often fail to make the same investment
in ourselves and our team. The problem with this is that to stay current and effectively address the
evolving needs of our corporate employees, our team needs to access current research, technologies,
approaches, and options. Chamorro-Premuzic and Swan (2016) remind us that what enables success
today will not enable success in the future, so it is imperative that we are agile in our willingness and
ability to learn and prepare for the future. It is exceedingly important for our TD professionals to
ensure they can continue to shape and evolve future learning experiences. As managers, we need to
create an environment for our team to share and develop individual expertise in different domains
and topics. While we need to coach our teams and emphasize their professional development, we
also need to invest in our own personal development and model self-directed learning. And finally,
we need to invest time in branding the members of our department as experts in their profession.

Creating an Environment for Staff to Stay Current


How do you help your team stay up-to-date with the latest in the field of talent development when we
live in a VUCA world of constant change? We have experienced so many evolutions over the years.
From the introduction of e-learning to the blended classroom, mobile learning, virtual reality, and the
current era of modern learning, change is ongoing. There is always something new on the horizon, and
while it’s often just a shiny object, it may also enable our work to be faster, better, and even stickier. It is
important to be aware of any new and innovative models, resources, and technology; however, this can
only happen if the learning leader creates time, space, and motivation for their team to stay current.

Start by Assessing the Current Situation


The first place to start with staying current is identifying what is already available within the team. Lew
Platt, former CEO of HP, frequently stated: “If only HP knew what HP knows, we would be three times
more productive.” And, as the saying goes, the whole is always greater than the sum of its parts. As a
business unit, what resources are already available within your team? While a 30-minute team brainstorm
could produce a huge list, consider some of the following ways to dig deeper.
• Start with a list of your team members, then look at their roles. Each one probably has new
developments, products, and thinking. Consider what related subtopics your team members
might be interested in. This can help identify many topics, as well as who can take ownership
for what—a divide-and-conquer approach.

156 | Chapter 12
• Next, take an inventory of what professional associations are represented within the group. Do
team members regularly read certain publications? What thought leaders and experts do they
follow via blogs and social media? Take the time to talk with your team and identify what they
are already doing in this space. Develop a visual and have fun with the content touchpoints
available within the team.
• Does your organization provide access to a broader learning database, such as LinkedIn
Learning, Mindtools.com, or Harvard Business Review? What are other functional departments
focusing on? Does anyone in your organization use Agile or Lean? What about game
mechanics or design thinking? How can your department leverage currency by connecting
with them?

Build a System for Sharing and Learning From One Another


Take a look at how you can start to share information and intentionally learn from one another. Do
you have a knowledge management system in your organization or a shared drive where people can
store information? Do you use an internal social-learning platform that enables you to easily share
links and resources and tag others you think will be interested in the information? Start with what you
have in your organization and see what works for you. If you find you don’t have a good platform for
sharing, you may need to get your IT department to explore options and alternatives. Find out what
other departments are using. If your expertise does not lie in this area, whose does? Let them take the
lead to help you successfully get this off the ground!
Lead by example and serve as a champion. This means not only sharing any information
you find that will be useful to others, but also taking the time to review what others share and
provide feedback and insights on your related thoughts. This can be intentional—such as creating
time and space at team meetings for people to share and discuss resources they’ve found. But it
can also be indirect and subtle. One example is simply putting the book or articles you are reading
in a conspicuous place in your office—then when a team member notices them, be ready to share
your takeaways. Another example is calling out trends when they’re mentioned during meetings.
This might be something like, “I am curious about how virtual reality might mesh with what
we are doing for Project XYZ?” or “Have you ever wondered if blockchain is going to affect
our work?”
Serve as a resource. When a team member is asking how to do something or for knowledge
access, consider what resources you can direct them to. Are there people in your network they can
speak with? Is there a good TED Talk online? Did you recently read a relevant article? Guide them
to other tools and resources to support their interests and further learning. Consider going on a
virtual scavenger hunt together. When your team sees you engaging in the behavior, it will help
them see where they can go to find the information they need for self-directed learning.

Develop Your Team to Develop Their Team | 157


Participating in professional organizations is an easy way to stay current while simultaneously building
skills. For example, if a team member is focused exclusively on crafting e-learning content, groups like
eLearning Guild or Online Learning Consortium may be helpful. On the other hand, if you are leading a
TD organization, ATD can offer greater breadth. Other professional organizations include Training Indus-
try, Institute for Corporate Productivity, Chief Learning Officer, and the Society for Human Resource
Management (appendix 1). Some organizations offer membership at the individual level and others are
offered at the corporate level. These avenues create a great space for professional networking and often
provide access to additional tools and resources that can support further self-directed learning, often more
affordably. If your learning team belongs to more than one organization, you can have different members
serve as champions by sharing the information and ideas they take away with everyone on the team.
Most professional groups host an annual conference, which provides additional opportunities for
learning. Consider carving out a little money each year in your budget so team members can attend a large
industry conference. These three- to four-day intensive programs generally host keynote speakers who are
thought leaders in the field; short, relevant sessions presented by practitioners and consultants; opportu-
nities to meet with vendors; and networking events with other learning professionals. Like any learning
experience, if you assess what is available ahead of time, determine your personal objectives, focus on the
experiences that are most important, and preplan your path, you will be able to make huge gains in a short
time. Additionally, if several members of the team are attending, you can divide and conquer to take in
the maximum amount of information, and then share and compare.
One rule for sending team members to conferences should be bringing the information back home
for all to learn. However, this is much more extensive than simply completing a block on their individ-
ual development plan (IDP) or impact map. This means they must review their notes, photos, hand-
outs, downloads, and other collateral, and put together an old-fashioned debrief to present to the team.
They should also share which thought leaders stood out most and what they were saying, the technology
demonstrations, any new books they heard about, and the intriguing conversations they had with other
practitioners. And most important, they should share at least one practice they are now including in their
repertoire as a result of the conference. As the manager and the learning coach for your team, make sure
to set this “bring it back” expectation before they leave, and help them set their objectives and plan their
itinerary. As soon as they return, show excitement about what they learned, ensure that time is available
for them to share with the team, help them connect the dots with what is happening in your organization,
and look for ways to use some of their new ideas.
Staying current does not need to be costly or a one-person endeavor. There are a wide variety of
online communities that learning professionals can join, many blogs to follow, and a plethora of easily
curated content. As these resources continue to grow and evolve, staying connected through avenues such
as LinkedIn will allow you to see new communities of practice and hear from others in your network about
what new tools and resources people are accessing.

158 | Chapter 12
Offer Opportunities Through Certifications
Certifications can provide focused development and give your strategy a more targeted process. If you
work for an organization where many of your team members are subject matter experts with little to
no adult learning experience but great technical expertise, they may benefit from a focused certificate
program in adult learning or a specific TD competency. For instance, after three people with no L&D
formal education or experience joined my team, the two facilitators attended a training certificate course
to learn some basic facilitation and adult learning skills, and the instructional designer attended a design-
ing learning certificate program to learn how to apply design to adult learning. This helped them establish
a solid foundation and common language while increasing their confidence and competence.
If your organization is experiencing rapid growth and you are trying to build future leaders from
within the organization, you may find certifications highly valuable. Certifications can also help your
organization work through competency modeling or assessing a leader’s learning agility. Obtaining a certi-
fication in 360-degree assessments may help you work with leaders on development plans that better equip
them for the future. However, it is imperative that you evaluate your organization’s strategy to ensure your
investments align with the long-term goals of the business.
One starting point for building your learning team’s capacity is the new ATD Capability Model,
which serves as a template for success in talent development. The model details what TD practitioners
need to know and do to be successful now and in the future. It also responds to trends affecting talent
development, such as digital transformation, data analytics, information availability, and partnerships
between talent development and business

Serve as a Learning Coach


As a manager responsible for creating an environment for employees to continually up their game, serving
as a learning coach has specific duties and tasks. Ensuring that all team members have a customized and
current IDP, impact map, or action plan is a critical step. These are not just pro forma documents to check
off at the beginning and the end of the year; they are dynamic documents that guide frequent discussions
and inform continuous feedback on progress. It is not one size fits all. Partner with each team member to
create a plan that recognizes their goals and creates opportunities for continuous growth and development
in a way that is motivating but also builds capacity aligned with the needs of the organization
According to Broad and Newstrom (1992), coaching by the manager is imperative for formal training.
And as Shriver (2018) points out in the Four Moments of Truth model, the manager and the employee
need to negotiate the expectations for the training beforehand as well as discuss how the training will result
in application on the job. These discussions should set the employee up for a higher level of success during
the training program. There should be objectives and questions that help the employee and manager dive
into the experience with more interest, intentionality, and energy. More important, the discussion should
include an IDP or action plan the trainer is expected to complete during the experience. As Shriver states,

Develop Your Team to Develop Their Team | 159


“If trainees clearly understand that they will be expected not only to learn something useful, but also to
create a strategic document that reviews what they learned, how they plan to implement that learning on
the job, and the projected impact the applied knowledge could have on productivity, it stands to reason
that they will approach the learning event with an orientation toward action and achievement as opposed
to ‘box checking’ and completion.”
The conversations after the training is finished are even more important. These are not just an
exchange of the topic—for example, why was the learning experience needed in the first place? Discus-
sions should start soon after the experience to probe for connections with the work and options for appli-
cation. This first conversation should involve the manager reviewing the IDP or action plan, making
thoughtful comments about the content, and offering support for the implementation. To truly make a
difference, these coaching conversations should continue over several months, eventually including more
feedback on the application process itself, progress toward the target, and new learning content.

Modeling Personal Development


Before you can identify the best way to continue your development and invest in intellectual currency, you
must have a solid understanding of yourself as a leader and align your development with your organiza-
tion’s key goals, values, competencies, and strategies.
The first step is assessing your strengths and understanding how you show up for others. There are a
variety of commercial assessments and tools on the market for this—and you may already be using them
as part of your practice. As a leader for the learning team, you should use every assessment instrument
your team administers within the organization, which might include MBTI, Emergenetics, a 360-degree
survey, or an assessment on emotional intelligence or learning agility.
You may consider an advanced professional degree as one option for your professional development.
This is one I took advantage of when I was an early midlevel manager. My organization offered tuition
reimbursement, making it more affordable, and I saw it as an opportunity to create a faster path for
growth. Additionally, because the more senior-level positions required an advanced degree, pursuing a
master’s degree felt like a worthwhile investment.
A few things you’ll want to consider when exploring this option for yourself or your team members
are time, cost, and learning preference. While many courses are only three hours of class time per week,
this does not account for reading assignments, research, group projects, or writing papers. I found I
usually needed an additional three to five hours per week per class to complete those extra activities.
It was critical that I understood how much time I could commit any quarter to determine how many
courses I could take at once. As for cost, look into the benefits your organization provides. Does it offer
tuition reimbursement, scholarships, or salary increases upon achievement of certain degrees? What is
the budget that you are willing and able to personally commit to offset costs not covered by your orga-
nization? It’s also important to look at the cost of books, university fees, or labs. Finally, make sure you

160 | Chapter 12
know how you learn best. Personally, I was very unfocused during my online courses—while I could get
the work done, I was often disengaged in the process. Figure out what will work best for you—taking
courses online with greater time flexibility or attending in-person classes where you are face-to-face
with other learners and your professor.
There are a variety of options you can take advantage of to continually support building intellec-
tual currency, including belonging to professional associations and attending conferences. However, as
a leader you should go one step further and get involved. This might include serving on committees,
writing blogs or articles, presenting at conferences, or even contributing to publications as a chapter or
book author.

Branding Your Team as Stellar Producers


The motivation for building a high-performing team that delivers results and contributes to your
organization’s competitive advantage is a worthy goal in and of itself. It enables you to help build
capability at the system level and hopefully create a culture of learning throughout the enterprise.
You can lift this to a higher plane by sharing your practices with professional colleagues or assessing
your practices with a competitive standard using a national award. In the learning arena there are
several ways to do this, such as the ATD BEST awards, CLO magazine’s LearningElite, and Training
magazine’s Training 100 (appendix 1). At the organizational level there is the Baldrige Performance
Excellence criteria and process.
Meeting such criteria involves submitting an application with standard questions, sharing data on
efficiencies and results, and demonstrating alignment with organizational goals, objectives, and strategies.
While the application process can be time-consuming and challenging because of the specific documen-
tation needed, there are huge benefits for the learning team:
• Employees gain an enhanced awareness of the business impact of practices within the learning
and talent space, thus making them feel more connected to the business.
• An application helps develop a common language around the practice, process, and entire
learning ecosystem.
• The team is forced to consider how their practices are aligned with the performance that drives
the business results, as well as the role of measurement. Additionally, it enables the learning
team to identify the many positive stories associated with their practice.
• The application process helps employees learn about what’s necessary for a systematic learning
culture, as well as how to manage processes and report results as a business in and of itself.
Responding to the application questions helps all L&D employees learn how to accurately tell
their performance story.
• Most award applications ask for programmatic data and information that is the same as the
internal executives’ request. Thus, the application has many reuse features.

Develop Your Team to Develop Their Team | 161


• Feedback from outside practitioners is useful for continuous improvements because it reports
identified strengths and opportunities to improve.
• Reward and recognition for members, teams, and departments is a huge benefit, literally
serving as a spotlight on projects and employees. This is even more important in tough
economic times when many benefits are reduced.
• Participants respond well to attending an award ceremony and the public recognition it offers.
• External validation for internal processes exemplifies the values of continuous improvement,
innovation, and transparency.
• Executives see that the program is validated externally.

Summary
Ultimately, if you understand the direction your organization is aspiring to and align your learning
strategy, you should be able to identify key opportunities to invest in the development of your team
and yourself. This will help promote growth from within, engagement, and retention because your
team members will see that you are willing to invest in their growth. When you do this, your team
will no longer be the cobbler’s children with no shoes; rather, they will be the cobbler’s children in
Jimmy Choos.

Key Takeaways
9 You know development is important—you are a leader within your organization. Make sure you are
prioritizing your team’s development so they can most effectively meet the needs of the business and
stay on top of learning trends.
9 Build a network of learning leaders you can learn from and leverage as a sounding board. You are
not the first person to face the challenge in front of you. And your organization isn’t the first either—
take time to learn from the experiences of those who have gone before you and be willing to share
with those follow after you.
9 One size does not fit all. Take the time to understand your team members’ learning preferences and
motivators. This will help you direct them to the tools and resources that best align with how they like
to take in new information.
9 Lead by example. If they see you taking the time to invest in your own development, they are more
likely to believe they can carve out time to invest in their development as well.
9 Ensure accountability. While development opportunities are essential for knowledge acquisition,
the application is far more critical. Take the time to support the creation of an IDP, action plan,
or impact map, and continue having follow-up conversations around the experiences your team
members are having in application. Learn about their road blocks, help them remove boundaries,
and celebrate the successful implementations!

162 | Chapter 12
Questions for Reflection and Further Action
1. What is your defined budget for the year? Of that, what percentage should be earmarked for the
development of your talent development team members?

2. What local opportunities or communities of practice exist for you and your team to actively
participate in? How might you encourage engagement?

3. What is your plan to document and share your team’s favorite resources; for example, how can you
live out Lew Platt’s quote?

4. What actions have you taken to align your team’s development needs to the organization’s strategy?

5. How and when are you serving as a catalyst to build and share your network and make new
connections for members on your team?

Develop Your Team to Develop Their Team | 163


Section 4
Making an Impact

T
his section considers how the learning profession demonstrates organizational value

in reaching goals, objectives, and business results. Data, analytics, metrics, and evalua-

tion processes often arise as sore points for learning leaders. While their goal is design-

ing, developing, and delivering learning content, leaders are under constant pressure to prove that

every penny invested produces the desired results.

There are many things to consider when addressing impact. What is the focus—learning or

performance, or equal shares of both? What tools do you use? How do you design the measure-

ment and when? How do you make decisions about metrics that are as diverse as qualitative feed-

back, Kirkpatrick’s Four Levels of Evaluation, the Phillips ROI Methodology, and Brinkerhoff’s

Success Case method? Do you focus on outcomes or outputs?

Once you have the data, how do you make sense of it? How do you share it with stakehold-

ers who come from different perspectives—and especially those who do not understand the

language used in our functional domain? How do you display the data? How do you turn numbers

into a compelling story? How do you make data dance? The chapters in this section will give you

many ideas.

In chapter 13, Rachel Hutchinson of Hilti provides an overview of measures that meet the

needs of the participants, evaluate the effectiveness of programs, and clearly communicate

markers of achieved success to stakeholders. Using a learner-centric approach, the team at Hilti

165
uses data to inform the original design of the learning solution as well as make updates and changes. In addi-

tion to looking inward, they also benchmark metrics from other companies and use stories and visuals to

show impact.

In chapter 14, Ron Dickson uses the story of a fictionalized learning professional’s journey from irrele-

vant measures to mission-critical dashboards to demonstrate how to move from important training department

data to data that is valuable to customers and stakeholders. Part of the process includes conducting analyses

and interpretation to highlight meaningful trends, point out milestones or anomalies, and provide context as

needed. While businesses are rarely fully “data-driven,” the tools covered in this chapter can help ensure that

learning-related business decisions are informed by data.

From a different perspective in chapter 15, Graham Johnston of Deloitte shares the reasoning and

actions behind moving from a measurement strategy to implementing and maintaining a learning impact

strategy. This is approached in two ways: the value and impact that the learning function provides to the busi-

ness and the effectiveness of individual learning solutions. For both, impact is achieved by defining outcomes

up front, and then using those to shape planning and design, the measurement approach, and its continuous

improvement efforts.

166 | Section 4
13
Metrics Matter
Rachel Hutchinson

Getting started with analytics can be scary, or seem way too complex. A quick Google search of “learning
evaluation methods” brings up dozens of theories mentioning the Kirkpatrick model, which seems to get
more overwhelming the more levels you look at! Then, once you pick a model to work with, you quickly
learn that you:
• Don’t have the business knowledge to know what to ask.
• Receive useless responses because you’re not experienced at gathering data.
• Get good data, but are not sure what insights they provide.
• Have so much data, you don’t even know where to start your narrative to management.
Don’t worry. We’ve all faced these fears, and it is possible to come out the other side without a degree
in statistics!
This chapter looks at some ways that companies have analyzed learner data and been successful
at showing their value to the organization. You are likely bringing value already; the focus here is
on how to help others see this value and make sure you and your team are confident that you’re
delivering effective solutions. But while we focus on looking at the data to see what is working,
we should also be looking for what is not working so that we can improve it in the next iteration.
An example of this could be when you deliver a minimum viable product (MVP) for user testing,
such as a group of performance support pieces for your HR system. You may find that while people
are using the system, they are not updating their coaching notes in the system. Your data show that
the performance support helped them use the system (one goal achieved) but did not help them

167
change their mindset to realize how critical documenting the coaching conversation was (room for
improvement in the next iteration).

Dashboards and Scorecards


All organizations use metrics to determine effectiveness. These metrics are not always the same, but they
do have one thing in common: They ensure that there is alignment with performance expectations. For
many years, L&D organizations focused primarily on how the learner felt about the experience, which
left them ill equipped to demonstrate evidence-based causal relationships between training and the
ability of the workforce to perform differently in response to new conditions or requirements. Further
complicating any such effort was the reality that many things in the environment beyond training
affect performance—economic changes, willingness to adapt, processes, coaching—making a causal
relationship difficult to prove. So, organizations have begun looking to simpler correlations and specific
behavior changes that the learning solution is designed to achieve. Leading and lagging indicators are
examples of the type of metrics that learning solutions to be quickly evaluated for effectiveness and
more agilely modified if needed.
Leading indicators are things that suggest the performance will improve because users are taking
measurable actions that correlate with future success, such as adoption of a new IT system seen through
logins or engagement with the platform. Lagging indicators, on the other hand, are the results of the
performance improvement, such as an increase in the number of promotions from within. These data
can be positive (show what is working) or negative (what is not working). In the example of adopting a
new system, the leading indicator could be that you are not seeing an expected positive trend (increase) in
logins as access to the system increases. Another leading indicator showing failure of your initial approach
could be that you see an initial login, but no further engagement with the system. These negative data
points are critical because they can help you immediately adapt the approach and continue to verify. This
is why knowing what possible leading indicators you could use can be extremely helpful in an iterative
design approach.
Connecting your learning solutions to business results ensures that you are affecting performance. It
provides your team with a higher level of engagement with your organization’s key business goals. Know-
ing what works, and what doesn’t, allows you to adjust the function, priorities, and resources in alignment
with any business changes.
Many ways exist to measure and report on the effectiveness of learning. Let’s take a look at how to
demonstrate our value in ways it will be seen through the eyes of our management teams.

How to Know Your Stakeholders Find Value in Your Work


The most certain way to make sure stakeholders see the value in your team’s work is to involve them
throughout the entire process. We talk a lot about consultative approaches; however, this works only

168 | Chapter 13
if you can speak the language of the business and understand when stakeholders aren’t sharing clear
measures of success. Consider the following example:

A department head in your organization says: “My team leaders are not coaching their teams appropriately. What
do we have that can help make them better coaches?”
“When you say they are not coaching appropriately, what does that look like?” you respond following a
consultative approach. “What are they doing—or not doing—that you believe is not effective?”

This is where you can find out if the stakeholder initially believes that you can add value.
• Worst case, they might respond: “Don’t we have a coaching course that we can send them
through? I heard from Andreas that his team coached much better after going through a
course.”
• Or they might respond: “Because they are not coaching their team members, performance is
suffering. And all I get are excuses for why we are not hitting our targets.”
• Best case, you can get a response like: “Well, it is not all of them. Mark does a good job—I see
him scheduling one-on-ones with his team, and actually he has a good promotion rate and is
the closest to hitting his team targets. It is more that I never see the others doing any coaching
at all, and our whole department is at risk if we can’t hit our targets this year.”
If you get the first response, you have further to go, but it is not impossible. It likely just means
that some probing questions are called for because your team’s value as root cause analyzers and busi-
ness-minded people is not well planted in their head. If you get the second response, the stakeholders have
given you some clues that could help—they believe that team leaders are not coaching, which is affecting
performance. Both of those things are relatively easily quantified. If you get the third response, you have
a benchmark example that you can use to draft KPIs—time spent coaching, promotion rate, team target
completion. Of course, this does not mean that Mark is the perfect benchmark! It simply means that you
know the stakeholder can quantify what they want by offering an example of someone they see as best
or good practice. That is your starting point—you can use the data for Mark as a baseline to see if he is
actually better or only perceived to be better.
Realistically you could now draft a sample dashboard without any data. It could show four graphs on
the visual representation of your dashboard:
• net promoter score (NPS)—would participants recommend this to a peer?
• hours spent coaching
• promotion rate
• team target completion or rating.
Some of these are longer term. Perhaps you need to further define some leading indicators; for
example, how many people are defined in the HR system as being ready to move to a new position
within 12 months?

Metrics Matter | 169


How to Know Your Learners Find Value in Your Solution Offers
While we often focus on communicating the effectiveness of learning to management teams, we also need
to ensure that learners find value in the learning solution. You can gather input from learners through
various tools, including:
• NPS—would participants recommend to a peer?
• Kirkpatrick’s Level 1 and 2 evaluations
• focus groups
• qualitative feedback
• participation levels for nonmandatory items or platforms
• engagement levels (repetitive returns).
There is no fixed way of gathering this information. For example, we use NPS within certain talent
or leadership programs, as well as our team mailboxes to gather input on our effectiveness at respond-
ing to ad hoc learner needs. We may select a few programs to evaluate at Level 1 or Level 2, especially
if we are using third-party facilitators or content. For unique programs for specific populations, we may
use a focus group, such as when we were looking into a marketing competence solution. We analyze
data quarterly in our digital learning platform and use participation and engagement levels to evaluate
what is working.
Here is how Michael Marschall, learning business partner at Hilti, evaluates our primary global lead-
ership talent development program, IMPACT.

IMPACT is an experiential leadership talent development program that helps current team leaders who have
been identified as future potential executive leaders get ready for senior leadership roles (for example, to serve on a
management team of one of our market, region, or global organizations).
We measure various elements to find out if we are doing the right things with our IMPACT Leadership
Talent Development Program. These measurements help us decide if the virtual and face-to-face L&D elements
we use create the desired impact or if we need to change elements of the nine-month learning journey. They also
help us evaluate if our selection to the program and the people development after program completion deliver the
expected results.
The key elements we use are:
• a confidential 360-degree survey before the program and a 360-degree follow-up survey after program
completion to measure personal development progress
• a group reflection after each face-to-face week to find out what worked for the participants and what
did not
• a structured confidential online survey at the end of the program to assess the dimensions we are driving
with the program setup (using rating scales as well as free text answer options)
• a net promoter score measurement for previous participants two years after program completion,
plus a free text field for them to share the reasons behind their rating—this helps to get their

170 | Chapter 13
judgment long after the initial excitement and fresh impressions have washed off and real life has
kicked in
• a tracking of participants’ career development for six years after program completion—this allows us
to measure things like how many people from which class make it to the desired management level or
higher; how many program alumni are promoted to the desired level in a given calendar year; or how
many participants leave the company within six years of completing the program.
These measures ensure that we meet the needs of the participants, evaluate the effectiveness of our program,
and communicate to our stakeholders clear markers of the success we have achieved. This brings confidence in our
approach and keeps the approach and methodology fresh.

Your approach to data can be a set of complex data points, or you can identify a single thing that
you want to track. For example, when launching a new digital learning platform, we knew that awareness
of the platform was a key metric. Therefore, we evaluated awareness to the platform, also referred to as
adoption rates (Figure 13-1).

Figure 13-1. Monthly Views Exceed 1 Million

Let’s look at another example: A specific location rollout of a marketing program with these offerings:
• an external digital course on the basics of marketing (four to eight weeks)
• an internal digital course with on-the-job assignments (six to 12 weeks)
• a classroom session showing on-the-job application in a final presentation (two days).
Our HR team realized that people promoted into marketing roles were not completing the program
as expected. Through research interviews, a focus group, and surveys within marketing competence
programs, we identified two primary challenges with the learning methodology design. Then we were
able to dig in and look for root causes.

Metrics Matter | 171


The team leaders were not supporting the team members’ journeys through the learning
programs for two distinct reasons. The first was that team leaders were unaware of the benefits
of the learning journey, so they did not make the development of marketing functional expertise
a priority or include it in each person’s performance management form. Team leaders also stated
that they did not have the relevant coaching guides to support the team members. This meant that
they didn’t have the necessary coaching and development conversations in conjunction with the
learning journey.
The team members found it very difficult to prioritize learning moments and the marketing exper-
tise development, and the digital learning program was relegated to a low-priority task. We found that
while participants were signing up for the courses, they might not receive (or be aware of) the coaching
guide to share with their team leader. Additionally, for programs requiring multiple courses, there were
misconceptions about who was eligible. Even if they initially heard details about the journey, many
forgot or received a confusing message. Based on our data, we developed a solution proposal to share
with stakeholders. This proposal included what we saw that did not work and our hypotheses of what
would work.
Each measurement tool is only as effective as how you use it. Trying to find analytics after creating
the learning solution will only make your job harder. Instead, start during the discovery phase to define
what needs to change and then identify how you can affect it with the learning solution and how you can
see the result.

Benchmarking for Comparisons


All too often, we look only internally for ways to measure our success. However, we can often learn
more about our effectiveness by looking to other companies that might have prioritized specific areas to
change in parallel with our own efforts. In the L&D field, most people are willing to share their success-
es and failures with other practitioners.
Benchmarking provides a more objective view of our strategy and what we offer our internal team
members and leaders. It can also provide insights from a new perspective and keep us from having to
make mistakes on our own because we can learn from others’ mistakes.
There are many opportunities for benchmarking, and networking is a good one. Many of us see
networking as “not part of my job,” but our network allows us to exchange information and insight
with peers. I learn so much when I share stories on a topic and then members of my network share
theirs! For example, our relationship with Grundfos started out as a customer referral request, but it
has evolved into a back-and-forth exchange over how we create our respective learning strategies, what
barriers we encounter, advice on how to overcome or avoid issues, and other brain-charging moments.
One of the fastest ways that we can increase our opportunities to benchmark is through member-
ships in groups. These organizations usually offer some type of annual report, structured experience

172 | Chapter 13
exchanges, or method of connecting their partners. For example, at Hilti we are members of the ATD
Forum, ATD CTDO Next, and IMD Learning Network, which provide direct access to hundreds of
organizations around the globe. We used our partnership with the ATD Forum to benchmark and
experience exchanges on onboarding new hires and developing new leaders. Our partnership with
IMD and others helped us identify 75 companies to benchmark against while looking at our learn-
ing strategy rework. And our partnership with Catalyst allowed us to benchmark our diversity and
inclusion efforts.
For a more informal method of increasing your network for benchmarking, consider speaking at
conferences and offering to connect with people on LinkedIn, Viadeo, or Xing.
We used benchmarking, along with other research methodologies, during the development of our
learning strategy. Our project manager created a comprehensive methodology using a combination of
internal and external data, including:
• research and benchmarking studies of more than 50 external organizations
• a megatrends study from industry experts
• internal connections to our corporate midterm strategy
From these, the project manager derived a current state and implications document and looked inter-
nally to understand what was currently working inside Hilti. This showed our as-is learning practice
(Figure 13-2).

Figure 13-2. Hilti Benchmarking Process

Metrics Matter | 173


To ensure a diverse gathering, the project manager selected a wide range of companies, looking at
things like:
• organization size
• Fortune 500 status
• emerging companies
• type of industry
• global, regional, and local geography.
The project manager then took these findings and compared them with the direction that Hilti
wanted to go as well as what we knew about Hilti as an organization. She then compiled the outcomes
into a simple list of six learning trends that would affect our learning strategy (Figure 13-3).

Figure 13-3. Six Learning Trends

Because the project manager used, and communicated the use of, such a diverse group of
benchmarking companies, the management teams were fully confident in her selected outcomes.
Benchmarking was both an effective way to gather data and a great method to gain stakeholder
buy-in.

Analytics
Data exist all around us. Much of our focus has been on creating surveys, performing needs analyses,
and figuring out what to do with the data we have access to. However, data are only numbers unless you

174 | Chapter 13
use them to tell a compelling story and create change. Next we’ll take a look at what methods you can
use to analyze the data points you have.
Without data, all our decisions would be based only on gut instinct. While that might lead to
success, it would be difficult to know when your neuroplasticity is holding you back and keeping you
tied to a habit that is no longer relevant or providing value.
For example, when we were designing our new team leader onboarding program, we realized that
changing behaviors in a manager would be hard to measure. Therefore, we defined the leading and
lagging indicators (Figure 13-4).

Figure 13-4. Examples and Leading and Lagging Indicators

For our purposes, we also looked for effectiveness data points that could be pulled globally to reduce
the workload of the local learning and development teams.
Our longer-term vision is to be able to also look across learning behaviors and correlate those with
business performance. For example, we are building systems to look at things like:
• What learning activities are unique to team leaders who have higher promotion rates?
• What learning activities are unique to salespeople who have higher sales rates?
• What learning activities are unique to salespeople who have higher service contract
conversions?
We would then be able to proactively recommend those learning activities in the development plans
for peers who are underperforming.

Making Data Dance


Salespeople use stories every day to connect with the hearts and minds of their customers to inspire action,

Metrics Matter | 175


whether that’s to purchase a product, analyze their asset management, or meet with a higher-level
decision maker. Stories are not magic; they are simply a way of structuring information (facts and
figures) in a format that makes it easier for audiences to receive and process the information. There
is plenty of neurological data as to why, but the important part for us is that storytelling connects
the speaker and receiver emotionally and motivates cooperative behavior.
We have defined an approach to make data palatable to audiences. First, define your goal. For
example, if you are trying to increase adoption of your digital learning platform, then that is your
goal. If you are trying to find out what learning approaches work best with a specific audience, then
that is your goal. As soon as you know that, consider who your stakeholders are. Or it could occur
in reverse—you could have a group of stakeholders who request an update on a project or program.
In any case, before you begin gathering data, you must know your ultimate goal and your audience.
Once you have those, you can gather data and determine which pieces will support your
hypothesis and engage your stakeholders. For example, if you wanted to find a way to encourage
people to adopt a solution for business storytelling, you would select data that show how commu-
nicating effectively will improve results or affect the business by increasing the ability to adapt to
change quickly (through faster decisions, clarity in decisions, or alignment on direction to move).
Your current tools are not the limitation—your ability is the limiting factor. This is because you
know that change is concerning and will affect all parts of the business and you need to change the
way you are communicating.
We always do a stakeholder map that shows where stakeholders fit in our project, what engages
them, and how they like to be communicated with. We can extrapolate some of this from our knowl-
edge of the level or area of the business, while other information comes through direct knowledge
and experience. We then create an executive summary to keep us aligned and ensure we have a
common core message for all of our data, which can be repeated in each instance. The message can
be simple, like “Stop teaching, start learning” before we deliver effectiveness data on our digital
learning platform, or it can be more complex, like creating a persona and scenario “day in the life”
as we did for our new team leader onboarding program. No matter your message, however, the
stakeholder map is your central hub where all of your data originates and returns.
The next step is defining your methodology. What are the audience’s needs and motivations?
What is your motivation? How can you connect your desire to theirs? Do you need pre-alignment
(this is particularly critical if you are bringing in unpleasant, controversial, or surprising data)?
Here’s an example of how data were used in an HR newsletter to drive awareness of how
quickly users were engaging with the digital learning platform. In this case, we knew that stakehold-
ers were concerned about whether the platform was going to be used.

176 | Chapter 13
Metrics—What Gets Measured, Gets Done Update
The beginning of the fourth quarter brought additional strides in learning engagement at Hilti. Seventy percent of
all Hilti employees are now active on our learning platform! We can see that as content grows on the platform,
usage and relevance of the platform increases (Figure 13-5).

Figure 13-5. Percentage of Growth in Learning Engagement

This is another place where Hilti is overachieving against external benchmarks. We intentionally chose not
to populate the platform with a lot of global materials at the start to ensure that only highly relevant materials
were provided. We can also see that our efforts to co-create content across a large population are working—more
than 1,000 Hilti team members (6 percent) are creating and uploading content to the platform (Figure 13-6)!

Figure 13-6. Percentage of Learning Platform Use

Impressively, our desire to learn is high—Hilti team members have viewed more than 5 million pages since the
launch of Fuse six months ago. Much of this is through the efforts of our regions.

Metrics Matter | 177


The whole presentation was 20 slides long; this excerpt is intended to show how some key messages
were highlighted. Your message can be presented in person or converted to video and explained
verbally. The benefit of video is that it can then be placed online and viewed independently by partic-
ipants (Figure 13-7).

Figure 13-7. Presentation Examples

By presenting the same data in different ways and using various methodologies to deliver it (text,
verbal, video), we were able to meet the needs of a diverse set of stakeholders. We were also able to share
the message via the newsletter and video so that it could be duplicated locally.

Summary
Data is not only a valuable tool in your toolbox, but it is critical to earning your seat at the executive table.
You cannot consider only what you want to say or what you are proud of. You have to focus on the things
that will encourage buy-in, participation, action, or decisions from your stakeholders. The focus of this
chapter was to help you continuously improve and ensure that your narrative of value-add is being distrib-
uted. While we often focus on looking at the data to see what is working, we should also watch for what is
not working so we can improve it in the next iteration.

178 | Chapter 13
Key Takeaways
9 Leading and lagging indicators for learning solutions need to be identified to assess effectiveness.
9 By using analytics, talent development professionals and business leaders can see a direct correlation
between the learning solution and business impact. These findings should influence the continuation,
change, or stopping of the solution.
9 Verify the data and data collection process. Don’t spring into automation without a clear
understanding of the data and its accuracy.
9 Be willing to change your measurements. If you are measuring the wrong things, you are not telling a
compelling, actionable story.

Questions for Reflection and Further Action


1. What insights are important to your organization? How are they gleaned from the data available?

2. What business imperatives or step-changers does your overall organizational strategy focus on
and why?

3. What data are you aware of and do you have access to it? How can you use that data to tell a story
that others understand?

Metrics Matter | 179


14
A Dashboard Journey
Ron Dickson

Maeve was pleased she was asked to create a metrics dashboard on training for her division’s business
leaders. A learning specialist with a recent graduate degree in adult education, Maeve knew her stuff: She
was comfortable with Kirkpatrick’s four levels of evaluation and was familiar with the data available in
her company’s learning management system. The deadline for the first prototype was tight, but Maeve
worked through the weekend to create a mock up that included division training totals (number of learn-
ers, total hours delivered, average completions per employee), percentages of training delivered by topic
as well as modality, results of satisfaction surveys, and average test scores (Figure 14-1). She also included
placeholders for measures of behavior change on the job, business results attributed to training, and the
ROI of those results. In addition, she planned to include industry benchmarks like the ratio of trainers to
learners as a measure of her department’s efficiency.
Dara, Maeve’s boss, liked her proposal.
“This is a great overview of what we do,” he said. “Our customers will be able to see at a glance how
well we’re supporting their employees. I can’t wait to hear what the business leaders have to say.”
The morning of her presentation, Maeve rehearsed her narrative, made some last-minute adjust-
ments to the graphics, and arrived at the conference room with plenty of time to spare. When she was
called for her spot on the agenda, Maeve had to suppress a smile. She could hardly wait for the formal OK
to proceed with creating the actual dashboard.

181
Figure 14-1. Division Training Dashboard

That afternoon, Dara waved Maeve into a chair on the other side of the desk.
“How did it go?” he asked.
“Not great,” Maeve sighed. “They said the dashboard might tell a training person something useful
but didn’t give them any insight.”
“Really? I thought your suggestions were very thorough.”
“Me too. But Peg from finance said they were mostly activity indicators, not business indicators.”
“Sounds like Peg—she doesn’t like anything that doesn’t give her an excuse to trim someone’s budget.
Did anyone offer any productive feedback?”
“Helen, the senior VP of our division, said it was a good start. She’s going to ask Alden Jones, a data
visualization analyst from her previous group, to work with me on the next version. Apparently, this guy
has a knack for sharing data that managers can relate to. We’re meeting later this afternoon to get started.”

Back at her office, Maeve stuffed the printout of her slides into a folder and dropped it into her desk draw-
er. She tried not to think about it while plugging away on the email messages she had been putting off to
meet the presentation deadline. After a while there was a knock on her door.
“Maeve?”
Maeve looked up, surprised to see that two hours had passed. She stood and offered her hand.
“You must be Alden.”

182 | Chapter 14
“That’s me. Helen asked me to stop by and lend a hand.”
Maeve motioned toward the spare chair and then offered the folder she retrieved from the drawer.
Alden waved it aside. “We’ll get to that. Let’s talk first. Why did your managers ask for a dashboard?”
“Because I have the training data,” Maeve said, hoping she didn’t sound as annoyed as she felt.
“Yes, but why did they want to see it? I mean, what questions do they need to answer? What decisions
do they need to make that your data could help with?”
“They, uh, never really said.”
Alden smiled. “They almost never do. It’s surprising how many managers assume that a dashboard is
a good thing but don’t think through how they would use it. They don’t realize that without a reason for
a dashboard, they won’t even look at it. In fact, that should be our first guiding principle.” He picked up a
marker and wrote on Maeve’s whiteboard: “We don’t have time to be in the curiosity business.”
“OK, but if they don’t know—or won’t say—then what data should I include?”
“Good question. What do senior leaders always care about?”
“Money!” Maeve said immediately.
“Got it in one. What else?”
“Profit. Efficiency. Quality. Customer service. Whatever the CEO cares about.”
“That’s right,” Alden said, scribbling as Maeve spoke. “Did your draft dashboard have those things?”
“Not directly, no. But they did ask for a training dashboard.”
Alden turned to the whiteboard and wrote: “They don’t always speak our language.”
“What do you mean?”
“Think about the requests you get for training,” he said. “Often the person asking for ‘training’ really
wants communication or a job aid or a whitepaper. People outside the training world don’t always make
the distinctions we make.”
“All right,” Maeve said. “But they must think there’s a connection between the things they care about
and training. Otherwise they would have asked someone else for a dashboard. What’s the disconnect?”
“Why do we train people?”
“What?”
“Well, they must see a connection between those things,” said Alden as he pointed at the whiteboard,
“and the work you do.”
“So they already believe that training can save money, increase profit, improve efficiency, boost
customer service, and help drive whatever the CEO cares about.”
“It would appear so,” Alden said. “Let’s take a look at your draft.”
Maeve spread out the mock up of the dashboard she’d prepared. Alden studied the pictures and read
the notes she had made on each.
“These are good,” he said, turning the last page. “Really good.”
“Then why—”

A Dashboard Journey | 183


“Because they aren’t the audience for this. If the satisfaction surveys are trending down, senior lead-
ership isn’t going to chase down the root cause and figure out what action to take—they trust you to do
that. Whenever someone asks for data, especially in a dashboard, your first task is to find out what actions
or decisions the data will be used to improve. Your manager and your training peers are probably the
right audience for your original, but we’ll talk about that later. Your division leadership wants to see how
the work you do connects to these things,” he explained, tapping the list of interests Maeve had dictated.
“What do you mean?”
“Well—they’re interested in money, right? Do you have access to financial data related to training?”
“You mean our spending?”
“That, and any training-related spending from other departments.”
“But we’re the only training department.”
“True, but any manager can spend directly from their funds on conferences, seminars, or tuition
reimbursement.”
“So, any money that goes out the door for any kind of training or learning?” Maeve asked.
“Yes. But also the dollar value of the time spent as a learner or instructor. Those are the indirect
costs of learning.”
“We don’t control how much people get paid!”
“No, but you often can control the length of your classes. You can use indirect cost estimates to
help decide whether it’s worth investing in online learning to shorten delivery time, so you will want to
include that on your department dashboard as well. Your senior leadership will want to see that so they
know how much they are investing—directly and indirectly—in the training and development of their
employees.”
“That’s a lot of data.”
“It is. You’ll also want to show the financial history so they can see whether those numbers are
trending up or down and, when possible, any events outside your control that might drive a change in
spending for training.”
“Like opening a new site.”
“Exactly,” Alden replied. “You would need new-hire training as well as site orientation at least. With
enough data, leadership (or you) will be able to predict how much budgets will have to shift when a new
site opens.”
Maeve took a deep breath and let it out slowly. “I’m beginning to see what you mean by thinking
through what decisions or actions a dashboard can inform. I need to approach this from a very different
angle.”
“Good idea,” Alden said. “Why don’t we meet again next week to talk about how to display the data
you get?”
Maeve agreed and they set a time to meet.

184 | Chapter 14
Digging Deeper
During the next few days, Maeve contacted several departments to locate training-related data that would
be of use to senior leadership.
• Finance provided training-and-development-related spending for each department in Maeve’s
division. Fortunately, finance used separate cost codes for tuition reimbursement to accredited
schools and spending on nonaccredited entities like vendors and professional development
seminars. They could provide information for the current year plus the two previous years.
• LMS administration reported total hours spent in training by learners and instructors
by worldwide region. They were also able to determine which hours were spent in physical
or virtual classrooms as well as in online learning modules. Unlike financial data, reporting
included only the current year plus one full prior year.
• Human resources would not release individual pay ranges for learners or instructors due
to data privacy concerns. However, after some discussion, they agreed to provide a midrange
average for individual contributors and managers in each worldwide region. Maeve could use
these numbers to estimate the cost of learner and instructor hours.
• Program owners in Maeve’s department provided details on two key programs the CEO
had championed: Customer Service Excellence (CSE) and Accelerated Leadership Excellence
(ALE). For each program, Maeve received a spreadsheet of who had attended since the
program started, including name, employee number, department, location, and date of
completion.
When Alden and Maeve met again, they reviewed what she had collected.
“Notice anything different?” Alden asked.
“Yes,” Maeve laughed. “Almost none of this was in my original draft. And none of it covers the
measures I learned about in college.”
“Right. That makes sense, in a way. Our university programs were focused on our areas of study, not
on how those areas intersect with different fields. But we’re getting there. Let’s see if we can put together a
dashboard with the data you have now. I like software that creates interactive visualizations because your
viewer can create a custom view, but you can create good static graphics in most standard office software.
The most important thing for now is to use a program you’re comfortable with so you can adapt your
graphics as needed.”
“Let’s start with interactive visualizations then,” Maeve said. “I want to make it as easy as possible for
the viewer to see at a glance the most important parts.”
Alden and Maeve spent the rest of the afternoon huddled in front of the computer. At the end of the
day they had created three primary visualizations (Figure 14-2):
• Finance and human resource data resulted in two visualizations:
° A combination chart used bars showing the total spending for each year (two completed

A Dashboard Journey | 185


years and the current partial year) with an overlaid line to show how much was spent
per employee. Maeve and Alden considered breaking down spending by department or
quarter but decided to wait until the business leaders had a chance to provide feedback.
° The second visualization used finance data to show how much spending in each
department was assigned to the training category. Maeve knew this category was fairly
broad and included consulting, purchases of books and supplies, rental for off-site events,
and catering.
• LMS administration data were presented as a series of stacked bars showing total
learning completions by month for each department. While Maeve still believed that
learning modalities were important, she reasoned that the business partners would be more
interested in how the different departments were consuming training in total, rather than
the specifics of how they did it. She made a note to explore the topic further in upcoming
meetings.
• Program data was used to create a three-part visualization combining data for CSE and
ALE. The visualization included a map showing where most attendees were based, bars
showing attendees by department, and a pie chart showing how many in total had attended
each program.

Figure 14-2. Alden and Maeve’s Interactive Dashboard

“Those look good,” Maeve said. “And they are much easier to understand than the spreadsheets I
started with.”

186 | Chapter 14
“Agreed,” Alden replied. “You still have the original data and can make it available for anyone who
needs it, but the point of a good dashboard is to help them know if they need more detail or not. Do
you see any areas where that might be true?”
Maeve paged through the visualizations and stopped at the finance charts.
“There,” she said, pointing to the purple line that represented HR. “That department has the
fewest employees and consumes less training, but it often spends more each month on training than
any of the others.” She immediately pulled up the original spreadsheet, located the months with the
highest totals, and filtered the column describing each expense. “It looks like HR is functioning as
the central purchaser for shared expenses, like software or technical training vendors.”
“Good eye. How long would it have taken to find that information by just reading the
spreadsheets?”
“Forever. Maybe longer.”
“Probably,” Alden said, “but you were able to spot the anomaly and find the explanation in
just a few minutes. You’ll have the answer at hand if anyone asks during your next presentation.”
“Which is tomorrow morning,” Maeve said, glancing at the clock. “I should get to work on my
presentation.”
“Fair enough. Let’s talk after your meeting.”

A Better Reception
The next day Alden returned to Maeve’s office.
“Well?”
Maeve smiled broadly. “It was a hit. The managers liked being able to see their data compared
to that of their peers. Those with lower enrollments in CSE and ALE were scribbling like mad, so
I think we’ll see a spike in attendance before the next CEO review. Even Peg from finance looked
impressed when someone asked about the purple line and I was able to explain how centralized
billing made the numbers look a bit off.”
“Well done. She’s always been my harshest critic.”
“Thanks. Funny thing, though—even after they tore apart my first proposal, they were asking
questions about satisfaction surveys. They wanted to know if that data gave us any insights into
whether the learning was supporting the business.”
“Does it?”
“I was just looking for that. Here is what I have.”
Maeve tilted the screen so they could both see the massive spreadsheet of LMS data. Together
they looked at the standard responses to the company’s standard learner survey, a set of 10 questions
that all learners answered after completing any training housed in the LMS (Figure 14-3).

A Dashboard Journey | 187


Figure 14-3. Student Evaluation

Student Evaluation
1. The course achieved the stated goals and objectives.

2. The materials (manual, videos, etc.) were valuable to my learning.

3. The course activities effectively supported my learning experience.

4. My knowledge in this area increased as a result of this course.

5. This course helped me learn things that are important to achieving my group’s goals.

6. I can use what I have learned back on my job.

7. The instructor knew the subject well.

8. The instructor presented in a manner that held my interest.

9. The instructor checked for understanding and answered the questions.

10. Please provide additional comments (on topics not mentioned above):

“There may be something,” Alden said, “but I’m curious what you think now that you’ve done a
business dashboard as well as a learning data dashboard.”
“Well, a lot of this data wouldn’t inform decisions that the business leaders would make,” Maeve
said. “For example, we asked whether learning objectives were clear, but we would not ask the business
leadership to take action based on low scores. Like you said before, that data belongs on a dashboard for
the training department.”
“Agreed. Anything for the business leaders?”
“Here,” Maeve said, sorting through the list of questions. “These two: ‘This course helped me learn
things that are important to achieving my group’s goals’ and ‘I can use what I learned back on the job.’

188 | Chapter 14
They tell us that the content is relevant, and the employees don’t expect to encounter barriers to using it.”
“I think that’s right,” Alden agreed. “If we can sort it by course or curriculum, we can see if the
division’s employees are getting the right information and are supported by their own managers in
applying it.”
Working together, Maeve and Alden created a visualization that totaled the average response to each
question on a five-point scale (Figure 14-4).
“Should we break this out by department?” Maeve asked.

Figure 14-4. Survey Bar Charts

“You could,” Alden said, “but I would wait. Doing that will result in a lot of supplemental charts that
may detract from your primary message. Perhaps first check in with the department leaders to see which
courses are most critical and only go into detail on those. They may also ask you to include how many
people actually responded to the survey to get a sense for how representative these summaries are. But this
is a good start for now.”
“That makes sense. But—” Maeve bit her lip. “I can’t help wondering why we have all the rest of this
data if we’re not reporting it. Habit maybe?”
Alden laughed. “Maybe. Or maybe we haven’t found the right audience yet.”
“So—the training department dashboard?”
“Possibly. You may also bring more of it into the business dashboard. You’ll figure that out as you
collect feedback from the business managers. But this is a good piece of work. Congratulations.”
“Thanks. One thing surprised me, though.” Maeve pulled up her original presentation and pointed
to some of the placeholders. “None of the managers asked me about application or impact measures, not
to mention ROI.”
“Why did that surprise you?”

A Dashboard Journey | 189


“I thought those were the things all business leaders cared about.”
“And nary a word about Kirkpatrick’s four levels?”
“You were right,” Maeve said, pointing to her notes from their first meeting. “You said, ‘They don’t
always speak our language.’”
“They don’t. Truth is, they may never care about higher levels of evaluation, and certainly they won’t
need to see them for every training product. That’s a good thing—evaluations of application, impact, and
ROI can be very expensive to do and disruptive to the business. They are important, but you’ll want to use
those only for programs where it’s critical to do so.”
“What about the rest of the time?”
“Most of the time your business partners trust you to do your job, especially once you’ve shown them
you’re tuned in to their concerns by analyzing and reporting data they already care about. You should
accept that trust. When the time is right, you’ll expand your range of measures, but you may never do
enough to make that level of evaluation a standard dashboard item.”
“So how do you get that information in front of the right people?” Maeve asked. “I don’t meet regu-
larly with all the leaders.”
“There are a number of ways. Department newsletters or a message from you would be good. You
can also tie it to the dashboard with a briefing sheet.”
“I’ve never done a briefing sheet. What’s that?”
“I send a standard message to the audience whenever their dashboard is updated—it’s kind of like
the summary that you might receive from your financial planner highlighting trends and changes so you
don’t have to look at everything to see where the most significant changes have occurred. The briefing
sheet directs the reader’s attention to key changes or emerging trends in the data.”
“Like the purple line on the financial chart.”
“Exactly. In your briefing sheet, you might point out the purple line, why it seems unusual, and any
anticipated shifts based on historical data. After hitting the highlights of the dashboard items, you could
summarize any important data that aren’t part of the dashboard, like an ROI study, web traffic to a new
training site, or initial audience reactions to a new training program. Let me show you.”
Alden opened his computer and pulled up a memo from his email archive. “See? This briefing sheet
is for financial data, but the idea is the same: If managers aren’t connected to your dashboard’s location
or are viewing the email on their phone, they can still see the highlights and then decide if it’s urgent to
access the graphics.
“The bullets here,” Alden continued, pointing to the end of the briefing sheet, “summarize infor-
mation that might be important but isn’t updated regularly and thus isn’t a good fit for the dashboard.”
“So the briefing sheet orients them to the dashboard, which can orient them to the source data—if
they need it.”
“Exactly! You’ve created a path for them to follow.”

190 | Chapter 14
“Send me that so I can use it as a template?”
“Done.”
Maeve glanced up at the clock. “Looks like another day is done. We still didn’t get to the training
department dashboard, though.”
“That’s another day’s work,” Alden said. “But, if you follow the principles we applied to the business
dashboard and focus on the decisions and actions the audience needs the dashboard to inform, you’ll do
fine. And I’m always ready to help.”
“Thanks,” Maeve said. “You’ll be on speed dial.”
Alden stood and they shook hands. “Walk you out?”
“No thanks. I’m going to finish up my notes and start my first briefing sheet. I want to get back to the
leadership team while the topic is still fresh.”
As Alden left, Maeve returned to her computer. Reviewing the visualizations on the new dashboard,
she drafted the following memo.

Managers,

Data in the training dashboard have been updated through end of the most recent month and can be found
at this link. Going forward, updates to this dashboard will be provided to you on the 15th of each month
with data through the end of the previous month.
• Total investments are on track to decline for the third consecutive year. If headcount remains stable,
the cost per employee could be as much as 10 percent less than previous years, indicating an increase
in efficiency.
• Training spending reflects seasonal patterns from previous years. HR spending has a larger than
expected presence due to its role as a central purchaser for shared training items and services.
• Total attendance exhibits seasonal variance consistent with previous years. If that pattern continues,
we should be past our busiest month and see lower numbers for the remainder of the year.
• Attendance in CSE and ALE programs is consistent with historical trends. Tech support is the
greatest consumer of the CSE program; that figure is reflected in the large number of completions
associated with the service center in Memphis.
• Learner feedback continues to show that learners find the content relevant and usable. However,
scores for both questions declined in Q2, when course completions were at their highest levels. I will
be exploring further to see if increased volume led to a drop in training quality.

Please contact me with question or concerns.

Regards,
Maeve

A Dashboard Journey | 191


With that, she sent the briefing sheet to Alden for review, invited him to drop by the following after-
noon to talk about the training dashboard, and logged out for the day.

A Little Closer to Home


The next morning, after Alden had replied with a thumbs-up emoji, Maeve sent the briefing sheet to the
managers in her division. Then, using the guidelines she and Alden had discussed for the business dash-
board, she got to work on the training department dashboard. First, she outlined the things her training
stakeholders would care about (Figure 14-5).

Figure 14-5. Stakeholder Priorities for Business and Training Dashboards

Business Dashboard Training Dashboard


Money Budget
Efficiency Development time
Delivery time
Quality Learner feedback
Business feedback
Customer service Learner feedback
Whatever the CEO cares about Whatever the manager cares about

Next, she spent some time reflecting on the decisions that might be informed by training dashboard data.
• Budget tracking would be helpful, although the dashboard would contain less detailed
information than the financial reports all managers receive. Still, having a high-level update
easily visible each month would serve as a reminder to the team that the department had to
function like its own small business, even though it was not a profit center for the company.
• Maeve’s department already used the Workplace Learning and Performance Scorecard as its
benchmark for development time; including it on the dashboard would help confirm that the
team was working as efficiently as their counterparts in other companies. However, Maeve
was concerned that the time to develop training from scratch or even to do a comprehensive
update might be too lengthy to lend itself to monthly reporting. Delivery time (average
course length) would indicate whether content was being delivered efficiently and hours per
learner would indicate whether their target audience was making good use of the training
department’s products.
• The extra information from survey results housed in the LMS would be a rich source of quality
data. Maeve was aware that experts disagreed on which questions—if any—should be asked
at the end of a training event. While she knew their standard survey was less than perfect, she
decided that imperfect data was better than none and chose to include questions that solicited
learners’ perspectives about the relevance of what they learned, whether they believed their

192 | Chapter 14
manager would support them in applying what they learned, and if they believed they would
see positive results as they applied what they learned. For customer service, Maeve included the
results for a question on how easy it was to find and enroll in training.
• For “whatever the manager cares about,” Maeve reproduced the data on ALE and CLE
from the business dashboard. She smiled as she recalled hearing her boss often saying that the
CEO’s interests were her interests.
After lunch, Alden knocked at Maeve’s open door.
“Now a good time?” he asked.
“Never better. Come on in.”
When they finished reviewing Maeve’s draft training dashboard, Alden smiled broadly.
“Great job,” he said. “You’ve treated the training department like its own business.”
“Well, you know what they say: ‘We don’t have time to be in the curiosity business.’”
“Exactly! I don’t see anything that wouldn’t inform a decision or lead to action by your team. And
I like that you’ve duplicated information about the two marquee programs that also appear on the
business dashboard. That’s a good way to make sure you are looking at business data as well as training
department data.”
“That’s what I thought.”
“What are your next steps?”
“I need to build out the visualizations and review them with my team. It’s important to get this in
place quickly to support the idea of making better use of data to inform our work.”
“You’re not calling it ‘data-driven’ then?”
Maeve shook her head. “We need a lot more than the dashboard to do our work: We’ll draw on
customer conversations, the deep experience of the team, and best practices from other companies. Even
then, the dashboard will continue to evolve as we learn what data are significant and which are stable
enough that we don’t need to review it every month.”
“I was just kidding,” Alden chuckled. “I know how you feel about that phrase.”
“It is one of my hot buttons. But I think we can help replace ‘data driven’ with ‘data-informed’ if we
can provide informative data in a useable format.”
“Agreed. And you’ve created a good foundation to help both your business partners and your team
make data-informed decisions about training. Let’s meet again in a few months to see what you’ve discov-
ered from the dashboards, how you might improve them, and what other kinds of measures you might
want to add to your data portfolio.”
“Sounds good. I look forward to it. In the meantime, I have plenty to keep me busy here.”
“No doubt. I’ll get out of your way for now, but call if you need help.”
Alone in her office, Maeve captured some final notes as a guide to developing future dashboards,
using Alden’s quote as a title.

A Dashboard Journey | 193


We Don’t Have Time To Be In The Curiosity Business
• Dashboards need to inform and improve decision making, not merely display data.
• Dashboards should be created to inform the perspective of the business being served.
• Existing data sources will likely contain much of the information needed for good dashboards. And, if not,
start with what is already available.
• Visualizations are more effective than numbers. (Obviously, the data used to create visualizations must be
accurate and grounded in evidence—as Tableau’s Kim Magden points out, “You can have terrible data
that is displayed beautifully and influences others [the pretty face syndrome]. You can have powerful data
that is displayed inappropriately and falls flat. Neither of these options drives business results.”)

As she left for the day, Maeve reflected on what she’d created and how much more could be achieved
with well-structured dashboards. She saw how they could inform, and be informed by, strong dialogue
with her business partners. More than ever, she realized that she needed to maintain open dialogue with
her partners in the business to understand the decisions they made in relation to their employees’ training
and development. Creating a tool that presented data in ways that made those decisions easier and more
effective could help make the relationship between the training department and the business more than
just white space on an org chart.

Summary
All aspects of business face the challenge of dealing with data in quantities and varieties never before seen;
the training and development groups that support those businesses are similarly challenged. Dashboards
are popular tools for summarizing, tracking, and presenting that data. Developing dashboards that enable
the business—by answering questions, tracking or predicting trends, or informing decision—is a critical
skill. Unfortunately, many dashboards present information that is confusing.
In this chapter, you saw how one fictionalized learning professional proceeded on the journey from
irrelevant to mission-critical dashboards. You learned that while much of the data available to the training
department is important to it, that data isn’t important to its customers or stakeholders. You also discov-
ered how to draw connections between the products and services provided by training and the needs and
interests of business leaders. In addition, you saw how data from nontraining sources like finance, HR,
and program owners can be used to provide a more complete data dashboard.
Traditional training data like course completions, survey results, and test scores can be useful, but
likely won’t appear on a business-facing dashboard. Those items would fit better on a dashboard for the
learning team (those who will be accountable to take action based on those items).
Whether the dashboard informs business leaders or training professionals, analysis and interpreta-
tion are valuable services. The person performing the analysis can provide a succinct memo to highlight
meaningful trends, point out milestones or anomalies, and provide context as needed. While businesses are

194 | Chapter 14
rarely fully “data driven,” the tools covered in this chapter can help ensure that learning-related business
decisions are data-informed.

Key Takeaways
9 Dashboards need to inform and improve decision making, not merely display data.
9 Dashboards should inform the perspective of the business being served (even when that “business”
is the training department).
9 Use briefing sheets to direct the reader’s attention to key changes or emerging trends in
dashboard data.
9 Existing data sources will likely contain much of the information needed for good dashboards. If
you don’t have all the data you want, start with what you have.
9 High-level evaluations (application, impact, and ROI) are important, but should be used only when
it’s critical to do so due to their cost and potential disruption to the business.
9 Visualizations are more effective than numbers.

Questions for Reflection and Further Action


1. What data do you currently collect that could be reported in a business-focused dashboard?

2. What decisions do your customers need to make that could be better informed with data from the
training function?

3. If asked, could you name the top five focus areas that your customers would want more data about?

4. In your perspective, how does being data-informed differ from being data driven, and why is the
distinction important?

A Dashboard Journey | 195


15
Impact: Making It Happen
Graham Johnston

Learning functions have different roles in their respective organizations, based on how they are viewed and
used to support the business strategy and how they drive the overall organizational culture of learning. But
they all share a common objective to be effective, value-added, and impactful in building capabilities that
drive improved performance for the organization. Measurement is how we know if we’ve achieved that,
but it is just a means to that end. We measure learning to understand what worked and what didn’t, but
more important, to direct our efforts to improve so we can maximize our impact. In this respect, everyone
in the learning function is responsible for driving impact and should—as performance consultants to the
business—have the mindset and capabilities to do so.
The goal is to implement and maintain a learning impact strategy, not just a learning measurement
strategy, and there are two ways to approach this:
• determining the value and impact that the learning function provides to the business
• showing the effectiveness of individual learning solutions.
For both, impact is achieved by defining outcomes up front, and then using those to shape planning
and design, the measurement approach, and its continuous improvement efforts. Let’s explore these further.

The Value and Impact of the Learning Function


The learning function at a large professional services firm uses a four-part construct to define the
macro-level value and impact to the business and provide a basis for strategic planning and how the
function assesses and reports its performance (Figure 15-1).

197
Figure 15-1. The Learning Function’s Value and Impact Construct

Business
Alignment Effectiveness
and Impact

Innovation Efficiency

While designed for the professional services environment, this construct is applicable for learning
functions of any size and scope and across all industries. It includes the following components that define
the goals for a high-performing learning function:
• Business alignment and impact. Develop and deploy solutions that address business
priorities, build required capabilities, and enable business performance.
• Effectiveness. Enable learning, provide content that is aligned with role requirements, and
improve individual, team, and organizational performance.
• Efficiency. Optimize resources and manage budget, schedule, and vendor spending.
• Innovation. Incorporate leading practices and creative problem solving to address
development issues, needs, and challenges.

Defining Value and Impact


This construct should be used first to guide our planning efforts by defining the outcomes we want to
achieve, and then as a basis for measurement. As we develop the learning function’s annual strategy and
plan, we should be identifying goals against each of the four components by asking:
• What business priorities should we support or enable through learning? What does the business
need its people to know and be able to do?
• How can we best provide learning and development solutions that address role-focused
learning needs and help people perform better?
• How can we operate efficiently and do more with less?
• How can we creatively address problems and opportunities for how we design, develop, or
deploy learning solutions?

198 | Chapter 15
With these questions answered, we have defined the goals and outcomes that serve as our north star
for implementing our strategy and plan, and determining how we measure and articulate success.

Measuring Value and Impact


So how do we measure success for this macro-level impact? Let’s start with a few guiding principles. For
one, don’t let the initial perception that an outcome or goal cannot be measured convince you to not
pursue it to begin with. For example, if a business priority dictates that part of the workforce has a certain
set of capabilities, we wouldn’t decide to not build those just because we weren’t sure if or how we would
know we were successful, right?
And that ties to the other principle, which is to cast a wide net when determining the ways
success can be measured. We all tend to seek out the data—or more specifically the numbers—but
quantitative measures may not be available, and they’re not always representative of a given compo-
nent’s success. Additionally, qualitative proof points, anecdotes, and testimonials can be equally if
not more telling than the numbers. Think back to the example about the business-driven capability
need. Sure, we can quantify the number of people who completed associated learning programs, and
maybe we can even capture assessment data where it’s collected. But does that tell us if the capability
was really built, or if people performed better as a result? What if we asked stakeholders how they
were seeing people better apply this capability? In the professional services firm context, we might
solicit feedback from project team leaders and clients around how project team members have gotten
better at diagnosing client needs and opportunities. That is certainly reflective of our success in
supporting that business objective.
What other numbers can we review? For example, what can we look at to know if we are:
• Aligned with and affecting the business? Business development or sales measures,
operational performance, and talent-focused measures, such as retention and individual and
team engagement.
• An effective learning function? Aggregate learning program evaluation data, team leader
feedback, and individual and team performance.
• An efficient learning function? Cost savings and cost avoidance, our responsiveness to the
business, and our design, development, deployment timeframe.
• An innovative learning function? By creative problem solving; enabling or accelerating
methods, tools, and solutions; and simply trying things out, whether they are successful or not.
So when and how often should we measure the impact of the learning function? It depends
on the organization, but a best practice should be to at least capture impact against this four-part
construct on a bimonthly basis. These are, after all, the outcomes the learning function seeks to
achieve, and so logically we should take a regular pulse of how we are performing against them. With
updated, telling, and actionable data on hand, we can proactively and responsively articulate impact

Impact | 199
to the business. This cadence also allows us to regularly validate progress or identify and respond to
any necessary course corrections along the way.

Articulating Value and Impact


We’ve covered how we can define the impact of the learning function and how to measure that. But how
should we demonstrate our value and impact to the business? We already have a leg up because we’re
focusing on what’s most important to them—their business priorities and the capabilities the workforce
needs to perform at a high level. Effectiveness, efficiency, and innovation may not be as important, but
they do reflect things that should matter to the business.
You might be thinking that while this depiction of the learning function’s impact makes sense, your
stakeholders in the business aren’t asking for this. A professional services firm faced a similar situation—
historically they had reported on learning activity and output, but not necessarily impact. The business
was used to and expected to see information on budget and spend, learning solutions that were developed
and deployed, learning hours offered, participants and completions, and aggregate learning program
evaluation data. But as the organization’s learning impact strategy evolved, that information was comple-
mented by qualitative and quantitative data showing accomplishments against the impact construct, which
included business alignment and impact, effectiveness, efficiency, and innovation. Business stakeholders
hadn’t previously sought this out, but as it became part of the regular conversation around the learning
function’s performance, they developed an immediate appreciation for it and began to view learning in
a different light—as an enabler for organizational performance. This level of preparation, proactiveness,
and responsiveness also served to keep the learning function out of a defensive posture where they would
have otherwise been asked to show ROI or their investments would have been challenged.
Dashboards, scorecards, and other tools can be used to capture and report on the learning function’s
impact, but what’s most important is that value- and impact-focused conversations are occurring with
stakeholders—no matter what the communication vehicle may be. The professional services firm devel-
oped a quarterly dashboard that captures quantitative and qualitative data against the four components
of the impact construct, which was then shared with key business and talent leaders. The learning lead-
ers also captured their own accomplishments against the construct using a shared document that team
members updated with qualitative and quantitative proof points at the end of every month. This meant
current information was always available to draw from when the need or opportunity arose to speak to
how the learning function served as a performance enabler and key engine for building critical capabilities
to achieve business outcomes.
Learning functions are best positioned to drive impact if they plan for it up front by defining outcomes
with the business, execute and maintain their strategy with those outcomes in mind, and measure and
demonstrate their impact against them. With these insights into how to optimize the aggregate perfor-
mance of the learning function, let’s focus now on the effectiveness of individual learning solutions.

200 | Chapter 15
Learning Solution Effectiveness
The effectiveness of learning solutions of any type—formal learning, on-the-job development, curated
content, mentors, networks, and so forth—is defined by the learning gained, the applicability of the content
or experience to one’s role, and, most important, its influence on performance. Similar to the value and
impact of the learning function, the effectiveness of individual learning solutions rests on the up-front defi-
nition of outcomes and how those shape the design, measurement, and continuous improvement.

Defining Effectiveness
As learning professionals, we’re all accustomed to being asked to “develop a training course” before
any discussion has occurred around drivers and needs, and what type of learning solution—if any—is
most appropriate. An up-front definition of outcomes—how the information will help the business or
what learners should be able to do, for example—by both the learning team and business stakeholders is
important for steering the learning content in the right direction and for achieving effectiveness. There
are three types of outcomes to define at the outset and get alignment on: business objectives, performance
objectives, and the learner experience (Figure 15-2).

Figure 15-2. Three Types of Learning Effectiveness Outcomes


Learning Effectiveness

Business Objectives
Outcomes

Performance Objectives

Learner Experience and Emotions Objectives

Business objectives reflect the business needs or priorities that the learning solutions are intended to
support and could include:
• Service and solution delivery. An organization needs to improve or increase a service it
provides to its customers, requiring that its people develop and apply a specific capability.
• Sales and business development. An organization seeks to grow business in a market
segment and needs to strengthen knowledge of services and solutions, as well as customer
relationship and sales skills.
• Operational performance. An organization must demonstrate regulatory or compliance
requirements, achieve process enhancements, or improve cost and revenue management.
• Talent. An organization is looking to improve retention for its new hires after their first year
and looks to improve their engagement.

Impact | 201
Typically, at least one of these business objectives is driving the learning needs, and it’s important to
confirm that objective so it can be referenced throughout the solution’s life cycle.
Most business objectives have a corresponding capability or set of capabilities that define the second
type of outcome—performance objectives. More and more, performance objectives (what the learner
needs to be able to do) are taking the place of learning objectives (what the learner needs to know)—or
at least complementing them—because just learning something is not enough and performance is what
matters most. When it comes to marketing a solution to a learner, sharing how that solution will help
their performance will be more resonant and compelling than simply telling them what they will learn.
There are typically multiple performance objectives for a given learning need, which helps define content
components and the delivery structure.
The last type of outcome—which isn’t often applicable for learning needs—is around the learner’s
experience and emotions, or how the learner should feel. Take, for example, the retention objective and
need to improve employee engagement discussed earlier in this chapter. In that instance, campus hires
may be more engaged because they feel prepared to perform in their new role, connected to the orga-
nization, energized and excited to perform, or inspired to make an impact. These could be the desired
outcomes for an onboarding experience, where the primary outcome is new hire engagement and second-
ary outcome is improved retention.
Defining these outcomes provides a foundation that shapes the learning solution design, measure-
ment, and refinement—all toward achieving effectiveness.

Designing for Effectiveness


The design of learning solutions is a separate discussion, but the relevant point here is how the defined
outcomes should inform decisions in the design process. When selecting learning solutions, determining
content to include, or deciding how content should be delivered and what practice, application, and feed-
back should be incorporated, we should be asking ourselves regularly if and how the design decisions align
with business objectives, performance objectives, and the learner experience and emotions. What we want
to avoid is not including design elements that are needed to meet the outcomes or, more commonly, includ-
ing design elements that don’t meet the outcomes. It’s important to remember the outcomes we defined
up front so we don’t go down the path of building a learning solution that doesn’t do what we need it to.

Measuring Effectiveness
A risk in learning effectiveness measurement is that we work from data that are available but aren’t
actually relevant to our outcomes, resulting in unnecessary work and a mischaracterization of solution
effectiveness. Most of us have had experiences where we’ve captured and reported on data that isn’t
particularly telling toward what we are trying to achieve, such as tracking down and highlighting reten-
tion data when reduced attrition wasn’t actually a goal for the development experience, or proficiency

202 | Chapter 15
data for a capability the learning program wasn’t actually intended to build. This is where a definition
of success in the learning solution outcomes is so important, because it focuses our measurement efforts
on data that validates that the outcomes were achieved, as well as where and how they weren’t, so we
know where to direct further analysis or refinement.
When defining the outcomes for a given solution or experience, a best practice is to identify the
measures, methods, and sources for each business objective, performance objective, and learner expe-
rience or emotion. This directs the measurement approach at the outset and helps capture the right
qualitative and quantitative data. Let’s examine how best to measure these three categories of outcomes.
By definition, the measurement of business objectives dictates that we are capturing business data
rather than learning data. This can be challenging because we don’t own that data and it may be diffi-
cult to access. However, the reality is that we need to point to the data to demonstrate how we have
enabled the business objectives, even indirectly. Take, for example, the business objective of increased
sales in a customer segment. In defining that outcome, the business may target a 20 percent increase in
sales. We then set out to develop and deploy a series of programs to build service or solution knowledge
and customer relationship and sales skills. After that curriculum has been developed and delivered, and
after the audience has had a chance to apply it and perform, we would look to the business to see if that
20 percent increase was realized. While the learning function is not solely responsible for meeting that
target, it’s the best indicator of our influence. Therefore, it’s important to set these expectations with the
organization when agreeing on business objectives, and to establish access to the business data we need to
demonstrate our effectiveness.
The shift from learning objectives to performance objectives presents more of a challenge because
it’s easier to measure if someone has learned something than it is to measure if they performed better. But
the latter is still the right outcome to focus on, and there are different ways we can determine how perfor-
mance has improved. Some performance objectives are easier to measure than others; for example, if
we’re looking for an increased ability to produce widgets, we can point to the widget production rate. But
if we’re talking about a capability like critical thinking, we need to consider the different ways a learner has
successfully applied that capability. The first and most common method is to get the learner perspective,
often through post-program evaluations conducted upon program completion that ask what the partici-
pant learned, if the content is applicable to their role, and if it will help improve performance. Then we
can ask those content and performance questions again 30 to 90 days after program completion to gauge
what a learner thought was going to happen versus what actually did. However, just because the learner
says their performance improved doesn’t necessarily mean that it did, and that’s where we can link to other
means for assessing performance. Feedback from team leaders or supervisors or even customer feedback
on a given capability can be very telling; for example, hearing a manager say, “Since he attended that
program, I’ve really observed an increase in Steve’s critical thinking skills, and here’s how,” is powerful.
And where it may be difficult to isolate and diagnose a specific capability or performance objective, you

Impact | 203
can bundle them, asking, “Since attending that program, has the team become much better at identifying,
assessing, and addressing the client’s issues, needs, and opportunities?”
When it comes to the learner’s experiences and emotions, the best way to assess that is to ask the
learners. Let’s go back to the campus hire engagement example. Here, the solution might be a six- to nine-
month holistic onboarding experience designed to make them feel prepared to perform, connected to the
organization, energized and excited, and inspired to make an impact. Campus hires could be asked how
they feel against those four elements at multiple iterations throughout the experience, with the goal being
that those feelings are stronger and stronger each time.

Refining for Effectiveness


We’re not doing our job as performance consultants if we simply measure and report on solution effective-
ness and stop there. Very few learning solutions are perfect, and the real value to the business comes when
we identify and address the parts of our learning solutions that weren’t as effective as we had hoped. For
example, what if evaluation data showed that participants learned a lot from the program and the content
was applicable to their role, but there wasn’t a noticeable improvement in their performance? Sharing the
good news and the not-so-good news with stakeholders and making a commitment to improve go a long
way toward being seen as a trusted business advisor.
The first thing we need to do is make measurement and continuous improvement core components
of the learning solution life cycle, as well as part of expectations for all learning professionals as part of
the impact mindset. We also need to collect telling and actionable data, and there’s an art and a science to
this. The science outlines how we define outcomes for the learning solution and measures, methods, and
data sources for each, while also determining what data we need or, more important, what data we do not
need. The art comes in how we solicit that data, so it tells us what we need to know and is targeted enough
to react to. Evaluation, survey, interview, or focus group questions represent the primary means for this.
These guiding principles can help drive response rates and overall respondent engagement:
• Focus questions around the three primary components of effectiveness as it pertains to
performance objectives—learning gained, applicability of the content to role, and impact on
performance.
• Don’t ask questions seeking data you don’t need or that you have gathered from other sources.
• Do not include multiple questions that are or even appear to be similar.
• On evaluations or surveys, don’t include “double-barreled” questions, which have more than
one statement for learners to respond to in a single question.
• Pose questions as absolute statements so respondents can more easily indicate their level of
agreement. (For example, “What I learned in this program is essential to my work.”)
• Limit the number of questions asked and use plain language and simple sentence structure so
the intent is clear.

204 | Chapter 15
These guidelines are intended to prevent respondents from disengaging, hastily answering ques-
tions, or not answering them at all, which will maximize the quantity and quality of data that we can
analyze and act upon to improve learning solutions.
What could a continuous improvement process to refine the learning structure, content, and
delivery look like? For example, if we find that business objectives haven’t changed in the way the
business had hoped, it could lead to a conversation around the role of the learning solution and if
something could be or needs to be changed. For performance objectives and learner experiences and
emotions, it is easier to isolate data to show if those were achieved, and if not, why. For example, if
program evaluation data show that learning was achieved but respondents didn’t think the content
was applicable to their role, it could trigger an examination of whether the program’s target audi-
ence was accurate or if there was a proper understanding of performance expectations. If the data
showed that learners weren’t effectively applying a given capability, that would direct us to assess
the associated content and how we’re delivering it, including methods for practice, application, and
feedback.

Summary
As a learning function, our goals should be the same ones as the business, and we should speak
their language with them. Their success should be our success, so that’s how we can ensure we’re
focusing on the right things. We exist not just to enable learning, but to drive individual, team, and
organizational performance. That mindset is what makes us valuable and impactful as performance
consultants.
The conversation is no longer just about measurement—it’s about impact and how we come
to a shared definition of it up front, use it to guide our planning and design, and then to inform
that measurement and our continuous improvement efforts. In creating and maintaining a learning
impact strategy, we bring value and impact to the business at the macro level as a high-performing
learning function, while also delivering effective learning solutions that address specific outcomes.
All learning professionals should adopt this impact mindset, and everyone should be held account-
able for impact and responsible for analysis and continuous improvement in addition to design and
development.

Key Takeaways
9 Change the lens for how we define, drive, and demonstrate the value and impact of the learning
function, to be better positioned as trusted advisors and performance enablers for the business.
9 Speak the language of the business with the business and make their definition of success our
definition of success.

Impact | 205
9 Define outcomes—including business and performance objectives—up front to inform and target
design, measurement, and continuous improvement.
9 Focus on telling, actionable data that align with those outcomes, and give equal attention to
qualitative data and the power of anecdotes and observations.
9 To whatever extent there are gaps and needs, there is a tremendous opportunity to enhance how the
learning function is viewed and used and to strengthen its role and brand.

Questions for Reflection and Further Action


1. What does your learning function’s impact strategy look like?

2. To what extent do your business stakeholders view you as performance enablers?

3. What would they say if asked about your accomplishments and how you helped them achieve their
objectives?

4. What is the level of ownership and accountability for driving impact across your entire learning
function?

206 | Chapter 15
Section 5
Stakeholder Collaboration

T
his section focuses on understanding, communicating with, and influencing those we

serve in our organizations. While the primary focus is on stakeholders—those who hold

the keys to success via knowledge, influence, or budget—it can also include customers,

vendors, and business colleagues. We included a section on stakeholders because we recognize

the value of collaborative partnerships and governing arrangements when it comes to our ability

to build organizational capabilities. But this section also reflects on how stakeholder relationships

can be dependent on the way the learning function is structured.

Marie Wehrung starts chapter 16 with a business case for the overarching purpose—for

example, why stakeholders are important. She then uses her expertise in human-centered design

to provide explicit details for comprehensively understanding all stakeholders using the stake-

holder mapping process. This process is not just about naming stakeholders and identifying their

respective roles; it includes understanding their point of view and the influence they have in corpo-

rate decision making.

One of the goals for writing this book was to provide an opportunity for Forum members

to share interesting projects and practices with the external community. This section features

entries from several members who volunteered to write partial chapters using stories about

collaborating with and managing stakeholders. In chapter 17, four members worked with the

editors to share snippets. Bryan McElroy provides a scenario reflecting what can happen when

207
stakeholders are not involved. Rachel Hutchinson explains how to use the power-interest matrix. Emily

Isensee shares her experience using advisory groups, and David McGrath uses his sales background to

explain the importance of building deeper relationships with targeted stakeholders.

In chapter 18 we look at structure and how it influences the configuration of stakeholders, especially

in terms of a governance board. Learning structure generally runs from completely centralized to decen-

tralized; the a popular hybrid, the federated model, sits in the middle. Using the ideas about structure and

governance as context, Kozetta Chapman shares a scenario for learning about different structures and the

importance of governance. Graham Johnston then provides lessons learned from a more established and

mature model.

208 | Section 5
16
Go Slow to Go Fast
Marie Wehrung

“If you work for and eventually lead a company, understand that companies have multiple stakeholders including
employees, customers, business partners, and the communities within which they operate.” —Don Tapscott

In our fast-paced world, the last thing anyone wants to do is slow down the process of getting work done.
The more people involved in a given project, the more time and effort it takes to manage them and the
more likely their questions and input may inhibit quick action and getting work done. Why, then, is it
advisable to not only consider stakeholders, but also actively engage and collaborate with them?
Stakeholder collaboration is a classic example of going slow to go fast. It can involve identifying
relevant stakeholders, determining the best ways to involve them, communicating with them, asking
them to weigh in on the work in question, and coming to consensus with them. While all of this engage-
ment with stakeholders takes some time, the investment on the front end ultimately saves time on the
back end. Analyzing and soliciting input from your most knowledgeable and significant stakeholders
can help shape your initiative, ensure it aligns with the business strategy, secure their support, and yield
a better outcome as a result of their participation. Key stakeholders who have high interest in your
work, and wield much influence, may assist you in accessing necessary resources for your project, such
as time, money, and people.
Stakeholder collaboration enables you to gain buy-in for your products or services, reduces the need
for rework, creates champions who can help positively spread the word about your initiative, and may

209
expand your bandwidth if stakeholders are so engaged that they want to partner with you. We look at
issues from different vantage points than stakeholders, so securing their input helps to paint a clearer
picture of their needs, as well as the best ways to prioritize, satisfy, and serve them.
When you do not consider all stakeholders or collaborate with at least some of them, you run the risk
of either treating all people affected by your work in the same way, or giving disproportionate weight to
issues raised by stakeholders who aren’t the targets of the solution, which may produce the wrong solution.
Not all stakeholders are created equal, so it is essential to identify all the groups touched by your work,
understand the roles they play and how your work affects them, and determine whose interests require
greater consideration.
You also run the risk of unwittingly creating obstacles to your solutions, programs, and initiatives. For
example, stakeholder groups who feel left out may become impediments to the process—speed bumps
that slow down or attempt to derail your work because they lack buy-in, and have no voice to address their
concerns, objections, and questions. Once upon a time, my organization had a process for reconciling
costs associated with travel, which was burdensome, time-consuming, and hated by all who had to carry it
out. After hearing complaints from the community over a number of years, the financial powers adopted
a new process and tool for reconciling travel expenses. They trained users on the new tool, sang its praises,
and trumpeted how they had heard users’ complaints and now had the solution to address them. Unfor-
tunately, they had arrived at that solution in a vacuum. They did not engage stakeholders to identify the
problems with the existing process or what they desired in a solution. Instead, the financial powers took
their understanding of the issues people had with the travel reconciliation process, and found a solution
on their own that they believed would be best for the organization. The result? The people who actually
had to use the new tool and process disliked it even more, and almost missed the old one. The financial
powers then had to overcome an abundance of bad press within the organization and expend a lot of time
and energy to help users understand how to leverage the new process and tool in the least burdensome
way possible.
The moral of the story? Taking it slower on the front end actually benefits you (in terms of time,
buy-in, and collaboration) on the back end.
Going slow to go fast doesn’t seem so undesirable now, does it?!

Setting the Context


Practices around stakeholder collaboration will vary and be influenced by a number of factors, such as
the industry, size, span of reach (global, national, or local), status (public or private, for profit or nonprofit),
and organizational culture. My perspective on and experience with stakeholder collaboration comes from
the world of academia. Specifically, I work at a midsize, nonprofit, private university with a single campus
in one location in North America. While we are not considered a global organization, we enjoy and
experience international influence from our collaborations around the world, as well as our international

210 | Chapter 16
students, staff, faculty, scholars, alumni, donors, and more. We have approximately 7,000 students (57
percent undergraduate, 43 percent graduate) and 3,300 employees (72 percent noninstructional staff,
28 percent faculty). My talent and organization development team has four members and is part of the
22-member HR department. Our team budget is microscopic, so we have to be strategic in our approach
to talent development, which also means partnering with and leveraging our campus stakeholders to
expand our reach and impact.
Culture is the strongest driver of our practices around stakeholder collaboration. We are a high-
touch, inclusive organization. Work gets done through building relationships and leveraging social capital.
Individuals and departments who drive their initiatives without appropriate consideration of and involve-
ment from stakeholders do so at their peril.
This chapter does not attempt to address stakeholder collaboration in all possible settings. Rather,
it offers points for you to consider, tools to try out, and concepts to translate into your organization, no
matter its size, complexity, or culture.

Identifying Your Stakeholders


Before you can collaborate with stakeholders, it is essential to identify everyone with an interest—a stake—in
the program, product, or process you’re designing. Ideally, you would work with your team to ascertain
every individual who is affected by your work, can wield influence and power over it, or has a vested inter-
est in its success (or failure). Expect to discover many more stakeholders for your product (or program,
process, or design) than you might initially have anticipated! This chapter will present two different meth-
ods for identifying stakeholders. The first uses graphics and visual imagery to represent stakeholders; the
other uses words and lists to accomplish the same. Neither tool is better than the other, so let personal
preference drive the method you choose.

Stakeholder Mapping—The Graphic Method


According to LUMA Institute (2012), a leading professional training and coaching organization, a stake-
holder map is a network diagram of all the people involved with or affected by a given system design, such
as an experience, change process, product, or service.
Stakeholders are people who can influence or touch a product or service during its life cycle. Stake-
holder mapping is a method to understand and frame a project or change initiative. It clearly scopes out
who has what level of input and interest, which helps align decisions. A stakeholder map is especially
helpful in visualizing and understanding the people involved, the roles they play, and the relationships they
have, including the hierarchies, interactions, influences, and, most important, influence on the execution
of the strategy, system, or product or service under design.
Stakeholder maps are not lists. They’re visual documents that use icons to provide a sense of real
people and their impact; groupings, arrows, and colors to show relationships; and speech bubbles to show

Go Slow To Go Fast | 211


thoughts, positions, influence, and impact. This results in a deep understanding of the stakeholders that
influence the work, and facilitates our ability, as learning leaders, to align learning with the business strategy.
The basic steps to conducting stakeholder mapping include to:
• Identify a subject area on which to focus.
• Convene a diverse team of collaborators with diverse perspectives.
• Generate a broad list of stakeholders.
• Draw simple icons to represent individual people (don’t use a single icon to represent categories
of people).
• Write a label to describe each individual’s specific role or title.
• Write a speech bubble to summarize stakeholder mindset, thoughts, and feelings.
• Draw lines with arrows connecting the people.
• Label lines to describe relationships between people.
• Circle and label related groupings (LUMA Institute 2012).
Are you ready to try your hand at stakeholder mapping? Use the following exercise to walk through
the process.

Activity: Using the Graphic Method for Stakeholder Mapping


Time: 65 minutes

Group size: As presented here, this method is designed for groups of five to seven participants. If you
have a larger group, you could break it into smaller groups, or adjust the instructions to accommodate
working with the entire group at the same time.

Materials:
• pens
• permanent markers
• easel poster sheets—one per group
• sticky notes (2” x 2” work fine) with at least 15 sheets per person
• whiteboard markers in red and green

Expected outcomes: The activity provides the opportunity for a small group to develop a plan, do
a task, and gain a deeper understanding of the people involved in an initiative (for example, a strategy,
product, process, or program design). More specifically, it allows the participants to:
• Establish shared ideas and understanding about stakeholders.
• Focus on people within the work context and their perspectives on an initiative.
• Identify those who will influence your success, either positively or negatively.

212 | Chapter 16
Process:
Step Time Task Notes
1 5 min Identify a trigger question to use during The trigger question should relate to the initiative at hand;
the brainstorming process. For example, the answer to the trigger question (refer to step 2) should
“In general, who are the stakeholders yield stakeholders who have roles, and varying degrees of
needed to execute an organizational interest and influence, in the success of the initiative.
learning strategy?” Customize the
question to meet your needs.
2 5 min Individual ideation: Individually and Be sure to represent individual people or roles as single
without talking, do a brain dump listing all icons. Avoid representing categories of people as single
stakeholders in response to the trigger icons. Have fun creating your icons. However, if you feel
question from step 1. Draw an icon stuck, you can always create a simple icon by drawing a
representing each individual on a sticky circle for the head and an upside-down U shape for the
note and label the role. Be sure to include body, resulting in a figure similar to:
any stakeholder who plays a role in your
initiative (as identified through answers to
the trigger question).

Trainer

3 20 min Group visualization: In round robin


format, have each participant post a
sticky note on the poster and share the
stakeholder. If others have drawn the
ISD Business Partner
same role on one of their sticky notes,
post them on top of each other. You’ll
likely see instances where the title is
different, but the role is the same. Then
have the next person post one of their
CFO
roles. Repeat this process until all sticky
notes have been posted.
4 10 min Clustering: After every participant has Do this step as a group.
posted their stakeholders, cluster related
roles together. For example, there may
be several roles within the training
department or under the chief learning
officer, while others are related to ISD Business Partner
technology or finance, and so on. Draw a
circle around each group on the poster.

CFO

Go Slow To Go Fast | 213


Step Time Task Notes
5 10 min Influence labeling: Discuss the influence Do this step as a group.
stakeholders have on the change Some criteria for determining influence include:
initiative. Then, draw in speech bubbles • Contribution (value): Does the stakeholder have
representing their top-of-mind emotion, information, counsel, or expertise on the issue that
thoughts, or position related to the unique could be helpful?
change. Put the comments on the sticky • Legitimacy: How legitimate is the stakeholder’s claim
notes or in the circled group. Additionally, for engagement? What is the stakeholder working to
highlight any specific relationships within achieve?
the group that will directly affect the • Willingness to engage: How willing is the stakeholder
results. to engage or prioritize this work? (This group would
have to add this to their work agenda.)
How does • Influence: How much influence does the stakeholder
this relate
to profit? have? (You will need to clarify who or what they influence,
such as strategic decisions or budget allocations.)
• Necessity of involvement: Is this someone who could
derail or delegitimize the initiative?
CEO • Blockers: Is there anyone who may block strategy
implementation?

We do not
have time for this
training!

Manager

6 10 min Network relationships and connections: Do this step as a group.


Draw a circle around all the roles on the You can color-code the arrows and use your own
poster that are related. Then draw solid definitions of the connections.
arrows between roles with a tight or
significant connection. Each arrow goes
in one direction, originating with the
end that has the most influence. Label
the actual connection, such as “boss and
subordinate.” Finally, draw a dotted line if
there is a slight connection, such as “lunch
mates.”

7 5 min Identify the top influencers or blockers: Do this step as a group.


Put a green star on the sticky notes for
the top three to five influencers for the
project. Put a red X in a circle on the top
three to five blockers.

Instructions taken from the ATD Forum Lab and LUMA Institute (2012).

214 | Chapter 16
Stakeholder Mapping FAQs
The following are some frequently asked questions about stakeholder mapping:
• Do you have to use sticky notes for stakeholder mapping? No, you don’t—but there
are several advantages when you use sticky notes. As the participants identify their stakeholders,
groups emerge. For example, collectively there may be six or eight roles within one area, such
as talent management or HR; when individual roles are posted, they may not be connected.
However, during the discussion and the drawing of relationships, the roles may be regrouped,
which is much easier if the icons are on sticky notes. An alternative approach is to draw the
icons with markers on a whiteboard. A benefit of this method is that you can erase the icons
easily, which is convenient if you want to indicate stakeholder influence by icon size.
• What is the purpose of the icons, speech bubbles, and connecting arrows? Why
not just list their names? The human-centered design model focuses on people. Framing
a problem, issue, or challenge in terms of the people within the context is key; therefore,
knowing the targeted, specific role is important. You can further enhance this understanding
of each stakeholder by identifying their perspectives and who they are related to within the
context of the issue.
• Do you need to interview the people in the organization prior to generating a
stakeholder map? Interviewing stakeholders before creating the first draft of a stakeholder
map is helpful, but not mandatory.
• Can a team create the stakeholder map in a virtual environment? Yes, creating the
stakeholder map in a virtual environment is possible, although the software used will determine
how easy it is to carry out the project virtually.

Stakeholder Analysis: The List and Grid Method


Stakeholder analysis is the method by which you identify and categorize your stakeholders, and it is the first
step in stakeholder management. Stakeholder management is the process of engaging and managing productive
relationships with stakeholders. You can’t accomplish that without knowing who your stakeholders are.
Here are the basic steps to conducting stakeholder analysis:
• Identify a product (or service, program, process, or initiative) on which to focus.
• Think broadly about and brainstorm a wide-ranging list of stakeholders.
• Use a 2 x 2 grid to prioritize your list of stakeholders according to the degree of power and
interest you think they have in your area of focus.
• Determine how your stakeholders feel about your project.
• Color-code the names of stakeholders on the 2 x 2 priority grid according to the degree of
backing you can expect from them.
The MindTools website makes resources available to assist you in conducting a stakeholder analysis.

Go Slow To Go Fast | 215


Activity: Using the List and Grid Method for Stakeholder Analysis
Here’s your opportunity to give stakeholder analysis a try. This activity walks you through the process.

Step 1. Identify Your Stakeholders


Brainstorm all stakeholders by thinking broadly about anyone and everyone who might be affected by
the experience or have an interest in the product (or service, program, process, or initiative) that you are
designing or promoting. Capture them in a list. These categories of individuals and organizations that
might have a stake in your work include the following examples:
• your boss • shareholders • government
• senior executives • alliance partners • trades associations
• your co-workers • suppliers • the press
• your team • lenders • interest groups
• customers • analysts • the public
• your family • future recruits • the community
• prospective • key contributors • key advisors.
customers • legal
While organizations may be stakeholders, you can’t conduct business with an organization. You
can only conduct business with the people within an organization. So, be sure to specify the right indi-
viduals, or at least the right roles, with whom you will need to interact, so you include and consider all
of the right players.

Step 2: Prioritize Your Stakeholders


The stakeholders on your list do not all carry the same weight when it comes to having an impact on your
product (or service, program, process, or initiative). You can find a template of a power-interest matrix
for stakeholder prioritization (Figure 16-1) on the MindTools website, which features both an interactive
screen app and a downloadable template.
Where you place a stakeholder on the grid determines the appropriate action for managing that
stakeholder:
• High-power, highly interested people (manage closely)—you must fully engage these
people, and make the greatest efforts to satisfy them.
• High-power, less-interested people (keep satisfied)—put enough work in with these
people to keep them satisfied but not so much that they become bored with your message.
• Low-power, highly interested people (keep informed)—adequately inform these
people, and talk to them to ensure that no major issues are arising. People in this category can
often be very helpful with the detail of your project.
• Low-power, less-interested people (monitor)—again, monitor these people but don’t
bore them with excessive communication.

216 | Chapter 16
Figure 16-1. Power-Interest Matrix for Stakeholder Prioritization

High
Power Keep Satisfied Manage Closely

Monitor Keep Informed


Low

(Minimum Effort)

Low High
Interest

Step 3: Understand Your Key Stakeholders


Once you have identified and prioritized your key stakeholders, the third step in stakeholder analysis is
to determine how those stakeholders feel about your project. MindTools suggests that you ask a series of
questions to arrive at that understanding. While you may think you know their answers, directly asking
stakeholders these questions may also benefit you. This approach can accomplish two things: It is more
likely to yield straightforward answers to your questions, and the process also may enable you to build
relationships with at least some of your stakeholders.
Think about asking your stakeholders questions like:
• What financial or emotional interest do they have in the outcome of your work?
• Is their financial or emotional interest in the outcome of your work positive or negative?
• What information do they want from you, and what’s the best way of communicating with them?
• What is their current opinion of your work? Is it based on factual information?
Stop and think for yourself about responses to questions like:
• What motivates your stakeholders most of all?
• Who influences their opinions generally, and who influences their opinion of you?
• Who else might be influenced by their opinions?
• Do some of these influencers or influenced become stakeholders in their own right?
• If they aren’t likely to be positive, what will win them over to support your project?
• If you don’t think you’ll be able to win them over, how will you manage their opposition?
Consider color-coding the degree of backing you can expect from the stakeholders for your initiative,
noting the names of supporters and advocates in green on the priority grid, the names of neutral parties
in yellow, and the names of blockers and critics in red. This visual representation of your stakeholders,

Go Slow To Go Fast | 217


and their priority stake in your initiative, will help you see the status of your stakeholders at a glance, and
enable you to pay the closest attention to any potentially challenging stakeholders.

Managing Your Stakeholders


With stakeholder identification complete, it’s time to shift your attention to the larger process of stakehold-
er management. While I won’t address this topic in depth, I will underscore the importance of thinking
about how you engage with your stakeholders and capturing that in a defined plan. Whether you identify
your stakeholders through stakeholder mapping or stakeholder analysis, you end up in approximately the
same place. You have a clear sense of who your stakeholders are, what their key interests and issues are
as they relate to your initiative, how much influence they wield, and the degree to which they are likely to
advocate for your work.
Creating a communication plan is one way to organize the information you have gathered from and
about your stakeholders. It allows you to see important elements in one place and keep them top of mind.
These elements include:
• who the stakeholder is
• how closely you need to manage communications
• the key interests and issues the stakeholder has in the initiative
• the current support you have and desired degree of support you would like from the stakeholder
• what you want from the stakeholder
• the best ways to communicate with and manage the stakeholder.
A template for a stakeholder communications worksheet (Figure 16-2) is available for download
from the MindTools website. There you also will find guidance about preparing a stakeholder manage-
ment plan.

Figure 16-2. Stakeholder Communications Worksheet

Stakeholder Communications Key Current Desired Desired Actions Messages Action and
Name Approach Interests Status2 Support3 Project Desired Needed Communication
(from power-interest and Issues Roles (if any)
matrix)¹ (if any)

1. Manage closely; keep satisfied; keep informed; monitor


2. Advocate; supporter; neutral; critic; blocker
3. High; medium; low
For more information about stakeholder communications, visit MindTools.com/rs/StakeholderComms.

218 | Chapter 16
The ultimate purpose of the communication plan and goal of this exercise is to inform your stake-
holder management strategy. Thoughtful development of the communication plan enables you to look
at your initiative through the eyes of your stakeholders, clarify what you want from each one, identify
the messages you need to convey to communicate that, and honor the stakeholder’s preferred method
and frequency of communication. Deliberate execution of the communication plan should enable you
to maintain the supporters you already have, and win over (or at least neutralize) the critics and blockers.
Review the plan regularly to ensure it remains current. Then step back to appreciate the payoff of the
time and effort you put into identifying and managing your stakeholders: a successfully launched product
(or service, program, process, or initiative).

Collaborating With Stakeholders


You’ve used stakeholder mapping or stakeholder analysis to identify your stakeholders. Perhaps you’ve
developed a communication plan to manage your stakeholders, and keep the right individuals apprised
of the status of your project in the right ways. You’re convinced of the value of involving at least some
stakeholders in your work in a more active way. Now what?

A Tale of Developing World-Class Staff


A trio of leaders from the HR department of a university—the directors of compensation, employee
relations, and talent development—were on a mission to build out tools to support the organization’s
employees in managing performance. The trio saw performance management as the strategic alignment
of an employee’s efforts and performance with the university’s mission, vision, and goals. They sensed a
gap between the way they hoped supervisors understood and engaged in performance management, and
the way supervisors actually engaged in and executed performance management. Before determining
the kinds of tools to create, and getting to work designing them, they decided to check in and test their
understanding of performance management with their constituents. They knew that gathering such infor-
mation would provide useful data and guidance for the tool development process. But to whom should
they turn to solicit such input?
The trio engaged in a stakeholder mapping exercise to answer that question. They asked themselves
about the stakeholders needed to execute a broad performance management strategy. They identified
people who would care about any tools, resources, or processes put in place to support performance
management, and people who would care about the results. These included the new supervisor, the
seasoned supervisor, the employee, the faculty member, the vice president, the dean, and HR, repre-
sented by the directors of compensation, employee relations, and talent development. They labeled
each with the influence and impact they would have on the initiative. They recognized the relationships
and connections among the stakeholders, and noted the top influencers and blockers.

Go Slow To Go Fast | 219


The university is a place that thrives on relationships. Its people care deeply about the institution and
want to have a say in how it functions. Honoring that, the trio decided to hold a series of focus groups to
collect feedback from employees. They worked with the results of their stakeholder mapping exercise and
compiled lists of individuals to invite to be part of the process.
“We would like to get your perspective on the current state of performance management at our
university—what’s working, what’s not, and what has potential,” they said in the invitation email.
As they suspected, there was much interest in the topic. During the next few months, the trio conducted
a series of five focus groups, in which 75 individuals participated. While the HR leaders learned a lot
about the types of tools and information people wanted, one thing became painfully clear: The trio’s
definition of performance management was not the same as the participants’ definitions. The trio under-
stood performance management to be the strategic alignment of an employee’s efforts and performance
with the university’s mission, vision, and goals. The focus group participants, on the other hand, under-
stood performance management to mean predominantly one thing: annual performance reviews. These
supervisors wanted the right form, the right process, and the right timing, but that was the extent of their
interpretation of performance management.
The trio came away from the focus group experience with a few realizations:
• They had a lot of work in front of them to broaden peoples’ thinking about performance
management.
• They had a lot of work to do, period, to assemble the tools their focus group participants
indicated they needed or wanted. They also had limited time—and no budget to pay an external
vendor—to pull those tools together.
The HR leaders knew that if they were going to make timely headway on assembling performance
management tools, they were going to need help. Who better to engage in this process than the employees
who would use the tools to get the job done? In a flash of brilliance, the trio decided to form a project team
of interested parties composed of stakeholders they had already identified. They would task this project
team with researching best practices and identifying resources (both within and outside the university)
that supported staff development. The trio then would take the information from the focus groups, pair it
with their own insight and experience, and build out the performance management tools the campus so
desperately needed.
It was time to get into action. But what exactly did that look like? How should they go about creating
such a team? They decided to put some basic project management principles into practice, creating a
project charter that defined the background, problem to solve, scope, deliverables, and milestones. Then
they found a date to hold an inaugural meeting of the project team and put it on the calendar, so prospec-
tive team members would understand what their first commitment would be, should they be selected to
work on the project. They scheduled two informational meetings during which they would describe the
project, the concept of the project team, the anticipated time commitment, and the application process.

220 | Chapter 16
Finally, they invited several hundred key stakeholders to spread the word, attend one of the meetings, and
consider participating on the project team.
Forty individuals accepted the invitation to one of two information sessions. After giving the
presentation about the project, the trio sent everyone a link to the application to participate in the
project group. In addition to asking for general work-related demographic information (including name
and contact information for the applicant’s supervisor, as well as the applicant’s own years of supervi-
sory experience and number of direct reports), the project team application requested three important
pieces of information:
• Confirmation that the individual could commit 10 hours per month to the project. (An
applicant could not go further in the application process unless they agreed to the statement.)
• What about this project interests you?
• How can your involvement on this project enhance the team and project deliverables?
Eighteen employees completed and submitted the application. The trio reviewed these applications
and selected 10 (six supervisors, four nonsupervisors) for the team. They contacted, notified, and congrat-
ulated each one on their selection for the team. They then confirmed each individual’s availability to
attend the inaugural team meeting.
The newly assembled project team was a beautiful sight to behold. Each member not only was enthu-
siastic and excited about being on the team, but also was mostly unfamiliar to the HR leaders and their
fellow teammates. There was no chance anyone could have put together that precise team simply by
thinking of and inviting potential participants off the top of their head. The trio felt proud of the outcome
of the recruitment process.
To get the team off the ground successfully, the HR leaders guided the group through a process to
design a team charter. The document they created addressed the following categories:
• Team members (names, contact information)
• Context:
° What problem is being addressed?
° What is the present state and the desired future state?
° What result or delivery is expected?
° Why is this important?
• Mission and objectives (what the team has to achieve):
° Mission of the team
° Objectives (specific, measurable, realistic, time-bound)
° Natural deadlines
• Roles:
° Are there certain people who should be considered core team members, attending nearly
every meeting or function and doing the lion’s share of the work?

Go Slow To Go Fast | 221


° Are there certain people who should be considered staff to the team?
° Should certain people be asked to serve as “ad hoc” team members, available to be on call at
a specific time?
° Will the team have one or more chairpersons? If more than one person is chair, will one be
considered lead or first point of contact?
° Will the team have specific process facilitators? Will they be core team members, facilitating
often, or simply called in when needed? What specific responsibilities do they have?
• Authority and empowerment (what team members can and can’t do to achieve the
mission):
° How much time should team members allocate to the team mission, and what priority do
team activities have relative to other ongoing activities?
° How should team members resolve any conflicts between their day jobs and the team
mission?
° What budget is available, in terms of time and money?
° Can the team recruit new team members?
° What can the team do, what can it not do, and what does it need prior approval to do?
• Resources and support available (resources available to the team to accomplish its goals;
training and coaching support available to the team to help it do its job)
• Operations (how the team will operate on a day-to-day basis):
° Confidentiality and communication concerns:
ƒ What is the agreement related to confidentiality and communication?
ƒ What reports will be submitted when and to whom?
° Meetings:
ƒ Will regular meetings be held? If so, how often, where, and when?
ƒ Who will prepare the agenda for meetings?
ƒ Will the agenda be pre-published and materials sent out beforehand to team members?
ƒ Whose participation is required at meetings? What are the group’s expectations regarding
attendance and participation?
° Team decision method:
ƒ What will be the primary mode of making decisions for the team?
ƒ What will be the backup method(s) of making decisions if the group fails to come to
agreement using the primary decision-making mode?
ƒ Who will make decisions when the group is deadlocked?
ƒ How will you manage conflict in the team (task, relationship, process)?
ƒ Who will ensure there is appropriate conflict?
ƒ How will you check agreement on process and task decisions?

222 | Chapter 16
By the time the team worked through the charter, they had a purpose (of their own definition)
for their existence, a game plan for how they would operate, and a name they created for themselves
(the Developing World-Class Staff project team), and were ready to launch and get to work! The final
step at this stage was to reach out to another group of key stakeholders—the supervisors of the team
members. The HR leaders thanked the supervisors for supporting their team members’ participation
on the project team, updated the supervisors on the status of the team, provided them with relevant
information (including the project charter, anticipated milestones and deadlines, and the general
project timeline), and invited them to ask questions at any time (and to continue to support their team
members in this endeavor).
Over the course of the next year the project team was totally self-directed, owned the project, and
contributed much more time to their work than the HR leaders ever could have anticipated. The trio had
to step in a few times to make some course corrections and help the team get back on track, but for the
most part the team adhered to the project charter and did its work in alignment with that.
The result? Exactly one year after being chartered, the project team presented its final report and
recommendations to the entire HR leadership team. The 10 members of the Developing World-Class
Staff team documented the problems they explored, the work they conducted, and their ideas for how to
address the challenges they identified. For its part, the trio took those recommendations, subdivided them
into actions that were quick wins (0–6 months), medium-range wins (6–12 months), and long-range wins
(greater than 12 months), and used the plan to guide its work. The mission of the project team officially
was accomplished.
But the story didn’t end there. Six months later, the TD leader, in partnership with the other two
members of the trio, sought to assemble an advisory team to help guide the work of that particular
HR function. The first place the leader turned to recruit prospective team members was the successful,
recently disbanded project team.
Half of those team members answered the call to serve on the Transformation Team. They were
joined by an equal number of colleagues from across the campus, all with a vested interest in talent
development. Because this team is advisory in nature, it operates differently than the project team
did. Nevertheless, the members of the transformation team self-selected to be part of smaller project
teams (informed by the original project team’s recommendations). It is on those smaller teams that
members have more autonomy and drive their projects forward, reporting out to the larger body at
its quarterly meetings. The investment made in the Developing World-Class Staff team continues to
pay dividends.
Why would the trio have invested so much time and effort into recruiting for, selecting, and guid-
ing the work of these project teams? Wouldn’t it have been a lot easier to find a vendor to provide
tools and resources on performance management, make the information available online, and be
done with it? The trio’s answer would be yes . . . followed by an even bigger no. The value-added

Go Slow To Go Fast | 223


portion of this process came from having a team of influential colleagues who stepped up to be
change advocates for the work. Time and again they have gone above and beyond to help not only
implement their own recommendations, but also drive other new initiatives. Even today, three of
the past project team members are serving on a committee (outside HR) focused on identifying and
sponsoring growth and engagement initiatives for staff. They continue to contribute to the important
work of talent development.
It did not escape the trio’s attention that their efforts, and the work of the project team, spanned
at least an 18-month period. Such a timeline is not uncommon in the world of academia, which is
entrenched in tradition dating back centuries, and often is slow to change. In a different industry or
organization, that sort of timeline would never be acceptable. The point of the story is not to advocate
for a slow change process, but to demonstrate the value of engaging and collaborating with stakeholders
to get work done.

Summary
Lives are busy, filled with unrelenting demands on our time and an unending list of projects and
initiatives on our plates. Given that reality, it can be difficult to conceive of committing to something
that you know will take even more of your time and attention than you have to give. Stakeholder
collaboration could be seen as a burden that takes more time than it’s worth, but nothing could be
further from the truth.
Your stakeholders can help shape your initiatives, provide their support, and partner with you to
turn out a better outcome as a result of their input. Key stakeholders who have high interest in your
work, and wield much influence, may hold the key to assisting you in securing necessary resources
(such as time, money, and people) for your projects. Identifying stakeholders, whether through stake-
holder mapping or stakeholder analysis, provides a number of benefits. It enables you to gain buy-in
for your products or services, and reduces the need for rework. It also creates champions who can
help positively spread word about your initiative, and potentially expands your bandwidth if your
stakeholders are so engaged—like the Developing World-Class Staff team was—that they wish to
partner with you in your work. We look at issues from different vantage points than stakeholders,
so securing their input helps paint a clearer picture of their needs, and the best ways to satisfy and
serve them.
When you do not consider all stakeholders, you run the risk of unwittingly creating obstacles to
your programs and initiatives. Stakeholder groups that feel left out may become impediments to the
process—speed bumps that slow down or attempt to derail your work because they lack buy-in and have
no voice to address their concerns, objections, and questions. In the end, taking it slower on the front
end might actually benefit you on the back end (in terms of time, buy-in, and collaboration).

224 | Chapter 16
Key Takeaways
9 Are you designing a new learning program? A change initiative? A process? Are you building out a
set of tools and resources for use by your client groups? Before you get too far down the road, identify
your stakeholders. Conduct a stakeholder mapping or stakeholder analysis process.
9 Name your stakeholders and their roles, relationships, levels of influence, and degrees of interest in
your work. Figure out what you need from them (support, money, time), and what they need from
you (type, frequency, style of messaging).
9 Create a plan for communicating with your stakeholders about your project (or process, program, or
design).
9 Determine how you can best engage and collaborate with your stakeholders and help them help
you get work done. Then manage your plan, follow through with it, and update it as circumstances
change. The result? Your stakeholders are likely to help remove obstacles (or not become them), offer
their support, provide needed resources, wield positive influence, and become trusted partners and
collaborators, working alongside you to move your work forward.

Questions for Reflection and Further Action


1. How might you introduce the concept of stakeholder identification and stakeholder collaboration to
your team or organization?

2. How might a deeper understanding of stakeholders help you in your role as a leader in talent
development charged with influencing direction and change?

3. How might stakeholder identification, stakeholder collaboration, or stakeholder management be


useful in your work to build, update, or change some aspect of your learning strategy?

Go Slow To Go Fast | 225


17
The Impact of Coalitions
Bryan McElroy, Rachel Hutchinson, Emily Isensee, and
David McGrath With MJ Hall

“Find the appropriate balance of competing claims by various groups of stakeholders. All claims deserve
consideration, but some claims are more important than others.” —Warren Bennis

Have you ever spent time with your team developing a great instructional design project for a client only
to have it rejected or picked apart by colleagues or stakeholders, resulting in major rework for the design
team? Your story might be similar to this one:

Joe’s team spent three months developing a new e-learning project and was extremely pleased with the results.
They worked with subject matter experts to come up with what they thought was a winning solution and met the
cost and time constraints. They used vendors to make sure the latest research was included. The only gate left before
deploying the first iteration was getting final approval from the stakeholders.
When Joe received the stakeholders’ approval email, the sender also copied someone who had not been
identified as a stakeholder with approval power. Very unexpectedly, the project went from almost deployed to back
to the design table because the new stakeholder requested major changes. The team had to completely regroup, go
through the storyboarding phase again, get new approvals, and develop a revised training option, this time with
proper approval from all major stakeholders.

227
This was a tough lesson for Joe’s team. It delayed project delivery and created additional work for
them. What was their big lesson? Project teams need to ask the correct questions and identify all major
stakeholders, especially those with approval power, very early in the project life cycle. Identifying stake-
holders and getting their buy-in is more than a nice-to-have option. Whether it is for a small redesign or
a major new offering, conducting stakeholder analysis and then managing stakeholder relationships is
imperative for success.

Who Are Stakeholders?


Stakeholders are any individual, group, or organization actively involved in the project or whose interests
may be (or are perceived to be) positively or negatively affected by the performance or completion of
the project. Stakeholders may also exert influence over the project, its deliverables, and the project team
members. In some organizations, employees, customers, and clients are included under the stakeholder
umbrella; in others, they are separated.
To design initiatives that work effectively and efficiently, project management teams or learning and
development management teams need to identify both internal and external stakeholders and determine
the requirements and expectations of all parties involved. Some refer to managing this relationship as
stakeholder management, although we try to use the term stakeholder alignment to ensure that our focus stays
on the partnership between us and various stakeholders. Stakeholder alignment translates to defining and
defending scope, as well as identifying, analyzing, and planning responses to manage communications on
progress, success measures, and risks.
Stakeholders have varying levels of responsibility and authority, and these can change over the course
of the project life cycle or as strategies change in your organization. Identifying stakeholders and under-
standing their relative degree of influence on a project or on your department is critical. It is most helpful
to focus on understanding their needs and aligning stakeholders and stakeholder groups through effective
communication.

Identifying Stakeholders
Generally, when thinking about stakeholders and customers, one considers a quick list—literally some-
thing that can be done easily in five minutes. A brainstormed list can be a great starting point. However, to
truly “put people first,” those potential stakeholders need more emphasis than a title on a list. Not involv-
ing or identifying all of them early in the process can have seriously negative results. There are some things
you want to consider when thinking about whom you need to include in your group of stakeholders.
To think broadly about this, consider asking these questions:
• Who is benefiting from this project? If you’re building a project to benefit a specific
group, it’s important to consider what they want. This could be accomplished by holding focus
groups with select members, sending out surveys, or meeting directly with the group. Start with

228 | Chapter 17
the end in mind by talking about what the perfect outcome would be for them. These could
become some of your goals to accomplish later.
• Whose workload will change because of this project? Often this is a different group
from who will benefit. Maybe this project will force this group of stakeholders to take on new
work while maintaining their current responsibilities or duties, or maybe it will make one
of their existing operations longer or more involved. For example, Joe’s company recently
went through a major change involving tracking the movement of inventory in and out of
warehouses using barcodes. When a change like this is enacted, the up-front time needed
to develop a complete list of stakeholders can be tremendous. But the benefits down the
road can be huge. In this scenario, it was important to not think about only the people who
would benefit from having a more controlled inventory, but also the people who would have
to barcode every item in the warehouse. Both groups are stakeholders and contribute to the
project’s success.
• Have you included everyone? Think broader than the obvious and include everyone
with any link or connection. When a person who needs to be involved or informed has been
identified, ask them: “Do you know of anyone else we need to include as a stakeholder?” Use
everyone, including vendors, as a resource to reach every corner of the organization.

Why Are Stakeholders Important?


Learning professionals are essentially business learning advisors (BLA) who enable organizations to
enhance their performance results. As such, they play a vital role in designing solutions with every
internal employee. This BLA role is different from developing and designing curriculum and requires
unique skills. However, even without specific training, learning professionals bring some of these skills
to the team, especially those related to creating environments for people to learn to work together
more collaboratively. These skills, frequently referred to as soft skills, are developed through teaching,
facilitating groups, and designing experiences into content. Keeping people as their priority during the
business-solution-generation process is a key way learning professionals add value.
Getting stakeholder buy-in early is critical to the learning team, but it requires using these soft
skills. As with the example of the barcoding project, engaging with stakeholders early can mean
better communication over the length of the project, more responsiveness to the needs that arise,
and even additional resources if or when they are needed. Stakeholders do this most when they are
informed and involved.
It is also important to think about the individuals who pull the strings in the areas affected by the
project. Although it should always be the goal to inspire individual contributors who will have to carry out
the work, if their managers and others in the chain above them don’t see the value, they could shut down
the project outright. Do not forget or ignore them. Sometimes the individual contributors may not buy in

The Impact of Coalitions | 229


to the change, but if their leader supports the project it will likely continue. Generally the CEO likely won’t
need to be involved or informed for most projects. If the project goes to the CEO level for status checks,
everyone below them needs to be in the communications loop. A rule of thumb is that it is almost always
better to inform too many people than too few. The cost of a stakeholder disagreeing with the direction
of the project later in the process can be catastrophic, as Joe’s scenario showed.

User Story 1. Organizing Your Stakeholders


By Bryan McElroy

While there are different methods for identifying, understanding, tracking, and managing stakeholders,
it’s most important that it’s actually done and continually kept a priority. The best methods for this are in
a project manager’s toolkit.
First, identify stakeholders with a tool like stakeholder mapping. Once this is done, identify who is
responsible for what. For example, in Joe’s case, they failed to identify a stakeholder with go-or-no-go
approval authority, and this cost the team time and resources. One way to prevent this is to use the RACI
matrix, which designates people according to who is responsible, accountable, consulted, or needs to be
informed. This is probably the most common tool for categorizing your stakeholders.

RACI Matrix
While the RACI matrix is generally used to assign project management roles and responsibilities, it can
also easily serve as a framework for identifying stakeholders associated with any project. Let’s look a
little closer:
• Responsible. These are the stakeholders tasked with doing the work. Although there can be
more than one person who is responsible, too many could make the task more confusing.
• Accountable. This person signs-off on and approves the task or work product, and can
also be thought of as the project owner. For this role there should be only one person. The
stakeholder in the accountable column may need to round up other necessary approvals and
give the final go or no-go, but to eliminate confusion only one person should be accountable for
the task.
• Consulted. These stakeholders need to have input and possibly preliminary approval before
the accountable person signs-off on the project. They might be experts, people who have done
this kind of work before, or even those who must control the financial aspects of the task.
Active participants in this group usually need to be kept in the loop.
• Informed. While they might not contribute to the task, the informed group needs to be kept
aware of the decisions being made and stay updated on the task timetable. Sometimes they’ll
need to communicate with others, such as those in a marketing or advertising, or they could be
working on a task that is dependent on the completion of the current task.

230 | Chapter 17
The matrix itself includes the roles or names in the column header and the work task activities along
the rows (Figure 17-1).

Figure 17-1. Example RACI Matrix

Person 1 Person 2 Person 3 Person 4


Task 1 C A R I
Task 2 R A C I
Task 3 R A I C

Some things to consider when using the RACI matrix:


• Is there at least one R for every task? Someone must be aware that they are leading the
charge of getting the work done.
• Are there too many Rs? This can make for too many cooks in the kitchen. If you are
unable to split up the tasks to improve clarity, you must make sure the dividing lines of the work
are clear to all the Rs on the task.
• Is there an A for each task associated with the project? This part is critical. Just like
a task won’t get done if there isn’t an R for it, the Rs won’t know if they are allowed to begin
work or if what they did received final approval without an A assigned to the task.
• Are those listed as Cs slowing down the process? You might want to move one or
more individuals from the C column to the I column. You can always put them back to the C
column if the success of the task depends on it, but any changes to a stakeholder’s role must be
clearly communicated.
• Does everyone agree with and understand their role? Not everyone will be happy
with their role 100 percent of the time, but the real question is, will they act in their role? Make
sure everyone signs off on their assigned role, and don’t be afraid to address any concerns when
necessary. Putting together an RACI matrix is a task unto itself, but in the end the benefits are
worth it.
Two other ways to understand stakeholders and eliminate the situation Joe faced are the positive-­
negative matrix and the stakeholder profile. In practice, many aspects related to understanding and
working with stakeholders can be combined into a database.

Positive-Negative Matrix
A helpful way to understand stakeholder roles in moving a project forward is to use a stakeholder matrix
of categories for those who might affect your project positively and those who might have a negative
influence. For example, regarding warehouse managers for a network of 150 stores—if one of them does
not like the proposed changes, they would be considered a negative influence and have a negative impact.

The Impact of Coalitions | 231


However, it’s unlikely that they could shut down the project. So, you would need to keep them informed
about the project, but it would be unwise to invest any time trying to sway their opinion. Alternatively, if
the CEO has any opinion about the project (positive or negative), they would automatically be placed in
the category of having a lot of power over your initiative. If they have a negative opinion of what you are
doing, it would be a good idea to spend time trying to win them over. If you are not successful, the project
will most likely be dead in the water.

Stakeholder Profile
Profiling is an essential aspect of managing relationships with customers or stakeholders because it
allows you to understand and document their characteristics, attitudes, and behaviors, especially as
they relate to talent projects. Using this analytical tool can start with identifying a few key elements
for each stakeholder, such as their demographics, expectations, and contributions to the business. New
information can be added as it surfaces to keep the tool dynamic. Additionally, it is most important to
know and document stakeholder influence. What is their influence within the project? What is their
influence within the company? And most important, what is their role outside the project and how does
it influence others?

User Story 2. Aligning Your Stakeholders


By Rachel Hutchinson

The power-interest matrix is an effective tool for ensuring stakeholder alignment (Figure 17-2). The first
step is identifying all stakeholders and placing them into the appropriate quadrant. Once the team thinks
that all stakeholders are included, they can generalize and approach each quadrant with targeted levels of
communication and support. This step breaks large numbers of individual stakeholders into four groups:
high power and high interest, high power but low interest, low power and high interest, and low power
and low interest.
For example, assume you are creating a new lead- Figure 17-2. The Power-Interest Matrix
ership onboarding program because your organization
is growing quickly and you are promoting people with
little experience to leadership positions. The quadrants High Power, High Power,
Low Interest High Interest
might end up looking like this:
• High power and high interest:
° human resources heads
° head of talent acquisition Low Power, Low Power,
Low Interest High Interest
° senior-level directors of areas with high
growth.

232 | Chapter 17
• High power and low interest:
° heads of departments with slow growth
° heads of departments with a solid succession plan
° heads of business units from areas not affected by the promotion rate.
• Low power and high interest:
° local learning and development team members
° people who want a team lead role
° existing team leads.
• Low power and low interest:
° team members who have no interest in leadership roles
° existing team leads
° other areas of human resources.
Additionally, you can color-code the roles in the matrix. For example, roles filled by managers
who are strong advocates and supporters could be green. Blockers and critics could be coded red.
Neutral managers could be coded yellow. Creating a power-interest matrix and planning your
method of ensuring the right level and style of communication and engagement will ref lect on the
professionalism of your department. Communicating with key stakeholders and decision makers
starts long before you begin a project. This needs to be an ongoing journey from the initiation of
your annual or biannual learning strategy.
This matrix is valuable at the overall department level as well as the individual project level.
Knowing the main inf luencers and decision makers across your organization is critical because
they can affect a wide range of projects and either help or hurt the credibility of your entire depart-
ment. You can enhance your understanding of and ability to communicate with stakeholders by
paying close attention to the language they use when they deliver messages to your organization.
Their word choices can tell you a lot about how they think and prefer to be communicated with.
Pay attention to key statements or phrases. For example, we had an executive who often used the
phrase the way we work. While it might seem like a simple thing, this phrase made his team feel
more comfortable with him, which resulted in getting his buy-in. Another executive always referred
to “customer engagement.” This was a clue to us to demonstrate how talent projects benefited
customer engagement when we published business cases for projects.
Another way to think of this is the Six Thinking Hats model developed by Edward de Bono in
the mid-1980s (Figure 17-3). The de Bono model views thinking from different perspectives, refer-
ring to these views as different colored hats:

The Impact of Coalitions | 233


Figure 17-3. The Six Thinking Hats Model

WHITE Focuses on available data, including YELLOW


Points out the positive benefits
trends and gaps

Shares options for the responses of


RED GREEN
others based on emotion and a gut Suggests free-wheeling creative options
reaction

Serves as the moderator during the hat-


BLACK Identifies fatal flaws and negative BLUE
thinking process, saying things like “Let’s
outcomes
hear some green hat ideas”

While the de Bono model focuses on specific ways to think, the concept can also be used to put
on the hat of each stakeholder for a deeper understanding of their position, and to use that under-
standing to prepare messages. For example, the messaging language might use data if it is aimed at
financial professionals. By considering how a person thinks, their pain points, and their focus, the
word choice and the approach can be more personalized. It does not change the message, but it does
change how the message is delivered. Focusing on phrasing desired outcomes to match the way the
recipient or audience thinks can positively affect how they receive the message. The same presen-
tation or communications can be designed for each of the six mindsets in the model to customize
messaging. It is an easy way to address differences in communication styles.
Stakeholders play a huge role in driving change and influencing others along the way. Quite
often, learning experiences or assets and performance solutions are a component of a systems-level
change in the organization. Success is enhanced by knowing advocates and resisters to a change and
planning an approach that more closely aligns with their preferred communications style. Even if
someone is not hierarchically a decision maker, they may be a powerful organizational influencer.
An example of this would be when implementing a new technology system. A few people are often
early adopters who consistently advocate for changes. If they also have a broad and powerful reach
because of their experience or personality, while not typically branded stakeholders, they need to be
considered in your communications plans. This is another reason for casting a wide net and including
peripheral stakeholders.

User Story 3. Using Advisory Groups to Engage Stakeholders


and Gain Buy-In for Learning Initiatives
By Emily Isensee

Getting stakeholders to buy in early to both contribute to and support a new learning initiative is no easy
task. Remember, this is not their main responsibility—they have their own day jobs. To remedy this, my

234 | Chapter 17
team has spent the last two years experimenting with using advisory groups to effectively achieve our
objectives of getting stakeholders to contribute to and support our learning initiatives.
This effort began when we were tasked with creating and rolling out a competency model to a
company that had never used competency models before. Most people we talked to in the organization
were not familiar with competency models for development, so we realized early on that we had our work
cut out for us. We decided the best way to gain early buy-in on this new concept from stakeholders was to
invite a group of employees and leaders to collaborate with us. We called them an advisory group because
they would be the primary project advisors from start to finish. Additionally, this was not the SME group
tasked with helping us write the competencies; rather, their task was to help us hone the competencies
and assist in creating a plan to implement them into the hiring, onboarding, and employee development
processes.
Members were nominated by senior leaders based on being high performers, good at providing feed-
back, known influencers among their peers, and representing global offices. The nomination process
also served as a way to inform senior leaders of the project and ensure alignment and commitment to its
success.
Nominees were sent information explaining the role of the advisory group. The guidelines for the
role included providing checks and balances for the project; effectively contributing to the discussions by
having enough deep knowledge about the role to generate ideas, provide feedback, and advise on imple-
mentation; and championing the initiative to the sales team.
They were asked to attend monthly 60-minute meetings and spend up to an hour a month complet-
ing surveys and reviewing documents. The secondary goal was creating a group of advocates and cham-
pions who would promote the project to their peers and teams. We ended up with 12 advisory group
members, which allowed for a good representation of roles, regions, and experience levels. The smaller
size also ensured time for all members to contribute on conference calls.
Each meeting of the project team and advisory group had a different objective, ranging from
providing feedback on the competency key actions to brainstorming the best way to communicate a
new competency-based interview process to hiring managers. Oftentimes, we did surveys before the
meetings to get a baseline and then used the meetings to ask questions to better understand that feed-
back. We often kicked off our meetings by sharing how we had integrated their feedback from the last
meeting into the project.
By involving this advisory group in the process, we ended up with a competency model that had a
job requirement comprehensiveness rating of 90 percent (the standard rating is 80 percent or above) and
a roll-out plan that included messaging and training that truly addressed the needs of those receiving it.
Additionally, we had a group of project champions who started discussing the competency model project
and its benefits with their leaders and colleagues before the formal rollout even began. This resulted a
successful rollout with significant early buy-in from employees and leaders at all levels.

The Impact of Coalitions | 235


As we were doing the rollout, one of the directors on the advisory group asked if his team could
be the pilot because he believed the project would best support the development of his managers and
employees. Listening to his leadership team meeting launch, we were impressed by how articulately he
described (unprompted) the competency model, the value, and why it was important for his team to imple-
ment it. This experience alone showed the benefit of the advisory group process.
Using an advisory group was an experiment that turned out to be beneficial. The following are some
key lessons from the process:
• Recruit the advisory group early to ensure they can be involved with every stage of the project.
• Use senior leaders from all parts of the business or globe to both get senior leader buy-in and
ensure wide representation.
• Clearly communicate the role expectations and time commitment up front and be conscious of
employee time.
• Show how the feedback you are gathering is being implemented in the project.
• Get creative in leveraging the advisory group as change champions in the roll-out process.
Since the success of using this advisory group, our team has repeated the model on other projects.
We continue to successfully gain early stakeholder buy-in and ensure we are truly meeting the needs of
our end users.

User Story 4. Building Powerful Stakeholder Networks


By David McGrath

Building a business operation requires strong relationships internally and externally. As a support depart-
ment within the organization, the value of these relationships and having allies within the business ecosys-
tem cannot be underestimated by the talent team. Generally, all talent teams will have regular interactions
with their business partners. Many in talent development might be deceived into thinking that they have
tight relationships, but when support is necessary for a major TD project, the bond often falls short of
what is needed. Relying on your standard cadence of meetings or casual interactions in the cafeteria is
not enough to build the connections that create advocates and strong promoters, especially with various
business challenges like resource allocation, finances, and priorities.
There are many benefits to investing in deeper stakeholder relationships. They can:
• Offer extra support when you are working on a project like serving as a subject manager
expert, which can be very time consuming.
• Help sell a project and work as change agents to influence others who are hard to influence.
• Serve as a sounding board for projects and simply as great colleagues with a different
perspective.
The real test, however, is when they proactively contact you for advice on various matters—this is
when the investment truly pays off.

236 | Chapter 17
In the sales arena, relationships win over almost everything. You can execute well and deliver a top
product, but when you really need action and support, it’s the personal relationships developed over time
that prevail. There is a common saying, “Business decisions are made on the golf course.” This saying
applies, and the importance of relationships is also critical for resolving challenges or when a project needs
extra support. For example, if you are behind in your deliverables or receiving conflicting requests from
business partners, the solution may hinge on the relationship you have with stakeholders who can give you
an easy pass with just a quick phone call.

Building Connections
So how can you develop these stakeholder relationships for project support, advice, and solutions? There
are countless books on sales and account management with a variety of techniques and ideas, and they’re
all rooted in trust, integrity, honesty, reliability, and consistency. However, some key sales practices can also
benefit talent professionals; for example:
• Be selective when forming close bonds with stakeholders. Do not try to be intimate
with everyone; form trusting business relationships with all major stakeholders but target only a
few key associates to develop those deeper, more time-intensive relationships.
• Target close relationships with stakeholders where there is an authentic
connection, hence leveraging an existing opportunity. This will make them feel valued and, in
some ways, special. You will not hit it off with everyone, so select carefully.
To make these connections happen naturally, get out of the office! Go see people! Get to know
their projects within the organization—but also get to know their personal interests. Pilot programs and
engage with “friendlies” first. For example, our CLO works more closely with three of the 20 distri-
bution centers in our U.S. operations. When I first noticed this, it seemed funny that he was generally
going to the same three sites. Well, it was not funny; he was being smart and calculating in doing so.
He wasn’t ignoring the other distribution centers—he still supported them; he just prioritized his efforts
and relationships.
You cannot be everywhere and connected with everyone at the same level. Having strategic and
personal relationships with a select number of stakeholders will be critical in building and maintaining a
successful learning function.
Think like a salesperson. It doesn’t matter your role; you are always selling, whether it’s your-
self, the learning function, your team, or your products and services. Building credibility is critical.
Targeting and forming a few deep relationships will win the day or at least be an essential component
to success. Again, do not assume that because you are in assessment meetings or meet regularly with
a leadership team or steering committee that you are in the know and have the total respect of other
team members. It is the side meetings, lunches, and other connecting situations that will be vital
when push comes to shove.

The Impact of Coalitions | 237


To break down functional walls and build strong, lasting relationships with those in all areas of
the organization, reach out, walk around, and connect outside the walls of the learning team.

One More Tool to Consider


While organizational wiring diagrams of major internal stakeholders are typically used to show how
things work in an organization, it is common knowledge they don’t really give you the whole picture.
An Organizational Network Analysis (ONA) is a technique you can use to understand the inner
workings of an organization in terms of informational flow, collaboration, and location of expertise.
Researcher, professor, and ONA expert Rob Cross states, “ONA can provide an X-ray into the inner
workings of an organization—a powerful means of making invisible patterns of information flow
and collaboration in strategically important groups visible” (Cross and Thomas 2009).
The ONA involves a series of surveys for gathering information and platforms for analyzing.
Based on the analyses, it uses circles, connecting lines, and color-coding to identify the most prom-
inent or go-to people in the organization, the peripheral people, and the various subgroups. It can
also provide insights on the advantages or disadvantages of the physical space. For example, when
subgroups are physically dispersed, they lose the connecting opportunities provided by serendipitous
meetings and watercooler conversations
An ONA is an excellent technique to use prior to a major change effort or as part of an organi-
zational effort to increase collaboration and knowledge sharing. However, it also requires expertise
and technology.

Summary
There are many ways to identify, engage, collaborate with, manage, and align with stakeholders. The
big message is that it must be part of how learning leaders lead and manage. There are a variety of
tools to assist with this effort, including those at the surface level, such as stakeholder analysis and
mapping. Other tools include RACI, the power-interest matrix, and stakeholder profiling. Still other
tools, such as the Organizational Network Analysis and different ways of assessing influence, enable
you to better understand those stakeholders who have the power to sway decisions and thus guide
direction. Thinking like a salesperson or creating an advisory group are personal and homegrown
ways to build constructive relationships for moving ideas forward into execution and implementation.
The relationships you form with stakeholders, whether they last for a project or develop into a
longer-term business partnership, are what will define you and your TD organization. Your deliver-
ables, solutions, consultations, and advice will establish your credibility, but stakeholder relationships
are what position and establish your talent organization as an invaluable resource and function that
has more demand than supply (which is a great problem to have). Your ability to expand, increase

238 | Chapter 17
resources, and gain a seat at the highest leadership discussions will become standard, but that is built
one relationship at a time.

Key Takeaways
9 Developing key relationships within and across the enterprise requires inquiry, advocacy, and
connection.
9 Because there are so many variables associated with a program or process, organizations have many
influencers and decision makers. It is imperative to identify and communicate with each for programs
and processes to be successful.
9 It is easy to focus on project design and development and be blindsided by stakeholders.
9 Continually use tools and personal creativity to build coalitions that help you better understand how
organizational systems operate and how learning supports and enhances operations and results.

Questions for Reflection and Further Action


1. What is your current understanding of the stakeholders in your organization?

2. How can you be more strategic in knowing your key decision makers, influencers, and other critical
stakeholders?

3. Which tools are you using to identify, engage with, manage, and align with stakeholders to ensure
that projects and change initiatives are supported throughout the organization?

4. How might you communicate with stakeholders in a manner that is more customized for the role
they play in the organization?

5. How do these suggestions apply to working with vendors?

The Impact of Coalitions | 239


18
Structure and Governance
Kozetta Chapman and
Graham Johnston With MJ Hall

From a success perspective, developing capability for the organization is not just about designing and
implementing an array of learning experiences for employees. It requires looking at the organization as
a system to understand the needs, and from that point of view determining what capabilities are needed
and the best approach for designing and delivering them. Implicit in systems thinking is how the people,
processes, and structures are integrated and interact and work together to produce products and services
for others to accomplish the mission. The people part of the equation includes all employees and part-
ners; the processes include all actions or steps taken to achieve an operation; and the structures generally
imply the division and hierarchy for job functions as well as decision authority and reporting. The business
results delivered as outputs and outcomes, whatever they are, require highly integrated and coordinated
work efforts from everyone in the system. Because of these interdependencies, alignment and integration
require cross-functional optimization among the structures, processes, and people.
Quality management expert Edwards Deming understood the interconnectedness of people, processes,
and structures. He taught that while employees work within the system, the role of leadership is to work
on the system. He writes in his book Out of Crisis (2000), “The aim of leadership should be to improve the
performance of man and machine, to improve quality, to increase output, and simultaneously to bring
pride of workmanship to people. Put in a negative way, the aim of leadership is not merely to find and
record failures of men, but to remove the causes of failure: to help people to do a better job with less effort.”

241
Thus, the learning leader not only uses their expertise to create value related to performance capabil-
ity at the individual, team, and business levels, but they also work within structures and operating mech-
anisms such as the governance process. Deming further defines this concept in his 14 Points for Quality
Measurement.

Learning Structures
The learning function and the hierarchic reporting unit it belongs to are affected by the organizational
structure. For example, in some organizations, learning is embedded in the HR department, while in
others learning and HR might both be grouped with other functions in a talent group. In still other orga-
nizations, learning may be an independent stand-alone unit reporting to the CEO.
In most organizations, regardless of where they are located, the training and learning functions
generally operate somewhere along a continuum—at one end, the function is completely centralized as
an organizational unit, while at the other, it is completely decentralized and resides in many different units
that are associated with business lines or geographies. However, because it is a continuum, there are many
options to blend these two extremes, and the resulting structure depends on a variety of things, such as
size, locations, and lines of business, industry, and type of work. For example, many organizations have
a centralized function for crosscutting issues such as leadership development and compliance training
and a decentralized function for training in specific areas of domain expertise. This is sometimes called a
federated structure.
According to Training Industry (2012), training administration in a federated structure type is
managed by the corporate training organization operating through a company-wide LMS, LCMS,
or learning portal, while the other functions remain decentralized. Another option for learning struc-
tures is the inclusion of a corporate university, which is typically started to allow for more focus on
aligning with the organization’s strategic initiatives. This model can be part of a structure anyplace
along the continuum.
A centralized model has the following advantages:
• consistency of delivery
• standardization in the development process and materials
• transparency for what is available across business lines
• economies of scale to leverage costs, especially when investing in technology and working with
vendors and consultants
• less risk with compliance.
A decentralized model has these advantages:
• greater autonomy
• closer to the employee and therefore is more attuned to their specific needs, operations, and
goals

242 | Chapter 18
• can be more easily targeted to specific needs and tailored in content and context to the learner
• need for training can be realized more quickly because of fewer levels in the reporting
structure.
A hybrid model, such as a federated model, has these advantages:
• flexibility, with some functions being centrally organized to provide more content consistencies
across the business units, and others closer to the employees they serve
• the administration of learning can be easily centralized (such as using a company-wide
LMS), while the content development for specialized subjects (such as sales training) can be
decentralized.
In reality, most organizations do not have a clear-cut structure. In a recent informal poll of 28
members of the ATD Forum, 60 percent used a form of the hybrid structure and many did not have a
name for it. Leadership development and compliance training were generally centralized in these hybrids.
When asked why they used the hybrid model, the most prevalent reasons were stronger alignment with
the business, senior leaders’ buy-in, and better use of resources.

Governance
Governance, whether at the corporate level or for learning, exists as a process for people in the organiza-
tion with the decision authority to provide both oversight and advancement of initiatives, such as launch-
ing new products and results. The Baldrige Criteria for Performance Excellence states that governance
is the system of management and controls exercised in the stewardship of the organization. According
to Rita Mehegan Smith (2011), there are five oversight areas: accountability, operational effectiveness,
program service and quality, effective controls, and adherence to enterprise priorities.
Because governance involves decisions and approval by people, it is a process that links stakeholders
from different divisions or business units and provides opportunities to expand the thinking when assessing
initiatives; for example, in engineering and finance. Linking groups of people together to solve problems
or advance results can help develop credibility between and among these stakeholders. When stakeholder
synergies are coupled with the same directional focus for solving a challenge or reaching a common goal,
the result will most likely be higher levels of impact. Because governance involves stakeholders, it plays a
major role in the success of both accepting initiatives and launching them. Every learning team needs a
method for thinking about how governance works and which stakeholders need to be engaged in defining
and leading the learning strategy at the highest levels. At the corporate level, governance has a variety
of names, including board of directors, councils, forums, and steering groups, and each of these terms can be
combined with words that suggest levels (such as strategic or executive), places (such as regional or global), or
roles (such as learning or technology).
Governance in learning is particularly useful in decentralized and federated structural models, because
it provides a formal framework for stakeholders to manage decisions about how learning, talent develop-

Structure and Governance | 243


ment, and performance improvement practices work within the organization. The governance process
also increases transparency between the investment and the practice, which is helpful in the current world
of emphasizing return of investment. It can promote effectiveness in the quality of the results and efficien-
cies, especially fiscal responsibility.
Organizations can provide governance in a variety of ways. The stage-gate process is one option
that uses project management skills—learning projects are divided into a series of stages starting with
analyzing, concept creation, solution approval, communication and launch, project rollout, and project
close. There is a formal meeting at the end of each stage with all governance stakeholders in which the
project status information is shared, problems are resolved, and risk, cost, schedule, and performance are
reviewed and discussed. All communications and decisions are documented and published for tracking
and auditing. Additionally, at each stage, the project must be signed off by the designated management
governance or steering board before the team can move through the gate.
Smith (2011) provides a framework for the levels and roles and responsibilities for a more formalized
learning governance board, which lists options to consider for governance bodies based on needs and
objectives (Figure 18-1). As would be expected, the governance design is based on the structure, size,
organization, and so forth.

Building Governance One Step at a Time: A Use Case for Getting Started
Taylor was hired as a senior executive leader by a major transportation company with a federated training
model. It was a structural concept of which she was unaware, and the organization had no governance
model in place to help her. In spite of her situation, Taylor was excited about the possibilities that lay
ahead. She found that the company had thousands of employees in a variety of diverse roles who were
willing and ready to learn new skills, both for their current role and for the changing business. She saw this
as an opportunity to use the knowledge and skills she had gained throughout her 25-year career in learn-
ing and development to make a difference in the lives of employees and to have organizational impact.
Before her first day on the job, Taylor began to construct a master plan to provide dynamic and engaging
training to all of the organization’s employees. She envisioned that the return on investment metrics would
ultimately prove that the training opportunities she wanted to introduce would be best in class for the
transportation industry.
Taylor was eager and prepared to change the landscape of training for her new organization. However,
within the first few weeks, she noticed that collaboration and sharing of existing training resources wasn’t
happening among the business units. The organization had 10 training teams that serviced various
segments of the business. And, although there were formal entities, Taylor quickly learned that many
other groups also maintained isolated informal training teams unknown to those outside their group. And,
within those groups, the learning audiences were populated by both union and nonunion employees,
which presented its own challenges and constraints for training design and delivery.

244 | Chapter 18
Figure 18-1. Framework for a Learning Governance Board

Governance Body Role and Responsibility Membership


Strategy Board • Maintain alignment with the direction of the 8–10 members
company
• Set learning philosophy and policies Standing Members:
• Identify and prioritize strategic learning • CEO (sponsor)
needs • Senior HR leader
• Approve and fund annual learning plan • Learning leader
• Ensure learning is run as a business process Rotating Members:
• Monitor impact of learning • Senior business leaders from different lines
• Act as a visible champion for learning of business, functions, and geographies
• External thought leader (optional)

Minimum of 4 meetings annually


Regional Advisory • Provide regional insight into the enterprise 4–6 members
Board learning direction and priorities
(as appropriate) • Identify and prioritize strategic regional Standing Members:
needs • Regional president (sponsor)
• Monitor impact of regional learning • Regional HR leader
• Act as a visible champion for learning within • Learning leader
the region Rotating Members
• Senior business leaders from different lines
of business, functions, and geographies

Minimum of 3 meetings annually


Curriculum • Set direction for curriculum area of school or 4–6 members
Advisory Board corporate university
• Identify and prioritize functional learning Standing Members:
needs • Functional business leader (sponsor;
• Provide directions on target audience and e.g., CFO, Sales, CHRO)
levels of penetration for learning • Learning leader
• Recommend program sponsors and design Rotating Members:
team members • Senior leaders with content expertise from
• Assist in obtaining necessary funding diverse lines of business and geographies
• Act as a visible champion for the curriculum
Minimum of 2 meetings annually
Enterprise • Link and leverage learning synergies 10–12 learning leader members
Learning Council • Recommend enterprise learning policies and (depending on the size and number of learning
processes groups)
• Share best practices in learning
• Provide input into annual enterprise learning Council leadership rotated annually
plan
• Act as a visible champion for learning Minimum 4 meetings annually

Structure and Governance | 245


A federated structure can work well for some organizations and there are advantages to the system,
but as Taylor soon learned, there are also many challenges. Having numerous training teams focused
on developing content that is relevant for their intended audience can be beneficial, but it also leads
to duplicate training modules; no standardization in work output; slower implementation times, which
could negatively affect the business; and inconsistencies in content and delivery. With different units
moving in so many directions, there was no champion at the highest level of the organization to influ-
ence alignment with corporate goals and objectives. In essence, the actions were all reactionary, relying
on the skills of the individuals involved rather than representing a collective with a standardized process.
So, while Taylor’s initial goal had been to provide employees the training and learning experiences
they needed and wanted, her new challenge became creating alignment within a federated training model.
And this was a completely new task that she had not anticipated.
Based on research and extensive inquiry, Taylor discovered a way to mitigate the challenges and
create alignment through governance. The model she selected referenced the work of Grant Ricketts and
Brook Manville (2004), which states that “governance in general is a system of structures, processes, prac-
tices and values that enable an organization to make good and actionable decisions . . . [by]:
• preparing plans and managing implementations
• overseeing continuing operations
• surfacing and resolving problems
• resolving conflicts
• assigning responsibility and ensuring accountable actions.”
Through continued investigation, Taylor quickly realized that governance in learning is a widely
used tool for organizations and is particularly useful in the federated training model. The process
provides a formal framework for including stakeholders and managing decisions for how learning,
talent development, and performance improvement practices work within the organization. It also
includes a set of operating policies, procedures, and standards. The design, development, and imple-
mentation of a governance process within this federated learning structure became Taylor’s top
priority—and her passion.
The first phase of her change initiative was to establish relationships with other training leaders at
the organization. Using company data, Taylor identified the departments and leaders that supported the
largest audiences in the organization and scheduled an initial, collective meeting with them. The intent of
the meeting was to share business practices and determine how these major groups could collaborate to
positively influence the business. Under Taylor’s guidance, the organization developed opportunities for
leaders of the training functions to meet on a monthly basis to increase communication, leverage tools,
and share ideas. This first structure established within the governance model was the Learning Council.
Once the leaders were in the room, they realized the benefits that would result from their collaborative
efforts, and the meeting was a success.

246 | Chapter 18
Although this structure created upper-level collaboration, Taylor believed that more could be done
to cascade the alignment efforts. In speaking with managers, she found that they were only aware of the
materials their own teams were developing and were unfamiliar with other training managers within the
company. After careful deliberation and discussion with the Learning Council, the leaders determined
that this type of meeting would also benefit the managers of their training units. As a result, Taylor intro-
duced the Learning Forum (Figure 18-2). This group met monthly and included managers from various
areas of the business who would discuss and share methods to capture training analytics, the completion
of high-profile projects being developed by their teams, and knowledge and best practice considerations.

Figure 18-2. Cascading Governance Model for Learning Forum

After the governance model had been in place for three months, an opportunity arose to test collab-
oration and sharing between and among the various business units in the form of hosting an experiential
two-and-a-half-day workshop, called a Learning Lab, for 82 senior learning leaders from more than 35
companies and representing a variety of industries. The workshop’s design team was composed of a
training manager from each of the five distinct units and led by an outside partner. They spent three and
a half months designing, developing, and coordinating the action research workshop, which would take a
deep dive into ways to lead change and result in competitive advantage.
The workshop included a variety of presentations from other companies about the tools and training
approaches they used to enhance their employees’ ability to accomplish the company’s mission and vision
as well as reinforcing the organization’s strategic initiatives. The action research part of the workshop
included a product presentation by internal training managers from the Learning Forum that attendees
assessed by meeting collaboratively in small groups and providing feedback. Topics shared included the
hiring process, onboarding, customer-service training, using virtual reality in the training space, and using
the Emergenetics Profile, which is a self-assessment built on four thinking attributes and three behavioral
attributes. The workshop also included networking lunches, including a panel discussion by the Leader-
ship Council. This provided an opportunity for the group to collectively share how using the governance
model was improving training effectiveness and efficiencies and promoting organizational alignment
around competitive advantage.

Structure and Governance | 247


The opportunity to work on a common project served as a major impetus for working horizontally
across boundaries in a variety of ways. The most obvious result was the benefits for the learning manag-
ers of an extremely successful project. Using this experience as a foundation, managers are now creating
numerous synergies by sharing their approaches and processes. This helped them reduce duplication
and, most important, see the value of exchanging ideas and learning from one another to leverage best
practices.
While the successes with this governance process have been significant so far, Taylor realizes that
much work still needs to be done. She has several new ideas for more formal approaches to enhancing the
process and these are currently under consideration. Additionally, Taylor and her group are continuing
to benchmark other companies for best practice ideas, because she knows there is truth behind the Ralph
Waldo Emerson quote: “Good thoughts are no better than good dreams unless they are executed.”

The Deloitte Story: Governance From a Federated Model


by Graham Johnston

At Deloitte, learning and development is widely recognized as a key enabler for business priorities, a
driver for organizational performance, and critical to maintaining competitive advantage. Deloitte is a
professional services organization, and its people represent the products and services delivered to clients.
Accordingly, there is a proportionate commitment to and investment in development that drives the orga-
nization’s strong learning culture. Deloitte offers a holistic development experience for its workforce, one
that integrates learning with the work and builds technical, professional, leadership, and industry capabil-
ities to optimize the value for the clients they serve.
To understand learning governance at Deloitte, one needs to understand the organizational structure
of the talent development function. It operates as a federated or blended model with seven entities, includ-
ing a talent development team, for:
• each of Deloitte’s four primary businesses (consulting, tax, audit, and advisory)
• the internal support functions (such as finance, information technology, and talent)
• leader development, onboarding, and performance management
• learning specific to the industries served.
Additionally, a shared services team executes repeatable transactions with high quality and efficiency,
specializing in fields such as continuing professional education (CPE) accreditation, program delivery
support, reporting and analytics, and vendor management. It also includes an instructional design group,
which helps accelerate time from design to delivery and manage vendor spend.
In this model, the talent development teams are largely decentralized and aligned with their respec-
tive stakeholder groups to best address their unique needs. However, there is extensive coordination and
alignment between and among all seven teams, working against a single talent development strategy while
executing their individual talent development plans.

248 | Chapter 18
Each of the seven talent development teams has a chief learning officer—who reports to the
managing director for talent development and ultimately Deloitte’s chief talent officer—driving
business and talent strategy alignment. By having CLO leaders at the helm of each business’s learn-
ing structure, they achieve close integration of business objectives and supporting development
strategies. The CLOs and their teams work closely with Deloitte’s business leaders to understand
their priorities and practitioner capability needs, and to develop learning solutions that best achieve
performance objectives. This structure yields development solutions that build knowledge and skills
around how Deloitte goes to market, the clients it serves, and the different roles required of its
professionals. Additionally, these relationships between talent development and the business enable
ongoing communication about talent development performance, the evolving business drivers and
associated learning needs, and continuous improvement efforts, while also creating an environment
of accountability.
To this end, Deloitte’s talent development function follows a set of consistent, integrated
processes to define, implement, and maintain annual development strategy and plans:
• Talent development planning. Prior to each fiscal year, the CLOs meet with
their respective business leaders to discuss client service, market growth, operational
performance, and talent-related business objectives. These form the basis for defining
learning and development priorities, which the CLO uses to formulate the development
plan and budget for the coming year.
• Driving development impact. Deloitte regularly assesses talent development’s
performance in terms of efficiency, effectiveness, innovation, and business alignment
and impact. This provides a platform for demonstrating value and highlighting
opportunities to enhance impact on business outcomes. Deloitte also measures the
impact and effectiveness of individual learning solutions to validate that they are doing
what was intended or point to opportunities for improvement.
• Maintaining business alignment. The CLOs regularly meet with business
stakeholders throughout the fiscal year to understand evolving business objectives
and changes to associated capability needs, based on changes to the market. This
enables ongoing alignment between client service needs and learning strategies
and solutions.
This governance model is ref lective of Deloitte’s organizational structure and operating model,
and positions talent development to be an enabler of business strategy, as business priorities inform
development goals and solutions. This level of integration between business direction and talent
initiatives is a critical success factor for Deloitte.

Structure and Governance | 249


Summary
The mission for talent management and particularly learning functions is ensuring that a compe-
tent workforce is in place with the right skills at the right time doing the right work. However, in
the haste of busy schedules many learning teams simply respond to managers’ requests. For greater
effectiveness with performance directed at delivering the desired business results now and in the
future, learning leaders need to advocate assessing the organizational system to influence building
new capabilities. In between these extremes is keeping the learning operations flowing smoothly.
This includes reskilling the workforce when the work changes, ensuring skills are current, and upskill-
ing for the future by creating opportunities to expand the organizational capability in a way that
improves strategy execution, creates competitive advantage, and ultimately creates superior organi-
zational performance.
Thus, organizational impact is not only from what learning leaders are doing; many times,
it is a result of the structure and governance system in place. This is important for a variety of
reasons, but the most important is that they can inf luence the standing of learning within the
organization and the function’s ability to successfully implement processes for skilling, reskilling,
and upskilling, which enhance performance. Organizations that view learning as a major player
ensure employees are skilled, engaged, and motivated to deliver a high level of performance. But
they also continually look at the external environment and the organization as a system to assess
future employee capability needs to operate in a VUCA environment. With disruptive drivers ever
present and with the pace of change continually advancing, competitive organizations must pay
attention to both the structure of and the governance process for learning. However, they do this
in many different ways.

Key Takeaways
9 The structure of the learning function plays a large role in how the work is organized, how
resources are used, and how people work together, all of which affect performance.
9 The continuum from completely centralized to completely decentralized is long and provides
myriad opportunities for variations in between. Some organizations call one of these hybrids a
federated model.
9 The governance process differs depending on the structure; however, governance involves key
stakeholders and therefore helps align learning to the business goals and objectives. This alignment
contributes to success in building capability for performance because of the support and buy-in from
various stakeholders and the more collaborative approach.

250 | Chapter 18
Questions for Reflection and Further Action
1. What is the learning structure in your organization? Why is this structure used? What are the pros
and cons?

2. What formal governance process, if any, is used in your organization and how does it add value?

3. If you do not have a formal governance structure, how might you use these ideas to get started?

Structure and Governance | 251


Section 6
Enabling Learning
Using Technology

T
his section runs the gamut from fundamental technologies that help the learning func-

tion deliver, monitor, and track learning to emerging technologies that disrupt learning

and enable personalization, learning paths, and trigger questions for reminders. While

technology opens doors to making learning content more accessible, realistic, and immediate, it

also represents the shiny new objects that surface almost monthly, tantalizing and frustrating

learning leaders just from the sheer volume.

One theme that surfaces throughout this section is that while learning learners do not need

to be technical geeks, they must be digitally literate and know how technologies are influencing the

work, the workers, and the learning content. This also implies a growth mindset and a willingness

to experiment.

In chapter 19, Jerry Kaminski addresses how the learning leader deals with new and emerg-

ing technologies requested from customers, and in the narrative makes a critical point: The place to

start is clarity on the learning need and the best way to close identified gaps. The next step is to gain

an understanding of the current capabilities of the technologies already in use. Only after these

steps are completed is it advantageous to research emerging technologies and their capabilities to

advance learning options and solutions.

Terry Copley uses a jungle metaphor to describe Hilti’s venture into emerging technologies in

chapter 20. As the organization advanced from an LMS to game mechanics, the cloud, and finally

253
virtual and augmented reality, the environment was scary and wrought with unexpected challenges. But there

were also a few successes that encouraged further exploration and usage. Additionally, they partnered with and

learned from experts who served as guides to navigate the terrain. While the technology was unfamiliar at the

time, the team was on solid ground because their focus started with the learner—and the goal was how best

to build learning assets for modern learning. The experience helped them realized that a different mindset is

necessary to push us to get uncomfortable with the status quo and venture into a jungle.

Brandon Carson provides a big picture perspective for how work, employee demographics, and learning

are changing in chapter 21. The more uncertain and complex environment means that every worker at every

level has new challenges that need to be addressed. In sync with the rest of the book, he expounds on the imper-

ative for learning leaders to focus on the performance required for business results. Within this overview he

encourages them to adopt a digital frame of mind and simultaneously keep a sharp focus on the human element.

Additionally, he provides a case study for going mobile.

254 | Section 6
19
Is Tech the Answer?
Jerry Kaminski

“Don’t find customers for your products; find products for your customers.” —Seth Godin

Maria, vice president of operations, is reading Fast Company and considering using some new, really slick
learning technology for her next project.
Steve, the company’s chief talent development officer, is sitting in his office when the phone rings. It’s
Maria: “Steve, do you have a moment to discuss a proposition I have for the company?”
Steve is always open to hearing Maria’s suggestions. She starts by telling Steve about an article she
just read detailing how new technologies like artificial intelligence (AI), augmented reality (AR), virtual
reality (VR), and wearables are revolutionizing how training and talent development is delivered. Steve
sighs. He’s been looking into many of these technologies as well, but continues to get resistance from both
the IT department (“We can’t support those technologies”) and finance department (“Do you know how
much it will cost to have these toys in our portfolio?”).
Not wanting to be a naysayer, Steve continues to listen to Maria’s ideas and enthusiasm about
these technologies. She shares the article’s conclusion about how easy it is to develop solutions with
these products and how much time and cost it can save the organization. Steve begins jotting notes
from Maria’s ideas, and is particularly interested in reading the article and some of the sources she
is referencing.

255
“Steve, why can’t we develop wearable technologies for our technicians to use in the field?”
Maria asks. “We could track their completions of any required training, as well as provide our oper-
ator qualifications directly to our learning management system.”
Knowing that Steve is still listening intently, Maria adds, “I would also like to use AR and some
form of game mechanics in the design. Could we make it like that game show—ask the audience,
phone a friend, or eliminate two as a creative way to answer questions our technicians may have in
the field?”
She continues explaining the what of the solution, but lacks many details about the actual
content or the how or how-to needed to accomplish behavioral changes as a result of the solution.
Her entire focus is on the technology and how she thinks it could revolutionize the company and its
image as being tech-savvy.
What does Steve say? First, he asks for clarification: “Maria, can I ask some questions to help
narrow down how our talent development team could help?”
“Sure Steve, ask away,” Maria replies.
So he asks the following questions:
• “Do you have a target content area that we could focus this on and not be so broad?”
• “Is there a timeline for development? A budget?”
• “Do you have the resources for subject matter experts my team can work with?”
• “Do you have senior leadership support to start this endeavor?”
• “How soon do you want us to start?”
• “What’s your vision for communicating these new technologies, and do we have a change
management plan for the technicians?”
“Yes, I have answers to all these questions and more details,” Maria responds. “Let’s schedule
some time next week to outline a plan I can present during the next senior leaders roundtable the
following week. You know I like working with your group, and I think between the two of us, we can
hit a home run with this project.”
Now what? Where does Steve start? How can he accomplish all that Maria has asked and do
it within his resource constraints of budget, people, and technology? And most important, how can
he provide a solution with real, meaningful change and benefits? According to Gartner (2017), “80
percent of social business efforts will not achieve the intended benefits due to inadequate leadership
and an overemphasis on technology.”

Assessing New Technologies


Before we jump to emerging technologies, let’s do a quick review of current technologies. Many are
grounded in the success of future technologies and understanding them is just as important as moving
into the emerging technologies.

256 | Chapter 19
Steve starts assessing new technologies, focusing on those that Maria outlined: AI, AR, VR, and
wearables. Maria also mentioned game mechanics. Steve’s team has been keeping tabs on these emerging
technologies for some time and has been trying to identify a proof-of-concept project that might allow
them to bring something into the company. This may be the ideal opportunity!
When assessing current and emerging technologies, the choices are all over the board in terms
of vendors, technologies, costs, timelines, and “glitz factor.” While nonlearning professionals may be
attracted by the glitz, they may not always be pleased by the outcomes. The good news is, Maria has
outlined her ideal outcomes and thoughts, which Steve can focus his assessment on. She is looking
specifically at wearables and game mechanics paired with the technical skills of the technicians.
Steve believes this would be an ideal marriage of technology and performance, and would allow him
to focus his attention on one technology rather than the entire spectrum. He is also excited because
he recently saw a demonstration of a solution with wearables that sounds exactly like what Maria is
looking to accomplish.
When reviewing existing technologies, consider how many will aid in the success of using emerging
technologies. Explore current technologies that affect a digitization strategy first, and then look at the
bigger picture of the emerging technologies for talent development.

Content Management
Content management is the process for collection, delivery, retrieval, governance, and overall management
of information in any format. Generally, it applies primarily to digital content, from creation through stor-
age and deletion. Examples of content include MS Word documents, PowerPoint, multimedia or e-learn-
ing files, movie files, audio files, graphics, or any asset used to support learning activities.
There are three major components to content management solutions:
• content storage capability
• tracking, logistic, and administration capability
• delivery systems.
Let’s start by defining some of these systems and how they affect technology use for talent development.
• Learning management system (LMS). An LMS is a software application for the
administration, documentation, tracking, reporting, and delivery of educational courses or
training programs. This is the heart of logistical tracking of students’ content completion,
tracking, and scheduling.
• Learning content management system (LCMS). An LCMS is a platform that allows
you to create, manage, host, and track digital learning content. You can think of it as a
one-stop shop for e-learning, from creation to delivery. The LCMS is essentially a storage
facility for all your learning assets, including content, graphics, videos, and audio. It also has the
ability to build a finished course.

Is Tech the Answer? | 257


What is the difference between an LMS and an LCMS? Because of their similarities, people
often think LMSs and LCMSs are the same thing. In fact, the term LMS has evolved to describe prod-
ucts that have both LMS and LCMS functionality, which can be incredibly confusing for first-time
users or buyers. But there are differences you can look out for to help choose the type of platform
that’s best for your organization.
The biggest difference is that an LCMS is used to create course content, while an LMS is used
to deliver that content to learners. Whether you have separate systems for your LCMS and LMS or
use one LMS suite that includes both (which is common), they work in tandem to manage the entire
learning process.
For example, say a developer needs to create e-learning content about updated managerial tech-
niques. They can use an LCMS to collaborate on and author this content, combining pre-existing
assets (such as videos, audio, and images) with content created in-system, such as text and assess-
ments. The LCMS can then help arrange all this material into a logically sequenced, comprehensive
course on the subject.
From there, developers can publish the finished content to an LMS. Trainees log in to the LMS
to take courses and assessments, while managers and corporate trainers can access it to track learner
progress and outcomes. Both the LMS and the LCMS should be integrated into your human resource
information system (HRIS) as well as be aligned with your HR and training data.

Content Delivery Systems


Computer-based training (CBT or e-learning) was first introduced in the mid-1980s and championed
as the tool that would put an end to instructor-led training (ILT). In fact, many predicted that learn-
ing organizations would no longer train workers in a classroom setting. Forty years later, however, the
percentage of ILT remains very large—exceeding 50 percent in many organizations (much as it was
in the mid-1980s). What has changed is the breadth of training offerings; instead of CBT wholesale
replacing ILT, it just grew the base of offerings.
Many instructional designers and talent development managers agree that there is some consid-
erably bad e-learning on the market—especially those e-learning courses that are no more than
“sound on slide” presentations. The 1960s and early 1970s saw extensive use of 35mm slideshows
synced with audio tapes; having an automated voice system advancing the projected slides was a
revolutionary concept. Even with the advent of less expensive technology, graphics, video, audio,
touchscreens, and mobile connectivity (laptop, tablet, or phone), many e-learning programs continue
to look like those early sound-on-slide approaches. For advanced technology to be truly revolution-
ary or emerging, it needs to transform the way the tasks or training is conducted. Many emerging
technologies are transforming how we do e-learning, such as using higher levels of interactivity, simu-
lations, and virtual reality foundations to better prepare today’s workers. There is still a significant

258 | Chapter 19
need for e-learning programs, and many companies still consider it to be an emerging technology (see
appendix 2 for a list of technology platforms and systems).
In addition to technology, organizations also use aggregators (or curators) and integrators to help
with their digital strategy. Here is a brief description of these different support organizations:
• Aggregators or curators. Content curation is the process of gathering information
relevant to a topic or area of interest. Services or people that implement content curation
are called curators. These services can be used by businesses and end users. In simple
terms, the process of content curation is the act of sorting through large amounts of
content on the web and presenting the best posts in a meaningful and organized way. The
process can include sifting, sorting, arranging, and placing found content into specific
themes, and then publishing that information to a targeted audience.
• Integrators. These are third-party companies (usually identified by LMS or LCMS
vendors) that assist organizations with the implementation of their systems. The success
of the integrator lies with its ability to fit all your systems together seamlessly and with
minimum user interruptions. In some instances, the vendor may serve as the integrator, or
a chosen partner will serve this role. The integrator brings considerable experience with
the LMS or LCMS, and often the company’s HRIS, which is another critical component.
Examples of aggregators or curators and integrators include Accord, Axonify, Bamboo, Coursera,
Degreed, EdCast, Exec Online, FUSE, Harvard ManageMentor, Joomla, LRN (ethics and compliance
content), Pathgear, Pinterest, Red Vector (technical content), LinkedIn Learning Saba, SAP Hana, SAP
JAM, Skillsoft Percipio, Udemy, Workday (social learning platform), and Your Learning.

Emerging Technologies
To this point, we have explored many of the current technologies within the talent development arena.
However, these are all back-office technologies—those that reside predominantly in the development
and logistics area. The majority of emerging technologies are in the realm of instructional design and
content development, and don’t necessarily address delivery. If you recall from our opening story, Maria
was looking for AI, VR, and AR. Let’s explore these emerging technologies and see how they affect the
learning function.

Artificial Intelligence
AI is the theory and development of computer systems that are able to perform tasks that would
normally require human intelligence, such as visual perception, speech recognition, decision making,
and language translation. AI requires considerable development around rule-based outcomes, and
while it can supplement many routine tasks, it does require a higher level of sophistication to actually
replace human intelligence.

Is Tech the Answer? | 259


AI is commonly used in decision trees or procedural tasks. Use of Bloom’s Taxonomy is key when
developing an AI solution—questions and tasks must align with the six levels: knowledge, under-
standing, application, analysis, evaluation, and creation. While AI continues to grow and become
more widespread, there is still considerable research needed to make it work for human performance.
Consider this—ATMs were one of the first commercial applications of AI. They knew you were doing
only one of three tasks: withdrawing money, depositing money, or inquiring about balances. Once you
made that selection, the next question for the decision tree was whether it was for a checking, savings,
or money market account. The decision tree was fairly specific and did not require human interaction
or training on the user’s part to accomplish the tasks. Another use of AI is the self-checkout lines at
many stores today.

Virtual Reality
VR has been prevalent for some time in complex, dangerous, or costly simulations, such as controlling
nuclear reactors. This technology is composed of computer-generated simulations of 3-D images or envi-
ronments that users can interact with in a seemingly real or physical way by using special electronic equip-
ment, such as a helmet with a screen inside or gloves fitted with sensors. VR is considered the ultimate
simulation, allowing the user to perform a series of tasks or experience the environment. Flight simulators
are another type of VR solution.

Augmented Reality
AR is a newer technology that shows the most promise and, when combined with wearable tech-
nology, has the potential to offer solutions never before imagined in the TD field. AR technology
superimposes a computer-generated image onto the user’s view of the real world, thus provid-
ing a composite view. For Maria’s request, consider this scenario: A technician is dispatched to a
customer site that has a serious equipment issue. While the technician is fully qualified to work on
many pieces of equipment and platforms, they are not fully up to speed with this particular equip-
ment and layout. So, the technician puts on a wearable set of glasses, calls up the customer and
specific equipment in his portable-connected devices and then looks at the open equipment bay.
The wearables pick up a QR code that has been installed in the equipment bay, pull up the appro-
priate schematic diagram along with step-by-step instructions for troubleshooting, and display it
superimposed onto the actual circuitry. The technician is then able to troubleshoot and repair the
equipment in a matter of minutes, never having to leave the workspace, pull out a manual, or even
second guess if he’s got the right diagrams or solutions. While all emerging technologies have merit,
AR is the leading contender for highly skilled jobs and roles. It allows for reality-based and real-
time use or training for complicated roles, and leads to faster, more efficient repairs and reduced
training for individuals.

260 | Chapter 19
Wearables
Wearable technology is a category of electronic devices that can be worn as accessories, embedded in
clothing, implanted in the user’s body, or even tattooed on the skin. The devices are practical, hands-free
gadgets that are powered by microprocessors and enhanced with the ability to send and receive data via
the Internet. Wearables are not entirely new, but with the advent of smaller and faster technology and
connections, they’re becoming more popular. When wearables are combined with augmented reality, it
becomes possible to perform a multitude of tasks that no longer require extensive training, volumes of
manuals, or fear of incorrect solutions or sequences (as in the example of the technician from the AR
section). Wearables are revolutionizing the way people perform, work, and learn. Job aids were originally
developed to reduce the instruct-to-recall, and wearables combined with AR take that thought process
even further. These technologies will continue to expand and inspire new solutions as they come down in
price and become more accessible for developers. Just as digital video cameras and editing software revo-
lutionized the entire video media industry, so will wearables.

Drones
Drone technology is not specifically new or unique to talent development. A drone, in technological
terms, is an unmanned aircraft, and they’re more formally known as unmanned aerial vehicles (UAV) or
unmanned aircraft systems (UAS). Essentially, a drone is a flying robot that can be remotely controlled or
fly autonomously through software-controlled flight plans in their embedded systems, working in conjunc-
tion with onboard sensors and GPS. Video capture is the most common use of drones in talent devel-
opment, because they give you the ability to get high-resolution aerial photography that once was only
available using aircraft. Drone technology is not very complicated, but combined with many of the other
new technologies, it gives the TD function a much broader appeal of capabilities.

Gamification and Game Mechanics


Use of gaming theories falls into two distinct arenas: gamification and game mechanics. These both
encompass the process of adding games or game-like elements to something (such as a task or training) to
encourage participation and completion.
Game mechanics are methods invoked by agents designed for interaction with the game state, thus
providing game play. All games use mechanics; however, theories and styles differ as to their ultimate
importance to the game. In general, the process and study of game design are focused on developing
game mechanics that allow for an engaging, although not necessarily fun, experience. The interaction
of game mechanics determines the game’s complexity and level of player interaction, as well as game
balance when tied to the game’s environment and resources. Some forms of game mechanics have been
used for centuries, while others were invented within the past decade. Complexity in game mechanics
should not be confused with depth or even realism.

Is Tech the Answer? | 261


Budgeting
So how much do these new and emerging technologies cost? When you start thinking about budgeting
and cost for emerging technologies, it’s like buying a car. Do I want a low-cost car, a midlevel car, or a
luxury model? Technology is much the same: There are price points for all solutions and budgets.
When you’re entering into a partnership with technology vendors, one of the first questions they’ll ask
is, “What is your budget?” They’re likely not trying to upsell you on more expensive solutions (although
there’s always a chance that they are); instead, they are trying to judge a solution that best fits your budget
amount. Obviously you can always grow your budget, but be realistic up front; you may tell your business
partners you can deliver X only to find out your budget can only deliver Y. A common mistake when
dealing with technology-related projects (not just within talent development but across all disciplines) is
overpromising a solution for the budget you have. Make sure everyone has realistic expectations of the
outcomes as well as the budget being spent.
Before you start developing your budget projection—especially for an emerging technology—there
are several cost considerations to keep in mind. They’ll affect how much you’ll be spending, as well as your
project’s ROI.

Audience Size
While your business partners may want a technology solution for all problems and audiences, consider
that it may not be as cost effective as you think. Normally solutions for fewer than 250 people are not
cost effective. To fully identify your minimum audience size, complete an ROI analysis using both the
cost of the solution and the cost of the problem. Audience size may vary based on the solution cost, but
always consider it in your decision-making process before agreeing to implement the solution. Ensure your
requestor confirms the audience size before committing to development.

Audience Distribution
Just as audience size is critical, so too is audience distribution. When dealing with technologies such as
wearables or computer-based VR simulations, having a large, distributed audience can be costly. Make
sure to complete a thorough assessment of your audience’s makeup and locations, as well as your ability
to support large-scale technology. The term scalability is critical to the decision process. While you may be
able to develop an emerging solution for one location, implementing it worldwide to 50 locations may be
both a challenge and a huge cost consideration.

Content Volatility
One of the more overlooked cost factors with budget is content volatility—how long will the content
remain current before it needs revision? Developing a solution that has a shelf life of two months means
you may be right back at square one developing a new solution. While up-front costs for equipment, staff,

262 | Chapter 19
or software are high, the cost for development may be even higher, especially if you use a vendor for
development. Emerging technology solutions can cost upward of $250,000 per project. So, make sure you
consider the content shelf life when doing your analysis and planning.

Technology Readiness
While it’s not a direct cost or budget consideration, you should consider the technology readiness of your
development team, IT department, and end users. Do they require specialized training and should that
be included in the budget? What about specialized hardware and software? Make sure you’re not selecting
an emerging technology that is too advanced, cumbersome, or costly for your user population. And if you
do, include that in your costing and budgeting analysis before you start. You’ll also want to involve your
communications and change management teams as you embark on your project—what good is a perfect
solution if nobody knows about it?

Cost
Again, consider if you want the low-, medium-, or high-cost solution. As with most projects, solutions are
available at all price points. When deciding, make sure the price point you select matches your expectations
for outcomes, as well as your requestor’s. Telling them they are getting top-of-the-line solutions, only to see
marginally different outcomes, will do considerable harm to your reputation and ability to request addition-
al projects. Cost savings could also include grants, joint-project budgets, and co-op work with your vendor.

Return on Investment
Lastly, after collecting all the cost factors, complete an ROI analysis. Identify the cost of the problem you
are attempting to solve and the cost of the solution you are implementing. Ideally, you want the cost of
the solution to be less than the problem. Exceptions to that are when a solution is mandated or you are
attempting a proof-of-concept project, which will have learning implications for future projects.

Summary
Determining your digital strategy is not a straightforward decision. First, you should be very cautious
and thorough in all your decisions, especially for emerging technologies. Next, consider using an accom-
plished vendor, benchmarking others who have implemented similar projects, doing your homework and
research, and being prepared for setbacks and failures. Finally, use an Agile approach to your work; that
is, use sprints to do small cycles of development, test your hypothesis at each one, and improve each time.
So, getting back to our story . . . What did Steve do?
Steve connected with a local vendor he had used in the past and outlined the project he wanted to
pursue. They came up with a solid project plan outlining a wearable glasses solution that linked to the
LMS as well as an internal help desk. The wearable used AR real-time overlays and work instructions.

Is Tech the Answer? | 263


Technicians put on the wearable glasses, connected to the LMS, LCMS, and help desk. The glasses
then recognized a QR code displayed on the equipment and pulled up the schematics, overlaying them on
the viewer along with the troubleshooting instructions. Technicians were now able to fix customer prob-
lems in half the time, as well as fix customer equipment in a fraction of the time it used to take. Also, with
real-time access to the help desk, technicians now had an immediate connection to a senior technician if
they encountered issues outside the normal work instructions.
The solution has been such a huge success by marrying two technology solutions (VR and QR codes),
revamping technician processes and procedures, and retooling the technicians. While all components were
critical to the change in performance, the technology was essential for making this solution the desired
outcome. Both Steve and Maria were recognized at the last senior-leader town hall and received promotions.
While the ROI was huge in terms of costs to fixing customer equipment problems, a larger benefit
was realized that no one had considered. The company became a leader in technical repairs—resulting
in six new customer contracts. Those new businesses realized a profit well into the multimillion-dollar
range, which gave Steve and Maria sufficient funding to add QR codes to every piece of equipment for
all customers going forward.

Key Takeaways
9 Don’t take on the world—start small, with technologies you are comfortable using. If you are not
currently doing e-learning, taking on a VR or AI project may be overwhelming. Start with solid,
existing technologies to build your team’s confidence before embarking on new technologies.
9 Select successful vendors who can help you succeed. Make sure you benchmark and ask others which
vendors can help provide the solutions you are looking to accomplish. Networking is key in helping
you identify a vendor that best meets your needs and aligns with your organization.
9 Expect it to cost more, take longer, and be more complicated than anticipated. Just remember,
technology can be very expensive. You will likely have to purchase hardware, software, equipment,
services, programming, maintenance, and even technical support. Be realistic when planning and
budgeting the project.
9 Be transparent and communicate frequently with the team, stakeholders, and end users. This goes
without saying for any project, not just technology-related ones.
9 Celebrate minor and major successes throughout the development. You are in it for the duration.
Have both intermittent and final benchmarks.
9 Keep accurate and thorough notes, records, and assets during the entire development cycle. Record
keeping is critical, especially if this is your first project. The better your notes and records are, the
better your next attempt will be. It will also offer you insights into timing and costs.
9 Complete lessons learned and close-out meetings when the project ends. Again, these are critical
steps not just for technology projects but any project in the project management life cycle.

264 | Chapter 19
Questions for Reflection and Further Action
While these questions can apply to any organization, the focus is for organizations just embarking on an
emerging technology solution. When first starting out, seek assistance from vendors and other organiza-
tions that may be more mature in their emerging technology journey. These questions help frame your
journey and are meant to keep you from getting stalled in your use of technologies.

1. What ideas from this chapter do you want to pursue further?

2. How might you find additional information?

3. What is your current practice when you receive a call from the Marias of the world?

4. Where does your organization rank regarding using technologies to leverage learning?

5. Based on your current situation and context, what is your next step?

6. Do you or your team have the required skills to take on these types of projects?

7. Does it make sense for your organization to go down this path right now? For this project?

Is Tech the Answer? | 265


20
Taking the Fear Out of the
Technology Jungle
Terry Copley

When we think about technology, all kinds of imagery can jump into our heads. Some have made huge,
positive impacts on society—cellular phones, medical implements, and GPS navigation to name a few.
In the learning arena, technology’s impact has been profound; from CD-ROM-based e-learning to fully
online training to present-day microbursts of content that’s available anytime, on any device. But, even
with all the positives, some still feel the same way about technology that they would if they were suddenly
dropped into a jungle—alone, without a flashlight, little water, and no means of protection.
Even the words used to describe technology can seem scary—disruptive, emerging, artificial intelligence,
machine learning. We are now living in an age described in some early science fiction novels. However,
regardless of how scary the scenario may seem, change is happening, and it is happening fast.
At Hilti, a privately owned global company with approximately 30,000 team members, we
realized that our personnel demographics were changing, and wondered if we were still creating
the right development activities to engage our team members. So, in 2016 we started our journey to
understand learning technology by creating and implementing a new learning strategy supporting
our corporate strategy in driving business results. We knew that each piece had to support the other
layers and roll up to supporting our overarching long-term business strategy. This tiered approach
helped take the complexity out of the jungle and allowed us to focus on what we wanted to achieve
(Figure 20-1).

267
Figure 20-1. Tiered Approach Toward a New Learning Technology

We analyzed and synthesized data from interviews with 4,000 internal team members and bench-
marked more than 60 of the top learning organizations globally. We learned that:
• There was a very large gap between how our team members learned outside work (in their
personal lives) and how we trained internally.
• Our current view of technology was extremely small and completely outdated.
• We had a long way to go to even start to understand the trends of virtual reality, augmented
reality, social learning, machine learning, and big data analysis.
In the beginning, addressing our technology issues felt like a daunting task. But with the help of exter-
nal partners, lots of trial and error and pilot programs, and some early successes, we started the journey
and are still reaching for the stars.

The Beginning
We decided to structure our findings on technology needs and gaps from the overarching learning strategy
assessment into four aspirations:
• Move the current reality from our traditional LMS to a virtual learning environment.
• Move content from learning ownership to a more “Hilti brain” concept, in which social
sharing and social learning is the approach.
• Build gamification and learning into the daily business.
• Follow technology trends and venture into AI and VR to build immersive learning
environments.

268 | Chapter 20
Before we go into more detail on these four aspirations, it’s important to note that we did not have a
lot of learning technology expertise. We knew a lot about formal learning management, because we had
been doing online learning for more than a decade. But, in a way, this hampered us because at times we
could not even dream of the possibilities that could exist. We had a vision, but we knew we needed help
assessing the potential solutions.
Hilti learners were stuck in a restrictive and disconnected learning environment—they were in a
jungle where some things were beautiful and visible, but there were also many hidden things. They loved
attending classes and networking with peers because they saw they were not alone in their challenges.
However, this positive experience rarely carried back to the job. What they learned in the classroom did
not match the instructions or coaching they received from supervisors.
In the digital space they saw an impenetrable rainforest because the only digital learning programs
available to them were the lengthy, asynchronous e-learning SCORM courses stored on the LMS.
Anytime they wanted to refer back to something important they thought they remembered seeing, they
had to manually page through screen after screen to find it.
As a global organization doing business in 120 countries and 34 languages, we knew we needed to
take a broad-based approach to understand our learners throughout the organization. By interviewing
and conducting focus groups with team members and leaders worldwide, we were able to create a picture
of what today’s Hilti learner wanted to experience. Our modern learners wanted easy access to open-
source content from anywhere, even a construction site. They wanted to access content from their phones
or tablets, be able to collaborate with peers and experts, and have their team leaders see that they were
actively engaging in professional development. They believed that open-source access to learning content
would help them engage Hilti customers, leading to their own and the organization’s continued success.
And they did not just want informal learning; they also wanted formal learning they could access from
anywhere. This way, they could benefit without missing out on potential business opportunities (Figure
20-2). To reframe this using our jungle metaphor, they wanted a path cut through the jungle, with easily
visible and clear signposts showing how to get to various points of interest.
Our journey began with this pictorial overview of what the learner wanted, and then we developed
key use cases for our primary personas (corporate team member, sales team member, corporate leader,
sales leader) to see how different learning platforms could accommodate the diverse needs of our global
organization. We had several big challenges. First, there was no common language—all materials needed
to be easily searchable and sortable in all 34 languages. Second, we had no way to sort people by sub-jobs
(such as trade-specific salespeople), and third, we had very little IT support. However, our users wanted
videos and PowerPoints slides from the L&D staff, and these personas helped our management team buy
in to the learning strategy and the need for change. Knowing we had done our due diligence in research
meant that the organization trusted us to take the next steps without oversight. We had created our first
path through the jungle for stakeholders to follow.

Taking the Fear Out of the Technology Jungle | 269


Figure 20-2. The Modern Learner

ED
NETWORK
DIGITAL
LEARNING O CLASSRO
OM
ON THE G
T
C O NTEN
OPEN YOURS
+ ED
OURS ENGAG MER
C TO
U S JOB WELL
DONE

CONNECTION TO
LEARNER OF THE FUTURE THE TEAM LEADER

M-LEARNING
KNOWLEDGE
ON THE GO!!
BEYOND
INFORMATION THE
ON THE COLLABORATIVE CLASSROOM
MOVE LEARNING
SHARING

Several vendors declined to submit proposals after looking at what we wanted to achieve and the
barriers we would need to overcome. They saw the overwhelming tangle of the jungle we were trying to
traverse and could not see a way to help us build the paths through. However, after doing further research
on learning systems, we realized that we needed a next generation learning environment (NGLE) to
accomplish our goals instead of the more traditional learning management system (LMS).
We based our decision on research from the Fosway Group—a leading provider of research, analysis,
and advice on HR, digital learning, and talent management—whose nine-box grid allowed us to see the
relative positions of various technologies. An NGLE makes it possible to create more individualized, agile,
and adaptable learning that can happen in the flow of work, which was exactly what our modern learners
wanted. NGLEs also allow for both formal and informal learning, with full visibility and transparency to
any Hilti team member. Most important, they allow for easy experience exchanges and knowledge sharing
and are available for nearly any language.
After working with our IT team to evaluate vendors, we had three finalists. We then brought in a
third-party consultant that knew Hilti and technology—GitWit Creative—to determine how difficult it

270 | Chapter 20
might be to integrate into our existing systems and network. With their support and technical knowledge,
we were able to determine which vendor would meet our needs, align with our existing ecosystem, and
fit the timeline.

Social Sharing and Social Learning Is Now the Approach


We realized that much of the jungle could be untangled by moving content from learning ownership to
the “Hilti brain” concept, in which social sharing and social learning were the approach. We could see
that determining the correct vendor to meet our needs was a significant step forward, but there were still
challenges in front of us. One of these challenges was who would create the content. We learned that our
employees who were seen as experts often struggled to get their own jobs done because they had become
de facto coaches helping others. Given this, we knew that Hilti team members trusted what they heard
from peers over what they received from learning and development or even their own boss. We were
concerned that people were worried about putting something online that could be wrong—it was seen as
dangerous to advancing their career, as well as dangerous if they accidentally provided incorrect informa-
tion. While we joked about the “career limiting move” or “career killing move,” there was an underlying
realization that transparency is not always seen as a positive. Creating an environment that made one
“Hilti brain”—with lots of people from all around the world contributing—was a great concept, but it
was going to be hard to implement.
Fortunately, the experts who were often answering the same question many times bought in to sharing
their experiences. We would just need to provide more help so they could share their knowledge safely
and in a methodology that would be of greatest help to the team members who needed it. The desire was
to create a platform where they could put the answers and people could easily find them, which would
increase their productivity and improve their engagement with their job. It also allowed them to be recog-
nized as the experts that their peers saw them as.
Team members who wanted to learn from those who were doing the same or similar jobs also bought
in to the idea of having a single source to go to for support. They appreciated having just-in-time access
from any device, and loved having real-world examples from people they knew.
The biggest challenge related to who would create content was twofold. We had to gain manage-
ment buy-in that this was not dangerous (by convincing them that if incorrect information was shared,
we would have transparency and could correct it) and convince L&D staff that releasing ownership of
content would make their jobs more interesting and engaging.
To support our journey, we worked with several external consultants to help our L&D staff articulate
the future vision more concretely. Julian Stodd, a thought leader on co-creating or crowdsourcing content,
helped us see possibilities to expand our reach and meet our learner needs with internal experts. The
70:20:10 Institute helped us move beyond formal learning. Fuse Consultants, the vendor for our NGLE,
helped us design a platform that would support our goals to democratize learning while simultaneously

Taking the Fear Out of the Technology Jungle | 271


creating business impact. We brought in ansrsource to help us develop our 70-20-10 solutions in creative
and sophisticated ways and supplement our lack of knowledge and experience. We also found early adopt-
ers within our organization who were willing to jump in, try and iterate within the system, and eventually
succeed. Working with Sea Salt Learning, we identified key cultural barriers so we could develop methods
to overcome them. We applauded the iterations that business units made and shifted our focus to sharing
these lessons learned among departments and teams so we could make new mistakes instead of repeating
old ones.
In year 1, we evaluated engagement with the platform and quality of content provided, and saw a
slight upward trend. Some areas were engaging with the platform, while others created spaces for experts
to share knowledge. Some areas were not engaging with the platform; sometimes that was a specific
department within an otherwise active organization. So, we stepped back to see who was taking advantage
and find out why. We used system data to find out what types of content were best received, and added
optional surveys to various pieces of content and our team mailbox for anonymous input. We also did
focus groups to speak with random team members. We asked those responsible for our regional learning
to bring us their insights.
Some things were easy to solve, such as creating more how-to videos and tips, or even running small
contests to encourage people to try the system, even if they lacked confidence. We set an example by doing
raw selfie videos on various topics and challenging others to do one on something they knew. Slowly we
saw a shift.
By year 2, we had more than 2,500 people actively and repeatedly uploading content to the social
platform, and more than 650 people were voluntarily managing more than 250 learning communities.
The majority of our learning communities were not led or directly supported by L&D; they were a core
component of how the business area developed its people to perform in roles.
Once the momentum shifted, we also began to see powerful growth in engagement. Instead of visit-
ing once or twice to complete compliance, we saw users entering the system at all times of the day,
including weekends. They were accessing content from a range of mobile devices, starting conversations,
and directing others toward helpful content. We even saw people jumping in to correct erroneous content
before an administrator even saw it. Communities formed and came together for shared success. The
future is bright—with the right base of technology and the right changes to support it from a cultural
perspective, our Hilti brain is becoming a very powerful asset.

Build Gamification and Learning Into Daily Business


Because we encounter gamification elements in many aspects of our daily lives, when our learning
programs don’t incorporate gamification, they seem less useful, less engaging, and less interesting. With
attention spans decreasing and the content complexity increasing, we had a prime opportunity to intro-
duce gaming elements to our professional learning content. With a young workforce that was spending,

272 | Chapter 20
on average, 22 months in a role before changing positions, we also had a receptive group of learners.
They frequently needed to learn new things due to the rapid pace of change in our business, the overall
shift from a product focus to services and software, and their own movement through the organization.
Because we needed highly adaptable learning content and wanted to have as much content as possible
generated by local experts and peer leaders, we needed to take a simple approach to gamification.
After two years of implementing gamification, these are some of our examples:
• Leaderboards within communities on our NGLE. When we first started this, we did not
explain how people got points, but rather let them experiment to see what affected their points,
much like a video game would.
• Use of simple avatars or photos within learner profiles to encourage users to put their own
custom mark on the “game” of learning.
• Introduction of design principles within digital communities, such as carousels to enable
interaction and individualization.
• Automatic assignment to job-relevant communities.
• Responsive formal learning online courses that adapted to the device being used.
• HTML coding to ensure the user was personally greeted when landing on a learning topic.
• Real-world scenarios incorporated into formal and informal learning moments.
• Contests for learning developers to compete for best designed page or community.
• Recognition of most viewed or most engaged with user-created content.
• Contests for users for things like best-rated product demonstration video.
• Gates or phases to break formal learning programs into smaller parts so learners could see
progress on a day-by-day basis.
Some of these techniques require more knowledge of L&D or more access to the background of
the learning technology system than others. Luckily this was an area where some members had insights
because of today’s gamified world. By creating things that could be duplicated, such as the carousels,
contests, real-world scenarios, and gated solutions, we set a positive example of the direction we were
headed. This enabled us to provide easy-to-duplicate methods anyone could use. We learned that this was
key—we had to go first and pave the way, find a few early adopters who could influence larger populations,
and be sure to provide easy-to-duplicate methods to allow others to join us on the journey. Just think of
the viral social media challenges, like the ice bucket challenge, or YouTube videos of people dancing—the
first person has to be willing to look like an idiot dumping a bucket of ice water on their heads or doing a
weird dance move, but if all goes well, others will join in more willingly.

Follow Technology Trends to Build Immersive Learning Environments


Finally, we have started to look into emerging trends. Like venturing into the jungle, this was scary ground.
We had little to no functional knowledge in these areas and few realistic strategies for how to use them in
our learning ecosystem.

Taking the Fear Out of the Technology Jungle | 273


Our first real entry was AI. We are not experts in technology, but we understand who our learners are
and what their core business is. So, we worked with an external partner to create AI-generated role-play
scenarios for our sales teams. We started here for two reasons: First, our sales teams make up 60 percent
of our overall company. Second, the initial entry point from a cost perspective was relatively low when
compared with reach and scale. We built custom stories and content and worked with the vendor to ensure
quality scenarios. Everything was looking great. We then piloted the solution to end users and strategic
stakeholders. The initial feedback was high and the excitement for a “shiny” new way to produce role-play
scenarios was contagious. And yet we failed.
We failed because we left out one key population—the local and regional learning functions that
would need to implement and understand how to use, coach, and facilitate this new methodology. We
had become a disruptive force and had not prepared ourselves for the disruption. This was a huge
lesson for us.
During our journey, we realized that we had to partner with leaders in learning and technology.
We evolved several of our partnerships with organizations like ansrsource, Sea Salt Learning, Fuse
Universal, and 70:20:10 Institute. These relationships not only help us move the needle in learning
and technology, but they also challenge us to stretch the boundaries of our thought processes and what
we can become. Much of our relationship is built upon the inspiration we get from them, so when we
were approached to write this chapter, it made sense to reach out to them for their inspiration and
insights.
Regarding critical advancements in learning technology, Rajiv Nayarana, president and CEO of
ansrsource, says, “Technology and pedagogy have advanced alongside one another, and we’re now seeing
an increased focus on enabling technologies. That is, we’re witnessing organizations thinking beyond
manual and physical approaches and what technology is available. Instead, the focus includes blue sky
thinking that imagines how the best approaches for human thriving can be unleashed.”
This focus means that L&D can attempt things we never dreamed possible—things related to what
Nayarana refers to as the four major digital learning disruptions:
• Communications. Digital communications enable more timely and efficient delivery and
collaboration in learning.
• Platforms. The advent of learning platforms allows us to take a more centralized approach
to learning management and performance tracking.
• Mass customization. Adaptive and personalized systems allow students to affordably move
at their own pace and with their own preferences.
• Networking. Technological sophistication provides more authentic social engagement and
user-driven discovery and experiences.
Steve Dineen, chief storyteller and CEO of Fuse Universal, agrees that we have incredible opportu-
nities ahead of us: “From an innovation perspective, we’re undoubtedly in a golden age,” he says. “Tech-

274 | Chapter 20
nology is driving new ways of thinking. We can reframe what we do in learning because it allows us to do
things we could never do before. We can reach everyone who has a mobile phone in their hand and it’s
all about how we harness the new technologies coming to market—be that VR, data analytics, or AI—to
accelerate even quicker to our goal.
“It’s exciting to see how technology is able to bottle and codify the greatness of the best people
inside and outside an organization,” he continues. “In the world of corporate learning, organi-
zations would only be able to focus on delivering training to the top 10 percent. Now technology
enables 100 percent of an organization to be developed. Education is becoming democratized.”
Dineen goes on to remind us that “the attractiveness of technology was to solve some of the
problems of classroom training—around reach, scale, and distribution. In the first generation we
got the scale part right. In the second generation, we’re getting the learning aspects right.”
Julian Stodd, author and captain at Sea Salt Learning, also sees the community aspect and
related collaboration as a key development. However, he cautions that we may allow the lure of
technology to distract us. Because he is first and foremost a storyteller, Stodd sees learning tech-
nology of all kinds as a facilitating and enabling tool for story delivery. With this mindset, he finds
the democratization and fragmentation of technology as one of the most exciting pieces of the land-
scape today. The move toward a more diverse ecosystem permits, believes Stodd, “individuals and
communities to take control of their conversations, to own and shape their ‘sense making’ spaces,
and allow new technological innovations to f lourish.” He recently completed a research study with
the United Kingdom’s National Health Service (NHS) in which people shared which technologies
they found effective on a daily basis. Only one of the technologies was actually owned by the NHS.
In today’s world, people will find and use what works for them, not necessarily what the company
they work for provides.
While organizations are often talking about learning technology such as big data, analytics, or
AI (when what is really meant is machine learning), the conversations are more driven by a realiza-
tion that social technologies are more engaging than what the company is currently offering. This
is often hiding the fact that we are simply delivering really bad learning. Stodd says that instead of
focusing on “more of what we need, when we need it, facilitated by lightweight and rapidly dispos-
able technologies,” we distract ourselves by discussing the trends. As L&D professionals, we need to
be aware of what possibilities exist, but keep our focus on seeing these as tools that deliver learning
opportunities and affect our organization’s culture and performance.
Where these leaders disagree is related to where our focus needs to go next. Stodd’s team at Sea
Salt Learning helped Hilti move from a permission-based, controlling mindset where learning was
completely formal and much was mandatory to today’s environment, where people learn on the go
and perform at higher levels. He wants that experience for everyone and advises L&D leaders to “be
curious, learn to experiment, and be a humble, social leader. Put as much effort into disposing of the

Taking the Fear Out of the Technology Jungle | 275


old as you do into adopting the new. Be an evangelist, but ensure that a deep fairness and inclusion
sits at the heart of your work. Recognize that engagement is earned, not demanded. Recognize that
technology connects, but its trust and pride forms communities. Be bold.”
Nayarana believes the advent of artificial intelligence and machine learning will be the biggest
next disruptors. “AI is in a nascent stage in the learning industry,” he says, “but that is about to
change. Automation made possible by AI is disrupting work as machines are taking on many tasks
previously performed by humans. This is uprooting people from jobs. Because of the low cost asso-
ciated with proliferating algorithms, this trend will accelerate and in the very near future we will
see this technology further upset our already fragile social structures. But there is a silver lining.
An age of automation means that people can focus their skills on those things that are uniquely
human.”
Dineen believes we must first change our mindset and then think about what we can do to
determine what is critical to our future success in L&D: “Accepting what got us here is unlikely
to be the thing that provides the next evolution or revolution—to accept that there are new ways
of thinking and maybe throwing away some of the old. One of those big thinking pieces I see as a
trend is the move from thinking that learning output is success, to believing that business outcome
is success.
“The biggest question that comes from that is the challenge around how we measure the value
and impact,” he continues. “The big enabler toward that is having rich data and analytics that give
you instant insights, as well as the ability to interpret that data far more easily. This way, we’re able
to see when things don’t work and then say, ‘Well, they don’t work—let’s stop doing them.’”
Dineen advises us to find people who can help change the mindset within the team. “There are
some great thought leaders out there—people like Charles Jennings in the 70:20:10 Institute,” he
explains. “It’s a case of busting through the cobwebs, breaking that old way of thinking, and getting
people to start thinking about designing differently.”
First, he says, “it’s important that this is not just done at the top levels of learning—and learn-
ing leadership—but all the way through to the actual trainers on the ground. You might find that
there are different people you’d bring in to help you. . . . It’s really how you help your team transition
and get the new mindset, thinking, skills, and capabilities.” Second, he says, you need a few core
strategic ideas to use the new methodology, the new way of thinking. Ideally, you’d want to find
business owners or stakeholders who are progressive and open minded. That might come from the
launch of a new product or strategic communication—as long as you can pick out someone great in
the business who will let you do something new.
Dineen’s third piece of advice is to mix and match your team. “It’s great to have learning
experts, but we’re in a new world, and I think you want to bring in some other talent on top of that
from other industries such as marketing,” he says. “Martech is having big impact; they are a couple

276 | Chapter 20
years ahead in some of the things they’re doing, and likewise, one of the things that I did years ago
was bring in graduates from film school. They weren’t trained in instructional design but they were
trained to think about narrative—beginning, middle, end, and stories, and so forth. That was a way
to kind of upskill from an internal perspective.
“Too often in L&D we are not bold,” Dineen continues. “We hesitate, we analyze, we consider, but
we do not move. Indecision sets in. We get comfortable, especially after we have made some big shifts.
We need to always be looking for the next challenge. Technology is there to facilitate change, but it is no
substitute for the culture that is being created in your organization on a daily basis.”

Summary
In most cases, technology surrounds us daily. If we think about it, we know what we like, what we
respond to, and what we engage with. We may not know how to do those things, but if they exist, it is
simply a matter of finding the right partners to work with. These could be thought leaders, challengers,
or research organizations. They could be technology geeks who make your visions come to life. Realize
that with the right partners, venturing into the ever-changing frontier of learning technology is an excit-
ing journey. You will face some unknowns and likely create new frontiers to cross for your organization.
Learning technology is changing, and so are your HR systems, logistics systems, planning systems,
financial systems, and sales systems. Digitalization is here to stay, so embrace it. And while it looks and
feels like a jungle when you first step into the area, with collaboration and inspiration you will soon start
seeing the order and beauty of the jungle.

Key Takeaways
9 We failed to implement a new methodology because we left out one key population—the local and
regional learning functions that would need to implement and understand how to use, coach, and
facilitate it.
9 A critical realization was that we always had to go first and pave the way, find a few early adopters
who could influence larger populations, and be sure to provide easy-to-duplicate methods to allow
others to join us on the journey.

Questions for Reflection and Further Action


1. What is the current state of your digital strategy and how might some of our lessons jump-start your
jungle adventure?

2. While you might not be able to hire vendors or thought leaders in the learning and technology
space, how might you learn from and be inspired by them?

Taking the Fear Out of the Technology Jungle | 277


3. How do you learn about the ways your colleagues in other industries like to learn?

4. How are you using technology to advance your personal learning?

278 | Chapter 20
21
The Role of L&D
in the Digital Age
Brandon Carson

The digital age is reshaping every aspect of business and leaving no industry untouched as its
three forces—technology, globalization, and demographic shifts—have brought the largest-scale
job transition since the Industrial Revolution. According to McKinsey, by 2030 as many as 375
million workers, or roughly 14 percent of the global workforce, may need to switch jobs (Bughin
et al. 2019). This massive shift means companies are going to have to reimagine their learning
strategies, and it also requires the learning and development (L&D) organization to assess how
it meets the changing needs of the workforce and the business. Learning leaders now face more
pressure to evolve their core practices to ensure L&D is a critical partner and, in some instances,
a leader in the face of this transition.

The L&D Shift


For L&D, the broad challenge lies in how to support the workforce by enabling the increasing
complexity of work as businesses rapidly adopt and integrate new technology, overhaul internal
processes, and realign job architectures. Like all shifts, we are currently in a period of discomfort
as the shock of the change settles in and requires more from L&D. Instead of operating as an
order-taking support group, a headcount cost center, a “drag on the business,” or, in some instances,
a near-irrelevant policies and procedures provider, we must reorient and reinvent our strategy. The

279
digital age calls for a paradigm shift in how L&D functions, what skill sets it possesses, and even
where it’s placed in the organizational structure if we are to rise to the true needs of the business.
I’ve been in corporate learning for 25 years and, like many tenured learning professionals, during that
time I have witnessed transformations in how workers communicate, collaborate, and get their work done.
I’ve provided training solutions across modalities and leveraged new technologies to deliver training—
from the classroom, Laserdisc, and computer-based training (CBT), to the Internet and almost everything
in between. I’ve heard others in the industry discuss the potential “death of the instructional designer” as
new technology enables people who aren’t classically trained in human performance to create their own
learning solutions. I’ve been involved in creating sales training, leadership development training, techni-
cal training, and functional training. I’ve helped create learning programs for as few as 25 people and as
many as 400,000. And, I’ve been involved in creating learning strategies and implementing them across
both small and large enterprises. Throughout my career, I’ve found that three fundamentals always affect
workplace performance:
• New technologies. Technology-driven change is always compelling, but it also presents
challenges in the context of the work environment. It’s important for L&D to have a good
understanding of emerging technology and how it can best integrate with the company’s
existing infrastructure. More and more, L&D will need to be an engaged partner in the
discovery and implementation of technology in the workplace and need the skill sets to
determine if and when to adopt technology for learning.
• New business practices. Understanding that every business at scale is under pressure
to perform at higher and higher rates of productivity is a critical element in creating a
learning strategy. L&D should draw a straight line between training, performance, and the
organization’s bottom line using data to pinpoint where the intersection of the three is the
greatest. Having a solid understanding of the financial impact of bad and good performance
and being able to show the correlation to training is necessary when building the business case
for a training initiative. To show credibility as a business partner, leveraging data for proposed
solutions that show how learners and the business will benefit from them is also necessary.
For example, L&D can provide data-informed evidence of performance impact to business
metrics, predictive data analysis to determine which employee behaviors best drive customer
satisfaction, and identify granular gaps in the knowledge, skills, and abilities necessary to
positively (or negatively) affect operational effectiveness.
• Worker transformation. The business world requires more and more high-quality work
from every level of worker. L&D needs to ensure it’s tackling the core issues of human (and
machine) performance in the workplace. The big question to keep on the wall is, “How are we
continuously driving a culture of performance?” With every worker at every level now faced
with a more complex work environment, it’s critical that L&D’s imperatives revolve around

280 | Chapter 21
data-informed learning solutions, balancing skills and knowledge development, and linking
learning to business key performance indicators (KPIs).
Although change is a constant, the digital age brings new workforce models and platforms that are
quickly changing our understanding of what work is, as well as expectations the business has of its work-
ers and what they need to know to achieve these business goals. This isn’t a new paradigm, but it means
that businesses will rely more on their learning function than ever before to ensure they have a workforce
that’s able to perform. For those of us in L&D, this means we must be the most agile line of business in
the enterprise, because this evolved business need fundamentally requires us to focus on the conditions
that affect performance, as well as the performance itself. We must continually reskill ourselves and stay
closely aligned to changes in the business. But we also need to understand that human performance isn’t
just about the business informing us of what it needs its workers to be able to do. This alignment means
helping the business understand the broader concept of what performance is in the context of what needs
to be done and how best to achieve the outcomes needed. In other words, L&D needs to be looked at as
a consultative part of the business, where we provide expertise and perspective on what dynamics drive
outcomes and what realistic gains can be made within the constraints of the system in which the work is
performed. How does environment, learning, employee wellness, and human performance combine to
drive those outcomes?
Alignment is not just L&D taking the order for more training from business units; it lies in fundamen-
tally re-establishing the dynamics of the partnership, as well as a willingness to shift how L&D operates and
potentially where it sits in the organizational structure. For 300 years, we’ve had this view of education and
how it should happen, and in today’s schools it looks much like it did long ago. We have a similar malaise
in business, where many corporate learning functions are often structured as a support organization, away
from where the work is performed; in some situations, they are even forgotten or not involved in the conver-
sations early enough to drive true impact. Over time, we’ve built walls around our operation and siloed
ourselves away from the workforce, leading to a lack of an integrated approach to solving performance
problems. As a result, we now require more resources and more time to create solutions with dubious
impact. I believe we are nearing an inflection point and need to ask ourselves some basic questions:
• Does L&D need to fundamentally rethink its role and the methods that support that role as a
capability function focused squarely on performance improvement?
• Should L&D move closer to the senior leadership reporting structure? Maybe even report
directly to the CEO?
• Should L&D be given more responsibility in driving workplace transformation (instead of
just responding) by making recommendations, and even policy, for how to leverage data and
technology to improve workplace performance?
• Should L&D be more involved in the design of the work systems and embedding learning into
those systems?

The Role of L&D in the Digital Age | 281


Ultimately, L&D’s responsibility is to the worker performing the job correlated to the desired
business outcomes. Today’s worker must be capable of executing on rapidly changing business needs,
which means we must nurture more agility within our own L&D operation to help an agile work-
force properly adapt. To begin formulating a strategy to enable this new responsibility, look to three
imperatives:
• Build a digital mindset. The future of work is digital, and the “big three” technology
areas—AI, the cloud, and data science—will continue to transform how we deliver products
and services to customers. Focusing on how to build a digital mindset in the workforce is
essential. From now on, every worker in every industry completing every task will interact with
technology to get their work done. A digital mindset helps the workforce build the skills it needs
using a personalized digital development plan for every worker.
• Target the right development area and invest. The painfully slow and archaic process
of talent acquisition is causing businesses to lose traction in finding the best talent. Companies
need the best people at every level in their organization. They need people who are innovative,
persistent, and courageous. We need to stop the mindless, irrelevant, and often months-long
tyranny of dissecting each candidate’s life history. If they’re good, you need to get them into
your company. You need the right talent, but you need to move quicker in identifying them and
bringing them on board. It’s beyond time for L&D to take a strong role in partnering with HR
to strategically align talent development to the areas that result in the highest business impact.
It’s important for L&D to recognize their capability to lead given the constraints in their
organizational design. Job architectures must be redesigned to allow for new business processes,
tasks must be identified that move from human to intelligent machinery, and job design must
be revamped with more focus on leadership, critical thinking, and innovation at the top of
the list. Companies are investing heavily in technology and updated infrastructure, but are
flat or negative in training investment. This is going to be very costly. More than 50 percent
of America’s workforce is aged 50 or older, and more than 10,000 of them are retiring every
day, which will result in a skills deficit across our companies. We have to significantly increase
investment in continuous acquisition and skilling of the workforce.
• Keep a sharp focus on the human element. As every aspect of work life is transformed,
L&D (in partnership with HR) has a unique role to play in ensuring that the humanity of
work is preserved, recognized, and rewarded. Building deeper relationships, empathy, and
healthier habits that help the workforce be their best is truly a key aspect of the future of
work. Concerted efforts in well-being (including physical and mental), ergonomics, and
environmental design of the workplace will pay off in a more motivated, healthy, and
committed workforce. We won’t be able to protect jobs made redundant by technology, but we
do have a responsibility to design optimal work environments for them.

282 | Chapter 21
To position L&D to be most effective as the digital age unfolds, we need to keep our focus on these
three key areas to ensure we help drive performance for the consistently evolving workplace.
This next section presents a case study in which L&D reoriented itself in the organizational structure
and led a program at scale designed to not only improve workplace performance, but also to move the
organization from a transaction-based training mindset to a learning culture mindset.

Mobile Learning: Innovation at Scale


In today’s world, almost all workers are mobile. Constant connectivity has sparked a revolution in processes
and agility, as well as rapid change in how business is conducted. Workers have rapidly integrated mobile
devices into their work, and are heavily reliant upon them for communicating, connecting, finding infor-
mation, and sharing. Information that helps workers perform their jobs has brought a massive productivi-
ty shift to the workplace. At its most basic, mobile is a confluence of devices, people, and connectivity with
the ability to give a deep reach into the learner’s context. The impact of real-time knowledge consumption
and the inherent social utility of mobile devices provide a great opportunity for business to fundamentally
change the conversation when it comes to meeting customer needs.
In just the last several years, mobile technology has transformed the retail world. Customers use
mobile devices before, during, and after transacting with retailers to ensure their product and service
needs are met. They are becoming accustomed to receiving hyper-individualized attention as they make
buying decisions, and they’re using mobile to stay informed and aware. In many ways, retail customers are
entering the buying journey with more information than the workers who serve them have. Even though
workers have overwhelmingly adopted mobile, using mobile devices for learning is still in its infancy in
many large enterprises.
Earlier in my career, I joined a large U.S. big-box retailer that was interested in moving their training
from a transactional model to one that would provide on-demand product knowledge in the flow of work.
The company has a mix of part-time and full-time workers and a highly complex retail environment with
a strong focus on customer service. The bulk of their current training was provided at the start of employ-
ment and consisted of approximately 30 to 40 hours of selling skills and product knowledge training.
Because a large percentage of their workforce were not subject matter experts, it was important to have
reliable and credible information available at their fingertips when working with customers.
The idea was to embed learning directly into the flow of the work itself and make it available during
the interaction with the customer. We were hoping to achieve two outcomes: Provide real-time informa-
tion about complex products and services to a worker who may not possess deep knowledge of the subject
area, and ensure the customer’s needs were met.
We piloted a mobile performance support platform accessible from corporate devices or the employ-
ee’s personal device to provide quick access to relevant information about top-selling products and popu-
lar projects.

The Role of L&D in the Digital Age | 283


In this context, mobile learning added meaningful information at the moment of need instead of
providing traditional training on products outside the employee experiencing the product in the aisle.
Before the mobile learning pilot, observations showed that employees learned by using their own devices
to search product reviews and ratings, check inventory, and locate products in other stores.
A key reason this retailer placed devices in the hand of every employee was because it recognized that
learning happens anywhere, anytime, and anyplace, and most employees learn how to do their jobs while
on the job. The ability to move about while remaining connected to information that helps the employee
perform is the essence of learning embedded in the workflow. The employee equipped with a mobile
device is similar to the employee equipped with a pair of gloves and a measuring tape. It’s an essential
utility to extend their knowledge so they can continue working with customers, instead of going off the
floor and sitting in front of a computer for e-learning. Furthermore, customer research showed that the
number 1 customer desire in a retail shopping interaction was to receive quick and knowledgeable help
from an employee.
The company decided to create a proof of concept using a small amount of content and limited
functionality. The development team applied design thinking principles by going into a store to design a
series of conceptual mock ups. This enabled the team to get immediate feedback from the target audi-
ence. By ideating a proof of concept with the employee audience involved, the team quickly established
a user-centric design pattern, shaving a considerable amount of time from the typical design process.
For example, developers considered search functionality to be a core component of the user experience.
They quickly discerned that most employees would not use the search feature; instead, they wanted more
guidance through a menu system (consider the resource savings of not engineering a functionality that
wouldn’t be used). Additionally, approaching the content development in design thinking mode, the proto-
type tested not only the effectiveness of the content but also how employees accessed and used it. Usability
and content together form the mobile experience, and the prototype was designed to test both the content
and how employees might use the app.
Once the prototype was built, it was delivered to a subset of stores. The primary use case for the
prototype was to observe employees using it in the context of their jobs, gauge reaction, and analyze
behavior patterns. The team also focused on customer reaction to employees using the app while working.
A key component in rolling out a mobile learning experience such as this is to identify how its usage in the
workplace affects not only employee behavior but also customer behavior. It was also important to receive
feedback from store leaders to discover whether the use of devices in the aisle affected customer service.
The app was well received during the prototype, which ran in the stores for 90 days; however, access
and usage of the app over time suffered because of device limitations—notifications were turned off and
the app icon was not easily accessible on the mobile devices.
The team took the feedback from the prototype phase and iterated the design. The content was
optimized to be more succinct and provide clearer guidance on helping the customer make purchas-

284 | Chapter 21
ing decisions. The team also integrated game mechanics to provide a more interactive and engaging
learning experience. With the addition of guided scavenger hunts and rapid-fire knowledge checks,
the employee could leverage the app to not only help customers but also become more fluent in the
work environment. The game mechanics assessed the employee’s product knowledge and, using reward
mechanisms such as points, badges, and leaderboards, provided a competitive and challenging learning
opportunity.
The pilot was conducted over a 90-day period in 78 of the retailer’s stores. A control group was
established to benchmark against the pilot to help the team determine a return on learning effective-
ness. Metrics such as app dwell time (the time spent in the app itself), motivation to complete game
activities, and proficiency based on assessment responses were analyzed to determine the pilot’s
success.
For a retailer of such a large scale as this, it was important to prototype and pilot a learning expe-
rience that would potentially transform how it delivered content before launching a systemwide rollout.
Iterating through the design and development experience by connecting closely with the target audience
allowed the team to reduce its reliance on assumptions and deliver to actual audience needs. Another key
aspect of determining whether mobile would work was to recognize the change management necessary
to facilitate learning in the aisle. From the employee’s perspective, the leadership perspective, and the
customer perspective, it was important to fully understand how (and if) the use of devices for learning
in the work stream would have a detrimental effect on the business. The team found that quite to the
contrary, customers, leaders, and employees welcomed the experience into the environment because at its
core, it was designed to assist.
Mobile learning can demonstrate tangible ROI for a learning organization. Even if your workforce
is not mobile in the sense of moving around, they constantly use mobile devices to acquire information
and learn. Having a learning strategy that incorporates mobile prepares you to accommodate your learn-
ers across the channels they feel most comfortable with and to which they have access. You should never
incorporate emerging technology just because you can, and you should remain skeptical of emerging
trends. However, mobile technology has completely reshaped business and provides continuing value as
to how people get work done. Consider these guiding principles when thinking about making the move
to mobile learning:
• Place learning opportunities where they matter most for the workforce: at the point of need,
where it’s more relevant and engaging.
• Accelerate the creation of learning content and increase the speed of access to keep your
learners informed with up-to-date information.
• Stop the information fire hose and provide just the right amount at the right time.
• Untether your workforce from “dedicated training computers” and make information available
anywhere, anytime they need it.

The Role of L&D in the Digital Age | 285


• Leverage geolocation capabilities, internal sensors, text messaging, supportive notifications,
collaborative learning, and smaller chunks of content inherent to mobile to redefine learning
experiences.
• Move from a focus on only formal learning to incorporating opportunities for self-directed
learning where the employee has more control.
This pilot led to another challenging decision for the company: Build the platform technology inter-
nally, or buy a platform through a software as a service (SaaS) model? Allocating internal resources to
build a platform would place the company in the business of developing software. However, it would also
enable the company to customize the experience to their specific needs. Buying or licensing a platform
would require the company to adapt its mobile learning initiative to the constraints of the licensed system,
but it would also provide speed to market and place the expertise for software development on the vendor
instead of the business.
An initiative such as this expands the traditional focus of the L&D organization to become more
of a cross-functional partner. For example, to succeed at designing and developing the mobile resource
outlined in the case study, the L&D team had to work with retail operations, the IT organization, the
mobile team (which included consumer-facing teams), employee engagement, and marketing. Each of
these lines of business had input into the overall solution and how it would be deployed. Because it was
not being delivered through the learning management system, which was owned by L&D, there was much
more scrutiny. It also required L&D leaders to bring forward business plans for the learning solution and
collaborate with the various factions to ensure the solution had the necessary support and would show
evidence of its value.

Summary
With the world of work under constant change and the business relying more and more on upskilling
and reskilling its workforce, successful L&D operations are now having to find the balance between their
legacy operations and how to more proactively deliver timely and relevant solutions to their audiences. To
be successful, L&D needs to operate more like a business balancing the learning needs of the workforce
with business needs (including costs), time constraints, technology challenges, company culture, and inter-
dependent systems and business operations. Today’s learning solutions require much more than just good
design. They require flexible, adaptable approaches across the spectrum of how the workforce best needs
learning presented so they can consume and interact with it in their context.

Key Takeaways
9 The digital age is reshaping every aspect of business and leaving no industry untouched as its three
forces (technology, globalization, and demographic shifts) have brought the largest-scale job transition
since the Industrial Revolution.

286 | Chapter 21
9 L&D needs to be looked at as a consultative part of the business, where we provide an expertise and
a perspective on what dynamics drive outcomes and what realistic gains can be made within the
constraints of the system in which the work is performed.
9 The digital age is requiring more and more high-quality work from high-performing humans, and we
in L&D must take a leadership role in formulating the right workforce performance strategies.

Questions for Reflection and Further Action


1. Are we tackling the right issues for the digital age?

2. Does L&D need to rebrand as a capability enablement function focused squarely on performance
improvement?

3. Should L&D move closer to the senior leadership reporting structure? Maybe even report directly to
the CEO?

4. Should L&D be given more responsibility in driving (instead of just responding to) workplace
transformation by making recommendations and even policy for how to leverage data and
technology to improve workplace performance?

5. Should L&D be more involved in the design of the work systems and be involved in embedding
learning into those systems?

The Role of L&D in the Digital Age | 287


Section 7
Innovation

T
his section is included in the book because of innovation’s role in enabling organiza-

tions to stay competitive, but also because innovation does not happen consistently

and pervasively without the leader serving as a model and catalyst. It is up to leadership

to make sure innovation is part of the culture. Additionally, given L&D’s role as an expert in building

organizational capabilities, learning functions need to lead by example and push all leaders to think

differently about the way they develop and provide products and services to their people.

But what is innovation, and how does it happen? The Internet is chockablock with defi-

nitions ranging from being curious and challenging assumptions, to inventing new things, to

experiencing creative friction, to generating ideas. Some even say innovation is using asso-

ciational thinking to mix concepts, which results in something new and different. Others say

innovation is not about the ideas that bubble up, but the actions taken to convert those ideas

into something of value to customers.

Most of the Forum’s members practice some form of innovation—but it looks different

for each of them. In chapter 22, the Accenture team of Dana Alan Koch, Michelle Webb, and

Tanya Gibson bring innovative processes to life by making research and innovation a priority.

They share this research process, which starts with clarity on purpose and what they refer to

as the “great question.” Their research is conducted by a cross-functional team that has the

right skills and strengths and uses a practical research methodology including collaborating

289
with external research partners and other internal innovation teams. In sharing their practice, they use one

of the suggested communication tools: storytelling.

In chapter 23, Graham Johnston of Deloitte provides a variety of ways they not only “keep up” with

innovation, but make it one of their core values. These processes enable the learning team to address the

future of work and the distinct needs and preferences of the modern learner. For innovation to be perva-

sive, organizations must have a strong learning culture where learning is viewed as an enabler for indi-

vidual, team, and organizational performance. The daily practices must drive engagement, connections,

knowledge sharing, and collaboration across the entire organization. Human-centered design is one of the

tools they have integrated.

Ann Quadagno and Catherine Rickelman of IBM share a variety of tools and techniques they use for

innovation to keep pace with changes happening in the business in chapter 24. They recognized early on

that there are many ways to be innovative, but their big takeaways include doing your research, ensuring a

tight connection to what the business needs are, and continually getting more agile in how you develop and

deploy your learning content. They provide their ideas and lessons learned about microlearning, bundling

content, and design thinking.

290 | Section 7
22
Business Impact Through Learning,
Research, and Innovation
Dana Alan Koch, Michelle M. Webb, and Tanya Gilson

Prologue
Your learning organization is humming along fine. Employees are gaining new skills, and business leaders
acknowledge learning impact. Of course, there are daily challenges, but you’ve got things under control and you
have a great staff. Yet, you are still uneasy about the future. Your company is increasing its interest in adopting new
technologies such as artificial intelligence, blockchain, data analytics, and immersive technologies, but you don’t
understand how these technologies could be used to drive learning impact. You’re also aware that learner mindsets
are changing, and brain science is revealing more about how people learn. To continue to successfully lead your
team, you believe you need to create a future-focused learning organization that thrives on change and innovation.
But where do you begin?

At Accenture, our learning team has been working at “getting better at getting better” for several years.
We’ve always had individuals with an innovative mindset who would innovate in the context of their
individual projects or learning programs. They would help us get better, but their efforts were often done
in isolation and on a shoestring budget. As the value of learning innovation became apparent to our lead-
ership, we saw an increased appetite to focus resources on research and innovation. A second inflection
point came when HR leadership began asking for “science” to inform programs and decisions. This led
to the creation of resources focused on bringing innovation out of the shadows, demonstrating leadership
commitment to innovation, driving science-based solutions, facilitating cross-pollination of innovative

291
ideas, enabling connections with other innovators in the company, and accelerating innovation overall. As
a result of this journey, the Talent Research and Innovation team was created. We are a small team with
a large impact—our research is driving significant positive change with our learning teams, our recruiting
efforts, and our performance achievement approach.
In this chapter, we will share how we have mobilized learning research and innovation to create a
compelling and effective future-focused learning and leadership development organization. As our team
has evolved, we have focused on several areas to drive success, which we discuss here—making research
and innovation a priority, defining a research and innovation purpose, engaging a cross-functional team,
having the right skills and strengths on the team, using a practical research methodology, innovating in an
agile way, engaging the learner in the innovation process, collaborating with external research partners
and internal innovation teams, and being intentional about communicating and sharing our work.

Making Research and Innovation a Priority


Whether you are a learning team of one or 1,000, unless research and innovation are a priority for your
business, they will never happen. There must be universal agreement that dedicated time for research and
innovation will drive business value and is worth the necessary resources. You may need to start small,
and that is okay. Even when innovation is small, it can lead to larger opportunities. By being thoughtful
and structured in how you approach innovation, you allow yourself the space to make new connections
between concepts and ideas and then create new business value.

“The everyday vortex of work will keep you from moving ahead with research and innovation unless it is a priority
to you and leadership.” —Allison Horn, Global Talent Lead, Accenture

While innovation can happen in pockets, to be transformational it must be part of a cohesive


strategy adopted at all levels of the business. Because this is uncharted territory, everyone must be
on board. An essential and early step is to get agreement on what innovation is and what it is not.
A simple definition of innovation is the creation and implementation of a new feature, capability,
product, or service leading to positive significant change. For businesses, it can also lead to competi-
tive advantage. For government agencies and nonprofits, it can lead to efficiency and satisfaction of
those served. Both organizational and learning leadership need to champion and believe in the value
of research and innovation by providing resources and giving permission to make mistakes and learn
from failure. Learning teams must act as both influencers and champions. Help leadership see the
value of research and innovation by starting small, having clear goals, and measuring success; then,
as you develop solid science-based and data-driven approaches, you can demonstrate the impact to
leadership. In addition, we regularly get learner feedback on our innovations and share that with
leadership to help with buy-in. Learners can provide essential input to research and give reality

292 | Chapter 22
checks to experimentation. For example, before revising the search engine and learner recommen-
dations approach in our LMS, we interviewed several employees. Through these interviews we came
to understand that it was not just about improving the LMS but improving overall access to learning
resources wherever they may be found. This created a fundamental shift in our vision of providing
learning resources to our employees, and it was driven by the voice of the learner.
Once research and innovation are seen as a business priority, it is critical that these efforts are prior-
itized to focus on business imperatives that drive value. Research and innovation should solve the big
questions that the business faces in the context of the organization’s vision, strategy, and priorities. Regu-
lar prioritization discussions between leadership and the research and innovation team will help ensure
energy is being exerted in the areas that are seen as most timely and influential.

Actions to Take
Try these actions to make research and innovation a priority:
• Stay deeply knowledgeable about the business for today and tomorrow by creating
opportunities for the learning team to share with business partners.
• Have prioritization discussions with the business to focus research and innovation work.
• Ensure that leadership provides resources (budget and dedicated time) and gives permission to
make mistakes.
• Plan to capture the voice of the learner in research and innovation efforts.

Defining Your Research and Innovation Purpose


A second part of making research and innovation a priority is to have a clearly defined responsibility or
charge. This is not only a team vision—it also outlines the scope of what you take on and what you don’t.
Business leadership should be part of defining your purpose because they are the primary beneficiary of
the value you bring.
Our research and innovation team are tasked with:
• Research. We conduct primary and secondary research on talent topics to ensure we
are grounded in science. We use our vast network of employees as well as global external
connections to gain knowledge and insight through interviews, surveys, focus groups, and other
human-centered design techniques. If you are not equipped to conduct primary research
internally, there are many external resources outlined in this chapter that can help with
primary research. Secondary research can easily be done by anyone through open-source
portals such as Google Scholar or a carefully curated news feed.
• Experimentation. We team with internal and external partners to evolve new concepts
and technologies as we innovate and solve for talent-related challenges. We assist with rapid
prototyping and piloting of new ideas, designs, or technologies based on our research.

Business Impact Through Learning, Research, and Innovation | 293


• Innovation adoption. Following experimentation, we transition new innovations to other
teams, freeing us up to continue with new experiments or research. We continue to consult as
needed to help ensure the greatest impact during adoption.
• Research partnerships. Our team maintains relationships with professional associations,
research partners, and academic partners.
• Brand and reputation. Our team is exposed to some of the greatest innovations in talent;
we learn how an idea was born, how it was developed, and the impact it created. We have
the responsibility to share compelling stories internally so other teams can learn from and
adopt the innovation. We also share successes externally at conferences and through award
submissions and thereby contribute to the industry. This also becomes a source of meaningful
feedback on how our research and innovations resonate beyond our organization.
This is a lot of responsibility, but it is easy to see the connection between these different areas of our
team’s charge. Research will lead to innovation, which then needs to be tested or piloted. Successful pilots
lead to broader adoption and value. This journey from idea to business value becomes the backbone of
a story to share.
You will need to determine which subset of these responsibilities make sense to include in your charge.
Factors such as team size, resources available, business intent for research and innovation, and responsibili-
ties of other teams in your organization can help determine your scope.

Actions to Take
These actions can help to define your research and innovation process:
• Align around a common, shared goal, keeping the learner at the center both by re-engaging
learners throughout the research and experimentation process and by asking for their input.
• Determine a clear and agreed-upon scope of responsibility and ensure it is well communicated.

Engaging a Cross-Functional Team


Diversity of perspectives and skills will help broaden research and innovation efforts and avoid siloed
thinking. It will also likely lead to healthy debate, questioning, and additional curiosity. A cross-functional
team may include some combination of dedicated staff, reviewers, stakeholders, and partners. These
teams will bring different perspectives and unique skills while working on a common purpose. Success-
ful cross-functional team members are those who bring their unique perspective, remove bureaucratic
boundaries and hierarchies, nurture a collaborative and high-trust environment, and maintain a laser
focus on increasing the speed to impact. It is important to keep these purposes in mind when considering
whom to include on a cross-functional team.
We are fortunate to have a dedicated cross-functional team working on a wide range of talent-related
research and innovation projects. We also leverage our network to create temporary agile teams when

294 | Chapter 22
needed. Typically, these include a cross-functional team of individuals from outside the learning and talent
development team who bring fresh perspectives and help avoid re-creating solutions that others have
already tried. Cross-functional players could come from:
• business
• in the field
• the sales team
• tech strategy
• learning delivery
• vendors
• the diversity team
• the recruiting team
• the onboarding team
• the performance team.
By working on research and innovation projects across a variety of talent areas (such as performance,
learning, onboarding, recruiting, and often HR), we make connections that would not be apparent in a
siloed organization. Here are two examples:
• Our talent research and innovation team’s learning experts collaborated with Accenture’s
onboarding team to create a unique onboarding game that eliminated almost all PowerPoint
slides. This new approach is highly engaging, instructive, and fun, and has sparked rave reviews
around the globe.
• Our work with Immersion Neuroscience began as a tool to improve immersion in our learning
experiences through unobtrusive measurement using a Fitbit-like wristband. It’s now being
used to better understand feedback conversations (Gerard 2019).
Within these cross-functional teams, we always ground ourselves in the big problem that we are
working to solve and define what success looks like. To quote Lewis Carroll, “If you don’t know where
you are going, any road can take you there.” By clearly identifying what success looks like and how it will
be measured, we’ve outlined the framework we’ll execute against. These cross-functional relationships
become key to making connections that deepen the impact of research and innovation.
The lesson here is that research and innovation are most successful when the team can see the chal-
lenges and innovations of other related talent teams or individuals. Small companies may institute a
research and innovation consortium where there is a set cadence to get together and share projects, chal-
lenges, tools, and lessons learned. You could gain broad visibility through an advisory group or by how you
define your testing or interview population.

Actions to Take
Try these actions when engaging with a cross-functional team:

Business Impact Through Learning, Research, and Innovation | 295


• Identify and engage individuals with diverse skill sets and discipline knowledge who will
contribute to the shared purpose of a cross-functional team.
• Have a clear understanding of the strengths each individual brings and leverage those strengths
with intent.
• Cultivate an environment of psychological safety where team members can openly share and
fail safely.
• Stay relevant with the business by directly involving stakeholders in the research process.

Having the Right Skills and Strengths on a Team


Having the right set of skills working on research and innovation efforts is key to driving business value.
Effective team members could have skills in performance, expertise in learning, organizational psychology,
behavioral psychology, cognitive science, interaction design, design thinking, visual design, measurement,
and agile processes. These skills can be fostered, built, or borrowed depending on your organization.
“Curiosity, passion, and a deep understanding of business priorities are also essential,” says Accenture
Global Talent Lead Allison Horn. “Curiosity drives better questions; passion drives an ongoing quest
for new insights and information, and being grounded in the realities of the business keeps the focus on
meaningful impact—not just activity.”
Our research and innovation team members have a learner mindset, are collaborative at ideating, are
constantly curious, and are strategic thinkers. We also understand the business and are good at relation-
ship building. We are positive, creative, and results driven. These strengths often combine to create deep
levels of impact. Let’s look at the strengths of learner mindset, strategic thinker, and relationship builder:
• learner mindset—having a deep desire to learn and continuously improve
• strategic thinker—quickly spotting relevant patterns and issues to create alternative ways to
proceed
• relationship builder—enjoying close relationships with others to achieve a goal.
The combination of these skills means our team is constantly learning, thinking strategically about
the business, and collaboratively working with others—a powerful mix indeed.
As individual team members, we also challenge ourselves to have an innovation mindset. We foster
this mindset by reading broadly, openly collaborating, making new connections, cultivating curiosity,
bypassing conventional wisdom, and reflecting often. By reading broadly, we look at the same or similar
topics from different angles or disciplines. As we openly collaborate and discuss ideas, we make new connections
between what is known and the new ideas that are inspired by reading broadly. Reflecting gives our brains
the space to let the ideas “percolate” and the time to revisit them in the context of the problems we are
trying to solve. We approach each new area of focus with open minds and bypass conventional wisdom as we
investigate and create insights. Curiosity becomes the engine of inquiry and often leads to things we never
before supposed.

296 | Chapter 22
No one team member can embody this complete innovation mindset, but collectively we can encour-
age and inspire one another by regularly sharing ideas, books, and articles we find inspiring, and fostering
a culture of open feedback.
For areas where we don’t have the right skill or the right depth of skill, we partner with our innovation
architecture network. This is a structured network of people, technology, and locations focused on bring-
ing innovation to life by digging deeper into business challenges, delivering disruptive innovations, and
scaling them faster. The innovation architecture is made up of six distinct groups (Figure 22-1) that imag-
ine and invent the future. You can learn more about Accenture’s innovation architecture on our website.

Figure 22-1. Accenture Innovation Architecture

One example of how we partnered with the innovation architecture network is our recent research
using artificial intelligence (AI). Our core team was trying to understand the future of learning pathways,
but we did not have the AI knowledge to conduct experiments to test our research findings and advance
innovation. However, by partnering with our Accenture Labs team, which has deep AI expertise, we were
able to combine our research insights with the technology to capitalize on those insights.
We recognize that many organizations will not have access to this type of robust internal innovation
architecture, but we share it as an example of the depth and breadth you may want to consider in estab-
lishing your own network. As you build your own innovation network you may want to start by attending
conferences with a diverse set of topics and presenters. Consider expanding your network to include
universities where you could engage postgraduate students as researchers. Your goal should be to connect
with people who will challenge your thinking in ways you haven’t previously considered and grow your
innovation network.

Actions to Take
Use these actions to ensure your team has the right skills and strengths:
• Invest time to understand the skills and strengths you and your team bring.

Business Impact Through Learning, Research, and Innovation | 297


• Assess your network for groups or teams that can help inform your thinking and potentially
partner on projects and intentionally grow that network.
• Encourage your team to read broadly, openly collaborate, make new connections, cultivate
curiosity, and reflect often.

Having a Practical Research Methodology


To be successful with research and innovation, it is essential to have a methodology to execute against.
Our methodology focuses on solving for the great questions that allow us to discover, inspire, and
grow talent for today and tomorrow (Figure 22-2). We follow a science-based, agile, collaborative, and
measurable approach.

Figure 22-2. Talent Innovation Methodology

Start With a Great Question


We often say of our methodology, “It all begins with a great question.” Great questions focus on getting
to the essence of what will bring true talent innovation and result in improved experiences that drive
business impact.

“What I have learned is that when we need to transform or disrupt for the future, we must first start with a great
question, and have a strong research base upon which to build our innovative experiences.” —Shelby Kan, Talent
Research and Innovation Lead, Accenture

What constitutes a great question, and how do you create one? You must first think about the end goal
or vision; then work backward into framing a question that will lead you to that vision when answered.
To bring talent innovation, a great question typically won’t ask how to fix today’s problems or processes.
Rather, it focuses on how we should think or act differently to create the future. Here are a few examples:
• Initial question: “What will persuade employees to own their development?”
Great question: “What will empower employees to own their development?”
This reframing from persuade to empower helped us research the right ingredients to enable

298 | Chapter 22
ownership, including a culture that supports learning, technology infrastructure, and content
that is relevant, contextual, and compelling.
• Initial question: “How do we make our teams more effective?”
Great question: “How do we help teams achieve their best performance?”
The reframing put the focus on the big picture of what we wanted to enable teams to achieve
in support of the business strategy and culture, rather than a process or operational fix.

Research
The aim of talent innovation research is to generate insight that will serve as the base for experimenta-
tion and improved employee experiences. We learned that we achieve the best insight when we conduct
research and answer the great question through the lenses of science, the market, and the internal voice.
In the area of learning and development, scientific research includes brain science, learning
science, or pedagogy. Market research includes practices from other companies, lessons from society, or
studies of emerging technology. Internal voice research is understanding the perspectives and experi-
ences of your employees. The following are examples of the research questions you may ask to generate
insight in each area:
• Science. What do learning and brain science tell us about how we learn, interact, and grow
our knowledge or skills?
• The market. What do we see other companies doing to improve learning? How is technology
enabling great learning? What can we learn about learning from noncorporate sources such as
sports, the military, or government?
• Internal voice. What are the experiences of our people? What insights and pain points can
they share?
When conducting research, there is value in leveraging a broad range of sources, such as
academic publications, case studies, literature reviews, interviews, surveys, firsthand observations,
focus groups, and employee observational data. Diversity of research can come not just from the
research sources but also from the diversity of the team. Having a research team who brings their
own varied experiences, perspectives from the business, and cultural backgrounds is also key to
generating innovative insight.
Even if you are a team of one, try tapping into those who have an interest or aptitude for research
who may want to join your effort as an agile team member. You can also draw on research partners from
a variety of sources, including universities, internal or external thought leaders, or third-party research
organizations.
When researching, it is important to have a growth mindset and be open to new concepts and unan-
ticipated results. Avoid limiting yourself to just looking at the best practices of other companies, what
you’ve done in the past, or conventional wisdom. If you are curious and refrain from forming opinions too

Business Impact Through Learning, Research, and Innovation | 299


early in the process, you won’t predetermine the outcome and will be more likely to have an innovative
breakthrough.
Once you have your findings, you can begin the process of deriving insight. There are many ways
to consolidate findings to identify the key themes. Often this involves an iterative process of gathering all
the individual insights (divergent thinking) and then consolidating them (convergent thinking) into core
themes. Design-thinking techniques such as affinity clustering are great tools for this, especially when you
have more than one researcher and need to bring together multiple points of view. Identifying the core
themes leads to uncovering the universal truths.

Universal Truths
The outcome of research should be the consolidated themes, insights, and fundamental principles that the
findings have revealed. These are the facts from the research that are true in any context and answer the
original great question asked. We call these our “universal truths.” They are the undeniable answers to the
research question we ask. The opportunity for innovation comes from anchoring to these fundamentals as
we design and execute to bring them to life in the business context.
For example, when we approached the transformation of performance management at Accenture,
we asked the question “How do we help the people of Accenture achieve great performance?” The
universal truths that emerged were that great performance happens when we:
• Bring the best of ourselves to our work (apply our strengths).
• Focus on the vital few priorities.
• Create engagement within our team.
• Discuss performance in the moment.
• Take forward-looking actions to grow.

Experiment and Measure


Experimentation allows you to test potential solutions that are grounded in your universal truths. This
involves the formation of a hypothesis, the identification of a measurement strategy, and the design and
execution of the experiment.
A hypothesis is a testable prediction of the outcome of your research. For example, the ques-
tion “How do we help teams achieve great performance?” led us to a hypothesis of “If teams spend
time learning as a team, then they will form more trust and grow together, leading to higher team
performance.”
In learning, one of the biggest challenges to overcome is the desire to solve for everything all at once.
Be careful of this and trying to test multiple hypotheses at the same time. When trying to test a hypothesis,
an experiment should clearly articulate what is being tested and what the intended outcome is to solve for
those outcomes before moving to the next experiment.

300 | Chapter 22
Measurement is at the core of experimentation, so it is important to narrow your focus to specific,
measurable items. You need to ask, “What factors may prove my hypothesis true?” This is what you will
measure. For example, in our hypothesis of “If teams spend time learning as a team, then they will form
more trust and grow together, leading to higher team performance,” we identified level of trust as a key
area to measure.
Your experiment design can come in many shapes and forms, but the most basic is called A/B
testing. In A/B testing, one group is the control group and the other is your test group, for which
you adjust a variable to identify and measure the impacts. Individuals are randomly assigned to the
groups and a pre-test is administered to both groups prior to the training program to assess a base-
line. The two groups are exposed to the same conditions in the experience except for the adjusted
variable. A post-test is then administered to both groups to identify the impact of the changed vari-
able on the target audience.
The measurement findings from your experiments may prove, disprove, or identify ways in which you
need to reframe the hypothesis. If your outcomes don’t clearly prove your hypothesis, this is valuable as
well. We have found that giving ourselves the license to fail intelligently helps us accelerate our knowledge
and build better insights to shape our experiences. Successful experimentation leads to the design of great
solutions or experiences that are scalable in an organization.

Communication and Storytelling


Throughout the research and innovation process, it is important to validate your great question and exper-
iments with the business, and actively share your research insights and universal truths. Once experimen-
tation is complete, you can then look at the business implications and provide a scientific, research-based
recommendation for how to scale the solutions or experiences.
We have found that sharing insights and recommendations using a storytelling approach has the
most impact. Storytelling helps people understand the outcomes and impacts of the research and
experimentation in a relatable way, so leadership can see the possibilities that exist and be inspired
to act.

Actions to Take
These actions can help ensure you’re using a practical research methodology:
• Start with a great question that is broad enough to generate nonpredictable insight and
forward-looking thinking, while being specific enough to tactically research.
• Conduct research that includes scientific, market, and internal voice insights that can lead to
universal truths, which answer the original great question.
• Develop a sound hypothesis and measurement strategy at the outset (before experimentation).

Business Impact Through Learning, Research, and Innovation | 301


Engaging the Learner in the Innovation Process
Engaging with learners as part of your research helps ensure you are solving problems with their needs,
wants, and insights in mind. You can then explore these against the business needs to understand what
gaps, friction points, and other barriers might exist. Learner involvement may seem obvious, but too often
it is not done. It may not happen because learners are too busy or inaccessible, or perhaps the project
timeline is slipping. Whatever the reason, do not proceed unless you engage with learners.
Our team gets the voice of the learner through interviews, focus groups, and surveys, and as partici-
pants in experiments. We engage them in prototyping and pilots. At Accenture, we are fortunate to have
a dedicated 3,000-square-foot research and innovation lab in our India Learning Center. This lab is in
the same physical space as our learning classrooms, giving us access to hundreds of participants when we
need learner input.
One thing is certain: If you don’t get learner input early and often, you will eventually get input when
you roll out an innovation or share your research. Learners often provide some of those “Duh! Why didn’t
we think of that?” moments.
Engaging learners will help ensure that research and innovation teams don’t have tunnel vision as
the team works to innovate. As learners engage, they have some degree of partnership in solving business
challenges and making a contribution. It is important to recognize learners’ contributions to research and
innovation through your company’s recognition channels.

Actions to Take
Try these actions to engage the learner in the innovation process:
• Ensure your research plan includes specific touchpoints to get the voice of the learner and
provides room for them to share their unique insights.
• Build your learner research skills by leveraging organizations such as IDEO and LUMA
Institute, which have strong human-centered design programs.
• Appropriately recognize learner contributions to your research and innovation efforts by giving
them an early preview of the solution or citing their insights as quotes in the research.

Innovating in an Agile Way


The agile approach of creating rapid incremental deliverables includes research sprints and building on
what you learn. A sprint is a time-bound activity that limits effort to a specific, short duration, which is
normally between two and six weeks. A sprint methodology is essential when you need a targeted effort
that will yield insight, but that will avoid you circling in research without getting to tangible answers.
Ultimately, these smaller steps lead to a more robust, larger solution with a higher likelihood of success
and acceptance. In other words, incremental innovation leads to larger, more impactful innovation in an
efficient way.

302 | Chapter 22
Our team experiments with early, small, and rapid prototypes. We then build on what we learn.
This approach helps keep us focused on the right things and ensures regular feedback leading to in-pro-
cess course correction. It also keeps us from becoming too committed and protective of our solution or
idea. For example, when we set out to redesign our onboarding program, we created storyboards and
low- fidelity videos (we were the actors and used smartphone video) to test the experience and impact
with our employees. Once the experience was refined, we increased the fidelity of the materials. As part
of this redesign, we introduced learning games, which we also tested with learners through mock ups
and sample game play. We did not get stalled by having perfect graphics or refined wording; instead,
we focused on rapidly getting the voice of the learner to help us evolve and iterate on our approach.
The result was an extremely popular and innovative approach to onboarding that is being rolled out
around the globe.
By not becoming too entrenched in a single approach, we free ourselves to swiftly fail forward to a
solution that is ultimately better for the program we are trying to address. Here is another example: Our
work on personalized learning began with a research sprint to define the landscape and identify unique
areas where experimentation could lead to business value. We followed this research with a simple mock
up to get learner reaction. Once we had the voice of the learner, we made adjustments and created a
limited-function prototype. That led to an enhanced prototype and eventually a small pilot. All these small
steps led to innovative approaches for using AI to provide more personalized learning recommendations
to all 485,000 of our employees.
Let’s look at some other examples of the types of research and innovation our team has engaged in
to drive business impact:
• Durable learning
° The great question: “How do we support learners so they experience deep, durable learning
that leads to lasting change?”
° To do this, we researched what makes learning stick though our academic partners, literature
reviews, and learning design pilots. Our learning approach is to create durable (or sticky)
learning for our people that is grounded in neuroscience and cognitive research.
• Extreme learner research
° The great question: “How do we empower lifelong learning in our employees?”
° We wanted to understand the motivations, habits, and any obstacles extreme learners face—
those individuals that seem to be wired to learn faster, more effectively, and in less time. This
insight allowed us to create a “better learner model,” which now empowers our employees to
learn more effectively.
• Physical design of learning
° The great question: “How do we design a physical space to be laser focused on providing an
optimal learning experience?”

Business Impact Through Learning, Research, and Innovation | 303


° As we were building a new learning center in Bangalore, India, we began with research
on learning durability, Zen philosophy, and ancient Indian wisdom literature. We took the
insights gained and created an intelligent, purpose-driven design to create smart learners and
better learning experiences.
• Learning in the future
° The great question: “What will learning look like in two to five years?”
° The learning landscape has changed dramatically over the past several years. We now have
access to learning 24/7, the Internet has a seemingly unlimited amount of content for us to
consume, and we know that we need to be learning new skills faster than previously required.
In addition, new technologies are being introduced to facilitate learning, and artificial
intelligence is playing a larger role in learning. Rather than responding to current trends or
the latest shiny object, we looked to the future to effectively invest time and resources.
• Future of recruiting
° The great question: “How do we connect exceptional people to exceptional opportunities?”
° Because the digital economy and automation require new skills and role profiles, we decided to
study how we match people to roles considering potential, not just pedigree. Our great question
led us to insightful universal truths that will radically reshape our approach to recruiting.
• Blockchain and talent
° The great question: “How can we use blockchain to remove barriers and improve the
employee experience?”
° Transparency and trust in data access and usage is becoming a priority. However, we find that
the data we use causes more barriers than it removes. This combined with an increased focus
on personalization is requiring organizations to think differently about their data. Our great
question led to a set of key insights and collaboration across several different groups to get
those insights. Interest in this topic led us to additional research, and our journey continues.
We’ve also carried out projects related to immersive learning, compassionate leadership, swift trust,
psychological safety, forming habits, learning pathways, immersion neuroscience, and games for learning.
Each of these research projects has an associated great question and requires engaging learners to solve a
talent-related opportunity.

Actions to Take
Use these actions to innovate in an agile way:
• Do research in sprints—a short, structured, time period focused on a specific task or topic.
• Build rapid protype experimentation into your broader plan by focusing on creating simple,
low-fidelity examples (such as a smartphone video versus a professionally produced video).
• Learn more about agile.

304 | Chapter 22
Collaborating With Research Partners
Research partners are experts at cultivating new and innovative practices from across the industry.
Some have a great breadth of knowledge across many talent areas; others are focused on a specific
area, such as learning, performance, or recruiting. There are also various cost models to consider;
some provide free research and trend information, but you must pay for unique research. Others
provide access only for a fee or with a contract. There are so many potential research partners in the
talent development field that it is important to find one that is the right fit for you. To determine the
right fit, consider:
• Does the potential partner have research and expertise in the areas of focus your learning
organization needs?
• Do they provide the type of content you are looking for, such as industry trends, case studies, or
consulting services?
• Do they match the culture of your organization? (Do they feel like “your kind of people” while
still challenging your thinking?)
• Are they affordable? (Note: The investment doesn’t have to be substantial to gain useful insights.
See, for example, Google Scholar, ResearchGate, or other open-source research groups.)
• Will working with the research partner help upskill your research and innovation team?
There are many advantages to engaging with research partners. You may find that someone else
has already answered the question you are trying to address. Or you may find that by engaging with
the research partner you develop a better hypothesis. The partner may help expand your professional
network, which may lead to more diverse input and ongoing idea exchanges.
At Accenture, we partner extensively internally, including the innovation architecture network
described in this chapter, and we also work with professional research and academic partners.

Professional Research Partners


Professional research partners help us monitor industry trends, keep current on technologies, and under-
stand social, legal, and economic influences on our profession. They are often great at finding and sharing
case studies. Often, we learn of another company that either has addressed or is looking to address a
similar problem to one we are currently exploring. Our research partners can connect us directly to repre-
sentatives from those case companies, and what ensues is typically a rich sharing of insights as well as a
sense of respect for other talent professionals.

Academic Partners
We maintain several partnerships with universities around the globe. We focus our relationships on areas
where we don’t have deep expertise internally. Universities are often excited to partner on research that
can be of mutual benefit—sometimes pro bono, other times for a fee. For example, we have partnered

Business Impact Through Learning, Research, and Innovation | 305


with MIT on several brain science and learning experiments focused on how to improve retention in
video training. We brought the great question and MIT brought the neuroscience background to help us
determine how best to improve video training.

Actions to Take
Try these actions as you collaborate with research partners:
• When conducting research, begin by discovering what is already available by looking at
resources such as Google Scholar and ResearchGate.
• Identify a research partner that can add unique insights to your great question, such as
graduate universities or leading vendors.

Communicating and Sharing Our Work


Without communication, great innovations may never be adopted or drive impact. New insights
and discoveries are only beneficial if the business and others buy in to the innovation and begin
adopting new approaches. By communicating, storytelling, and connecting, you enable others to
see the value of the insights as well as be inspired by new ways of working. This can come in a
variety of formats and should be based off what works best within your organization. Look for
unique opportunities to get your message out, including formats such as magazines, podcasts, or
roadshows. Think broadly about the audiences your message needs to get to, as well as the most
effective channel for sharing it.
We have a variety of ways we communicate, including a regular cadence of virtual webinars, a
website that hosts our research findings and innovations, a quarterly digital innovation magazine, our
own internal video channel, a “persona” on our internal social media feed, and even a podcast called The
Learning Geeks (which is available on iTunes and Google Play).
We use innovative approaches to communicate our research; for example, we recently published
a series of eight four-by-six-inch Durable Learning cards. Each card provides a summary of a dura-
ble learning principle, the research behind that principle, and ideas for how to apply that principle to
increase learning durability (stickiness) for their program. Teams and individuals are using these cards to
enhance their understanding of learning durability, review course designs through the lens of durability,
and communicate this research—and its importance—to sponsors and stakeholders.
Communication is such a vital part of our research and innovation activities that we have a social
media expert on our team who regularly shares our research and innovation on our internal social plat-
forms as well as external platforms such as LinkedIn.
Through effective communication we have been able to:
• Increase the rate of adoption of innovations.
• Inspire and build bridges with other teams.

306 | Chapter 22
• Create an informal network of cross-functional employees with an interest in helping with our
experimentation.
• Connect with others who are trying to solve the same problems.
• Build skills in telling the story of our research and innovation.
• Improve our research by having others probe and question it, which, in turn, informs our own
thinking.
• Disseminate our research broadly once it is complete.
• Build our brand.
Educating stakeholders, sponsors, and learning and talent professionals is a key aspect of the research
process. Through this communication, we find ourselves in the virtuous cycle of having our research and
innovation shape new questions or identify connections that we then get to further explore.

Actions to Take
These actions can help ensure you’re communicating and sharing your work:
• Design communication into your research and innovation plan so it’s easy for people to apply
the insights.
• Establish and grow a brand for your research and experimentation.
• When appropriate, establish a central repository where people can view your research and
innovation plans and collateral.
• Look for opportunities to publish your insights in professional journals or by presenting at
conferences.

Summary
Wherever you are as a learning organization, you can benefit from taking a research and innovation
focus. As L&D professionals, we are all too often consumed by the day-to-day mountains we are asked to
climb; however, we require dedicated time and resources to move our organizations forward to the future.
By working with the business to ask and investigate the right great question, we are ensuring that we are
focused on solving the correct problem and at the right level. By researching, we explore a better tomor-
row and are better able to focus our investments in things that drive business impact. By experimenting in
an agile way and engaging with learners, we ensure that we are getting feedback early and often so we can
improve outcomes before we spend considerable time and budget.

“By grounding ourselves in research and innovation, we are able to move from opinions and conventional wisdom
to measurable, science-based answers for the big questions we struggle with as learning professionals.” —Allison
Horn, Global Talent Lead, Accenture

Business Impact Through Learning, Research, and Innovation | 307


Here are just a few successes that have come as a result of our approach to talent research and
innovation:
• Impact on learners
° more engaged and informed learners
° personalized, AI-driven learning recommendations
° brain-friendly learning designs that leverage what we know about how the brain learns
• Impact on the organization
° an overhaul to our performance management approach, leading to performance
achievement
° a new, science-based approach to recruiting
° an improved, learner-centric onboarding experience
° evolving the skills of our talent and learning teams
• Impact on reputation
° winning several industry awards, including the CLO LearningElite Organization of the Year
in 2014 and 2018
° being in demand as presenters at conferences

Epilogue
For your first foray into learning research you asked the great question of: “What makes learning experiences brain
friendly?” The results led to great insights that you and your team now use to create more brain-friendly learning.
Learning teams and sponsors now consider brain science in the design and delivery of learning experiences,
making learning stick and increasing cost efficiency. You were excited when you heard one of your learners say,
“This was fantastic. Better than any training course I’ve ever taken.” You were even more excited when her
supervisor called and said, “I’m not sure what you did in that training, but I have seen a significant increase in my
team’s skills. Bravo!” Next on your prioritized research agenda is using artificial intelligence to improve learning
recommendations for your learners. For this you plan to find an external partner who will round out your skill set
with deep AI expertise. You now feel you have a path to the future. You have found, as a side benefit, that your
learning team is energized and excited about what lies ahead. The future is in good hands.

308 | Chapter 22
Key Takeaways
9 Have a practical research methodology that is central to successful research.
9 Leverage external research organizations, which can save time and money—if they have the research
you are looking for.
9 Start research by getting the big question right. This will lead to better data and actionable results.
9 Keep the learner at the center of your research to drive more impactful results.
9 Engage a cross-functional team in your research to ensure diversity of thought and perspectives.
9 An effective communication strategy for research is critical to having an impact.

Questions for Reflection and Further Action


Wherever you start, don’t forget to start with a great question. Start small and from small and simple
things, and you will see great things take place. Be constantly curious about the world around you and take
time to regularly reflect on those things you are learning in your research. Most of all, enjoy the creative
and intellectual challenges of defining universal truths and inventing something new as you explore
uncharted territory.

1. What is the purpose of your research team, and how broad or narrow is your scope?

2. Who will use the research to produce results, and who will they produce results for?

3. Does the research you need already exist in the marketplace?

4. Does the team have the right skills mix to support your purpose and communication strategy?

Business Impact Through Learning, Research, and Innovation | 309


23
Innovation in Learning
Graham Johnston

Innovation is part of the four-part construct that defines the value and impact that learning provides to
the business, and it’s one of the attributes of a high-performing learning function. But while we all may
agree that innovation is a good thing overall, we need to better understand why it is so important in the
learning industry, what it looks like for the learning function, and how we can create an environment
that encourages and enables it.

The Call for Innovation in Learning


Innovation is already being fueled at an unprecedented pace, largely by automation and the expo-
nential improvements across core technologies that are influencing—and will continue to influ-
ence—the work, the workforce, and the workplace. The future of work is being reimagined, and
this has significant implications for the future of learning. This changing landscape requires the
learning function to sustain its value and impact; we absolutely have the ability to not just keep
up with innovation, but to make innovation a core value and pioneer how we use it to enable and
optimize development.
The half-life of many technology skills is shrinking, and new or enhanced skills are now more
in demand. Whereas deep domain expertise once enabled success, those skills are quickly becom-
ing outdated and will need to be refreshed. In addition, the future workforce still requires a blend
of professional and leadership skills for success, no matter how significant the shift toward auto-
mation and technology or the type of work they’re doing. Uniquely human capabilities—such as

311
resilience, divergent thinking, curiosity, and social and emotional intelligence—will be particularly
important, because they will allow professionals to manage ambiguity, develop creative solutions, stay
relevant amid constant change, acquire knowledge, and understand the implications for the future,
all while leading inclusively and empathetically. This, in turn, will help cultivate agile teams that can
be creative, adaptive, and excellent at problem solving, all while quickly pivoting in the ever-changing
landscape of work and technology. The professionals of the future must be nimble and flexible—able
to apply key capabilities across multiple domains throughout their careers.
How do we drive innovation in learning to address this future of work and the distinct needs and
preferences of the modern learner? For one, we need to be flexible in where and how learning occurs.
This also points to continuous, curated, and holistic learning content that is integrated into the flow
of work, where learning can occur on-demand and at the point-of-need. In addition, organizations
must have a strong learning culture, where learning is viewed as an enabler for individual, team, and
organizational performance, as well as a driver of engagement, connections, knowledge sharing, and
collaboration. Professionals should be able to own their career growth through meaningful learning
and development. Innovation in learning is critical for creating these learner-focused development
experiences that drive career progression and performance.

Creating a Culture of Learning Innovation


Bringing new and effective methods, tools, or solutions that better address learning issues, needs, and chal-
lenges requires that we first understand how innovation is defined. Merriam-Webster defines innovation as
“the introduction of something new, or a new idea, method, or device—a novelty.” At its core, innovation
is about creative problem solving and different ways to define and address those challenges, or—better
stated—those opportunities. Being innovative means that you need to approach things from a different
angle and perspective, and be willing to depart from the way something has always been done. Innova-
tion can include—but is not limited to—technology solutions; it is more about dissecting the challenge
or opportunity and then determining the solutions that optimize the associated experiences. Innovation
occurs when the manner in which the challenge or opportunity is framed drives the process for how the
solution is framed.
But we can’t just write “be innovative” in our daily planner and expect it to happen. Innovation isn’t
exactly an intrinsic skill, nor is it one that is necessarily taught. Rather, certain mindsets need to exist to
cultivate and foster innovation; we need to be:
• open-minded to issues, opportunities, and wild, innovative ideas
• empathetic to the learner’s situation and experience
• committed to collaboration and diversity of thought
• embracing of creativity
• willing to experiment with a bias toward action.

312 | Chapter 23
Innovation is anchored in divergent and then convergent thinking—which is different from typical
brainstorming—in that we converge on the specific problem to solve and then diverge into the solution
exploration space seeking radical ideas. Innovation also requires an almost relentless practice of asking
“why?” to get to the root of the challenge or opportunity. This is what yields the meaningful, sometimes
unexpected aha moments about the learner experience—the insights that lead to action. And that action
comes in the form of iterative processes—testing partial, imperfect prototypes to learn from, refine, and
test again. Finally, innovation is dependent on having the courage to think of and suggest radical ideas that
address the core problem. And that can’t happen if ideas are discussed and voted down before they are
even tested. If you have an idea or a solution, don’t over-deliberate or decide against it prematurely—be
willing to experiment!
This speaks to another important factor: the broader organizational environment for innovation.
You need people with the mindsets to foster innovative thinking, but you’ll also be helped by an orga-
nizational culture of permission and encouragement. For learning functions, it can be challenging
to innovate if the organization does not promote or enable it as part of the overall culture. A strict
process for making decisions or approving investments will stifle innovation, as will an environment
that doesn’t allow for experimentation. Many organizations have innovation as a core value, but they
need to back that up by clearing the path for creative problem solving so the learning function can
achieve value and impact.
However, there are ways to overcome a lack of organizational support for innovation if that’s what
you are facing. Innovation by definition includes small-scale ideating and prototyping. You don’t need
to employ an organization-wide rollout—instead, you should look for a single stakeholder group that’s
willing to test your solution. Then, leverage their success to build the momentum that can open the door
a bit wider.
When it comes down to it, the learning function needs either that license or the willingness to
ideate and experiment—the permission to try. And using the word try is intentional. We need permis-
sion to try, not permission to fail, because experimentation is what we’re going for—not failure. Think
of it this way: Baseball players aren’t encouraged to go strike out—they are expected to swing the bat
knowing that one of those times, they will make contact. And striking out swinging is always better
than striking out looking! Now, let’s explore one form of innovation that learning functions can employ:
human-centered design.

Human-Centered Design: Creating Development


Experiences for and With the Learner
Learning solutions are typically planned, designed, and developed with the learner in mind—taking
into account learner capability needs and performance objectives. And subject matter experts represent
those learners by providing valuable input on the design. But designing for the learner is not nearly the

Innovation in Learning | 313


same as designing with the learner. Human-centered design (HCD) is a form of design thinking that
addresses this dual objective by putting the learner at the heart of the design process, and also serves as
a cornerstone for innovation.
HCD is a structured, iterative, and collaborative approach to creative problem solving that starts
with understanding the end users (our learners) and ends with solutions tailored to their specific needs. It
taps into skills often overlooked by more conventional problem-solving practices, such as our ability to be
intuitive, recognize patterns, and construct ideas that are emotionally meaningful as well as functional.
HCD drives innovation by elevating the voice of the customer to understand complex, human-centered
challenges and opportunities and address those by uncovering novel ideas. The approach is anchored in
advocacy for the end-user experience, which requires empathy, curiosity, and humility to generate new
and relevant solutions, focus on the desired impact, and mitigate the influence of bias and institutional
assumptions. Ultimately, HCD is a way of thinking that leads to a way of doing. Most important, it can
be learned! Let’s look closer at how you can use HCD to build innovative, end-user-focused, and effective
learning experiences.
The goal of HCD is to develop simple, unthought-of solutions using end-user data to make informed
decisions and embracing the innovation mindset. Counterintuitive for many, HCD flips the traditional
management model on its head; it starts with understanding end users and their experiences and defining
root causes rather than first defining operational goals and creating concepts or solutions based on them.
For learning, HCD focuses on creating an optimal development experience that addresses distinct learner
needs and preferences but is largely based on what the learners are sharing, not just we are inferring.

The Human-Centered Design Process


There are a few different models for the HCD process, but they all reflect an amalgamation of the five
innovation mindsets and core design thinking action steps. The Discover-Define-Develop-Deliver model
(Figure 23-1) depicts a phased approach to identify, research, and define the problem; ideate and proto-
type solutions; and then test those to determine if and what to implement.

Figure 23-1. Discover-Define-Develop-Deliver Model

314 | Chapter 23
In the discover phase, we look at the challenge or opportunity from the learner’s perspective and then
gather information and inspiration to further define it. Empathy-based research techniques are important
for observing the situation and users to reveal root causes, uncover areas for improvement, and point to
potential solutions. You can use three key methods to build an empathetic understanding of end users:
observing them, engaging with them, and immersing ourselves in their experiences. By putting ourselves
into their world, we best understand what our learners do, feel, think, and say.
In the define phase, we combine research, observations, and insights to identify the real learner
needs. Areas for improvement or change are highlighted when we share findings and stories, synthe-
size insights to make sense of the data, and find patterns and meaning in the data. From this we
develop the “How might we . . . ?” question, which attempts to pinpoint an area for growth or change
that also keeps the door open to other possibilities. “How might we?” statements are a critical part
of the HCD process because they:
• Articulate the end state or outcomes we are trying to achieve.
• Provide guidance and strategic direction for future ideas.
• Provide focus and creative freedom for future brainstorming.
In addition, answering “How might we . . . ?” along with “so we can . . . ” is a great way to reframe
the problem in the define phase. It becomes the anchor point for the HCD process, and should be
regularly revisited as solutions are identified, developed, tested, and implemented to stay true to the
challenge or opportunity being addressed.
Next, in the develop phase we ideate and prototype creative, innovative solutions to consider and
further evolve. Using brainstorming and iteration to build and refine models, we generate ideas, refine
those ideas, and generate prototypes. We use the “How might we. . . ?” question to come up with creative
ideas that keep us grounded in the user’s need. Ideation is essentially iterative brainstorming, allowing
ideas to build off one another and giving the group the freedom to explore new possibilities. However,
this type of ideation is different from typical brainstorming because we focus first on converging around
the specific problem we need to solve for, and then diverge into exploring solutions for radical ideas. The
process needs to be focused, deferring judgment to generate a large quantity of ideas with structured,
time-bound ideation methods. Prototyping is about producing an early, inexpensive, and scaled-down
version of the solution. It might be the whole solution, or simply core pieces to test for proof of concept.
Prototyping provides an opportunity to bring ideas to life, test the design, and explore how the end users
think and feel about the solution.
Finally, in the deliver phase, we’ve tested, refined, and presented using divergent thinking to generate
many ideas. From here, we narrow those proposed solutions to one or a few cohesive concepts, developing
and evolving those through prototyping and testing, and then moving into implementation. When we
have ideas, we have a tendency to talk about them, debate the pros and cons, and decide to move forward
based on what we think will or won’t work. But as designers, we need to avoid this process and instead

Innovation in Learning | 315


move swiftly from an idea to a concept model, building out ideas so we can learn from them. This is where
that mindset around a bias toward action comes in—developing those partial and imperfect solutions to
make the “rough but ready” prototypes. Then we iteratively test for feedback and measure outcomes,
refine the solution accordingly, and develop a compelling case to move forward. Ultimately, the solutions
we end up with fall at the intersection of user desirability (do they want this), technical feasibility (can we
do this), and business viability (should we do this).

HCD in a Learning Context


In an HCD project, this process is repeated at scale, synthesizing large volumes of data to pull together
our end-user experiences, frustrations, delights, and needs. The following scenario presents an example of
how HCD can be applied to develop or transform a development experience.
Let’s imagine that the learning function seeks to redesign the development experience for newly hired
or promoted managers across the entire organization. This is being undertaken because the existing new
manager experience is:
• delivered inconsistently across business units, locations, and geographies
• mostly focused on a live learning program that occurs just after the promotion cycle
• not aligned against a common set of standards, expectations, capabilities, and roles
• not effectively supporting the manager’s transition into the new role.
These issues are drawn primarily from learner feedback and reveal the need and opportunity to have
new managers help shape what an improved development experience should entail. Accordingly, the
learning team decides to use HCD to drive the transformation effort.
Starting with the discover phase, the learning team conducts a series of “voice of the customer work-
shops” to gather insights from a representative sample of end users and define the challenge and oppor-
tunity from their perspective. The end users in these focus group discussions include recently hired or
promoted managers who went through the current new manager development experience and rising
managers who will likely go through it in the future. Key questions the team asked of those who went
through the experience include:
• How did you feel during and following the experience, and how do you wish you felt?
• What content were you offered and what do you wish you were offered?
• What worked and what didn’t work regarding the delivery method and timing as it
pertains to your transition to being a new manager?
• What were you prepared for as a new manager and what do you wish you were better
prepared for?
The team also asks rising managers in the group are about their expectations and to give them a
wish list for their upcoming development experience.
The learning team and the voice of the customer group synthesize the data collected from these

316 | Chapter 23
questions to jointly create new manager personas (fictional characters to represent types of end users and
new manager profiles), proposed journey maps depicting interactions and moments of need, and ideas
and prototypes for possible solutions. At this time, the learning team also conducts internal and external
research on similar development experiences within and outside the organization, as well as on other
factors informing new manager needs and preferences and enabling development solutions (future of
work, future of learning, modern learner, and so forth).
In the define phase, the learning team further synthesizes the voice of the customer session data as
well as research from other sources to find patterns and meaning that generate insights and highlight
areas for improvement or change. The challenge and opportunity are reframed by developing the “How
might we. . . ?” statement: “How might we create a consistent, innovative, and effective development
experience for all new managers that enables the transition into their new role?” This also serves as the
guiding principle for the ideation, prototyping, testing, and refinement that follows in the next phase.
In the develop phase, the learning team digs deeper into the personas, journey maps, ideas, and
prototypes identified by the voice of the customer group to build out solutions to prototype and further
evolve. This is where the learner-centric perspective can really fuel innovative design. For example,
what content can be pulled out of the live learning program and delivered via other means or at a
different point in the new manager transition to make it more accessible at the point of need? Or how
could a more holistic development experience be offered—one that complements the live learning
program with curated content, on-the-job development, and networking, mentoring, and coaching—to
facilitate the transition to manager? And finally, what innovative and creative solutions not previously
offered are they inspired to create based on how you want the new managers to feel coming out of this
experience? The learning team ideates through iterative brainstorming to identify the possible solutions
in that new manager journey, and then defines prototypes of the solutions they wish to test.
In the deliver phase, the learning team tests the development experience solution prototypes,
measuring their impact and effectiveness, and shares their story to determine the way forward for those
solutions. They deploy each prototype against a subset of the new manager population, using a control
group for comparison to determine the effectiveness of each solution. That measurement effort is criti-
cal for understanding how human centered the solutions end up being, based on learner feedback from
surveys, interviews, and focus groups.
The team shares the results of this testing with new manager experience sponsors and stakeholders,
and together they make decisions around which solutions to drop, refine, or implement as-is for the trans-
formed development experience.

Summary
Organizations are evolving to keep up with the future of work, and the learning function is responsible
for driving the future of learning. Being aligned with and enabling the business, driving learning that

Innovation in Learning | 317


improves performance, and staying efficient are all essential attributes of a high-performing learning func-
tion. But that alone is not enough—we must also be innovative in bringing creative problem solving to
define, design, develop, and deploy learning solutions.
Human-centered design is one approach that can accelerate innovation in learning—because we
define and understand the challenge or opportunity from the lens of the learners and put them at the
center of the process. Empathic research on the learners, a deep look into root causes, and an itera-
tive process for ideation, prototyping, and testing solutions serves as a framework for innovation. While
HCD isn’t the only way to be innovative, focusing on understanding the human experience to deter-
mine user-centric solutions can facilitate our transition from training to development and from instruc-
tional designers to experience architects, and can also increase the learning function’s value and impact
in enabling performance.

Key Takeaways
9 The call for increased innovation in learning is being driven by the changing nature of work
and what the workforce needs to know and be able to do. Learning functions need to be more
nimble, responsive, and creative in building a rapidly changing set of capabilities.
9 Organizational permission and encouragement for innovation is important, but the learning
function needs to at least be willing to innovate and have that stakeholder partner for ideation
and prototyping.
9 Human-centered design is just one form of innovation in learning. It puts the learner at the
heart of the experience design process and enables learning to be built for and with the learner.

Questions for Reflection and Further Action


1. When you consider your learning challenges and opportunities, where could you benefit from a
more thorough examination of the learner experience from their perspective?

2. How might you use this better understanding of the user—along with collaboration, creativity, open-
mindedness, and a willingness to experiment—to yield creative ideas that will improve learning?

3. How could this be an opportunity to test more holistic learning solutions that extend the
development experience, further integrate learning into the work, and meet the needs and
preferences of your learners?

318 | Chapter 23
24
Changing Times:
Innovate for Impact
Ann Quadagno and Catherine Rickelman

“100 percent of jobs will change because of technology. Skills will be the number-one issue of our time.”
—Ginni Rometty, former Chief Executive Officer, IBM

Learning organizations are being challenged more than ever to respond to changing business needs,
rapid upskilling requirements, shifts in the markets, and learner expectations. In a recent C-suite
study from IBM’s Institute for Business Value, 65 percent of CEOs reported people skills as having
a strong impact on their business, and people skills were also named one of the top three factors
for affecting the organization (Ikeda, Zaharchuk, and Marshall 2019). Other studies indicate that
constant upskilling and reskilling will be the norm for the future.
All these challenges influence IBM’s learning approaches and programs. As a learning team, we
need to meet business needs through skill development, job performance, talent retention, and improved
engagement. How connected are you to your company’s business goals and strategies and upskilling
requirements? Are you aware of the changing expectations of your learners and the innovative approaches
the learning industry has to offer?
Imagine you have a sales learning program that consistently has high satisfaction scores and proven
business impact. All is right with the world, correct? Well, not quite. The business is telling you that

319
salespeople need to be in the field and on quota faster. You also need to reduce the travel costs associated
with the learning. Now what? This is sales, which is not a topic that does well with e-learning. They need
practice, so you’ll need to think innovatively.
We did just that for our Global Sales School program. Using many of the innovative approaches
shared in this chapter, we were able to reduce training time, travel time, and budget, while still maintaining
the same level of learning impact.
However, there’s one thing you need to keep in mind—every shiny object is not necessarily the right
shiny object for you. It’s great to try new things; just make sure you’re trying them in a safe space so you
can see if they work for you, your organization, and the specific project.

Use Research to Ensure Your Innovation Has Impact


You have a great idea, an innovative idea in fact. But how can you be sure it is the right idea? Your gut?
Your experience? Your peers say so? That’ll give you a good start, but you also need to validate your ideas
through research.
Maybe you’re looking for an innovative way to implement a learning solution worldwide or the next
“thing” that will drive the right business impact. Innovations are being created and implemented all the
time, and you can find them through research that others have conducted, your own, or a combination.
Accessing research is easier than ever, and one of the best ways to research innovation is to immerse your-
self in the future of learning. You can attend webinars and conferences, read literature and articles, and
review surveys, among other things.

Leveraging Others’ Research


When looking for research from others, there are many free or low-cost resources you can take
advantage of. Consulting groups, for example, dedicate time to conducting research and finding and
collecting research from external sources, including ATD, Bersin by Deloitte, Gartner, Corporate
Executive Board, the Conference Board, Forrester, IDC, trade magazines, and many colleges and
universities. If you aren’t sure where to begin, start with a simple Internet search.

Conducting Your Own Research


The best data you can use is data that is specific to your audience and company. You can use a
variety of methods to collect your data, and mixing and matching methods will give you even
more insight (Figure 24-1). The variety will let you compare the information to see patterns. When
conducting research, be clear on the questions you are trying to answer. The research may move
you in a different direction or drive additional areas of study, but your beginning questions should
be well thought out.

320 | Chapter 24
Figure 24-1. Methods and Benefits for Data Collection

Approach Benefits

Surveys • Relatively inexpensive and can allow data collection of large populations
• Flexible and can be delivered in a variety of modes to a variety of individuals
• Offers anonymity—all data can be collected without the names of the respondents, leading
to more responses and better results
• Allows for a variety of evaluation methods
Data Analysis • Allows you to use data from different sources and evaluate them to look for the
information you need to make decisions
• You may have tools you haven’t even considered, such as your LMS or CMS
Pilots • Allows you to test different approaches in a safe environment and then make
improvements or try another approach
Interviews • Provides anecdotal data for use alone or with other data points
• This is an especially good approach for experiential and emotional data
Focus Groups • Collects information about feelings, perceptions, and opinions and allows for clarification
or digging deeper
• Quickly collects data and can save time and money
• Get quotes and contacts for further research
Observation • Observe in the natural setting, which can provide insights that cannot be achieved using
any other method
• See how people act together and separately
• Document people’s nonverbal interactions

Don’t overlook the data that you may already have—look to your LMS or CMS for information
around average learning time per employee, insights into what type of learning programs are being
taken, what subjects are most popular, and so forth. And don’t forget that your HR department is a great
resource for general data about your audience, such as population characteristics, performance data, job
role distribution, training courses taken and completed, and hiring plans.

Forming Conclusions
Once you’ve collected the data, you need to evaluate it to find patterns and insights around your training
question or idea. The activity of consolidating the data will naturally lead you to the best grouping, and
you will start to see patterns. For example, does all or most of the research point to the same thing, like
“coaching is the most important skill for sales managers”?
When we were researching the curriculum for a new offering for sales managers, our hypothesis was
that coaching was one of the most important skills. When we surveyed our top sales managers about
which skills were the most important, coaching came out as the number-one skill. Evaluating research
from others, which also showed that coaching was the number-one skill for success, provided further

Changing Times | 321


support. Based on our research, we built a groundbreaking learning solution that used several of the
innovations discussed here. This solution has proved to be very successful from both the learner feedback
and the business results.
To help decide if a new approach is worth pursuing, try some of these tips:
• Start researching what others have done with the innovation. What was the outcome? How did
they use it? What were the parameters of the use? What did they learn?
• Refine your approach based on what you learned from your research and put your business
goals and audience needs first. Use design thinking and Agile approaches to better ensure your
approach is based on solid information.
• Try it. Create a small pilot to test the approach; then observe, collect data, and talk about the
experience with peers and users.
• Evaluate the outcome. What worked and what didn’t? How could you change it to work even
better? Is the approach even worth pursuing? If so, how? Refine your assumptions and your
approach.
• Start implementing your solution in other programs, collect data, and refine your approach.
Continuously improve—innovate, test, revise, repeat.

Driving Learning Impact With the Lines of Business

Joe, a senior learning and talent developer in a large global company, just deployed an initial training program
around his company’s digital cloud solution suite. Rollout was smooth, and people seemed to find the experience
useful. That was until one of the cloud product owners came to him and said, “You need to redo this training
because the suite just added a new product. Didn’t you know? Oh, by the way, the update rolls out in two weeks.
And, since you have to change it anyway, could we share some new ideas we have?”

Sound familiar? Involving the business as part of learning solution design and deployment is obvious, but
we often forget to given our increasingly shorter deadlines, reduced or missing stakeholder involvement,
and efforts to keep up with skill needs. We want the business’s opinions and feedback to ensure we’re
deploying relevant content that provides business impact. However, how regularly can you get their time
and attention? How many times have you deployed something only to find that something has changed?
Or the business decides to do it themselves, and then wonders why there are no measurable results?
Skill needs, changing client requirements, and shorter time to production make it even more import-
ant for the business and learning function to have a closer partnership. Collaborating together can:
• Improve the employees’ perception of the value of learning.
• Shorten the design and deployment cycle because the learning team knows in advance what
changes might be coming and can strategize on how best to address them.
• Create tools and templates so subject matter experts can author end-user content.

322 | Chapter 24
• Leverage business expertise in training challenges and create activities that bring an element of
authenticity to the programs.

Examples and Best Practices That Have Worked for IBM


Design thinking and Agile principles emphasize the importance of engaging with the end user right from
the beginning and co-creating with identified sponsor users who represent the target audience. Who better
to work with when you are going to need buy-in to advocate or drive consumption of the solution? Do
employees and business leadership become your stakeholders or pilot users? Consider the following best
practices.

Engage the Business


Invite business contributors or visionaries to join a development initiative as sponsor users. They can be
as involved as time allows, but having this group (especially for transformational initiatives) saves time and
strengthens the relevancy of the program.
The IBM Manager Champion Group is a great example of this. Every year, the top 50 manag-
ers from across IBM are selected for a year-long cohort experience during which they participate in
projects that provide feedback to the business and the learning team. The cohort makes a yearly trip
to corporate headquarters for an in-person session during which they engage in identified business
challenges to provide a manager’s perspective around how to make our learning offerings, talent
initiative, and business processes more effective. It’s innovative and aspirational as well as immedi-
ately applicable.

Use Business Leaders as Co-Facilitators


Create a faculty of experienced business professionals who can join the facilitation team for identi-
fied learning programs. At IBM we created a Faculty Academy to ensure business leaders brought
real business stories to the programs. Joining the Faculty Academy requires members to give a
certain amount of time every quarter and have the opportunity to take part in the learning devel-
opment. This is especially effective when upskilling executives or senior managers, because faculty
members can guide practice assignments and help with content. Our onboarding program receives
rave reviews when we leverage business leaders to inspire and provide clarity about IBM, how it
functions, and what the business priorities are. This also prompts proactive engagement between
the teams.

Use Business Coaches


Identify business coaches and guides who are willing to give feedback during the apprentice or practice
portions of an employee’s experience. This can be especially effective if key knowledge or skills cannot be

Changing Times | 323


found in many places. If digitized, the feedback from these coaching sessions can be made available for
enterprise-wide program reuse.

Conduct Showcases
We have quarterly virtual sessions at IBM where we demo different learning solutions, and these sessions
are open to anyone interested in learning about a new tool or best practice. Through those sessions we
also share perspectives, demo new learning approaches, and discuss recent business priorities. We can
then leverage the information we’ve acquired or developed. We share research to help inform business
stakeholders of the value and reasoning behind the approaches used.
Throughout the year, periodically assess your own engagement activities, ask to be part of strategic
planning, or offer to share the latest in innovations as business teams are looking at their skill needs. Build-
ing habits like this so that they become part of your corporate culture can strengthen a learning organi-
zation’s ability to respond on demand, anticipate where new methods will be required, and create a joint
sense of ownership to drive the company’s impact in the market.

Creating Unique Learning Experiences Quickly


While ADDIE (analysis, design, development, implementation, and evaluation) remains the “grandfather”
instructional design model, learning teams can also use a variety of other approaches to drive continuous
improvement to create great outcomes and improve customer satisfaction. Often, we no longer have the
time or access to content experts, and our learning assets are becoming outdated faster than ever before.
Learning and talent development professionals have to be nimbler, strengthening our ability to shift
direction as things change and drive measurable impact whenever possible. We, as an industry, have had
to rethink the way we work to create better learning experiences. Methods and models such as design
thinking, Agile, and user experience tools allow us to more directly involve target audiences in design and
development activities. Today’s employees are more sophisticated, more “digital”—and they expect more
from their experience. We are also being asked to tie our learning into the way we work. While IBM still
has a learning portal, learning teams are also integrated into business systems so the learning content is
right where employees need it. Here we look at design thinking and designing with Agile in more detail.

Design Thinking
Look at the following instructions:

Prompt 1 Prompt 2
Take one minute to Take one minute to design a
or
design a vase. better way for people to enjoy
flowers in their home.

324 | Chapter 24
What was different about the two prompts? What made designing for the second prompt different?
The first prompt contained a solution. This constrained creativity and problem solving, creating a forgone
conclusion with no mention of a user or experience. It also failed to provide a context for how the product
would be used.
The second prompt opened the mind, providing the opportunity to get creative: “How will those
consumers find value in your design?” To improve an experience:
• Focus on human problems rather than technical solutions.
• Explore a wide range of options to lead to a better set of outcomes.
• Make mistakes. It is OK, and doing it early saves time and money.

“There’s no longer any real distinction [at IBM] between business strategy and the design of the user experience.
The last best experience that anyone has anywhere, becomes the minimum expectation for the experience they want
everywhere.” —Bridget van Kralingen, Senior Vice President, Global Markets, IBM

At IBM, design thinking includes three pieces: the principles, the loop, and the keys (Figure 24-2).
Let’s look closer at each.

Figure 24-2. Design Thinking at IBM

The Principles
Our principles represent our attitude. They allow us to see problems and solutions from a new point of view.
• Keep the focus on user outcomes: We must always prioritize the needs of the people who will
use our solution.

Changing Times | 325


• Take time to see the world through one another’s eyes, which drives unique insights, advancing
the whole team’s thinking.
• Restless reinvention means everything is a prototype. When you think of everything as just
iteration, you’re empowered to bring new thinking to even the oldest problems.

The Loop
The loop is a behavioral model for discovering users’ needs and envisioning a better future.
• When designing, you maintain constant motion on a continuous loop of observing,
reflecting, and making adjustments to the solution.
• We observe by examining our users and their world to understand them, uncover their
needs, and get feedback on ideas. When observing, we set aside our assumptions and
immerse ourselves in our users’ environment; it is about taking it all in and seeing what
others look past.
• Then we reflect and look within to synthesize what we’ve learned. We articulate a point of
view and come up with a plan. We build understanding of a situation by making sense of
what we’ve observed or made. Reflecting individually is important but reflecting together
raises everyone’s collective understanding.

The Keys
The three keys—hills, playbacks, and sponsor users—are all about scalability. These core concepts help us
not only design, but also deliver meaningful outcomes.
• Hills turn users’ needs into project goals, helping the team align around a common
understanding of the intended outcomes to achieve.
• Playbacks are a time for your team and stakeholders to understand the work and voice their
input. They reveal alignment or discord about what the team intends to deliver.
• Sponsor users are real users. We collaborate with them to increase our speed and close the
gap between our assumptions and our users’ reality. They aren’t just passive subjects—
they’re active participants who work alongside the team to help you deliver an outcome that
meets their needs.
At IBM, we look at our requirements, engage sponsor users from the beginning, and perform
sprints using flipcharts and other tools such as Mural to determine our user needs, learning outcomes,
and how to measure impact. Then we develop, deploy, and loop back around.
Figure 24-3 is an example of output after a design thinking session. The learning team evaluated a
specific persona and then created an empathy map examining what this persona might say/think/do/
feel—first in the “as is” state (meaning right now) and then in the “future state” in order to look at what
needs to change in the learning experience.

326 | Chapter 24
Figure 24-3. Example Empathy Map for a Specific Persona

A Story of Industry Gold


After a one-day design lab in October 2016, the IBM learning team engaged with senior industry executives to
reimagine how to develop expert industry sellers. These executives were spread out, so how could the team keep
in touch during the sprints and playbacks when the solution was on an accelerated development path? The team
chose to use the MeetingSphere tool for playbacks that needed brainstorming and consensus. The tool allowed the
team’s global members to brainstorm, vote on, and prioritize new ideas, as well as discuss and capture everyone’s
comments. And it created a permanent record of everyone’s input. The tool served as a great way for members to
keep in regular contact, receive ongoing feedback, and make improvements to the solution to meet the complex needs
of the 18 industries across IBM.

Using Agile Methods and Principles


There are similarities between design thinking and Agile. At IBM, we ascribe to three Agile principles:
• Clarity of outcome. Begin with clarity about the outcome and let it guide every step along
the way.
• Iteration over perfection. Listen, iterate, learn, and course correct rather than waiting
until it’s perfect.
• Self-directed teams. Build small teams with the right skills to encourage self-direction and
innovation.

Changing Times | 327


You may already be familiar with Agile, especially if it is used in your company. We suggest you
consider how you can use these practices to change the way you identify, design, develop, and implement
your learning solutions. At IBM, design thinking and Agile are used prolifically. And, now that the IBM
Leadership, Learning, and Inclusion teams have made them a part of how we work, it is easier to partner
with the business and iterate learning solutions in a collaborative way.
Whatever the method, we encourage you to reflect as a team on how these processes could address
pain points that prevent you from becoming more innovative or that affect your ability to experiment and
improve learning impact. How can you experiment in ways that help you get better at getting better?

Innovation and Trends in Learning


Recently, we heard this discussion around the idea of innovation.

Julio: I was talking with Peter and he sees innovation as new technology. But I believe that new learning
strategies, approaches, and ways to implement them are also innovation.

Pam: Even though I use new and innovative learning approaches often, I don’t always think of them as innovative
unless they are delivered with technological innovations. How wrong is that?

Julio: Innovation can be new technology, new learning strategies, or new ways to use either.

Pam: You’re right! I can’t believe I fell into the trap of thinking only about technology!

How do you view innovation? Is it technology, strategy, or a combination of the two? In truth, incor-
porating innovation in learning will always require solid instructional design based on neuroscience and
cognitive science. The shiny new object might not be shiny at all if it doesn’t have solid instructional
design, giving it measurable impact on learning and performance. The best shiny object may simply be
a new way to use a more mature technology. Let’s discuss a few of those innovative learning approaches.

Curating and Filtering Content


Curation is the process of finding, selecting, organizing, contextualizing, and sharing the best and most
relevant digital content on a topic to meet a learner’s needs.
There is power in curating the best content because it allows for speed to market for content your
learners need. If you have an urgent need, you can deliver the best, most relevant content without requiring
a long development timeline, a large development budget, a dedicated development team with advanced
digital content development skills, or extensive time from subject matter experts.
There are two levels of curation (Figure 24-4). Level 1 is identifying and using the best-of-the-best
content from a variety of sources around a specific subject. Level 2 is filtering the right content from the
level 1 curated content, based on the learner, specific learning objectives, current and desired skill level,

328 | Chapter 24
job role, learning styles, length of content, relevance, and more.

Figure 24-4. The Two Levels of Curation

Trusted Identifiers Learning Platform Specific Delivery

• Learning professionals • Surface relevant content • Programs


• Strategically connect
• Subject matter experts • Channels
learners to content
• Enable intelligent learner
• Trusted sources • Recommended learning
searches
Level 1 Level 2

At IBM we use a learning platform called Your Learning to aid in content curation at both levels. The
platform pulls in content from trusted advisors—including third-party content vendors, learning profes-
sionals, and subject matter experts—and continually manages it, adding or deleting based on relevance,
learner reviews, content owner management, and source metadata.

Microlearning
Imagine: You are riding the train into work and have just a few minutes before you arrive at your station.
What a great time to learn! You open a video on your phone and discover a new technique that changes
the way you do business; later you get a quiz question that tests and reinforces what you learned. You
might even get a little practice exercise. That is the power of microlearning.
Microlearning allows anyone to focus on upskilling, even without a large amount of time. Let’s be
real, in society today, we have become conditioned to expect and be comfortable with shorter content; it
fits into our on-the-go, multitasking, wait-in-line lifestyle. To offer content that meets the needs of today’s
learners, think about learning solutions made of short, bite-size nuggets.
Microlearning can be leveraged in a variety of ways. It can be used to deliver and reinforce learning
content, assess retention, support performance just-in-time, promote practice, and achieve the benefits of
spacing in content. You can even bundle multiple microlearning nuggets, use them as pre- and post-event
learning, or partner them with more traditional approaches.
To make microlearning work, consider these best practices:
• Ensure that microlearning is the appropriate approach.
• Package it as a complete learning point.
• Ensure it is a learning experience and not just content.
• Use systematic instructional design.
• Be engaging.
• Deliver in the working-memory capacity of humans—three to seven minutes.
• Use visuals to make the content sticky.

Changing Times | 329


• Be flexible to allow personalization if, how, and when learners use the program.
• Design for accessibility.

Bundling
Bundling is a way to associate and deliver content that supports achieving mastery of learning objectives.
Often the content is micro, nonlinear, self-paced, and self-selected (Figure 24-5). It allows a structure to be
imposed or suggested, but often gives the learner the flexibility to determine their own path. Bundling can
be as simple as an email with links to microlearning or as sophisticated as a custom tool. It can be accessed
via the company intranet or embedded in work, other learning programs, communities, emails, wikis, or
any other place you can post a web link.
At IBM we created an internal platform that provides an easily accessible, engaging, reusable, and
configurable framework for housing, presenting, and tracking content that also supports internal and
external content access. The framework allows the learner to easily navigate and complete the learning
content that will help them most and fits the timeframe they have. We include features that drive engage-
ment, mastery, and recognition.

Figure 24-5. Bundling Attributes and Application

Attributes Application
Look and Feel Apply custom branding and imagery for each project and program.
Activity Content Can be simple with only a description and a link, or complex with multiple sections of text, videos,
and images. Can include off-platform activities, such as one-on-one practice, testing, and work
study.
Mini Quizzes Check learner mastery when they complete an activity.
Tracking Provides personalized cues to what content has been accessed and completed. Credit can be
awarded on activity completion.
Gamification Points, ranking systems, and game-based learning help engage and motivate learners.
Social Users can share achievements on social platforms.
IBM Watson Users can leverage the power of IBM Watson to find content using natural language queries.
Reporting Detailed reports are available.
Customizable Customize activity content, points, ranks, and other design elements.
Digital Badging Awarding digital badges for proven mastery via quizzes, tests, and performance evaluation.

IBM learning bundles are also integrated with our personalized learning platform, Your Learning,
which means our structured microlearning programs are fully integrated. The following are just a few
examples of how IBM has used bundling.

330 | Chapter 24
Advanced Sales Coaching: Master With Easy Navigation
Each box on the platform’s landing page represents an activity within the learning bundle. From this initial
page, users can easily identify:
• the subject
• how long it will take to complete
• the number of points they will earn as they “rank up” in accelerating sales
• the format of the content (for example, video, document, blog, article, book chapter,
or podcast), which is represented by the icon to the left of the activity title and by the
background image.
• whether they’ve already viewed or completed the content, as shown by an orange status
indicator
• recently added content, which is indicated by a “new” label.

The Story of Cloud and Cognitive Patterns


Two years ago, the IBM Cloud & Cognitive (C&C) solutions line of business was formed to address
the needs of IBM’s top integrated and industry clients, offering depth for the C&C portfolio. The
new leadership was also responsible for identifying new C&C opportunities and managing strategic
cross–business unit opportunities. The mission was to rapidly improve the skills of 378 cloud and
technical sellers to confidently and competitively sell these solutions. To do this, IBM designed an
extensive program focused on cloud and AI content. It presented the go-to-market strategies for 16
client conversation patterns with three to four use cases for each pattern. This amounted to a total
of 58 use cases. Phase 1 was launched in 45 days, and the solution itself was reused, repurposed, and
reinvented to reach more than 40,000 IBM employees. With a 15-person learning team, IBM ran a
series of five sprints over two months, bringing together more than 250 experts and creating more
than 700 use case elements, 58 assessments, and 200 videos.
The solution leveraged IBM learning bundles for delivery (Figure 24-6). The team first created
a framework using design thinking sessions and a lot of SME input. This pattern was then replicated
quickly because multiple development streams could work simultaneously. In the spirit of Agile,
sprints (or iterations) took two to three weeks, and we held daily standups and retrospectives at the
end of each sprint.
Design features included:
• use of microlearning and macrolearning
• content curated for easy consumption
• mastery assessments
• media variety

Changing Times | 331


• practice and feedback to build skills
• digital open badging.

Figure 24-6. Cloud & Cognitive Patterns Bundle System

Small Private Online Course (SPOC)


You’ve probably heard of MOOCs (massive online open courses), which are often used by universities.
But have you heard of SPOCs (small private online courses)? At IBM, we call them digital learning guides
because we use them to guide learners through structured or semi-structured content. Not only can they
stand alone, but they can also be integrated in other approaches. For example, a SPOC can create the struc-
ture for digital face-to-face learning programs. SPOCs can contain all types of content, including microle-
arning, curated content, live virtual classroom sessions, internal and external content, and quizzes and tests.
A SPOC is typically supported by technology, which can be simple or complex. It can include content
presentation, tracking, quizzing, and testing, as well as student and instructor interaction, peer reviews, group
projects, and social connections. You can chunk content to help the learner navigate and choose the content
they want to focus on. Figure 24-7 shows an example of the SPOC interface for the Global Sales School
program, which is a blended solution that uses a SPOC for the digital portion of the program. SPOCs also
allow for facilitator moderation and monitoring, tracking, and reporting on progress.

332 | Chapter 24
Figure 24-7. Example of a SPOC Interface

Lead Client Success Weeks 1 2 3

week 1

Research the essentials


COURSE
Explore your clients' world.
ASSIGNMENTS 1 Welcome
DISCUSSION 5 minutes

PEERS
2 Collaborate
2 minutes

Digital Face-to-Face
Converting traditional instructor-led training to digital face-to-face training is becoming more in demand
as skills become a currency and training organizations are asked to do more with less for increasingly
dispersed audiences. Because it is a global company that is widely dispersed not only geographically but
also with a large remote workforce, IBM needs to look for ways to generate the same impact as face-to-
face learning, but with a faster, less expensive approach. This need continually drives us to evaluate ILT
programs for transition to digital face-to-face, if not in whole, at least in part. Our move to increase our
digital approaches is also supported by learners; 75 percent of IBM learners reported that this approach
gave them confidence to learn new skills.
To ensure we make the best decisions for ILT program conversions and new learning programs, we
have identified four best practices for digital face-to-face learning. We have found that the most effective
experiences appropriately blend and leverage all best practice dimensions. Let’s look at each dimension:
• Interactivity
° One-on-one work. Includes role play and practice for skill improvement using
teleconference, videoconference, or in-person if co-located.
° Group activities. Use breakout sessions through videoconference or teleconference
breakout rooms to complete activities and build relationships.
° Mentoring. Pre-arranges virtual partnerships with mentor guides and integrates meetings
into the program schedule.
° Coaching. Pre-arranges virtual coaching partnerships using trained coaches.
° Individual activities. Use work-based activities with peer or manager reviews.
° Relationship building. Uses videoconferencing, group activities, and social learning to
build relationships.

Changing Times | 333


• Approaches
° Schedule. Conduct multiple sessions that are no longer than two hours each and are
spread over several days or weeks.
° Microlearning. Quickly closes micro skills gaps in a self-paced environment by
capturing and maintaining learner attention.
° Content curation. Uses the best content available and is built to meet the objectives;
diverse content leads to expanded learning and engagement and appeals to different
individual learning styles.
° Cohort grouping. Cohorts build relationships and expand partnering.
° Collaboration over time. A learning journey provides for building long-term
relationships.
° Just-in-time. Allows f lexibility by including self-paced learning to ensure access
when needed.
° Work-based learning. Integrates activities that kick off or complete a job
requirement.
• Technology
° Videoconferencing. Delivers lectures, enhances engagement, promotes relationship
building, and simulates the classroom experience.
° Self-paced learning. Integrates digital learning to support learning objectives and
reduce class time.
° Gamification. Increases engagement when used in any asset.
° Social learning. Host discussions via social networks, thus building relationships and
peer-to-peer learning
° Video-based learning. Uses simulations to demonstrate behavior and provide
enhanced engagement.
° Simulations. Demonstrate behaviors and help evaluate the behavior being displayed.
• Feedback and Measurement
° Knowledge and skill assessment. Uses traditional online testing techniques or
gaming for assessment. Self-paced assessments can be made using a wiki, gaming
engine, or testing software. One-on-one interactions can be used to assess performance,
for example, with coaching or presentation skills.
° Peer reviews. Expand perspectives and peer-to-peer learning.
° Practice and feedback. Provide practice and feedback via in-person or virtual peer,
manager, or coach interactions.
° Stand-ups. These assessments can be recorded via video and used to measure against
criteria. Video tone and content can be evaluated with software.

334 | Chapter 24
Digital Face-to-Face Example
Bright Blue, an IBM program for new sales managers, is a cohort journey that uses many digital face-
to-face best practices. It is a mix of live virtual classroom sessions, SPOCs, small group activities, peer
reviews, microlearning, group projects, quizzes, and executive sponsorship. Mixing different learning
strategies helps the learners stay engaged with the content and one another and learn in multiple
ways (Figure 24-8).

Figure 24-8. Creating a Cohort Experience in Bright Blue

Digital Badges
Digital badging is a nano credentials program established using the Mozilla Open Badge standard,
which issues badges to people who have accomplished résumé-worthy activities. The activities may
be deep learning, and the badges serve as proof of mastery and accomplishment, or they could
represent the achievement of a certain level of experience or expertise. Badges are validated against
a public set of criteria developed by the issuer (which can be an employer, educational institution,
or company).
Learners can share their badges across social media. These badges also travel with the badge earner,
allowing them to broadcast their accomplishments and skills, thus helping them to enhance their personal
brand. Digital badges show the holder’s expertise and experience and improve their employability. Privacy
is protected because the earner owns the badge once they earn it, and they can decide if, how, and where
to share the badge.
What does the badge issuer get out of it? A leading-edge skills program that attracts, engages, retains,
and upskills the best talent.
In 2015, IBM launched the IBM Digital Badge Program with a pilot for online learning and saw
dramatic results. Not only did student enrollment increase by 129 percent; the number of enrollees who

Changing Times | 335


actually completed the courses also increased by 226 percent. Further, the number of students who passed
the end-of-course exam increased by 694 percent compared with the six-week period leading up to the
introduction of digital badges.
Since launching that first program, IBM has issued more than 2 million badges for employees, clients,
students, and partners around the globe. IBM offers five types of badges: knowledge, skill, proficiency, two
professional certifications (used internally and externally), and an achievement contribution. In addition
to increases in learning success, we’ve also seen an increase in engagement—87 percent of badge earners
said they were more engaged because of the program. Internally, we’ve also seen that digital credentials
rank third after technical and personal development in driving learners to take learning. Based on business
results, technical sellers with certification badges were more likely to make their revenue target than those
without. The program has also received an excellent net promoter score.

Innovation for Impact—Stories of Success


Earlier this chapter mentioned a story about IBM’s Global Sales School program, which had to reduce
learning time and still achieve the learning outcomes the business needed. After using many of the
approaches covered in this chapter, the program was able to successfully reduce seat time by 54 percent
and reduce travel by 25 percent. For entry-level sales training, we more effectively structured the new
sellers’ progress through their online learning and promoted connections with other new sellers. We
created a digital learning guide (DLG) that leverages a SPOC and a dashboard for facilitators and sales
managers. The DLG allowed us to schedule online learning by week rather than supplying a full sylla-
bus of activities because it uses a single screen of assignments over multiple weeks and highlights tasks
in red when they are overdue. It also includes a discussion area for sellers and facilitators, peer reviews
of assignments, and easy ways to connect with other participants. Facilitators and sales managers have
a dashboard that allows them to track seller progress and review assessments that outline areas for
improvement and exceptional performance.
Another success we’ve had is the Advanced Sales Coaching (ASC) program, which has become a
premier program at IBM. It was developed in response to internal and external research indicating that
coaching was the most critical skill for sales manager success and thus sales team success. Where better to
start than the most critical skill for the audience? The ASC program is based on the IBM learning bundles
and features:
• anywhere, anytime access on a laptop, an iPad, and mobile through an online app
• 16 business topics to support managers and sellers
• curated content
• microlearning
• self-paced and personalized content
• engaging, interactive video simulations

336 | Chapter 24
• challenging quizzes and games to keep it fun
• practical practice and feedback from professional coaching advisors
• a little bit of competition with peers to keep it interesting.
The results of the program were stellar, and the impact was real. Managers who participated actively
in ASC had far better business results than those who did not:
• Active ASC sales managers exceeded the revenue plan for three of four parts. Nonactive ASC
sales managers missed the revenue plan for three of four parts.
• Revenue attainment for sales managers who completed one or more ASC bundles exceeded
non-ASC sales managers by 7.8 percent.
• Revenue attainment for active ASC sales managers outpaced non-ASC sales managers by
6.9 percent.
• ASC sales managers exceeded quota by 6.5 percent.
As you can imagine, this program has been a game changer for IBM and has spurred many other
programs using similar techniques and innovations.

Summary
Companies are always trying to be first to market, and growing market share requires being ready to roll
out the next thing before anyone else. Training organizations are challenged similarly. If you aren’t updat-
ing or innovating quickly enough, you can lose leadership’s interest and investment. But innovating just to
bring in the next shiny object doesn’t answer the L&D professional’s responsibility to show impact in the
form of newly created skills or faster responses to client needs. They need to be able to plan and develop
for skills that haven’t even been defined yet!
There are many ways to be innovative. Do your research to ensure you are tightly connected to the
business needs and become more Agile in how you develop and deploy your learning content. This chap-
ter provided insight into methodologies, tools, and design approaches we are using at IBM. We hope you
can leverage these ideas and make a difference in the effectiveness of your learning experiences.

Key Takeaways
9 Being innovative can be a game changer for your organization. However, innovation without
research, a focus on the audience and business, and good instructional design will not be successful.
9 When you use an innovative approach to learning, you should be able to describe the reason you are
using the innovation, as well as the expected outcome. The innovations discussed in this chapter have
the anticipated outcomes; however, each situation is different and you are the best determiner for if,
when, and how to use an innovation.
9 There are many ways to combine different innovative approaches. Once you have done your
research, be bold!

Changing Times | 337


9 As learning professionals, we owe it to our organizations to bring forward proven instructional
approaches and new approaches that could have a big impact by making use of new technology,
new research, changes in the business, and changes in the audience and the way we interact with
the world.
9 Don’t leave the business out! Engage employees (who are your end users) in what you are
working on and co-create learning experiences that include them as part of the development and
deployment efforts.

Questions for Reflection and Further Action


1. How connected are you to the business goals in your organization? Is there a direct line of slight
from L&D to business goals?

2. Can you easily articulate to your team the priorities that drive the business needs in your
organization?

3. What innovative approaches should you learn more about to be more effective with your
business needs?

4. What resources should you regularly engage with to stay current on innovation and trends?

5. How do you assess and track changes in organizational needs at all levels within the organization?

6. Of the many approaches listed in this chapter, which can you add to your portfolio?

338 | Chapter 24
Section 8
Leader Behavior
and Practices

T
his section is a change in format. In the big scheme of the book, Forum members have

shared their practices, cases, tools, and techniques all within the context of their indi-

vidual businesses. For the chapters in this section, we interviewed 11 learning leaders

representing a variety of member companies:


• Alissa Weiher, Cochlear • Jeremy Jones, Asurion

• Carmen Reynolds, Boeing • Marguerite Samms, Intermountain Healthcare

• Chris Holmes, Booz Allen Hamilton • Randall Gross, PeaceHealth

• Cory Bouck, Johnsonville • Terrence Morley, NBCUniversal

• Heather Durtschi, Walmart • Suzanne Frawley, Plains All American Pipeline

• Jay Erickson, Hitachi Vantara

The goal was twofold. First we wanted to understand how they led other learning profession-

als. On a daily basis, how do they operate to enable what their team accomplishes? What would

a fly on the wall see them doing, saying, and, if possible, thinking? The second goal was to delve

deeper into how they personally practiced continuous learning. How do they keep ahead of the

curve with all that is going on in the learning space and in the workplace in general?

Chapter 25 provides many examples of impact when leaders have informal and formal

opportunities to interact with and support those they lead. These stories zero in on the many

varied ways leaders create environments where others can excel in their respective roles and

339
collectively work together to positively affect business results. The ideas range from getting uncomfortable

to being the player’s coach; from being resourceful to focusing on the future; and from using a system’s view

to modeling servant leadership.

Because leadership is a journey that never ends and continuous learning is critical for everyone, chap-

ter 26 focuses on the ways these leaders build their own capabilities by constantly reskilling, upskilling, and

new skilling themselves. Some of the examples include using assessments to gain greater awareness of them-

selves and how they show up to others; self-reflecting to gain clarity on values, goals, and purpose; intention-

ally seeking and using feedback to make continuous improvements; and journaling to capture both thoughts

and ideas.

340 | Section 8
25
Leadership: Enabling
Others to Move the Needle
MJ Hall

In a July 2019 Wall Street Journal article, Tim Sweeney, founder of Epic Games, was asked how he created
such blockbuster products as Fortnite. He replied, “I didn’t create Fortnite. But I did create and nurture the
company that built Fortnite” (Needleman 2019).
The skills needed by such leaders include setting direction, leading people, delivering results, using
business acumen, and building coalitions or some variation thereof, but what does this actually look like in
the moment, day in and day out? What are the practices and behaviors leaders use to enable the type of
performance that delivers results and enables competitive advantage in the marketplace? Like Sweeney,
many leaders in the learning arena use their skills to create an environment in which others can excel and
move the needle in their organizations. But, they do it differently.
The following stories were gathered from interviews with senior learning leaders within the ATD
Forum community. The questions focused on leader behaviors and everyday practices that create an
environment where others can excel in their roles and work together to positively affect business results.

Cory Bouck—Serve, Build, Inspire


Cory Bouck has spent his career as both a follower and leader in a variety of roles and places, including
the military, academia, business, elected office, and a nonprofit. In these roles, he was both a student of
people and a reflective practitioner. In 2013, as director of organizational development at Johnsonville,

341
Bouck captured his experiences and ideas in the book The Lens of Leadership: Being the Leader Others Want to
Follow. Today, he serves as the general manager for Johnsonville’s businesses in Asia. Bouck believes leaders
need to look at all actions and assess the results through the lens of leadership. According to him, “Lead-
ership is an accountability mindset focused on delivering great results through the efforts of others.” His
personal leadership model is “Serve, Build, Inspire.”
In Bouck’s experience, a service mindset means that as a learning leader your business is creating
environments where all employees can create value by excelling in their roles. As the Johnsonville credo
states: Some companies use people to build their business. We use the business to build the people.
This service to others includes:
• loyalty—even when you are not in full agreement with the final decision
• integrity, ethics, and character—transparency and trust at all levels
• initiative, resourcefulness, and self-reliance—constantly focusing on improving results
• professionalism—be productive, be innovative, be the expert, and be polite.
According to Bouck, the best leaders are also teachers who recognize that most of the learning occurs
on the job, not in a training room, and are always developing others. This may sound like it’s straight from
a leadership playbook, but it works: Set clear expectations; be accountable and make others accountable
too; and have those difficult conversations about performance, and have them in the moment, not at an
end-of-year review.
Pursue your own development and build a huge toolkit for helping others. Be curious and chase that
curiosity with sincere inquisitiveness. Develop the skill of questioning that opens others up to share and
get excited about their work and their development. Consider expanding your professional network and
giving more than you receive by mentoring and sharing ideas with others.
Bouck uses what he calls the confidence continuum—with complacency on one end and hubris on
the other—to illustrate how leaders and teams achieve excellence. They can be confident in carrying out
the task, event, or business plan because they have prepared thoroughly—they are committed experts. But
they also have a healthy paranoia that allows them to worry about what they do not know or have not yet
considered. This paranoia fuels their insatiable desire to learn more and get better.
Being at the ideal point on the confidence continuum will increase their likelihood of success and set
an example for others to do the same.

Suzanne Frawley—Resourceful
When people have the ability to find quick and clever ways to overcome difficulties, it can be easily inferred
that they are full of resources and tools for ideating solutions. In truth, they simply adapt well to new or
difficult situations, and they are able to think creatively, even on the fly. As an experienced TD leader who
has worked for a variety of organizations, Suzanne Frawley believes that being resourceful is a critical
leader behavior, given the constant changes in organizational processes, programs, and employee work.

342 | Chapter 25
As a director of talent management teams, part of her role is creating a talent and learning strategy.
Her Gallup CliftonStrengths StrengthsFinder report indicates her top strengths include strategic, futuris-
tic, and arranger; in other words, Frawley has many of the attributes needed for being resourceful. When
creating a talent and learning strategy, it is critical to start with the big why and then figure out what and
who needs to be included, as well as how all the pieces fit together to maximize business productivity and
deliver the requisite results.
Frawley always begins the strategy development process by meeting with business leaders to learn
their perspectives and by asking questions such as, “What do you want the outcome to be? What is your
vision for the next one, three, or five years? What keeps you up at night? What makes you want to get up
in the morning and come to work?” During the development phase of strategy building, she continues to
meet with business leaders, share the current version of strategy, ask questions to ensure alignment with
their business needs, and seek feedback.
By taking this approach, business leaders recognize the work and their input when the talent strategy
is rolled out. This is the benefit of the iterative nature of the development process and building relation-
ships with other business leaders in the organization. Because of their engagement and buy-in along the
way, they more readily champion the strategy during execution. This can result in leaders contributing to
the development and kick-off, or, even, leading some of the learning initiatives associated with the strategy.
Frawley’s ability to be resourceful is enabled through intentional networking, having a large and active
group of mentors, and intentional socializing within the business, especially with other departments.
Additionally, she engages in formal training, conferences through professional networks, and, most of all,
deliberate practice—trying new ideas and approaches, being at ease with making mistakes, and asking
what worked, where she got stuck, and what she would do differently in the process. In her experience, it
is about being curious; reading; reviewing data from credible, reputable sources; and being comfortable
with multiple iterations, not perfection.

Randall Gross—Servant Leadership


Central to Randall Gross’s story as a leader is being a pastor for 18 years and leading a large congre-
gation with tremendous passion and love. Currently, Gross is system director of talent development at
PeaceHealth, a healthcare system in the Pacific Northwest; in this role he is an authentic and true servant
leader. His philosophy is that the greatest power comes from providing direction and then relinquishing
control and using influence and trust to enable the team to get the job done with excellence. Gross is
deeply passionate about the people he leads and serves, and this grounding in empathy traces back to his
values. He uses his service approach to lead from the heart and allow the team to deliver results and take
PeaceHealth’s TD practices to a new level.
His goal is for his employees to see the value in their work life and love what they do. This means
listening to them and adapting to meet the needs of both the learning customers and the L&D team. Gross

Leadership | 343
has found that listening is especially valuable when the work isn’t going smoothly or there is controversy. In
these times, he listens to understand, even if that involves hearing details and barriers that are outside the
grasp of an executive. By doing so, Gross is able to facilitate a collaborative dialogue with senior executives
and gain their buy-in. With this buy-in comes commitment and sponsorship, which in turn brings support
to L&D employees in their daily work. These interactions and working together cultivate a culture of trust.
Like many organizations, PeaceHealth needs to offer a huge breadth of content because of the
diverse roles in the organization, which leads to the old question of buy versus build. When making this
choice, the organization does its due diligence to understand the pros and cons of the options, as well
as ensuring it’s getting maximum value for its employees. Currently, PeaceHealth partners with DDI
(Development Dimensions International) to provide leadership development content and consulting on
succession management, and with Emtrain to provide training content that promotes a positive work envi-
ronment, including a code of culture, diversity, and workplace safety. To keep his stakeholders informed,
Gross hosts show-and-tell sessions and uses the feedback to make process improvements.

Heather Durtschi—Focus on Future


During her time with Walmart, Senior Director of Learning Content Design and Development Heather
Durtschi has learned that whether you love change or not, it is inevitable. The better prepared you are for
change, the better off your team and your organization will be. For Durtschi, this question is always top of
mind: “How do you stay in front of what’s coming without being able to see the future?”
Her team operates within an Agile framework and uses a variety of change management techniques
to ensure quick adoption of new processes and ways of working. Her content and creative design teams
work in both Scrum and Kanban format, flexing to where the business needs them most during a given
development cycle. Designers attend daily stand-up meetings and biweekly retros, where they are empow-
ered to share ideas, offer solutions, and suggest new opportunities for feedback and support.
In Durtschi’s organization, content managers own the curriculum from needs intake to measuring
impact—literally end-to-end—updating and refreshing it at whatever cadence the business requires. For
this to work with the utmost speed and effectiveness, she creates an environment of personal and team
accountability, empowering her associates to be autonomous decision makers and encouraging them to
adapt. Critically, when they do so, she trusts their judgment, which has led to outstanding learning results.
Walmart’s greatest challenge is the pace of change in the modern marketplace and the adaptability
that pace demands. An omnichannel strategy—cooperation with communication channels and resources
rather than working in parallel—has become the reality of retail, and Walmart must continually balance
its focus between traditional brick and mortar stores and online operations. For Durtschi, this means
ensuring that training and instruction stay relevant, even as retail continues to change. Training must be
provided when the learners need it, where they need it, and in an easily digestible amount they can quickly
apply. For example, as Walmart continues to install pickup towers in stores, Durtschi’s team has trained

344 | Chapter 25
associates on how to function efficiently in this new environment. The training can be done on the sales
floor using virtual reality, which decreases the in-class time required for previous training programs.
Walmart’s customers are also changing. To thrive, the retailer needs to change with them and embrace
new ways of working. The training for the new skill sets must be developed by L&D team. Walmart’s
learning team is focused on delivering targeted curriculum at scale. They leverage cutting-edge tech-
nology (such as mobile learning, VR, and AR) and deliver training to associates on the sales floor at the
moment of need, all without sacrificing the customer focus that helped the retail giant succeed in the first
place. Embracing change and staying close to future trends are critical for Walmart’s continued success.

Chris Holmes—Curation
Chris Holmes leads the functional L&D team at Booz Allen Hamilton, which like many companies isn’t
equipped to produce the volume of content needed. But Holmes and her team do not let this get in the
way of providing the requisite content the company needs. Instead, they strategically partner with vendor
organizations and use a curation-based learning model, rather than one that is development based. Booz
Allen Hamilton also designs cohort-based learning experiences that may comprise online learning from a
provider and another section facilitated by a different provider.
Holmes’s team is challenging the norms of what some would consider traditional learning roles or
staffing needs. For example, they have found that rather than focusing on finding the perfect instructional
designer, their team benefits more from having people who not only understand the principles of instruc-
tional design, but also can play other roles such as data analyst or videographer. Balancing the team with
an array of versatile skills is critical.
The ability to solve problems is also essential for a TD leader. Holmes believes that if you cannot iden-
tify a problem, break it down to its fundamental pieces, and understand it, you can’t be effective at your
job. A few years ago, for example, Holmes and her team had an off-site team-building day at an escape
room. Having a diverse team (including a technology strategist, electrical engineer, program manager,
software developer, marketer) was very beneficial because everyone was able to solve a different piece of
the puzzle seamlessly. This allowed the team to leverage their strengths and emphasized the importance
of surrounding oneself with people who excel in other skills. And, they got the top score!
Building trust within a team is imperative. Holmes has found that she needs to be intentional about
investing time in people and building relationships, both in the talent space and in the core business units.
For her, trust develops through interactions with one person at a time and over time.

Jeremy Jones—Alignment
As director of learning and development at Asurion, Jeremy Jones believes that aligning L&D with the
organization’s goals and objectives is imperative; it gives the L&D team its purpose, or, as Simon Sinek
says, “the why.” This is especially important in the age of VUCA and constant change. One of the ways

Leadership | 345
Asurion’s L&D function adapts is by engaging in conversations with major stakeholders, speaking the
language of the business, and making sure they have a seat at the strategy and planning table. They have
accomplished this by being accountable for consistently delivering results to gain credibility.
L&D was not involved in the planning process when Jones started in his current role, so his
first step was to identify key stakeholders. He worked with a senior manager in the finance depart-
ment to gain a better understanding of the planning process and a list of stakeholders from each
department. Next, he scheduled introduction meetings with each of them. The meetings focused on
gaining a better understanding of the stakeholder’s role in the planning process and the pain points
identified around operational performance. These monthly conversations about process improve-
ment, finance, quality, change management, and coaching eventually led to partnerships. After
completing a gap analysis on specific pain points, Jones was able to determine that focusing on the
new expert experience (that is, retention of those experts and how they perform relevant to tenure
performance) would have the most significant impact on the organization.
Jones believes the determination to partner is very simple: If L&D can bring value, it will be a
part of the equation; if not, there is no need to engage. By bringing value to the business and creat-
ing financial benefit, the team was able to create two goals for L&D:
• Increase the retention of new employees within the first 90 days by 10 percent each year.
• Reduce the time it takes for new employees to match tenured performance in key metrics
by 25 percent each year.
These goals became part of the annual planning process and are tracked for successful delivery.
The relationships the team created have resulted in additional opportunities to partner with
product launches and operational unit changes. Each launch or change has key performance indi-
cators that the team uses to measure changes based on learning (for example, did we do what
we said we would?). If the team doesn’t meet this commitment (which was discussed when build-
ing out the business case), then they don’t consider the product launch successful. Additionally,
the primary questions always come back to, “Did it deliver financial value?” and “Did we see
an increase in the number of high-potential employees who had an opportunity to get promoted
(employee experience)?”
The philosophy of “fail fast” is a big part of Asurion’s cultural fabric, and this is especially true
when developing content using ADDIE. When the process calls for quick iterations, it’s OK to fail
fast, because the lessons learned affect future work and performance. It is never a good idea to tie up
significant resources in something that leaves you needing to start from scratch if it fails on launch.
Jones lives by the credo, “If you don’t care who gets the credit, anything is possible.” He says
in the end it’s about inf luencing others and collectively driving toward the expected outcome. Jones
likens the learning function to the offensive line of football—we don’t hear about them as much, but
they’re responsible for the success of the game.

346 | Chapter 25
Marguerite Samms—Systems View
As assistant vice president for leadership development at Intermountain Healthcare, Marguerite Samms
thinks of leadership as an organizational process, as well as how an individual leads. She recommends
starting with the organization’s mission, vision, and values and basing your strategy on those fundamen-
tals. Your strategy can then be articulated in strategic imperatives with clear goals and outcomes, which
are created through a strategic deployment process. At Intermountain Healthcare, the resulting initiatives
are managed by a performance management system—this occurs at the organizational level, team level,
and individual level and is continuously improved.
Samms believes that a leadership model should articulate the leadership brand and what it means to
be a leader at the organization. At Intermountain Healthcare, leadership excellence is supported by the
learning organization and the company’s deep commitment to being a model health system. Intermoun-
tain continuously strives to show other organizations how to create healthy communities, provide excellent
care when needed, and offer the most value to members of their insurance plans or patients using their
services. And, while many organizations visit Intermountain to learn from its processes, Samms thinks that
the real value happens when the learning is mutual—they are always learning and improving ways to help
people live the healthiest lives possible.
Samms believes that everything in an organization is interdependent. She plans and designs work so
it can adapt to constant change.
Samms has used the Baldrige Performance Criteria as a framework for evaluating how leaders are
prepared to meet the strategic plans of the organization. One example of this is the organization’s lead-
ership excellence strategy to prepare leaders for the future, which holds the clues to what capabilities
leaders will need to deliver on the strategy. These clues include items like what is important, how the work
should be done, what success looks like, and how success will be measured. If the strategy includes new
products, services, or growth, then it can be assumed that change leadership will be a core capability for
the future. For the next step, Samms advises learning professionals to identify what knowledge, skills, and
abilities (KSAs) their workforce needs today and in the future. Then they should consider which groups
need which KSAs and in what order. For example, new leaders may need change leadership communi-
cation skills, and senior leaders may need more skill in assessing and creating readiness for change. Once
this groundwork is in place, it is useful to create a grid of the learning content, tools, standard processes,
and audiences to map an interconnected approach. The next steps are to plan deployment or implemen-
tation for all appropriate groups. The most important part is to include a step in the annual planning
process for learning and improving. If this is part of the process, it reduces the number of disconnected
improvement initiatives.
Intermountain has an annual process in which the executive leadership team—which includes the
chief people officer—sets the strategies and identifies a limited number of goals. Then HR develops
aligned strategies and goals followed by the learning groups. Once the goals are drafted, each team in the

Leadership | 347
learning function reviews them with stakeholders in back-and-forth, catch-ball conversations for ensuring
relevancy and alignment. Intermountain uses KPIs to measure outcomes in favor of activity or processes.
Because the strategic imperative categories are meant to last multiple years, annual goals are connected
year-to-year and easily show progress against the strategic imperatives over time.
After a recent organization-wide transformation and restructuring to align the organization as one
and eliminate regions, 63 percent of the leaders found themselves in new job roles. To meet the mission in
today’s environment for patients who want a better experience, the organization needed to create a consis-
tent and highly reliable experience for its patients and for its 38,000 caregivers (Intermountain believes
all employees are caregivers). This would require a more collaborative, interdependent leadership style.
As a result, the company began designing a new leadership excellence framework, which has three parts:
• a leadership-specific workforce planning process
• a clear definition of an Intermountain leader, expressed through a leadership brand and
leadership statements
• a competency map.
The first step of designing this framework was to develop the elements of the new leadership frame-
work and corresponding assessment and measurement tools. They did this by conducting an internal
study to identify the leadership mindsets and behaviors driving Intermountain’s specific outcomes. The
study involved 50 interviews at all levels and in diverse roles to establish the key outcomes, mindsets, and
behaviors to test. Participants were asked questions such as:
• How would you define the intended outcomes of what your group does? What is success?
• What role or roles do you think you, as leader, have to do to achieve those outcomes of
your group?
• What do you think it takes in a leader in your position to be successful in achieving those
outcomes? What knowledge, skills, abilities, or characteristics do you think are most needed?
The interview data was aggregated to form a quantitative survey to pinpoint the mindsets and skill
sets driving success in 13 Intermountain-specific outcomes tied to their strategies. Using regression and
factor analysis, the team identified the most important things leaders were doing to achieve key outcomes.
Six leader capabilities were found to be strongly tied to achieving the desired outcomes. The strongest
correlations were visioning (seeing the long view and being able to create purpose) and connecting to the
front line (understanding the front line, spending time with the front line, seeing the work, and so forth).
Other items included communicating, results orientation, self-awareness, and relationship building.
These results are now informing the leadership capability statements. The organization is seeking to
articulate which capabilities truly differentiate the outcomes of an Intermountain leader. The study find-
ings combined with cultural observations and future-focused strategies have led to five draft capabilities
that will differentiate the outcomes of an Intermountain leader: mission obsessed, envision the future,
act and learn, manage obstacles, and build relationships. The Leadership Excellence Executive Steering

348 | Chapter 25
Team is now collaborating with the marketing and communications teams to articulate a final set of capa-
bility statements in a compelling and uniquely Intermountain way.

Alissa Weiher—Get Uncomfortable!


Get out of your comfort zone. This is something we hear often, yet lacking context, it generally rings
hollow. But in the career of Alissa Weiher, the director of talent development at Cochlear, this mantra has
fueled her growth mindset, which has helped her become a more effective talent leader. By stepping out
of her comfort zone, she has stretched her skills to new levels and been able to build teams that are more
cohesive, develop high-performing strategies, and cultivate strong relationships with members all through-
out the organization. Her advice is: “When someone asks you to do things that aren’t normally a part of
your day-to-day work, be willing to say yes, and give it a shot!”
Weiher’s first job in learning and development was as an L&D subject matter expert in banking—she
started as a teller and worked her way up to office manager before joining the L&D team. The role of facili-
tator came very naturally to her, and she quickly progressed through a variety of roles in L&D over the next
five years. When she went to her next organization, she found herself questioning certain decisions within
the organization and her own abilities as an L&D professional. There was an initial honeymoon phase of
about six months, and then some discomfort for a year, which helped her discover biases based on past
experiences. From there, she determined what she could do to help solve the organization’s challenges. Her
advice to new talent leaders is to know that you’re going to get uncomfortable when you hit one to two years
at a new organization. You’ll question the decisions, and everything won’t always make sense at the time.
But hang in there and get comfortable with being uncomfortable. Ask a lot of questions, build relationships,
learn the culture, and read industry articles. This will definitely benefit you and the organization in the end.
An example of operating outside her comfort zone is using new technologies, especially those having
a major impact on the learning and talent landscape. Weiher was trying to create a different type of
e-learning experience and started exploring rapid speed animation through online resources. She was just
starting to use the technology when she was asked to teach others about it at a conference. Even though
still a novice, she said yes and helped introduce it to other learning leaders from different organizations. At
the time, Cochlear was using VideoScribe, an animation storytelling software.
Weiher encourages the same mentality of risk taking and saying yes to opportunities in her team.
There have been a number of these over the years. One such opportunity involved having a team member
who was still relatively new to instructional design and not comfortable with facilitating conduct a short
presentation at a learning technology conference. Another time, Weiher asked a manager who primarily
supported operations to facilitate an influencing workshop with sales leaders, a group they had little experi-
ence with. She promotes these new and different opportunities because she believes that experience comes
from trying something new. While there may have been some discomfort for her team, the new challenges
allowed for skill and relationship building, both of which are wins for her group.

Leadership | 349
Terence Morley—Focused Understanding
One of the biggest goals of any talent development organization is to meet employees where they are in
their career and move them along based on their potential, their interests, and the organization’s needs.
At NBCUniversal, Terence Morley and his team worked on a program designed to develop high-potential
early career talent. However, after a few cohorts, they realized that something wasn’t working. The team
conducted a formal needs analysis and realized they were not meeting participants at their point of need
because the talent gaps were too disparate. The big takeaway was the power of understanding audience
needs. Armed with these new insights, Morley’s team focused on developing three different experiences to
address these needs in a more targeted way. The new sessions included:
• INVEST—understanding the business
• INFLUENCE—influencing and communication skills
• CONNECT—bringing together cross-functional communities of practice.
Using three tightly focused areas rather than one session that addressed a broad array of topics was
highly successful and productive. Morley and his team learned that you can’t just train a population; you
need to be very specific about the population’s needs and expected outcomes related to their goals, pain
points, and gaps.
Morley believes in the importance of creating a mission and vision for the talent development team.
They collectively agreed upon six core values for which they hold themselves accountable—creativity,
excellence, agility, fun, substance, and uniqueness—and which are printed on their name badges for an
easy reminder. When the team recognizes one another for good work, they tag a specific value. Twice a
year they conduct an awards show called the “Chin Chins,” during which they recognize people for living
and embodying the core values.
Morley describes the first part of his career as being about delivering results. When he got to NBCU-
niversal, he realized that results aren’t the only thing that matters—the people you work with and the
talent in your organization are critical to success. While he is still very results-oriented and focused on the
process, Morley has learned to build individual relationships and adapt his leadership and working skill
sets to obtain the best results for the team. The results and desired outcomes must be customized to meet
the needs of the individual—and the business. The goal is to be the leader your employees need.

Carmen Reynolds—The Player’s Coach


When it comes to leading a talent development organization, talk is wonderful, but it is leadership by
example that truly galvanizes employees. Carmen Reynolds, currently the director of leadership and
professional development at Boeing, has built her career as a “player’s coach”—a leader willing to walk
the talk and demonstrate what savvy, smart leadership in the talent development space looks like.
Reynolds believes it comes down to modeling the behaviors that she and her team are trying to build
in others. This involves a willingness to be vulnerable and understand that you won’t have all the answers.

350 | Chapter 25
This is where appreciative inquiry comes in—asking the right questions and provoking a different way of
thinking.
“By creating an environment where people are empowered and not feeling like you need to be involved
in every little decision, you allow people the freedom to make choices and decisions and be accountable
for the results,” Reynolds says. “Think about how you want to solve something first. It’s OK to offer an
opinion, but be the player-coach and allow the individual the opportunity for leading and self-discovery.”
Reynolds has the opportunity to work with multiple teams while deploying learning programs (includ-
ing instructional designers and operations specialists). She can have a unique viewpoint to meet people
where they are. For example, she might say, “Here’s how I approached X problem. I found it worked for
me, so maybe you can try that. When you’re a player-coach, you’re willing to do the type of work you
expect from others.”
As a talent development leader, Reynolds says it’s vital to be a “curious learner” and know as much
about the business as she can. Every year or two, she does something new in her role to add to her skill set.
This means continuing to get better at speaking the language of the business. “If someone recognizes that
I am a business leader who happens to do learning and development, I take this as the highest compli-
ment,” she says.
Reynolds believes the greatest challenge facing talent development professionals is the same for
any business leader—the need to stay agile and pivot based on market conditions. In some cases, it also
includes a willingness to put on a different uniform and play from a different perspective. “As learning
professionals, it’s important to have an ‘outside-in’ view,” she says. This includes:
• applying industry knowledge and global mindset to what you’re trying to deliver
• making sure the learning you’re looking to promote is applicable to your learners
• making sure that you’re driving relevance; her leader at Boeing often says, “Relevance trumps
innovation”
• meeting people where they are; “Regardless of where they start,” Reynolds explains, “I want
someone to walk out of a program and feel changed.”
Reynolds says she also emphasizes the importance of digital technology, which allows for equal
access to training for all learners—enabling on-demand, relevant learning versus a one-size-fits-
all training approach. “Putting more control in the hands of the learner is a big step forward,” she
explains. “The more accessible and engaging the learning experience, the more likely it is to be used
and to be effective.”

Jay Erikson—Clear Expectations


Jay Erikson, director of global learning at Hitachi Vantara, believes in hiring the right people, setting clear
expectations and goals, providing support, and, then, getting out of the way to let the performers perform.
It’s this freedom and latitude that allows high-performing teams to be creative, develop innovative learning

Leadership | 351
programs, and garner satisfaction in their work. However, greater freedom requires greater responsibility.
So, he makes sure that everyone on the team has a clear understanding of their roles and accountability
for their initiatives. His teams create clarity by having a consistent model for ownership across learning
initiatives, which is supported and reinforced by the structure of the learning organization.
Erikson says it is important for the organization to look at its purpose and constantly ask themselves,
“Why are we doing the things we’re doing? Are we investing our time and resources in the priorities
that matter most? What can we do differently to improve learning and business outcomes? How can we
expand our circle of influence?” Asking these questions will guarantee that your focus is on the right prior-
ities while validating assumptions with data whenever possible.
In addition, Erikson notes that the learning organization needs to align with business partners
and executives to stay relevant and expand their circle of influence. Learning organizations can stay
aligned by:
• Finding business partners who want to work with them. They are usually willing to provide
support, will work toward shared success, and will be champions for the learning organization.
• Not insisting on doing it the “right way”; being pedantic may alienate some. One can be right
and still be wrong. Remain focused on the desired outcome, and collaborate and compromise
as needed.
• Staying aware of business priorities and new initiatives; listening for opportunities to develop
new capabilities that support strategic business initiatives. Business leaders want these strategic
initiatives to succeed and generally welcome support. Executing well on these initiatives will
open new doors.
Hitachi Vantara, like many companies, says Erikson, faces the challenge of equipping employees with
the knowledge and skills required to meet near-term objectives while preparing for a rapidly changing and
uncertain future. The massive reskilling of employees to take advantage of emerging technologies such as
analytics, the Internet of Things, artificial intelligence, and machine learning cannot come at the expense
of meeting quarterly and annual performance and financial goals. The learning organization must work
with business partners to balance these often-competing objectives.
Erikson’s focus on creating a learning organization is his passion, but one that is also supported by
senior executives, including the COO and CEO. Their modeling of continuous learning makes his efforts
seamless and more impactful.

Summary
As demonstrated in these stories, leaders are different, but they use similar skills, behaviors, and practices
in leading others. In Learning Leadership: The Five Fundamentals of Becoming an Exemplary Leader, Jim Kouz-
es and Barry Posner (2016) describe the five fundamentals of exemplary leadership based on extensive
research. These include:

352 | Chapter 25
• Believe you can.
• Aspire to excel.
• Challenge yourself.
• Engage support.
• Practice deliberately.
While those in the leadership and management functional area of talent development teach these
ideas and behaviors to others, we have the opportunity—and duty—to practice them every day by being
a player’s coach, meeting employees where they are, getting out of our comfort zones to learn, and getting
out in front of the change. If those responsible for developing capability within the organization are not
exemplary leaders, how can we expect the organization to have exemplary leaders at every level?

Key Takeaways
9 Leadership is a journey, and there are many ways to get better, but a starting point is learning more
about yourself and how others see you—for example, understanding how you show up to others.
9 Development is personal and takes knowledge, skills, and abilities—which mean learning, doing,
reflecting, and getting feedback—and then repeating the process over and over and over again.
9 While a mountaintop experience happens periodically, the actions taken every day are what define
our leader style and affect the work we do and the people we lead.

Questions for Reflection and Further Action


1. How do you show up as a leader and model exemplary leadership daily?

2. What are your differentiators in terms of leading others; what do others say about the actions you
take and the impact you make?

3. If you were given the corporate award for demonstrating outstanding leadership for this year, what
would the write-up say?

4. What are you doing to get better at getting better as a leader?

Leadership | 353
26
Getting Better at Getting Better:
Tools and Techniques
MJ Hall

Leadership in an organization is generally considered the pathway for competitive advantage. As the
owners of the assessment and design for leadership development experiences, the talent development
profession has a plethora of effective resources, theories, and tools to work our craft. Our ownership of
training entrusts us with the responsibility for not only developing leaders at every level in the organiza-
tion, but also modeling the practices we teach to others.
David Langford, of Langford International, focuses his practice on using quality management and
systems principles with pre-K–12 teachers and, frequently, vocalizes this axiom: “Everyone a teacher;
everyone a learner.” He starts with teacher because teaching others is their job and it resonates. He adds
“everyone a learner” because of his focus on the collaborative team experiences needed to improve the
system and the need to learn continually regardless of your functional field.
As learning leaders in a complex environment with advancing technologies, shifting demographics,
and changing priorities, the work our organizations do is constantly being disrupted. In the Forum, a
consortium for connecting, collaborating, and sharing, Langford’s mantra is used with intentionality. To
stay competitive, organizations need to be flexible and ready to adapt when prudent. Adaptability requires
leveraging current successful practices. A network like the Forum can jump-start new initiatives, but only
if the members are willing to take the time to teach others, to share their ideas openly including lessons
learned, and to learn constantly themselves. It is a give-and-take proposition.

355
As part of this book project, we researched the components or skills identified with leadership to
serve as a guide for the chapters, and then crowdsourced with our members for further input and chapter
authoring. We also interviewed some of our members to shine a spotlight on what leadership in action
looks like, including the stories shared in other parts of the book. Additionally, we surveyed members for
the tools, techniques, and processes they use to build their personal leadership skill sets. While the number
of leaders was limited, there were overlaps within broad categories, as well as some of the tools and tech-
niques employed. Because we know that growing our leadership skills is a lifelong journey, it is interesting
to see which tools and techniques senior practitioners in the learning field are using to learn, teach others,
and get better at getting better in their craft and in their daily roles of leading others.

Creating Awareness of Self


Knowing your strengths, weaknesses, and how you show up to others is one of the most critical aspects of
personal leadership. In Learning Leadership, Kouzes and Posner (2016) state: “Inside-out leadership is about
discovering who you are, what compels you to do what you do, and what gives you the credibility to lead
others.” It is about seeing yourself the way others see you, and there are a variety of ways to learn more
about yourself, including some specific tools in appendix 3.
Before setting goals for improving your leadership practices, you need to know where you currently
stand. The leaders we interviewed use assessments, self-reflection, and feedback to understand the influ-
ence they have as leaders.

Assessments
In leadership development, one of the first places to start is by getting a clear and accurate understanding
of the current state—how you show up to others. In Learning Leadership, Kouzes and Posner share how they
both independently purchased a copy of Jim Tweedy’s painting Self-Portrait because of how meaningful it
was in leader development. The painting depicts a chubby, friendly tabby cat using a mirror to help paint
a self-portrait. While the mirror reflects an accurate image, the tabby is painting itself as a very large and
stern-looking tiger.
Learning and development practitioners assess individuals in a variety of ways, and there are many
vendors offering tools to support those initiatives, including feedback, coaching, and follow-up actions to
enhance strengths using validated and job-related assessments. The repertoire available for most learning
teams includes behavior assessments like the widely used Myers-Briggs Type Indicator (MBTI), DiSC,
Harrison, Hogan Personality, and Emergenetics, as well as 360-degree surveys.
Jeremy Jones of Asurion believes that the MBTI and other assessments, as well as participating in the
Situational Leadership programs, helped him become a stronger leader. The MBTI helped him better
realize the importance of not labeling people, but rather understanding how and where to meet them.
When used for developing team skills to foster collaboration, these assessments help group members recog-

356 | Chapter 26
nize and appreciate the different strengths and natural attributes that individuals on the team contribute.
Group synergy can promote productivity and influence results in a positive manner. Situational Leader-
ship helped Jones understand situational leadership approaches, goals around performance such as part-
nering with the business, and other ideas for personal development. He is a big believer in taking his own
medicine and learning from all the assessments and tools employees use.
Alissa Weiher of Cochlear says that if you’re not taking the time to develop new capabilities, you’re
not staying at the forefront of the learning industry. She is certified to administer the MBTI and Gallup
StrengthsFinder assessments, and has used them with her teams. If someone is struggling or facing chal-
lenges, she likes to remind them of their strengths and what they excel at doing. As a leader, Weiher can
use insights from these instruments to shift her focus and what she is talking about with a team member
to make sure she’s addressing what is important to that person in that moment. She is also certified to
administer a variety of 360 assessments and has also used this process to receive feedback. By using a 360
assessment she’s able to gain additional insights for her own leadership development and to facilitate more
empathetic debrief conversations with others.
What happens when you think you’re one type of leader, but the data tell you otherwise? Carmen
Reynolds of Boeing found herself in this position after taking the MBTI. Her results indicated a prefer-
ence for extroversion, even though she didn’t feel like an extrovert. So, she decided to dive deeper and
took step 2 of the MBTI. The results had a huge impact on her self-awareness, taking her understanding
of the way she shows up to others to a different level. MBTI’s step 2 reveals five facets to each of the four
MBTI dichotomies (extroversion and introversion, intuition and sensing, thinking and feeling, and judging
and perceiving). For example, she loves to be organized and planned, but also thrives in the dynamism of
working under pressure. A big takeaway was how important it is for learning professionals to be consumers
of all the instruments to understand and empathize with employees and to make better decisions about
their use.
Heather Durtschi of Walmart also recommends leveraging internal assessments and credits the Birk-
man Method with helping advance her leadership work with her team. The Birkman Method is a suite
of questions used to measure the characteristics of the individual, the characteristics of the situation, and
the interaction between the individual and the situation. Durtschi uses insights gleaned from her own
results and the results of her team on the Birkman Method to enable collaborative partnerships that drive
business impact and forge lasting relationships.

Self-Reflection to View With a Different Lens


Jay Erikson of Hitachi Vantara is a firm believer in self-reflection, which is vital to his development and
well-being. He urges all talent leaders to spend a few minutes each day reflecting on key ideas related to
work, values, goals, and life, in general. He suggests recording your experiences and lessons learned in
a notebook or on a tech device or discussing them with a trusted colleague or friend. Your experiences

Getting Better at Getting Better | 357


influence your behaviors, which influence your actions and decisions. Think about the results of your
actions and decisions. Why did you decide or act the way you did? How do you make decisions? Did the
outcomes match your expectations? What, if anything, will you do differently next time?
Erikson’s suggestion reflects a process similar to the Kolb Learning Cycle, which is used to debrief
team or group activities, such as an action learning team or a team challenge. It includes an accurate
assessment of what happened, how it happened, the results or impact, the insights gleaned from the expe-
rience, and how the experience will change behavior going forward. Periodically talking with a trusted
colleague, mentor, or coach about the experience and your assessment to obtain their feedback can be an
added boost to help you learn more about yourself and the way you show up to others.

Soliciting Feedback From Others


Feedback is difficult, yet asking for constructive feedback and accepting it without defensiveness is one
of the most effective ways to understand how you show up to others. When Claude Bolton served as the
commandant at the Defense Systems Management College (now Defense Acquisition University), he was
a U.S. Air Force general. He would say that his role was to provide ground support and air cover for every-
one in his command. Every six months he would ask his direct reports, his peers in the Pentagon, and his
superiors to respond to three questions:
• What am I doing that is working well?
• What am I doing that is not working?
• If you could change anything in the organization, what would it be?
The responses needed to be submitted by a certain date, were not to include the person’s name, and
could be delivered in any format (such as an email or an envelope pushed under the door). Within a week
of the request date, Bolton would express appreciation for the feedback and announce a summary of the
results in a staff meeting. Additionally, he shared three actions that he would take to improve. He would
also mention the feedback in more public all-campus meetings.
Terence Morley of NBCUniversal believes in consistently asking for specific feedback before he
embarks on a new initiative. For example, he might ask, “To be better at A, one skill I’m interested in
improving is X. Are you willing to give me feedback in how I currently do X?” Additionally, he will give
people more and better feedback if they ask for it. Morley believes that part of the ask is finding the people
who will help you take big steps in your career by providing honest and open feedback. Open and direct
feedback (such as actively seeking feedback through mentors and individuals he supports) has also helped
Jeremy Jones grow as a leader.

Building Personal Capability


It is easier to start improving your leadership practices once you have a realistic view of how others see
you. Development opportunities—whether formal or informal—are all around. They happen through

358 | Chapter 26
intense work challenges, formal training, and even while walking in the forest meditating. They are built
through intention and discipline, as well as happenstance and serendipity.

Using Tools
One tool that Cory Bouck of Johnsonville uses as a leader with others and for his own development is
Robert Brinkerhoff ’s Impact Map, which is traditionally completed in conjunction with the learner’s
manager (Figure 26-1). The map’s structure creates a line of sight between what is being learned and
better on-the-job behaviors by explaining the positive contributions to organizational goals and the result-
ing metrics (the four column headings are knowledge and skills, on-the-job application, job and team
results, and organizational goals). The theory supporting the impact map process implies that application
and feedback are necessary for newly learned content to morph into successful new behaviors on the job.
For example, the new skill might be to “Increase my team’s effectiveness when having conversations with
business clients.”

Figure 26-1. Example Impact Map

Getting Better at Getting Better | 359


Another tool Bouck uses to move his learning into the doing realm is the What? So What? Now
What? tool. This creates a quick capture of the summary situation and the next steps:
• What? Summarizes the key takeaways from the experience.
• So What? Explains the application and impact of the What?
• Now What? Documents the next step actions and support that are needed.
This is a quick process for converting aha moments into behaviors for impact. While it can be a
documented activity for yourself and for those you lead, it can also be used during conversations with
colleagues and at meetings.

Learning as a Habit
Heather Durtschi believes that staying curious, asking pointed questions, and constantly seeking to under-
stand are essential learning components needed to continually develop her skills and practice. She cites
books, courses, and new technologies as integral to evolving as a learner and leader within talent develop-
ment. Recently, she has pushed herself to learn more about virtual reality and augmented reality because
they will likely play a prominent role in the future of learning. She has attended conferences in these areas,
benchmarked companies that are already leveraging emerging technologies, and reached out to vendor
partners to better understand their businesses. Retail is changing, and Durtschi knows that Walmart’s
training has to change with it. Jeremy Jones also nurtures this curiosity and interest in technology because
of his strong desire to know what is happening in the industry—and how it can influence and advance
individual and group learning.
Randal Gross of PeaceHealth always makes sure to use the assessments or learning tools that will be
part of internal course content. He has taken leadership development courses from the Center for Creative
Leadership and recognizes the influence they have had on enhancing his leadership style. He believes that
practice, reflection, feedback, and coaching are integral to the successful transition and sustainment of the
training on the job. He feels very strongly about the importance of feedback and engaging with leaders
he respects.
Being engaged in talent certificate programs has allowed Jones to obtain a better understanding
of the theory behind learning. Through these programs he appreciates the foundational capabilities
needed by talent professionals. Combining these efforts has provided a deeper understanding for what
effective learning looks like, and what he needs to include in the learning solutions he provides the
organization.
Chris Holmes of Booz Allen Hamilton encourages all talent development professionals to seek out
new opportunities for learning and then share the experience with others in the form of stretch assign-
ments or new roles or responsibilities. Learning agility is a fundamental skill that is tested when we stretch
beyond our comfort zone—and this investment is circular. Learning and helping others will always come
back in equal measure because it also helps you learn how to learn from others to advance your own

360 | Chapter 26
development. Additionally, what you learn when you’re out of your comfort zone is cumulative, and the
experience and skills gained will rapidly enhance your strengths and abilities.
Jay Erikson believes that to be a great talent and learning practitioner, you need to be an avid learner.
He spends a lot of time reading, listening to podcasts, and taking training programs. He recommends
consuming training experiences not only for your own professional development but also to learn what
others in the industry are doing. Erikson requires each person on his team to include development in their
quarterly and annual goals. He recommends they allocate a minimum of two hours every week (ideally in
the mornings) for personal and professional development.
The research presented in Harvard Business Review’s “Good Leaders Are Good Learners” indicates
that those in a learner mode adopt a growth mindset and are more open to experimenting with different
strategies and approaches (Keating, Heslin, and Ashford 2017). Additionally, they are more prone to use
after-action sessions to learn from experience and to document ideas for moving forward. Most of these
sessions are modeled after the U.S. Army’s After Action Review (AAR), and focus on a discovery conver-
sation and note capture. It is a more expansive version of the Plus-Delta tool and uses questions such as:
• What happened?
• Why did it happen?
• What worked well?
• What did not work and why?
• What surprised you?
• What should we do differently next time?

Journaling
Suzanne Frawley of Plains All American Pipeline keeps a journal of what she is learning, capturing
ideas from articles, books, and podcasts. After trying something new, she jots down what worked, where
she got stuck, and what she would do differently if given a chance. Then she does weekly and monthly
reviews of her notes to decide what to keep doing and what upgrades to make. As appropriate, she
also adds these ideas to an initiative or project. One technique that Frawley finds useful in building her
ability to be a resourceful leader and in enhancing the journal process is Ryder Carroll’s Bullet Journal
method, which many bloggers affectionately call the KonMari approach for journaling. The process
involves what Carroll calls rapid logging, and consists of four components: topics, page numbers, short
sentences, and bullets. It also incorporates a variety of symbols and reviews to migrate tasks into the
future. While keeping a journal is helpful, using Carroll’s bullet format and ideas has expanded Fraw-
ley’s ability to capture information and has provided a more practical way for her to review it, thus
adding to more implementation.
Marguerite Samms of Intermountain Healthcare also uses a journal for self-reflection to gain a
deeper understanding of where she needs to grow. Through this reflection, she has discovered the free-

Getting Better at Getting Better | 361


dom to learn that comes with a willingness to be vulnerable. She uses this self-awareness in her interactions
with employees, leaders, and her own team to create a safe space for discussing real issues and for fostering
a healthy place to work. In her experience, her own team has reported increased trust and more fun at
work. She believes that staying true to her organization’s mission—helping people live the healthiest lives
possible—starts with herself and her interactions with the people around her.

Podcasting
As part of his professional research on durable learning and learning in the future, Dana Alan Koch of
Accenture realized the importance of learning about his learning as well as thinking about his point of
view on those topics. Inspired by Accenture encouraging its employees to share blog posts on its internal
social media site, Koch and two colleagues developed a public podcast called Learning Geeks. The podcast is
dedicated to discussing the things that are influencing the learning world and reflects the adage that we do
not know what we know until we tell others. This requires them to work through their point of view on the
topic and organize it into a logical format that connects emotionally with others using stories and examples.

Building Capability in Others


Leadership is about creating an environment in which others can empower themselves to successfully
meet their daily challenges. It thrives when the leader’s daily habits demonstrate how to use the tools and
techniques and intentionally stretches their thinking and actions. Because of the value of learning from
others, modeling practices that promote self-directed research, investigation, and experimentation not
only helps the leader but also helps the employees.

Coaching Others and Using a Coach


Chris Holmes is a big proponent of performance coaching for enhancing an individual’s development.
She believes coaching is an effective tool to rapidly discover and understand your personal strengths
and weaknesses. Additionally, as managers and leaders, we are always coaching others. An executive
leadership program she participated in at Cornell University emphasized coaching as a management
style. “As a leader, it’s very important to bring others along in the decision-making and problem-solving
process not only from a client management standpoint, but also in developing talent throughout the
organization,” Holmes explains. “This is where you can be a great coach and help others develop.”
Having your own coach is also an effective way to learn how to make coaching part of your natural
management style.
Marguerite Samms manages her own learning by balancing the 70-20-10 framework. The 70 percent
is learning from her experiences through action and reflection; the 20 percent is learning from other
people, such as her manager, peers, mentors, coaches, and social media; and the 10 percent is learning
from formal classes in person or online. She is a professional certified coach, certified team coach, and

362 | Chapter 26
certified mentor coach, all through the International Coach Federation. Samms has found that coaching is
the most powerful way to learn, and she identifies a new personal coach annually to keep her development
moving beyond blind spots and habits that might otherwise get in her way. Additionally, she identifies a
new mentor specific to the development area she is focused on each year. She asks the individual for an
hour of their time every month, requesting they share stories of their leadership journey, hard-won lessons
from their experiences, introductions to their networks, and sponsorship into events where she can learn
and network.

Solving Problems With Gusto


A universal aspect of organizational life is the need to continually solve problems. Problems are every-
where—we can call the small ones “puzzles,” but those that are more complex, huge, and amorphous
are called “conundrums” or “wicked problems.” One critical way for organizations to stay competitive
is having a system in place for generically solving problems and for making improvements that are both
continuous and breakthrough. Problems and system improvements are rarely accomplished by individu-
als; they are initiated and solved by groups or teams working across boundaries. Thus, teams that interact
and are interdependent need a consistent, disciplined approach to problem solving so they can more easily
influence business challenges.
There is no single problem-solving method. For example, a puzzle-type problem generally has an
answer, but some information is missing. The goal is to find that information and the answer. Complex
problems, on the other hand, are multifaceted and usually have more than one solution. Through the
years a variety of models have been developed and the one you use might be different based on the level
of complexity of the problem or of your organization. The Shewhart Cycle, known as Plan.Do.Study.
Act., is one of the most well-known methods.
As a talent development leader, Suzanne Frawley of is constantly honing her ability to solve
challenges, whether they are the conundrum types or general puzzles. Practicing these challenges
increases her resourcefulness, helps her anticipate future issues, builds her portfolio of solution
options, and enables her to flex more easily to ever-changing business needs. To stay current on new
developments, she networks with colleagues from other businesses, vendor reps, and professional
organizations.

Collaborating
To Terence Morley, collaboration stands out as critical when developing a new solution. NBCUniversal
uses a brain trust, which is what they call pulling people together across business lines and departments to
solve a challenge. Developing a team with different perspectives is a must when launching organizational
initiatives. For the TD team, this expands its understanding of their internal clients and advances the
credibility of the talent developers. Morley’s team also leverages Slack to stay updated on new content

Getting Better at Getting Better | 363


related to what’s going on in the talent and media industry. The platform is designed to help teams orga-
nize conversations, integrate tools, share and archive files, and talk face-to-face.

Using Stories
Storytelling for Terence Morley means adding creativity, energy, and real-life situations to the learning
and instruction his team is providing. To him, storytelling makes a difference in how effectively they
are delivering new information. Whether it is recommending a solution for a business or organization
or during a weekly staff meeting, he believes it’s important to tell great stories that touch people’s
emotions. The story about how NBCUniversal acquired the rights to the Harry Potter franchise and
why it is so important in their theme parks business is one of Morley’s favorites. The more you can bring
in stories and weave in the tales, the better. In addition, Terence often brings in personal experience and
relates what he has been through to further humanize his perspectives when addressing others. Sharing
a personal fear related to having difficult conversations helps others appreciate both the difficulty and
the necessity of the situation in question.
Boeing’s Carmen Reynolds has also found that teaching is more about being a great storyteller than
anything else—it does not always matter what the theory is behind it. It’s all about how the topic or point
is presented and how you can make it come alive and relevant for others.

One-on-One Meetings
Jay Erikson conducts one-on-one meetings with his direct reports every week to discuss their priorities
and how he can support them. These consistent meetings are essential for building trust and for open
communication. The call structure is largely dependent on the management style that fits the individual
and their responsibilities, and it’s critical to listen attentively and focus on them and not on yourself. In
addition, Erikson’s organization, Hitachi Vantara, uses the Marcus Buckingham StandOut Assessment
to better understand what motivates and energizes each employee. He works with his team to find
opportunities for each person to do the things that align with their strengths and energizes them on a
frequent basis.
Alissa Weiher likes to do her one-on-one meetings as walk and talk sessions with her employees. The
physical exercise is good for everyone and it creates healthier behaviors that spark more authentic conver-
sations. In her one-on-one relationships, Weiher leverages the skills she has gained from taking Situational
Leadership courses. Using this training and her understanding of preferences from the MBTI, she is
intentional in how she adjusts the way she interacts with each individual on her team.

Inquiry
Another tool Terence Morley uses is asking questions. By asking “What does your ideal vision of a prod-
uct or program look like?” or “What does program success look like?” he not only invites others into

364 | Chapter 26
the conversation in a meaningful way, but the resulting conversations generally produce ideas that can
improve the current thinking.
Alissa Weiher also recommends using questions. She likes asking, “What do I, my team, and my
organization need to do next to be more competitive?” She is also a strong advocate of Michael Bungay
Stanier’s 2016 book, The Coaching Habit: Say Less, Ask More & Change the Way You Lead Forever, because it
distills coaching to seven simple questions that elicit rich insight and promotes team members talking and
reflecting more in coaching conversations.
Michelle Webb of Accenture uses questions to leverage her natural curiosity and habit of working
out loud. She purposely seeks out new opportunities and connections by asking insightful questions
and talking to others about her projects and research. This leads to formulating new questions—for
example, asking questions about the responses to her questions—enabling her to find new areas for
exploration.

Expanding Horizons
While leadership is inside out and takes daily practice, one of the most fundamental ways to learn
and get better at it is through learning with and from others, hence, making intentional connections
that build professional relationships that grow over time. According to MIT Sloan Management Review,
in “The Social Side of Performance,” one distinguishing factor of high performers “is their ability to
maintain and leverage personal networks” (Cross, Davenport, and Cantrell 2003) The most effective
performers create and tap large, diversified networks that are rich in experience and span all organi-
zational boundaries. As Andrew Hargadon, a professor at UC Davis and a thought leader for innova-
tive practices, especially associational thinking, stated in the 2014 ATD Forum’s Spring Lab, “One’s
network is the innovation.”
Randal Gross relies on his own personal network to meet the ongoing demands in the workplace,
especially given the speed of change. He reaches out to others to gain perspective on the best ways to
develop new learning content because he believes that having a sounding board to learn what others are
doing in the field makes a huge difference. The ATD Forum is an example of the partnering support
PeaceHealth receives from networking. In addition, Gross has found a key to his success as a TD leader is
partnering not just externally, but also across his own organization. This helps him to know the learning
consumers and their work, and creates an awareness of the work the learning team does for the organiza-
tion. It is definitely a win-win.
Getting plugged into the ATD Forum network, the experiential labs, and the ATD organization
at large has equipped Jeremy Jones with new relationships that he has found extremely valuable. Being
able to have personal interactions on challenging learning and development topics with people who have
similar responsibilities but work in different industries has helped him gain new insights on content and
delivery methods, which have, in turn, influenced his internal practices.

Getting Better at Getting Better | 365


Alissa Weiher also reports the benefits of being in the right professional communities, which requires
intentional networking and communicating with a variety of learning leaders. These conversations can
run the gamut of the talent spectrum and range from how they source vendors, to the different alignment
strategies they have used, to how they decide which learning solutions to design and to promote. Some of
the best decisions she has made over the years were made in part by relying on the guidance and insight of
other people in the talent industry that she knows and respects. She cites her involvement with the ATD
Forum as being instrumental in helping her to grow as a leader. It’s also a relationship that pushed her
out of her comfort zone. By getting involved, she has served as a teacher and as a learner, thus benefiting
from contributing valuable intellectual leadership to the mission of the Forum and learning from other
talent leaders. For Weiher, having a network of learning professionals outside her organization is critical.

Summary
In his classic works on leadership, Noel Tichy used the metaphor of leadership as the engine that runs
the organization. The organization with the best-running engine wins the race. Those engines with the
capacity to thrive in a complex world have leaders that continually learn from every experience and use
that knowledge to develop teachable stories for sharing with others to help them learn. True learning
leaders also generate ideas by asking questions and by wondering about how things in the business
work. They focus on targets and goals and creating positive emotional energy in others.
It is evident from many research reports and leadership books that learning is the most important
job for leaders. It seems intuitive this would be an imperative for leaders of learning teams. But which
comes first—the passion for learning or the decision to serve as a learning professional and leader? And
are they always tied together? From the experiences and examples provided in this chapter, it seems as
though they go together. So, we as TD leaders need to continue to be curious, to keep asking questions,
and to learn something every day. It is the only way we can stay in the race and continually get better
at getting better and catalyze curiosity in others so they excel in their practice.

Key Takeaways
9 With the world of work continually being disrupted, work in all organizations is changing. This
requires leaders to be in a constant state of learning—learning about the work, about the workers,
about themselves, and about the future.
9 There are many tools and techniques available for continually improving our craft as leaders in
general, and specifically as leaders of learning professionals. The ones we select and use are personal
and dependent on our situation.
9 Building our own skills and capabilities helps our team and adds value to our organization. It also
increases our confidence and the belief that we can do even more.

366 | Chapter 26
Questions for Reflection and Further Action
1. Have you thought about your own learning story recently and captured it in writing? If so, what
insights does it provide to how you serve as a model for other learning leaders? If not, what about
starting today using the tools in appendix 3?

2. What do you personally do to obtain feedback from others about your performance on a regular
basis (such as monthly or quarterly)? Once you get feedback, how do you use it to build personal
capability? How do others know you are using it?

3. How intentional are you in determining your focus for improvement, and how do you convert that
desire into habitual actions?

4. If your team members were asked to report the ways they have advanced their personal learning by
modeling your actions, what would be on their lists?

Getting Better at Getting Better | 367


Acknowledgments

The two of us often say that we have the best jobs in the world—serving as the hub for supporting
senior talent leaders from more than 60 member organizations to connect, collaborate, and share
their knowledge to leverage the expertise of the entire consortium. The companies that belong to
the ATD Forum truly understand the value of supporting performance excellence in real time and
simultaneously building capability for the future. Their common goal is enabling competitive advan-
tage within their organizations by staying ahead of the ever-changing challenges they face. As part
of the larger group, we also work very closely with an advisory group—a rotating group of senior
leaders within the Forum community who have demonstrated personal leadership and volunteer to
serve as strategic guides and a networking lynchpin for the consortium.
Members of the ATD Forum community support one another in a variety of ways. They build
professional relationships that extend beyond the formal Forum venues, which expands their ability
to share ideas, insights, practices, and suggestions. Many of these exceptional learning leaders have
worked together for years and are open to constant experimentation. Thanks to the group’s extensive
comradery, newcomers are always welcomed with enthusiasm.
This book would not have been possible without this tightly connected community. For that
reason, our first acknowledgment of gratitude is to all Forum members and leaders—over the past 29
years you have built a structure that allows this learning community to thrive. Our members come
from a huge array of industries, sizes, and locations. Under the umbrella of talent development, they
serve in a variety of roles. Each contributor has gifts they share within the Forum community, and
we are so grateful and proud.
Our list of acknowledgments is long; more than 50 individuals from Forum member companies
played a role in some aspect of the book’s development. Most of the contributing chapter authors had

369
never written for public consumption, but they had a desire to share and the willingness to experi-
ment and give it a go. Once they volunteered, we did what the Forum always does—collectively we
created a working project plan and then tweaked it along the way.
When we began developing the content framework, we decided to give the authors freedom
within that framework to write in their own style and use their own terminology to capture their
experiences and stories. However, this freedom to write also presented challenges for some of our
first-time writers. One question we heard a lot at the beginning of the process was, “What should a
chapter look like?” Knowing that seeing a draft chapter would help to kick-start the writing process,
we asked for volunteers to work with us to quickly draft a few chapters. Jerry Kaminski volunteered
immediately. And later, when we bumped into roadblocks, we were fortunate enough to have the
assistance of writer Chris Connors who helped ensure we met our deadline. We are grateful to both
of them.
During the first draft writing phase we held weekly hour-long coaching sessions, which were
divided into three basic parts: discussing the current status of the book, sharing writing sugges-
tions and what each author was doing, and asking questions and following up on details. We were
constantly amazed by how the participants all supported and helped one another. Someone would
ask a question and others would offer suggestions, whether that was sharing resources and ideas or
even collaborating on a chapter. Each web session was followed by an email summary, and all docu-
ments were posted in a Dropbox folder. This not only helped with the actual writing process, but also
kept the energy high and built an esprit de corps within the group.
Since the essence of the book is leadership and what it looks like in practice, we decided that this
would not be an authored chapter or two, but a broader collection of practices and actions. Thus,
another major member contribution came from those who were willing to be interviewed by Chris
Connor about their leadership practices—Alissa Weiher, Carmen Reynolds, Chris Holmes, Cory
Bouck, Heather Durtschi, Jay Erikson, Jeremy Jones, Marguerite Samms, Randall Gross, Suzanne
Frawley, and Terence Morley. This group of 11 senior learning leaders was very open—they shared
examples of not only how they led their teams, but also how they continued to build their personal
leadership skills. The stories they told and the wide variety of tools, practices, and techniques they
use revealed that leading is a personal, continuous, and never-ending journey.
And still there were other Forum members who wanted to contribute to the book, but did not
have the bandwidth to commit to writing as an author. Once we realized what the review process
would require and the breadth of the subject matter, we again asked for volunteers. In true Forum
fashion, we ended up with 12 reviewers: Amanda Gunter, Chad Peters, Chanda Binkley, Douglas
Holt, Emily Isensee, Jennifer Chung, Kendall Mealy, Kevin Metsers, Lisa Gary, Lucinda Ehlen,
Michael Bolen, and Richard Coco. The work they did was critical in providing clarity of content and
in keeping to our very aggressive timeline.

370 | Acknowledgments
By the time we held our 2019 Fall Lab, which focused on leveraging strategic tools, we had
already finished writing the manuscript and had turned it over to our developmental editor. However,
the case studies shared during the event were so thought provoking that we decided they needed
to be included in the book. The presenters—Jill Carter, Mark Lemon, Susana Sipes, and Taylor
Harlin—quickly agreed to be interviewed. We were able to take the case summary and presentation
documents they shared at the lab, along with their interviews, and craft a new chapter. This became
chapter 5.
Members who have joined the Forum since the book was written are interested in helping to
spread the word and carry the book forward to the next phase. As with all major projects, writing
a book takes a variety of strengths and much discipline, and the whole is definitely greater than the
sum of the individual parts. The magic behind this project is the synergy that exists within the Forum
community, which continuously connects, collaborates, and shares. Just like the excitement we have
when members share their business challenge stories in a case study or at a lab, we are thrilled with
this body of knowledge and expertise that we have created to share with all talent professionals,
especially those leading the learning profession. But most importantly, we are appreciative of all the
contributing members and the work they did to make this happen.
In addition to the Forum members, there were others who played a critical role. First and fore-
most was Tim Ito, our boss at the time, and former vice president for content and marketing at ATD.
Tim understood and appreciated the essence of connecting, collaborating, and sharing within a
consortium. He also recognized the unique elements of the lab’s experiential “skunk-works” type
of design model. But most of all he grasped the wealth of knowledge and experiences the Forum’s
members have and so willingly share. He initiated the development of member case studies and
worked with the ATD Research team to make them a reality. When the need arose for a book cover-
ing the wide gamut of functions in the learning space, Tim knew where to go. Once we said yes, he
served as a catalyst, a coach, and confidante.
As with all writing projects at ATD, our “go-to” person was the recently retired Pat Galagan.
She was excited about the project, providing ideas, samples, and a connection to the “queen” of
talent development publishing—Elaine Biech. Elaine provided even more resources and hints. To
these two we are most grateful. As the book progressed, the entire ATD Press staff provided exper-
tise to enable a publication-ready book on schedule.
The last shout out of gratitude is to our endorsers. When we requested their support, the world
was normal, albeit within the VUCA construct, and our lives had a semblance of order and struc-
ture. By the time they received the uncorrected proofs to review, the world had turned upside down
with the COVID-19 crisis. We truly appreciate that they recognize the work learning practitioners
do on a daily basis to enable organizations to build performance capability and honored it with their
support, even in this time of extreme disruption.

Acknowledgments | 371
We appreciate everyone who contributed to this project—more than 50 percent of Forum
member companies played a role. One of the sayings in the Forum is that we are always working
to get better at getting better. This book is for the entire talent profession. The stories, examples,
and actions will enable them to continuously get better at getting better with leading and guiding
others. It will help them create and support work environments that are ahead of the curve, allowing
learning professionals to become masters at advising and guiding business leaders on changes that
influence the future of the organization—building people capability. An army of contributors made
this book possible and validated their professionalism and interest in helping others to get better at
leading and managing the learning function. To each and every one, we are extremely grateful and
honored to be on your team.

—MJ and Laleh

372 | Acknowledgments
Appendix 1
Suggestions for Professional
Organizations

Resources for Staying Current


If you have a team member who is focused exclusively on crafting e-learning content, groups like the
eLearning Guild or Online Learning Consortium may be helpful, but if you are leading a TD organiza-
tion and need greater breadth, an association like the Association for Talent Development (ATD) might
work better, and there are local chapters in many cities.
Other professional organizations include:
• Training Industry (trainingindustry.com)
• i4cp (Institute for Corporate Productivity; i4cp.com)
• CLO (Chief Learning Officer; chieflearningofficer.com)
• SHRM (Society for Human Resource Management; shrm.org)

Branding Learning Through Professional Recognition


In the learning arena there are several ways to do this, including:
• Applying for an ATD BEST Award (td.org/best-awards)
• CLO’s LearningElite (learningelite.clomedia.com)
• Training magazine’s Training 100 (trainingmag.com/training-magazine-ranks-2019-training
-top-125-organizations)
Additionally, the Malcolm Baldrige Performance Excellence criteria and process are geared toward
the organizational level (msqpc.com/business-solutions/baldrige-assessment).

Appendix 1 | 373
Appendix 2
Technology Platforms and Systems

Because there are rapid changes in technology every couple of years, it is important to consult a variety
of vendors that provide these types of tools and solutions. The following are examples of various systems
that are available today.

Learning Suites
Note that many of these offer cloud-based suites as well as on-site versions. Examples include:
• Blackboard • Oracle (Taleo) • SAP (Success Factors)
• Cornerstone OnDemand • Oracle (iLearning) • Skillsoft (SkillPort)
• Grovo • PeopleFluent • Skilljar
• Halogen • Plex • SumTotal
• MS SharePoint • Saba • Travantis CourseMill

Cloud-Based Suites
Cloud-based suites generally reside outside your systems and are accessed using an Internet connection.
The software is generally purchased through a subscription, which means that ownership and all updates
are the responsibility of the vendor that is providing the cloud-based solution. Some examples include:
• Absorb LMS • Docebo • LearnUpon
• Adminstrate • Dokeos • Litmos
• Adobe Captivate Prime • Expertus One • Matris LMS
• Adobe Connect • Fuse Universal • MS SharePoint LMS
• Adobe Experience • G-Cube LMS • Opigno
Manager • GnosisConnect The • Propsel Enterprise &
• Agylia Academy LMS Distribute
• Blackboard • Grovo • SeerTEch
• Canvas • Halogen • Talent LMS
• Cornerstone OnDemand • ILIAS

Appendix 2 | 375
Open-Source Systems
Note that open-source systems are also gaining in popularity. They are all online, and many provide free
learning opportunities. Open-source systems allow you to enter markets with less initial capital outlay,
but they may require considerable resources to maintain and update. Be cautious when pursuing them.
Examples include:
• Eliademy • ILIAS • OpenEdX
• Forma.LMS • Moodle • Opigno

376 | Appendix 2
Appendix 3
More Tools and Techniques
for Enhancing Personal
Leadership Capability

The tools and techniques used by the array of leaders in this book provide ideas for “getting better at
getting better.” Start by looking at and assessing the hand you were dealt. Understand how you show
up to others. Be clear on your personal best leadership experiences and leverage them to build further
capabilities. Know your point of view on leadership and model it in your daily work and life. Intentionally
seek feedback, take time for personal reflection, use a coach, and be a coach. Journal to capture ideas and
turn the lessons learned into stories, blogs, and podcasts for others. Leverage your network, be curious,
and ask questions. And then really stretch yourself to be “the little engine that could” and write a chapter
for a book.
This leadership journey activity is a great way to get you started. It’s adapted from Noel Tichy’s work-
shops and his book, The Leadership Engine (1997), and also incorporates ideas from Learning Leadership by Jim
Kouzes and Barry Posner (2016).

Leadership Journey: A Timeline Continuum


and Personal Best Leadership Log

Context
Understanding the principles behind leadership and being a leader in action are two very different things.
Unfortunately, some get them confused and think that knowing and doing are the same. Leadership
opportunities happen daily in our homes, schools, offices, places of worship, and communities. Some are
formal and some are informal. Some are planned and some are impromptu. Thinking about and docu-
menting experiences where you have actually demonstrated leadership behaviors can inform you not only
of your leadership skills and behaviors, but also on the impact of your leadership actions on other people;
the project, task, or activity; and, ultimately, the organization.

Appendix 3 | 377
Purpose
The purpose of this activity is to document and diagram key leadership activities in your life in two ways.
The first is a continuum over time from an important starting point. While the time may be from your
earliest experience to today, it may be just the last 10 or 15 years. Whenever the timeline starts, indicate the
year(s) such as teen years, 20s, 50s, 1990–1995, or 1995–2000 and so on. The second way this organizer
documents key leadership experiences is from impact, such as high to low, or personal experience, such as
positive to negative.

Process
1. Using the Leadership Journey Timeline Continuum Template and starting at an appropriate time in
your life, fill in the years in the first row of each column, dividing them between the year you chose
and today. Write the leadership activity in the second row of each column; intentionally place the
activity somewhere between the top and bottom of the row, depending on whether it was a positive
(high impact) or negative (low impact) experience. The words should tell what you did and the results.
For example, someone starting at age 13 might list “Eagle Scout” as a high impact experience. There
is also a row for notes and references to trigger memories or to make other relevant comments.
2. After documenting the leadership activities in each column, write a summary takeaway statement
about your leadership in the last row. (This is something you learned from the process of outlining
your experiences.)
3. Review your Leadership Journey Continuum Timeline. Select one experience that can be labeled
your “personal best” and use it to document the Personal Best Leadership Log.
4. The Personal Best Leadership Log provides an opportunity to unpack a leadership experience with
the goal of delving deeper into how and why it worked, as well as any lessons learned that can be
repeated in the future. Respond to the questions in the Personal Best Leadership Log using the left-
hand column only. When you go back and review the log later, respond in the right-hand column.

Leadership Journey Timeline Continuum Template


Year or Age Year or Age Year or Age Year or Age Year or Age
Positive/High
Impact

Negative/
Low Impact
Notes and
References
Summary Takeaway Statement:

378 | Appendix 3
Personal Best Leadership Log
Name:
Topic: Gaining a deeper understanding of a personal best leadership experience
Instructions: Review your Leadership Journey Continuum Timeline. Select one experience that can be labeled your
“personal best.” Respond to the questions in a journal format using the left-hand column. At a later time, respond to the
original questions or to your original responses in the right-hand column.
Date of original comments: Date of follow-up comments:
• Label the experience. When and where did it happen?
What was the situation or context? Who was involved?

• What specific tasks were involved in the situation?

• What actions did you take? Describe what you did


personally, not the actions of others.

• What were the results of your actions, especially as


they relate to the performance and development of
others? What did you accomplish? What was the end
result?

• How might you use this experience to lead in the


future?

General Notes and Commentary:

Appendix 3 | 379
References and Resources

Accenture. n.d. “Innovation Architecture.” accenture.com/us-en/innovation-architecture.


American Society for Training & Development (ASTD). 2014. ASTD Handbook: The Definitive Reference for
Training & Development. Alexandria, VA: ASTD Press.
Andreatta, B. 2019. Wired to Grow 2.0: Harness the Power of Brain Science to Learn and Master Any Skill. Santa
Barbara, CA: 7th Mind Publishing.
Arets, J. 2018. “5 Myths About the 70:20:10 Reference Model.” 70:20:10 Institute, October 1.
702010institute.com/5-myths-702010-reference-model.
Arets, J., C. Jennings, and V. Heijnen. 2016. 702010 Towards 100% Performance. Sutler Media: Maastricht.
Arets, J., C. Jennings, and V. Heijnen. 2019. “What Is the 70:20:10 Model?” 70:20:10 Institute, March 6.
702010institute.com/702010-model.
Association for Talent Development (ATD). 2015. Aligning for Success: Connecting Learning to Business
Performance. Alexandria, VA: ATD Press.
Association for Talent Development (ATD). 2017a. Chief Talent Development Officers: Driving Strategy and
Performance. Alexandria, VA: ATD Press.
Association for Talent Development (ATD). 2017b. Microlearning: Delivering Bite-Sized Knowledge.
Alexandria, VA: ATD Press.
Association for Talent Development (ATD). 2018a. Needs Assessments: Design and Execution for Success.
Alexandria, VA: ATD Press.
Association for Talent Development (ATD). 2018b. Personalized and Adaptive Learning: Shaping Employee
Development for Engagement and Performance. Alexandria, VA: ATD Press.

381
Association for Talent Development (ATD). 2019a. Effective Evaluation: Measuring Learning Programs for
Success. Alexandria, VA: ATD Press.
Association for Talent Development (ATD). 2019b. IBM: Reaching Today’s Learner With AI. Case Study.
Alexandria, VA: ATD Press.
Association for Talent Development (ATD). n.d. “Talent Development Capability Model.” ATD.
td.org/capability-model/access.
Ausubel, D. 1963. The Psychology of Meaningful Verbal Learning. New York: Grune & Stratton.
Bain & Company. n.d. “The RAPID Decision Making Model.” bain.com.
Baldrige. n.d. “Baldrige Excellence Framework (Business/Nonprofit).” National Institute of Standards and
Technology (NIST). nist.gov/baldrige/publications/baldrige-excellence-framework/businessnonprofit.
Baldrige. n.d. “Baldrige Organizational Profile.” National Institute of Standards and Technology
(NIST). nist.gov/baldrige/baldrige-organizational-profile.
Baldrige. 2018. “2019-2020 Baldrige Excellence Framework and Criteria (Business/Nonprofit) Now
Available.” Press Release. National Institute of Standards and Technology (NIST), December 18.
nist.gov/news-events/news/2018/12/2019-2020-baldrige-excellence-framework-and-criteria
-businessnonprofit-now.
Bersin, J. 2010. “How to Build a High Impact Learning Culture.” Josh Bersin (blog), June 14. joshbersin
.com/2010/06/how-to-build-a-high-impact-learning-culture.
Biech, E. 2007. Thriving Through Change: A Leader’s Practical Guide to Change Mastery. Alexandria, VA: ATD Press.
Biech, E., ed. 2014. ASTD Handbook: The Definitive Reference for Training & Development, 2nd ed. Alexandria,
VA: ASTD Press.
Blanchard, K. 1985. Leadership and the One Minute Manager. New York: Morrow.
Block, P. 2005. Flawless Consulting: A Guide to Getting Your Expertise Used, 3rd ed. San Francisco: Pfeiffer.
Bouck, C. 2013a. The Lens of Leadership: Being the Leader Others Want to Follow. Self-published.
Bouck, C. 2013b. “Tools of the Trade.” T+D magazine, May. td.org/magazines/td-magazine/tools-of-the-trade.
Bresciani Ludvik, M.J., ed. 2016. The Neuroscience of Learning and Development. Dulles, VA: Stylus Publishing.
Brinkerhoff, R.O., and A.M. Apking. 2001. High Impact Learning: Strategies for Leveraging Performance and
Business Results from Training Investments. New York: Basic Books.
Broad, M., and J.W. Newstrom. 1992. Transfer of Training: Action Packed Strategies to Ensure High Payoff From
Training Investments. New York: Basic Books.
Brown, P.C., H.L. Roediger III, and M.A. McDaniel. 2014. Make it Stick: The Science of Successful Learning.
Cambridge, MA: The Belknap Press of Harvard University Press.
Bryan, L. 2008. “Enduring Ideas: The McKinsey 7-S Framework.” Podcast. McKinsey Quarterly, March.
mckinsey.com/business-functions/strategy-and-corporate-finance/our-insights/enduring-ideas-the
-7-s-framework.

382 | References
Bughin, J., E. Hazan, T. Allas, K. Hjartar, J. Manyika, P.E. Sjatil, and I. Shigina. 2019. “‘Tech for Good’:
Using Technology to Smooth Disruption and Improve Well-Being.” McKinsey Global Institute, May.
mckinsey.com/featured-insights/future-of-work/tech-for-good-using-technology-to-smooth
-disruption-and-improve-well-being.
Buss, L. 2016. “Group Flow: Don’t Let Time Constraints Curb Creativity and Engagement.” ATD Insights,
November 1. td.org/insights/group-flow-dont-let-time-constraints-curb-creativity-and-engagement.
Caporale, B. 2015. Creative Strategy Generation: Using Passion and Creativity to Compose Business Strategies That
Inspire Action and Growth. New York: McGraw-Hill Education.
Carroll, R. 2018. The Bullet Journal Method: Track the Past, Order the Present, Design the Future. New York: Portfolio.
Carson, B. 2017. Learning in the Age of Immediacy: 5 Factors for How We Connect, Communicate, and Get Work
Done. Alexandria, VA: ATD Press
Chambers, D., K. Rabren, and C. Dunn. n.d. “Dr. Deming’s 14 Points for Management.” The
W. Edwards Deming Institute. deming.org/explore/fourteen-points.
Chamorro-Premuzic, T., and M. Swan. 2016. “It’s the Company’s Job to Help Employees Learn.”
Harvard Business Review, July 18. hbr.org/2016/07/its-the-companys-job-to-help-employees-learn.
Chen, L., and R. Hutchinson. 2017. “Stop Training and Start Learning.” TD magazine, April.
td.org/magazines/td-magazine/stop-training-and-start-learning.
Cole, M. 2017. “Just How Micro Is Microlearning?” ATD Insights, March 23. td.org/insights/just-how
-micro-is-microlearning.
Cook, J. 2017. “How Can You Stay Current in Your Field When Work, Jobs, and Even Professions Are
Constantly Changing?” Training, February 11. trainingjournal.com/articles/feature/how-can-you
-stay-current-your-field-when-work-jobs-and-even-professions-are.
Covey, S. 1994. The 7 Habits of Highly Effective People: Powerful Lessons in Personal Change. New York: Simon
and Schuster.
Craig, A., and D. Yewman. 2013. Weekend Language: Presenting With More Stories and Less PowerPoint. Portland,
OR: DASH Consulting.
Cross, R., T.H. Davenport, and S. Cantrell. 2003. “The Social Side of Performance.” MIT Sloan
Management Review, October 15. sloanreview.mit.edu/article/the-social-side-of-performance.
Cross, R.L., and R.J. Thomas. 2009. Driving Results Through Social Networks: How Top Organizations Leverage
Networks for Performance and Growth. San Francisco: Jossey-Bass.
Davachi, L., T. Kiefer, D. Rock, and L. Rock. 2010. “Learning That Lasts Through AGES.” The
NeuroLeadership Journal 3.
Davenport, R. 2006. “Uncommon Sense: View From the Learning Executive.” LX Briefing, 1:3.
marshallgoldsmith.com/wp-content/uploads/2015/10/ASTD_2006.pdf.
Davis, J., M. Balda, D. Rock, P. McGinniss, and L. Davachi. 2014. “The Science of Making Learning
Stick: An Update to the AGES Model.” The NeuroLeadership Journal 5, August.

References | 383
Deloitte. 2015. “Learning Transformation.” Infographic, June 19. deloitte.com/content/dam/Deloitte
/global/Documents/HumanCapital/gx-cons-hc-learning-transformation-placemat.pdf.
Deming, W.E. 2000. Out of Crisis. Cambridge, MA: MIT Press.
Denning, S. 2018. Age of Agile: How Smart Companies Are Transforming the Way Work Gets Done. New York:
American Management Association.
Dineen, S. 2016. “Why Do LMSs Fail? An Engagement Issue Explained | Steve Dineen, Fuse Universal
CEO.” Video. Fuse Universal, July 14. youtube.com/watch?v=vJmQUy6lV2c.
Drucker, P. 2010. The Five Most Important Questions Self-Assessment Tool: Participant Workbook, 3rd ed. San
Francisco: Jossey-Bass.
Duggan, W. 2013. Creative Strategy: A Guide for Innovation. New York: Columbia University Press.
Edmondson, A.C. 2019. The Fearless Organization: Creating Psychological Safety in the Workplace for Learning,
Innovation Growth. Hoboken, NJ: Wiley & Sons.
Ericsson, A.K., and R. Pool. 2016. Peak: Secrets from the New Science of Expertise. Boston: Houghton
Mifflin Harcourt.
Expert Program Management (EPM). 2018. “RAPID Decision Making Model.” EPM, April 23.
expertprogrammanagement.com/2018/04/rapid-decision-making-model.
Fritz, R. 2011. The Path of Least Resistance for Managers. Newfane, VT: Newfane Press.
Fyfe-Mills, K. 2018. “Digital Badge Program at IBM.” ATD Insights, July 2. td.org/insights/digital
-badge-program-at-ibm.
Galagan, P., M. Hirt, and C. Vital. 2019. Capabilities for Talent Development: Shaping the Future of the Profession.
Alexandria, VA: ATD Press.
Gallup. n.d. “Clifton Strengths.” CliftonStrengths Online Talent Assessment. gallup.com/cliftonstrengths.
Gerard, B. 2019. “Immersion, Learning and the Importance of Trust.” Accenture, Future Workforce Case
Study, January 22. accenture.com/us-en/case-studies/future-workforce/case-study-immersion-learning.
Gerard, B., D. Koch, and J. Gittleson. n.d. Learning Geeks. Podcast. learninggeekspod.com.
Gilbert, T. 1978. Human Competence: Engineering Worthy Performance. New York: McGraw-Hill.
Glotz, S. 2014. “A Closer Look at Personas: What They Are and How They Work.” Smashing
Magazine, August 6. smashingmagazine.com/2014/08/a-closer-look-at-personas-part-1.
Godin, S. 2019. “How Much Is That Piece of Paper in the Window?” Seth’s Blog, October 4.
seths.blog/2019/10/how-much-is-that-piece-of-paper-in-the-window.
Gottfredson, C., and B. Mosher. 2012. “Are You Meeting All Five Moments of Learning Need?”
Learning Solutions, June 18. learningsolutionsmag.com/articles/949/are-you-meeting-all-five
-moments-of-learning-need.
Gray, D., S. Brown, and J. Macanufo. 2010. Gamestorming: A Playbook for Innovators, Rulebreakers, and
Changemakers. Cambridge: O’Reilly Media.

384 | References
Hackett, J.P. 2007. “Preparing for the Perfect Product Launch.” Harvard Business Review, April.
hbr.org/2007/04/preparing-for-the-perfect-product-launch.
Hall, MJ. 2014. Designing WorkLearn Networks: Making Magic Happen With Your Profession. Lake Placid, NY:
Aviva Publishing.
Hall, MJ. 2017. “Mapping a Journey to Inform the Future.” TD, February. td.org/magazines/td-magazine
/mapping-a-journey-to-inform-the-future.
Hall, MJ. 2018a. “Know Your Stakeholders.” TD, February. td.org/magazines/td-magazine/know-your
-stakeholders.
Hall, MJ. 2018b. “Personas: Designing Personalized, Learner-Centric Experiences.” ATD Insights,
October 31. td.org/insights/personas-designing-personalized-learner-centric-experiences.
HCM Advisory Group. 2012. “Transforming Learning into a Strategic Business Enabler.” HCM
Advisory Group, October. cedmaeurope.org/newsletter%20articles/Clomedia/Transforming%20
Learning%20into%20a%20Strategic%20Business%20Enabler%20(Oct%2012).pdf.
Hess, E.D. 2011. Learn or Die: Using Science to Build a Leading-Edge Learning Organization. New York:
Columbia Business School Publishing.
Horth, D.M. 2019. “Navigating Disruption With RUPT: An Alternative to VUCA.” Center for Creative
Leadership (blog), July 17. ccl.org/blog/navigating-disruption-vuca-alternative.
Horwath, R. 2015. “The Strategic Thinking Manifesto.” Strategic Thinking Institute, March 7. strategyskills
.com/pdf/The-Strategic-Thinking-Manifesto.pdf ?gclid=CIaV2fG0v88CFcVlfgodSBUM8A.
IBM. n.d. “What is an IBM Digital Badge?” IBM Skills Gateway. ibm.biz/badging.
IBM. 2018a. “Incumbents Strike Back: Insights From the Global C-Suite Study.” IBM Institute for Business
Value, February. ibm.com/downloads/cas/Y9JBRJ8A?mhsrc=ibmsearch_a&mhq=Incumbents%20
strike%20back%3A%20Insights%20from%20the%20global%20c-suite.
IBM. 2018b. “Plotting the Platform Payoff: Chief Executive Officer.” IBM Institute for Business Value,
May. ibm.com/downloads/cas/NJYY0ZVG?mhsrc=ibmsearch_a&mhq=Plotting%20the%20
platform%20payoff.
IBM Learning. n.d. “IBM Learning Channel on YouTube.” youtube.com/channel/UC9gOawWmee
QdFAE6oAHEscA.
Ideo.org. 2015. The Field Guide to Human-Centered Design. Stanford: Design Kit.
Ikeda, K., D. Zaharchuk, and A. Marshall. 2019. “Agility, Skills and Cybersecurity: Three Keys to
Competitiveness in an Era of Economic Uncertainty.” IBM Institute for Business Value, February.
Jarrett, C. 2018. “Learning by Teaching Others is Extremely Effective—A New Study Tested a Key
Reason Why.” The British Psychological Society, Research Digest, May 4. digest.bps.org.uk/2018/05
/04/learning-by-teaching-others-is-extremely-effective-a-new-study-tested-a-key-reason-why.
Jefferson, A., and R. Pollock. 2014. “70:20:10: Where Is the Evidence?” ATD Insights, July 8.
td.org/insights/70-20-10-where-is-the-evidence.

References | 385
Jennings, C. 2016. “From Courses to Campaigns: Using the 70:20:10 Approach.” 70:20:10 Institute,
November 8. 702010institute.com/courses-campaigns-using-702010-approach.
Jung-Beeman, M., A. Collier, and J. Kounios. 2008. “How Insight Happens: Learning From the Brain.”
The NeuroLeadership Journal 1.
Kaufman, R., and I. Guerra-López. 2013. Needs Assessment for Organizational Success. Alexandria, VA: ASTD Press.
Keating, L.A., P.A. Heslin, and S.J. Ashford. 2017. “Good Leaders Are Good Learners.” Harvard Business
Review, August 10. hbr.org/2017/08/good-leaders-are-good-learners.
Kelley, T., and D. Kelley. 2013. Creative Confidence: Unleashing the Creative Potential Within Us All. New York:
Crown Business.
Kim, W.C., and R. Mauborgne. 2015. Blue Ocean Strategy: How to Create Uncontested Market Space and Make
the Competition Irrelevant. Boston: Harvard Business Review Press.
Kirkpatrick Partners. n.d. “The One and Only Kirkpatrick Company.” kirkpatrickpartners.com.
Kouzes, J.M., and B.Z. Posner. 2016. Learning Leadership: The Five Fundamental of Becoming an Exemplary
Leader. San Francisco: Wiley.
Lafley, A.G., and R. Martin. 2013. Playing to Win: How Strategy Really Works. Boston: Harvard Business
Review Press.
Langford. n.d. “Experience Langford Quality Learning.” Langford International. langfordlearning.com.
Latreille-Phifer, A., R. Hutchinson, and T. Copley. 2016. “Technology Helps Put the Power of Learning
in the Employees’ Hands.” ATD Insights, September 21. td.org/insights/technology-helps-put-the
-power-of-learning-in-the-employees-hands.
Levenson, A. 2016. “Measuring and Maximizing the Impact of Talent Development.” TD at Work.
Alexandria, VA: ATD Press.
Lombardo, M.M., and R.W. Eichinger. 2000. The Career Architect Development Planner, 3rd ed.
Minneapolis: Lominger.
LUMA Institute. 2012. Innovating for People: Handbook of Human-Centered Design Methods. Pittsburgh:
LUMA Institute.
Mann, J., T. Austin, A. Walls, C. Rozwell, and N. Drakos. 2012. “Predicts 2013: Social and Collaboration
Go Deeper and Wider.” Gartner Research, November 28. gartner.com/en/documents/2254316
/predicts-2013-social-and-collaboration-go-deeper-and-wid.
Manorek, K. 2019. “The Three Pillars of Learning Governance.” Corporate Learning Trends, D2L,
February 1. d2l.com/corporate/blog/the-three-pillars-of-learning-governance.
Martin, R.L. 2017. “Strategic Choices Need to Be Made Simultaneously Not Sequentially.” Harvard Business
Review, April 3. hbr.org/2017/04/strategic-choices-need-to-be-made-simultaneously-not-sequentially.
McGoldrick, B., and D. Tobey. 2016. Needs Assessment Basics, 2nd ed. Alexandria, VA: ATD Press.

386 | References
McGraw, M. 2019. “What Is Organizational Network Analysis? And How Does It Benefit Companies?”
The i4cp Productivity Blog, August 1. i4cp.com/productivity-blog/what-organizational-network
-analysis-is-and-how-it-benefits-companies.
Medina, J. 2014. Brain Rules: 12 Principles for Surviving and Thriving at Work, Home, and School. Seattle: Pear Press.
Mind Tools Content Team. 2009. “Deming’s 14-Point Philosophy: A Recipe for Total Quality.” Mind
Tools, February 7. mindtools.com/pages/article/newSTR_75.htm.
Mind Tools Content Team. 2018. “What Is Stakeholder Management?” Mind Tools, April 27.
mindtools.com/pages/article/newPPM_08.htm.
Mind Tools Content Team. 2019. “McKinsey 7-S Framework.” Mind Tools, June 5. mindtools.com/pages
/article/newSTR_91.htm.
Mind Tools Content Team. n.d. “Stakeholder Analysis.” Mind Tools. mindtools.com/pages/article
/newPPM_07.htm.
Needleman, S.E. 2019. “The Man Behind ‘Fortnite.’” Wall Street Journal, June 15. wsj.com/articles/the
-man-behind-fortnite-11560571201.
Norman, D.A. 2010. Living With Complexity. Cambridge, MA: MIT Press
Open Badges. n.d. “Discover Open Badges.” openbadges.org.
Osterwalder, A., and Y. Pigneur. 2010. Business Model Generation: A Handbook for Visionaries, Game Changers,
and Challengers. Hoboken, NJ: John Wiley & Sons.
Ravanfar, M.M. 2015. “Analyzing Organizational Structure Based on 7s Model of McKinsey.” Global
Journal of Management and Business Research: Administration and Management 15(10): 7–12.
pdfs.semanticscholar.org/9fd1/4d415ed96b1dcafa9d84ddde97ecabe5dbda.pdf.
Richman, L. 2012. Improving your Project Management Skills, 2nd ed. New York: AMACOM.
Ricketts, G., and B. Manville. 2004. “Governing the Learning Organization in an Era of Strategic
Human Capital Development and Management.” Saba and United Way. learninggovernance
.com/uploads/ASTD_Ricketts_Manville_HCM_Governance.pdf.
Rivers, R. 2007. WLP Scorecard: Why Learning Matters. Alexandria, VA: ASTD Press.
Robinson, D.G., and J.C. Robinson. 2011. The Strategic Business Partner: Aligning People Strategies With
Business Goals. San Francisco: Pfeiffer.
ROI Institute. n.d. “ROI Institute: Return on Investment.” roiinstitute.net.
Rossett, A. 1999. First Things Fast: A Handbook for Performance Analysis. San Francisco: Jossey-Bass.
Rumelt, R.P. 2011. Good Strategy, Bad Strategy: The Difference and Why it Matters. London: Profile Books.
Schellenger, V. 2015. “Stop and Think Questions.” LinkedIn Pulse, July 22. linkedin.com/pulse/stop
-think-questions-bob-tiede.
Shriver, S. 2018. The Four Moments of Truth. Cary, NC: Leadership Studies.

References | 387
Skrobe, R. 2019. “A Simplified Approach to Design Sprint Problem Framing.” Medium, Dallas Design
Sprints, February 11. medium.com/dallas-design-sprints/a-simplified-approach-to-design-sprint
-problem-framing-65d3bf271693.
Smith, R.M. 2011. Strategic Learning Alignment: Make Training a Valuable Business Partner. Alexandria, VA:
ASTD Press.
Stainer, M.B. 2016. The Coaching Habit: Say Less, Ask More & Change the Way You Lead Forever. Toronto: Box
of Crayons Press.
Stodd, J. n.d. “Julian Stodd’s Learning Blog.” julianstodd.wordpress.com.
Story, J. 2019. “How to Use the McKinsey 7-S Model for Marketing.” Smart Insights, February 28.
smartinsights.com/marketing-planning/marketing-models/mckinsey-7s-model.
Suarez, J.G. 2014. Leader of One: Shaping Your Future Through Imagination and Design. Self-published.
Tauber, T., and W. Wang-Audia. 2014. “Meet the Modern Learner: Engaging the Overwhelmed,
Distracted, and Impatient Employee.” Bersin by Deloitte Research Bulletin, November 26.
legacy.bersin.com/uploadedfiles/112614-meet-the-modern-learner.pdf.
Taylor, K., and C. Marienau. 2016. Facilitating Learning With the Adult Brain in Mind. San Francisco: Jossey-Bass.
Tichy, N.M. 1997. The Leadership Engine: How Winning Companies Build Leaders at Every Level. New York:
Harper Business.
Torrance, M. 2018. “Demystifying Agile in Instructional Design.” ATD Insights, March 7.
td.org/insights/demystifying-agile-in-instructional-design.
Training Industry. 2012. “Federated Training Organizational Model.” Training Industry Glossary,
October 31. trainingindustry.com/glossary/federated-training-organization-model.
Udell, C., and G. Woodill. 2019. Shock of the New: The Challenge and Promise of Emerging Learning Technologies.
Alexandria, VA: ATD Press.
Vance, D. 2018. “Alignment Revisited.” Chief Learning Officer, April 10. chieflearningofficer.com/2018
/04/10/alignment-revisited.
Waters, T. 2017. “How to become an Agile Team.” Medium, July 12. medium.com/agile-in-learning
/how-to-become-an-agile-team-918e055f7d9d.
Wentworth, D. 2017. The New Wave of Digital Learning. Brandon Hall Group and Litmos, August. litmos
.com/wp-content/uploads/2017/11/ebook-The-New-Wave-of-Digital-Learning-BrandonHall.pdf.
Xiao, L. 2018. “A Guide to Problem Framing.” UX Planet, December 13. uxplanet.org/a-guide-to-problem
-framing-ae58713364ec.

388 | References
About the Authors

Alan Abbott
Instructional Design Supervisor, UPS
Alan Abbot realized the importance of effective instructional design a decade ago when, as an adjunct
instructor at Indiana University Southeast, he was teaching a course for the second time in two semesters.
The first time he taught the class, some of the students failed to grasp the relationship between some
of the material. Without a complete schema, they struggled with some of the assignments. The second
time he taught the class, he kept the same assessments, assignments, and material, but rearranged how
he covered it. There was a full grade improvement across the entire class. Ever since, he’s been interested
in how educators can create and experiment with their learner-centric programs. These days, Alan’s an
instructional design supervisor at UPS in the talent management department. Previously, he’s worked as
an instructional designer and facilitator in the nonprofit and healthcare fields.

Alissa Weiher
Director, Talent Development, Cochlear
Alissa Weiher has progressed in a variety of roles in the learning and development industry for more than
15 years. She serves as the director of talent development at Cochlear, where she leads a team of talent
development professionals. As an expert in adult learning, she is versed in all aspects of the practice and
focuses on leadership development, organizational redesign, change management, and coaching. Alis-
sa received a master of professional studies in organizational and professional communications with a
concentration in organizational development, training, and learning from the University of Denver and
carries the SPHR certification.

389
J. Ann Quadagno
Principal Learning Strategist and Implementor, IBM
J. Ann Quadagno is a principal learning strategist and implementor for IBM Leadership, Learning &
Inclusion, and focuses on global programs to upskill and reskill internal and client employees. She focuses
on innovative ways to address business problems through learning solutions. She has been in the consult-
ing and learning design space for more than 20 years. She has an MS in instructional systems from Florida
State University.

Brandon Carson
Director of Learning, Delta Air Lines
Brandon Carson is the author of Learning in the Age of Immediacy (ATD Press, 2017). He is an award-win-
ning, innovative, and highly focused leader with a progressive track record of learning strategy and execu-
tion. He has extensive experience developing global learning strategies for companies such as Apple,
Microsoft, Yahoo, and Home Depot. He is currently the director of learning at Delta Air Lines. He holds
an MEd in learning technology and design and a BA in business communications, as well as advanced
ISPI certification in analysis. He resides in Atlanta, Georgia.

Bryan McElroy
Senior Manager of Learning and Development, Rush Enterprises
Bryan McElroy is the senior manager of learning and development for Rush Enterprises, the largest
network of commercial truck dealers in North America. Applying techniques learned working for more
than a decade in the highest levels of film, television, and audio production in Hollywood, California,
Bryan came to the L&D field focusing on creating compelling e-learning content. Having held many
leadership positions, Bryan currently leads a team of L&D professionals who strategically supply the
organization with all modes of training for all employees, from executives to high-potentials to new hires
and everyone in between.

Caroline Fernandes
Senior Learning Designer, IBM
Caroline Fernandes designs and develops learning solutions (face-to-face, live virtual sessions, and self-
paced offerings on various platforms) for IBM. Her projects have included designing and developing
employee enablement for the company’s new performance management discipline and technology,
enablement for industry sellers to build knowledge and selling skills for industries they serve, and programs
to build employee skills in areas such as boosting personal impact, time management, and thriving on
change. Caroline also leads a team of instructional designers, mentoring them on their learning consul-
tancy and instructional design skills. Prior to IBM, Caroline worked as a senior instructional designer in

390 | About the Authors


Praxis Interactive Services, designing and developing learning solutions for K–12 education programs,
software training, and technology certification programs. Earlier, she worked at Lionbridge designing
courses for technology certification programs, soft skill development, and software training. Caroline has
a master’s in communication arts from New York Institute of Technology, and a BA in economics from
the University of Mumbai.

Casey Garhart
Senior instructional designer, IBM Leadership, Learning and Inclusion, IBM
Casey Garhart is a senior instructional designer with IBM’s Leadership, Learning & Inclusion group,
splitting her time between work in diversity and inclusion and courses for technical leaders. Her work
encompasses face-to-face and virtual classes, online learning, and ongoing activities designed to support
learning. Some of her courses in the area of diversity and inclusion are available outside IBM. Casey has
been designing interactive instruction for more than 30 years—starting with interactive videodisc in 1978.
She has a PhD in instructional design from Penn State University and has taught classes at Penn State,
American University, and the University of Wisconsin-Madison. Her work prior to joining IBM included
creating materials for the hearing impaired, simulations for the military, and support for management
consultants.

Catherine Rickelman
Senior Talent and Learning Solution Architect, IBM
Catherine Rickelman is a senior talent and learning solution architect for IBM Leadership, Learning &
Inclusion. She has architected enterprise-wide solutions for IBM’s onboarding and recruitment programs
for newly hired employees and executives, management and leadership development, diversity and inclu-
sion offerings, and technical reskilling priorities. Catherine has been in the consulting and learning design
space for more than 20 years. She has an MBA in international business from Rollins College.

Chris Garton
Learning and Development Manager, Asurion
Chris Garton leads a team of learning and development specialists focused on delivery of training at
Asurion in Houston, Texas. He partners with operations, workforce, design, and project teams to align
resources and drive success among their newest employees. Over his past seven years with Asurion, Chris
has gained experience at multiple levels and won the prestigious Superhero Award in 2016 for his dedica-
tion. He earned a BS in materials science and engineering from the University of Florida. While originally
pursuing a career in the engineering space, he found that his passion for training and the positive culture
at Asurion were too good to pass up. Chris enjoys nearly any outdoor activity, and you will frequently find
him camping, hiking, working out, or traveling with his family.

About the Authors | 391


Dana Alan Koch
Institute for the Applied Learning Sciences Lead, Accenture
Dana Alan Koch has more than 30 years of experience in learning and talent development. In recent
years he has focused on blending the best of cognitive science, brain science, and instructional science to
build better learning programs, better learners, and better learning professionals. His team has innovated
with immersive learning technologies, immersive game design, wearables for learning, AI content domain
mapping, and chatbots for learning. Dana has been an integral part of the company’s work in durable
learning, building better learners, and learning in the future research. He holds several patents for his
work in learning. Dana has a BA from Brigham Young University in organizational communications and
an MA from Northwestern University’s Institute for the Learning Sciences. He is a frequent presenter at
conferences. Dana was a contributing author to Big Learning Data (ATD Press, 2014) and ATD’s Foun-
dations of Talent Development and ATD’s Action Guide to Talent Development (ATD Press, 2018). He served as
chair of the ATD Forum Advisory Group in 2013 and 2014 and served on both the ATD Public Policy
Council and the editorial board of CTDO magazine. He regularly joins two colleagues—Bob Gerard and
Jake Gittleson—to discuss learning topics on the Learning Geeks podcast. Dana is the proud father of three
beautiful daughters and lives in St. Charles, Illinois, with his wife, Julie.

David McGrath
Senior Manager, Consulting Services, Grainger
David McGrath works with Grainger’s largest customers helping them reduce their total cost of oper-
ations by improving productivity. This is achieved by applying his continuous improvement skills (Six
Sigma Green Belt) and user design expertise. He works with a talented team to innovate and develop new
products that are implemented across the organization. Due to his passion for creativity and new prod-
uct development, he is currently completing an MS in product design and development management at
Northwestern University’s McCormick School of Engineering. His background in talent development,
marketing, and ecommerce allows him to bring a broad perspective to any situation. Some highlights of
his work in talent development include building a change management practice, program development
and strategy (such as sales, product knowledge, and ecommerce), and international projects in South
America and Asia. When he’s not working David enjoys adventure and traveling, including high-altitude
mountaineering.

Elizabeth Huttner-Loan
Senior Learning Designer, IBM
Elizabeth Huttner-Loan joined IBM in July 2019 as a senior learning designer, where she has worked on
projects related to employee resilience and leadership development. Previously, Elizabeth was a Digital
Learning Lab fellow with the MIT Office of Open Learning, and served as senior manager for online
course development for the MIT Teaching Systems Lab. She designed and managed MOOCs pertaining

392 | About the Authors


to educational transformation and technology. Elizabeth values creating meaningful online educational
experiences with an emphasis on active learning. Before MIT, Elizabeth worked at the American Acade-
my of Arts and Sciences. She holds a BA in government from Claremont McKenna College and an EdM
in technology, innovation, and education from the Harvard Graduate School of Education.

Emily Isensee
Senior Learning Manager, Tableau
Emily Isensee works on the sales enablement team at Tableau, driving the learning and leadership
development strategy for its sales team. Previously, she worked for Brave Leaders, Brené Brown’s online
learning community, helping organizations integrate e-courses into their talent development strategy to
create braver leaders and more courageous cultures. She also worked on the learning and development
team at the Bill and Melinda Gates Foundation, where she managed a foundation-wide, award-winning
leadership development program. Emily’s areas of expertise include leadership development, facilitation
and training, sales enablement, change management, team development, curriculum design, employee
onboarding, and program management and evaluation. She has an MS in organization development
from the Graziadio School of Business and Management at Pepperdine University and is currently pursu-
ing a coaching certification through Co-Active Training Institute (CTI).

Graham E. Johnston
Leader—Development Strategy & Innovation, Deloitte
Graham Johnston leads the development strategy and innovation team within Deloitte’s development
and performance function, managing various initiatives that advance L&D for Deloitte professionals.
As a learning strategist and performance consultant, he enables innovative, human-centered, and holis-
tic development experiences that address business needs and drive individual, team, and organizational
performance. Graham also provides consultative services and thought leadership to the development and
performance teams to support their strategies and solutions. He serves as Deloitte’s lead representative on
several learning and development groups, and is regular speaker at related industry conferences. Graham
was previously the talent development leader for Deloitte’s federal practice, and prior to that was in
Deloitte Consulting’s human capital practice, providing talent management, organization development,
workforce planning, and change management solutions for clients. He lives he in Maryland just outside
Washington, D.C., with his wife, son, and daughter.

Jerry Kaminski
Manager, Learning and Development; Instructional Design and Vendor Management, Consumers Energy
Jerry Kaminski leads the instructional design team and vendor management at Consumers Energy. He
has more than 35 years of talent and organization development experience in a multitude of industries,

About the Authors | 393


including information technology, retail, government, consulting, automotive, and currently, energy util-
ities. Jerry started his career as an anti-submarine-warfare technician in the U.S. Navy, moving from that
role to instructor, and ultimately a program development director for the learning command. He has also
held instructional designer, e-learning manager, learning architect, and chief learning officer positions. He
holds undergraduate and graduate degrees in instructional design and CPT-ISPI, SPHR, SHRM-SCP,
and CSSGB certifications. He is a four-time winner of an ATD BEST award. When not doing talent
development activities, he can be found either restoring his nearly 100-year-old home or racing sailboats
on the Great Lakes.

Joan McKernan
Senior Learning Consultant, IBM
Joan McKernan is a learning professional with more than 25 years of experience and skills in meeting the
strategic business needs of talent development. She has a strong background in strategic planning for learn-
ing solutions, all phases of custom learning development, and creating leadership solutions. Her demon-
strated experience includes working with internal and external global clients; working across commercial,
government, and military industries; building development programs for a variety of talent audiences,
including professionals, leaders, managers, and executives; applying the latest methods in instructional
design, neuroscience, and agile practices; and employing a wide variety of learning formats. Roles in her
learning career have included project manager, learning consultant, leadership development consultant,
instructional designer, learning scientist, and team lead. Other roles include assessment coach and general
learning mentor. She has also acted as a facilitator as well as presented at national conferences.

John Kelly
Talent Development Lead, Grainger
John Kelly has more than 30 years of experience in organization development, coaching, training, and
strategic business partner transformation for human resources and other staff functions. He has 20 years
of experience as an internal consultant for organizations in the healthcare, manufacturing, and technical
fields. He also worked as an external consultant helping a variety of corporations, agencies, and schools
collaborate better and more effectively develop their leaders. He holds a BS in psychology from Xavier
University in Cincinnati and is working on a master of organizational behavior at Benedictine University in
Chicago. John also co-leads the Northwest Chicago Organization Development professional development
network. He has experience in change management, culture assessment, HR transformations, coaching,
strategic planning, performance consulting, and various other organization development specialties.

394 | About the Authors


Kozetta Chapman
Training and Development Manager, American Airlines
Kozetta Chapman is a learning professional who collaborates with others to develop strategies that are
effective, innovative, and transformative. In her current role, she is responsible for leading the technical
operations training group, which creates curriculum, videos, and immersive technology components for
team members. Kozetta has 20 years of experience in training design, facilitation, leadership develop-
ment, change management, and inclusion and diversity. She has a bachelor’s degree in sociology and a
master’s degree in human relations from the University of Oklahoma. Kozetta is passionate about helping
others succeed and is heavily involved in mentoring, professional development, and coaching. Kozetta and
her husband, Darius, reside in Texas where they enjoy relaxing with family and friends.

Laura Solomon
Senior Leadership Learning Designer, IBM
Laura Solomon designs and develops leadership learning solutions (face-to-face, online, and virtual) as
part of a global design team for IBM’s 40,000 emerging leaders, managers, and executives. Laura’s recent
projects included designing and developing a corporate-wide initiative on positive leadership; transition
programs for IBM’s newly appointed vice presidents, managers, and middle managers; a microlearning
series; an engagement toolkit; and a peer-to-peer development program on building trust-based relation-
ships. Prior to IBM, she was an assistant vice president at Merrill Lynch, developing and managing leader-
ship development initiatives for high-potential managers that included assessment, individual development
plans, job rotations and mentoring. Before that, she was manager of training and development at Zany
Brainy and Staples. Laura has a master’s in education, instructional design, from the University of Massa-
chusetts, Boston, and a master’s of professional studies in art therapy from Pratt Institute in New York.

Leanne Drennan
Senior Consultant, Talent Development Strategy and Design, IBM
Leanne Drennan is a senior consultant for talent development strategy and design at IBM Leadership and
Learning. She leads the team creating distinctive development and learning experiences for IBM’s global
sales force, from onboarding new sales professionals to coaching sales executives. Focused on innovative,
work-based learning, she and her team enable sales professionals to create client value and exceed sales
targets. Her designs have improved the performance of IBM sales professionals and leaders around the
world, and received recognition from ATD, Brandon Hall, and ISPI. After a 15-year career in sales, sales
management, and sales operations, Leanne followed her passion to empower others to reach their maxi-
mum potential. She facilitated sales and client engagement education to a broad range of professionals in
Asia, Europe, and North America. She earned an MS in education from Northwestern University, with a
specialization in organizational behavior from the J.L. Kellogg Graduate School of Management.

About the Authors | 395


Lisa Gary
Chief Learning Officer, Trane Technologies
Lisa Gary has more than 25 years of learning and development experience, working in financial, food,
and manufacturing industries. She is passionate about learning and always applies a business perspective
to the field. For the last six years, Lisa served as Ingersoll Rand’s CLO and was responsible for developing
enterprise strategic capabilities of leadership development, operational excellence, strategy and market
analytics, product management, innovation, and sales excellence. She was also responsible for ensuring
global learning strategies were aligned for Asia Pacific, EMEA, India, and Latin America. In March 2020,
Ingersoll Rand split into two publicly traded companies, Ingersoll Rand and Trane Technologies; Lisa
joined the Trane Technologies company as the CLO. She and her team are focused on building the next
set of strategic capabilities as well as a reskilling and upskilling strategy. Lisa served on the ATD Forum
Advisory Group in 2017-2018 and served as its chair in 2019. She has a BS in marketing from Virginia
Tech and an MBA from the University of North Carolina, Charlotte. Lisa is a NeuroLeadership Institute
Results Certified Executive Coach as well as a lifelong learner—tackling anything from blacksmithing to
clogging. Lisa is the proud mom of a grown daughter and son. She lives in Cowan’s Ford, North Carolina,
with her husband and two rescue mutts.

Marie Wehrung
Director, Talent and Organizational Development, Rice University
Marie Wehrung is the director of talent and organizational development at Rice University in Houston,
Texas, where she strategizes and oversees programs that drive organizational effectiveness and enable
organizational, team, managerial, and individual success. She consults with individuals and departments
to enhance workplace performance and provides professional development resources to meet organi-
zational needs. Marie has a bachelor’s degree in biological sciences from Smith College and a master’s
degree in human resources development from Texas A&M University–Central Texas. She’s especially
passionate about applying design thinking to the work she does and affecting lives through coaching. She’s
an active member of the ATD Forum, having served three years on its advisory group (one year as chair).
She appreciates having the opportunity to give back to colleagues in the field that she loves. Marie lives in
Houston with her husband, two daughters, and one cat.

Michelle M. Webb
Institute of Applied Learning Sciences Research Lead, Accenture
Michelle Webb has more than 18 years’ experience exploring how to improve the Accenture talent expe-
rience. Through her appetite for voracious learning and application, Michelle has had the opportunity
to research, experiment, create, and consult on a wide range of initiatives that have changed how people
live, learn, and work. Her research has dived into topics including durable learning, personalized learning,

396 | About the Authors


learning pathways, blockchain, virtual reality, leadership development, and futurism research. Beyond
learning, Michelle is passionate about health and well-being, organization, reading, and spending time
with her family. She lives near Denver, Colorado, with her husband and blended family of five kids.

Rachel Hutchinson
Director, L&D, Head of Global Portfolio and Community Management, Hilti North America
Rachel Hutchinson’s passion is engaging people in change journeys while developing and empowering
people. She is energized by solving puzzles—finding what is the root cause issue of performance, what
are the biggest difference makers for the future direction of our organization, and how can we inspire
people to bring their best ideas and best implementation to their daily work. With a CPTD and a master’s
in business administration, with an emphasis on data analytics from Oklahoma State University, Rachel
believes that developing people is the biggest edge you can have over competitors from any industry. As
a huge virtual team proponent, she manages a diverse global team from a mountain cabin in Colorado.

Ron Dickson
Senior Learning Specialist, Honeywell
Ron Dickson is a strategically oriented analyst and program manager focused on transforming learning
and development through analytics, standardized measurement systems, and data-driven innovations.
He serves as a senior learning leader at Honeywell Aerospace, where he recently was awarded the Chief
Engineers Coin Award. He served as an engagement impact analyst at the nonprofit Experience Matters,
and for many years was a measurement and analytics manager for Intel’s corporate learning team. Ron
was instrumental in the creation of the Workplace Learning & Performance Scorecard, the online real-
time benchmarking and decision support tool maintained by ATD.

Sandi Maxey
Senior Vice President, Learning and Professional Development, Sandy Spring Bank
Sandi Maxey joined Sandy Spring Bank in 1999. Her current position is senior vice president and manag-
er of the learning and professional development department. She is responsible for developing and execut-
ing the bank’s enterprise talent strategy. Sandi’s areas of expertise include leadership and management,
coaching, career development, sales management, and the client experience. She has 30 years of banking
experience with more than 25 years in learning and development. Sandi holds an AB in sociology from
the University of Georgia, an MBA from Frostburg State University, and is a graduate of the Stonier
Graduate School of Banking.

About the Authors | 397


Sarah Siegel
Manager, IBM Learning Design, IBM
Sarah Siegel manages the learning design team at IBM and is a learning experience designer herself. In
her management role, she aims to hire and engage the industry’s most gifted learning experience designers
to help upskill the workforce of today and the future. In her senior learning designer role, Sarah focuses on
diversity and inclusion, including a groundbreaking course called Global Religion and Culture. She holds
an MA in organizational leadership with a specialization in adult learning and leadership from Columbia
University.

Suzanne Frawley
Director, Talent Management. Plains All American Pipeline
Suzanne Frawley, CHRS, CPTD, has a passion for setting people and organizations up for success. She
is the director, talent management, for Plains All American Pipeline and leads succession planning, lead-
ership and professional development, performance management, organization design, and talent acqui-
sition. Suzanne’s previous experience includes roles in leading learning and organization development
functions and as a strategic HR business partner. She credits the ATD Forum for introducing her to design
thinking, which she has incorporated into her projects, workshops, and meetings.

Tanya Gilson
Future of Work Researcher and Innovator, Accenture
Tanya Gilson has a curiosity for bringing new ways of thinking to the complex challenges within the
ever-evolving talent landscape. During the last 15 years she has worked in a variety of strategic roles
including talent transformation and optimization projects, leveraging technology to obtain data-driven
insights, and bringing research-based innovation to recruitment, onboarding, learning and talent develop-
ment, and performance. Most recently, Tanya has been exploring the effect of compassion in the work-
place, how we will be learning in the future, and what the next generation of recruitment will look like.
With a love for continuous learning, she is developing new skills in foresight research, kitesurfing, the
science of well-being, and growing Japanese maples. She lives in Nelson, New Zealand, where she enjoys
spending time in the wilderness and walks on the beach with her partner and their dog.

Teri Lowe
Training and Development Manager, UPS
Teri Lowe has more than 30 years of experience as an educator and learning and development profession-
al. She has worked for the past 20 years for UPS in various technical writing, instructional design, and lead-
ership and talent development roles. Teri’s current role is as a training and development manager whose
team is responsible for the project management, design, development, deployment, and measurement of

398 | About the Authors


company-wide leadership training. She has been an active contributor to UPS’s learning strategy, content
strategy, leadership development strategy, measurement and evaluation strategy, learning leaders collab-
oration, and enterprise learning initiatives. Teri has a PhD/ABD in rhetoric and composition from the
University of Louisville, an MA in English from Cleveland State University, and a BA in psychology and
sociology from Ohio Northern University. Teri is an active member of the ATD Forum and serves on the
Advisory Group.

Terry Copley
Head of Solutions and Experience Management, Hilti
Terry Copley is a champion of using performance improvement to grow bottom-line performance and
achieve outstanding results. He prides himself on being a self-starter who is willing to go above and
beyond to get the desired results. Terry is a proven coach, facilitator, and leader in sales training, and
makes sure that he keeps his facilitation skills sharp. In his current role, he is responsible for leading the
teams that design and implement 70-20-10 programs that facilitate closing skills gaps. His goals are to
provide the professional development and continuous learning of team members to increase engagement,
improve Hilti’s ability to recruit top talent by offering best-in-class development opportunities, and overall
to achieve specific business goals. He loves to be creative and thinks in pictures or images that tell a story,
so he tries to make time to stay involved enough in Hilti’s projects to bring his creativity into play.

About the Authors | 399


About the Editors

MJ Hall, PhD, MBA, MEd, is an experienced WorkLearn strategist, performance coach, and business
learning advisor. Her expertise includes designing, developing, and facilitating innovative collabora-
tive experiences that focus on leveraging employee capabilities for positively influencing organizational
productivity and delivering results. Prior to ATD, MJ served as a Level IV professor, director of leader-
ship development, and special assistant to the commandant at the Defense Acquisition University in Ft.
Belvoir, Virginia. MJ has also held numerous operational assignments with other military branches within
the Department of Defense (DOD), including chief of the Program Management Office at the U.S. Army
Engineer School. She also consulted with civilian agencies while serving in an executive rotational assign-
ment with the Federal Consulting Group.
MJ earned a doctoral degree in educational leadership from George Mason University, an MBA from
Long Island University, an MEd from the University of Maryland, College Park, and a BA in teaching
from High Point University (HPU). She has certificates in design thinking from the Darden School of
Business and LUMA Institute, and served for more than 10 years as an examiner for the Baldrige National
Quality Award. She is the author of Designing WorkLearn Networks: Making your Magic Happen and of more
than 100 other articles, blogs, chapters, reports, and instructions. MJ has served as a keynote speaker
on many occasions, including the Carson Scholarship for the North Carolina Award banquet, and has
spoken at numerous domestic and international conferences. Among her many honors, MJ was awarded
the DoD Exceptional Civilian Medal, the U.S. Vice President’s Hammer Award for Innovative Practice,
and the 2017 HPU Alumni Service Award for her volunteer work with doctoral students at the Stout
School of Education.

401
Laleh Patel is the senior manager for the ATD Forum, steering the engagement and direction of the
senior leader, talent development consortium. In this role, she oversees and manages the overall business
strategy and day-to-day operations of the community, and engages with executives to develop research
and products to meet their most pressing talent development challenges to drive engagement and content
strategy for the ATD Forum.
Laleh has had a few roles at ATD within the past 10 years. Prior to managing the Forum, she worked
as a research associate for ATD, conducting surveys and analyzing and reporting on industry trends. She
wrote the 2010 State of the Industry report. Laleh has also worked in the survey research team at the Society
for Human Resource Management, and as a global advertising account executive at Lowe Worldwide.
Laleh holds an undergraduate degree in psychology from University College London, a master’s
degree in industrial and organizational psychology from George Mason University, and a graduate certif-
icate in survey design and data analysis from George Washington University.

402 | About the Editors


Index
Page numbers followed by f refer to figures.

ansrsource, 271, 274


A artificial intelligence (AI), 259–260, 297
academic partners for research, 305
ASC (Advanced Sales Coaching) program, 336–337
Accenture, 98, 291–309, 297f, 362
asking questions for capability building, 364–365
action plan, 159–160
Association for Talent Development (ATD), 15, 59, 102, 158
ACT Model, 42–43
Asurion, 345–346
access, 42
ATD Forum, 243, 365
credibility, 42
augmented reality (AR), 260
trust, 42
Ausubel, David, 87
ADDIE model
awards, 161
and “fail fast” philosophy, 345–346
awareness of self, 356–358
and impact map, 65
and innovation, 324
in online toolkits, 145
B
for organization analysis, 82
badging
advisory groups, coalitions created with, 234–236
for continuous learning, 135–137
affinity sort activity, 49, 49f
digital, 335–336
affinity matrix, 99, 99f
Bain & Company, 100
aggregators, defined, 259
Baldrige Criteria for Performance Excellence, 4, 161, 243
Agile
banking industry, 16–17
design thinking vs., 327–328
behavior assessments, 356
in digital age, 282
benchmarking, 172–174, 173f, 174f
for innovation, 302–304
Bersin, Josh, 20, 35, 102
and SAM, 82
biases
at Walmart, 344
mitigating, in talent identification and onboarding, 132f
analogies, neural schema connected using, 87
personal, 32–33
analytics, 174–178, 175f, 177f, 178f
Biech, Elaine, 61–63
annual conferences, 158
Biech change tools, 61–63, 62f, 63f
annual needs assessments, 80–81
Birkman Method, 357
annual reports, 19

403
Blanchard, Ken, 72, 356 coalitions, 227–238
Block, Peter, 44 advisory groups for creating, 234–236
Bloom’s Taxonomy, 260 and alignment of stakeholders, 232–234
Boeing, 350–351 and definition of stakeholders, 228
Bolton, Claude, 358 getting buy-in for, 229–230
Booz Allen Hamilton, 345 and identification of stakeholders, 228–229
Bouck, Cory, 65, 341–342, 359–360 Organizational Network Analysis for creating, 238
branding of your team, 161–162 positive-negative matrix for creating, 231–232
British Psychological Society Research Digest, 139 Power-Interest matrix for creating, 232–234, 232f
Broad, Mary, 155, 159 and profiling, 232
budgeting for technologies, 262, 263 Six Thinking Hats model for creating, 233–234, 234f
building capability in others, 362–366 stakeholder networks, 236–238
build the business case for learning (Step 2 of SLA model), using RACI matrix for creating, 230–231, 231f
19–21 Cochlear, 349
Bullet Journal Method (Carroll), 361 collaboration
bundling leadership and, 363
continuous learning through, 137–139, 139f with research partners, 304–305, 322
innovation and, 330–331, 330f, 332f communicate your business results (step 4 of SLA model),
business coaches for innovation, 323 22–23
business leaders communication
as co-facilitators, 323 for innovation, 306–307
perceptions of, 15, 17 for stakeholder collaboration, 218–219, 219f
role of, in preparation phase, 47 community banks, 16
business learning advisors (BLA), 229 competition, for emotional arousal, 92
business model canvas, 23, 24f computer-based training (CBT), 258. See also virtual learning
business need, linking solution to, 45 confidence continuum, defined, 341–342
business objectives, defining and measuring, 201–203 confidentiality in peer-to-peer development, 148
business results, communicating, 22 consultation process, 43–50, 45f, 46f
about, 43–47
data gathering, 47–50
C evaluation, 50, 51f
capability development (teams), 155–163 project planning and management, 50
and assessing the current situation, 156–157 content curation and filtering, 328–329, 329f
by branding your team as stellar producers, 161–162 content delivery systems, 258–259
and building a sharing/learning system, 157–158 content management, 257–258
by modeling personal development, 160–161 content volatility, 262–263
by serving as a learning coach, 159–160 continuous improvement process, 204–205
and staying current in the field, 156 continuous learning, 135–153
by using certifications, 159 and badges, 135–137
Capability Model (ATD), 159 discussion guides for, 139–140
The Career Architect Development Planner, 3rd ed., 109 learning design guilds for, 141
Carter, Jill, 61, 63 lunch & learns for, 141–143
Center for Creative Leadership, 109, 360 and microlearning, 143–144
centralized model of learning, 242 online toolkits for, 144–147
certifications, 159 and peer-to-peer development, 147–149
change use of bundles in, 137–139, 139f
defining, 41–42 watercoolers for, 150–152
preparing for, 61–63, 62f, 63f webinars for, 149–150
change management skills assessment, 62f convergent thinking, 312–313
change readiness predictor, 62f Covey, Stephen, 50, 78
client relationship manager (CRM), 77 Cross, Rob, 238
cloud-based suites, 371 cross-functional teams, 294–398
coaching circles, continuous learning through, 147–149
The Coaching Habit (Stanier), 364–365

404 | Index
culture end state, defined, 79
of innovation, 312–313 engage leaders in key learning activities (Step 3 of SLA
organizational, 211 model), 21–22
curators, defined, 259 engaging learners in innovation, 301–302, 323
current situation, assessment of, 8–9, 156–157 enterprise learning council, roles and responsibilities of, 244f
curriculum advisory board, roles and responsibilities of, 244f Erikson, Jay, 351–352, 357–358, 361, 364
executive sponsors, strategic alignment ensured with, 6–7
expanding horizons, leadership and, 365–366
D experimentation for innovation, 300–301
dashboards, 168, 181–195, 182f, 186f experts, looking to, 36
data analysis, benefit of, 321f
data gathering, 47–50
about, 45 F
fictionalized case example, 185–186 facilitator notes in discussion guide, 140
individual structured interviews for, 48–50, 49f “fail fast” philosophy, 345–346
for innovation impact research, 320–321, 321f federated structure of learning, 242, 248–249
for refining effectiveness, 204–205 feedback
for strategic alignment, 4–5 with digital face-to-face training, 333
team-facilitated experience for, 48 from new hires, 130
de Bono, Edward, 233–234 for self-reflection, 358
decentralized model of learning, 242–243 fill-in jobs (in impact matrix), 56–57
deliverables, project gate, 46, 46f Five Moments of Need (Goffredson and Mosher), 78
Deloitte, 248–249 Flawless Consulting (Block), 44
Deming, Edwards, 241 focus and attention, 88
designing slides for discussion guide, 140 focus groups
design thinking, 98–100, 99f, 324–327 benefit of, 321f
Dewey, John, 36 fictionalized case example, 220
digital age, learning and development operations in, 279–287. for refining effectiveness, 204–205
See also technology formal learning, limitations of, 111
digital badging, 335–336 Fosway Group, 270
digital face-to-face learning, 333–335, 335f Four Moments of Truth model (Shriver), 159
digital learning disruptions, 274 Frawley, Suzanne, 342–343, 361, 363
Dineen, Steve, 274–276 front-end analysis (FEA), 76–78, 77f
Discover-Define-Develop-Deliver model, 314–317, 314f future focus, 3
discussion guides for continuous learning, 139–140
divergent thinking, 99, 312–313
drones, 261 G
Drucker, Peter, 4 Gallup StrengthsFinder, 357
Durtschi, Heather, 344–345, 357, 360 gamification and game mechanics, 261, 272–273
Dweck, Carol, 124 getting buy-in for coalitions, 229–230
Gilbert, Thomas, 79
Gilbert’s Grid, 78–80, 80f
E goal deployment processes, 6
effectiveness of learning solutions, 201–205 Goals Objectives Strategy Tactic (GOST), 20
defining, 201–202 Gottfredson, Conrad, 155
designing for, 202 governance strategy for strategic alignment, 7
measuring, 202–204 Grainger, 67
refining, 204–205 Graphic Method (for stakeholder mapping), 211–215
Eichinger, Robert W., 109 Gross, Randall, 343–344, 360, 365
Eisenhower matrix, 57–58, 57f group membership, benchmarking through, 172–173
e-learning. See virtual learning GSS (Global Sales School), 320, 332, 336
emerging technologies, 259
Emergenetics, 160, 247, 256
emotion, 91–93

Index | 405
H defining the purpose of, 293–294
Hall, MJ, 36, 73 and design thinking, 324–327
Hargadon, Andrew, 365 and digital badging, 335–336
Harless, Joe, 75–77 and digital face-to-face learning, 333–335, 335f
Harlin, Taylor, 65–67 Discover-Define-Develop-Deliver model, 314–315, 314f
Heijnen, Vivian, 111–112 engaging cross-functional teams for, 294–297
Hilti engaging learners in, 301–302, 323
benchmarking at, 172–174, 173f, 174f and human-centered design, 313–317
metrics of, 177, 177f at IBM, 323–337
70-20-10 framework at, 114–115, 116f for maximum impact, 319–338
technologies at, 267–277, 268f and microlearning, 329
hippocampus, 86, 91 need for, in learning, 311–312
Hitachi Vantara, 351–352 as priority, 292–293
Holmes, Chris, 347, 360, 362 and research methodology, 298–301, 298f, 320–322
Horwath, Rich, 20, 36 showcases for, 324
Hoshin Kanri process, 6 small private online courses for, 332, 333f
human centered design (HCD), 19, 313–317 and unique learning experiences, 324
human performance technology (HPT) model, 76 using business coaches for, 323
hybrid model of learning, 242–243 insights, fostering, 90–91
instructor-led training (ILT), 333
integrators, defined, 259
I Intermountain Healthcare, 61, 347–349
IBM internal consultants, use of (in preparation phase), 47
Advanced Sales Coaching (ASC) program, 336–337 internal voice research, 299
Cloud & Cognitive (C&C) solutions, 331, 332f interviews
Digital Badge Program, 335 benefit of, 321f
Faculty Academy, 323 for refining effectiveness, 204–205
Global Sales School (GSS), 320, 332, 336 for talent identification and onboarding, 127–128
innovation at, 319, 323–337 investigative report organizational scan, 63–64, 64f
Manager Champion Group, 323 iteration, 95–109
onboarding at, 130–131, 131f, 132f and design thinking, 98–100, 99f
ice breakers, 87 at Hilti, 271–272
IDP (individual development plan), 159–160 and project management, 100–106, 101f, 103f, 104f
impact and understanding the process, 96–98, 97f
impact effort matrix, 56–57, 56f
impact map, 65–67, 65f, 159, 361, 361f
innovation for, 319–338 J
measuring (See measuring value and impact) Jennings, Charles, 276
IMPACT, 170–171 job descriptions for talent identification and onboarding,
individual development plan (IDP), 159–160 124–127, 125f, 126f
industry practices, assessing, 35–36 Johnsonville Sausage, 341–342
influence, determining, 101 Jones, Jeremy, 345–346, 356–357, 360, 365
Ingersoll Rand (IR), 6–12 journaling, 361–362
Ingersoll Rand University (IRU), 8–11, 9f, 10f junior learning experience designer, 124–125, 125f, 126f
Innovating for People (Luma Institute), 19
innovation, 311–319
at Accenture, 291–309, 297f K
Agile approach to, 302–304, 327–328 Kaufman, Roger, 60–61
and bundling, 330–331, 330f, 332f keys (in IBM design thinking), 326
and collaboration with research partners, 304–305, 322 Kirkpatrick, Donald, 35
and communication/sharing, 306–307 Kirkpatrick’s levels of evaluation, 50
content curation/filtering for, 328–329, 329f Knight, Erin, 136
creating a culture of, 312–313 know the business (Step 1 in SLA model), 18–19
defined, 292, 312 Koch, Dana Alan, 362

406 | Index
Kolb Learning Cycle, 358 LUMA System of Innovation, 36
Kouzes, Jim, 353 lunch & learns for continuous learning, 141–143

L M
Lafley, Alan G., 8 Mager, Robert, 76–77
lagging indicators, defined, 168, 175f market research, 299
Langford, David, 355 Marschall, Michael, 170
LCMS (learning content management system), 257–258 Martin, Roger, 8, 36
leadership, 355–367 McCall, Morgan, 109
and awareness of self, 356–358 McKinsey 7-S framework, 67–68
and building capability in others, 362–366 measuring success, 199
and collaboration, 363 measuring value and impact, 197–207
and expanding horizons, 365–366 and articulation of value/impact, 200
importance of, 8 and defining value/impact, 198–199
and learning as a habit, 360–361 effectiveness, 201–205
and personal capacity, 358–359 principles of, 199
tools for enhancing, 359–362, 359f, 373–375 and value/impact construct of learning function,
Leadership and the One Minute Manager (Blanchard), 72 197–198, 198f
The Leadership Engine (Tichy), 373 “Meet the Modern Learner” infographic, 35
leadership journey timeline, 373–375 memory
leading indicators, defined, 168, 175f long-term, hippocampus and amygdala role in, 91
Lean organizations, goal deployment processes in, 6 and neural networks, 86
learner experience memory retention
defined, 202 spacing out learning for, 89–90
measuring, 204 metrics, 167–179. See also measuring value and impact
learning. See also continuous learning analytics, 174–178, 175f, 177f, 178f
as a habit, leadership and, 360–361 benchmarking, 172–174, 173f, 174f
measuring business impact of, 23 dashboards, 168, 181–195, 182f, 186f
structures for, 242–243 of learners finding value, 170–172
learning brief, 46 scorecards, 168
learning charter, 46 and stakeholder involvement, 168–169
learning coach, serving as a, 159–160 microlearning
learning content management system (LCMS), 257–258 continuous learning through, 143–144
learning design guilds for continuous learning, 141 innovation and, 329
Learning Geeks (podcast), 362 mind mapping, 103, 103f
learning governance board, 244, 244f mobile learning
Learning Leadership (Kouzes and Posner), 353, 356 technologies for, 283–286
learning log, 373–375 Morley, Terence, 350, 358, 363–364
learning management system (LMS), 257–258 Mosher, Bob, 155
learning recall, 33, 85 multitasking, 88
learning request forms, 81–82 Myers-Briggs Type Indicator (MBTI), 356–357
learning strategy, defined, 43
learning suites, 371
Lee, Sheryl, 117 N
Lemon, Mark, 61, 63 Nayarana, Rajiv, 274
The Lens of Leadership (Bouck), 65, 342 NBCUniversal, 350
Levenson, Alec, 5 needs assessment, 59–66
List and Grid Method (for stakeholder analysis), 215–218 Biech change tools, 61–63, 62f, 63f
LMS (learning management system), 257–258 in consultation process, 45
Lombardo, Michael M., 109 impact map, 65–67, 65f
long-term memory, hippocampus and amygdala role in, 91 investigative report organizational scan, 63–64, 64f
loop (in IBM design thinking), 327 at NBCUniversal, 350
LUMA Institute, 19, 211 training needs assessment vs., 59

Index | 407
Needs Assessment (ATD Research), 59 McKinsey 7-S framework for, 67–68
Needs Assessment Basics, 2nd ed. (McGoldrick and Tobey), 63 needs assessment for, 59–60
needs assessments, annual, 80–81 optimal strategic zone concept for, 58–59, 58f
net promoter scores (NPS), 124 SWOT analysis for, 68–69, 68f
networking performance master builder (phase 3 of 70-20-10), 112
for benchmarking, 172 performance objectives, defined, 202–204
in digital learning, 274 performance tracker (phase 5 of 70-20-10), 112
for talent acquisition, 128 personal biases, 32–33
neural networks, 86–87 personal capacity in leadership, 358–359
neuroscience in learning design, 85–94 personal development, modeling of, 160–161
emotions, 91–93 personality assessments, 356–357
and focus/attention, 88 personal learning journey, 27–39, 41–52
and fostering insights, 90–91 and ACT Model, 42–43
neural networks, 86–87 consultation process in, 43–46, 45f, 46f
and spacing, 89–90 data gathering in, 47
Newstrom, John, 155, 159 and defining change, 41–42
next generation learning environment (NGLE), 270 and looking around, 28–31
North Box (Hoshin Kanri process), 6 and looking at industry practices, 35–36
and looking at practices, 34–35
and looking at the past and the future, 37
O and looking inward, 31–34
observations, benefit of, 321f and looking to the experts, 36
onboarding. See talent identification and onboarding and project evaluation, 50, 51f
one-on-one meetings for capability building, 364 project planning/management phases in, 50
online courses, emotional arousal in, 92–93 as team-facilitated experience, 48
online toolkits for continuous learning, 144–147 and use of structured interviews, 48–50, 49f
Open Badges, 136 persona profiles, 103–105, 104f
open-source systems, 372 Phillips, Jack, 35
optimal strategic zone, 58–59, 58f Phillips, Patti, 35
organizational change readiness audit, 62f pilots, benefit of, 321f
organizational culture, stakeholder collaboration Platt, Lew, 156
affected by, 211 Playing to Win (Lafley and Martin), 8
Organizational Network Analysis (ONA), 238 podcasting, 362
Out of Crisis (Deming), 241 positive-negative matrix, 229–230
Posner, Barry, 353
Power-Interest matrix, 230–232, 230f
P preparation (in consultation process), 45, 47
Pareto Principle, 110 priming questions, 87
PeaceHealth, 343–344 problem solving methods, 363
peer-to-peer development, 147–149 process of discovery, 48, 48f
performance architect (phase 2 of 70-20-10), 112 professional organizations, 369
performance assessments, 74. See also performance gaps, professional research partners, 305
determining and addressing profiling stakeholders, 232
performance coaching, 362–363 project charter, 100
performance detective phase (phase 1 of 70-20-10), 112 project evaluation, 42, 45, 50, 51f
performance game changer (phase 4 of 70-20-10), 112 project gates, 46, 46f, 246
performance gaps, determining and addressing, 55–70 project management
Biech change tools for, 61–63, 62f, 63f in consultation process, 45, 50
Eisenhower matrix for, 57–58, 57f iteration and, 100–106, 101f, 103f, 104f
example of, 55 project plan (document), 46
impact effort matrix for, 56–57, 56f project planning (in consultation process), 45, 50
impact map tool for, 65–67, 65f
investigative report organizational scan for, 63–64, 64f

408 | Index
Q 70:20:10 Towards 100% Performance (Arets, Jennings, and
quarterly business review scorecard, 46 Heijnen), 111
quick wins (in impact matrix), 56–57 sharing, building a system for, 157–158
Shewhart Cycle, 363
showcases, innovation through, 324
R Sipes, Susana, 67
RACI matrix, 230–231, 231f Six Thinking Hats model (de Bono), 233–234, 234f
RAPID Decision-Making Model, 100, 101f SLA model, 17–23, 18f
regional advisory board, roles and responsibilities of, 244f build the business case for learning (Step 2), 19–21
requests for training (1-800-TRAIN problem), 71–84, 155 communicate your business results (Step 4), 22–23
and annual needs assessments, 80–81 engage leaders in key learning activities (Step 3), 21–22
initial response to, 72–73 know the business (Step 1), 18–19
and “invitation to the dance” concept, 73–75 small private online courses (SPOC), 332, 333f
on learning request forms, 81–82 Smith, Rita Mehegan, 17–18, 20, 23, 243–244
and “turning the conversation,” 75–76 social sharing and learning, technologies for, 271–272
using ADDIE and SAM approaches with, 82 spacing out learning for memory retention, 89–90
using front-end analysis to address, 76–78, 77f sprint (in Agile methodology), 302
using Gilbert’s Grid to address, 78–80, 80f stakeholders
research coalitions aligned with, 232–234, 236–238
for business alignment, 19 defined, 228
defining purpose of, 293–294 identifying, 211, 228–229 (See also stakeholder analysis map)
ensuring innovation impact, 322–324 involvement of, metrics and, 168–169
market, 299 stakeholder analysis map
methodology for, 298–301, 298f, 320–322 for business alignment, 21
with professional research partners, 305 defined, 211
with research partners, 304–305 for metric presentation, 176
scientific, 299 for stakeholder collaboration, 211–219, 213–215f
retail industry, 283–285, 360 stakeholder collaboration, 209–225. See also coalitions
return on investment (ROI) analysis, 263, 285 benefits of, 209
Reynolds, Carmen, 350–351, 357, 364 context of, 210–211
Rock, David, 124 fictionalized case example, 219–224
Rose, Bud, Thorn, 99, 99f and identification of stakeholders, 211
Rossett, Allison, 76 and management of stakeholders, 218–219, 219f
Rumelt, Richard, 36 risks of, 210
using stakeholder analysis for, 215–218, 217f
using stakeholder mapping for, 211–219, 213–215f
S stakeholder governance, 241–251
SAM (successful approximation model), 82 at Deloitte, 248–249
Samms, Marguerite, 347–349, 361–362, 361–363 fictionalized case example, 244, 246–248
scientific research, 299 and learning structures, 242–243, 245f
scorecards, 168 as system/process, 243–244
self-reflection by leaders, 357–358 stakeholders, profiling, 232
senior learning experience designer, 124–125, 125f, 126f Stanier, Michael Bungay, 364
senior-level goals, strategic alignment with, 5–6 staying current in the field, 156
70-20-10 framework, 35, 109–122, 110f stellar producers, branding your team as, 161–162
benefits of using, 113 Stodd, Julian, 271, 275–276
case study (Hilti), 114–115, 116f, 271–272 storytelling for capability building, 363
case study (UPS), 118 Stowell, Steven J., 58
origins and development of, 109–111 strategic alignment, 3–39, 41–52
and Pareto Principle, 110 asking questions about, 4–5
in performance coaching, 362–363 and building the case for learning, 19–21
phases of, 111–113 and communicating results, 22–23
70:20:10 Institute, 110, 274 data gathering for, 4–5

Index | 409
and engaging leaders in learning, 21–22 readiness for, 263
executive sponsors for ensuring, 6–7 return on investment with, 263
governing strategy for, 7 and social sharing/learning, 271–272
at Ingersoll Rand, 6–12 virtual reality, 260
levels of, 15–16 at Walmart, 345
proactive approach to, 15–26 wearables, 261
with senior-level goals, 5–6 thankless tasks (in impact matrix), 56–57
skills required for, 4 Thriving Through Change (Biech), 61, 63
and understanding the business, 18–19, 23, 24f Tichy, Noel, 366, 373
Strategic Learning Alignment (Smith), 17–18, 18f Training Industry, 158
The Strategic Thinking Manifesto (Horwath), 36 Training Industry, 242
strategy board, roles and responsibilities of, 244f training needs assessment, needs assessment vs., 59
strategy document, 46 truths, universal, 300
stress, reducing, 92
structured interviews for data gathering, 48–50, 49f
subject matter experts (SMEs), 19, 21, 25, 93, 124, 142, 313 U
success, defining, 4–5 universal truths, 300
successful approximation model (SAM), 82 UPS, 117
surveys, benefits of, 321f
SWOT analysis, 48, 68–69, 68f
V
value
T learners finding, 170–172
taking breaks, importance of, 88 measuring (See measuring value and impact)
talent acquisition professionals, 128 value and impact construct (of learning function), 197–198, 198f
talent identification and onboarding, 123–133 virtual learning
for cross-functional teams, 295 bundles, 136, 137–139, 139f
in digital age, 282 capabilities of, 274
hiring tips, 128–129 discussion guides, 140
interviews, 127–128 emotional arousal in, 92–93
job descriptions, 124–127, 125f, 126f lunch and learn, 142
onboarding procedure and examples, 129–131, 129f, 131f, peer-to-peer development, 148
132f webinars, 149–150
Talent Management Framework (Bersin), 20, 20f virtual reality (VR), 260
team development. See capability development (teams)
team-facilitated experience, data gathering through, 48
technology, 255–265 W
artificial intelligence, 259–260 Wall Street Journal, 341
assessing new, 256–257 Walmart, 344–345
and audience size/distribution, 262 watercoolers, continuous learning, 150–152
augmented reality, 260 Waterfall approach, 82
budgeting for, 262, 263 wearable technologies, 261
content delivery systems, 258–259 Webb, Michelle, 98, 365
and content management, 257–258 webinars, continuous learning, 149–150
and content volatility, 262–263 Weiher, Alissa, 349, 357, 364–366
for digital face-to-face training, 333 worker transformation, workplace performance affected by,
drones, 261 280–281
emerging, 259 workplace performance, fundamentals affecting, 280
gaming and gamification, 261, 272–273
at Hilti, 267–277
for immersive learning environments, 273–276
and learning and development operations, 279–287
for mobile learning, 283–286
platform/system resources, 371–372

410 | Index

You might also like