Essentials of Non Profit Organisation

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 4

Notes on Implementation

Paul Brest and Jason Bade

It is through implementation that your plan meets reality and often hits a wall. These notes
sketch a variety of implementation problems that a social problem solver is likely to encounter.
Recognizing the problems in advance can help you adjust your strategy during the design phase
to avoid or mitigate them.

What insights for implementation of a childhood obesity prevention strategy can be gleaned from
the story of the 1976 Swine Flu Vaccination Program and the following notes?

Read David Sencer and J. Donald Miller, Reflections on the 1976 Swine Flu Vaccination
Program1

In The Day After an AIDS Vaccine is Discovered: Management Matters, Martin Levin draws on
the 1976 Swine Flu Vaccination Program to speculate about why the discovery of an effective
AIDS vaccine would not lead to a quick decrease in HIV/AIDS-related illnesses and deaths.

The subtitle of Levin’s article, “Management Matters,” emphasizes the point that the
implementation of a program is not a mechanical process, but rather “a process of policy making
through learning by doing,” which “occurs in the field through a process of iteration, adaptation,
and ex post facto error corrections.”

Referring to problems with the Swine Flu Vaccination Program, Levin writes:

The management problems and delays will result from many serious conflicts: scientific
controversy over the vaccine’s effectiveness and safety; threats of lawsuits over side
effects and demands of manufacturers calling for indemnification from them;
professional and institutional timidity among health care providers; media
sensationalization of rare cases. All these conflicts will discourage the public from
embracing the vaccination program. A lack of leadership is likely because this is all so
controversial, and because formal authority is so fragmented in the health care field. But
even with the best of leadership, any vaccine program will find its implementation and
management difficult because it will face a complex situation filled with booby traps …

As the head of CDC, David Spencer pressed for mass inoculation to address the potential swine
flu epidemic. But later, as New York City Health Commissioner addressing the HIV/AIDS
epidemic, because of the social and political sensitivities, Spencer did not move to regulate bath
houses, a major venue for transmitting the virus.

1
Emerging Infectious Diseases • www.cdc.gov/eid • Vol. 12, No. 1, January 2006

7/31/2015 Page 1
Levin points to a potential issue with the AIDS vaccine that did not arise in the case of the swine
flu vaccine. Who should have priority in getting vaccinated—gay men, intravenous drug users,
sexually active youth, etc.?—and who would pay for individuals who cannot afford the vaccine.

***
With the caveat that nothing can substitute for expertise in a particular domain to anticipate the
ways that its specific conditions can make a strategy go wrong, here are six general ways in
which implementation can fail.

1. Misaligned stakeholders’ interests. Both within and outside your organization, you may
encounter principal-agent problems, where people who are duty-bound to carry out a policy may
have competing personal and professional interests. Begin by trying to identify the interests of all
key stakeholders. Then think with a “dirty mind” or “think like a rat” to consider their motives
for impeding implementation and how they might game the system to hinder success. Now
change your attitude, and consider how you might enlist these and other stakeholders as allies.
Two elements of human centered design, ethnography and user testing, are valuable tools here.

Consider this example: In 2006, four years before Congress enacted the Affordable Care Act,
known as "Obamacare,” Massachusetts Governor Mitt Romney persuaded the legislature to
adopt a “Romneycare” based on a very similar structure.2 Research leading to the Massachusetts
plan was conducted in private by a team of financial and organizational experts from Bain & Co.,
JP Morgan, and Harvard Business School, who examined many examples of health plans in the
context of Massachusetts’ health and demographic data to come up with a scheme that required
individuals to purchase health insurance. Rather than announcing a plan and selling it to the
legislature, the team then tested their proposal in individual meetings with stakeholders,
including hospitals, medical groups, insurers, health advocates, and legislators. They listened to
objections, invited suggestions, and modified the plan based on feedback. Ultimately, the plan,
with compromises not to the Governor’s liking, was passed by a Democratic legislature.

Timothy Murphy, who later became Romney’s Secretary of Health and Human Services,
described the process:

We purposely put ourselves in the firing line with people who viewed us with great
skepticism. We had to gain credibility with them early on. We wanted them to have
confidence that we understood their business, and that we weren’t married to any
particular solution. We wanted them to have a personal relationship and open line of
communication with us, to make them comfortable. After every meeting we’d debrief.
What did we learn that we didn’t know before? What adjustments can we make to
accommodate this group’s concerns? After a while, you start to gain confidence that the
plan you are formulating is robust, that it makes sense.

2
The story and quotations are taken from William Eggers and John O'Leary, If We Can Put a Man on the
Moon: Getting Big Things Done in Government (2009).
Based on this experience, Murphy advises: “Don’t be afraid of information that conflicts with
your preconceived notions. Also don’t be afraid of those people who are likely to disagree with
you. Be open to alternative ways of defining your problem and developing various approaches.”

When you are unlikely to get stakeholders to agree, you might give up on consensus and settle
for a compromise that preserves the essentials of your plan but gives up inessentials to
accommodate the interests of others, as Romney did.

2. Misaligned organizational culture. Even when all stakeholders’ incentives are properly
aligned and the necessary infrastructure is in place, implementing large program or policy
changes can be hindered by an organizational culture deeply rooted in the status quo. Consider
some health care providers’ efforts to convert from a volume-based fee-for-service (FFS) model
to a value-based model of health care, one that achieves the best health outcomes at the lowest
cost. Despite excellent examples (e.g., Kaiser Permanente) and strong economic incentives from
payers such as insurance companies and Medicare, many providers that tried to switch ended up
reverting to FFS within a few years. Poor management and organizational culture contributed to
the failure: Successful affordable care organizations not only require open communication
between medical workers within and across departments; they also require a dramatic shift from
an ingrained institutional inclination to over-test and over-treat to a culture in which cost
effectiveness is a key factor in medical decision making. It take years, perhaps decades, to effect
such attitudinal and organizational changes.

3. Poor management. Good management is valuable even when nothing goes wrong, and good
management is essential for an organization to respond to challenges to a plan as it is
implemented. For an example of a serious management problem, consider the Web enrollment
debacle associated with the roll out of the Affordable Care Act in 2013.

4. Beneficiaries’ self-defeating behavior. People do not always act in their own best interests.
Such irrationality has to be understood (through ethnography and social science) and
accommodated (with user testing and nudging).

5. Inadequate infrastructure. For present purposes, we define “infrastructure” as conditions


necessary for the successful execution of a plan, but presumptively beyond your remit—that is,
goods or services that you might assume someone else would provide. For example, you might
assume that there will be a transportation system—at least roads built and maintained by the
government; or that there will be people available who can be hired to work on the projects.

Consider a vaccination program in the developing world. The necessary physical infrastructure
includes refrigerators to store the vaccines and roads or paths to deliver the vaccines to villages.
The human infrastructure you rely on includes skilled personnel to administer the vaccines. But
some of these elements may just not be present, and will have to be incorporated in your logic
model: for example, the lack of refrigerated transport and storage might require you to fund that
infrastructure development yourself or partner with someone who can. (By contrast, sometimes
you may find an unexpected partner who can provide an essential service—for example, one
organization found that Coca Cola trucks would distribute its vaccine, and packaged it to fit in
the same cartons.)

7/31/2015 Page 3
6. Faulty assumptions. In addition to all of the above, it is useful to be pessimistic about the basic
nuts and bolts of execution and operation. Where might things go wrong even if everyone is
motivated to make the plan succeed? What assumptions underlying your theory of change might
be flawed? A session or two in which you, your team members, several stakeholders, and even a
few outsiders think imaginatively about failure scenarios can be extraordinarily useful for
making your strategy resilient to unexpected shocks. You might undertake a “pre-mortem:”
assume that the strategy has failed and ask how that might have happened.

You might also like