Download as pdf or txt
Download as pdf or txt
You are on page 1of 104

SHOP TALK

1.DECISION TREE
2.MANAGERIAL GRID
3.SATISFIER/DISATISFIER
4.THEORY X/THEORY Y
5.BRAINSTORMING
6.T-GROUP TRAINING
7.CONGLOMERATION
8.MANAGEMENT BY OBJECTIVES
9.DIVESIFICATION
10.THEOY Z
11.ZERO-BASED BUDGETING
12.DECENTRALIZATION
13.QUALITY CIRCLE
14.EXCELLENCE
15.RESTRUCTURING
16.PORTOFIO MANAGEMENT
17.MBWA
18.MATRIX
19.KANBAN
20.INTRAPRENEURING
21.CORPORATE CULTURE
22.ONE-MINUTE MANAGEMENT
23.DOWNSIZING
24.SELF-MANAGED TEAMS
25.TQM & 6-SIGMA
26.LEARNING ORGANIZATION
27.ENTERPRISE RESOURCE PLANNING
28.KNOWLEDGE MANAGEMENT

1
Decision tree
In operations research, specifically in decision analysis, a decision tree is a
decision support tool that uses a graph or model of decisions and their
possible consequences, including chance event outcomes, resource costs,
and utility. A decision tree is a decision support tool, used to identify the
strategy most likely to reach a goal. Another use of trees is as a descriptive
means for calculating conditional probabilities.

In data mining and machine learning, a decision tree is a predictive model;


that is, a mapping from observations about an item to conclusions about its
target value. More descriptive names for such tree models are classification
tree or reduction tree. In these tree structures, leaves represent
classifications and branches represent conjunctions of features that lead to
those classifications [1]. The machine learning technique for inducing a
decision tree from data is called decision tree learning, or (colloquially)
decision trees.

General

In decision analysis, a "decision tree" ---- and a closely related model form,
an influence diagram ---- is used as a visual and analytical decision support
tool, where the expected values (or expected utility) of competing
alternatives are calculated.

You start a Decision Tree with a decision that you need to make. Draw a
small square to represent this towards the left of a large piece of paper.

From this box draw out lines towards the right for each possible solution,
and write that solution along the line. Keep the lines apart as far as possible
so that you can expand your thoughts.

At the end of each line, consider the results. If the result of taking that
decision is uncertain, draw a small circle. If the result is another decision
that you need to make, draw another square. Squares represent decisions,
and circles represent uncertain outcomes. Write the decision or factor above
the square or circle. If you have completed the solution at the end of the
line, just leave it blank.

Starting from the new decision squares on your diagram, draw out lines
representing the options that you could select. From the circles draw lines
representing possible outcomes. Again make a brief note on the line saying
what it means. Keep on doing this until you have drawn out as many of the
possible outcomes and decisions as you can see leading on from the original
decisions.

An example of the sort of thing you will end up with is shown in Figure 1:

2
Once you have done this, review your tree diagram. Challenge each square and
circle to see if there are any solutions or outcomes you have not considered. If there
are, draw them in. If necessary, redraft your tree if parts of it are too congested or
untidy. You should now have a good understanding of the range of possible
outcomes of your decisions.

evaluating
Now you are ready to evaluate the decision tree. This is where you can work out
which option has the greatest worth to you. Start by assigning a cash value or score
to each possible outcome. Estimate how much you think it would be worth to you if
that outcome came about.

Next look at each circle (representing an uncertainty point) and estimate the
probability of each outcome. If you use percentages, the total must come to 100% at

3
each circle. If you use fractions, these must add up to 1. If you have data on past
events you may be able to make rigorous estimates of the probabilities. Otherwise
write down your best guess.

This will give you a tree like the one shown in Figure 2:

CalculatingTreeValues
Once you have worked out the value of the outcomes, and have assessed the
probability of the outcomes of uncertainty, it is time to start calculating the values
that will help you make your decision.

Start on the right hand side of the decision tree, and work back towards the left. As

4
you complete a set of calculations on a node (decision square or uncertainty circle),
all you need to do is to record the result. You can ignore all the calculations that lead
to that result from then on.

Calculating The Value of Uncertain Outcome Nodes


CALCULATINGTHE VALUE OF UNCERTAINOUTCOMENODE
Where you are calculating the value of uncertain outcomes (circles on the diagram),
do this by multiplying the value of the outcomes by their probability. The total for
that node of the tree is the total of these values.

In the example in Figure 2, the value for 'new product, thorough development' is:

0.4 (probability good outcome) x £500,000 (value) = £200,000


0.4 (probability moderate outcome) x £25,000 (value) = £10,000
0.2 (probability poor outcome) x £1,000 (value) = £200
+ £210,200

Figure 3 shows the calculation of uncertain outcome nodes:

5
Note that the values calculated for each node are shown in the boxes.

Calculating The Value of Decision Nodes


When you are evaluating a decision node, write down the cost of each option along
each decision line. Then subtract the cost from the outcome value that you have
already calculated. This will give you a value that represents the benefit of that
decision.

Note that amounts already spent do not count for this analysis - these are 'sunk
costs' and (despite emotional counter-arguments) should not be factored into the
decision.

When you have calculated these decision benefits, choose the option that has the
largest benefit, and take that as the decision made. This is the value of that decision
node.

6
Figure 4 shows this calculation of decision nodes in our example:

In this example, the benefit we previously calculated for 'new product, thorough
development' was £210,000. We estimate the future cost of this approach as
£75,000. This gives a net benefit of £135,000.

The net benefit of 'new product, rapid development' was £15,700. On this branch we
therefore choose the most valuable option, 'new product, thorough development',
and allocate this value to the decision node.

Result
By applying this technique we can see that the best option is to develop a new

7
product. It is worth much more to us to take our time and get the product right, than
to rush the product to market. It is better just to improve our existing products than
to botch a new product, even though it costs us less.

Influence diagram
A decision tree can be represented more compactly as an influence diagram, focusing
attention on the issues and relationships between events.

Uses in teaching
Decision trees, influence diagrams, utility functions, and other decision analysis tools
and methods are taught to undergraduate students in schools of business, health
economics, and public health, and are examples of operations research or
management science methods.

Creation of decision nodes


Three popular rules are applied in the automatic creation of classification trees. The
Gini rule splits off a single group of as large a size as possible, whereas the entropy
and twoing rules find multiple groups comprising as close to half the samples as
possible. Both algorithms proceed recursively down the tree until stopping criteria
are met.

Advantages
Amongst decision support tools, decision trees (and influence diagrams) have several
advantages:

Decision trees:

8
• are simple to understand and interpret. People are able to understand
decision tree models after a brief explanation.
• have value even with little hard data. Important insights can be
generated based on experts describing a situation (its alternatives,
probabilities, and costs) and their preferences for outcomes.
• use a white box model. If a given result is provided by a model, the
explanation for the result is easily replicated by simple math. (An example of
a black box model is an artificial neural network since the explanation for the
results can be excessively complex for a decision maker to comprehend.)
• can be combined with other decision techniques. The following example
uses Net Present Value calculations, PERT 3-point estimations (decision #1)
and a linear distribution of expected outcomes (decision #2):

• can be used to optimize an investment portfolio. The following example


shows a portfolio of 7 investment options (projects. The organization has
$10,000,000 available for the total investment. Bold lines mark the best
selection 1, 3, 6 and ,7 which will cost $7,740,000 and create a payoff of
2,710,000. All other combinations would either exceed the budget or yield a
lower payoff:

9
10
Managerial grid model
The Managerial Grid Model (1964) is a behavioral leadership model developed by
Robert Blake and Jane Mouton. This model identifies five different leadership styles
based on the concern for people and the concern for production. The optimal leadership
style in this model is based on Theory
Y.

A graphical representation of the Managerial Grid

As shown in the figure, the model is represented as a grid with concern for production as
the X-axis and concern for people as the Y-axis; each axis ranges from 1 (Low) to 9
(High). The five resulting leadership styles are as follows:

The impoverished style (1,1)

In this style, managers have low concern for both people and production. Managers use
this style to avoid getting into trouble. The main concern for the manager is not to be held
responsible for any mistakes, which results in less innovative decisions.

Features 1. Does only enough to preserve job and job seniority. 2. Gives little and enjoys
little. 3. Protects himself by not being noticed by others.

Implications 1. Tries to stay in the same post for a long time.

11
The country club style (1,9)

This style has a high concern for people and a low concern for production. Managers
using this style pay much attention to the security and comfort of the employees, in hopes
that this would increase performance. The resulting atmosphere is usually friendly, but
not necessarily productive.

The produce or perish style (9,1)

With a high concern for production, and a low concern for people, managers using this
style find employee needs unimportant; they provide their employees with money and
expect performance back. Managers using this style also pressure their employees
through rules and punishments to achieve the company goals. This dictatorial style is
based on Theory X of Douglas McGregor, and is commonly applied by companies on the
edge of real or perceived failure.This is used in case of crisis management.

The middle-of-the-road style (5,5)

Managers using this style try to balance between company goals and workers' needs. By
giving some concern to both people and production, managers who use this style hope to
achieve acceptable performance.

The team style (9,9)

In this style, high concern is paid both to people and production. As suggested by the
propositions of Theory Y, managers choosing to use this style encourage teamwork and
commitment among employees. This method relies heavily on making employees feel as
a constructive part of the company.

Inspiration for Conflict Style Inventories

The managerial grid has also served as the inspiration for several conflict management
style inventories, notably the Thomas Kilmann Inventory (TKI) and the Kraybill Conflict
Style Inventory

12
SATISFIER/DISSATISFIERS

Getting the right information is essential to making the right decisions. Good research can
help you make the right decisions by providing you with the right information.....

Key Driver Analysis:

With Key Driver Analyses, the focus is placed on identifying the “key drivers” of satisfaction, loyalty,
or retention. The goal of this type of analysis is to identify the smallest set of attributes
(performance/features/attributes) that has the greatest capacity to alter/improve overall
satisfaction, loyalty, or retention – those elements that are most important in driving
satisfaction/loyalty/retention. In general, we use two broad categories of methods are used:

• Indirect Methods – “back room” statistical and/or mathematical methods used to identify
key drivers. Three of the more common methods used to identify underlying relationships
and relative importance are correlations, factor analysis, and multiple regression. These
types of methods can often be easily used with existing CSM databases and are minimally
intrusive. However, the results are limited to the quality/depth of the information available
as well as the multicollinearity of the data.
• Direct Methods – methods used to identify key drivers directly from customers in a separate
research initiative. Some of the more common methods used to identify relative importance
are importance ratings, trade-off methods (rank ordering, constant sum, and pair
preference), and card sorts. These types of methods require separate/additional research,
but greatly maximize the likelihood of discovering new issues because they are not limited
by the information currently in-house.

Satisfier/Dissatisfier Analysis™:

The Satisfier/Dissatisfier Profile™ is a method of analyzing customer satisfaction, loyalty, or


retention information developed by Bosma Research International. The Profile reveals the extent to
which meeting or not meeting customer expectations across specific performance attributes affects
overall satisfaction, loyalty, or retention. In particular, the Profile identifies satisfiers (things that, if
done well, create satisfaction, loyalty, or retention) and dissatisfiers (things that, if done poorly,
create dissatisfaction, loyalty, or retention).

The Profile can best be described as a composite representation of the specific aspects that promote
satisfaction/loyalty/retention as well as the aspects that tend to arouse
dissatisfaction/disloyalty/defection. The power of the Profile can be viewed in the following two
examples.

The Results of a Traditional Key Driver Analysis:

The traditional key driver analysis can identify the relative importance of Scale of Operations,
Aircraft, In-flight Service, and On-Time Performance as key drivers of overall customer satisfaction
for this market segment.

13
The Results of the Same Scenario Using a Satisfier/Dissatisfier Analysis™:

An analysis of the same scenario using a Satisfier/Dissatisfier Analysis™ reveals that with this
particular market segment, the negative impact and/or consequences of not meeting customers’
expectations across these four service areas is substantially greater than the potential increase to
be gained. It becomes very clear that there is little room for error across any of these four service
areas with this customer group. As a result, a service priority for the airline would likely need to be
doing things right the first time with this customer group.

The Results of the Same Satisfier/Dissatisfier Analysis™ for a Different Segment:

This second example presents both the power and capacity of the Profile to identify clear differences
in service/product expectations across different market segments.

14
15
Brainstorming
Brainstorming is a group creativity technique that was designed to generate a large
number of ideas for the solution of a problem. The method originated in a 1953 book
called Applied Imagination by Alex Faickney Osborn, an advertising executive. Osborn
proposed that groups could double their creative output by using the method of
brainstorming. Although brainstorming has become a popular group technique,
researchers have generally failed to find evidence of its effectiveness for enhancing either
quantity or quality of ideas generated. Because of such problems as distraction, social
loafing, evaluation apprehension, and production blocking, brainstorming groups are little
more effective than other types of groups, and they are actually less effective than
individuals working independently. For this reason, there have been numerous attempts
to improve brainstorming or replace it with more effective variations of the basic
technique.

Although traditional brainstorming may not increase the productivity of groups, it has
other potential benefits, such as enhancing the enjoyment of group work and improving
morale. It may also serve as a useful exercise for team building.

Approach
There are four basic rules in brainstorming These are intended to reduce the social
inhibitions that occur in groups and therefore stimulate the generation of new ideas. The
expected result is a dynamic synergy that will dramatically increase the creativity of the
group.

1. Focus on quantity: This rule is a means of enhancing divergent production,


aiming to facilitate problem solving through the maxim, quantity breeds quality.
The assumption is that the greater the number of ideas generated, the greater the
chance of producing a radical and effective solution.
2. No criticism: It is often emphasized that in group brainstorming, criticism should
be put 'on hold'. Instead of immediately stating what might be wrong with an idea,
the participants focus on extending or adding to it, reserving criticism for a later
'critical stage' of the process. By suspending judgment, one creates a supportive
atmosphere where participants feel free to generate unusual ideas.
3. Unusual ideas are welcome: To get a good and long list of ideas, unusual ideas
are welcomed. They may open new ways of thinking and provide better solutions
than regular ideas. They can be generated by looking from another perspective or
setting aside assumptions.
4. Combine and improve ideas: Good ideas can be combined to form a single very
good idea, as suggested by the slogan "1+1=3". This approach is assumed to lead
to better and more complete ideas than merely generating new ideas alone. It is
believed to stimulate the building of ideas by a process of association.

16
Diagram of a brainstorming session

Outline of the method


Set the problem

One of the most important things to do before a session is to define the problem. The
problem must be clear, not too big, and captured in a definite question such as “What
service for mobile phones is not available now, but needed?“. If the problem is too big,
the chairman should divide it into smaller components, each with its own question. Some
problems are multi-dimensional and non-quantified, for example “What are the aspects
involved in being a successful entrepreneur”. Finding solutions for this kind of problem
can be done with morphological analysis.

Create a background memo

The background memo is the invitation and informational letter for the participants,
containing the session name, problem, time, date, and place. The problem is described in
the form of a question, and some example ideas are given. The ideas are solutions to the
problem, and used when the session slows down or goes off-track. The memo is sent to
the participants at least two days in advance, so that they can think about the problem
beforehand.

17
Select participants

The chairman composes the brainstorming panel, consisting of the participants and an
idea collector. Ten or fewer group members are generally more productive than larger
groups. Many variations are possible but the following composition is suggested.

• Several core members of the project who have proved themselves.


• Several guests from outside the project, with affinity to the problem.
• One idea collector who records the suggested ideas.

Create a list of lead questions

During the brainstorm session the creativity may decrease. At this moment, the chairman
should stimulate creativity by suggesting a lead question to answer, such as Can we
combine these ideas? or How about a look from another perspective?. It is advised to
prepare a list of such leads before the session begins.

Session conduct

The chairman leads the brainstorming session and ensures that the basic rules are
followed. The activities of a typical session are:

1. A warm-up session, to expose novice participants to the criticism-free


environment. A simple problem is brainstormed, for example What should be the
next corporate Christmas present? or What can be improved in Microsoft
Windows?.
2. The chairman presents the problem and gives a further explanation if needed.
3. The chairman asks the brainstorming panel for their ideas.
4. If no ideas are coming out, the chairman suggests a lead to encourage creativity.
5. Every participant presents his or her idea, and the idea collector records them.
6. If more than one participant has ideas, the chairman lets the most associated idea
be presented first. This selection can be done by looking at the body language of
the participants, or just by asking for the most associated idea.
7. The participants try to elaborate on the idea, to improve the quality.
8. When time is up, the chairman organizes the ideas based on the topic goal and
encourages discussion. Additional ideas may be generated.
9. Ideas are categorized.
10. The whole list is reviewed to ensure that everyone understands the ideas.
Duplicate ideas and obviously infeasible solutions are removed.
11. The chairman thanks all participants and gives each a token of appreciation.

18
Process of conducting a brainstorming session

The process

• Participants who have an idea but no possibility to present it are encouraged to


write down their idea and present it later.
• The idea collector should number the ideas, so that the chairman can use the
number to encourage quantitative idea generation, for example: We have 44 ideas
now, let’s get it to 50!.
• The idea collector should repeat the idea in the words he or she has written it, to
confirm that it expresses the meaning intended by the originator.
• When more participants are having ideas, the one with the most associated idea
should have priority. This to encourage elaboration on previous ideas.
• During the brainstorming session the attendance of managers and superiors is
strongly discouraged, as it may inhibit and reduce the effect of the four basic
rules, especially the generation of unusual ideas.

Variations

19
Nominal group technique

Variations
Nominal group technique

The nominal group technique is a type of brainstorming that encourages all participants
to have an equal say in the process. It is also used to generate a ranked list of ideas.

Participants are asked to write down their ideas anonymously. Then the moderator
collects the ideas and each is voted on by the group. The vote can be as simple as a show
of hands in favor of a given idea. This process is called distillation.

After distillation, the top ranked ideas may be sent back to the group or to subgroups for
further brainstorming. For example, one group may work on the color required in a
product. Another group may work on the size, and so forth. Each group will come back to
the whole group for ranking the listed ideas. Sometimes ideas that were previously
dropped may be brought forward again once the group has re-evaluated the ideas.

It is important for the moderator to have received training in this process before
attempting to take on the moderating task. The group should be primed and encouraged
to embrace the process. Like all team efforts, it may take a few practice sessions to train
the team in the method before tackling the important ideas.

Group passing technique

Each person in a circular group writes down one idea, and then passes the piece of paper
to the next person in a clockwise direction, who adds some thoughts. This is repeated
until everybody gets their original piece of paper back. By this time, it is likely that the
group will have extensively elaborated on each idea.

A popular alternative to this technique is to create an "Idea Book" and post a distribution
list or routing slip to the front of the book. On the inside cover (or first page) is a
description of the problem. The first person to receive the book lists his or her ideas and
then routes the book to the next person on the distribution list. The second person can log
new ideas or add to the ideas of the previous person. This continues until the distribution
list is exhausted. follow-up "read out" meeting is then held to discuss the ideas logged in
the book. This technique takes longer, but allows individual thought whenever the person
has time to think deeply about the problem.

Electronic brainstorming

Electronic brainstorming is done via email. The chairman or facilitator sends the question
out to group members, and they contribute independently by sending their ideas directly
back to the facilitator. The facilitator then compiles a list of ideas and sends it back to the

20
group for further feedback. Electronic brainstorming eliminates many of the problems of
standard brainstorming, such as production blocking and evaluation apprehension. An
additional advantage of this method is that all ideas can be archived electronically in their
original form, and then retrieved later for further thought and discussion. Electronic
brainstorming also enables much larger groups to brainstorm on a topic than would
normally be productive in a traditional brainstorming session.

Conclusion
Brainstorming is a popular method of group interaction in both educational and business
settings. Although it does not appear to provide a measurable advantage in creative
output, brainstorming is an enjoyable exercise that is typically well received by
participants. Newer variations of brainstorming seek to overcome barriers like production
blocking and may well prove superior to the original technique. How well these newer
methods work, and whether or not they should still be classified as brainstorming, are
questions that require further research before they can be answered.

21
T-
Groups
History
In 1947, the National Training Laboratories Institute began in Bethel, ME. They pioneered the use of T-
groups (Laboratory Training) in which the learners use here and now experience in the group, feedback
among participants and theory on human behavior to explore group process and gain insights into
themselves and others. The goal is to offer people options for their behavior in groups. The T-group was a
great training innovation which provided the base for what we now know about team building. This was a
new method that would help leaders and managers create a more humanistic, people serving system and
allow leaders and managers to see how their behavior actually affected others. There was a strong value of
concern for people and a desire to create systems that took people's needs and feelings seriously.

Objectives of T-Group Learning


The T-Group is intended to provide you the opportunity to:

• Increase your understanding of group development and dynamics.


• Gaining a better understanding of the underlying social processes at work within a group (looking
under the tip of the iceberg)
• Increase your skill in facilitating group effectiveness.
• Increase interpersonal skills
• Experiment with changes in your behavior
• Increase your awareness of your own feelings in the moment; and offer you the opportunity to
accept responsibility for your feelings.
• Increase your understanding of the impact of your behavior on others.
• Increase your sensitivity to others' feelings.
• Increase your ability to give and receive feedback.
• Increase your ability to learn from your own and a group's experience.
• Increase your ability to manage and utilize conflict.

Success in these goals depends, to a large extent, on the implied contract that each participant is willing to
disclose feelings that she or he may have, in the moment, about others in the group, and to solicit feedback
from the others about herself or himself. The focus is upon individual learning; some participants may learn
a great deal in most of the above areas, others learn relatively little.

22
Method
One way of describing what may happen for a participant is --

1. Unfreezing habitual responses to situations -- this is facilitated by the participant's own desire to
explore new ways of behaving and the trainer staying non-directive, silent, and providing little
structure or task agenda
2. Self generated and chosen change by the participant
- Experiment with new behaviors -Practice description not evaluation
of
3. Reinforce new behavior by positive feedback, participants own assessment of whether what is
happening is closer to what she/he intends, supportive environment, trust development

Sources of Change in Groups

• Self-observation - participants give more attention to their own intentions, feelings, etc.
• Feedback - participants receive information on the impact they have on others
• Insight - participants expand self-knowledge
• Self-disclosure - participants exposes more of themselves to others
• Universality - participants experience that others share their difficulties, concerns or hopes
• Group Cohesion - participants experience trust, acceptance & understanding)
• Hope - participant see others learn, achieve their goals, improve, and cope more effectively
• Vicarious Learning - participants pick up skills and attitudes from others
• Catharsis - participants experience a sense of release or breakthrough

A Description

The T-group provides participants with an opportunity to learn about themselves, their impact on others
and how to function more effectively in group and interpersonal situations. It facilitates this learning by
bringing together a small group of people for the express purpose of studying their own behavior when they
interact within a small group.

A T-Group is not a group discussion or a problem solving group.

The group's work is primarily process rather than content oriented. The focus tends to be on the feelings and
the communication of feelings, rather than on the communication of information, opinions, or concepts.
This is accomplished by focusing on the 'here and now' behavior in the group. Attention is paid to particular
behaviors of participants not on the "whole person", feedback is non-evaluative and reports on the impact of
the behavior on others. The participant has the opportunity to become a more authentic self in relation to
others through self disclosure and receiving feedback from others. The Johari Window is a model that looks
at that process.

The training is not structured in the manner you might experience in an academic program or a meeting
with an agenda or a team with a task to accomplish. The lack of structure and limited involvement of the
trainers provides space for the participants to decide what they want to talk about. No one tells them what
they ought to talk about. The lack of direction results in certain characteristic responses; participants are

23
silent or aggressive or struggle to start discussions or attempt to structure the group.

In the beginning of a T-Group participants are usually focused on what they experience as a need for
structure, individual emotional safety, predictability, and something to do in common. These needs are what
amount to the tip of the iceberg in most groups in their back home situation. By not filling the group's time
with answers to these needs, the T-Group eventually begins to notice what is under the tip of the iceberg. It
is what is always there in any group but often unseen and not responsibly engaged . So, participants
experience anxiety about authority and power, being include and accepted in the group, and intimacy.

Depending on forces, such as, the dynamics of the group, the past experience and competence of
participants, and the skill of the trainers -- the group, to some extent, usually develops a sense of itself as a
group, with feelings of group loyalty. This can cause groups to resist learning opportunities if they are seen
as threatening to the group's self-image. It also provides some of the climate of trust, support and
permission needed for individuals to try new behavior.

As an individual participant begins to experience some degree of trust (in themselves, the group and the
trainers) several things become possible --

• The participant may notice that his/her feelings and judgments about the behavior of others is not
always shared by others. That what he/she found supportive or threatening was not experience in
that way by others in the group. That how one responded to authority, acceptance and affection
issues different from that of others (more related to ones family of origin than to what is happening
in the group). Individual differences emerge in how experiences are understood.
• The participant may begin to try on new behavior. For example, someone who has always felt a
need to fill silence with noise and activity tries being quieter and still.
• Participants begin to ask for feedback from the group about how their behavior is impacting others.
• Participants may find that they are really rather independent and have a relatively low level of
anxiety about what is happening in the group. They will exhibit a broader range of behavior and
emotions during the life of the group. In fact their leadership is part of what helps the group develop.

The role of the trainers

• To help the group and individuals analyze and learn from what is happening in the group. The
trainer may draw attention to events and behavior in the group and invite the group to look at its
experience. At times the trainer may offer tentative interpretations.
• To offer theory, a model or research that seems related to what the group is looking at.
• To encourage the group to follow norms that tend to serve the learning process, e.g., focusing on
"here & now" rather than the "then & there".
• To offer training and coaching in skills that tend to help the learning process, e.g., feedback skills,
EIAG, etc.
• To not offer structure or an agenda. To remain silent, allowing the group to experience its anxiety
about acceptance, influence, etc.
• To be willing to disclose oneself, to be open with the group. On occasion being willing to offer
feedback and challenge a participant
• To avoid becoming too directive, clinical, or personally involved.

24
Possible Problems
• T-Group methods usually encourage self-disclosure and openness, which may be inappropriate or
even punished in organizations. This was an early learning. When managers thought they could take
the T-group method into the back home organization, they discovered that the methods and the
assumptions of a T-group did not fit. T-groups consisted of participants who were strangers. They
didn't have a history or a future together and could more easily focus on here and now behavior.
Another issue was that in the organization there were objectives, deadlines and schedules related to
accomplishing the work of the company or group. Groups with a task to accomplish could not take
the same time that would be used in a T-Group. These difficulties helped lead to the development of
Organization Development and team building. What had been learned in T-Groups was combined
with other knowledge and these new disciplines emerged as ways to address the values raised by the
T-Group experience.
• The T-Group experience can open up a web of questioning in a participant. Ways of behaving that
the person has used for many years may be called into question by others in the group and oneself.
This has in some cases brought the participant to question relationships in the family or at work.
While this can be a very constructive process that leads to the renewal of relationships, it has on
occasion lead to the breakdown of a relationship. While such a breakdown may have, in time, come
to the relationship without participation in a T-Group, it remains a painful and possibly damaging
experience.
• Participants being forced or pressured to attend, by an employer or other person with influence, are
on the whole less likely to have a positive learning experience. Employers or others who want to
require the participation of others may enhance the chance of having a productive outcome if -- they
attend a lab themselves before sending others; they speak with the lab coordinator before the event
to discuss what might realistically be expected and what the leader could do to assist in the learning
process when the participant returns home.
• Very rarely there have been situations in which a participant has a psychiatric problem. One report
said "The possibility of negative psychiatric effects of ST, and especially its role in inducing
psychiatric symptoms, is yet to be clarified." This reinforces the value of participation based on
intrinsic motivation; a norm that discourages people in therapy from attending without the approval
of their therapist; and trainers staying focused on the learning areas suited for T-Group experiences

25
CONGLOMERATION
Conglomerates are corporations consisting of a number of different companies operating
in diversified fields. In geology circles, the analogous definition of conglomerate is
something consisting of loosely cemented heterogeneous material. Conglomerates in
business are organizations built on the acquisition of firms that are usually in a type of
business indirectly related, if at all, to the acquiring company's other corporate divisions.
The parent company is what holds these loosely related companies together.

Although conglomerates existed before World War II, they became increasingly popular
during the late 1950s and early 1960s. One reason for the adoption of the conglomerate
strategy was that such entities could make acquisitions and grow yet maintain immunity
from the anti-trust prosecution that companies making acquisitions in the same line of
business often found themselves facing. Thus, businesses that were constrained within
their own industry were able to freely expand into different markets. In addition, of
particular importance at the time was that the conglomerate strategy allowed firms
heavily engaged in defense contracts to diversify and reduce the risks associated with
overspecialization.

One of the best examples of conglomeration and its focus on diversification was Textron
Incorporated. After early beginnings as a parachute and textiles manufacturer, Textron,
which was headed by Royal Little, began to acquire unrelated companies in an effort to
expand profits and experience beneficial tax treatments. This diversification became so
wide reaching that Textron began to acquire companies whose products ranged from
cement to helicopters; by 1963, Textron was no longer even in the textile business.

Some experts believe that the techniques used by conglomerates to achieve the
astounding growth for which they were noted was a direct violation of sound corporate
operating principles. Conglomerates exercised few, if any, limits on diversification, often
purchased less than 10 percent of their acquisitions, operated with complex capital
structures, and exhibited high debt-to-earnings ratios. This loose-cement method often
meant that a conglomerate's stock price could fall as quickly as it rose. In addition,
conglomerate corporations often paid debt securities, such as bonds, debentures, and
preferred shares, for the companies they acquired. This was derisively referred to as
"funny money" because the payment did not represent ownership in the acquiring
company, and the company being acquired would surrender outright ownership although
in return it would receive nothing more than evidence of the acquirer's indebtedness.

Litton was one of the first conglomerates to take advantage of this acquisition technique.
It was not, however, the originator of this corporate form. Before Litton came companies
like U. S. Hoffman, Penn-Dixie Industries, Merritt, Chapman & Scott, and Aeronca,
Incorporated. Each of these conglomerates started out small, made a series of
acquisitions, and quickly became top stock market performers. They all failed, however,
because they either purchased poorly performing companies, failed to add any substantial
businesses, squeezed the worth from their acquisitions, or used slick accounting methods
to appear stable.

26
Although the exact origins of conglomerates are un-clear, Litton seems to be the model
that lit the fuse on the conglomerate explosion of the late 1950s and early 1960s. The
company, which was created and led by Charles "Tex" Thornton, began in 1953 by
purchasing three privately held companies with $3 million in combined sales. For the
next fifty-seven straight quarters, a period spanning fourteen years, the company reported
increases in quarter-to-quarter earnings per share. In 1968, sales reached an astounding
$168 billion before the earnings record and the company collapsed. The company's stock
price dropped from a high of 120 3/8 in 1967 to 8 1/2 in 1973—a 93 percent decline.
Despite its failure, which Litton blamed on management problems, the company's
earnings record encouraged dozens of new firms to take on the conglomerate form.

While efficient management helped many conglomerates improve the performance of


acquired companies, others were seemingly more interested in earning profits from
securities. Acquiring companies for stocks and bonds and later selling off portions of the
acquired companies generated profits and funds for expansion. James J. Ling of Ling-
Temco-Vought (LTV) used this method of building conglomerates to achieve remarkable
success. By selling off portions of acquired companies and using the money to expand,
Ling took his company from the 204th largest industrial organization in America to the
14th spot in just four years. He would eventually step down as chairman, however, after
the government mounted serious anti-trust challenges and LTV began to suffer
substantial losses, including a $10.59 per share loss in 1968.

Business went well for conglomerates until 1969, when antitrust indictments challenged
some of them and business began to slow. In 1969 and 1970, ten national investigations,
including studies by the Federal Trade Commission, Securities and Exchange
Commission, and Department of Justice, began to explore the conglomerate culture. This
increased scrutiny, along with the publication of stories detailing securities manipulations
of certain conglomerate promoters, began to greatly affect their ability to continue doing
business in the same way. As the economy began to slow in the early 1970s, the
managers of some conglomerates were proved to have been far less efficient than they
had claimed. Nearly a quarter of the conglomerates doing big business in the 1960s failed
to make it beyond the 1970s. But while most conglomerates survived the recession of the
early 1970s, they were no longer regarded with the enthusiasm they had enjoyed for over
a decade.

The trend at the end of the twentieth century was for conglomerates to move from being
large, unfocused behemoths to firms that created organizations focused on core
capabilities. This means more companies began to avoid acquisitions that clashed with
their business mix and focused on acquiring companies with related synergies. This, of
course, is in direct opposition to the mindset of the conglomerate boom, when market
focus and business streamlining were, at best, secondary concerns.

27
MANAGEMENT BY
OBJECTIVES
Management by objectives (MBO) is a systematic and organized approach that
allows management to focus on achievable goals and to attain the best possible
results from available resources. It aims to increase organizational performance
by aligning goals and subordinate objectives throughout the organization.
Ideally, employees get strong input to identify their objectives, time lines for
completion, etc. MBO includes ongoing tracking and feedback in the process to
reach objectives.
Management by Objectives (MBO) was first outlined by Peter Drucker in 1954 in
his book 'The Practice of Management'. In the 90s, Peter Drucker himself
decreased the significance of this organization management method, when he
said: "It's just another tool. It is not the great cure for management
inefficiency... Management by Objectives works if you know the objectives, 90%
of the time you don't."
Core Concepts
According to Drucker managers should "avoid the activity trap", getting so
involved in their day to day activities that they forget their main purpose or
objective. Instead of just a few top-managers, all managers should:
• participate in the strategic planning process, in order to improve the
implementability of the plan, and
• implement a range of performance systems, designed to help the
organization stay on the right track.
Managerial Focus
MBO managers focus on the result, not the activity. They delegate tasks by
"negotiating a contract of goals" with their subordinates without dictating a
detailed roadmap for implementation. Management by Objectives (MBO) is about
setting yourself objectives and then breaking these down into more specific
goals or key results.
Main Principle
The principle behind Management by Objectives (MBO) is to make sure that
everybody within the organization has a clear understanding of the aims, or
objectives, of that organization, as well as awareness of their own roles and
responsibilities in achieving those aims. The complete MBO system is to get
managers and empowered employees acting to implement and achieve their
plans, which automatically achieve those of the organization.
Where to Use MBO
The MBO style is appropriate for knowledge-based enterprises when your staff is
competent. It is appropriate in situations where you wish to build employees'
management and self-leadership skills and tap their creativity, tacit knowledge
and initiative. Management by Objectives (MBO) is also used by chief executives
of multinational corporations (MNCs) for their country managers abroad.
Setting Objectives
In Management by Objectives (MBO) systems, objectives are written down for
each level of the organization, and individuals are given specific aims and
targets. "The principle behind this is to ensure that people know what the
organization is trying to achieve, what their part of the organization must do to

28
meet those aims, and how, as individuals, they are expected to help. This
presupposes that organization's programs and methods have been fully
considered. If they have not, start by constructing team objectives and ask team
members to share in the process."6
"The one thing an MBO system should provide is focus", says Andy Grove who
ardently practiced MBO at Intel. So, have your objectives precise and keep their
number small. Most people disobey this rule, try to focus on everything, and end
up with no focus at all.
For Management by Objectives (MBO) to be effective, individual managers must
understand the specific objectives of their job and how those objectives fit in
with the overall company objectives set by the board of directors. "A manager's
job should be based on a task to be performed in order to attain the company's
objectives... the manager should be directed and controlled by the objectives of
performance rather than by his boss."1
The managers of the various units or sub-units, or sections of an organization
should know not only the objectives of their unit but should also actively
participate in setting these objectives and make responsibility for them.
The review mechanism enables leaders to measure the performance of their
managers, especially in the key result areas: marketing; innovation; human
organization; financial resources; physical resources; productivity; social
responsibility; and profit requirements.
However, in recent years opinion has moved away from the idea of placing
managers into a formal, rigid system of objectives. Today, when maximum
flexibility is essential, achieving the objective rightly is more important.
Balance between Management and Employee Empowerment
The balance between management and employee empowerment has to be struck,
not by thinkers, but by practicing managers. Turning their aims into successful
actions, forces managers to master five basic operations:
• setting objectives,
• organizing the group,
• motivating and communicating,
• measuring performance, and
• developing people, including yourself.
These Management by Objectives (MBO) operations are all compatible with
empowerment, if you follow the main principle of decentralization: telling
people what is to be done, but letting them achieve it their own way. To make
the principle work well, people need to be able to develop personally. Further,
different people have different hierarchy of needs and, thus, need to be managed
differently if they are to perform well and achieve their potential.
Empowerment recognizes "the demise" of the command-and-control system, but
remains a term of power and rank. A manager should view members of his or her
team much as a conductor regards the players in the orchestra, as individuals
whose particular skills contribute to the success of the enterprise. While people
are still subordinates, the superior is increasingly dependent on the
subordinates for getting results in their area of responsibility, where they have
the requisite knowledge. In turn, these subordinates depend on their superior for
direction and "above all, to define what the 'score' if for the entire organization,
that is, what are standards and values, performance and results."

29
30
Diversification
A risk-management technique that mixes a wide variety of
investments within a portfolio. The rationale behind this technique
contends that a portfolio of different kinds of investments will, on
average, yield higher returns and pose a lower risk than any
individual investment found within the portfolio.

Diversification strives to smooth out unsystematic risk events in a


portfolio so that the positive performance of some investments
will neutralize the negative performance of others. Therefore, the
benefits of diversification will hold only if the securities in the
portfolio are not perfectly correlated.

Diversification is a familiar term to most investors. In the most general sense, it can be
summed up with this phrase: "Don’t put all of your eggs in one basket." While that
sentiment certainly captures the essence of the issue, it provides little guidance on the
practical implications of the role diversification plays in an investor's portfolio and offers
no insight into how a diversified portfolio is actually created. In this article, we'll provide
an overview of diversification and give you some insight into how you can make it work
to your advantage.

Another way to reduce the risk in your portfolio is to include bonds and cash. Because
cash is generally used as a short-term reserve, most investors develop an asset allocation
strategy for their portfolios based primarily on the use of stocks and bonds. It is never a
bad idea to keep a portion of your invested assets in cash, or short-term money-market
securities. Cash can be used in case of an emergency, and short-term money-market
securities can be liquidated instantly in case an investment opportunity arises, or in the
event your usual cash requirements spike and you need to sell investments to make
payments. Also keep in mind that asset allocation and diversification are closely linked
concepts; a diversified portfolio is created through the process of asset allocation. When
creating a portfolio that contains both stocks and bonds, aggressive investors may lean
toward a mix of 80% stocks and 20% bonds while conservative investors may prefer a
20% stocks to 80% bonds mix.

Regardless of whether you are aggressive or conservative, the use of asset allocation to
reduce risk through the selection of a balance of stocks and bonds for your portfolio is a
more detailed description of how a diversified portfolio is created than the simplistic eggs
in one basket concept. With this in mind, you will notice that mutual fund portfolios

31
composed of a mix that includes both stocks and bonds are referred to as "balanced"
portfolios. The specific balance of stocks and bonds in a given portfolio is designed to
create a specific risk-reward ratio that offers the opportunity to achieve a certain rate of
return on your investment in exchange for your willingness to accept a certain amount of
risk. In general, the more risk you are willing to take, the greater the potential return on
your investment. (To learn more, check out Achieving Optimal Asset Allocation and Five
Things To Know About Asset Allocation.)

What Are My Options?


If you are a person of limited means or you simply prefer uncomplicated investment
scenarios, you could choose a single balanced mutual fund and invest all of your assets in
the fund. For most investors, this strategy is far too simplistic. While a given mix of
investments may be appropriate for a child's college education fund, that mix may not be
a good match for long-term goals, such as retirement or estate planning. Likewise,
investors with large sums of money often require strategies designed to address more
complex needs, such as minimizing capital gains taxes or generating reliable income
streams. Furthermore, while investing in a single mutual fund provides diversification
among the basic asset classes of stocks, bonds and cash (funds often hold a small amount
of cash from which to take their fees), the opportunities for diversification go far beyond
these basic categories. (For more detail, read Advantages Of Mutual Funds and
Disadvantages Of Mutual Funds.)

With stocks, investors can choose a specific style, such as focusing on large caps, mid
caps or small caps. In each of these areas are stocks categorized as growth or value.
Additional choices include domestic stocks and foreign stocks. Foreign stocks also offer
sub-categorizations that include both developed and emerging markets. Both foreign and
domestic stocks are also available in specific sectors, such as biotechnology and health
care.

In addition to the variety of equity investment choices, bonds also offer opportunities for
diversification. Investors can choose long-term or short-term issues. They can also select
high-yield or municipal bonds. Once again, risk tolerance and personal investment
requirements will largely dictate investment selection.

While stocks and bonds represent the traditional tools for portfolio construction, a host of
alternative investments provide the opportunity for further diversification. Real estate
investment trusts, hedge funds, art and other investments provide the opportunity to
invest in vehicles that do not necessarily move in tandem with the traditional financial
markets. These investments offer yet another method of portfolio diversification.

32
Different Types of Risk
Investors confront two main
types of risk when investing:

• Undiversifiable - Also
known as "systematic" or
"market risk",
undiversifiable risk is
associated with every
company. Causes are
things like inflation rates,
exchange rates, political
instability, war and
interest rates. This type
of risk is not specific to a particular company and/or industry, and it cannot be
eliminated or reduced through diversification; it is just a risk that investors must
accept.
• Diversifiable - This risk is also known as "unsystematic risk", and it is specific to
a company, industry, market, economy or country; and it can be reduced through
diversification. The most common sources of unsystematic risk are business risk
and financial risk. Thus, the aim is to invest in various assets so that your assets
are not all affected the same way by market events.

Why You Should Diversify


Let's say you have a portfolio of only airline stocks. If it is publicly announced that
airline pilots are going on an indefinite strike and that all flights are canceled, share prices
of airline stocks would drop. Your portfolio would witness a noticeable drop in value. If,
however, you counterbalanced the airline-industry stocks with a couple of railway stocks,
only part of your portfolio would be affected. In fact, there is a good chance that the
railway stocks' prices would climb as passengers turn to trains as an alternative form of
transportation.

But you could diversify even further since there are many risks that affect both rail and
air because each is involved in transportation. An event that reduces any form of travel
hurts both types of companies - statisticians would say that rail and air stocks have a
strong correlation. Therefore, to achieve superior diversification, you would want
to diversify across not only different types of companies but also different types of
industries. The more uncorrelated your stocks are, the better.

But it's also important that you diversify among different asset classes. Because different
assets - such as bonds and stocks - will not each react in the same way to adverse events,
a combination of asset classes will reduce your portfolio's sensitivity to market swings.
Generally, the bond and equity markets move in opposite directions, so, if your portfolio
is diversified across both areas, unpleasant movements in one will be offset by positive
results in another.

33
There are additional types of diversification and many synthetic investment products
have been created to accommodate investors' risk-tolerance levels; however, these
products can be very complicated and are not meant to be created by beginner or small
investors. For those who have less investment experience and do not have the financial
backing to enter into hedging activities, bonds are the most popular way to diversify
against the stock market.

Unfortunately, as the world has seen cases of fraud and earnings manipulation, even the
best analysis of a company and its financial statements cannot guarantee that it won't be
a losing investment. Diversification won't prevent a loss, but it can reduce the impact of
fraud and bad information on your portfolio.

Summary
Diversification can help an investor manage risk and reduce the volatility of an asset's
price movements. Remember though, no matter how diversified your portfolio is, risk can
never be eliminated completely. You can reduce risk associated with individual stocks,
but general market risks affect nearly every stock, so it is important to diversify also
among different asset classes. The key is to find a medium between risk and return; this
ensures that you achieve your financial goals while still getting a good night's rest.

34
The X Y and Z of Management Theory
Introduction:

Achieving a clear understanding of human nature is an important aspect of management


in the work place. In order for managers and workers to work together as an effective and
productive unit, the workers must know how they fit into the overall scheme of things,
and the managers must have a clear understanding of how they can maximise
productivity by supporting their employees through the appropriate leadership style. It is
also extremely important for managers to realistically evaluate the working environment,
as well as the characteristics of the task, in order to decide how he or she deals with and
directs employees.

Aside from knowing how human nature dictates a worker's actions, the manager must
also be aware of the specific working environment, personalities, and motivational forces,
which drive employees. This can then be used to decide which actions are necessary to
motivate the work force, and to obtain maximum productivity.

The purpose of this paper is to discuss two theorists, Douglas McGregor and William
Ouchi, and the theories, which made them well known in the organisational development
and management arenas. McGregor, with his "Theory X" and "Theory Y", and Ouchi,
with the notion of a "Theory Z", both look at the attitudes of managers and workers with
very similar, as well as contrasting views of how workers are perceived by management,
and how workers perceive their role in the company. In these theories, the various authors
discuss how each plays an important part in the understanding of workers by
management. A comparison and contrast of these two theorists will be presented, which
will show how each might view various aspects of the relationship which exists between
management and workers, in such areas as motivation, leadership, power, authority, and
conflict, to name a few.

Douglas McGregor - Theory X & Theory Y:

In 1960 Douglas McGregor defined contrasting assumptions about the nature of humans
in the work place. These assumptions are the basis of Theory X and Theory Y teachings.
Generally speaking, Theory X assumes that people are lazy and will avoid work
whenever possible. Theory Y, on the other hand, assumes that people are creative and
enjoy work (Goldman).Although "X" and "Y" are the standard names given to
McGregor's theories, it is also appropriate to mention here that other names for these
management theories have been used as well, and are sometimes interchanged with "X"
and "Y". For instance, one author refers to Theory X as "Autocratic Style", and Theory Y
as "Participative Style" (DuBrin). Yet another author writes that Theory X and Theory Y
are sometimes termed as "hard" and "soft" management, although careful to point out that
these terms can be used incorrectly (Benson). This information is presented in order to

35
illustrate the different terminologies, which have been used to describe McGregor's
theories, and will be used in this paper as well.

William Ouchi - Theory Z:


Another theory which has emerged, and deals with the way in which workers are
perceived by managers, as well as how managers are perceived by workers, is William
Ouchi's "Theory Z". Often referred to as the "Japanese" management style, Theory Z
offers the notion of a hybrid management style which is a combination of a strict
American management style (Theory A) and a strict Japanese management style (Theory
J). This theory speaks of an organisational culture which mirrors the Japanese culture in
which workers are more participative, and capable of performing many and varied tasks.
Theory Z emphasises things such as job rotation, broadening of skills, generalisation
versus specialisation, and the need for continuous training of workers (Luthans).

Much like McGregor's theories, Ouchi's Theory Z makes certain assumptions about
workers. Some of the assumptions about workers under this theory include the notion that
workers tend to want to build co-operative and intimate working relationships with those
that they work for and with, as well as the people that work for them. Also, Theory Z
workers have a high need to be supported by the company, and highly value a working
environment in which such things as family, cultures and traditions, and social
institutions are regarded as equally important as the work itself. These types of workers
have a very well developed sense of order, discipline, moral obligation to work hard, and
a sense of cohesion with their fellow workers. Finally, Theory Z workers, it is assumed,
can be trusted to do their jobs to their utmost ability, so long as management can be
trusted to support them and look out for their well being (Massie & Douglas).

One of the most important tenets of this theory is that management must have a high
degree of confidence in its workers in order for this type of participative management to
work. While this theory assumes that workers will be participating in the decisions of the
company to a great degree, one author is careful to point out that the employees must be
very knowledgeable about the various issues of the company, as well as possessing the
competence to make those decisions. This author is also careful to point out, however,
that management sometimes has a tendency to underestimate the ability of the workers to
effectively contribute to the decision making process (Bittel). But for this reason, Theory
Z stresses the need for enabling the workers to become generalists, rather than specialists,
and to increase their knowledge of the company and its processes through job rotations
and continual training. In fact, promotions tend to be slower in this type of setting, as
workers are given a much longer opportunity to receive training and more time to learn
the intricacies of the company's operations. The desire, under this theory, is to develop a
work force, which has more of a loyalty towards staying with the company for an entire
career, and be more permanent than in other types of settings. It is expected that once an
employee does rise to a position of high level management, they will know a great deal
more about the company and how it operates, and will be able to use Theory Z
management theories effectively on the newer employees (Luthans).

36
Theory Analysis, Comparisons & Contrasts:
While several similarities and differences surround the ideas of these two theorists, the
most obvious comparison is that they both deal with perceptions and assumptions about
people. These perceptions tend to take the form of how management views employees,
while Ouchi's Theory Z takes this notion of perceptions a bit farther and talks about how
the workers might perceive management. Table 1 below shows a quick "snapshot"
comparison and contrast of the two theorists, and how they might apply the concepts
shown to their particular management theories.

Comparison & Contrast of Management Theorists

Management Douglas McGregor William Ouchi


Concept (Theory X & Y) (Theory Z)
Tends to categorise people as one type or
another: either being unwilling or unmotivated
to work, or being self motivated towards work. Believes that people are innately self
Threats and disciplinary action are thought to motivated to not only do their work, but also
Motivation
be used more effectively in this situation, are loyal towards the company, and want to
although monetary rewards can also be a prime make the company succeed.
motivator to make Theory X workers produce
more.
Theory X leaders would be more authoritarian, Theory Z managers would have to have a
while Theory Y leaders would be more great deal of trust that their workers could
Leadership participative. But in both cases it seems that the
make sound decisions. Therefore, this type of
managers would still retain a great deal of leader is more likely to act as "coach", and
control. let the workers make most of the decisions.
As mentioned above, McGregor's managers, in The manager's ability to exercise power and
both cases, would seem to keep most of the authority comes from the worker's trusting
Power & power and authority. In the case of Theory Y, management to take care of them, and allow
Authority the manager would take suggestions from them to do their jobs. The workers have a
workers, but would keep the power to great deal of input and weight in the decision
implement the decision. making process.
This type of manager might be more likely to Conflict in the Theory Z arena would involve
exercise a great deal of "Power" based conflict a great deal of discussion, collaboration, and
resolution style, especially with the Theory X negotiation. The workers would be the ones
Conflict
workers. Theory Y workers might be given the solving the conflicts, while the managers
opportunity to exert "Negotiating" strategies towould play more of a "third party arbitrator"
solve their own differences. role.
Theory Z emphasises more frequent
Performance Appraisals occur on a regular basis. Promotions
performance appraisals, but slower
Appraisals also occur on a regular basis.
promotions.

37
With respect to overall management style, McGregor's Theory X and Theory Y managers
seem to have a much more formal leadership style than do Ouchi's Theory Z managers.
McGregor's managers seem to both have different views of the workers, while their
views of the tasks remains the same in both cases: that is, one of specialisation, and doing
a particular task. Albeit that Theory Y suggests that the workers would become very good
at their particular tasks, because they are free to improve the processes and make
suggestions. Theory Z workers, on the other hand, tend to rotate their jobs frequently, and
become more generalists, but at the same time become more knowledgeable about the
overall scheme of things within the company. Several parallels indeed exist between
these two theorists. Namely McGregor's Theory Y, and Ouchi's Theory Z both see the
relationship between managers and workers in a very similar light. For instance, they
both see managers as "coaches", helping the workers to be more participative in their
endeavour to be more productive. They both are more group oriented than the Theory X
assumptions, which seem to be more individual oriented. One of the most notable
similarities between McGregor's Theory Y and Ouchi's Theory Z appears in the form of
the type of motivation that makes the workers perform in a way that enables them to be
more productive. While the Theory X worker is said to require coercion, threats, and
possibly even disciplinary action, Theory Y and Theory Z workers are, again, self
motivated. This allows them to focus on the task, and also their role within the company.
Their desire is to be more productive and enable the company to succeed. Theory X
workers, on the other hand, seem to have just enough self motivation to show up at work,
punch the time clock, as it were, and do only that which is necessary to get the job done
to minimum standards.

Summary & Conclusions:

Many assumptions are made in the work place, based on observations of the workers, and
their relationship with management. The types of tasks being performed, as well as the
types of employees which make up a particular organisation can set the stage for the
types of leadership roles which will be assumed by managers. Theory X, which shows
that workers are assumed to be lazy and do not want to work, seems to be giving way to
theories, which suggest that workers tend to be more participative and creative. Creativity
and motivation naturally lend themselves to a more effective organisation. While
McGregor's Theory Y seems to address the more motivated type of employee, Ouchi's
Theory Z seems to take that notion a step farther by implying that not only are
assumptions about workers made, but assumptions about managers as well. That is to say
that under Ouchi's theory, managers must be more supportive and trusting of their
employees, in order to receive the benefit of increased participation in the decisions of
the company. As is clearly seen by comparing and contrasting these two theorists,
assumptions about people can be more clearly understood in order for managers and
workers to make for a more productive environment in the work place.

38
ZERO-BASED BUDGETING
Zero-based budgeting requires that the existence of a government program or
programs be justified in each fiscal year, as opposed to simply basing budgeting
decisions on a previous year’s funding level. Zero-based budgeting is often
encouraged by fiscal watchdog groups as a way to ensure against unnecessary
spending. Zero-based budgeting, or some modified version of it, has been used in the
private- and public- sectors for decades. Indeed, it is my understanding that the first
use of zero-based budgeting in government has been tracked back to Gov. Jimmy
Carter’s use of it in Georgia in the early 1970s.

As with most policies, there are both benefits and costs to be taken into account
when considering zero-based budgeting. Case studies about businesses and
governments that have adopted zero-based budgeting, or some hybrid of it,
generally report some improvement quantitatively or qualitatively. That is, the
process has either saved money, improved services, or both.

Advantages of Zero-Base Budgeting

1. Results in efficient allocation of resources as it is based on needs and benefits


2. Drives managers to find out cost effective ways to improve operations
3. Detects inflated budgets
4. Useful for service departments where the output is difficult to identify
5. Increases staff motivation by providing greater initiative and responsibility
in decision-making
6. Increases communication and coordination within the organization
7. Identifies and eliminates wastage and obsolete operations.
8. Identifies opportunities for outsourcing.
9. Increase restraint in developing budgets;
10. Reduce the entitlement mentality with respect to cost increases; and
11. Make budget discussions more meaningful during review sessions

Disadvantages of Zero-Base Budgeting

1. Difficult to define decision units and decision packages, as it is very time-


consuming and exhaustive.
2. Forced to justify every detail related to expenditure. The R&D department is
threatened whereas the production department benefits.
3. Necessary to train managers. ZBB should be clearly understood by managers
at various levels otherwise they cannot be successfully implemented. Difficult
to administer and communicate the budgeting because more managers are
involved in the process.
4. In a large organization, the volume of forms may be so large that no one
person could read it all. Compressing the information down to a usable size
might remove criticaly important details.

39
5. Honesty of the managers must be reliable and uniform. Any manager that is
prone to exaggeration might skew the results.
6. May increase the time and expense of preparing a budget;
7. May be too radical a solution for the task at hand. You don’t need a
sledgehammer to pound in a nail;
8. Can make matters worse if not done in the right way. A substantial
commitment must be made by all involved to ensure that this doesn’t
happen.

Zero-based budgeting can be useful for shaking up a process that may have grown
stale and counterproductive over time. But I must offer three serious warnings.

First, the success of such a change like this hinges strongly on leadership that is
dedicated to the task. If those appointed to conduct budget reviews are unwilling to
truly assess every item in their budget, word will get out quickly that this new
budgeting technique is more symbolism than substance. Indeed, it is incumbent
upon proponents of zero-based budgeting to ensure that those reviewing the budget
do not have a pecuniary interest in maintaining the status quo. Allowing people who
will be most affected by the elimination of programs to conduct their own reviews
may be counterproductive, since most people are quick to defend their own
interests.

Second, don’t attempt to do zero-based budgeting for every department, every year.
Such a move may prove impossible to manage. Instead, choose several departments
and/or agencies, and rotate through every facet of state government over time. In
Oklahoma, which has recently adopted zero-based budgeting, officials are applying
the method to two departments and several agencies each year. Once those reviews
are complete, the same departments and agencies will not see another zero-based
review for eight years.

Third, ensure that each review is conducted by referencing all aspects of a


department, agency or program to what its goals are. This makes the very purpose
of the entity being reviewed transparent, and can increase the opportunities
available for making objective measurements of a department, agency or program’s
success rate.

As with most programs or reforms of programs, it must be done right, or it should


not be done at all. For example, department, agency or program directors who feel
endangered by this kind of scrutiny will be delighted to be placed in charge, so that
they can do it wrong, waste everyone’s time, and give a cutting-edge management
tool like zero-based budgeting a bad name, all at the same time.

40
Decentralization
Decentralization is the process of dispersing decision-making closer to the point of
service or action. It occurs in a great many contexts in engineering, management science,
political science, political economy, sociology and economics — each of which could be
said to study mass decision-making by groups, too large to consult with each other very
directly.

Law and science can also be said to be highly decentralized human practices. There are
serious studies of how causality and correlations of phenomenon can respectively be
determined and agreed across an entire nation, or indeed across the entire human species
spread across the planet. While such institutions as the International Criminal Court or
Intergovernmental Panel on Climate Change seem highly centralized, in fact they rely so
heavily on the underlying legal and scientific processes that they can be said to simply
reflect, as opposed to impose, global opinion.

A central theme in all kinds of decentralization is the difference between a hierarchy,


based on:

• authority: two players in an unequal-power relationship; and


• an interface: a lateral relationship between two players of roughly equal power.

The more decentralized a system is, the more it relies on lateral relationships, and the less
it can rely on command or force. In most branches of engineering and economics,
decentralization is narrowly defined as the study of markets and interfaces between parts
of a system. This is most highly developed as general systems theory and neoclassical
political economy

[edit] Decentralization in History


Decentralization and centralization are themes that have played major roles in the history
of many societies. An excellent example is the gradual political and organizational
changes that have occurred in European history. The rise and during the rise and fall of
the Roman Empire, Europe went through major centralization and decentralization.
Although the leaders of the Roman Empire created a European infrastructure, the fall of
Empire left Europe without a strong political system or military protection. Viking and
other barbarian attacks further led rich Romans to build up their latifundia, or large
estates, in a way that would protect their families and create a self-sufficient living place.
This development led to the growth of the manorial system in Europe. This system was
greatly decentralized, as the lords of the manor had power to defend and control the small
agricultural environment that was their manor. The manors of the early Middle Ages
slowly came together as lords took oaths of fielty to other lords in order to have even
stronger defense against other manors and barbarian groups. This feudal system was also
greatly decentralized, and the kings of weak "countries" did not hold much significant
power over the nobility. Although some view the Roman Catholic Church of the Middle

41
Ages as a centralizing factor, it played a strong role in weakening the power of the
secular kings, which gave the nobility more power. As the Middle Ages wore on,
corruption in the church and new political ideas began to slowly stregthen the secular
powers and bring together the extremely decentralized society. This centralization
continued through the Renaissance and has been changed and reformed until our present
centralized system which has an excellent balance between central government and
decentral balance of power. The Roman Empire and the Middle Ages demonstrate the
problems with both radical decentralization and centralization, further proving the
importance of centralized nation-states with decentral representation.

42
Kaizen and Quality Circles

Quality circle are typically said to have originated in Japan in the 1960s but others
argue that the practice started with the United States Army soon after 1945 with
the Japanese then adopted and adapting the concept and its application.

Quality circles are not a panacea for quality improvement but given the right top
management commitment, organisation and resourcing they can support continuous
quality improvement at shop-floor level. What is a quality circle?

a group of staff who meet regularly to discuss quality related work problems so that they
may examine and generate solutions to these. The circle is empowered to promote and
bring the quality improvements through to fruition.

Thus the adoption of quality circles (quality improvement team) has a social focus. There
must be commitment from senior management, unit management and supervision, other
staff and of course the circle members. A team of 6-9 people need to participate freely
together, to challenge assumptions and existing methods, examine data and explore
possibilities. They need to be able to call in expertise and ask for training. The quality
circle needs a budget so that members can be responsible for tests and possible pilots.
The need a skilled team leader who works as a facilitator of team efforts not a dominator.

The circle needs to have a very good approach to

• analysing the context of the problem and its situation


• defining just exactly what the problem is and the relationship between its
component parts.
• how it identifies and verifies that the causes are indeed the causes. These must be
understood otherwise solutions as developed may fail to address the real problem.
• Problem definition requires quantitative measurement and often a consensus of
qualitative judgement. The impact of the "problem" - if it continues - must be
comprehended. Where is it affecting other parts of the "problem system"?
• we need to understand the quality objectives to be achieved and evaluate the
resources that can be brought to bear on the problem and possible solutions.
Objectives relate to both what must be done and what we would like to do - if
only everything else will fit into place.
• in the classical "functional, problem analysis" cycle, solution generation involves
conceiving what might be done.

We can typically develop options from DO NOTHING to do everything. The


options (MAX/MIN, optimistic/pessimistic, high/low budget etc.) are all models
to be tested against objectives and constraints.

• We must recognise also that there are tensions between

43
1. resource constraints and solutions and
2. the imagining processes of solution development. These must then be
elaborated and grounded in detailed planning and operational
implementation.
• such implementation planning and management of the change/operational
programme involves scheduling, work allocation, capacity management,
communicating, development of information monitoring systems and overall
coordination and control of the solution programme.

Other techniques may brought into use also by quality circle participants e.g.

• process flow charts


• brainstorming
• cause and effect analysis
• reverse engineering
• value analysis
• pareto analysis

Team members will need training and support to apply these to the context and issue they
are experiencing.

Management have to believe in the quality team process, listen to proposals and enable
feasible solutions to be progressed through pilot stages and into full operation. Open-
mindedness and a desire to avoid blocking is essential. It is a useful philosophy to realise
that experimentation enables learning.

44
Excellence

Excellence is the state or quality of excelling. It is superiority, or the state of being good
to a high degree. Excellence is considered to be a value by many organizations, in
particular by schools and other institutions of education[1], and a goal to be pursued.

However, according to Keathley[2], the pursuit of excellence is "not to be a quest for


superiority", and is not about "competition" or about "outstripping others", which is
"usually done for one's own glory or significance or for the praise or applause of men".
Instead, he quotes Harbour distinguishing between success and excellence, saying that
"Success means being the best. Excellence means being your best. Success, to many,
means being better than everyone else. Excellence means being better tomorrow than you
were yesterday. Success means exceeding the achievements of other people. Excellence
means matching your practice with your potential."

Keathley also asserts that the pursuit of excellence "should not be limited by the nature of
the task" (i.e. that one should pursue excellence no matter how humble or menial one
considers the task to be) and "works against a half-hearted, drift along or go-with-the-
flow kind of mentality".

Kamali[3] agrees, citing Muhammad in a hadith saying that "God loves it when a worker
undertakes a work and he does it to perfection." Kamali states that thus "Perfection is
manifest in artistic beauty and in human endeavour that seeks to achieve excellence. The
pursuit of excellence is highly recommended at all times and places, be it in the home, in
the neighbourhood or the mosque; indeed everywhere."

45
Restructuring
Restructuring is the corporate management term for the act of partially dismantling and
reorganizing a company for the purpose of making it more efficient and therefore more
profitable. It generally involves selling off portions of the company and making severe
staff reductions.

Restructuring is often done as part of a bankruptcy or of a takeover by another firm,


particularly a leveraged buyout by a private equity firm. It may also be done by a new
CEO hired specifically to make the difficult and controversial decisions required to save
or reposition the company.

Characteristics
The selling of portions of the company, such as a division that is no longer profitable or
which has distracted management from its core business, can greatly improve the
company's balance sheet. Staff reductions are often accomplished partly through the
selling or closing of unprofitable portions of the company and partly by consolidating or
outsourcing parts of the company that perform redundant functions (such as payroll,
human resources, and training) left over from old acquisitions that were never fully
integrated into the parent organization.

Other characteristics of restructuring can include:

• Changes in corporate management (usually with golden parachutes)


• Sale of underutilized assets, such as patents or brands
• Outsourcing of operations such as payroll and technical support to a more
efficient third party
• Moving of operations such as manufacturing to lower-cost locations
• Reorganization of functions such as sales, marketing, and distribution
• Renegotiation of labor contracts to reduce overhead
• Refinancing of corporate debt to reduce interest payments
• A major public relations campaign to reposition the company with consumers

Results
A company that has been restructured effectively will generally be leaner, more efficient,
better organized, and better focused on its core business. If the restructured company was
a leverage acquisition, the parent company will likely resell it at a profit when the
restructuring has proven successful.

46
PORTFOLIO MANAGEMENT
How to Do It Right
Portfolio management is a tool with clear benefits, among them a holistic view of IT
projects across the enterprise and the alignment of IT with corporate strategy. But it isn't
easy. We've found some portfolio managers willing to share their secrets

Ron Kifer, vice president of program management at DHL Americas, is a veteran of the typical project
and portfolio planning—or lack of planning—process in many companies. "The last three organizations
I've been in had the same scenario. They didn't have defined processes for reviewing project
proposals; projects were pretty much recommended by senior vice presidents in each business area,"
he says. "They were attempting to do many more projects than they had the capacity to do Bad
projects squeezed out good projects. There was no visibility of what was being done throughout the
organization."

That's a recipe for disaster. At a time when CEOs are demanding that technology investments return
value, CIOs who don't have control over their IT project portfolios are fighting losing battles.
Surprisingly, that's a good number of you: A recent report by AMR Research contends that as many as
75 percent of IT organizations have little oversight over their project portfolios and employ
nonrepeatable, chaotic planning processes.

But if you're not doing it already, portfolio management can help you gain control of your IT
projects and deliver meaningful value to the business. Portfolio management takes a holistic view
of a company's overall IT strategy. Both IT and business leaders vet project proposals by matching
them with the company's strategic objectives. The IT portfolio is managed like a financial portfolio;
riskier strategic investments (high-growth stocks) are balanced with more conservative
investments (cash funds), and the mix is constantly monitored to assess which projects are on
track, which need help and which should be shut down.

But it's all in the execution. Jeff Chasney, executive vice president of strategic planning and CIO at
CKE Restaurants, notes that "some companies do it poorly and some do it well." The companies
profiled in this story reveal their best practices for doing it well.

Why You Need Portfolio Management


Think about how IT investments are managed in your company; do any of the following scenarios
ring true? Million-dollar projects, which may or may not match the company's objectives, are
awarded to business units headed by the squeakiest executives; weak IT governance structures
mean that business executives don't have clear ideas of what they're approving and why; the CIO
ends up selling projects that should be generated and sold by line-of-business heads; the company
doesn't build good business cases for IT projects or it doesn't do them at all; and there are
redundant projects.

A strong portfolio management program can turn all that around and do the following:

Maximize value of IT investments while minimizing the risk

47
Improve communication and alignment between IS and business leaders

Encourage business leaders to think "team," not "me," and to take responsibility for projects

Allow planners to schedule resources more efficiently

Reduce the number of redundant projects and make it easier to kill projects

All that means more pennies in your piggy bank. Dennis S. Callahan, executive vice president and
CIO of Guardian Insurance, and Rick Omartian, CFO of Guardian's IT group and chief of staff, claim
that portfolio management has reduced their companies' overall IT applications expenditures by 20
percent and that, within that spending reduction, maintenance costs have gone from 30 percent to
18 percent. Eric Austvold, a research director at AMR Research, says companies doing portfolio
management report saving 2 percent to 5 percent annually in their IT budgets.

There's no single right way to do IT portfolio management. Vendors, consulting companies and
academics offer many models, and often companies develop their own methodologies. Off-the-shelf
software is available from a variety of vendors (see "Tools of the Trade," right). But there are
plenty of hurdles to doing it well. There are, however, best practices and key logical steps that can
be gleaned from organizations such as Brigham Young University (BYU), DHL Americas and Eli Lilly,
which have integrated portfolio management into the fabric of IT management, as you'll see in this
story.

Here are the key steps in creating and managing your IT investment portfolio.

Gather: Do a Project Inventory


Portfolio management begins with gathering a detailed inventory of all the projects in your
company, ideally in a single database, including name, length, estimated cost, business objective,
ROI and business benefits. Merrill Lynch maintains a global database of all its IT projects using
software from Business Engine.

In addition to project plan information, Merrill Lynch's users—almost 8,000 from Asia, Europe,
India and the United States—add weekly updates on how much time they spend working on
projects. "We use that as our internal cost assignment tool back to the business, so that the
business is paying for every technology dollar monthly," says Marvin Balliet, CFO of global
technology and services.

When Kifer joined DHL Americas as vice president of program management in 2001, one of his first
tasks was getting control of project portfolio activities. He created an inventory, put that into a
master project schedule, gained an understanding of the resource requirements of all the projects,
then did a reconciliation of the projects and reduced the schedule to a manageable level.

Creating a project portfolio inventory can be painstaking but is well worth the effort. For many
companies, it may be their first holistic view of the entire IT portfolio and any redundancies. A good
inventory is the foundation for developing the projects that best meet strategic objectives.

Evaluate: Identify Projects That Match Strategic Objectives


The next steps involve establishing a portfolio process. The heads of business units, in conjunction

48
with the senior IT leaders in each of those units, compile a list of projects during the annual
planning cycle and support them with good business cases that show estimated costs, ROI,
business benefit and risk assessment. The leadership team vets those projects and sifts out the
ones with questionable business value. At Eli Lilly, a senior business ownership council comprising
the information officer and senior business leaders in each business unit takes on this role.

Next, a senior-level IT steering committee made up of business unit heads, IT leaders and perhaps
other senior executives meets to review the project proposals; a good governance structure is
central to making this work. "Portfolio management without governance is an empty concept," says
Howard A. Rubin, executive vice president at Meta Group. Conversely, putting portfolio
management in place can force companies with weak governance structures to improve them. (For
more on governance, read "The Powers That Should Be".)

One of the core criteria for which projects get funded is how closely a project meets a company's
strategic objectives for the upcoming year. At clinical diagnostics company Dade Behring, an
executive leadership team, which includes the CEO, creates five strategic initiatives, such as CRM
or organizational excellence. The IT governance council, made up of business leaders and senior IT
leaders, then evaluates projects based on how well they map against those initiatives. "We also try
to assess risk from a technology point of view, a change-management point of view, the number of
people that a project will impact and whether it will involve huge reengineering," says Dave
Edelstein, CIO and senior vice president of regulatory affairs, quality systems, and health, safety
and environment. Using methodology borrowed from the product development group (modified for
IS, but keeping terminology that business executives are familiar with), projects are placed "above
the line"—those that should be funded—or "below the line"—those that shouldn't.

At DHL Americas, a project portfolio review board evaluates the one-page project opportunity
assessment for every proposal. Membership on the board includes IS and 12 vice presidents from
across all areas of the business. "Those vice presidents are not the senior vice presidents—they're
the next level down, the lieutenants," Kifer says. "Portfolio management doesn't work at the senior
vice president level; they don't have time to commit to portfolio management."

A good evaluation process can help companies detect overlapping project proposals up front, cut
off projects with poor business cases earlier, and strengthen alignment between IS and business
execs.

Prioritize: Score and Categorize Your Projects


After evaluating projects, most companies will still have more than they can actually fund. The
beauty of portfolio management is that ultimately, the prioritization process will allow you to fund
the projects that most closely align with your company's strategic objectives.

Ernie Nielsen, managing director of enterprise project management at Brigham Young University, is
a frequent lecturer on portfolio management and a founding director of Stanford University's
Advanced Project Management Program. He instituted an extremely thorough prioritization and
scoring methodology at BYU.

Under his plan, projects are placed into portfolios—Nielsen thinks multiple portfolios are a good
idea in many companies because they allow like projects to be pooled together. In his case, the IT
department uses four:
large technology projects (more than $50K), small technology projects (less than $50K),

49
infrastructure technology projects, and one covering executive initiatives. Think of the first three as
peer portfolios; the executive one is a slightly different animal. The main job of the executive
portfolio management team (each portfolio has its own team) is to distribute funds appropriately to
the other three. (There are plenty of other ways to categorize initiatives; see "Powerful Portfolios,"
right.)

In the case of the large tech portfolio, its management team—made up of project sponsors,
function managers (for example, representatives from engineering, financial services and
operations, and Nielsen himself) and product portfolio managers (people with long-term project
leadership responsibilities in areas such as student services or data management)—vetted projects
and came up with a list of 150 for the portfolio team to score. (Nielsen uses Microsoft Project and
Pacific Edge's Project Office to plan and prioritize.)

They then prioritized them using a model that has four key tenets:

1. Identify four to seven strategies. BYU's Office of Information Technology does this yearly
(for example, limiting technology risk, increasing the reliability of the infrastructure).

2. Decide on one criterion per strategy. For example, the team decided the criterion for limiting
technology risk would be whether the technology had been implemented in a comparable
organization and the benefits could be translated to BYU easily.

3. Weigh the criteria.

4. Keep the scoring scale simple. BYU uses a scale of one to five. For the technology risk
strategy, five might mean that it has been used in a comparable organization and the benefits
could be transferred easily; three could mean it's hard to do because it would require changing
processes; one might mean they haven't seen it work anywhere else.

Following the scoring, the team drew a line based on how many projects it could do with existing
resources. In the case of the large technology portfolio, the line was calculated where demand (the
list of projects) met supply (resources—in this case, the cumulative dollar value of available
application engineers plus overhead); the line was a little less than halfway down the list. Those
projects above the line could be done in 2003. The team then presented that list to the president's
council, which approved it in an hour and a half, a process that used to take weeks, according to
Nielsen.

There is no one method to categorize your IT investment portfolio. One approach is to categorize it
as you would your own financial portfolio, balancing riskier, higher reward strategic investments
with safer categories, such as infrastructure. Meta Group's Rubin recommends a portfolio divided
into three investment categories: running (keeping the lights on), growing (supporting organic
growth) and transforming the business (finding new ways of doing business using technology).
Those categories can then be cross-tabulated with four to five value-focused categories, such as
how those investments support revenue growth, reduce costs or grow market share.

Since 1999, Eli Lilly has used Peter Weill's model to categorize its IT investments (see "Powerful
Portfolios" for a closer look at the model offered by Weill, director of the Sloan Center for
Information Systems Research and senior research scientist at MIT's Sloan School of
Management). Under the Weill model, companies view their IT portfolios on multiple levels and at
different stages, by visualizing their investments in aggregate and placing them in four categories,

50
with the percent of IT expenditures apportioned across each. "We tend to want to have 5 percent
[of our projects] in strategic areas, 15 percent to 20 percent in the informational category, and the
remaining percentage split between the infrastructure and transaction modules," says Sheldon Ort,
Lilly's information officer for business operations. He says that at the enterprise level, those
percentages have remained fairly consistent. That model allows Lilly to balance the risk and reward
of its IT investments. (The average percentage of annual IT spend of the 57 companies in Weill's
2002 survey breaks down as follows: infrastructure, 54 percent; transactional, 13 percent;
informational, 20 percent; strategic, 13 percent.)

The payoffs that come from a thorough evaluation and prioritization process is the primary reason
portfolio management is so effective. First, communication between IS and business leaders
improves. And portfolio management gives business leaders a valuable, newfound skill—the ability
to understand how IT initiatives impact their companies.

Second, business leaders think "team," not "me," and take responsibility for projects. One tried-
and-true method for how a business leader got money for his unit's projects was to scream louder
than everyone else. Portfolio management throws that practice out the corner office window;
decisions are made based on the best interests of the company. At BYU, Nielsen observes that
after its portfolio process was implemented, "instead of vice presidents fighting for their own lists
of projects, they noticed projects below the line, not in their areas. They said to one another, 'I
could provide some funds for you to get [your project] above the line.'"

Third, portfolio management gives business leaders responsibility for IT projects. "I'm no longer in
a position where I have to sell these projects to the business," says Dade Behring's Edelstein. "If
I'm doing a project for marketing, it's the marketing exec who has to sell the project to the rest of
the team." Merrill Lynch's Balliet says, "When we started, the technology people were proposing
the projects. Now the businesspeople propose the projects and [take responsibility] for risk
profiling, ongoing operational costs and timeliness of delivery."

Finally, everybody knows where the dollars are flowing and why, which is especially important to
CEOs and CFOs who are increasingly demanding that technology investments deliver value and
support strategic objectives.

Review: Actively Manage Your Portfolio


A top-notch evaluation and prioritization process is emasculated rather quickly if the portfolio is not
actively managed following approval of the project list. Doing that involves monitoring projects at
frequent intervals, at least quarterly. At Blue Cross and Blue Shield of Massachusetts, a project
management office, which reports directly to Senior Vice President and CIO Carl Ascenzo, has that
responsibility. Once or twice a month, the project management office gets financial and work
progress perspective updates from project leaders. That information goes into a database, and
Ascenzo reports to the entire company monthly, giving the project inventory and its status. He
assigns project status—green (good), yellow (caution) or red (help!)—and includes an explanation
of the key driver causing a yellow or red condition. The IT steering committee meets once a month
to make decisions to continue or stop initiatives, assess funding levels and resolve resource issues.

At CKE Restaurants, the IT steering committee meets monthly to review at least threeof
the initiatives under way. "In my opinion, quarterly is too long," says Chasney. CKE,
under the Carl's Jr., Hardee's and La Salsa Fresh Mexican Grill brand names, operates
approximately 3,300 restaurants worldwide. Frequent reviews allow Chasney to redirect resources

51
more quickly.

Monitoring project portfolios regularly also means projects that have run off the rails can be killed
more easily. "People have an aversion to stopping projects, but the majority of projects I cancel
are done because there's a change in company strategy—a change in priority or direction," says
Chasney. For example, if there's a strategy decision to focus on SAP, then it makes sense to cancel
a new system that interfaces with PeopleSoft, he says. Chasney states another simple but powerful
principle that eludes many companies: "You can't complete projects just because you started
them."

Hurdles to Portfolio Management


Yes, portfolio management is a good thing. But getting to nirvana requires a serious commitment
from both the business and IS sides, as well as a whole lot of sweat equity. Here are some of the
pitfalls and ways to overcome them.

Democracy ain't easy. Taking power away from business leaders accustomed to calling the shots
will not always go smoothly.

"Business leaders who didn't have decisions scrutinized previously now are [having] decisions
decided by group consensus," says DHL's Kifer. But Kifer says that quickly "people realize it does
work and that 12 people can make better decisions than one or two making unilateral decisions."

There's no single software that does everything. "There are really good budget packages,
resource management packages and fairly good portfolio management packages, but no package
that ties it all together," says Gordon Steele, CIO and vice president of IT at Nike, who is in the
process of implementing portfolio management. (See "Tools of the Trade," for a list of some
leading portfolio management vendors.) Steele is currently exploring a partnership with a portfolio
management vendor to see if such a software tool can be developed.

Do you need to buy portfolio software? There's no right answer. Some say it's a necessity. "It's a
better investment now to buy rather than build," says Meta Group's Rubin. Gopal Kapur, founder
and president of the Center for Project Management, begs to differ. "Far too often people get the
software and say they have portfolio management. But they don't—they don't have the foundation
for portfolio management," he says. Microsoft Excel and Project are commonly used by companies
to track and manage projects; some companies build their own tools.

Getting good information isn't easy. Take, for example, the transparency of your cost
structure. "You need good information around all technology costs and investments," says Merrill
Lynch's Balliet. In 1999 and 2000, he and his team looked hard at all the IT dollars and categorized
them into service "buckets," then put them in chargeback buckets related to those activities. For
example, Balliet says that they created a phone monitoring tool and told some units, "You pay for
the calls you make."

In addition, you must update the database regularly. "You need to have the constant status of each
project so you can react quickly to market changes," says Balliet.

It's still hard to make tough decisions on whether to undertake—or cancel—projects.


Kifer, no slouch at portfolio management, says DHL Americas currently has 20 percent more
projects in its portfolio than it can support. "We won't probably start half of those," he says. "[But]

52
an organization has a tendency to say, You'll figure out a way to make those work."

It's an additional time constraint on busy executives. Good portfolio management means
good IT governance means regular IT governance committee meetings. "Just about every company
today has its people stretched," says Chasney. As noted earlier in the story, that concern is
addressed at DHL Americas, where the lieutenants of time-constrained senior vice presidents serve
on the project portfolio review board.

In the grand scheme, however, the challenges of implementing portfolio management pale in
comparison to the value it brings to your IT investments. "It forces IT and businesspeople to talk
about investments from a business perspective," says Weill. "That's its most powerful feature."

53
. Management By Walking Around
Actually a valuable and useful practice. The concept is that in most interpersonal work
situations, most of the time you only see what the other person is presenting. Since they
know your expectations, they will manage their presentation so as to satisfy you. On the
other hand, when you walk around you get to witness what people are actually doing.
Another way to accomplish this is to seat the managers in the same cube farm as the
workers, right smack dab in the middle.
Not a bad idea most of the time, but a right pain when either you want to have "a word"
with one of the team, or they want a word with you. Wandering off to an office/meeting
room is a bit obvious, and seems to make the conversation a bit more "official" than is
sometimes appropriate.

Definitely better than Management By Phone , Management By Email or Management


By Memo.
However, it does take more than just walking around and watching. When a manager
sticks a head into a developer's cubicle and asks "How is it going?", the conversation
tends to not be very valuable. And if it is done too often, developers may try to look busy
all the time, fearful that if they are caught sitting there and thinking, they'll get into
trouble. Many developers want to be judged solely on the results of their work, and don't
like it when someone watches how the work gets done.

Twelve Guidelines for Managing By Walking Around


(MBWA)
Do it to everyone.

You may remain in such close contact with your direct reports that MBWA is
redundant with them. The real power of the technique lies in the time you spend with
those in lower levels of your area of responsibility. Get around to see those who work
for your direct reports and any others whose work is important to you.

Do it as often as you can.

MBWA sends positive messages to employees. It reveals your interest in them and in
their work, and it says you don’t consider yourself "too good" to spend time with
them. MBWA also enables you to stay in touch with what is going on in your
department, section or unit. Put aside at least thirty minutes a week to spend with
all employees. Aim for once a quarter to see those you must travel long distances to
visit.

Go by yourself.

MBWA is more meaningful when you visit with employees alone, and one-on-one. It
encourages more honest dialogue and speaks loudly of your personal commitment to
the idea.

54
Don’t circumvent subordinate managers.

Some employees may take advantage of your presence to complain about a


supervisor who is your subordinate. Counsel them to discuss the issue fully with their
supervisor first. If you have cause to question the supervisor’s judgement, don’t
indicate so to the employee, but follow up privately with the supervisor.

Ask questions.

MBWA is a great opportunity to observe those "moments of truth" when your


employees interact with your clients. Ask them to tell you a little bit about the files,
projects or duties they are working on. Take care to sound inquisitive rather than
intrusive.

Watch and listen.

Take in everything. Listen to the words and tone of employees as they speak to you
and to each other. You’ll learn a lot about their motivation and their levels of
satisfaction. In the words of Yogi Berra, "You can observe a lot just by watching."

Share your dreams with them.

As a Yukon Dog Team handler used to say, "The view only changes for the lead dog."
MBWA is a solid opportunity to make sure that when you lead the sled in a new
direction, the employees behind you won’t trip over themselves trying to follow. Tell
them about the organization’s vision for the future, and where your vision for the
department / unit/ section fits in with the "big picture." Reveal the goals and
objectives that you want them to help you fulfill together as a team. Ask them for
their vision, and hold an open discussion.

Try out their work.

Plop down in front of the computer; get behind the wheel; pick up the telephone;
review a project file. Experience what they endure. Sample their job just enough to
show your interest in it, and to understand how it goes. Think of great ways to
reconnect with your front line workers, and gain a current understanding of exactly
what they are dealing with during a typical work day.

Bring good news.

Walk around armed with information about recent successes or positive initiatives.
Give them the good news. Increase their confidence and brighten their outlook. So
often employees are fed only gloom and doom. Neutralize pessimism with your own
optimism, without being non-credible.

Have fun.

This is a chance to lighten up, joke around, and show your softer side without being
disrespectful or clowning around. Show employees that work should be fun and that
you enjoy it too.

55
Look for victories rather than failures. When you find one, applaud it. When you run
into one of the many unsung heroes in your job site, thank them on the spot, being
careful not to embarrass them in front of peers or to leave out other deserving
employees.

Don’t be critical.

When you witness a performance gone wrong, don’t criticize the performer. Correct
on the spot anything that must be redone, but wait to speak to the wrongdoer’s
supervisor to bring about corrective action

56
Matrix management
Matrix management is a type of management used by some organizations.

Under matrix management, all people who do one type of work are in a pool. For
example, all engineers may be in one engineering department and report to an
engineering manager. These same engineers may be assigned to different projects and
report to a project manager while working on that project. Therefore, each engineer may
have to work under several managers to get his or her job done.

Proponents suggest that there are two advantages to matrix management. First, it allows
team members to share information more readily across task boundaries. Second, it
allows for specialization that can increase depth of knowledge and allow professional
development and career progression to be managed.

The disadvantage of matrix management is that employees can become confused due to
conflicting loyalties. A properly managed cooperative environment, however, can
neutralize these disadvantages. In order for the system to work, all parties must be willing
to talk to each other to learn what their different objectives and goals are.

Matrix management can put some difficulty on the project managers because they must
work closely with other managers and workers in order to complete the project. The
functional managers may have different goals, objectives, and priorities than the project
managers, and these would have to be addressed in order to get the job done.

One advantage of matrix management is that it is easier for a manager to loan an


employee to another manager without making the change permanent. It is therefore easier
to accomplish work objectives in an environment when task loads are shifting rapidly
between departments.

The matrix
Most organizations fall somewhere between the fully functional and fully projectized
organizational structure. These are matrix organizations. Three points along the
organizational continuum have been defined. They are:

Weak/Functional Matrix – A project manager (often called a project administrator


under this type of organization) with only limited authority is assigned to oversee the
cross-functional aspects of the project. The functional managers maintain control over
their resources and project areas.

This matrix still retains most of the problems associated with a functional organization.
The project administrator’s role is to attempt to alleviate communication issues between
functional managers and track overall project progress.

57
Balanced Functional Matrix – A project manager is assigned to oversee the project.
Power is shared equally between the project manager and the functional managers.

Proponents of this structure believe it strikes the correct balance, bringing forth the best
aspects of functional and projectized organizations. However, this is the most difficult
system to maintain as the sharing of power is a very delicate proposition. This is also the
most complex organizational structure to maintain.

Strong/Project Matrix – A project manager is primarily responsible for the project.


Functional managers provide technical expertise and assign resources on an as-needed
basis.

Because project resources are assigned as necessary there can be conflicts between the
project manager and the functional manager over resource assignment. The functional
manager has to staff multiple projects with the same experts.

Visual representation
Representing matrix organizations visually has challenged managers ever since the
matrix management structure was invented. Most organizations use dotted lines to
represent secondary relationships between people, and software packages, such as Visio
and PowerPoint support this approach. Until recently, ERP and HRMS systems did not
support matrix reporting. Late releases of SAP support matrix reporting. Oracle
eBusiness Suite can also be customized to store matrix information. Most dedicated org-
chart software vendors do not support matrix reporting; the one exception is
HumanConcepts.

58
Kanban Systems
This article describes the 6 types of Kanban system available and what you need to do to
choose, design, implement, and operate Kanban systems, size buffer stocks (the number
of Kanbans), choose containers and signalling mechanisms. It shows the need to integrate
the system with your planning systems. It includes the impact on people, automated
materials handling systems and some important do's and don'ts. This type of system is
sometimes called a "pull" system.

Links to other best practices and training at bottom of page.

Types of Kanban Systems


You may previously have thought that there was only one, or maybe two types of
Kanban system! In fact there are 6 and here they are:

One card systems

In the above diagram:

A signal is sent back from the consuming work centre to supplying work centre (or
supplier). This is a signal:

a. To send some more (a transfer batch), via a buffer stock.


b. To produce some more (a process batch), at the supplying work centre.

NB. Empty containers acting as a signal are a potential hazard as any empty container is
a signal to fill it. Also occasionally containers have been known to go missing! Usually,
for these reasons, the signal is separated from the container.

59
Input / Output Control Kanban

This is to impose input / output control, where the signal travels directly from the end of a
line or section to the preceding section or raw material stores. In this case the supply
chain is treated as one unit rather than a series of linked operations. So, as one transfer
batch is completed (output) another is launched on the first operation (input), thus
ensuring that work in process cannot build up. However there are some special
considerations required in the operation of the system, but we have used adaptations of
this system to manage workflow in a number of environments including job shops.

Kanban Accumulator

In this method Kanban signals are allowed to accumulate at the supplying work centre
until the production batch size is reached.

In this case buffers can be depleted or exhausted depending on the accumulation rules.
Also because buffers can be exhausted, slightly higher mixes can be accommodated.

Dual Card System (2 Card System)

This method, first used by Toyota, separates the replenishment (send some) signal, which
is produced from the Kanban system, from the "produce" signal, which is produced by a
scheduling system such as MRP.

• MRP says which job is next.


• The Kanban says make it now.

This method can deal with higher mixes. It can also deal with larger batch sizes, caused
by long changeovers, where scheduling is necessary, although you should be trying to
reduce batch sizes (See Previous Technique of the Week: T019 Avoiding set ups and
Reducing Changeover Times). In this case the buffer is depleted, and can be exhausted.
In addition a planning system such as MRP1 (See "Levels of Planning and Control"
below) is also necessary to schedule the work.

Variable Quantity (fixed frequency) System

In some situations it is more convenient to replenish items used, by fixed frequency


deliveries (or collections), rather than respond to fixed quantity replenishment requests.
This method forms the basis of supplier "top up at point of use" systems, where a supplier
visiting your point of use will top up stocks to a predefined maximum level. We have
also used this method as the mechanism to drive "replacement systems" for maintaining
stocks of critical spares items or maintaining "van stock" for on-the-road service
engineers. (See Previous Techniques of the Week 015: Replacement Systems).

POLCA System ("Quick Response Manufacturing" Rajan Suri)

60
This is mentioned for completeness only and is said to be prescribed for high-mix,
variable-route, situations. However at this point, in our opinion, it is worth considering
other simplification techniques (see "Organisational Redesign"), or as a last resort, the
use of scheduling tools (See "Advanced Planning & Scheduling").

Attributes of Kanban Systems


Some champions of Kanban Systems suggest that the system is universally
applicable and has no disadvantages. This is not true! There are some
circumstances where they can be positively harmful. Also if they are not
designed and managed correctly, disastrous! We have rescued several, and in
two cases the system was responsible for accumulation of serious customer
backlogs! The system does have advantages and disadvantages and some of these
are:

Advantages

• Low fixed stock (number of Kanbans in system)


• Low lead-time
• Quality problems visible
• Highly stable

Disadvantages

• Inflexible (transfer batch fixed, except with "Variable Quantity Systems" above)
• Can cause stoppages (often viewed as an opportunity to solve a problem)
• Highly stable! (But you may need to change it, or it may be an unstable
environment)

Where appropriate

The technique can be applied to any pair of resources, or pairs in a series of resources
(including clerical operations), where one feeds the other. It is important to choose
suitable pairs. However you also need to be careful to select the appropriate Kanban
system for your situation. Some systems are more appropriate to particular situations. In
particular the mix, variability, and numbers of resources in the supply chain network (e.g.
one to many, many to many, many to one) are key. Also there needs to be a method of
handling small orders or prototypes (not difficult if thought about at the start). (See
Previous Readers Question Q019: When is Kanban not Appropriate? Do I need extra
equipment?). There are also some prerequisites which you need to consider such as
having a planning process which is integrated with the Kanban system. (See
"Participative Sales and Operations Planning"). If this is not done the system will
eventually fail!

61
Note: Just because your end product or service is not suitable, it is possible that some
aspect or segment of your business may be suitable. It is quite possible and sensible to
segment control systems to suit the needs of different parts of a business. The skill is in
selecting suitable segmentation strategies. But we have seen a number of examples of the
"one size fits all" philosophy being positively damaging!

Just-in-Time/Kanban
Introduction
Just-in-time production, or JIT, and cellular manufacturing are closely related, as a cellular
production layout is typically a prerequisite for achieving just-in-time production. JIT leverages the
cellular manufacturing layout to reduce significantly inventory and work-in-process (WIP). JIT
enables a company to produce the products its customers want, when they want them, in the
amount they want.

Under conventional mass production approaches, large quantities of identical products are
produced, and then stored until ordered by a customer. JIT techniques work to level production,
spreading production evenly over time to foster a smooth flow between processes. Varying the
mix of products produced on a single line, sometimes referred to as "shish-kebab production",
provides an effective means for producing the desired production mix in a smooth manner.

JIT frequently relies on the use of physical inventory control cues (or kanban) to signal the need
to move raw materials or produce new components from the previous process. In some cases, a
limited number of reusable containers are used as kanban, assuring that only what is needed
gets produced. Many companies implementing lean production systems are also requiring
suppliers to deliver components using JIT. The company signals its suppliers, using computers or
delivery of empty, reusable containers, to supply more of a particular component when they are
needed. The end result is typically a significant reduction in waste associated with unnecessary
inventory, WIP, and overproduction.

Method and Implementation Approach


Key elements of JIT, and techniques for achieving JIT, are discussed below.

Load leveling. This technique involves determining appropriate quantities and types of products
needed in a given day to meet customer orders. This technique allows organizations to produce
products with a variety of customer specifications each day (using a daily schedule), in a smooth
sequence that minimizes inventory and delay. Takt time is critical to the daily scheduling required
in leveled production described above. It is the rate at which each product must be completed to
meet customer needs, expressed in amount of time per part.

Production Sequencing. This involves calculating the pattern for making each product type in the
required amount for any given day, by calculating the takt time for the daily quantity of each type.

Kanban. Often referred to as the "nervous system" of lean production, kanban is a key technique
that determines a processes production quantities, and in doing so, facilitates JIT production and
ordering systems. Contrary to more traditional "push" methods of mass production which are

62
based on an estimated number of expected sales, kanban's "pull" system creates greater
flexibility on the production floor, such that the organization only produces what is ordered.

More specifically, a kanban1 is a card, labeled container, computer order, or other device used to
signal that more products or parts are needed from the previous process step. The kanban
contain information on the exact product or component specifications that are needed for the
subsequent process step. Kanban are used to control work-in-progress (WIP), production, and
inventory flow.

In this way, kanban serves to ultimately eliminate overproduction, a key form of manufacturing
waste. Different types of kanban include: supplier kanban (indicate orders given to outside parts
suppliers when parts are needed for assembly lines); in-factory kanban (used between processes
in a factory); and production kanban (indicate operating instructions for processes within a line).

Kanban are a critical part of a JIT system. In implementing a kanban system, organizations
typically focus on four important "rules".

• Kanban works from upstream to downstream in the production process (i.e., starting with
the customer order). At each step, only as many parts are withdrawn as the kanban
instructs, helping ensure that only what is ordered is made. The necessary parts in a
given step always accompanies the kanban to ensure visual control.
• The upstream processes only produce what has been withdrawn. This includes only
producing items in the sequence in which the kanban are received, and only producing
the number indicated on the kanban.
• Only products that are 100 percent defect-free continue on through the production line. In
this way, each step uncovers and then corrects the defects that are found, before any
more can be produced.
• The number of kanban should be decreased over time. Minimizing the total number of
kanban is the best way to uncover areas of needed improvement. By constantly reducing
the total number of kanban, continuous improvement is facilitated by concurrently
reducing the overall level of stock in production.

Implications for Environmental Performance


Potential Benefits:
JIT/kanban systems help eliminate overproduction. Overproduction affects the
environment in three key ways:

1. increases the number of products that must be scrapped or discarded as waste;


2. increases the amount of raw materials used in production;
3. increases the amount of energy, emissions, and wastes (solid and hazardous)
that are generated by the processing of the unneeded output.

JIT/kanban systems reduce the amount of necessary in-process and post-process


inventory, thereby reducing the potential for products to be damaged during handling and
storage, or through deterioration or spoilage over time. Such damaged inventory typically
ends up being disposed of as solid or hazardous waste. Frequent inventory turns can
also eliminate the need for degreasing processes for metal parts, since the parts may not
need to be coated with oils to prevent oxidization or rust while waiting for the next
process step.
JIT typically require less floor space for equal levels of production ("this is a factory, not a
warehouse"). Reductions in square footage can reduce energy use for heating, air
conditioning, and lighting. Reduced square footage can also reduce the resource

63
consumption and waste associated with maintaining the unneeded space (e.g.,
flourescent bulbs, cleaning supplies). Even more significantly, reducing the spatial
footprint of production can reduce the need to construct additional production facilities, as
well as the associated environmental impacts resulting from construction material use,
land use, and construction wastes.
JIT/kanban systems also help facilitate worker-lead process improvements, as workers
are more motivated to make product improvements when there is no excess inventory
remaining to be sold.
Excess inventory results in increased energy use associated with the need to transport
and reorganize unsold inventory.
Potential Shortcomings:
JIT can result in more frequent "milk runs" for parts and material inputs from sister
facilities or suppliers, leading to an increased number of transport trips. This can
contribute to traffic congestion, as well as environmental impacts associated with
additional fuel use and vehicle emissions. Through efficient load planning, however, the
environmental implications of increased milk runs can be significantly reduced or
eliminated.
JIT/kanban may not succeed at reducing or eliminating overproduction and associated
waste if the products produced have large and/or unpredictable market fluctuations.
JIT, when not implemented throughout the supply chain, can just push inventory carrying
activities up the supply chain, along with the associated environmental impacts from
overproduction, damaged goods, inventory storage space heating and lighting, etc.

64
What is Intrapreneuring?
Though the term intrapreneuring has been in use for a while, it may not really be in
practice as much as it should be. Put simply, intrapreneuring is the process of
allowing--and encouraging--your employees to initiate and oversee new ideas or
improvements within the framework of your organization. It means developing the
kind of corporate culture that will allow employees to find all the opportunities for
innovation and ownership they're craving--without their having to leave your
organization to do it.

Does this mean that you should stop the presses right now and let every employee do
his or her own thing? That would be taking the concept too far. Not all employees
are feeling that entrepreneurial itch, and not all employees are even ready to be
given the kind of responsibility we're talking about here. However--and this is
important--some employees are not only ready for these kinds of opportunities, but
they'll be ready to leave if they don't get them. You don't want to lose your best and
brightest employees. And, by practicing intrapreneuring, you won't have to.

We've all heard the lament in exit interviews or resignation announcements that a
given employee liked his or her job, but was leaving to "branch out" or was leaving
"in search of something more," or was leaving "to start his or her own business." A
large measure of this desire is that the employee is looking to gain responsibility for
his or her work. Today's employees are looking for more than just a desk and a
paycheck; they're looking for the opportunity to create, establish, and nurture an
identity--a corporate identity, a work-related identity--of their own. They want
something to be theirs.

Employers are slowly beginning to recognize these drives, and some are finding
innovative ways to meet these needs, so that their best employees will have reasons
to stay. They realize that they must provide challenges for their rising stars. They
must provide opportunities for autonomy and for leadership. They must find a way,
within the framework of the existing company, to allow their star employees some
room to create, to maneuver, and, therefore, to thrive. You can do this in your
company, too.

The key in any intrapreneuring concept is making sure that the new idea, or the
revision of an old idea, fits within the framework of your existing company. Giving
employees autonomy and ownership is fine, but you need to make sure that your
new division, arm, or even corner of the room fits in well with the direction in which
the company is already headed. You're looking for complementary concepts here.
The New Intrapreneurial The entrepreneurial spirit can be an important competitive advantage,
especially if harnessed in large organizations, which contain some of our best people and resources.
But to accomplish this, large corporations must grant employees the kinds of freedoms entrepreneurs
now enjoy. In the right environment, freedom and cooperation will generate productivity.
Innovation involves turning new ideas (inventions) into business successes. Large corporations are
good at creating ideas but poor at developing them commercially. Dissatisfied would-be intrapreneurs
are then driven to venture capitalists to become entrepreneurs. To keep their creative people and
implement new ideas, firms must first understand the complicated process by which innovation really

65
works.
Both top management and potential innovators must also recognize certain truths about
intrapreneurs. They circumvent or ignore orders, follow intuition, go beyond formal job descriptions,
share responsibility with a team, and pursue goals passionately but realistically.

Who Is the Intrapreneur?

Intrapreneurs bridge the gap between inventors and managers; they take new ideas and turn them
into profitable realities. Without them there is an innovation gap. They have vision and the courage to
realize it. They can imagine what business and organizational realities will follow from the way
customers respond to their innovations; they can plot the necessary steps from idea to actualization.

To make things work, intrapreneurs cross organizational boundaries to do other people's jobs. They
have a need to act, and they don't wait for permission to begin. Their dedication frequently shuts out
other concerns, including family life. They pursue only goals that they set, that have personal
meaning. Successful intrapreneurs learn to overcome mistakes and to manage risk. The typical
intrapreneurial personality lies somewhere between that of the traditional manager and that of the
traditional entrepreneur.Intrapreneuring in Action

What the firm IHS Helpdesk Services has done with their arm, IHS Helpdesk
Services NEED CORRECT NAME, is an excellent example of what we're talking
about here. IHS is a company that provides help desk staffers, on-site, in client
corporations and organizations. IHS used to only provide on-site service. However,
a young employee, a person already identified by management as someone with
great potential, approached management with a simple idea: why not provide 24/7
phone support as well? That way, IHS could be the "first line of defense," so to
speak, people looking for help at their client companies would call an 800 number,
and speak with a trained IHS staff member, first to see if the problem could be
resolved. The IHS staffer could then determine if the designated person on call for
the client needed to be disturbed. Basically, the new division of IHS Helpdesk
Services, would be hired by clients to be their on-call people. The idea took off,
business is booming, and the young pro stayed on with IHS to run the new division.

The IHS example (IHS, by the way, was itself a spinoff of this type, from Leverage
Technologies in New York) highlights a good marriage of new ideas and existing
company services. Look around your own company to begin to identify needs and
opportunities for your rising stars. Intrapreneuring can help you keep your best and
brightest employees, and they, of course, can help you maintain and improve your
position in the marketplace.

Intrapreneuring at Your Company

Here are a few quick ways you can get started with intrapreneuring initiatives in
your own company:

• Identify your best and brightest employees. Not only are these folks your
greatest assets, but they're also likely to be your greatest risks to leave. Start
looking for ways to keep your most important people from day one.
• Identify the strengths and weaknesses of your star employees. Look for ways
to play to their strengths and improve their weaknesses. Discuss their goals
and dreams with them, so that they know from the start that they won't

66
necessarily have to leave to find what they're looking for. (You should, by the
way, be having these sorts of conversations with all of your employees on a
regular basis, regardless of intrapreneuring.) See if you can find ways to help
them achieve their goals in alignment with the goals and objectives of the
company.
• Ask your at-risk employees--those most likely to leave--to come up with ideas
that might allow them to stay. Give them the chance to sell you on their
plans.
• Support these ideas in any ways you can. Find ways to make employees even
more excited about their own ideas, and they'll want to stay.

It is important to keep in mind that if, for some reason, you can't follow through in
supporting your employees and their ideas, you need to offer that feedback
immediately with an explanation as to why you couldn't support that idea. As well,
you should know that you will need to provide some other sort of opportunity, in
order for the employee to stay.

Intrapreneuring is really a win-win idea--your very best employees become more


energized and motivated, and want to stay, and, if and when they're successful your
bottom line improves. The key to workforce stability is finding ways for your
employees to feel happy, successful and important...and intrapreneuring can do just
that.

67
Organizational culture
Organizational culture, or corporate culture, comprises the attitudes, experiences, beliefs and
values of an organization.

It has been defined as "the specific collection of values and norms that are shared by people and
groups in an organization and that control the way they interact with each other and with
stakeholders outside the organization. Organizational values are beliefs and ideas about what
kinds of goals members of an organization should pursue and ideas about the appropriate kinds or
standards of behavior organizational members should use to achieve these goals. From
organizational values develop organizational norms, guidelines or expectations that prescribe
appropriate kinds of behavior by employees in particular situations and control the behavior of
organizational members towards one another." (Hill & Jones, 2001)

Senior management may try to determine a corporate culture. They may wish to impose corporate
values and standards of behavior that specifically reflect the objectives of the organization. In
addition, there will also be an extant internal culture within the workforce.

Work-groups within the organization have their own behavioral quirks and interactions which, to
an extent, affect the whole system. Task culture can be imported. For example, computer
technicians will have expertise, language and behaviors gained independently of the organization,
but their presence can influence the culture of the organization as a whole.

Strong/Weak cultures

Strong culture is said to exist where staff respond to stimulus because of their alignment to
organizational values.

Conversely, there is weak culture where there is little alignment with organizational values and
control must be exercised through extensive procedures and bureaucracy.

Where culture is strong—people do things because they believe it is the right thing to do—there
is a risk of another phenomenon, Groupthink. "Groupthink" was described by Irving L. Janis. He
defined it as "...a quick and easy way to refer to a mode of thinking that people engage when they
are deeply involved in a cohesive ingroup, when members' strivings for unanimity override their
motivation to realistically appraise alternatives of action." This is a state where people, even if
they have different ideas, do not challenge organizational thinking, and therefore there is a
reduced capacity for innovative thoughts. This could occur, for example, where there is heavy
reliance on a central charismatic figure in the organization, or where there is an evangelical belief
in the organization’s values, or also in groups where a friendly climate is at the base of their
identity (avoidance of conflict). In fact groupthink is very common, it happens all the time, in
almost every group. Members that are defiant are often turned down or seen as a negative
influence by the rest of the group, because they bring conflict (conflicting ideas) and disturb the
central culture. In cultural studies, culture is seen as ethnocentric (Barone, J.T, Switzer, J.Y), or
culturocentric, meaning that we tend to think that our culture/subculture is the best. The stronger
the culture, the greater the risks of groupthink.

68
By contrast, bureaucratic organizations may miss opportunities for innovation, through reliance
on established procedures.

Innovative organizations need individuals who are prepared to challenge the status quo—be it
groupthink or bureaucracy, and also need procedures to implement new ideas effectively.

Classifying organizational culture

Several methods have been used to classify organizational culture. Some are described below

Geert Hofstede demonstrated that there are national and regional cultural groupings that affect the
behavior of organizations.

Hofstede identified five characteristics of culture in his study of national influences: (CAREFUL -
MISTAKE!! The author of this article mixes Hofstede's 5 dimensions of NATIONAL CULTURES
(as described here) with Hofstede's six dimensions of ORGANIZATIONAL CULTURES. See
Hofstede, G. Cultures Consequences, 2001)

• Power distance - The degree to which a society expects there to be differences in the
levels of power. A high score suggests that there is an expectation that some individuals
wield larger amounts of power than others. A low score reflects the view that all people
should have equal rights.
• Uncertainty avoidance reflects the extent to which a society accepts uncertainty and risk.
• individualism vs. collectivism - individualism is contrasted with collectivism, and refers
to the extent to which people are expected to stand up for themselves, or alternatively act
predominantly as a member of the group or organization.
• Masculinity vs. femininity - refers to the value placed on traditionally male or female
values. Male values for example include competitiveness, assertiveness, ambition, and
the accumulation of wealth and material possessions.
• Long vs. short term orientation - describes a society's "time horizon," or the importance
attached to the future versus the past and present. In long term oriented societies, thrift
and perseverance are valued more; in short term oriented societies, respect for tradition
and reciprocation of gifts and favors are valued more. Eastern nations tend to score
especially high here, with Western nations scoring low and the less developed nations
very low; China scored highest and Pakistan lowest.

Deal and Kennedy

Deal and Kennedy[2] defined organizational culture as the way things get done around here. They
measured organizations in respect of:

• Feedback - quick feedback means an instant response. This could be in monetary terms,
but could also be seen in other ways, such as the impact of a great save in a soccer match.
• Risk - represents the degree of uncertainty in the organization’s activities.

Using these parameters, they were able to suggest four classifications of organizational culture:

69
The Tough-Guy Macho Culture. Feedback is quick and the rewards are high. This often applies to
fast moving financial activities such as brokerage, but could also apply to a police force, or
athletes competing in team sports. This can be a very stressful culture in which to operate.

The Work Hard/Play Hard Culture is characterized by few risks being taken, all with rapid
feedback. This is typical in large organizations, which strive for high quality customer service. It
is often characterized by team meetings, jargon and buzzwords.

The Bet your Company Culture, where big stakes decisions are taken, but it may be years before
the results are known. Typically, these might involve development or exploration projects, which
take years to come to fruition, such as oil prospecting or military aviation.

The Process Culture occurs in organizations where there is little or no feedback. People become
bogged down with how things are done not with what is to be achieved. This is often associated
with bureaucracies. While it is easy to criticize these cultures for being overly cautious or bogged
down in red tape, they do produce consistent results, which is ideal in, for example, public
services.

Charles Handy

Charles Handy[3] (1985) popularized a method of looking at culture which some scholars have
used to link organizational structure to Organizational Culture. He describes:

• a Power Culture which concentrates power among a few. Control radiates from the center
like a web. Power Cultures have few rules and little bureaucracy; swift decisions can
ensue.
• In a Role Culture, people have clearly delegated authorities within a highly defined
structure. Typically, these organizations form hierarchical bureaucracies. Power derives
from a person's position and little scope exists for expert power.
• By contrast, in a Task Culture, teams are formed to solve particular problems. Power
derives from expertise as long as a team requires expertise. These cultures often feature
the multiple reporting lines of a matrix structure.
• A Person Culture exists where all individuals believe themselves superior to the
organization. Survival can become difficult for such organizations, since the concept of
an organization suggests that a group of like-minded individuals pursue the
organizational goals. Some professional partnerships can operate as person cultures,
because each partner brings a particular expertise and clientele to the firm.

Edgar Schein

Edgar Schein[4], an MIT Sloan School of Management professor, defines organizational culture as
"the residue of success" within an organization. According to Schein, culture is the most difficult
organizational attribute to change, outlasting organizational products, services, founders and
leadership and all other physical attributes of the organization. His organizational model
illuminates culture from the standpoint of the observer, described by three cognitive levels of
organizational culture.At the first and most cursory level of Schein's model is organizational
attributes that can be seen, felt and heard by the uninitiated observer. Included are the facilities,
offices, furnishings, visible awards and recognition, the way that its members dress, and how each
person visibly interacts with each other and with organizational outsiders.

70
The next level deals with the professed culture of an organization's members. At this level,
company slogans, mission statements and other operational creeds are often expressed, and local
and personal values are widely expressed within the organization. Organizational behavior at this
level usually can be studied by interviewing the organization's membership and using
questionnaires to gather attitudes about organizational membership.

At the third and deepest level, the organization's tacit assumptions are found. These are the
elements of culture that are unseen and not cognitively identified in everyday interactions
between organizational members. Additionally, these are the elements of culture which are often
taboo to discuss inside the organization. Many of these 'unspoken rules' exist without the
conscious knowledge of the membership. Those with sufficient experience to understand this
deepest level of organizational culture usually become acclimatized to its attributes over time,
thus reinforcing the invisibility of their existence. Surveys and casual interviews with
organizational members cannot draw out these attributes--rather much more in-depth means is
required to first identify then understand organizational culture at this level. Notably, culture at
this level is the underlying and driving element often missed by organizational behaviorists.Using
Schein's model, understanding paradoxical organizational behaviors becomes more apparent. For
instance, an organization can profess highly aesthetic and moral standards at the second level of
Schein's model while simultaneously displaying curiously opposing behavior at the third and
deepest level of culture. Superficially, organizational rewards can imply one organizational norm
but at the deepest level imply something completely different. This insight offers an
understanding of the difficulty that organizational newcomers have in assimilating organizational
culture and why it takes time to become acclimatized. It also explains why organizational change
agents usually fail to achieve their goals: underlying tacit cultural norms are generally not
understood before would-be change agents begin their actions. Merely understanding culture at
the deepest level may be insufficient to institute cultural change because the dynamics of
interpersonal relationships (often under threatening conditions) are added to the dynamics of
organizational culture while attempts are made to institute desired change.

Gabrielle O'Donovan

Gabrielle O'Donovan—author, management consultant and university lecturer. In 'The Corporate


Culture Handbook' [ISBN 109041-48-97-2] O'Donovan defines culture as an organic group
phenonemon, whereby tradition passes on acquired learning to success generations while
innovation builds capacity to evolve with the environment. The interplay between these
complementary forces manifests in the shared beliefs and assumptions of the workforce. It is
visible in shared attitudes, behaviours and artefacts, and determines the quality of (business)
outcomes and results. (O'Donovan, 2006).The levels of culture are defined as the drivers,
expressions and reflections of culture. Drivers include shared needs and the central paradigm of
the workforce. Expressions of culture include shared attitudes and behaviours. Reflections of a
culture include artefacts, outcomes and results. O'Donovan has also created two new typologies
of culture to meet the needs of 21st century business leaders who are looking to create a service
culture, a culture of ethics or a culture of learning and innovation.The first typology employs a
systems perspective, to demonstrate how the dual forces of tradition and innovation coexist to
create necessary friction. This model provides a useful frame of reference for those seeking to
create a service culture or a culture of learning and innovation. The second typology employs a
moral perspective, to demonstrate the role of principles and values in organizational culture.
From this typology three distinct cultures emerge—a culture of ethics, a black and white culture
and a shades of grey culture.

71
Elements of culture
G. Johnson described a cultural web, identifying a number of elements that can be used to
describe or influence Organizational Culture:

• The Paradigm: What the organization is about; what it does; its mission; its values.
• Control Systems: The processes in place to monitor what is going on. Role cultures
would have vast rulebooks. There would be more reliance on individualism in a power
culture.
• Organizational Structures: Reporting lines, hierarchies, and the way that work flows
through the business.
• Power Structures: Who makes the decisions, how widely spread is power, and on what is
power based?
• Symbols: These include organizational logos and designs, but also extend to symbols of
power such as parking spaces and executive washrooms.
• Rituals and Routines: Management meetings, board reports and so on may become more
habitual than necessary.
• Stories and Myths: build up about people and events, and convey a message about what is
valued within the organization.

These elements may overlap. Power structures may depend on control systems, which may
exploit the very rituals that generate stories which may not be true.

Entrepreneurial Organizational Culture


Stephen McGuire defined and validated a model of organizational culture that predicts revenue
from new sources. An Entrepreneurial Organizational Culture (EOC) is a system of shared
values, beliefs and norms of members of an organization, including valuing creativity and
tolerance of creative people, believing that innovating and seizing market opportunities are
appropriate behaviors to deal with problems of survival and prosperity, environmental
uncertainty, and competitors’ threats, and expecting organizational members to behave
accordingly.

Critical Views on Organizational Culture


Writers from Critical management studies have tended to express skepticism about the
functionalist and unitarist views of culture put forward by mainstream management thinkers.
Whilst not necessarily denying that organizations are cultural phenomena, they would stress the
ways in which cultural assumptions can stifle dissent and reproduce management propaganda and
ideology. After all, it would be naive to believe that a single culture exists in all organizations, or
that cultural engineering will reflect the interests of all stakeholders within an organization. In
any case, Parker (2000) has suggested that many of the assumptions of those putting forward
theories of organizational culture are not new. They reflect a long-standing tension between
cultural and structural (or informal and formal) versions of what organizations are. Further, it is
perfectly reasonable to suggest that complex organizations might have many cultures, and that
such sub-cultures might overlap and contradict each other. The neat typologies of cultural forms
found in textbooks rarely acknowledge such complexities, or the various economic contradictions
that exist in capitalist organizations.One of the strongest and widely recognised criticisms of
theories that attempt to categorise or 'pigeonhole' organisational culture is that put forward by
Linda Smirchich. She uses the metaphor of a plant root to represent culture, describing that it
drives organisations rather than vice verca. Organisations are the product of organisational
culture, we are unaware of how it shapes behaviour and interaction (also recognised through
Scheins (2002) underlying assumptions) and so how can we categorise it and define what it is?

72
The One Minute Manager

by Kenneth Blanchard and Spencer Johnson

Summary:

The One Minute Manager reveals three secrets to productive and efficient managing as
told through a young man's search for the perfect managing and leading skills. The One
Minute Manager is focused on, not surprisingly, a one minute manager. The man is a
venerable leader that is highly spoken of by his employees, his three secrets being the key
to his success.

The first secret is One Minute Goals. This involves a meeting of the manager and the
employee where goals are agreed on, written down in a brief statement, and occasionally
reviewed to ensure that productivity is occurring. This whole process takes a "minute",
which truly means it is a quick meeting, however it is not limited to just sixty seconds.
The purpose of one minute goal setting is to confirm that responsibilities of each working
is understood, understanding that confusion leads to inefficiency and discouragement.

The second secret to one minute managing is one minute praisings. This involves being
open with people about their performance. When you catch someone doing something
right, a goal of the one minute manager, you praise them immediately, telling them
specifically what they did correctly. Pause to allow them to "feel" how good you feel
regarding their importance to the organization, and finish by shaking hands.

The third secret is the one minute reprimand. Being honest with those around you
involves reprimanding when a wrong has occurred. The first step is to reprimand
immediately and specifically. This is the same as the second secret, and it holds an
important aspect of the first secret: it enables an understanding of responsibilities and
how to complete them correctly. Following the reprimand, shake hands and remind the
person that he or she is important and it was simply their performance that you did not
like. The one minute reprimand consists of the reprimand and the reassurance, both being
equally important. If you leave the latter out, you will not be liked by those around you
and they will attribute mistakes to them being worth less, which is far from the truth.

73
Downsizing in the Macroeconomy
The concept of downsizing can have a longer-term and perhaps subtler effect on the labor
force. Management can repeatedly threaten to wield the job-cutting axe, and to the extent
that workers believe that the threat is credible, they will be less bold in asking for salary
raises. Management clearly has an incentive to reduce the wage bill. As many participate in
profit-sharing plans, and stock options, the lower the wage bill (for a given output and
productivity level), the higher the profits. The incentive is clearly for management to portray
a doom-and-gloom attitude about the state of the firm to its workers, while conveying a rosy
outlook to its shareholders.

The years 1993 to 1997 provided the labor force with almost idyllic settings. Unemployment
dropped significantly, and the jobs being created were at higher salaries than the average of
what already existed. Yet, highly publicized downsizings instilled an almost irrational fear in
the work force of impending and wide-spread layoffs. In a study by The New York Times
following the AT&T downsizings, 75% of households reported a close encounter with a job
layoff in the past fifteen years. One third of the respondents knew someone who had
recently lost a job or had been laid off, while 10% of those surveyed claimed that a job loss
had "precipitated a major crisis" in their lives. An article in The Denver Post read: "[This] is an
explosive social backdrop for new massive job losses through downsizing in the developed
world. In America, it is time for government to bring businesses, finance, and labor together
for discussions on compensation packages to prevent workers from bearing the entire cost
of downsizing while shareholders take all its profits." Such a dire account of the state of the
labor market seems more applicable to the Great Depression than to a booming economy.
"Nobody's safe anymore," announced Maine's Portland Press Herald. The Austin American-
Statesman proclaimed "Millions run scared in today's workplace."

This relates to the issue of perceptions in the labor market. If newspapers report massive job
cuts, and don't report the wide-scale hirings, perceptions of unemployment might be higher
than actual unemployment. Workers might assume that the probability of their being
unemployed is greater than it actually is. Blanchflower suggests that the equilibrium wage
should tend to rise in this situation. Workers who perceive a high chance of being laid off
should receive a wage premium for their willingness to be employed in a risky position. In
the Blanchflower world, the fear of downsizing increases the aggregate corporate wage bill in
the economy. For a given revenue stream, this will lower profits and shareholder value. In
such a model, downsizing is a horrible strategy for the management to undertake as it
increases the wage bill. The counter-argument proposed by Rosen seems to explain wage
inflation in the US better than the Blanchflower model. Rosen suggests that the fear of
unemployment will tend to lower the equilibrium wage, as workers are afraid to ask for
raises, since they believe that there are many unemployed individuals who would be more
than happy to replace them in their jobs at a lower wage. Additionally, those seeking work
are willing to return to employment at below-market wages, which are still better than
unemployment benefits. In the Rosen model, the perception of downsizing bids up equity
values. The more that workers believe that downsizing is prevalent, the lower the wage bill
for a given revenue. This tends to increase profits, and therefore the value of the company.

74
If anything, downsizing appears to be a nation-wide scare tactic, and one that worked for a
long time. The Conference Board conducts a monthly economic survey, asking respondents
to characterize the labor market six months from the survey date as "more jobs," "fewer
jobs," or "no change." Despite impressive economic growth during the era of downsizing,
more respondents claimed that there would be fewer jobs in the near future. Once the
AT&T downsizing had captured widespread media attention, 20% of those surveyed
predicted fewer jobs (up from 16% the previous month), while 11% estimated there would
be more jobs (down from 14% the previous month). Such a dire change in outlook (see the
"spike" in the time series at January 1996 in Figure II.1.) remains completely unjustified by
the promising economic news that was released at the time.

Figure II.1.

A study by President Clinton's Council of Economic Advisors reported in late April 1996
that Americans' fears of corporate downsizings "could be overstated" given that two-thirds
of jobs created since 1994 pay above-average wages. Job reductions were found to have risen
slightly, but were outpaced by higher-paying "quality" jobs. As this report was released
during an election year, the Dole campaign countered that it was a politicized report
designed to boost Clinton's poll ratings. Still, a survey in the report revealed that a growing
number of Americans feared downsizings, and cited AT&T's much-publicized decision to
cut 40,000 workers. The study was conducted in April of 1996, two months after AT&T had
revised that number to 18,000 workers, and actually had only fired less than 1,000 by that
time.

75
Perhaps this widespread fear, however unjustified it is, may explain why wages have not
inflated as a response to tightness in the labor market as they have historically. The standard
macroeconomics states that a rise in aggregate demand will increase the demand for inputs
such as labor. To induce more workers to join the labor force, wages should rise. Both the
wage increases and the price hikes stemming from stronger aggregate demand are supposed
to cause inflation. Yet, a rise in prices does not seem apparent as a result of the robust
economy. To the extent that the economy is strong and at the same time exhibits little
inflationary pressure is ideal for equity values. In the formula for the price of a share of
stock, the numerator (expected future earnings) rises as firm earnings are expected to rise,
and the denominator (the rate at which cash flows are discounted) falls as inflation falls. A
rising numerator and falling denominator indicates, without ambiguity, a higher share price.

Perhaps sentiments in the labor market generated by downsizing are causing this
phenomenon. Workers are not reacting to tight employment in the manner textbooks claim
they should. The Conference Board's survey is a fairly reasonable proxy for the sentiments
of workers. The null hypothesis is that expectations of the future state of the job market
cannot predict the change in inflation, from month to month. This hypothesis is rejected at
the 95% level of the modern US economy. Regressions demonstrate the changes in the
inflation rate are negatively correlated with the percentage of workers who feel that the
outlook for jobs is poorer, over the period 1980 to 1997. In other words, as the number of
workers who are pessimistic about employment prospects in the economy rises, the change
in the inflation rate over that period falls, all things equal. Specifically, for every one-
percentage point of workers that anticipate more layoffs than hirings, the rise in inflation
slows down by 1.3 basis points (standard error 0.36 basis points, regression 1). The full
regression output is included in the appendix of the thesis (Chapter X). A similar result is
obtained for expectations six months in advance: for every pessimistic worker out of 100,
the rate of inflation falls by 0.9 basis points (standard error 0.21 basis points, regression 2).

To verify that this relationship is a result of the perceived derived demand for labor not a
result of the industrial production itself, the rate of inflation was regressed on the change in
the government's industrial production index, as well as the Conference Board statistics on
expectations in the labor market. Industrial production was not statistically significant for the
current economy, or the economy expected six months from that date, while the job data
retained its statistical significance at the 95% level (regressions 3 and 4).

Geoffrey Tootell, an economist with the Boston Federal Reserve has studied this
phenomenon. Tootell wanted to test if the recent downsizing craze has increased the natural
rate of unemployment in the economy. He regressed perceived changes in the NAIRU on
two phenomena that have been the subject of debate recently. The first are massive job cuts
in the military and defense industries, and the second (and related observation) is the
geographic mis-match in the economy: some regions have many job openings, and others
profess tight labor markets. Supposedly, large downsizings in concentrated areas would
cause great variation in regional unemployment and might increase the NAIRU. Tootell was
forced to accept the hypothesis that the military downsizing had no effect on the NAIRU.

The effects of downsizing on stock value now seem more evident. The effect on un-
discounted cash flow depends on perception: does the market consider the firm

76
repositioning itself to accommodate greater market share, or does the market believe the
firm is reducing capacity to brace for a downturn in demand? This effect is firm or industry-
specific. Yet there exists a subtler effect of downsizing that is economy-wide, and that is the
effect of downsizing on the rate that the cash flows of a share of stock are expected to
generate. Anecdotal and empirical evidence suggests that fear of downsizing, whether
legitimate or not, will put downward pressure on inflation (workers are less bold in asking
for pay raises) and therefore the rate used to discount cash flows.

While this analysis serves to explain what impact downsizing in general will have on stock
prices in general, a framework needs to be developed to account for how the downsizing of
one company will alter the equity value of that particular company. The next chapter
presents an event study that measures the effect upon the stock price of a downsizing
announcement.

77
Requirements of a Self-Managed Team Leader

Organizations are benefiting from the advantages of teams by using more of them in a wide
variety of ways. Cross-functional teams are regularly being formed and commissioned by
management to manage projects, design or improve products or processes, resolve chronic
problems or to conduct research on new equipment and technologies. Self-managed teams are
active in some companies managing responsibilities associated with their everyday work.
Without defining their style, many successful leaders of these teams are using self-managed
team leadership principles and processes to improve the performance and to achieve the desired

Self-managed team leadership is quite different from traditional leadership and provides an
alternative to traditional leader’s role. It affords the leader an opportunity to experience different
methods that neutralize the issues oftentimes associated with the traditional leadership model.
The self-managed team leader is most at home in the Team Based Organization (TBO) where his
or her style is supported and in line with the culture and values of the organization. However, a
staff member, supervisor or manager in a traditional organization can learn to become a self-
managed team leader if his or her supervisor, the organization’s culture and paradigms of the
organization will permit him or her to do so.

Self-managed team leaders lead without positional authority. Traditional leaders function outside
of their subordinate work group and use positional authority to provide instruction, conduct
communication, develop action plans and give orders on what is to be accomplished. If
necessary, positional authority includes the right to discipline a subordinate if he fails to comply
with orders or meet requirements. In practice, the need for a leader to discipline is seldom
needed or exercised, but subordinates recognize that their supervisor has the authority to take
disciplinary action if the situation warrants it. Relationships between supervisor and subordinates
are maintained at arm's length to ensure objectivity in making assignments and reviewing
performance. The focus of the leader's attention is on meeting the needs of his supervisor and
the organization. Two-way communications and positive response to a leader’s direction is
desirable, but not required.

Self-managed team leadership is moving inside one’s subordinate work group to lead. In the self-
managed team leader's role, the leader decides to permanently or temporarily set aside positional
authority and to move inside the work group to provide direction, communication, group process
facilitation, coordination and support. When a leader has not been delegated positional authority
from higher-level management and is a member of the work group, none of the traditional issues
related to positional authority are present. However, sometimes the process breaks down
because the noncommissioned leader thinks the proper and most effective way to lead is by
following the traditional leadership model.

To move inside the work group, the traditional leader announces to subordinate staff members
that they are being delegated the authority to manage a defined area of responsibility or to make
a decision. The team has the responsibility and authority for reaching consensus decisions that
everyone can support. The leader makes it clear he or she will act as the team's facilitator to
coordinate the work, but will not make any independent decisions related to the delegated
responsibility area. He or she notes that the team will be held accountable for the outcome of its
decisions and actions. As a team leader experiences success and recognizes the benefits of this
process, he or she will define more areas for collective responsibility and decision-making and
spend more and more time inside of the group.

78
While team leaders do not have the power of positional authority, they do enjoy the authority that
comes from:

• their ability to communicate and represent the team's interests,


• a desire to help each member to develop and use their skills,
• a demonstration of concern for each member and the team,
• the ability to facilitate group processes,
• a knowledge of the group’s work processes,
• the ability to help the team to maintain its focus, and
• setting an example thorough one's behavior, personal values, energy and actions.

The self-managed team leader fulfills a skilled team role similar to that of captain in a team sport,
but this role does not carry with it special status. Status is not at issue because the leader
maintains or accepts equal status with the other members of the group. The leader is not in a
position to give orders, to define or prescribe certain levels of individual or team performance.
The leader holds equal responsibility and accountability for the group’s performance with each
other team member. Ideas, options and collective decisions on how best to accomplish the
purpose and goals of the team are encouraged and supported by the team leader.

Self-managed team leadership defines a different role for the leader. The leader is not
responsible for making decisions, developing action plans or giving orders. In these situations,
the team is given the responsibility, authority and accountability for managing a defined area of
responsibility. When the work group is given control over one or more defined areas of team
responsibility, it is the leader’s role to use self-managed leadership skills and systematic
processes to help the team to operate effectively and efficiently. Everyone in the group is
encouraged to contribute by communicating and promoting their ideas, by “hitch hiking” on the
ideas of others and by exercising judgment to narrow down ideas or options. Everyone
recognizes that since the group makes decisions and develops action plans, the group will also
be held accountable for the outcomes of their management actions.

When a person accepts a position as a leader of a self-managed team or plays the role of a self-
managed team leader, he or she accepts the challenge of becoming both an exceptional leader
and an exceptional person. In effect, the team leader becomes accountable to the team for his or
her leadership performance. The team leader’s orientation is toward meeting the needs and
requirements of team members, a higher-level management authority and the organization.

With the above noted understanding of the self-managed team leader’s role, we can now briefly
define several of the requirements for effective team leadership.

The most important single factor in becoming a successful self-managed team leader is a servant
attitude. To have such an attitude, one must have or develop a sincere desire to assist the work
group to accomplish its responsibility by bringing out the best qualities and contribution of each
team member. This is something that cannot be taught. It requires an inner sense security, self-
worth and self-control along with a desire to see others succeed. It is the cornerstone to
successfully fulfilling such roles as teaching, coaching or pastoring.

The team leader's orientation should be "How can I help create a working environment where my
fellow team members are willing to exert themselves to meet personal and team goals?" The
leader's mission is to free up team members to act collectively to use their intellect, creativity,
diversity, talents and skills to manage defined areas of team responsibility and to develop and
carry out action plans that capture the commitment and enthusiasm of everyone.

79
Some of the characteristics that are often said of effective team leaders by team members are as
follows:

• The team leader is a fellow worker and friend, not a supervisor;


• leads by example, not by giving directions;
• is a servant, not a master;
• is a peacemaker, not a warrior;
• is a coordinator, not an order giver;
• is a facilitator, not an individual decision-maker; and
• is a communications link, not a communications owner.

The team leader helps team members to identify their unique abilities and talents and then seeks
to provide the environment, resources and opportunities that will enable them to use their special
abilities to experience meaning from their work and contribute to team goals. The leader
recognizes that work can be a desirable and meaningful activity and that people seek to derive
fulfillment, purpose and joy from their employment situation.

To accomplish this, the leader finds ways to blend the needs of the organization with the higher
level needs of team members. The leader takes an active interest in each person in the group
and strives to build positive relationships with team members and among team members. In
effect the leader is in a continual process of finding ways to build and strengthen each member’s
skill set and self-worth based on their contribution to the group.

The secondary skills to team leader success are learned and can be developed by most anyone
interested in leading a self-managed team. The key skills are group process facilitation, team
problem solving, team decision-making and team communications. Each of these skills requires
knowledge of systematic processes and tools to move a group forward to accomplishing its
mission in as efficient and effective way possible.

Group process facilitation is the use of consistent processes, methods, and tools that aid team
members in agreeing on how best to conduct business, accomplish work and manage defined
areas of team responsibility . Every time the team needs to conduct a meeting, make a decision,
develop an action plan or resolve a problem, the team leader uses group process facilitation skills
to enable team members to work together to carry out its management responsibilities. Much of
the team leader’s interaction with team members in conducting day-to-day work activities requires
the use of group process facilitation skills.

The team leader facilitates the team decision-making and problem solving processes. He or she
uses systematic step-by-step processes on a consistent basis to help members to make
unanimous or consensus decisions or to resolve problems. He or she helps to make the team
and the organization become more effective by harnessing the power of collective management
control, collective decision-making and problem solving over defined areas of team responsibility.

The team leader is the logical choice to handle the formal communications responsibilities for the
team. The leader coordinates the work of the team with process suppliers and customers and
with managers and staff personnel. He or she plans team meetings, prepares and distributes a
meeting agenda to team members and facilitates team meetings. The leader reviews meeting
minutes, posts a copy on the minutes on the team's communications board and sends a copy to
the next higher-level of management.

The team leader acts as the channel through which communications flow both inside of and
outside of the team. On issues concerning the team's support or performance, the leader takes
care to ensure that he or she is communicating the consensus of the group and not the leader's
own position. The leader thinks and communicates in terms of "we".

80
Performance Leadership
in Meeting
Customer Requirements
by Doing
the Right Things
Right
The First Time 1

Is this Total Quality or Six Sigma ?

The answer is BOTH .

Six Sigma provides a structured approach to Total Quality.

In 1988, Motorola and the Westinghouse Commercial Nuclear Fuel Division (WCNFD)
won the first Baldrige National Quality Awards. Both Motorola and Westinghouse had
undertaken major quality improvement programs in the early 1980's.

Motorola used Six Sigma quality and Cycle Time reduction as the foundations of its
Continuous Improvement program. The goal was Total Customer Satisfaction (TCS). In
the late-1970's, Westinghouse began using Cycle Time reduction to dramatically reduce
its investment in inventory. In the early 1980's, WCNFD also focused on improving
process yield (fundamentally a Six Sigma approach). These similar Continuous Quality
Improvement (CQI) programs paid huge dividends. Motorola achieved a dominant
market position in pagers and cell phones and WCNFD did so in nuclear fuel.

Today, we see many corporations -- most notably GE -- adopting similar quantitative


quality improvement programs to achieve significant bottom line results. Strong
management leadership and support has been as vital in these successes as the quality
improvement techniques themselves -- Bob Galvin at Motorola, Mead D'Amore at
WCNFD, and Jack Welch at GE.

Six Sigma Process Quality

81
In 1985, Bill Smith at Motorola demonstrated a correlation between how often a product
was repaired during manufacture and its life in the field. Defect levels in the parts per
million (ppm) rather than in parts per hundred (%) were needed to improve the reliability
of semiconductors and electronic products in order to compete with the Japanese. Hence,
the development of the Motorola Six Sigma quality program with its landmark quality
level of 3 ppm defects.

Six Sigma was intended to improve the quality of processes that are already under control
-- major special causes of process problems have been removed. The output of these
process usually follows a Normal distribution with the process capability defined as ± 3
sigma.

The process mean will vary each time a process is executed using different equipment,
different personnel, different materials, etc. The observed variation in the process mean
was ± 1.5 sigma. Motorola decided a design tolerance (specification width) of ± 6 sigma
was needed so that there will be only 3.4 ppm defects -- measurements outside the design
tolerance. This was defined as Six Sigma quality.

Six Sigma Process Improvement -- (D)MAIC

A more quantitative version of Deming's PDCA (Plan-Do-Check-Act) Process


Improvement methodology was developed to implement this statistical approach -- it is
commonly referred to as MAIC.

ƒ Measure
ƒ Analyze
ƒ Improve
ƒ Control

Key product-process performance variables are measured, analyzed, improved, and


controlled using statistical methods. The simple "statistical" quality tools that were
popularized in the Total Quality era are reinforced with Design of Experiments (DOE)
and more sophisticated Statistical Process Control techniques.

82
Process sigma is the primary unit of measure. It is determined from an analysis of the
number of defects observed in a process. Performance is compared to the Best-In-Class
sigma for that process to determine whether the process needs to be improved or the
product / service needs to be re-designed. When improvement is necessary, Design of
Experiments (DOE) are used to determine which product or process parameters are most
important and specific parameter values that will give the best performance. SPC is used
to continually monitor product and process performance.

Similar to the problem-solving models where an initial step to define the problem was
frequently added, some practitioners prefer to precede MAIC with a Define step. They
feel that selecting and defining the right process is critical. Effort can easily be wasted
working on poorly selected, ill-defined processes -- as illustrated by many TQM failures.

Total Quality Management (TQM)

TQM is an overall business (quality) improvement system. It encompasses leadership,


strategic planning, and human resources as well as Process Improvement -- as seen in the
Baldrige Award Framework below.

The previously described Six Sigma Process Improvement methodology would be


covered in criteria 3, 4, and especially 6 -- Customer and Market Focus, Information and
Analysis, and Process Management respectively. The Baldrige criteria does not prescribe
the use of a specific quality improvement methodology such as Six Sigma. A business
can select or develop its own process, but it must show that results are obtained.

The Baldrige criteria does assess whether all personnel are enabled to contribute
effectively through work teams and individually. TQM provided a big impetus to
problem-solving teams, quality improvement teams (QITs), and cross-functional teams.

83
Companies generally trained teams to use simple statistical quality tools in solving
problems. These teams have been very effective in developing and implementing
consensus-based solutions to productivity and quality issues.

The core values and concepts of the Malcolm Baldrige Total


Quality Awards are

• Customer-Driven Quality
• Strong Leadership
• Continuous Improvement
• Employee Participation (Teamwork)
• Fast Response
• Design Quality
• Management by Fact (measures)
• Partnerships
• Measurable Results

Customer requirements, design quality, measures, and


continuous improvement are key elements of Six Sigma Process
Improvement.

Six Sigma Improvement System

Many Total Quality improvement efforts did not achieve their objectives because there
was a lack of commitment to the specific improvement actions and to their effective
implementation. Six Sigma, as a system, overcomes that weakness by

• focusing on the common commitment to meeting customer requirements,


• developing a consensus set of improvement actions,
• prioritizing those actions, and
• establishing measures that assure accountability in implementation.

Many companies today are achieving dramatic results with a company-wide Six Sigma
Improvement System based on the previously described Six Sigma Process Improvement
methodology -- MAIC. Large numbers of technical personnel are trained as "black belts"
to lead teams in applying the statistically-based methodology. Most black belt training
programs focus heavily on these advanced statistical techniques.

High level executives are appointed as "champions" to drive the Six Sigma Program
within their segment of the company. Master Black Belts coach black belts and
coordinate Six Sigma projects. Some companies provide basic process improvement
training to Six Sigma project team members and refer to them as "green belts." Black

84
belts and / or teams are assigned process improvement projects with specific performance
improvement goals.

To reduce the workload on their key personnel, to lessen the need for extensive training,
and to minimize costs, small organizations (and some large ones, too) obtain external
facilitation and statistical methods support.

Balanced Score Card

Kaplan's Balanced Score Card (Harvard Business School) lends support to the
importance of approaching business in a total systems manner such as TQM or Six Sigma
in the broad sense. Employee skills are the base of Kaplan's business model. Employees
work to improve quality and reduce cycle time (improve processes) so that deliveries can
be made on-time. This creates customer loyalty which in turn generates profits for the
company (Return on Capital Employed). Kaplan suggests using measures of employee
skills, process quality, process cycle time, and on-time performance to monitor business
performance in addition to the usual financial measures (which lag performance).

Brecker Six Sigma Improvement Methodology

The four-phase Brecker Six Sigma Improvement Methodology incorporates elements of


Value Analysis (VA), Quality Function Deployment (QFD), and QS9000 (ISO900-2000
is now similar) into the Six Sigma Improvement System to provide better results with less
effort and cost.

Implementation can be undertaken at 3 levels

• Process (Phase 3)
• Product Line / Plant (Phases 2-3)
• Business (Phases 1-3).

85
Organizations can pilot this methodology at the product line / plant level (Phases 2-3)
before committing to company wide implementation and training. Traditional Six Sigma
training addresses Phase 3.

Phase 1: Key problem areas are identified and quantified.

Senior personnel analyze customer, financial, operational, and quality data to identify
improvement opportunities and quantify possible improvements. An Activity-Based
Costing approach is frequently taken. Improvement goals are aligned with strategic
business objectives. This is akin to DMAIC at the business level with the Critical to
Quality (CTQ) and Critical to Business (CTB) parameters being passed down from Phase
1 to Phase 3 (similar to QFD or Hoshin planning).

Phase 2: Potential product / process improvement solutions are quantified.

Product line / plant teams use value analysis style workshops to develop and evaluate
specific product / service and process improvements needed to meet quality, productivity,
and cost objectives. Lean thinking, Six Sigma, and other quality and productivity
concepts are considered.

Phase 3: Multi-functional teams improve key processes.

Multi-functional teams analyze products and processes in depth and develop detailed
implementation plans for improvements. Lean thinking, Six Sigma, Kaizen, and other
quality and productivity tools are used as appropriate.

Phase 4: Improvements are implemented and monitored.

Strong management support is essential in making significant and lasting improvements.


Decision-making needs to be crisp. Follow-up needs to be relentless. Improvement goals
and the implementation schedule must be met to achieve the projected returns.

86
LEARNING ORGANIZATION

”Contemplate to see that awakened people, while not being enslaved by the work of
serving living beings, never abandon their work of serving living beings.”

Thich Nhat Hanh

The Miracle of Mindfulness! (1976, p. 98)

Introduction
In a way those who work in a learning organization are “fully awakened” people. They
are engaged in their work, striving to reach their potential, by sharing the vision of a
worthy goal with team colleagues. They have mental models to guide them in the pursuit
of personal mastery, and their personal goals are in alignment with the mission of the
organization. Working in a learning organization is far from being a slave to a job that is
unsatisfying; rather, it is seeing one’s work as part of a whole, a system where there are
interrelationships and processes that depend on each other. Consequently, awakened
workers take risks in order to learn, and they understand how to seek enduring solutions
to problems instead of quick fixes. Lifelong commitment to high quality work can result
when teams work together to capitalize on the synergy of the continuous group learning
for optimal performance. Those in learning organizations are not slaves to living beings,
but they can serve others in effective ways because they are well-prepared for change and
working with others.

Organizational learning involves individual learning, and those who make the shift from
traditional organization thinking to learning organizations develop the ability to think
critically and creatively. These skills transfer nicely to the values and assumptions
inherent in Organization Development (OD). Organization Development is a “long-term
effort at continuous improvement supported at all levels of the organization, using
interdisciplinary approaches and modern technologies.”1 Organization Development is
the mother field that encompasses interventions, such as organization learning. OD is
about people and how they work with others to achieve personal and organizational
goals. Many times achieving goals means making changes that require creative thinking
and problem solving. French and Bell report that the values held by OD practitioners
include “wanting to create change, to positively impact people and organizations,
enhance the effectiveness and profitability of organizations, [to] learn and grow, and
exercise power and influence.” (1995, p. 77) Although values do shift over time, the
values held by OD practitioners mesh well with the characteristics of learning
organizations as outlined in this paper.

The paper is organized according to the five disciplines that Peter Senge (1990) says are
the core disciplines in building the learning organization: personal mastery, mental
models, team learning, shared vision, and systems thinking.2 Even though the paper
makes liberal use of Senge’s pervasive ideas, it also refers to OD practitioners such as

87
Chris Argyris, Juanita Brown, Charles Handy, and others. What these writers have in
common is a belief in the ability of people and organizations to change and become more
effective, and that change requires open communication and empowerment of community
members as well as a culture of collaboration. Those also happen to be the characteristics
of a learning organization. The paper is influenced by team meetings in which the five
authors prepared a class presentation on the topic of learning organizations. The team
worked to emulate a learning community within the group. The paper reflects the
learning, reflection, and discussion that accompanied the process.

Personal Mastery
Personal mastery is what Peter Senge describes as one of the core disciplines needed to
build a learning organization. Personal mastery applies to individual learning, and Senge
says that organizations cannot learn until their members begin to learn. Personal Mastery
has two components. First, one must define what one is trying to achieve (a goal).
Second, one must have a true measure of how close one is to the goal. (Senge, 1990)

Mental Models
Mental models are the second of Senge's five disciplines for the learning
organization.(Senge, The Leader’s New Work, 1990) Much of the work involving mental
models comes from Chris Argyris and his colleagues at Harvard University. A mental
model is one's way of looking at the world. It is a framework for the cognitive processes
of our mind. In other words, it determines how we think and act. A simple example of a
mental model comes from an exercise described in The Fifth Discipline Fieldbook. In this
exercise, pairs of conference participants are asked to arm wrestle. They are told that
winning in arm wrestling means the act of lowering their partner's arm to the table. Most
people struggle against their partner to win. Their mental model is that there can be only
one winner in arm wrestling and that this is done by lowering their partner's arm more
times than their partner can do the same thing to them. Argyris contends that these people
have a flawed mental model.

An alternative model would present a framework where both partners could win. If they
stop resisting each other, they can work together flipping their arms back and forth. The
end result is that they can both win and they can win many more times than if they were
working against each other. (Senge, 1994) Argyris says that most of our mental models
are flawed. He says that everyone has ‘theories of action’ which are a set of rules that we
use for our own behaviors as well as to understand the behaviors of others. However,
people don't usually follow their stated action theories. The way they really behave can
be called their ‘theory-in-use.’ It is usual:

1. To remain in unilateral control,


2. To maximize winning and minimize losing,
3. To suppress negative feelings, and

88
4. To be as rational as possible by which people mean defining clear objectives and
evaluating their behavior in terms of whether or not they have achieved them.
(Argyris, 1991)

Argyris believes that people can be taught to see the flaws in their mental models. One
way to do this is to practice the left-hand column technique. In this exercise, one takes
some dialogue that occurred during a conversation and writes it in the form of a play
script on the right-hand side of a sheet of paper. In the corresponding left-hand column,
one records what he or she was really thinking during the conversation. An example is as
follows:

Left-hand column Right-hand colum


(What I'm thinking) (What was said.)
Everyone says the presentation was a bomb. Me: How did the presenation go?
Bill: Well, I don't know. It's really too
early to tell. Besides, we're breaking new
ground here.
Does he really not know how bad it was?
Me: Well, what do you think we should
do? I believe that the issues you were
raising are important.
He really is afraid to see the truth. If he only
had more confidence, he could probably learn
from a situation like this. Bill: I'm not so sure. Let's just wait and
see what happens.
I can't believe he doesn't realize how disastrous
that presentation was to our moving ahead.
Me: You may be right, but I think we
I've got to find some way to light a fire the
may need to do more than just wait.
guy.
(Senge, 1990, pp. 196)

Professor Sue Faerman at the University at Albany suggests that there could be two left-
hand columns: one for what each partner to the conversation might be thinking:
(Faerman, 1996)

Left-hand column #1 Left-hand column #2 Right-hand column


(What I think she was thinking.) (What I was thinking) (What was said.)

Teams
WHAT IS A TEAM AND WHY ARE THEY IMPORTANT?

89
A team, say Robbins and Finley, is “people doing something together.” It could be a
baseball team or a research team or a rescue team. It isn’t what a team does that makes it
a team; it is a fact that they do it “together.” (Robbins and Finley, 1995, p. 10) “Teams
and teamwork are the ‘hottest’ thing happening in organizations today...” according to
French and Bell. (1995, p. 97) A workplace team is more than a work group, “a number
of persons, usually reporting to a common superior and having some facetoface
interaction, who have some degree of interdependence in carrying out tasks for the
purpose of achieving organizational goals.” (French and Bell, 1995, p. 169)

A workplace team is closer to what is called a selfdirected work team or SDWT, which
can be defined as follows: “A selfdirected work team is a natural work group of
interdependent employees who share most, if not all, the roles of a traditional
supervisor.” (Hitchcock and Willard, 1995, p. 4) Since teams usually have team leaders,
sometimes called coaches, the definition used by Katzenbach and Smith in French and
Bell seems the most widely applicable: “A team is a small number of people with
complementary skills who are committed to a common purpose, set of performance
goals, and approach for which they hold themselves mutually accountable.” (1995, p.
112)

TEAM BUILDING AND TEAM LEARNING

A recent concept in OD is that of the learning organization. Peter Senge considers the
team to be a key learning unit in the organization. According to Senge, the definition of
team learning is:

...the process of aligning and developing the capacity of a team to create


the results its members truly desire. It builds on the discipline of developing
shared vision. It also builds on personal mastery, for talented teams are
made up of talented individuals.

Before a team can learn, it must become a team. In the 1970s, psychologist B. W.
Tuckman identified four stages that teams had to go through to be successful. They are:

1. Forming: When a group is just learning to deal with one another; a time when
minimal work gets accomplished.
2. Storming: A time of stressful negotiation of the terms under which the team will
work together; a trial by fire.
3. Norming: A time in which roles are accepted, team feeling develops, and
information is freely shared.
4. Performing: When optimal levels are finally realized—in productivity, quality,
decision making, allocation of resources, and interpersonal interdependence

Shared Vision

90
The shared vision of an organization must be built of the individual visions of its
members. What this means for the leader in the Learning Organization is that the
organizational vision must not be created by the leader, rather, the vision must be created
through interaction with the individuals in the organization. Only by compromising
between the individual visions and the development of these visions in a common
direction can the shared vision be created. The leader's role in creating a shared vision is
to share her own vision with the employees. This should not be done to force that vision
on others, but rather to encourage others to share their vision too. Based on these visions,
the organization's vision should evolve.

It would be naive to expect that the organization can change overnight from having a
vision that is communicated from the top to an organization where the vision evolves
from the visions of all the people in the organization. The organization will have to go
through major change for this to happen, and this is where OD can play a role. In the
development of a learning organization, the OD-consultant would use the same tools as
before, just on a much broader scale.

What is a shared vision? To come up with a classification for shared visions would be
close to impossible. Going back to the definition of a vision as a graphic and lifelike
mental image that is very important to us, Melinda Dekker's drawing [see p. 2] is as good
as any other representation of shared vision. The drawing will probably be interpreted
differently by people, but still there is something powerful about the imagery that most
people can see.

Systems Thinking
Humankind has succeeded over time in conquering the physical world and in developing
scientific knowledge by adopting an analytical method to understand problems. This
method involves breaking a problem into components, studying each part in isolation,
and then drawing conclusions about the whole. According to Senge, this sort of linear and
mechanistic thinking is becoming increasingly ineffective to address modern problems.
(Kofman and Senge, 1993, p. 18) This is because, today, most important issues are
interrelated in ways that defy linear causation.

Alternatively, circular causation—where a variable is both the cause and effect of


another—has become the norm, rather than the exception. Truly exogenous forces are
rare. For example, the state of the economy affects unemployment, which in turn affects
the economy. The world has become increasingly interconnected, and endogenous
feedback causal loops now dominate the behavior of the important variables in our social
and economic systems.

Thus, fragmentation is now a distinctive cultural dysfunction of society.4 (Kofman and


Senge, p. 17) In order to understand the source and the solutions to modern problems,
linear and mechanistic thinking must give way to non-linear and organic thinking, more

91
commonly referred to as systems thinking—a way of thinking where the primacy of the
whole is acknowledged.

Conclusion
The concept of the learning organization arises out of ideas long held by leaders in
organizational development and systems dynamics. One of the specific contributions of
organizational development is its focus on the humanistic side of organizations. The
disciplines described in this paper “differ from more familiar management disciplines in
that they are ‘personal’ disciplines. Each has to do with how we think, what we truly
want, and how we interact and learn with one another.” (Senge, 1990, p. 11) The authors
of this paper see learning organizations as part of the evolving field of OD. To our
knowledge, there are no true learning organizations at this point. However, some of
today’s most successful organizations are embracing these ideas to meet the demands of a
global economy where the value of the individual is increasingly recognized as our most
important resource.

92
Enterprise resource planning
Enterprise Resource Planning systems (ERPs) integrate (or attempt to integrate) all
data and processes of an organization into a unified system. A typical ERP system will
use multiple components of computer software and hardware to achieve the integration.
A key ingredient of most ERP systems is the use of a unified database to store data for
the various system modules.

The term ERP originally implied systems designed to plan the use of enterprise-wide
resources. Although the acronym ERP originated in the manufacturing environment,
today's use of the term ERP systems has much broader scope. ERP systems typically
attempt to cover all basic functions of an organization, regardless of the organization's
business or charter. Business, non-profit organizations, non governmental organizations,
governments, and other large entities utilize ERP systems.

Additionally, it may be noted that to be considered an ERP system, a software package


generally would only need to provide functionality in a single package that would
normally be covered by two or more systems. Technically, a software package that
provides both payroll and accounting functions (such as QuickBooks) would be
considered an ERP software package.

However, the term is typically reserved for larger, more broadly based applications. The
introduction of an ERP system to replace two or more independent applications
eliminates the need for external interfaces previously required between systems, and
provides additional benefits that range from standardization and lower maintenance (one
system instead of two or more) to easier and/or greater reporting capabilities (as all data
is typically kept in one database).

Examples of modules in an ERP which formerly would have been stand-alone


applications include: Manufacturing, Supply Chain, Financials, Customer Relationship
Management (CRM), Human Resources, and Warehouse Management.

Overview
Some organizations - typically those with sufficient in-house IT skills to integrate
multiple software products - choose to only implement portions of an ERP system and
develop an external interface to other ERP or stand-alone systems for their other
application needs. For instance, the PeopleSoft HRMS and Financials systems may be
perceived to be better than SAP's HRMS solution. And likewise, some may perceive
SAP's manufacturing and CRM systems as better than PeopleSoft's equivalents. In this
case these organizations may justify the purchase of an ERP system, but choose to
purchase the PeopleSoft HRMS and Financials modules from Oracle, and their remaining
applications from SAP.

93
This is very common in the retail sector, where even a mid-sized retailer will have a
discrete Point-of-Sale (POS) product and financials application, then a series of
specialised applications to handle business requirements such as warehouse management,
staff rostering, merchandising and logistics.

Ideally, ERP delivers a single database that contains all data for the software modules,
which would include:

Manufacturing
Engineering, Bills of Material, Scheduling, Capacity, Workflow Management,
Quality Control, Cost Management, Manufacturing Process, Manufacturing
Projects, Manufacturing Flow
Supply Chain Management
Inventory, Order Entry, Purchasing, Product Configurator, Supply Chain
Planning, Supplier Scheduling, Inspection of goods, Claim Processing,
Commission Calculation
Financials
General Ledger, Cash Management, Accounts Payable, Accounts Receivable,
Fixed Assets
Projects
Costing, Billing, Time and Expense, Activity Management
Human Resources
Human Resources, Payroll, Training, Time & Attendance, Benefits
Customer Resources and Marketing
Sales and Marketing, Commissions, Service, Customer Contact and Call Center
support
Data Warehouse
and various Self-Service interfaces for Customers, Suppliers, and Employees

Enterprise Resource Planning is a term originally derived from manufacturing resource


planning (MRP II) that followed material requirements planning (MRP). MRP evolved
into ERP when "routings" became major part of the software architecture and a
company's capacity planning activity also became a part of the standard software activity.
ERP systems typically handle the manufacturing, logistics, distribution, inventory,
shipping, invoicing, and accounting for a company. Enterprise Resource Planning or ERP
software can aid in the control of many business activities, like sales, marketing, delivery,
billing, production, inventory management, quality management, and human resources
management.

ERPs are often incorrectly called back office systems indicating that customers and the
general public are not directly involved. This is contrasted with front office systems like
customer relationship management (CRM) systems that deal directly with the customers,
or the eBusiness systems such as eCommerce, eGovernment, eTelecom, and eFinance, or
supplier relationship management (SRM) systems.

94
ERPs are cross-functional and enterprise wide. All functional departments that are
involved in operations or production are integrated in one system. In addition to
manufacturing, warehousing, logistics, and Information Technology, this would include
accounting, human resources, marketing, and strategic management.

ERP II means open ERP architecture of components. The older, monolithic ERP systems
became component oriented.

EAS - Enterprise Application Suite is a new name for formerly developed ERP systems
which include (almost) all segments of business, using ordinary Internet browsers as thin
clients.

Before

Prior to the concept of ERP systems, departments within an organization would have
their own computer systems. For example, the Human Resources (HR) department, the
Payroll (PR) department, and the Financials department. The HR computer system (Often
called HRMS or HRIS) would typically contain information on the department, reporting
structure, and personal details of employees. The PR department would typically
calculate and store paycheck information. The Financials department would typically
store financial transactions for the organization. Each system would have to rely on a set
of common data to communicate with each other. For the HRIS to send salary
information to the PR system, an employee number would need to be assigned and
remain static between the two systems to accurately identify an employee. The Financials
system was not interested in the employee level data, but only the payouts made by the
PR systems, such as the Tax payments to various authorities, payments for employee
benefits to providers, and so on. This provided complications. For instance, a person
could not be paid in the Payroll system without an employee number.

After

ERP software, among other things, combined the data of formerly disparate applications.
This made the worry of keeping employee numbers in synchronization across multiple
systems disappear. It standardised and reduced the number of software specialities
required within larger organizations.

Best Practices
Best Practices were also a benefit of implementing an ERP system. When implementing
an ERP system, organizations essentially had to choose between customizing the
software or modifying their business processes to the "Best Practice" functionality
delivered in the vanilla version of the software.

Typically, the delivery of best practice applies more usefully to large organizations and
especially where there is a compliance requirement such as IFRS, Sarbanes-Oxley or
Basel II, or where the process is a commodity such as electronic funds transfer. This is

95
because the procedure of capturing and reporting legislative or commodity content can be
readily codified within the ERP software, and then replicated with confidence across
multiple businesses who have the same business requirement.

Where such a compliance or commodity requirement does not underpin the business
process, it can be argued that determining and applying a best practice actually erodes
competitive advantage by homogenizing the business compared to everyone else in their
industry sector.

Evidence for this can be seen within EDI, where the concept of best practice, even with
decades of effort remains elusive. A large retailer, for example, wants EDI plus some
minor tweak that they perceive puts them ahead of their competition. Mid-market
companies adopting ERP often take the vanilla version and spend half as much as the
license cost doing customisations that deliver their competitive edge. In this way they
actively work against best practice because they perceive that the way they operate is best
practice, irrespective of what anyone else is doing.

Implementation
Because of their wide scope of application within a business, ERP software systems
are typically complex and usually impose significant changes on staff work practices
(if they did not, there would be little need to implement them). Implementing ERP
software is typically not an "in-house" skill, so even smaller projects are more cost
effective if specialist ERP implementation consultants are employed. The length of
time to implement an ERP system depends on the size of the business, the scope of
the change and willingness of the customer to take ownership for the project. A
small project (e.g., a company of less than 100 staff) may be planned and delivered
within 3 months; however, a large, multi-site or multi-country implementation may
take years.

The most important aspect of any ERP implementation is that the company who has
purchased the ERP product takes ownership of the project.

To implement ERP systems, companies often seek the help of an ERP vendor or of third-
party consulting companies. These firms typically provide three areas of professional
services: Consulting, Customization and Support.

Consulting Services
The Consulting team is typically responsible for your initial ERP implementation and
subsequent delivery of work to tailor the system beyond "go live". Typically such
tailoring includes additional product training; creation of process triggers and workflow;
specialist advice to improve how the ERP is used in the business; system optimisation;
and assistance writing reports, complex data extracts or implementing Business
Intelligence.

The consulting team are also responsible for planning and jointly testing the
implementation. This is a critical part of the project, and one that is often

96
overlooked.Consulting for a large ERP project involves three levels: systems architecture,
business process consulting (primarily re-engineering) and technical consulting
(primarily programming and tool configuration activity). A systems architect designs the
overall dataflow for the enterprise including the future dataflow plan. A business
consultant studies an organization's current business processes and matches them to the
corresponding processes in the ERP system, thus 'configuring' the ERP system to the
organization's needs. Technical consulting often involves programming. Most ERP
vendors allow modification of their software to suit the business needs of their customer.

For most mid-sized companies, the cost of the implementation will range from around the
list price of the ERP user licenses to up to twice this amount (depending on the level of
customisation required). Large companies, and especially those with multiple sites or
countries, will often spend considerably more on the implementation than the cost of the
user licenses -- three to five times as more is not uncommon for a multi-site
implementation.

Customisation Services
Customisation is the process of extending or changing how the system works by writing
new user interfaces and underlying application code. Such customisations typically
reflect local work practices that are not currently in the core routines of the ERP system
software.

Examples of such code include early adopter features (e.g., mobility interfaces were
uncommon a few years ago and were typically customised) or interfacing to third party
applications (this is 'bread and butter' customisation for larger implementations as there
are typically dozens of ancillary systems that the core ERP software has to interact with).
The Professional Services team is also involved during ERP upgrades to ensure that
customisations are compatible with the new release. In some cases the functionality
delivered via a previous customisation may have been subsequently incorporated into the
core routines of the ERP software, allowing customers to revert back to standard product
and retire the customisation completely.

Customizing an ERP package can be very expensive and complicated, because many
ERP packages are not designed to support customization, so most businesses implement
the best practices embedded in the acquired ERP system. Some ERP packages are very
generic in their reports and inquiries, such that customization is expected in every
implementation. It is important to recognize that for these packages it often makes sense
to buy third party plug-ins that interface well with your ERP software rather than
reinventing the wheel.

Customisation work is usually undertaken as bespoke software development on a time


and materials basis. Because of the specialist nature of the customisation and the 'one off'
aspect of the work, it is common to pay in the order of $200 per hour for this work. Also,
in many cases the work delivered as customisation is not covered by the ERP vendors
Maintenance Agreement, so while there is typically a 90-day warranty against software
faults in the custom code, there is no obligation on the ERP vendor to warrant that the

97
code works with the next upgrade or point release of the core product.One often
neglected aspect of customisation is the associated documentation. While it can seem like
a considerable -- and expensive -- overhead to the customisation project, it is critical that
someone is responsible for the creation and user testing of the documentation. Without
the description on how to use the customisation, the effort is largely wasted as it becomes
difficult to train new staff in the work practice that the customisation delivers.

Maintenance and Support Services


Once your system has been implemented, the consulting company will typically enter
into a Support Agreement to assist your staff keep the ERP software running in an
optimal way. A Maintenance Agreement typically provides you rights to all current
version patches, and both minor and major releases, and will most likely allow your staff
to raise support calls. While there is no standard cost for this type of agreement, they are
typically between 15% and 20% of the list price of the ERP user licenses.

Advantages
In the absence of an ERP system, a large manufacturer may find itself with many
software applications that do not talk to each other and do not effectively interface.
Tasks that need to interface with one another may involve:

• design engineering (how best to make the product)


• order tracking from acceptance through fulfillment
• the revenue cycle from invoice through cash receipt
• managing interdependencies of complex Bill of Materials
• tracking the 3-way match between Purchase orders (what was ordered), Inventory
receipts (what arrived), and Costing (what the vendor invoiced)
• the Accounting for all of these tasks, tracking the Revenue, Cost and Profit on a
granular level.

Change how a product is made, in the engineering details, and that is how it will now be
made. Effective dates can be used to control when the switch over will occur from an old
version to the next one, both the date that some ingredients go into effect, and date that
some are discontinued. Part of the change can include labeling to identify version
numbers.Computer security is included within an ERP to protect against both outsider
crime, such as industrial espionage, and insider crime, such as embezzlement. A data
tampering scenario might involve a terrorist altering a Bill of Materials so as to put
poison in food products, or other sabotage. ERP security helps to prevent abuse as well.

Disadvantage
Many problems organizations have with ERP systems are due to inadequate
investment in ongoing training for involved personnel, including those
implementing and testing changes, as well as a lack of corporate policy protecting
the integrity of the data in the ERP systems and how it is used.

Limitations of ERP include:

98
• Success depends on the skill and experience of the workforce, including training
about how to make the system work correctly. Many companies cut costs by
cutting training budgets. Privately owned small enterprises are often
undercapitalized, meaning their ERP system is often operated by personnel with
inadequate education in ERP in general, such as APICS foundations, and in the
particular ERP vendor package being used.
• Personnel turnover; companies can employ new managers lacking education in
the company's ERP system, proposing changes in business practices that are out
of synchronization with the best utilization of the company's selected ERP.
• Customization of the ERP software is limited. Some customization may involve
changing of the ERP software structure which is usually not allowed.
• Re-engineering of business processes to fit the "industry standard" prescribed by
the ERP system may lead to a loss of competitive advantage.
• ERP systems can be very expensive to install.
• ERP vendors can charge sums of money for annual license renewal that is
unrelated to the size of the company using the ERP or its profitability.
• Technical support personnel often give replies to callers that are inappropriate for
the caller's corporate structure. Computer security concerns arise, for example
when telling a non-programmer how to change a database on the fly, at a
company that requires an audit trail of changes so as to meet some regulatory
standards.
• ERPs are often seen as too rigid and too difficult to adapt to the specific workflow
and business process of some companies—this is cited as one of the main causes
of their failure.
• Systems can be difficult to use.
• The system can suffer from the "weakest link" problem—an inefficiency in one
department or at one of the partners may affect other participants.
• Many of the integrated links need high accuracy in other applications to work
effectively. A company can achieve minimum standards, then over time "dirty
data" will reduce the reliability of some applications.
• Once a system is established, switching costs are very high for any one of the
partners (reducing flexibility and strategic control at the corporate level).
• The blurring of company boundaries can cause problems in accountability, lines
of responsibility, and employee morale.
• Resistance in sharing sensitive internal information between departments can
reduce the effectiveness of the software.
• There are frequent compatibility problems with the various legacy systems of the
partners.
• The system may be over-engineered relative to the actual needs of the customer.

99
Knowledge management
Knowledge Management refers to a range of practices used by organisations to identify,
create, represent, and distribute knowledge for reuse, awareness, and learning across the
organisations.

Knowledge Management programs are typically tied to organisational objectives and are
intended to lead to the achievement of specific outcomes, such as shared intelligence,
improved performance, competitive advantage, or higher levels of innovation.

Knowledge transfer (one aspect of Knowledge Management) has always existed in one
form or another. Examples include on-the-job peer discussions, formal apprenticeship,
corporate libraries, professional training, and mentoring programs. However, since the
late twentieth century, additional technology has been applied to this task, such as
knowledge bases, expert systems, and knowledge repositories.

Knowledge Management programs attempt to manage the process of creation or


identification, accumulation, and application of knowledge or intellectual capital across
an organisation. Knowledge Management, therefore, attempts to bring under one set of
practices various strands of thought and practice relating to:

• intellectual capital and the knowledge worker in the knowledge economy


• the idea of the learning organization;
• various enabling organizational practices such as Communities of Practice and
corporate Yellow Page directories for accessing key personnel and expertise;
• various enabling technologies such as knowledge bases and expert systems, help
desks, corporate intranets and extranets, Content Management, wikis, and
Document Management.

While Knowledge Management programs are closely related to Organizational Learning


initiatives, Knowledge Management may be distinguished from Organizational Learning
by its greater focus on the management of specific knowledge assets and development
and cultivation of the channels through which knowledge flows.

The emergence of Knowledge Management has generated new organisational roles and
responsibilities, an early example of which was the Chief Knowledge Officer. In recent
years, Personal knowledge management (PKM) practice has arisen in which individuals
apply KM practice to themselves, their roles in the organisation, and their career
development.

While it has been applied to all industrial sectors and increasingly to Government,
Knowledge Management is a continually evolving discipline, with a wide range of
contributions and a wide range of views on what represents good practice in Knowledge
Management.

100
Approaches to Knowledge Management
There is a broad range of thought on Knowledge Management with no unanimous
definition current or likely. The approaches vary by author and school. For example,
Knowledge Management may be viewed from each of the following perspectives:

• Techno-centric: Focus on technologies, ideally those that enhance knowledge


sharing/growth, frequently any technology that does fancy stuff with information.
• Organizational: How does the organization need to be designed to facilitate
knowledge processes? Which organizations work best with what processes?
• Ecological: Seeing the interaction of people, identity, knowledge and
environmental factors as a complex adaptive system.
• Combinatory: Combining more than one of the above approaches where possible
without contradiction.

In addition, as the discipline is maturing, we see an increasing presence of academic


debates within epistemology emerging in both the theory and practice of knowledge
management. UK and Australian Standards Bodies both have produced documents that
attempt to bound and scope the field, but these have received limited acceptance or
awareness.

Schools of thought in knowledge management


There are a variety of different schools of thought in Knowledge Management. For
example the Intellectual Capital movement with Professor Nick Bontis, Professor Leif
Edvinsson and Tom Stewart (formerly of Fortune Magazine, currently of Harvard
Business Review), a body of work derivative of information theory associated with
Prusak and Davenport. Complexity approaches associated with Snowden (see Cynefin).
Narrative with Denning, Snowden, Boje and others. One school takes forward the ideas
of Popper (McElroy & Firestone). They are many and various and it would be invidious
for an encyclopedia to list one without covering the others. Readers are commended to
the reading list.

Key concepts in knowledge management


Tacit versus explicit knowledge

A key distinction made by the majority of knowledge management practitioners is


Nonaka's reformulation of Polanyi's distinction between tacit and explicit knowledge.
The former is often subconscious, internalized, and the individual may or may not be
aware of what he or she knows and how he or she accomplishes particular results. At the
opposite end of the spectrum is conscious or explicit knowledge - knowledge that the
individual holds explicitly and consciously in mental focus, and may communicate to
others. In the popular form of the distinction tacit knowledge is what is in our heads, and
explicit knowledge is what we have codified.

101
Nonaka and Takeuchi (1995) argued that a successful KM program needs to, on the one
hand, convert internalized tacit knowledge into explicit codified knowledge in order to
share it, but also on the other hand for individuals and groups to internalize and make
personally meaningful codified knowledge once it is retrieved from the KM system.

The focus upon codification and management of explicit knowledge has allowed
knowledge management practitioners to appropriate prior work in information
management, leading to the frequent accusation that knowledge management is simply a
repackaged form of information management. (Eg Wilson, T.D. (2002) "The nonsense of
'knowledge management'"

Critics have however argued that Nonaka and Takeuchi's distinction between tacit and
explicit knowledge is oversimplified, and even that the notion of explicit knowledge is
self-contradictory.

Other commonly used types of knowledge include embedded knowledge (knowledge


which has been incorporated into an artifact of some type, for example a tool has
knowledge embedded into its design) and embodied knowledge (knowledge as learned
capability of the body’s nervous, chemical & sensory systems). These two types, while
frequently used, are not universally accepted, any more than is the distinction between
tacit and explicit.

The latest wave of Enterprise 2.0 social computing tools provide a more unstructured
approach information exchange and the development of new forms of community within
and without the organisation. However such tools are still based on text and are thus
explicit in nature. An additional challenge these tools face is how to distill meaningful,
re-useable knowledge from the variety of other content also captured in tools like blogs,
Wikis and Twikis.

Knowledge capture stages

Knowledge may be accessed, or captured, at three stages: before, during, or after


knowledge-related activities.

For example, individuals undertaking a new project for an organization might access
information resources to learn best practices and lessons learned for similar projects
undertaken previously, access relevant information again during the project
implementation to seek advice on issues encountered, and access relevant information
afterwards for advice on after-project actions and review activities. Knowledge
management practitioners offer systems, repositories, and corporate processes to
encourage and formalize these activities.

Similarly, knowledge may be captured and recorded before the project implementation,
for example as the project team learns lessons during the initial project analysis.
Similarly, lessons learned during the project operation may be recorded, and after-action
reviews may lead to further insights and lessons being recorded for future access.

102
Different organizations have tried various knowledge capture incentives, including
making content submission manditory and incorporating rewards into performance
measurement plans. There is controversy over the whether incentives work or not in this
field and no firm consensus has emerged.

Ad hoc knowledge access

One alternative strategy to encoding knowledge into and retrieving knowledge from a
knowledge repository such as a database, is for individuals to access experts on an ad hoc
basis, as needed, with their knowledge requests. A key benefit of this strategy is that the
response from the expert individual is rich in content and contextualised to the particular
problem being addressed and personalized to the particular person or people addressing
it. The downside is, of course, that it is tied to the availability and memories of specific
individuals in the organisation. It does not capture their insights and experience for future
use should they leave or become unavailable, and also does not help in the case when the
experts' memories of particular technical issues or problems previously faced change with
time. The emergence of narrative approaches to knowledge management attempts to
provide a bridge between the formal and the ad hoc, by allowing knowledge to be held in
the form of stories.

Drivers of knowledge management


There are a number of 'drivers', or motivations, leading to organizations undertaking a
knowledge management program.

Perhaps first among these is to gain the competitive advantage that comes with improved
or faster learning and new knowledge creation. Knowledge management programs may
lead to greater innovation, better customer experiences, consistency in good practices and
knowledge access across a global organization, as well as many other benefits, and
knowledge management programs may be driven with these goals in mind.

Considerations driving a knowledge management program might include:

• making available increased knowledge content in the development and provision


of products and services
• achieving shorter new product development cycles
• facilitating and managing organizational innovation
• leverage the expertise of people across the organization
• benefiting from 'network effects' as the number of productive connections
between employees in the organization increases and the quality of information
shared increases
• managing the proliferation of data and information in complex business
environments and allowing employees to rapidly access useful and relevant
knowledge resources and best practice guidelines
• facilitate organizational learning

103
• managing intellectual capital and intellectual assets in the workforce (such as the
expertise and know-how possessed by key individuals) as individuals retire and
new workers are hired
• a convincing sales pitch from one of the many consulting firms pushing
Knowledge Management as a solution to virtually any business problem, such as
loss of market share, declining profits, or employee inefficiency

Knowledge management enablers


Historically, there have been a number of technologies 'enabling' or facilitating
knowledge management practices in the organization, including expert systems,
knowledge bases, various types of Information Management, software help desk tools,
document management systems and other IT systems supporting organizational
knowledge flows.

The advent of the Internet brought with it further enabling technologies, including e-
learning, web conferencing, collaborative software, content management systems,
corporate 'Yellow pages' directories, email lists, wikis, blogs, and other technologies.
Each enabling technology can expand the level of inquiry available to an employee, while
providing a platform to achieve specific goals or actions. The practice of KM will
continue to evolve with the growth of collaboration applications made available by IT
and through the Internet. Since its adoption by the mainstream population and business
community, the Internet has led to an increase in creative collaboration, learning and
research, e-commerce, and instant information.There are also a variety of organisational
enablers for knowledge management programs, including Communities of Practice,
before-, after- and during- action reviews (see After Action Review), peer assists,
information taxonomies, coaching and mentoring, and so on.

Knowledge management roles and organizational


structure
Knowledge management activities may be centralized in a Knowledge Management
Office, or responsibility for knowledge management may be located in existing
departmental functions, such as the Human Resource (to manage intellectual capital) or
IT departments (for content management, social computing etc.). Different departments
and functions may have a knowledge management function and those functions may not
be connected other than informally.

Knowledge management lexicon


Knowledge management professionals may use a specific lexicon in order to articulate
and discuss the various issues arising in Knowledge Management. For example, terms
such as intellectual capital, metric, and tacit vs explicit knowledge typically form an
indispensable part of the knowledge management professional's vocabulary.

104

You might also like