Automation, Business Process Reengineering and Client Server Technology

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 22

Automation, BPR and Client Server Technology 143

Chapter
6

Automation, Business
Process Reengineering and
Client Server Technology:
A Three Stage Model of Organizational Change

Margaret T. O’Hara
University of Georgia

Richard T. Watson
University of Georgia

The idea of reengineering a business is not new. Although the


current focus is on reengineering business processes, the concept of
reinventing the business has been around at least since the turn of
the century. Reengineering is critical to a firm’s survival during
certain periods when there are major upheavals in the economy that
threaten the firm’s existence. In today’s chaotic business environ-
ment, many firms that do not reinvent themselves are doomed to
become part of business history.
In this chapter, we have two themes. First, we present a three-
stage model that illustrates the role of business process reengineering
(BPR) in organizational transformation. Then, we discuss why
client/server technology can have a significant impact on
reengineering management. We demonstrate how client/server
technology can be used both to reengineer work processes and to
redesign work management. We argue that client/ server computing
144 O’Hara/Watson

has a unique capability to change how work is managed and, as such,


is an important agent in restructuring organizations to a form
appropriate to an information era.

BPR—An Historical Perspective


Background

The current emphasis on business process reengineering began


in 1990, with Hammer’s article in Harvard Business Review. Ham-
mer defined BPR as using “the power of modern information technol-
ogy to radically redesign our business processes in order to achieve
dramatic improvements in their performance” (Hammer 1990, p.
104). This definition has evolved to the one in vogue now:

“the fundamental rethinking and radical redesign of business


processes to achieve dramatic improvement in critical, contem-
porary measures of performance, such as cost, quality, service,
and speed.” (Hammer and Champy, 1993, p. 32).

The more recent definition downplays information technology’s role


in reengineering and adds a measurement dimension. Basic to these
and other definitions (Hunt 1993, Davenport 1993, Morris and
Brandon 1993), are the concepts of radical redesign and dramatic
improvement. If either of these concepts is missing, the effort falls
short of reengineering.
Recent interest in BPR represents the resurgence of a concept
that seems to swing in and out of fashion. There is evidence of wide–
spread reengineering efforts to increase blue–collar productivity
early this century. Interestingly, this earlier reengineering effort
began approximately fifty years after the start of the industrial era,
and today’s reengineering efforts are occurring approximately fifty
years after the start of the information era.
We see some clear parallels between the industrial and informa-
tion eras (Table 1). Our analysis of the industrial era indicates the
existence of three stages, each of which represents a different focus
for the organization. While some organizations may be firmly within
a single stage, it is much more likely that a firm will straddle two
stages at once. As our case data will indicate, an organization often
continues to automate some processes concurrently with a search for
ways to revolutionize other processes.

The Industrial Era

During stage one of the industrial era, machines such as the


steam engine and the internal combustion motor had simple appli-
Automation, BPR and Client Server Technology 145

Table 1: Stages of the Industrial and Information Eras

cations: replace human or animal energy sources. Attention was not


given to radical improvement of work; there was no need. Large
productivity gains were easily reaped through the straightforward
substitution of machine for human. For instance, the horse–drawn
carriage was replaced by a truck. The truck was faster, required less
attention, and was not subject to the vagaries of the animal mind.
After replacing workers, firms searched for ways to redesign
work processes that existed before machines were used. This search
signaled the entrance of firms into Stage Two of the industrial era,
and sparked the concept of scientific management. Scientific
management, concerned with managing work and workers, offered
a radical departure from past practices, and Frederick Taylor played
a dominant role in this departure. We suspect that,were Taylor
reincarnated today, he would sound much like Hammer and Champy
in their preaching for radical departures from existing mindsets.
Taylor believed that productivity improvements would result
from following certain basic principles. One principle was to develop
a science of management to identify the best method for performing
a task (Stoner and Freeman 1989). To succeed, Taylor felt that labor
and management must undergo a “complete mental revolution.” He
carefully examined work methods, and redesigned them to achieve
radical results. For example, Bethlehem Steel increased the loading
and unloading of steel from 12.5 tons to 47.5 tons per day (a radical,
380% improvement) by following Taylor’s methods. At another firm,
the work force was reduced to 35 from 120 while accuracy improved
over 60% (Daft 1991). Thus, many firms changed the way in which
146 O’Hara/Watson

work was accomplished because of Taylor’s concept of scientific


management.
After changing their work methods, firms still needed to develop
a way to manage the new organizations that evolved from the
industrial revolution. In effect, firms needed to learn how to manage
industrial work. For many organizations, the pattern was similar.
Chandler (1962) presents this progression in his classic study of
organizational histories. First, firms vertically integrated. Vertical
integration created the need for a different management structure,
since each firm now performed several different functions. Thus, the
functionalorganization emerged, and areas such as finance, produc-
tion, and marketing were separately managed. Finally, as firms grew
larger and diversified, the multidivisional (M–form) organizational
structure emerged.
Moving to the M–form organization was not a trivial task.
Chandler’s work clearly depicts the struggle that corporations such
as Sears, General Motors, and Du Pont endured as they tried to move
into an M–form organization. Often it is only during times of crisis,
when the organization is threatened, that it will overcome the
barriers to change (Goss, Pascale, and Athos, 1993) and, in many
cases, this was the compelling reason for the adoption of the M–form.
Once adopted, however, the M–form proved an effective way for large
firms to manage work in the industrial age. For large organizations,
it has remained the dominant organizational form, and most large
firms have adopted it. Firms that have not adopted the M–form have
no need to —either because their core function is relatively simple
(such as mining companies) or because they are in highly customized
market niches (Hunt 1993).

The Information Era

The information era is also unfolding in three stages. Schein


(1994) refers to these stages as automating, informating (Zuboff
1988), and transforming the organization. In Stage One, productiv-
ity gains were made by replacing clerical workers with data process-
ing systems. The success of microcomputer word processing and
spreadsheet software is a demonstration of how information–era
technology, like industrial–era technology, was initially used to
harvest obvious gains by replacing workers or highly leveraging
existing work.
Technology, whether industrial– or information–based, enables
an organization to take on new functions, enter new markets, and
generally expand its operational scope. Given such expansion and
change, existing management structures often become ineffective
and even dysfunctional. Furthermore, the nature of work within the
firm also changes. Just as the unskilled manual laborer of the early
Automation, BPR and Client Server Technology 147

industrial era became a skilled blue–collar worker, so did the high


school graduate, a clerical worker of the early information era,
become a knowledge worker with an MBA in Stage Two. In Stage Two,
the focus shifts to redesigning work to realize a further round of
productivity gains. Thus, today’s BPR and total quality management
(TQM) efforts parallel the industrial era’s scientific management.
In an era when 70% of workers were employed in manufactur-
ing, Taylor’s scientific management concepts worked to improve
blue–collar productivity. Today, 70% of workers are in the service
sector, so focusing attention on improving white–collar productivity
makes sense (Ginzberg 1982). Additionally, we have achieved the
obvious gains from replacing workers with computers; future pro-
ductivity gains will come from radically redesigning work. Thus, the
current interest in BPR represents a transition of focus from stage
one to Stage Two of the information era.
In the 1990s, individuals and organizations must be given the
tools to “work smarter” by better managing complexity (Schrage,
1993). To manage this complexity, firms must eventually enter the
third stage of the information era. They must redesign not only their
processes, but also themselves. In an intensely competitive and
global world, leading edge companies are already confronting the
third stage of the information era. These organizations are finding
that they need a new way to manage work. Functional and
multidivisional organizations are often not appropriate to the infor-
mation age; they hinder rather than enhance the rapid exchange of
information. Many enlightened organizations are already moving
beyond reengineering business processes to total organizational
transformation. These organizations are seeking new ways to
organize work, new ways to communicate, and new authority
relationships (Schein 1994).
There is no clear demarcation between the stages of the informa-
tion era; many firms are in transition from one stage to the other.
Specifically, Stages Two and Three may be intertwined: as busi-
nesses continue to reengineer selected processes they often begin to
reengineer their basic structures, thus leading to organizational
transformation –– Stage Three of the information era.
Being in the midst of this revolution makes it extremely difficult
to identify what the new organizational form will be. After all,
Chandler’s analysis was not completed until long after the M–form
had emerged as a successful solution to managing industrial–era
work. Nevertheless, current trends seem to indicate that managing
work in an information era involves creating flatter organizations,
autonomous work groups, and the devolution of power.
Scholars and early experience suggest that the M–form struc-
ture of the industrial era is giving way to a new design centered on
creating learning organizations and collaborative work teams. The
148 O’Hara/Watson

new organizational form is perhaps best encapsulated in the word


empowerment. Empowerment occurs when employees who perform
certain jobs are given the responsibility for determining how those
jobs should be done. It assumes that the people doing the work are
those most informed about how best to do the work and how best to
improve it (Pine 1993).
There are four keys to empowerment: information, knowledge,
skills, and control (Taylor and Felton 1993). First, employees must
have the knowledge and skills to perform their jobs, or they must be
able (authorized to and willingto) obtain what they need. Second,
employees must have available the information needed to perform
their jobs. This leads to less closely–guarded data within the
organization and implies that information sharing is commonplace
and encouraged. Finally, empowered workers have control over their
jobs. They can make decisions regarding any aspect of the task and
expect to be supported by upper management in those decisions.
Empowered workers are actively engaged in their jobs and in
first developing and then fulfilling organizational goals. They have
the authority to form autonomous teams with self–defined roles and
structure. Empowering workers leads to radically different organi-
zations.
In these new organizations, decision making and responsibility
are diffused so that the organization can react rapidly to environmen-
tal change and customer demands. The new organization is highly
responsive and adaptive. Intelligence and control are no longer
concentrated in divisional offices; they exist in all the tentacles of the
organization. Information technology (IT) can play a key role in
forming and sustaining the transformed organization.

The Link Between BPR and Information


Technology
General Information Technology Enablers

Most current literature on business process reengineering acknowl-


edges the role that information technology (IT) must play in the
procedure. Some view specific information technologies, such as
work–group computing, as enabling organizations “to go beyond
traditional time and motion studies to reengineer business processes
and refocus organizations” (Tapscott and Caston, 1993a, p. 41).
Davenport and Short (1990) note that:

information technology should be viewed as more than an


automating or mechanizing force; it can fundamentally reshape
the way business is done.”(p. 12)
Automation, BPR and Client Server Technology 149

And, as previously noted, Hammer (1990), suggested that businesses


use information technology to reengineer. Thus, it is clear that
information technology and BPR are intricately linked in the redesign
of work processes. It is also clear that BPR represents a transition
from Stage One—automating work—to Stage Two — redesigning
work.
Many of today’s business processes were developed and institu-
tionalized long before computers became the powerful tools they now
are. The true challenge in Stage Two is to redesign work processes
around existing and future information technologies to maximize
their possibilities. Thus, IT is viewed as an enabler in business
process reengineering rather than a solution in itself. For Hammer
and Champy (1993), IT is an “essential enabler” of any reengineering
effort. They caution, however, that “throwing computers” at existing
business problems does not lead to a reengineered business.
There are numerous examples of information technology being
used to enable BPR. In a current reengineering project, a firm chose
to reengineer its business communications processes, including the
mailroom. A survey of mailroom operations revealed that a large
percentage of incoming mail being handled is actually comprised of
inter–office memos. By implementing an electronic mail system to
handle such communications, the firm will eliminate this particular
mailroom process. Remaining mail will be scanned using an
electronic imaging system and distributed to the addressee electroni-
cally, largely eliminating handling (Kelly and Daniel 1994).
Electronic data interchange has often been cited as a technology
that redesigns work. Some organizations that conduct a large part
of their business electronically have completely eliminated, or greatly
reduced entire departments and functions. When the management
at General Motors designed the operations for the new Saturn plant,
they worked the concept of EDI directly into the design and elimi-
nated the need for complex purchasing systems (Hammer and
Champy 1993).
While electronic mail, imaging, and EDI are good examples of
technologies that enable firms to reengineer by helping them rede-
sign work, none of these technologies is able to radically change the
manner in which work is managed. We maintain that the technology
that does this is client/server computing. While client/server
technology may be used simply to automate work or reduce costs
(Stage One enabler), it may also be used to redesign business
processes (Stage Two enabler). Furthermore, because this technol-
ogy parallels the organizational goals of empowerment and coopera-
tion by distributing information and processing power closer to the
user (Tapscott and Caston 1993a), it may also be a Stage Three
enabler.
150 O’Hara/Watson

Client/Server Computing

Client/server computing has been defined in a number of ways


(Huff 1992, Tapscott and Caston 1993a). The typical client/server
model is two–fold. Simply stated, the client makes requests; the
server usually fulfills those requests. The model has three compo-
nents: data management, application and presentation. These
components can be distributed to the clients and servers in a variety
of ways, however, the presentation software usually resides entirely
on the client workstation. Application software may reside on either
the client or the servers, or it may be split among them. The data
management software usually resides only on the servers although
client workstations may have some software for data management.
The client hardware is normally a workstation, equipped with a
graphical user interface (GUI). In some cases, the client workstation
may be equipped with enough software to handle certain requests
without server involvement. This software usually includes a

Table 2: Benefits and Costs of Client/Server Computing


Automation, BPR and Client Server Technology 151

standard office automation package (word processor, spreadsheet,


and graphics) and may also include a statistical analysis package.
Application software and processing power are often distributed
among many networked systems; thus, in theory, each computer
system can be configured to perform its job most efficiently.
Anecdotal evidence suggests vast benefits possible from client/
server computing, as well as a number of significant drawbacks
(Table 2). There has been much debate as to whether client/server
reduces costs when compared to a mainframe–based system. The
client workstation’s GUI normally provides some cost savings from
ease of use, reduced user training time, and productivity gains.
Typically, client/server user interfaces, such as Windows, Presenta-
tion Manager, and Macintosh, overcome many problems associated
with the less friendly, character–based interfaces of older operating
systems. Yet, many firms report increased costs associated with
managing the more complex networks and in training the informa-
tion systems personnel.
For many firms that have moved to client/server technology, the
primary benefits have come not from reduced costs, but in other
areas. A move to client/server may mean escape from an undue
reliance on the products of a single vendor. While true open systems
have not yet arrived, client/server technology is moving in that
direction more rapidly than its mainframe counterparts.
Another advantage of client/server is that it facilitates organi-
zations getting closer to their customers. As more customers request
information suited specifically to them, organizations must find a
way to meet those demands. Decision making authority must be
placed closer to the customer, and client/server computing is the
way to do this (Dodge 1993).
Client/server technology is not without its drawbacks. Early
movers experienced unreliable networks, difficult to manage operat-
ing systems, and a lack of the much–touted development tools. Tasks
such as data security, backup and recovery, and database manage-
ment are often more difficult in a client server environment, simply
because there usually are more machines and links that can fail
(Gillin 1993).
Client/server technology represents more of a “philosophy of
operation” than a strictly defined architecture. The nature of client/
server is such that it can be configured differently for different firms,
thus offering the flexibility needed in a rapidly changing business
environment (Datamation 1994).

Case Studies

Data from three firms that are moving to or have moved to a


client/server computing environment are presented. Two of the
152 O’Hara/Watson

firms are banks; examining their move to client/server computing is


especially interesting because the banking industry was largely
responsible for the move to large mainframe systems twenty years
ago (Radding 1992). The third is a large collection agency. Specifics
about each firm are disguised so that confidentiality is maintained.
In two of the cases, the reasons for considering a move to client/
server and the degree of senior level management involvement are
similar. This suggests that the firms may be in the same stage of the
information era. In the first case, client/server was initially thought
of as a way to automate current practices and possibly eliminate the
mainframe. In Cases Two and Three, client/server technology is
redesigning work management and enabling the transformation of
the organization.

Case One: Large Bank (LB)

Through recent acquisitions and mergers, LB has grown consid-


erably during the last ten years. It now ranks among the ten largest
banks in the United States. The computing environment is strictly
mainframe, and the systems are totally outsourced. Billing for
mainframe usage is based on a sliding scale per processing cycle
used. At the branch offices, not every teller has a personal computer;
deposit and withdrawal information is not on–line. Currently, there
is a project underway to bring more technology to the branches,
eliminating much of the batch processing. An IS staff is maintained
at LB’s corporate office to manage financial production and report-
ing. There are three primary components of the reporting system: the
database management side, the development/ad hoc reporting
section, and the consolidation reporting department. It was the ad
hoc reporting group that first looked at a move to client/server
technology.
Production reporting at LB currently takes place in a COBOL
environment. The production system produces all standard reports
used by the financial group. A 4GL is used to run the ad hoc
reporting. The product is perceived as a resources hog, and the ad
hoc group is often criticized for its consumption of mainframe
processing cycles. Another difficulty with the current system is that
it runs under an operating system that has not been supported for
nearly ten years. LB cannot add any more users to the system;
therefore, they must make some change to it.
The ad hoc group’s original goal was to find a mainframe
replacement that would give identical functionality, use the same
programs with minimal changes, provide all existing services at a
substantially reduced cost and better performance level, and offer
more control than the outsourced system. According to the manager
for the ad hoc group, the current production system is “very
Automation, BPR and Client Server Technology 153

bureaucratic; changes require an Act of Congress.” Like many IS


departments, the ad hoc group has a workaround for this situation.
Ad hoc reports are not put into production, but are instead kept in
the ad hoc group’s own libraries.
Once the need for a new system was identified, the group asked
three large vendors to present proposals. At this point, a client/
server solution was not specifically requested. Somewhere during
the proposal stage, however, the idea of client/server came to the
forefront. As a manager recalls, “One of the vendors had a client/
server product he wanted to sell.”
When the first vendor was ready with a proposal, they presented
it to the ad hoc group. Phase One of the project (for which the vendor
assumed all costs) was scheduled to last five days. The first two days
would be used to download data to the servers; a rehearsed script of
tests were to be conducted on the last three days.
Loading the data onto the servers took two weeks. Problems
occurred for several reasons: programs written to convert packed
fields did not work correctly, some key players were not available
during the scheduled cutover week, and the software did not perform
as expected. The test results were not spectacular, as processing on
the servers was not faster than on the mainframe. Nonetheless, the
group believed that the move to Client/server was justified because
mainframe savings were projected to be two million dollars per year.
The outsourcing contract, however, was not favorable to the
change. Due to the sliding billing scale, the loss of the ad hoc group’s
mainframe cycles each month actually resulted in an overall cost
increase for LB. The group then tried to justify the new system not
as a savings but as a wise idea because it meant a move to newer
technology. This attempt failed because the system would have
solved only the ad hoc group’s problems, and there was trouble in the
production group as well. The COBOL–based system locked all files
during any processing, and the production group often could not gain
access to the data. Consequently, every general ledger account was
stored in three files –– in the general ledger itself, in the production
system, and in the ad hoc system. Maintaining common, current
versions on three systems was a nightmare.
Ultimately, upper management decided to invest the time and
money needed to develop a new system that would solve the problems
in both groups, and a development project is currently underway at
LB. As to whether the new system will use client/server technology,
the manager said, “People don’t really care if it’s client/server or not.
All they want is to have all the functions they need. No one cares
about the technical architecture.” Yet, the new system will most
likely be a client/server system for several reasons. LB’s information
systems management staff believe that a client/server system would
keep up with technology and provide a “cohesive, integrated plat-
154 O’Hara/Watson

form.” They also believe that a major advantage of a client/server


system is its flexibility. Users will have more flexible tools to build
applications that will do what needs to be done. These tools will be
easier to use and information will be easier to obtain. Rekeying will
be eliminated, and downloading will be simplified. Both functions
are currently critical to the financial area. As the manager stated,
“We ... produce all this data and about 80% of it is either downloaded
or rekeyed into a spreadsheet so they (the users) can add a font, a
shading and a bold, and that’s all they care about.”
Although the official position is that the new system will provide a
cost savings, no one really believes that is likely.
Vendors have been asked back to demonstrate products in three
areas: data base management, consolidation reporting systems, and
development and ad hoc reporting tools. Later this year, a single data
base system, a specific hardware platform, and a variety of tools will
be selected for testing. Managers and staff have six months to “play
with the tools and see how they work and if they like using them.”
Although the senior management at LB is committed to this,
managers in the three groups are currently managing their on–going
projects while researching the new systems.
Changes in the information flow and decision making process
are not expected at LB from the client/server system. Although the
manager acknowledges that the new system will offer more on–line
tools and reporting, no one really expects senior management to use
them. The present culture of LB is such that upper management
doesn’t log on to computers; their staff runs reports for them.

Case Summary
It is clear that Large Bank’s primary intention in developing the
new system was to save money. While giving users more flexibility
and information was important, the driving force behind the change
was financial. Although the most recent look at client/server
systems is motivated by more than money—replacing a legacy
system and embracing new technology—the project retains a pre-
dominantly Stage One focus.
There is some indication, however, that LB’s senior manage-
ment does recognize the need to change the manner is which work
is accomplished, and is teetering at the brink of Stage Two. That is,
they are positioning themselves to reengineer their business pro-
cesses. While some work has begun in this area, it is largely
unrelated to the client/server project, and the new technology is
expected to have only a minimal effect on the BPR effort. In fact, the
manager expressed little hope that even the BPR efforts would
change the management structure and the way things are done at
LB.
Automation, BPR and Client Server Technology 155

Case Two: Large Collection Agency (LCA)

The system in this case was originally developed in 1977, and


evolved over time to suit the needs of the business. To understand
the system, it is necessary to examine the organization at the time the
legacy system was first developed and implemented. In the mid–70s,
LCA maintained several hundred geographically dispersed collection
offices, each of which followed its own collection procedures, most of
which were manual. In general, the collection process began with a
customer’s request to collect an invoice. A manual file folder was
created, and all transactions were recorded on the jacket. All
collections took place in the office that received the request, irrespec-
tive of where the delinquent customer was located. This often
resulted in large telephone costs and delays in mailings.
Each local manager decided the collection cycle. Some placed
several direct phone calls before mailing any letters; others mailed
letters first, then placed personal calls. In some cases, attorney
involvement was threatened quickly; in others this was looked upon
as a last resort. Thus, a customer with delinquent accounts in two
different areas of the country could be treated differently by each
collection office.
In the late 70s, LCA went through a major restructuring and
merged many of its offices. The corporate office, in an effort to
establish and enforce policy, —in the chief technology consultant’s
words “to homogenize service,” developed the initial information
system. As time passed, subsystems were added to the legacy
system, until in 1985 the composite system consisted of seven
different applications.
In 1985, the economy was such that revenues in the organiza-
tion needed a boost. Previously, LCA had grown largely from
efficiency (i.e., replacing manual work with computers), but it is
difficult to repeat those gains from year to year. Instead of looking
further at the bottom line for more efficiency gains, the focus shifted
to the top line. As the technology consultant stated, “The bottom line
gives only once; the top line gives many times.” LCA developed a
number of information–based products for the customers and ac-
quired new firms that they promptly gathered under the same
system, thus presenting a unique identity in the marketplace.
With the major downturn in the economy, however, the busi-
ness went into a tailspin, prompting another major restructuring. In
the consultant’s words, “When business is slow, there are no sales;
consequently, there are no invoices, and ultimately, no overdue
invoices for us to collect.” The economy was not the only negative
impact on the firm. Other smaller collection agencies, and the
customers themselves, began using computers to improve their
collection efforts. Thus, both the quality of the claim (i.e., the
156 O’Hara/Watson

chances of collecting it) and the number of claims were lower.


The regional office personnel were instructed to get closer to the
customer and provide them with what they needed, but that was
extremely difficult with the information system in use. The earlier
monolithic systems were fine at producing standard requests but
when customers needed specialized information, the systems could
not provide it. Changes to the system usually took one year; the
competition, being smaller, could change much more rapidly. Man-
agement decided that what was really needed was a way to empower
associates to respond to the customer’s needs. A different tool set is
needed when the objective is to push customer service to the point
of customer contact. This according to the consultant, was “a real
disconnect with a centralized system that enforces policy.” Managers
could not make more decisions without the tools to provide needed
information.
Thus, LCA had come full circle in the way they view their
information systems. At the start, field offices handled collections
however they wanted. When LCA was first computerized, people in
the field could only do what the system let them do. Now, the
objective was to allow people in the field to do what they needed to do,
and to decide for themselves what that was. The firm also needed a
system that would support a rapidly changing business environ-
ment; thus, continuing in the tradition mainframe arena was ruled
out because of the mainframe’s lack of flexibility.
To determine what the new system would be, LCA first rightsized
their operations over last three years. This process involved senior
managers from all departments. Through close examination of the
business processes (in effect, a reengineering of the processes), LCA
realized they needed a decentralized management structure, but
their highly centralized information system could not support such
a structure. Tools that could be used to distribute information from
the mainframe were either not available, or not practical. Client/
server technology was selected based on the belief that it would
enable the reengineered management recently put into place to work
better. In effect, the firm wanted to distribute management, but then
needed an information system to handle the redistributed authority.
The conversion to client/server, currently underway, has a budget of
nearly two million dollars and a time frame of one year.

Case Summary
The discussion above indicates that Large Collection Agency
(LCA) has moved into Stage Three of the information era. They have
reengineered their business processes; they have reengineered and
restructured their management. Work has been organized in new
ways and the lines of communication have been restructured. Now,
they seek an information system to support and enable this new
organizational form.
Automation, BPR and Client Server Technology 157

Case Three: Small Bank (SB)

When SB opened seven years ago, management made the


decision to use service bureaus for all their data processing needs
rather than purchasing their own mainframe systems. As a start–
up company, this decision made sense because it represented a lower
short–term investment in information processing power. SB contin-
ues to use these service bureaus today. Currently, one service
bureau handles their credit card business, another handles their
traditional banking business (commercial loans, savings accounts,
equity loans, etc.), a third firm supplies processing for their non–
banking trust accounts. The bank has only a single branch office;
most business is conducted via phone, facsimile, or direct mail.
Initially, users had dumb terminals through which they could
connect to the service bureau of their choice and obtain needed
information. For the most part, the service bureaus provided
consistent service with very scaleable costs, but they were inflexible.
Customized reporting, especially for a smaller customer such as SB,
was too expensive. Yet, as SB began to grow, the need for such
customization became obvious.
SB needed more management information, in a more usable
format, than what was available from the service bureaus. To
accommodate management needs, analysts would receive a multi-
tude of reports from the bureaus, take a few lines of data from each,
and rekey the data into spreadsheet applications to produce the
necessary information. This method was time consuming, tedious,
and expensive, and it greatly under utilized the analysts’ talent. The
goal, therefore, was to provide flexible reporting without high costs
in a way that would enhance, rather than ignore SB’s analytical
resources. The general ledger was the area first targeted.
The bank selected a client/server technology for several rea-
sons. As a start up company, they didn’t want to make a huge
investment in mainframe computers. They also wanted to empower
their employees, giving them all the information they needed to make
decisions. The decision was made to start out with small machines
that could grow as the firm (and its profits) did. The decision to move
horizontally to multiple servers instead of one large server reflects
these considerations since failure of a single system does not mean
that every user is off–line.
The solution was to install a database server to which general
ledger data were downloaded each morning from the mainframe. As
the data were written to the server, theywere processed to obtain
information not available directly from the mainframe, such as
account average balances. Users (clients) were given access to the
data on the local server via intelligent workstations equipped with
158 O’Hara/Watson

sufficient software so that they could manipulate and massage the


data as they needed. The users build queries to see what they need
to see. Queries are ad hoc; users can ask any questions using a point
and click scheme for ease of use.
There are presently six file servers in use at SB. Once general
ledger data were made available, users quickly asked for trust
account, commercial loan, savings account, credit card, and home
equity data. The Information Systems personnel at SB believe that
several more servers will be added in the next few years. One
manager stated that either the users or the IS department always
finds a need for another server. The most recent addition is a server
dedicated to tracking telecommunications costs. The data switch
talks directly to the server for every connection made.
The information systems department at SB consists of twenty–
two people, and services three separate floors of users on three
distinct LANs. Many of the staff are systems and financial analysts
who are working together to develop new client/server applications
in the financial reporting area.
The advent of client/server computing has changed the manner
in which many of SB’s employees perform their jobs; it has reengineered
the work processes. In the past, month–end closing would take up
to eight working days; it can now be done in two. If, for some reason,
the database servers aren’t working, there is no need for some
employees to report to work. The IS manager’s comments on this are
interesting:

“Organizations tend to structure themselves to fit the main-


frame. The mainframe spits out report A or B, someone gets
them, then rekeys lines 10 and 30, and then an army of analysts
replays the information into something the mainframe didn’t
provide. In the old world, that’s what things looked like... The
value of the client/server structure became so valuable that they
(the users) ended up, without prompting from us, reengineering
their work around its capabilities.”

A systems analyst also had some telling comments:

“We don’t give our users a hole; we give them a shovel. We no


longer build systems for them; we build tools for them to use. We
have become toolmakers. The organization has moved decision
making further and further down, in data processing we have
moved it to the end user.”

Another process that has been completely reengineered is closing an


account. What once took several days of processing through
numerous approval channels now takes about twenty seconds. This
Automation, BPR and Client Server Technology 159

has happened through a combination of three factors. First, the


process itself has been reengineered; unnecessary steps are gone.
Next, client/server enabled the new process to take place. Finally,
customer service representatives (CSRs) were empowered to make
account–level decisions.
IS management at the bank believes that client/server technol-
ogy is no more expensive for them than buying and maintaining
mainframe systems would have been; in fact, they feel it may be less
expensive. The focus of client/server for SB, however, is not on costs.
It is on user satisfaction with the performance and information
availability. As the Data Base Administrator said:

“When it comes right down to it are the costs (for client/server)


going to be significantly cheaper, I don’t know. Is it going to be
cheaper, I believe so. But better than any of that, it’s (client/
server) going to be better. If you get better for the same price,
you’ve still got a better deal. You have a better front–end for the
users, you’ve got a better implementation of everything. But
we’re not doing client/server because it’s cheaper or because the
users like the GUIs. We’re doing it because it solves business
problems.”

No one in the IS department at SB sees the mainframe going


away completely. All agree that for number crunching and transac-
tion processing, the mainframes and service bureaus will remain.
Yet, they continue to develop new client/server applications to
replace mainframe functions they find unacceptable. Thus, while SB
never faced the legacy system problems that the large bank and large
collection agency face, they do have systems that no longer work well
for them.
The focus at SB has always been on empowering their users.
When perspective programmer/analysts are interviewed, they are
asked by the COO how they feel about empowering users –– writing
tools and not systems for them. Thus, the idea of empowering users
filters down from the top. Indeed, senior management is intimately
involved with the IS projects at the bank, and IT plays a central role
in the strategic plan.
SB’s president has always focused on two issues: the quality of
the customer service representatives and the quality of the informa-
tion systems that support the bank. Since customers don’t walk into
the bank to make transactions, the person on the phone becomes all–
important to them. The voice, attitude, and demeanor of the CSR are
critical, but also important is the information the CSR can offer the
customer. Thus, the corporate culture doesn’t see IT as an expense
but rather as a “keeper of the business.”
Client/server technology has changed several things within SB.
160 O’Hara/Watson

First, the flow of information has been streamlined. Formerly,


several different processes updated the mainframe sequentially and
long delays resulted from an overnight wait for each update. The
general ledger system processing has been reduced from more than
3 days to less than 1 day.
The sources of information for many bank employees have also
changed. Managers just below the senior vice–president level once
called on analysts to run reports; now, they can get the information
themselves very easily. Information is also more readily available to
clerical employees. Thus, the client/server systems have distributed
information up and down the organization.
Decision making has moved down the organization so that now
CSRs and clerical employees are given the information and authority
to make decisions. There are fewer layers of management for each
decision to travel through; thus, the organizational structure has
flattened.

Case Summary
Work is organized in new ways, communication paths are
simplified and redefined, and authority relationships have changed.
These are all indicative that an organizational transformation has
occurred. Thus, SB seems to be in Stage Three of the information era.

The New Organizational Structure


References to the new organization, the organization of the
future and new organizational forms and structures abound through-
out this paper, and in much of the current literature. What do all
these catchphrases mean?
One vision of the organization of the future is a move away from
the hierarchical structure to a flatter one comprised of small
business units focused by product or market segment. In these
firms, reengineered for quick response to a changing environment,
information flows freely and processes are streamlined (Ligus 1993).
Autonomous business teams are critical in these new information–
based organizations (Tapscott and Caston 1993b). Employees are
empowered and encouraged to participate in the decision making
process (Davenport 1993).
The organization of the future will be knowledge based, and have
fewer levels of management (Drucker 1988). It will be a learning
organization —one that is constantly enhancing its capacity to create
(Senge 1990). By building a shared vision of an organization’s future,
the organization will become learning rather than controlling and the
behavior of people within the organization will be fundamentally
different. The new organization will also be better able to respond to
opportunities – whether they be in new technologies, new revenue
Automation, BPR and Client Server Technology 161

sources, or new market segments (Berquist, 1993). Another term


often used to characterize the new organization is “boundarylessness”
(Tichy 1993). As used by Jack Welch, the CEO of General Electric, the
term connotes an organization that de–emphasizes hierarchical and
functional boundaries and focuses instead on partnerships, alli-
ances and teams. Critical to this new organization is the free flow of
information among its employees.
There are several influences compelling organizations to become
“more flexible, far–sighted, and able to learn continuously” (Watkins
and Marsick, 1993, p. 5) These include global turmoil and compe-
tition, the rise of self–directed teams, participatory management, and
time compression. Managers, however, may find it difficult to
respond to these influences because the infrastructure of their firms
does not support such response (Rummler and Brache 1990).
Thus, the organization of the future will be characterized by
empowered associates who have the responsibility, knowledge, and
authority to make decisions. Information Technology (IT) will play a
key role in enabling this organizational form.

Conclusion
We have put forth a three–stage model to explain the changes in
the automation and the management of work that occurred during
the industrial era and are occurring during the information era. Our
model suggests that BPR is primarily a second stage phenomenon
that initiates new organizational forms based on collaborative teams
and empowered employees. Indeed, there is evidence that some firms
are already abandoning the constraints of the M–formfor less rigid
structures. Thus, while in 1985, Williamson could maintain that the
M–form organization was “the most significant organizational inno-
vation of the twentieth century;” six years later Bettis was calling it
an “organizational fossil” (Hosskisson, Hill, and Kim, 1993). In a
world of increasing complexity, rapid change, virtual corporations,
and global competition, Bettis may indeed be correct.
We propose that client/server technology is a major enabling
technology for every stage of the information era. It has this
capability because it can be used to automate existing processes,
simplifying an interface or providing information more easily. It may
also enable informating the organization as it reengineers its busi-
ness processes because it provides users with a friendly interface,
tools for extracting and manipulating data, and access to a range of
servers.
As a Stage Three enabler, client/server technology deregulates
information access. The monopoly era of the mainframe as the sole
provider of information is replaced by a free market of multi-servers
and user self-service. Users are empowered because they have
162 O’Hara/Watson

choice, tools, and personal processing resources. Users can choose


what data they want from what is available.
As marketing discovered decades ago, customers like to serve
themselves. Users want to have the tools to manipulate data and
change its format to meet their personal preferences and decision
style. Customers want customized products, even if they have to do
the tailoring themselves. Users have desktop computers with
sufficient processing power to quickly and easily process data
without being reliant on the IS custodians of processor cycles.
Customers want independence and control over their destinies. A
marketing perspective on client/server illustrates that it results in
satisfied users who can better serve their customers.
The three case studies illustrate how some firms are using
client/server technology as a means of empowerment. The small
bank and the large collection agency both perceive client/server
technology as a medium for transferring power from the IS depart-
ment to the users. Furthermore, it is clear that this movement is a
deliberate response to a competitive business environment that
requires a more agile and maneuverable organization. Clearly, these
organizations are in the third stage of the information revolution.
In contrast, the large bank seems to be teetering between phases
one and two. Client/server is initially seen as a way of rearranging
processing to reduce costs; it is just a cheaper way of automating
work. This is not surprising because the impetus for change is not
external market forces but internal cost saving measures. Competi-
tion is not promoting the change; rather, some cost accountant is
tallying the score. There is some indication, however, that the bank
also sees client/server as a way of redesigning work since it is now
looking into a large–scale cutover to a client/server environment.
The cases support our argument that client/server technology
enables organizations to empower employees to create a new organi-
zational form. While two cases directly support this contention, one
supports it indirectly. The large bank has no compelling reason to
undertake the massive organizational upheaval and realignment of
power that is necessary for a new organizational form. It is far easier
to stay in the automation phase than endure the rigors of transition
to another phase.
There is little doubt that organizations must change to survive
into the 21st century. Hammer and Champy argue that simple gains
from automation are no longer enough and that radical advances
must be made. Evidence suggests that even these radical gains from
reengineering business processes are not enough. What is needed for
survival is a totally transformed organization, one that is able to learn
and grow continually. We believe that client/server technology can
enable this new organizational form.
Automation, BPR and Client Server Technology 163

References

Berquist, W. (1993). The postmodern organization: Mastering the art of


irreversible change. San Francisco: Jossey-Bass.
Chandler ndler, A. Jr., (1962). Strategy and structure: Chapters in the
history of the American industrial enterprise. Cambridge MA: MIT Press.
Daft, R. (1991). Management (2nd ed.). Fort Worth: Dryden.
Davenport, T., (1993). Process innovation: Reengineering work through
information technology. Boston: Harvard Business School Press.
Davenport, T. and Short J. (1990, Summer). The new industrial
engineering: Information technology and business process redesign. Sloan
Management Review, 31(4), 11–27.
Dodge, M. (1993, November 15). Client server may be ragged, but it’s
not on the ropes. Computerworld, p. 37.
Drucker, P. (1988, January–February). The coming of the new
organization. Harvard Business Review, 66(1), 45–53.
From Client to Server and Back Again. (1994, April). Datamation, p. S–
19.
Gillin, P. (1993, September 27). Valley of dearth. Computerworld, p. 36.
Ginzberg, E. (1982, September). The mechanization of work. Scientific
American, 247(3), 67–75.
Goss, T., Pascale, R., and Athos, A. (1993, November–December).
Risking the present for a powerful future. Harvard Business Review, 71(6),
97–108.
Hammer, M. (1990, July–August). Reengineering work: Don’t auto-
mate, obliterate. Harvard Business Review, 68(4), 104–112.
Hammer, M. and Champy, J. (1993). Reengineering the corporation: A
manifesto for business revolution. New York: HarperCollins.
Hosskisson, R., Hill, C., and Kim, H. (1993). The multidivisional
structure: Organizational fossil or source of value? Journal of Management,
19(2), 269–298.
Huff, S. (1992, Summer). Client–server computing. Business Quar-
terly, pp. 30–35.
Hunt, V. (1993). Reengineering. Essex Junction, VT: Oliver Wright
Publications.
Kelly, G. and Daniel, L. (1994). Applying electronic meeting systems to
the strategic planning process. University of Georgia Working Paper.
Ligus, R. (1993, January). Methods to help reengineer your company
for improved agility. Industrial Engineering, 25(1), 58–59.
Morris, D., and Brandon, J., (1993). Reengineering Your Business. New
York: McGraw–Hill.
Pine, B. (1993). Mass Customization. Boston: Harvard Business School
Press.
Radding, A. (1992, December). Banking’s new technology revolution:
From mainframes to PC power. Bank Management, pp. 35–40.
Rummler, G., and Brache, A. (1990). How to manage the white space on
the organization chart. San Francisco: Jossey–Bass.
Schein, E. (1994) Innovative cultures and organizations. In T.J. Allen
and M.S. Scott Morton, (Eds.), Information Technology and the Corporation
of the 1990s, New York: Oxford University Press, 125–146.
Schrage, M. (1993, September 27). No frills, fewer tangles.
164 O’Hara/Watson

Computerworld, p. 37.
Senge, P. (1990). The Fifth Discipline. New York: Doubleday.
Stoner, J., and Freeman, R. (1989). Management (4th ed.). Englewood
Cliffs, NJ: Prentice Hall.
Tapscott, D., and Caston, A. (1993a). Paradigm Shift. New York:McGraw–
Hill.
Tapscott, D., and Caston, A. (1993b, Summer). The new promise of
information technology. Business Quarterly, pp. 51–60.
Taylor, J., and Felton, D. (1993). Performance by Design. Englewood
Cliffs, NJ:Prentice Hall.
Tichy, N. (1993, December). Revolutionize your company. Fortune, pp.
114–118.
Watkins, K. and Marsick, V. (1993). Sculpting the learning organization.
San Francisco: Jossey–Bass.
Zuboff, S. (1988). In the age of the smart machine. New York: Basic
Books.

You might also like