Professional Documents
Culture Documents
by
A DISSERTATION
IN
HIGHER EDUCATION
DOCTOR OF EDUCATION
Approved
Accepted
December, 1999
Copyright 1999, Jan P, McCathem
ACKNOWLEDGMENTS
1 have only finished this project with the support and encouragement of many
people. 1 am indebted to Dr. John Murray for joining me and guiding me on the last
part of this journey. His editing, suggestions and support are appreciated. I also wish to
thank my other two committee members, Dr. Robert Ewalt and Dr, Brent Cejda for their
input and suggestions.
Most of all I must thank my husband, Glenn, who has always supported and
encouraged me in my educational endeavors and in life. Few women are blessed to
have a soul mate who seeks their good and growth as a person as much as my husband
does. I thank God for him. I am also grateful to my two sons, Levi and Kurtis, who
have stood by mom encouraging and believing that I could achieve this goal,
A few others have been very instrumental in supporting me during this process,
Dr, Bud Joyner always believed that it was a good study and his encouragement is
greatly appreciated. My prayer partners, Vicki Hooker, Betsy Armstrong, Tonya
Canada, Ellen Rumpff and Betty Thompson have offered words, notes and passages of
encouragement. For these and their fiiendships, I am eternally gratefijl,
I finished this task only with the support and encouragement of my Lord and
Savior, Jesus Christ, to whom belongs all honor and glory forever.
11
TABLE OF CONTENTS
ACKNOWLEDGEMENTS ii
ABSTRACT vi
CHAPTER
L INTRODUCTION 1
Statement of Problem 1
Decreasing Resources 2
Increasing Accountability Demands 2
Available Information 4
Changes 5
Purpose of Study 9
Research Question 10
Justification forthe Study 11
Definitions 12
Delimitations and Limitations of Study 14
Significance of Study 15
Organization of the Document 16
IL REVIEW OF LITERATURE 18
Theoretical Background 18
Philosophy of TQM 20
TQM Application 21
TQM in Higher Education 25
TQM Continuous Improvement 28
TQM Benchmarking 28
Business Benchmarking Background 29
Benchmarking in the Baldridge Award 30
iii
Definifions of Benchmarking 32
Types of Benchmarking 34
Higher Education Benchmarking 35
Examples of Benchmark Studies 36
Summary 39
IIL METHODOLOGY 40
Background for study 40
Sample selection 40
Instrumentation 43
Support secured 43
Survey development 43
Content areas 45
Pilot Study 45
Data Collection 45
Data Analysis 46
Follow-Up Survey 46
Summary 47
IV. FINDINGS 48
Sub-Question 1 48
Essential Elements 48
Study Guidelines 49
Designing the Instrument 53
Sub-Question 2 54
Functional Areas 55
Internal Baseline 57
Sub-Question 3 58
Sub-Question 4 59
Sub-Question 5 62
IV
Sub-Question 6 65
4.7 Summary 65
V. CONCLUSIONS 67
Purpose of Study 67
Conclusions 68
Sub-Question 1 68
Sub-Question 2 71
Sub-Question 3 72
Sub-Question 4 72
Sub-Question 5 74
Sub-Question 6 74
Recommendations 76
Future Investigations 77
Summary 78
BIBLIOGRAPHY 79
APPENDICES
A - BENCHMARKING SURVEY INSTRUMENT 89
B - TASK FORCE BENCHMARKING HANDOUTS 159
C - TASK FORCE POSSIBLE BENCHMARK AREAS 171
D - FINAL BENCHMARKING REPORT 176
E - BENCHMARK FOLLOW-UP SURVEY 282
F BENCHMARK STUDY ORIGINAL TIMELINE 287
G - SAMPLES OF LEAD COLLEGE GAP ANALYSIS 289
H - COVER LETTER 317
ABSTRACT
VI
survey instrument designed by a researcher and participants, (3) a year's commitment to
the study, and (4) collection, analysis and access to the data by decision makers in each
area surveyed. Five basic types of improvements were made based on the benchmark
data: fiarther research in weak institutional areas, improvements in technology equipment
and usage, making existing processes more efficient, upgrading weak areas and
developing new services.
The study concluded that support and involvement of the president and senior
leadership is necessary for improvements to be made from a benchmark study, that
training of decision-makers and a Task Force on each campus is necessary in a
benchmark study, that doing a gap analysis of an institution after receiving the data, and
that making the data available to key personnel are necessary in making improvements
from benchmark data.
vu
LIST OF TABLES
via
D.26 Noncomputerized labs-Institution 1 245
D.27 Noncomputerized labs—Institution 2 247
D.28 Noncomputerized labs-Institution 5 248
D.29 Noncomputerized labs-Institution 6 248
D.30 Non-listed positions 255
F.l Time line for resource management study 288
G.l Contact hours, employees, students 295
G.2 Percentage of developmental classes taught 296
G.3 Per contact hour cost 301
G,4 Contact hour study 301
G.5 Instructional adminisfrators compensation and load 302
G.6. Librarians and counselors 304
IX
CHAPTER I
INTRODUCTION
Statement of Problem
Three major problems are facing higher education today: decreasing resources,
increasing demands for accountability, and greater amounts of available information.
Decreasing Resources
Increasing operating costs coupled with decreasing and volatile revenue sources
have created both internal and external pressures for higher education. Reductions in
state and federal appropriations and slower-growing tuition rates limit the fiinds available
to colleges; thus, costs must be controlled and reliable sources of additional revenue
found merely to keep from losing ground (DeCosmo, Parker, & Heverly, 1991). Since
resources are not as plentifijl as they once were, cost and efficiency are major issues for
colleges. While building such resource efficiency, colleges and universities must deal
with environmental changes such as changing demographics, increasing demands for
accountability from government, governing boards and the media, and public
disillusionment with the value of higher education.
Available Information
This knowledge and technology explosion or as Wolverton (1994) labels it, this
"exponentially exploding knowledge base" (p. vi), is the third problem facing higher
education today. In discussing this knowledge explosion, Parston (1986) observed that
more than ever, institutions now need to incorporate both lifelong learning practices and a
mass system of post-school education while continuing to produce value on limited
resources. In explaining why certain traditions were unsustainable in higher education
today, Susan Weil (1994) stated, "The nature and rhythms of knowledge creation and
dissemination have changed dramatically, supported by the revolution in technology..."
(pp. 22-23), This higher volume of available information predicts Alsete, (1995) is
changing "the methods of how institutions of higher education operate in the mid-1990s"
(p. iii). Similarly, Peter Drucker (September-October 1992) predicted that in the "next 50
years, schools and universities will change more and more drastically than they have
since they assumed their present form more than 300 years ago when they reorganized
themselves around the printed book" (p. 97). He goes on to say that these changes will
be forced in part by "new technology, such as computers, videos, and telecasts via
satellite" and in part by "the demands of a knowledge-based society in which organized
learning must become a hfelong" (p. 97).
In a survey of over 350 college and university senior administrators reported in
the Chronicle of Higher Education Almanac for September 1989, 42% ranked facilities
and technology and 39% ranked adequate resources as two of the top three challenges
facing higher education (p. 24). Carr, Hard, and Trahant (1996) listing core changes in
organizations placed information technology as a key change for the 1990s. One of a
number of conditions affecting colleges and universities today, Alsete (1995) explains is
that "not only is the knowledge base of many areas in higher education changing rapidly,
awareness concerning these changes is so widespread that the stake holders of higher
education are increasingly dissatisfied with the status quo" (p. xi). Thus, these three
problems are demanding changes in higher education.
Changes
Some of these changes can be contiolled by effective resource management.
Since the 1860's institutions of higher education have continually changed due to
intellectual, philosophical, and functional changes in society, observes Veysey (1965);
such adaptation must continue to take place as we face a new millermium. The stresses
discussed above are not only facing colleges and universities, but community colleges as
well. Moreover, for community colleges, added to these pressures, exists the struggle to
once again define their role in higher education. Business and industry, communities,
government, and four year institutions all desire to shape two year schools for their own
benefit, but community colleges must chose for themselves how to relate to each of these
service areas. In doing so community colleges must define their role in
continuing/lifelong learning, workforce development, vocational certification, distance
learning, and four year transfer.
Sir Winston Churchill, once said, "There is nothing wrong with change if it is in
the right direction. To improve is to change, so to be perfect is to have changed often"
(quoted in Jablonski, 1992, p, 25), Alstete (1995) agrees that to stay viable, a new way
of thinking that creates more efficient operations and a desire for continual learning by
students must be integrated into higher education. Rush (1994) summarized the problem
by stating that the challenge higher education faces is "reorienting their thinking around
customers, processes, and a different set of measurements. Institutions not only need to
rethink what they do, but how they do it and how they measure themselves" (p. 88).
Benchmarking
One of the latest TQM techniques for continuous improvement being used by
business and industry is benchmarking. Benchmarking is being used and is effective in
6
higher education for several reasons. First, because it focuses on outputs and quality as
explained by Alsete (1995), "Traditionally, institutions have been primarily concerned
with inputs and costs, where improved quality can only be achieved from greater
expenditures. Benchmarking is different, because it focuses on the outputs and quality of
services, not the inputs" (p. 41), Second, benchmarking is easy to understand and
implement by all levels of employees in an organization for all kinds of processes
(Spendolini, 1992), Benchmarking is currently being used successfully in colleges and
universities (Alsete, 1995) because it is a part of a learning process within an organization
and the key to productivity improvement which lies in performing better through
continuous learning. "Since institutions of higher education profess learning and value
hard data, using benchmarking to improve our processes is a natural extension of what we
provide to students in the classroom" (p, 38),
Third, many companies, such as Xerox, Motorola, IBM and others, have been
using it for years, thus establishing a pattern to adapt to colleges and universities
(Spendolini, 1992), Fourth, benchmarking uses reliable research techniques, such as
surveys and interviews which can "provide external and objective measurements for goal-
setting, and for improvement tracking over time" (Alsete, 1995, p. 27), Fifth, it is one
method that aids institutions in staying competitive. Because today's students are more
demanding and tend to "shop"for the college they will attend, institutions of higher
education are using benchmarking to improve by "comparing performance (both
administiatively and academically) with comparable or peer institutions, 'best-in-class,'
and even world-class organizations outside of higher education" (Alsete, p, 4). Sixth,
Tucker (1996) found that benchmarking can be used to rapidly address specific strategic
goals and prevent "wheel reinvention o f by an institution as well as learning how other
organizations have addressed the issues and problems they are facing. Seventh, Detrick
and Pica (1995) reported that one real value of a benchmarking study is institutional
infrospection, because it forces participants to go inside their own institution, collect
information, and raise questions.
Higher education institutions have begun to use benchmarking to compare
themselves to peer or competitive institutions and have found several benefits have
accrued. A study conducted for the American Assembly of Collegiate Schools of
Business (AACSB) and the Graduate Management Admission Council (GMAC) by Pica
and Detrick working with schools in the Big Ten Conference in 1993 found that
participating institutions reported the benchmark study was a positive experience, and has
been used in self-analysis, in preparing budget requests, and in getting additional
resources, (Alsete, 1995, pp, 43-44).
Responding to inquires from the CQI-L web site on what benchmarking studies
have accomplished in colleges and universities, Janice Dossey-Terrell at the University of
Central Florida states: "Thus far, our benchmarking efforts have been very helpful, and
have prevented us from making some decisions we would have regretted in the future"
(Dossey-Terrell, 1995, p, 4). A Pennsylvania State University respondent states that they
have been using benchmarking initiative in a wide variety of areas, "We continue to
profit from our corporate partners who keep us on track and emphasize the importance of
benchmarking processes for improvement, and not just the collection of data to prove
how good we are" (Sandmeyer, 1995, p, 4),
Another Respondent, Gerry Shaw from Babson College which used
benchmarking to discover best practices replies: "We have learned where and what to
avoid as we move along, how others have dealt with resistance along the way, how
technology can be used to better enable what we are trying to do, and how to achieve a
stronger customer focus" (1995, pp. 4-5), A University of Maryland respondent reported
that benchmarking successfully reduced the processing time for surplus property requests
(Schnell, 1995,p, 6).
Further Ray Carlson from Dalhousie University believes that benchmarking is a
much needed comparative analysis technique:
From my perspective, benchmarking is probably the key to CQI being
useful on campus. At the same time, benchmarking relies on some form
of outcome measurement—as one can measure outcomes in a valid and
reliable way, it becomes possible to identify processes that seem more
effective, and then try to isolate factors that might be responsible, and test
whether introduction of these factors leads to better results. (Alsete, 1995,
p. 56)
Benchmarking in higher education is also being conducted at several European
institutions Alsete (1995) reports.
In addition to gathering data for process improvement, benchmarking is useful by
college and university leaders
for strategic planning and forecasting, because it develops knowledge of
the competition, views state of the art practices, examines trends in
product/service development, and observes patterns of customer behavior.
Benchmarking is a source for new ideas, process comparisons, and goal-
setting. It enables the... practioner to see the organizational functions
from an external point-of-view, and not be limited to the traditional method of
developing ideas and objectives internally. (Alsete, 1995, p. 11)
Higher education uses benchmarking by identifying competitors performance
standards and then comparing one's own performance those standards. Such information
can produce greater efficiency as both college boards and administrators become aware of
ways of doing things they had not thought possible, Alsete (1995) explains that
benchmarking has emerged as a usefiil, easily understood, and effective tool for staying
competitive and developing adept practices. Understanding how another institution
operates can provide both new ideas and new methods to bring about improved
performance. The prestigious American prize for quality, the Malcolm Baldridge award
has incorporated benchmarking into its application process. Such inclusion confirms that
benchmarking has become a recognized tool of the TQM continuous improvement
process in institutions of higher education
Purpose of Study
The purpose of this study is to evaluate the feasibility of using the TQM technique
of benchmarking in mid-sized community colleges in Texas. Mid-sized colleges were
selected for this peer benchmarking case study because they experience the same
pressures as all community colleges, but because of size, they have fewer resources than
larger institutions and larger student populations to serve than smaller institutions.
Selecting such homogeneous institutions provided the necessary environment to examine
the methods and design of the benchmarking process. Based on published research and
input received during the study from the participating colleges, these selected institutions
were surveyed in ten educational areas: business operations, computer technology,
developmental instruction, distance education, institutional effectiveness, instructional,
outsourcing, personnel, scheduling instruction, and workforce development. After
information was compiled from the six institutions selected, the raw data was distributed
to the institutions involved in the study. Then, a gap analysis of the data was calculated
for the lead college, and benchmarks were also established as either trends or the best
practices in each area of study. Two years later, the institutions were contacted again and
asked to complete a questionnaire to determine how the benchmark study had been
utilized by each institution. These surveys were then evaluated to determine if
improvements were made as a result of using benchmarking data. If improvement had
been made because of the benchmarking study, this would indicate that the process could
be used by community colleges to make improvements.
Research Question
To guide the evaluation of the benchmarking process in mid-sized community
colleges, a main research question and a series of sub-questions were developed. The
main question for the study, what are the conditions necessary for benchmarking to be
used by mid-sized community colleges in Texas to improve community college
operations? was developed to evaluate the process of benchmarking.
The six sub-questions to be answered were as follows.
1. What plaiming steps should be taken by the lead college in beginning a benchmark
study for an educational institution?
2. Can a benchmark study be designed to ensure that each of the participating institutions
feels included and derives information useful in making system improvements?
3. What time period is reasonable for conducting a benchmark study in community
colleges?
4. What procedures can be used to collect, analyze and distribute benchmark data so that
it can be utilized by individual institutions to improve processes in community
colleges?
10
5. Can the benchmark information be used at individual institutions to improve processes
in community colleges?
6. What are the characteristics of a college that utilizes benchmark data for process
improvement?
It should be noted that based on the literature from the use of benchmarking in
Business that answering the six sub-questions regarding planning, designing, conducting,
collecting and analyzing, and distributing and utilizing benchmark data, an adequate
evaluation of the benchmarking process's effectiveness in producing operational
improvements in community colleges can be accomplished.
11
Much has been written about the need for change and improvement in higher
education, and new paradigms of operation are being proposed to address change, Alsete
(1995) explains
..,quality improvement techniques developed in the business world such as TQM,
reengineering, the Baldridge Award, and benchmarking are being used as tools in
these new paradigms or ways of thinking for colleges and universities.
Benchmarking has become part of the lexicon of continuous quality improvement
in the 1990s, (p. 16)
One change needed in colleges and universities is to place less emphasis on the input of
resources to academic departments and more emphasis on the work processes and outputs
of those departments. Thus, institutions should "determine what the expertise, skills, and
materials they invest produce, how they produce it, what it costs to produce it, and how
well it is produced" (Rush, 1994, p, 88). One way to efficiently shift to this output
emphasis is to benchmark. Another change needed in higher education is how decisions
are made and innovations selected; these are usually done without the active
involvement of those who will implement the changes and without data to support the
investments the innovations take (Tucker, 1996, p. x). Benchmarking directly addresses
these needs as well.
Thus, due to the fact that benchmarking is a new method that has proven to be
successful in many areas, six presidents of mid-sized community colleges in Texas
participated in conducting a benchmark study between their institutions.
Definitions
TQM or Total Quality Management is defined as the theory of customer-centered
quality-driven participatory management developed by Edward W. Deming during his
work in Japan. Deming bases his theory of management on fourteen points which if
followed are designed to improve quality, productivity, and competitive position by
continuous improvement in customer satisfaction through an integrated system of tools,
techniques, and training.
12
CQA nr Continuous Quality Assurance is defined as the aim of an organization, in
this study a mid-sized community college, to adapt TQM to its culture including
improving quality through continuous change.
Continuous Improvement is defined as a "constantly, never ending striving to do
better" (Hammons, 1994, p. 337).
Mid-sized community college is defined as a community colleges with annual
student headcount between 5,000-10,000, contact hours between 2 million and 6 million,
and appropriations between 9.5 million and 13 million dollars with primary service area
population of between 75,000 and 200,000.
Benchmarking is defined as the TQM technique used to facilitate continuous
improvement by analytically comparing ongoing processes in an effort to manage change.
Peer Benchmarking is defined as the process of analytically comparing peer
organizations in order to discover current trends and show institutional gaps in
performance where improvement can occur. This definition combines business's
definitions of competitive benchmarking, as defined by Spendolini (1992) as measuring
against two or three direct competitors with the same customer base, and industry
benchmarking, as defined by Leibfried and McNair (1992) as looking for general trends
across a larger group of related firms who have similar interests and technologies. But
the goal of both is to provide external standards for internal measurement and
institutional improvement.
Process is defined as a series of actions or operations that leads to a particular
result, Jablonski (1992) defines a process in TQM "as a series of operations linked
together to provide a result that has increased value" (p, 52), Key processes are "critical
functions that determine how well an organization succeeds (e.g., admissions, advising
teaching, and placement)" (Hammons, 1994, p. 338).
Trend is defined as a common approach to development, operation or growth
being utilized by a majority of the colleges studied, thus, indicating a degree of efficiency
in performance.
Malcolm Baldridge National Quality Award is defined as the United States annual
national quality award established by the Malcolm Baldridge National Quality
13
Improvement Act of 1987, signed by President Reagan and handed out by the Department
of Commerce, The purposes of the award are to promote quality awareness, to recognize
quality achievement of United States companies, and to publicize successful quality
strategies (Jablonski, 1992), In 1995, the Malcolm Baldridge National Quality Award
Education Pilot was launched; "it implies improvement compared to peer institutions and
appropriate benchmarks" (Wolverton, 1994, p. 14),
14
Significance of Study
Tradition and intuition, used often as a basis for making changes, are really no
replacement for objective external comparison and analysis, or benchmarking.
Benchmarking is emerging in leading-edge companies as a tool for obtaining the
information needed to support continuous improvement and gain competitive advantage.
Recent surveys across major countries indicate that over two thirds of leading companies
are conducting benchmarking on a regular basis (Cook, 1995, p. 11). Leibfried and
McNair (1992, p. 3) illustrate that management is looking beyond the basics for ways to
improve upon existing products and services, and benchmarking provides an excellent
way to do this and survive. The marketplace in both business and higher education today
expects, and is receiving, constantly improving products and services at an ever-
decreasing cost per function. Such reports suggest that benchmarking is becoming a key
management technique for the 1990s and beyond because it provides a new vision and
perspective on traditional management concerns,
Peterson (1993) predicts that colleges and universities "can never again expect to
receive as much of their funding from tax sources as in the past and must now convince
private funding sources that higher education is a worthwhile investment" (p. 12), If
community colleges begin to search for funding from local businesses and industries,
they must justify their contributions in terms of what the college can do for them. He
further notes that in fulfilling the community college mission, "it is essential that college
services of all types, as well as curriculum and instruction, be continuously assessed and
enhanced" (p. 15). This study has the potential of aiding community colleges in
developing a continuous improvement tool that will lead to better organizational
effectiveness.
The challenge facing community colleges and all colleges is what is the best way
to bring about needed and continual changes in areas of rapidfransformationsuch as
operational efficiency, technology and increased knowledge base, demands of
accountability, and life-long learning opportunities. Quality management principles and
processes such as TQM (Total Quality Management), CQI (Continuous Quality
Improvement), and CQA (Continuous Quality Assurance) have afready provided the way
15
and the language to meet these challenges in some higher education institutions. But
effective methods of bringing about continuous improvement, such as benchmarking, are
still needed. Benchmarking has provided the means for making comparisons to assist in
meeting the challenges of these transformations. For example, benchmarking benefits
higher education in the area of accountability by encouraging the institutions to rethink
and reorient processes to improve the quality and reduce costs.
Thus, this study of the development of a benchmarking process in mid-sized
community colleges in Texas will provide college administrators an evaluation of both
the design and methods of benchmarking which can be used in institutions of higher
education to make operational improvements. It also provided the community colleges
who participated in the study with data and benchmarks of practices in like institutions,
as well as providing other colleges with knowledge of existing practices in mid-sized
community colleges in Texas. This study is important because
• it is the first of its type to be conducted among community colleges; therefore, it will
provide valuable information on how benchmarking can be used in community
college educational settings;
• it will furnish guidelines for community colleges in applying the process of
benchmarking to their operations;
• it will test the usefiilness of benchmarking for community colleges;
• it will add to the research and literature in the field of higher education
• it will produce examples of efficient practices being used in community colleges;
• it will provide standards for participating mid-sized Texas community colleges to
measure themselves against.
16
selection, the instrumentation, the pilot study, the data collection, the data analysis, and
the follow up survey is reviewed. In Chapter IV, the findings that answer the research
question and the five sub questions are presented. In Chapter V, the conclusions,
recommendations and implications for fiirther study are discussed.
17
CHAPTER II
REVIEW OF LITERATURE
Theoretical Background
Total Quality Management (TQM) is the theory being adopted to improve both
American industry and higher education. The theory's primary founder was Dr, W.
Edward Deming who joined with Dr. Joseph Juran after World War II. They introduced
the Statistical Quality Confrol (SQC) concept of management, a statistical theory
originated by Sir Ronald Fisher, but used during World War II by Walter Shewhart, a
Bell Laboratories physicist, to develop the zero-defects approach to producing
telephones. Deming, who had worked with Shewhart, developed his own version of SQC
which he introduced and refined in Japan (Jablonski, 1992, p. 29), Joseph M. Juran's
main contribution was defining and teaching about how to create customer-oriented
18
organizational systems (Sashkin & Kiser, 1993, pp. 37-38). Both Deming and Juran saw
quality as a management function that could be systematically improved by using
statistical tools, consumer research, goal-setting, team work, problem solving, human
resource management, and strategic planning (Seymour, 1993, pp. viii-ix).
In Japan the two were joined by Kaoru Ishikawa, the early Keidanren leader who
helped Japanese executives identify the need for quality and who invited Deming to teach
Japanese business executives how to achieve quality; Ishikawa is known for his efforts to
spread the use of TQM techniques through his technical books that describe TQM tools
and his new tool, the Fishbone Diagram. The term "total quality control" was actually
coined by another pioneer of the theory, Armand V. Feigenbaum (Sashkin & Kiser,
1993, p. 37). Deming, Edwards and others emphasize that quality enhancement
throughout an organization is the basis for profit. This concept was first recognized by
the Japanese as the strategy needed to solve their economic crisis after World War II, and
thus, Japanese companies made the first total commitments to quality enhancement
(Peterson, 1993, p. 5).
Since the work in Japan, Deming (1986) has refined his theory of management for
the improvement of quality, productivity, and competitive position into fourteen points:
1. Create constancy of purpose toward improvement of product and service,
with the aim to become competitive and to stay in business and to provide
jobs.
2. Adopt the new philosophy. We are in a new economic age.
3. Cease dependence on inspection to achieve quality. Eliminate their need for
inspection on a mass basis by building quality into the product in the first
place.
4. End the practice of awarding business on the basis of price tag alone.
Instead minimize total cost by moving toward a single supplier,
5. Improve constantly and forever the system of production and service, to
improve quality and productivity, and thus constantly decrease costs.
6. Institute training on the job.
7. Institute leadership,
8. Drive out fear, so that everyone may work effectively for the company.
9. Break down barriers between departments.
10. Eliminate slogans, exhortations, and targets for the work force asking for zero
defects and new levels of productivity,
11. Eliminate numerical quotas for the work force and numerical goals for
management.
19
12. Remove barriers that rob people of pride in workmanship by abolishment of
the annual or merit rating and of management by objective.
13. Institute a vigorous program of education and self-improvement.
14. Put everybody in the company to work to accomplish the transformation.
(pp.23-24)
He adds that apphcation of his theory produces a chain reaction: First, quality improves;
second, costs decrease because of less rework, fewer mistakes, fewer delays, and snags;
third, materials are used better; fourth, productivity improves; fifth, the market is
captured with better quality and lower prices; and sixth, the organization stays in business
and provides jobs (Sashkin & Kiser, 1993). Deming (1986) fiirther argues that the
consumer is most important; and therefore, quality should be aimed at the consumer's
needs both present and future. He predicts that companies that adopt constancy of
purpose for quality, productivity, and service, and go about it with intelligence and
perseverance, have a chance to survive in the global market.
This fourteen point theory also supports Deming's seven deadly diseases: lack of
constancy of purpose, an emphasis on short-term profits, individual performance
evaluations, managers who are highly mobile, use by management of numbers and
figures that are visible and available, excessive medical costs and excessive legal liability
costs. Sashkin and Kiser (1993) found that these diseases in management "refer to
beliefs, policies, and practices so firmly entrenched that many, perhaps most, American
managers regard them as basic truths" (pp. 34-35). Such basic truths hindered American
industry for a while from successful competing worldwide. Thus, the roster of founders
of the quality approach include Deming, Kaoru Ishikawa, Joseph M. Juran and Armand
V. Feigenbaum.
Philosophy of TQM
This new philosophy emphasizes a few guiding principles and applies to both
large and small institutions. TQM allows an organization to set expectations higher than
in the past, to recognize and remove barriers to change, and to enable high-level
managers to solicit the opinions and ideas of their associates and do something with those
good ideas. Supporting the philosophy of TQM are both qualitative and quantitative
20
tools which allow a better understanding of the way business is conducted. Such tools
allow for measuring improved quality which is required in TQM's continuous
improvement, as well as recognizing when the improved productivity, performance,
efficiency, and quality goals are achieved (Jablonski, 1992).
Jablonski (1992) reports that the U. S. General Accounting Office conducted a
review of the 20 highest-scoring applicants for the Malcolm Baldridge National Quality
Award in business over a two-year period to evaluate the impact of TQM practices on
their organizations. In nearly all cases, those using TQM techniques achieved better
employee relations, greater customer satisfaction, higher productivity, improved
profitability, and increased market share.
TQM Application
TQM is a "cooperative form of doing business that relies on the talents and
capabilities of both labor and management to continually improve quality and
productivity using teams" explains Jablonski (1992, p. 71). He fiirther declares that three
ingredients are necessary for TQM to flourish in any company: participative
management, continuous process improvement and the use of teams. Participative
management is defined as "arming your people with the skills and support to better
understand how they do business, identifying opportunities for improvement, and making
change happen..." (p. 71). Continuous process improvement (CPI) a major component of
TQM theory is defined as "accepting small, incremental gains as a step in the right
direction toward total quality....Continuous process improvement reinforces a basic
tenant of TQM—long-term focus" (p. 72). TQM also involves teams which should
include a "cross-section of members who represent some part of the process under study"
(p. 43). Business and industiy have successfully applied these principles to their
operations.
In the 1980s, higher education institutions began to apply the TQM principles.
The early higher education institutions who tried the switch to quality management were
a mixture of community colleges, four-year colleges and universities, and four-year
public schools. Although distinctively different, each institution found the quality
21
principles appropriate to its situation. A number of institutions have been surveyed and
have provided in-depth information about why they became interested in the quality
movement (American Association 1994; Seymour 1993; Seymour & Collett 1991)
Freed, Klugman and Fife (1997) provided a list of institutions that participated in such
studies: Alabama State University, Arkansas State University, Arkansas Tech University,
Babson College, Belmont University, Brazosport College, Central Connecticut State
University, Clemson University, Colorado State University, Cornell University, Dallas
County Community College District, Delaware County Community College, Duke
University, El Camino Community College, Fordham University, Fox Valley Technical
College, George mason University, Georgia Instittite of Technology, Grand Rapids Junior
College, Lamar Community College, Maricopa County Community College District,
Marietta College, Northwest Missouri State University, Oregon State University, The
Pennsylvania State University, St. John Fisher College, Samford University, Southern
College of Technology, United States Air Force Academy, University of Chicago,
University of Illinois-Chicago, University of Maryland, University of Minnesota-Duluth,
University of Minnesota-Twin Cities, University of Montevallo, University of
Pennsylvania, University of Tampa, University of Wisconsin-Madison, University of
Wyoming, Winona State University (p. 15),
In response to this increased interest in the quality movement during 1990 to
1993, the American Association for Higher Education (AAHE) and the William C. Norris
Institute collaborated in January of 1993 to establish the Academic Quality Consortium
(AQC) with the purpose of providing campuses committed to implementing quality in
higher education "with the opportunity to exchange information, build on one another's
experiences, expand on assessment practices already used, and share the results of their
work with the wider higher education community" (Freed, Klugman & Fife, 1997, p.
29). One way the AQC shares its experiences and knowledge of quality practices is
through the AAHE Continuous Quality Improvement (CQI) Project, an e-mail discussion
list, CQI-L, open to all educators interested in quality improvement in higher education.
The group began with a membership of approximately 50 and has grovra to
approximately 750 members in fall 1996.
22
Calek (1995) has also documented the growing interest in quality principles in
the annual Quality in Education Survey by Quality Progress. Since 1991, Quality
Progress has conducted a survey to determine how many community colleges and four-
year public and private colleges and universities offer courses in quality improvement and
whether they apply the quality principles in managing their institutions. The number of
participants has steadily increased over the five years: community colleges from 14 to 83
and four-year colleges and universities from 78 to 220. Eighty-eight percent of the four-
year colleges and universities that responded in 1995 reported using the quality principles
to manage their administrations, 55 percent offer quality-related certificated, minors, or
degrees, and 42 percent do both. Among community colleges, 91 percent of the
respondents use quality principles to manage their adminisfrations, 66 percent offer
quality related certificates, minors or degrees, and 52 percent do both.
The American Association for Higher Education (1994) noted the growth of the
total quality management movement in colleges and universities in the late 1980s and
early 1990s, Then institutions questioned whether the quality movement was appropriate
for education. Just a few years later, adminisfrators did not ask whether the quality
movement was appropriate, but how the quality principles could be made relevant and
worthwhile on their campuses. Freed, Klugman and Fife (1994) recorded the growth of
the quality movement in higher education in a national survey on TQM on campus. This
survey of over 400 institutions that were identified as having shown interest in the quality
principles, 25 percent of the responding institutions reported that they began
implementation of the principles reported that they had begun implementation of the
principles in 1990 or before, 50 percent reported that they began in 1991 or 1992, and 25
percent reported that they began in 1993 and early 1994.
But in implementing TQM many institutions have experienced failure. Creating a
culture for academic excellence by implementing the quality principles is not easy in
higher education institutions, "as the strong historical fraditions in higher education make
any kind of change exfremely difficult" (Freed, Klugman, & Fife, 1997, p. 9). These
authors further explain,
23
Most institutions have missions, but most are not accustomed to
measuring the outcomes of their processes. Traditionally, constituencies
in higher education institutions act independently rather than
interdependently. Leaders are usually not trained in the tools and
techniques used to improve systems and processes. Developing
management skills and knowledge is not the norm in higher education.
Professional development is more often discipline- and person- specific
rather than oriented toward developing members who can collectively
improve institutional processes. Although data are collected for a variety
of purposes in directing higher education institutions, the quality
principles emphasize the systematic collection of data to be used in
making academic and administrative decisions. Committees in academe
are common, but collaborating and working as teams for common
purposes are not. (pp, v-vi)
Carr and Johansson (1995, p. 16) found that TQM is process-focused, emphasizes
teams and mutual values, and uses shared tools for problem-solving. In practice, TQM
tries to improve processes and includes the continuous drive for quality improvement as a
part of the fabric and culture of an entire organization (Parfett, 1994, p, 160), Thus the
process, methods, and language of total quality management were adopted by U, S.
industry to develop a method of quickly finding and implementing practices that would
achieve world-class results, and higher education has adapted these techniques to their
institutions.
Brown, Hitchcock, and Willard (1994) studied TQM failures and reported that if
there had been a failure, it was not one of philosophy; but one of implementation. They
cite three phases of TQM implementation: start up, alignment, and integration. They also
state, "It's no wonder that somewhere between one-half and three-quarters of the
organizations implementing TQM drop their initiatives within the first two years (p. 1),
Then they list reasons for possible failure in each area. In the start up phase the authors
cite "lack of management commitment, poor timing and pacing, wasted education and
fraining, and lack of short-term bottom-line results" (p, 2). In alignment phase they show
that divergent sfrategies such as employees thinking that quality is separate from work,
and "inappropriate measures, outdated appraisal methods, and inappropriate rewards" (p.
71). And in the integration phase, they list "failing to transfer true power to employees,
maintaining outmoded management practices, poor organization and job design, outdated
24
business systems and failing to manage learning and innovation diffusion" (pp, 138-9).
Applying the principles of TQM to higher education has not been fool-proof, but has
proven to be successful in many institutions.
25
of the liberal arts, and the steady climb of tuition, have caused legislative commissions
and assessment programs to begin investigating the value of the college degree (Seymour,
1993), The public and governments are scrutinizing higher education very closely and
are demanding greater accountability in areas of student retention, especially retention of
minority students, and the need for remedial assistance in order to succeed in college-
level courses (DeCosmo, Parker & Heveriy, 1991). The standard response has been to
revisit time-worn cost containment and policy options. Institutions have become
accomplished experts in crisis management rather than simply asking, "Is there a better
way to manage higher education?" (Seymour, 1993, p. viii).
Hubbard (1993) points to the first three institutions that became involved in the
quality movement: Northwest Missouri State University in 1984, Fox Valley Technical
College in 1986, and Oregon State University in 1989. Although these three institutions
were different in size, location, and type of degrees offered, they all had presidents who
developed an interest in the quality movement, became champions of quality within their
own institutions, and applied the quality principles with a long-term commitment
(Hubbard, 1993). Many other colleges and universities have now agreed with Seymour
(1993) who said that TQM is "too well-grounded in a scientific approach to problem
solving, and it has been tested, scrutinized, and revised in thousands of organizations over
a period of three years. Bottom line: It works" (p. ix). He goes on to conclude that
"quality can be actively and aggressively managed into our colleges and universities" (p.
x), because quality is being managed into operations and attitudes in higher education
"with a sense of purpose, urgency and excitement" (p. 3). Then he cites examples of
higher education institutions which since 1985 have adapted TQM principles with the
concept of continuous, systematic improvement to their campuses: the University of
Wisconsin-Madison, North Dakota University System, Oregon State University, and
Delaware County Community College.
Because the goals of private industry and universities differ from those of
community colleges, Peterson (1993) observes that the implementation by almost every
institution of higher education during the 1980s of some type strategic planning model
such as Management by Objectives did not fit very well into their structure and mission.
26
However, since then community colleges have recognized the importance of long-range
planning in maintaining their mission as opportunity colleges. During this time they have
evolved from junior to community colleges and their mission has become more diverse.
Due to this evolution, systematic long-range planning and improvement have become
more necessary to maintain community colleges effectiveness. Peterson (1993) writes
that it is "apparent that the basic tenets of this concept [TQM] are applicable to non-profit
organizations" (p. 3) such as colleges and universities, but institutions of higher education
are unique in many ways and must develop their own methodology in implementation.
He calls this new methodology Continuous Quality Assurance (CQA) which is TQM
adapted to colleges and universities,
DeCosmo, Parker and Heverly (1991) agree that community colleges can provide
"an affordable avenue to excellent programs and services" (p, 14) by adopting the
principles and methods embodied in TQM to their campuses. Theses authors conclude:
Solving today's problems requires the intelligence and hard work of
everyone. TQM offers a paradigm that encourages effective participation.
To achieve giant strides in both quality and efficiency. Everyone must be
freed to pursue quality and accept responsibility for it, to work together in
teams, and to listen to internal and external customers, (p. 14)
Seymour (1993) suggests that TQM is a structural system that can help create a true
learning organization which involves learning how to improve the registration of students
and learning how to improve campus maintenance (p. 31).
A TQM environment calls for an unrelenting desire for quality, teamwork,
long-term thinking, rewards for results, and empowered employees. In
many colleges, there is a willingness to settle for less,
compartmentalization and competitiveness between individuals and units,
short-term thinking, poor and nonexistent reward systems, and external
pressures for accountability, bureaucratic processes and "layered"
administration....TQM not for everyone...! am thoroughly convinced
that TQM is an extremely worthwhile approach to managing a community
college. In my view the long-term benefits of TQM far outweigh its costs.
(Hammons, 1994, p. 344)
Schauerman (1994) describes El Camino College's struggle to overcome resistance to
change and implement TQM on his campus, and then concludes that improved service
processes, more productivity by assigning monetary and human resources more wisely
27
and getting people to work together and to respect each other's strengths are benefits of
having a TQM culture. In writing of her experience in implementing TQM on her
campus, Thor (1994) also chronicles the barriers to change that must be overcome in
implementing TQM. She summarizes by saying, "the process of continuous change has
begun to make its mark on American education, and employees and students nationwide
are benefitting from that beginning" (p. 367). In relating the establishment of TQM at
Jackson Community College, LeTarte and Schwinn (1994) relate that a basic belief in
TQM is a belief in positive change.
TQM Benchmarking
Benchmarking focuses on the processes used by competitors, as well as industry
trends, to identify opportunities for continuous improvement, not just for price cutting
observed Leibfiied and McNair (1992), They further explain that benchmarking
transforms the change process into a continuous improvement effort focused on process
28
learning at the organizational level, which leads to continued innovation and change.
Because it is based on clear, two-way communication and participation between partners,
benchmarks cannot be established without a comprehensive understanding of current
practice within an organization, desired results, and the recognition and acceptance of the
changes that will need to occur to meet and exceed those goals,
Leibfried and McNair (1992) also show that benchmarking is a dynamic, ongoing
effort by management and workers alike that contains the seed of organizational and
cultural change that must occur if survival, let alone competitive excellence, is to be
achieved. Thus, the goal is to put benchmarking to work to achieve world-class
competitive capability. Tucker (1996) found that the foundation of benchmarking is
"education orfransferringlearning from one group to another" (p. xiv).
29
the Xerox Corporation in response to increased competition and a rapidly declining
market share. Cook (1995) reports that Xerox used benchmarking whenever it found
something that someone else did better, by adopting the new level of performance as a
new base standard in its own operations. The guiding principle was "Anything anyone
else can do better, we should aim to do at least equally well" (p. 14). Xerox
Reprographics Manufacturing Group president, Charles Christ, sent a team to Japan
because
I needed a benchmark, something that I can measure myself against to
understand were we have to go from here. This competitive benchmarking
resulted in specific performance targets rather than someone's guess or
intuitive feel of what needs to be done-which is the real power of the
process. (Leibfiied & McNair, 1992, p. 20)
The use of benchmarking grew rapidly in industry through out the US during the
1980s as many companies recognized the need to improve the quality of their output and
business performance. This development naturally followed from TQM's emphasis on
process. Benchmarking became a recognized tool in the development of a continuous
improvement process. Other companies like Fed Ex, IBM, health care organizations and
hotels began to use benchmarking. Tucker (1996) declares that almost all of the Fortune
500 companies have done at least one benchmarking project, and many use it on a regular
basis.
UK companies with US connections began benchmarking during the early 1990s
(Cook, 1995). So organizations such as Xerox, Digital Equipment Company and
Milliken Industrials used and continue to use this technique as an integral part of their
corporate sfrategy. Cook (1995) fiirther explains that in 1991 the first benchmarking
seminar was held by the British Quality Association, Almost all of them are expecting to
increase their investment in benchmarking over the next five years.
30
publicize successfiil quality strategies " (Spendolini, 1992, p. 5). Organizations
competing for it are required to document their benchmarking projects in several sections
of the application. Thus, one issue raised by the Baldridge award is that of external
comparisons, Wolverton (1994) fiirther adds that "internally, this means year-to-year
improvement; externally, it implies improvement compared to peer institutions and
appropriate benchmarks" (pp. 14-15). For example. The Malcolm Baldrige National
Quality Award: A Yard.stick for Quality Growth by Heaphy anH On.Qlra (1995) describes
category 2.0 of the award. Information and Analysis, as the guideline which examines the
scope and use of data as well as the benchmarking activities of the company. Under the
heading competitive Comparisons and Benchmarks, the right benchmark data must be
chosen for collection and analysis, and then used to guide decisions. Benchmarking,
examined in this category, is an approach for learning what other organizations are doing
and for getting new ideas. This section 2.2 is worth fifteen points and includes two areas
which directly address benchmarking:
In 1995 after two years of research and input by educators and organizations such
as the Association for Supervision and Curriculum Development and the American
Association of School Administrators, information about the upcoming Baldridge Quality
Award for Education was released by the U.S. Department of Commerce for the Malcolm
31
Baldridge National Quality Award Education Pilot, In this new Baldridge education pilot
criteria all seven categories relate to information and analysis that support overall
mission-related performance excellence. This includes benchmarking and peer
comparison (p. 49). Many leaders such as Tucker (1996, p. 6) expect that this will
radically change the way we assess school effectiveness.
Definitions of Benchmarking
Benchmarking corresponds to the human learning experience in teaching an
organization to improve, and continuous improvement brings needed changes that keep
the instittition competitive, David Keams (Leibfiied & McNair, 1992) CEO of Xerox
Corporation, the American pioneer of benchmarking defines benchmarking as: "the
continuous process of measuring our products, services, and practices against our
toughest competitors or those companies known as leaders" (p, 10), Tom Peters in
Thriving on Chaos uses the term "creative swiping" to describe the process of looking
beyond your own organization for other ideas... that may be adapted and enhanced to fit
your special cfrcumstances" (p. 25).
Robert Camp (1995) shares the dictionary definition: "benchmarking as a standard
against which something can be measured. A survey mark of previously determined
position used as a reference point" (p, 18). He fiirther defines benchmarking as a 10-step
process which is "a structured way of looking outside to identify, analyze, and adopt the
best in the industry or function" (p, 18). Benchmarking may be descriptive and be
converted into performance measurements that can show the effect of adopting the
practice.
Kempner (1993) defines benchmarking more concisely as an ongoing, systematic
process for measuring and comparing the work processes of one organization to those of
another, by bringing an external focus to internal activities, fiinctions or operations.
Leibfried and McNair (1992) agree that benchmarking brings an external focus, "a
movement away from a concern with cost reduction and budgets, to an understanding of
what activities customers value and what level of performance they expect" (p. 2). Shafer
and Coate (1992) define benchmarking as a "positive process that provides objective
32
measurements for baselining (setting the initial values), goal-setting, and improvement
tracking, which can lead to dramatic innovations" (pp. 28-35). Tucker (1996) defines
benchmarking as the process used to "achieve superior performance through a team
research and data-driven process by which learning and innovation trigger fundamental
breakthroughs in thinking and practice" (p. ix),
Dertouzos, Lester, Solow, and the MIT Commission on Industrial Productivity
(1989) define benchmarking as the "ongoing, systematic process for measuring and
comparing the work processes" of one organization to another (p. 84), Tucker (1996)
defines benchmarking as "the process used to achieve superior performance through team
research and a data-driven process by which learning and innovation trigger fundamental
breakthroughs in thinking and practice" (p, ix). Cook (1995) defines benchmarking as:
the process of identifying, understanding and adapting outstanding
practices from within the same organization or from the other businesses
to help improve performance. This involves a process of comparing
practices and procedures to identify ways in which an organization can
improve. Thus new standards and goals can be set which, in tum, will
help better satisfy the customer's requirements for quality, cost, product
and service, (p. 13)
Thus through benchmarking an organization can add value to their customers and
distinguish themselves from their competitors.
Superintendent's Quality Challenge (1994) defines benchmarking as an
"improvement process in which an organization compares its performance against best-
in-class organizations, determines how those organizations achieved their performance
levels, and uses the information to improve its own performance by benchmarking
sfrategies, products, programs, services, operations, processes and/or procedures" (p, 29).
Regardless of how benchmarking is defined, all agree that it is an analytical process of
comparison used as a tool for continuous improvement in an effort to manage change.
The purpose of the benchmarking process is to provide managers or administrators with
external points of reference or standards for evaluating the quality and cost of their
organization's internal activities, practices, and processes. Rush (1994) affirms that
benchmarking strives to provide an institution with a
33
demonstrable cost or quality advantage by replacing management intuition
or 'gut feel' with facts and analysis that foster better operational practices.
It attempts to answer the following questions" How well are we doing
compared to others? How good do we want to be? Who's doing the best?
How do they do it? How can we adapt what they are doing to our
institution? How can we be better than the best? (pp. 84-85)
Furthermore, Alstete (1995) explains that the goal of benchmarking is to "provide key
personnel, in charge of processes, with an extemal standard for measuring the quality and
cost of internal activities, and to help identify where improvement opportunities exist" (p.
in).
Types of Benchmarking
Research shows six basic types of benchmarking studies can be conducted
depending upon the data needed: Internal, Competitive including peer. Industry, Non-
competitive, Best practice, and Functional/Generic. Intemal benchmarking is defined by
Cook (1995) as the "measuring and comparing of organizational data on similar practices
from different departments of the same organization" (p, 18). Alstete (1995) applies this
definition to higher education by explaining that intemal benchmarking can "best be
conducted at large, decentralized institutions where there are several departments or units
that conduct similar processes" (pp. iv-v).
Competitive benchmarking as defined by Spendolini (1992) is "measuring an
organization against direct competitors who are selling to the same customer base" (p.
17). Rush (1994) adds that the intent is to find an "extemal standard against which the
institution can compare itself (pp. 89-90). Such benchmarking is used to establish
performance standards and to detect trends in the competitive environment and Leibfiied
and McNair (1992) found that it usually includes two or three of a firm's closest
competitors.
Industry benchmarking as defined by Rush (1994) is "identifying trends and
providing insights into how those frends are being set" (p. 90). Leibfiied and McNair
(1992) show that it looks for general trends across a much larger group of related firms
than competitive benchmarking using a "more general procedure that compares a firm
34
against companies with similar interests and similar technologies, attempting to identify
product and service trends rather than current market share rankings" (p, 116). This is
similar to competitive benchmarking, but differs in that the scope is broader to allow
looking at more institutions. But the goal is again to provide extemal standards for
intemal measurement and to identify the best operational practices that could be adopted
or adapted at an institution.
Noncompetitive benchmarking as defined by Cook (1995) is the process of
measuring and comparing "a related process in a noncompetitive organization, a related
process in a different industry or an unrelated process in a different industry" (p. 19).
Best practice/world class benchmarking is defined also by Cook (1995) as the process of
"learning from best practice/world-class organizations, leaders of the process being
benchmarked" (p. 20).
This type of benchmarking is also called fimctionaL/generic benchmarking by
Spendolini (1992, p. 17). Rush (1994) differentiates between functional and generic
benchmarking by saying that "functional benchmarking uses a large group of competitors
and is more broadly defined while generic uses the broadest application of data collection
from different industries to find the best operations practices available" (p. 11). Alstete
(1995) reveals that the selection of the benchmarking type depends on the "processes
being analyzed, the availability of data, and the available expertise at the institutions" (pp.
iv-v).
35
1992; Seymour, 1993a, 1993b, 1994, 1996; Teeter & Lozier, 1993; Willliams, 1993), but
little research has been documented about benchmarking (Kempner, 1993; Shafer and
Coate, 1993; Shaughnessy, 1993).
Evidence of growing interest in benchmarking is the National Postsecondary
Education Cooperative Council Meeting Proceedings of December 9-10, 1996, where a
Research Associate with the State Higher Education Executive Officers began a
discussion with an overview of benchmarking including its definition, processes, and
current national efforts. The discussion was to center on a possible benchmarking study
by the council. However, most participants expressed views that many other
organizations could be more effective than NPEC, like institutional departments,
individual institutions, or postsecondary systems. Some authors claim that benchmarking
used with other improvement initiatives has the capacity to move business processes in
higher education forward at a faster pace than other approaches used alone (Douglas,
1997; Shafer & Coate, 1992). Tucker (1996) supports the idea that individual institutions
can be more effective than large organizations at addressing benchmarking studies. She
explains that what's missing from school reform efforts is a "careful analysis of what is
and isn't working in the individual current system and an infusion of ideas and
documented practices that work in similar settings" (p. x).
36
Some examples of benchmark studies do exist in higher education. Alstete
reports (1995) that graduate business schools, professional associations such as the
National Association of College and University Business Officers (NACUBO), and the
Association for Continuing Higher Education (ACHE), independent data sharing
consortia, private consulting companies, and individual institutions are all conducting
organizational benchmarking projects. The broad-based NACUBO benchmarking
program (NACUBO, 1995) was begun in late 1991, in collaboration with Coopers &
Lybrand to establish a national data base of key benchmark data for a variety of financial
and administrative fiinctions within four year higher education institutions. It seeks to
provide participants with an objective basis for improved operational performance by
offering the best practices of other organizations. Between 1991 and 1995, 344
institutions have participated in the NACUBO study (Haack, 1998). The current project
analyzes core functions of colleges and universities, such as accounting, admissions,
development, payroll, purchasing, student housing, and others. The information does not
yet include academic department benchmarks, but it does provide data for 38 functional
areas. The purpose of the NACUBO benchmarking project is to "provide colleges and
universities with current operational information and benchmarks as they consider
various restmcturing and cost-reduction initiatives" (Rush, 1994, pp. 94-95). Haack
(1998) found that institutions involved with this study of benchmarking reported positive
changes in higher education business practices. The approach of using the benchmark
data in a non-fraditional manner, not using it to justify more money from states, students
or donors but using the data to cut costs and increase quality, is unique to higher
education (Blumenstyk, 1993; Massy & Meyerson, 1994).
Reporting on his experience with benchmarking, Edwin, J. Merck, Vice President
for Finance and Operations at Wheaton College says.
We had a lot of good ideas for improvements intemally, but we got
a fuller spectmm by looking for best practices....We contacted some
similar sized schools to see how they handled the coordination issues [of
the financial aid letter and student's bills], and leamed that all had
grappled with this problem; there were a wide range of possible
approaches, and no one school was completely satisfied with its solution.
Taking some comfort in the universality of this problem, we built upon the
37
ideas of others, which enabled larger improvements at a quicker pace. In
tum, we shared our results; several of the originally 'benchmarked'
schools have found our solution a usefiil enhancement to implement or
adapt. It's really not necessary for all of us to reinvent the wheel with
every problem. If we aggregate creative resource, we'll all be able to
spend less time solving each problem, thus allowing capacity for tackling
more problems with limited resources. (Kempner, 1993, p, 22)
The Association for Continuing Higher Education (ACHE) and graduate business
schools have also conducted specialized benchmarking studies that focus on the
"processes and practices concerning their institutional departments" (AACSB & Alsete,
p, 45). The College of Business at Arizona, Northwest Missouri, and Samford all
benchmark according to Wolverton (1994). Arizona has tried to gather data from peer
institutions on college specific priorities, but such benchmarking at the undergraduate
institutional level remains limited because "the collection of relevant data about peer or
better baccalaureate programs has not been accomplished" (COB, 1994, p. 34).
Colleges and universities can use benchmarking in bringing about changes needed
for school improvement. Benchmarking, explains Tucker (1996), is not a mechanism for
reducing a school's budget; instead, it is a
mechanism for deploying resources in the most effective manner to
achieve customer satisfaction. It is not a cookbook program that requires
only a recipe for success. Instead, it is a discovery and learning process
that can be used over and over again to achieve different goals. It is a way
of working and thinking to achieve continuous improvement in a college,
(p. 3)
Benchmarking can be used to address specific college sfrategic goals more rapidly.
Tucker (1996) believes that it can be used to "prevent 'wheel reinvention' " and leam
how other organizations have addressed the issues they are facing" (p, 8), She also states
that federal and state mandates can be implemented very efficiently through a benchmark
study, and that while conducting a benchmark study an institution can practice one of the
fimdamental improvement strategies required in the Malcolm Baldridge Quality Award
criteria.
38
Benchmarking is being used for improving administrative processes as well as
instmctional models at colleges and universities, by examining processes and models at
other schools and adapting their techniques and approaches. (Chaffee & Sherr, 1992;
Clark, December 5, 1993), Such a project can provide "an objective basis for improved
operational performance measurement, a tool for change within the institution, a 'pointer'
to the best practices of other organizations, and a ineans to bring about change more
quickly" (Rush, 1994, p. 96). Benchmarking is a practical approach for a college or
university to use to assess how their resources are being spent. It can be a "vehicle for
promoting substantive, change-oriented action within an institution by providing
compelling evidence of the need to change" (Rush, 1994, p. 96), A benchmark study
offers the following advantages to an institution: "It sets performance goals. It helps
accelerate and manage change. It improves processes. It allows individuals to 'outside
the box'. It generates an understanding of world-class performance" (Cook, 1995, p, 1)
Due to its reliance on hard data and research methodology, benchmarking is
especially suited for institutions of higher education where these types of studies are
usually very familiar to faculty and administrators. Practitioners at colleges and
universities have found that "benchmarking helps overcome resistance to change,
provides a structure for extemal evaluation, and creates new networks of communication
between schools where valuable information and experiences can be shared" (AACSB,
1994, p. 16-17).
Summary
This review of literature has included discussions of the theoretical background of
Total Quality Management, TQM use in higher education, TQM's continuous
improvement concept, TQM benchmarking, and higher education's benchmarking.
39
CHAPTER III
METHODOLOGY
Sample Selection
For benchmarking to be usefiil in evaluating several like processes at once and to
produce institutional frends similar institutions must be compared, mid-sized Texas
community colleges were selected. Community colleges were selected for this study by
primarily considering student and community populations, and appropriations or fiinds
available for operation. In order to select the colleges to be studied statistics were
40
gathered on student headcount, semester contact hours generated, community population
base including both the city size and the surrounding county sizes, and appropriations.
Thus, the level of money available for operation and the population served by higher
education operations were the primary considerations in trying to equate schools with
comparable means.
Mid-sized community colleges were defined as community colleges with student
headcount between 5,000-11,000, contact hours between 2,5 million and 5.5 milhon, and
appropriations between $9,5 milUon and 13 million with primary service area population
of between 70,000 and 210,000. Colleges fitting these criteria are found in Table 3,1.
Nine colleges fell into the mid-range of statistics being used for selection.
41
Table 3.2 Revenue statistics considered on community colleges defined as mid-sized.
M&O Unrestricted Tax Value
Taxes tuition & fee per
College Tax Base M&O Rate per revenue per Contact
1995 1995-96 contact contact hour Hour
hour
CI $ 4,384,591,839 .1233 $1.70 $1.38 $1339
C2 $ 954,237,592 .0452 NA NA $240
42
In.stmmentation
A survey design was selected for this study because it allowed for the efficient
collection of benchmarking data from each institution and in follow-up procedures. The
survey design also allowed for rapid turn-around in data collection, and the ability to
make comparisons and identify practices and gaps across institutions as well as determine
use of the benchmark data for improvement.
Support secured
Adopting Peterson's (1993) belief that the college president is the key leader in a
higher education system, the presidents of the six selected colleges were given an
explanation of the project by the lead college's president and asked for their institution's
cooperation in the project. This explanation included a guarantee of confidentiality and
rationale for the use of the data collected. Each president designated an institutional
study director or contact person responsible for helping decide what data was to be
collected for benchmarking by giving input on the design of the study, seeing that the
questionnaire was completed and retumed in a timely manner, and reporting the survey
information results to their institution. Therefore, each contact person assumed
responsibility for approving the items in the scope of the survey instmment and then
contacting appropriate personnel on their campus to complete the survey form.
Survey development
After the facilitator conducted research on benchmarking, she along with the Task
Force of the lead college took the initiative in developing the survey instmment.
Realizing as Rush (1994) points out that before any benchmarking is undertaken, "the
ultimate objective or target has to be very clearly understood and the potential cost and
benefits assessed," the Task Force set the objectives for the study: to collect information
from like community colleges in key areas of resource management to determine what
processes at their institution could be improved. The participating community colleges
were to use the data for similar purposes.
43
Combining Leibfiied and McNair's (1992) suggestion that a benchmarking
process follow three steps: "indentification of core issues, intemal baseline data
collection, and extemal information data collection" (p. 45) with Rush's (1994)
guidelines for a benchmarking questionnaire, the process of stmcturing the survey
instmment was begun. The lead community college held Task Force and executive staff
meetings to select areas that would be most beneficial to benchmark. Each major area of
college processes was considered as possible area of benchmarking.
The lead benchmarking college was guided in establishing the questionnaire by
the belief that critical to the "extemal data collection process is the exchange of
information." Because a benchmarking organization expects to give and receive
information; Liebfired and McNair (1992) advise "do not ask for something that you are
not willing to provide to others" (pp. 44-45). The Task Force with the assistance of the
facilitator selected college personnel from the lead institution most knowledgeable about
key processes, and the other five institutional leaders were contacted for input on what
key process areas should be selected for benchmarking. This procedure followed Rush's
(1994) guidelines of "clearly identifying the department, activity, or process to be
benchmarked so that apples-to-oranges comparisons are avoided...,and keeping the study
simple" (p. 96), Benchmarking is intended to point to trends or best practices and allow
for measuring the performance of one's own institution in light of customer expectations
and the performance of other organizations; thus, the five other schools were continually
consulted about departments, activities and processes that they desired to benchmark.
Once the content areas were designated, knowledgeable people in each area were
selected to work with the facilitator in developing questions and items for the scope of the
survey. These lead college experts met for brainstorming sessions with the facilitator to
select areas or topics of comparison for the survey. Then the facilitator formulated
questions for the survey instmment and the expert groups again met for review and
revision sessions. During this time the other institutional leaders were contacted by e-
mail and telephone to gather their input on areas they wished to be benchmarked. Eight
areas were originally targeted; however, after meeting with the adminisfrative committee,
the various process area experts, and gaining input from the other participating
44
community colleges, ten areas were chosen by the facilitator for the survey. After the key
areas were chosen, the facilitator used the input from each expert group to formulate
questions for the survey instmment. These area surveys were then distributed to the
professionals for refinement and feedback.
Once the personnel at the lead college approved of the survey, it was distributed
to the other five participating institutions for their input and suggestions. After receiving
input from the participating institutions, further improvements were made to the survey
instmment.
Content Areas
The ten educational areas chosen for the study were business or the processes
involved in the finances of an institution, computer technology or the information
systems utilized by the institution, developmental instruction or the remediation pre-
college program of the institution, distance education or the instmction delivered via
technology, institutional effectiveness or the measurement, resource and planning area
of a community college, instructional or the educational programs of the institution,
outsourcing or the process areas which are contracted outside of the institution's
operation, personnel or the administrative, faculty, and staff employees of the institution,
scheduling instruction or the arrangement of educational classes at the institution, and
workforce education or the continuing education and customized fraining programs of
the institution for business and industry and the community (see Appendix A for copy of
survey instrument). Following Cook's (1995) waming that the outcome of the
benchmarking process is "only as good as the preparation which goes into it" (p, 23) and
the adoption of the idea of continuous improvement, the preparation was detailed and
results were expected which would produce areas for improvement.
Pilot Sttidy
After the survey instmment was finalized, a pilot of the instmment was conducted
at the lead college. Each of the ten key area surveys were distributed to the division chair
45
or the administrator primarily responsible for that process area to complete. As these
surveys were retumed, further refinement was made to the area surveys to remove any
ambiguities.
Data Collection
The revised survey instmments were then sent by mail and e-mail to the
designated contact person at each community college to be distributed, completed and
retumed for analysis. During this collection period, responses and correspondence were
handled by mail, e-mail, FAX and telephone in order to keep costs at a minimum and for
the convenience of participants which is recommended by Tucker (1996) and Leibfiied
and McNafr (1992).
As expected some colleges finished the surveys and retumed them promptly.
Others had to be contacted numerous times and encouraged to complete the project. All
six institutions eventually completed the survey instmment.
Data Analysis
After information was retumed, each college was assigned a code number. For
reasons of legality, exchange, and confidentiality, this method was chosen. The raw
quantitative data was entered into a spread sheet and qualitative data was entered in
narrative format using a word processor. If there was a question about a community
college's response, the contact person was contacted for clarification. Next ranges, the
median, and the means were calculated for each item on the survey requiring a numerical
response. When the responses sought were descriptive in nature, narrative responses
were recorded with indications as to how many institutions gave like narrative responses.
The data was then analyzed to determine trends. If at least four community colleges
reported similar responses to survey items, a general frend in mid-sized community
colleges in Texas was noted. Each participating institution then received a final copy
which included these conclusions. Next the lead college had the researcher do a gap
analysis of their institution using the raw data which identifies the community college
doing the analysis performance and makes comparisons with the other benchmarking
46
partners. Cook (1995) defines this gap analysis process by explaining. "Where the
intemal standard is higher than the target performance this is termed a "positive' gap.
Where the performance levels in place in an organization are lower than the target
performance...this is called a negative gap....there are "pockets of excellence" within an
organization...build on what is good" (p. 123). The other participating instittitions were
encouraged to do their own gap analysis.
Follow-Up Survey
Two years later, a follow-up survey was prepared to determine the use of the
benchmark data in each participating institution in order to evaluate the design and
methods of peer benchmarking in higher education. This survey instmment ask about
study procedures, data presentation, and usage or results. This descriptive data was
recorded, analyzed and conclusions drawn about the usefiilness of a benchmark study in
peer institutions in higher education.
Summary
This chapter examined the methods and procedures of a case study in
benchmarking among mid-sized community colleges in Texas including a discussion of
the background, the sample selection, the instrumentation, which consisted of the survey
development and selection of content areas, the administration of a pilot study, the
collection of data, the analysis of the data, and the administration and analysis of a follow
up survey on the process of conducting a benchmark study.
47
CHAPTER IV
FINDINGS
This chapter will report and analyze the data derived from the two questionnaires,
the original benchmark study and the follow-up study. All six participating community
colleges completed and retumed the original benchmark study, and four (66%) of the
participating institutions responded to the follow-up survey.
The basic research question was "what are the conditions necessary for
benchmarking to be used by mid-sized community colleges in Texas to improve
community college operations?" In order to answer this question six sub-questions were
posed.
Sub-Question 1
1. What planning steps should be taken by the lead college in beginning a benchmark
study for an educational institution?
Four basic steps were taken by the lead college at the beginning of the benchmark
study: recognizing the need for a comparative study; providing the essential study
elements, leaders and money; establishing study guidelines such as selecting the
parameters, the type, and the partners for the benchmark study, and assuming the task of
overseeing the design and pilot study on the survey instmment. The first basic step taken
was to recognize the need for conducting a comparative study. The president of the lead
college had afready come to think that a benchmark study might provide meaningfiil data
for continuous improvement at his the community college.
Essential Elements
The second basic step was to provide the essential elements needed for such a
study: leaders and money. Therefore, the president of the lead college decided to form
an ad hoc Resource Management committee, establish funding for the development,
distribution, collection and analysis of the survey instmment, and name a dean of
institutional effectiveness to explore the feasibility of conducting a benchmark study and
48
the hiring of a researcher. The president had afready set aside budget monies to cover the
cost of hiring a researcher and providing work space and equipment such as a computer
with an e-mail account, postage, telephone expenses, and visits to the various campuses if
they were deemed necessary. Because the means of communication proved sufficient to
provide all the information needed, no campus visit were ever made.
Once it was determined that a benchmark study appeared feasible, the Resource
Management committee hired a facilitator to administer the study because it was
determined that a facilitator would increase objectivity, help to maintain focus, keep the
project on track by adherence to a time line, fiimish benchmarking method research
skills, and provide the confidentiality necessary to gain support from other mid-sized
community colleges in Texas. Confidentiality was believed to be critical because
institutions are sharing processes and quantitative data that might be sensitive.
Adherence to a time line was thought necessary in order to have the results when the
budget for the next biennium needed to be drafted. The Resource Management
committee decided to seek a researcher with knowledge of community college operations,
statistics, group communication skills, and a knowledge of the benchmarking process
because they believed such traits would allow them access to the information they
deemed necessary to make decisions about what data was needed, how it could best be
collected and how potential findings could be implemented to bring about improvements.
Expertise in collecting and analyzing the data was necessary in order that thefindingsbe
presented in a usable form for making decisions about how changes could be best
accomplished.
Study Guidelines
The third basic step taken was to select the parameters of the study, select the type
of benchmark study, and chose the benchmark partners to be included in the study.
Parameters of study
One guideline established by the Task Force at the lead college was to select the
parameters of the study. Since the creation of the Resource Management Committee, its
49
charge had been to discover ways to better utilize resources at the lead college; so the
Resource Management Committee now assumed the role of a Task Force to supervise the
study. In order to select the parameters of the study the Task Force needed to better
understand the technique of benchmarking and their function in this study. The facilitator
was asked to research the qualifications needed to be a member of a benchmarking team,
definitions of benchmarking, how to start the benchmarking process including how
benchmarking relates to the National Baldridge Award, how benchmarking can be used
in school improvement and the different types of benchmarking studies and report to the
Task Force members. First, the Task Force considered material from Spendolini (1992)
to review the characteristics needed by a quality benchmarking team The Benchmarking
Book lists the qualifications for members of a benchmarking team as:
• functional expertise and a demonstrated level of job skills, or work-related
performance;
• sufficient credibility in the institution, as judged by subject-matter knowledge,
employment history, and the level of position(s) held;
• above average communication skills, in order to communicate well with other team
members;
• the need to have a high level of team spirit, including a sense of cooperation, effective
listening skills, an ability to reach a consensus, and respect for the opinions of others,
(p, 54) (See Appendix B for handout).
Second, after reviewing and discussing several definitions of benchmarking (see
Appendix B), the facilitator's recommended definition of benchmarking was accepted.
Benchmarking for the purpose of this study was defined as the TQM technique used to
facilitate continuous improvement by analytically comparing ongoing processes in an
effort to manage change. This definition satisfied the purpose of this study which was to
evaluate the usefulness of the process of benchmarking in mid-sized community colleges
in Texas.
Third, the research material demonsfrated the use and importance of
benchmarking by presenting it's use in the Baldridge Award (see Appendix B) as an
approach for learning what other organizations are doing and for getting new ideas to
help drive improvement and to improve planning and institutional performance
(Spendolini, 1992). Next, the Task Force examined the seven points that Belmont
50
University's Quality Team (see Appendix B) stated should be addressed before beginning
a benchmark study as listed by Alsete (1995):
• Is there already a focus in your work area or department around service,
employees, and continuous improvement of processes?
Is benchmarking the right strategy in this sitiaation?
What should you benchmark?
What should you measure?
What organization(s) should you benchmark?
How should you collect data?
How can you implement what you leamed?. (p. 63)
Type of study
Next, the Task Force considered the differing types of benchmark studies to
establish another guideline for the study. Combining the research on benchmarking the
facilitator presented six basic types of benchmarking studies that can be conducted
depending upon the data needed: intemal, competitive including peer, industry, non-
competitive, best-practice, and functional/generic. The Task Force concluded that
information from other similar community colleges was needed to make operational
comparisons and encourage improvements in resource use; thus, it decided to conduct
they peer institution benchmark study. This choice fimiished the Task Force and the
51
institution with data that has been used in several different ways to bring about
improvements. The study was still being referenced at the lead college in the annual
budget hearings two years after the study. Peer benchmarking is defined as the process of
analytically comparing peer organizations in order to discover current trends and to show
institutional gaps in performance where improvement can occur. This definition
combines business's definitions of competitive benchmarking (Spendolini, 1992), as
measuring against two or three direct competitors with the same customer base, and
industry benchmarking, as looking for general trends across a larger group of related
firms who have similar interests and technologies (Leibfried & McNair, 1992). The goal
of both is to provide extemal standards for intemal measurement and institutional
improvement. This type of benchmark study was chosen because the desire of the Task
Force was to have extemal standards for intemal management and institutional
improvement.
Thus, the Task Force understood the qualifications for a benchmarking team, how
to start the benchmarking process, the definitions of benchmarking, how benchmarking is
used in the Baldridge Award, how to benchmark for school improvement and the types of
benchmarking studies that can be conducted to facilitate continuous improvement at their
organization. The other participating colleges were not provided this information.
52
determined that six peer institutions besides themselves would be ask to participate in the
study; one institution declined to participate resulting in six institutions participating fiilly
in the benchmark study.
The president of the lead college then contacted the presidents of these institutions
and obtained the cooperation of the other five community colleges for the study. Each of
the presidents agreed to participate in the study and to appoint a contact person
responsible for conducting the study on their campus. The names of the contact persons
were then given to the researcher; however, no training was done to prepare the contact
persons for participating in the study and no Task Force was appointed to support the
study at any of the peer institutions except the lead college. It was recommended that the
contact persons at the participating community colleges select key personnel to be
involved in designing and completing the survey instrument. All of the schools used
principal personnel in the areas covered by the study. It was also recommended that the
contact person take the necessary steps to prepare and educate their institution's
participants in benchmarking and the benefits of such a study. Whether this was done or
not is unknown.
53
stage the Task Force also requested knowledgeable personnel provide input into the
various survey areas for further development.
The facilitator took the hst developed by the Task Force of potential items to be
included in the survey instmment to the expert personnel in each designated area which
included both instmctional division chairs and administrators. The key personnel were
asked to add components to the survey that they felt were needed in order to improve
operations in their areas. This expanded rough draft of questions was then taken back to
the Task Force for evaluation and changes were made. As the last planning step before
the pilot study, the key personnel from each functional area, which by now included ten
sections, were ask to fiirther refine the questions in their section of the survey instmment.
Therefore, four major planning steps were taken by the lead college in beginning the
benchmark study in mid-sized community colleges in Texas: recognizing the need;
providing the essential study elements, leaders and money; establishing the parameters,
the type, and the benchmark partners for the study, and assuming the task of overseeing
the design and pilot study.
Sub-Question 2
2. Can a benchmark study be designed to ensure that each of the participating institutions
feels included and derives information usefiil in making system improvements?
The facilitator and the Resource Management Task Force of the lead college
began developing the survey instmment. As discussed above, the Task Force began by
brainstorming functional areas at their college that were aligned with the institutional
mission statement and could possibly be improved. Later, with the facilitator, they
discussed what areas should be measured in order to generate comparative performance
data that might be used to bring about institutional improvement. The facilitator also met
with the executive staff of the lead college to brainstorm and select areas that they
determined would be beneficial to benchmark. As a result of these brainstorming
sessions, the facilitator recommended the requests be grouped into eight areas of
community college operations for the benchmark study.
54
Once the content areas to be benchmarked were designated, the key personnel at
the lead college were identified by the executive staff and the Task Force and these area
experts were asked to develop the scope of the survey. These knowledgeable people in
each designated area at the lead college were selected to work with the facilitator in
developing concems and questions. The other five institutions were also asked by e-mail
and letter to contact their selected key personnel to be a part of the survey development;
several revisions of the survey instmment came from this input. Included on the list of
pivotal personnel from the lead college were division chairs, so meetings were held with
these leaders to brainstorm specific processes that should become survey items.
Additionally, some of the selected key personnel from the lead college were interviewed
one-on-one to determine what information would be helpfiil in making comparisons of
operational processes for improvement. Throughout the entire development of the
instmment, the other participating institutional leaders were continually consulted by e-
mail and telephone to gather their input about departments, activities and processes that
they desired to be benchmarked. Input was sought from all the participants because the
instmment was intended to benefit all participating community colleges. Except in one
instance, when it was recommended that scheduling of instmction not be included in the
instmctional area but be a separate section, the areas of information sought by
participants matched those presented by the lead college.
Functional Areas
After meetings with the administrative committee, the division chairs, the various
process area experts, and gaining input from the other participating community colleges,
the facilitator decided on ten fimctional areas for organizing the questions so as to make
the data useful for making system improvements in all of the participating institutions.
Following this detailed process yielded the following areas:
• business or the processes involved in the finances of an institution,
• computer technology or the information systems utilized by the institution,
• developmental instmction or the remediation pre-coUege program of the
institution.
55
distance education or the instmction delivered via technology,
institutional effectiveness or the measurement, resource and planning area of a
college,
instmctional or the professional educational programs of the institution,
outsourcing or the contracting of services to outside sources that the college
normally performs
personnel or the administrative, faculty, and staff employees of the institution,
scheduling instmction or the arrangement of educational classes at the institution,
and
• workforce education or the continuing education and customized training
programs of the institution for business and industry, and the community
Rationalization for selecting these areas was established by each fiinctional area.
Each area was then allowed to design their section of the survey instmment. The
business area believed that they needed to know how other mid-sized community colleges
in Texas distribute their fimds, attempt to cut costs, prioritize their spending, allocate
money for travel, use purchasing groups, and deal with early retirement issues (see
Appendix A). The computer technology group determined that they needed to
understand how other community colleges are using mainframe computer technology,
using personal computers on campus, paying computer related personnel, establishing
computer usage policies, budgeting for computer technology, fraining employees for
using computer technology, and servicing their computer technology (see Appendix A).
Clarification to ascertain what other mid-sized community colleges in Texas are using to
place students into developmental education, how they are teaching developmental
classes, and fimding developmental classes was sought by the developmental education
area (see Appendix A), The distance education area desired to appraise what mid-sized
community colleges in Texas are offering by distance education, what technology they
are using to deliver distance education, and how they are staffing, budgeting and
equipping distance education departments (see Appendix A). How other institutions deal
with institutional grants and operate foundation offices was important to the institutional
effectiveness domains (see Appendix A), The instmctional sector requested information
56
on how other mid-sized community colleges in Texas compare to each other in cost per
contact hour spent for instmction, classify instmctional administrators, figure load hours,
staff instmction in the classrooms, in labs, govem instmctional related travel, provide
instmctional equipment, handle instmctional purchasing, and provide instmctional
professional development (see Appendix A). Administrators wanted to determine what
services are outsourced and what type of agreements govem services which are
outsourced, sold or provided to other mid-sized community colleges (see Appendix A).
Personnel departments sought to comprehend how other community colleges in Texas are
defining personnel levels, using part-time employees at each level, and evaluating each
employee group (see Appendix A). A new and growing area for community colleges,
workforce education, aspired to know how other mid-sized community colleges in Texas
are offering workforce education, workforce training and services, and organizing for
workforce development (see Appendix A), These areas were considered essential by
those involved in this benchmark study when the instrument was developed.
Following the selection of these fimctional areas, the facilitator used the input
from each expert group at the lead college and suggestions from the other colleges to
draft questions for the survey instmment. Then another round of meetings and interviews
occurred at the lead community college to allow the experts in each area to review and
refine the survey instrument. The survey questions were then revised to reflect the added
input and distributed to the professionals of the lead college forfiartherrefinement and
feedback. Once the personnel at the lead college approved of the survey, it was
distributed to the other five participating institutions for their suggestions on what
basic process areas they would like to have included in the study. After receiving
recommendations from the participating institutions, fiirther improvements were made to
the survey instmment,
Intemal Baseline
Furthermore, the Task Force agreed to establish their own institution's intemal
baseline performance measures after the instrument was finalized but before it was sent
out to participating institutions. So a pilot study was conducted using the newly designed
57
survey instmment within the lead college. Each area was asked to record the survey
information and then comment on the survey instmment itself in the areas of clarity and
ease of completion. Again after the intemal pilot study, revisions were made to the
survey instmment.
Using input from all participants several times during the formatting process and
designing a survey instmment that utilized this input allowed the participating institutions
to take ownership in the process and derive information that they deemed necessary to
make system improvements. Leibfiied and McNair (1992) indicate that an
understanding of the benchmarking questiormaire is important in gaining support for a
study; therefore, much effort was expended in helping all of the participating institutions
understand what was being benchmarked, who was to be involved, and what would be the
value of the study.
In the follow-up study, the participants were asked to evaluate the method used to
constmct the survey instrument that is by allowing the lead college to organize the
preliminary survey but allowing all participants to give input and make changes to the
survey instrument so that it included their areas of concern and need (see Appendix E).
Four respondents (66%) indicated that they felt incorporated into the process and that
their concems were included in the final survey device, and two (34%) failed to retum the
follow up survey because the original contact person was no longer employed at that
community college. Seventy-five percent of the responding institutions disclosed that
they derived information from the benchmark report that was usefial in making system
improvements. One follow up survey states that their institution liked having one college
organize the survey questions with input from others. Another commented that is was "a
lot of work by the main college."
Sub-Question 3
3. What time period is reasonable for conducting a benchmark study in community
colleges?
The original projected time period of the study was six months (see Appendix F),
beginning with the hiring of the facilitator in April and ending with the final report to the
58
peer institutions in September. Afterwards, the facilitator was to continue working with
the lead college's Task Force to help facilitate the use of the data and the change
processes at this institution during the school year. Four of the projected six months were
designated for the creation of the survey instmment, one month for participating
institutions to complete and retum the instmment and one month for analysis and
distribution of the data. These predictions were made based on the professional judgment
of the facilitator and discussions with the designated contact persons from the
participating community colleges.
The actually time period taken to complete the study was one year. Because the
instmment design meetings fell during the summer, it was difficult to carry out the work
in a timely fashion due to vacations and summer school demands; thus, it required longer
than anticipated to finish the survey instmment. The amount of time needed to complete
the survey instmment was also longer than anticipated because it occurred at the
beginning of the school year. Because the institutions involved aspired to give accurate
and complete information on the survey, the time constraints of beginning a new
semester, caused the reporting stage to also take longer than expected.
Sub-Question 4
4. What procedures can be used to collect, analyze and distribute benchmark data so that
it can be utilized by individual institutions to improve processes in community colleges?
A hard copy and an e-mail copy of the survey instrument was sent to each
participating institution for completion. A cover letter explaining the time frame for
response and that any questions or responses could be e-mailed, faxed, telephoned or sent
by mail was also included. During the data collection, correspondence between the
participating institutions and the researcher took place often to clarify or explain desired
responses on the survey. All six institutions eventually completed and retumed the
survey instmment.
When the raw data was received, it was recorded in the computer question by
question by the college numbers assigned to produce a raw data report. Then both
quantitative and qualitative reports were generated from the raw data. It was decided by
59
the Task Force and the facilitator that the calculating of ranges, the median, and the mean
for numerical responses would supply the necessary information to make comparisons;
and that a compilation of the narrative responses and an established trends list would aid
in comparing the institutions. The researcher recorded both the quantitative and the
qualitative data and then did a trends analysis of the responses. These analyses made up
the final report (see Appendix D); a copy of this final report was then sent to each
community college.
This final report was reviewed by the Task Force at the lead college. Afterwards
the lead college had the data fiirther analyzed by the researcher using a gap analysis
technique which looks at the quantitative data from a peer institution and then compares
their college's response to the others (Table 4.1). When such an analysis indicates an
institution is located at the top or bottom of the range or is way above or below the
average or median, then a "gap" in performance exists between that institution and the
other surveyed. The purpose of such an analysis is to determine those areas in which an
institution may below other institutions and thus need to make improvements.
Table 4.1. Gap analysis example from developmental area of survey instrument
Area Full Time Part Time
Instructors Instructors
Writing R=54%-100% R=l,7%-46%
Avg.=77.6% Avg.=25.5%
M=80% M=28
LCC*=65% LCC*=35%
GAP=-15% GAP=+7%
Reading R=5.6%-100% R=12%-94.4%
Avg.=61.4% Avg.= 46%
M=69 M=38%
LCC*=88% LCC*=12%
GAP=+19% GAP=+26%
*LCC = Lead Community College
The gap analysis (see Appendix G) was then given to each Task Force member in a
comparison notebook to review before a meeting.
60
Then, the Task Force assessed the gap analysis to determine areas where a gap
existed between their institution and the peer institutions. Cook's (1995) gap decisions
were followed, by identifying "both the size of gap in performance and potential causes,
the next step is to identify and prioritize areas of change and to draw up a plan for
improvements.,,." (p. 127). Then the lead college had to decide if the should seek to
match or better the peer institution's practice, and if the resources and capability to
achieve the desired improvement existed at their college. Several change options for
improvement were considered: potential changes were brainstormed, scenarios for
possible future versions of a process were developed, for altematives for improving a
process were discussed, and target areas were selected for future improvements (Cook,
1995). Each area where a gap was found between the lead colleges and the other peer
institutions was then considered by the Task Force for possible improvements; finally the
Task Force targeted and prioritized these gap areas that indicated their college was not
performing as efficiently as others. A copy of the benchmark data including the raw data,
the quantitative and qualitative reports and the gap analysis for each area was then sent to
the appropriate division chairs and adminisfrators. The various area experts then
reviewed the benchmark data, especially the gap analyses and the frends lists, which
shows if the institution is addressing each process like the majority of other institutions or
if it is falling behind in making operational changes. For example an adminisfrator from
the lead college reported, "The raw data and the gap analysis was a starting point to
target our weak processes," Only at the lead college was the final report fiirther analyzed
by using the gap analysis technique.
Even though a gap analysis was not done at all of the institutions, the final report
was analyzed for comparison purposes. One institution which used only the final report
stated in the follow up survey that the results were canvassed to compare "our
performance to see if we were on the right track." Another follow up survey comment
said, "We are still referencing the study's benchmark report in our budget hearings this
year regarding a technology fee." Thus, the other community colleges did evaluate the
final report and use the data to make institutional changes.
61
Distribution of the benchmark final report was different for each participating
institution. Two colleges sent copies of the report in each area to key personnel and
administrators, and one community college circulated the sections of the material to the
persons who completed the survey. When the complete survey results were distributed at
the lead college, a cover letter (Appendix H) explained the locations of each of the five
complete copies of the entire final report and the gap analysis and then the areas were
sent a copy of the section of the survey which applied directly to their operations.
Another institution reports having the raw data, the summary report, and the trends lists
placed in a central location, the institutional effectiveness office, and sending memos to
the various divisions announcing the location of the data. The fourth institution
responding does not recall what was done with the final report but reports that it did not
distribute the final report.
Sub-Question 5
5. Can the benchmark information be used at individual institutions to improve
processes in community colleges?
The follow-up survey revealed that colleges utilized the benchmark data in several
areas including: business affairs, computer technology, distance education, institutional
effectiveness, instmctional, outsourcing, personnel, and scheduling instmction. Basically
the benchmark data was found to be used to spur fiirther research in several areas, to
compare technology equipment and usage, to improve existing processes, to upgrade
weak areas and to develop new services.
The lead college's Task Force, division chairs and executive committee served as
teams for reviewing the final benchmark report including the trends list and the local gap
analysis. By using this information, experts in the various areas at the lead college began
to identify areas where improvement was needed. The data was further examined to
determine what elements were making other institutions more effective than the lead
community college. Then plans for change, which included both fiirther study
recommendations and definite instmctions with proposed time lines, were developed at
the lead college
62
One result of this review was a call for further research. The Task Force
appointed a subcommittee to develop a survey instmment and to survey each employee
group, including administrators, faculty and classified staff about the performance gaps
found. These intemal surveys asked for specific input regarding each area where a gap
was identified as to what improvements should be made and how these improvements
could be made. Following the results of this intemal survey, the Task Force
recommended two fiirther studies, one in the area of instmctional administration and one
on retirement options. Thus, data from the instmctional section of the benchmark study
became the starting point for two different committees. An ad hoc instmctional
administration committee was created because the benchmark data showed that all five of
the other community colleges considered the division chair position as an administrator
instead of faculty employed under a 12-month contract with 100% release time; thus, this
committee was to conduct fiirther study on compensation formulas and duties of division
chairs. Another ad hoc committee was appointed to consider retirement options. This
retirement subcommittee is still looking at various retirement options such as plans to
allow retiring faculty to serve in master teacher positions part-time. Another community
college after reviewing the benchmark data and conducting fiirther research instituted a
change through a committee that recommended replacing fiill-time faculty with part-time
faculty.
Another way the benchmark study was used was to compare technology
equipment and usage with other peer institutions. Two institutions used the computer
technology section of the benchmark study to begin improvements in computer utilization
and efficient usage of computer labs. When comparing the technology data at one
community college, the need for additional faculty computers became apparent, and in
the next budget period "all faculty and staff were approved for "the purchase of high-end
computers." One institution used the benchmark study information to allocate more
fimds to develop new programs in technology.
Several institutions used the benchmark data to make improvements on existing
processes; for example, the survey data was used at one community college for updating
their institutional sfrategic plans. Another community college reports that as result of
63
evaluating the institutional effectiveness data a new strategic plan which better "closes
the loop on all ventures of the college from intent to result" was developed. One
community college used the institutional effectiveness time lines to establish a local
"time line for operation, planning and publishing" of their strategic plan; the results of the
study also spurred the development of a new institutional effectiveness planning process.
After examining the benchmark data regarding costs, one college determined that by
upgrading a climate control system in the heating and air conditioning system, they could
operate as efficiently as the peer institutions. Still another institution used the benchmark
information to improve operations in summer school by changing to consider all summer
school offerings together as one semester in figuring academic standing.
Some institutions used the benchmark study to upgrade weak areas of operation.
The business section of the benchmark survey was used to upgrade one institution's
professional development. It reports that the survey highlighted their "need to increase
investment in employee development." This community college declares that as a result
of the benchmark survey, it has made "substantial investment in professional
development." Two institutions tell of using the benchmark data to make compensation
and salary changes; for example, one institution updated its faculty salaries as a result of
the survey. Another reported it made compensation changes for division chairs and
department heads based on recommendations from an instmctional adminisfration
committee which was formed as a result of the benchmark comparisons. One institution
used the survey to redefineflail-timefaculty because their Vice President of Instmction
questioned the accuracy of their existing definition after reviewing the benchmark study.
Another college reports that the data arrived as a subcommittee was being formed to
investigate a tenure and faculty salary schedule; thus, the data was used by that
committee to bring about improvements in personnel salaries.
Finally, institutions can use the benchmark information to develop new services
and processes. One community college felt the need to investigate Internet-based
instmction. An institutional effectiveness office at one institution recognized the
importance of establishing a time line for its institutional effectiveness activities. Another
institution was encouraged by the benchmark data to plan and offer an Intemet course,
64
One year after the study results were received, it offered its first Intemet course. At the
time of the follow-up study, the college was offering twenty Intemet courses.
Sub-Question 6
6. What are the characteristics of a college that utilizes benchmark data for process
improvements?
Participating colleges who utilized the benchmark data seemed to have common
characteristics: A president and a contact person who supported the benchmark process
and entered into it with plans to use the information generated, a willingness to invest
time and perhaps fiands into the benchmark study process, a contact person and other
expert persormel perhaps in the form of a committee or team who are knowledgeable of
the benchmark process and take an interest in designing the survey instmment, a definite
interest in applying the benchmark data to their institution such as doing a gap analysis, a
definite plan for distribution and presentation of the benchmark data to the various areas,
a culture which embraces the TQM philosophy, and an understanding of how to utilize
the benchmark data to encourage fiirther research, create new committees to plot
necessary changes and to make adaptations to create institutional improvements.
The follow-up survey instmment asked if each participating institution would
participate in another peer institutional benchmark study and 75% of those responding
replied yes. They were also asked if their institution would consider sponsoring a peer
institutional study, 50% of those responding responded yes. One institution responding
to the follow up survey indicated that the contact person was angered by the study and
that his institution "participated only because our President is a fiiend of your President
and, out of friendship, our President insisted that we participate." Obviously, this
institution did not distribute nor use the benchmark data.
Summary
This chapter reviewed the research question what are the conditions necessary for
benchmarking to be used by mid-sized community colleges in Texas to improve
community college operations? The findings to the six sub questions answered this main
65
research question. Findings from question one revealed that four basic steps were taken
by the lead college in beginning the benchmark study: recognizing the need for a
comparative study; providing the essential study elements, leaders and money;
establishing the parameters, the type, and the benchmark partners for the study, and
assuming the task of overseeing the design and pilot study on the survey instmment.
Research question two found that a benchmark study can be designed to ensure that each
of the participating institutions feels included and derives information useful in making
system improvements. Lead by the facilitator and the Task Force of the lead college, all
of the participating community colleges were involved in developing the survey
instrument. The lead community college conducted an intemal pilot study to test the
clarity, ease of completion, and value of the questions included in the instmment; and
respondents indicated that they felt included in the survey instmment design and that they
derived information from the benchmark report that was usefiil in making system
improvements. The third research question found that a year is a reasonable period of
time to complete a benchmark study with all participants involved in both the design and
completion of a survey instrument.
Research question four found that sending both a hard copy and an e-mail copy of
the survey instmment generated responses from all of the participating benchmark
institutions. Producing both a raw data report containing both quantitative and qualitative
analysis and a trends list supplied the necessary information for community colleges to
make comparisons even thought this final report distributed differently at each institution.
Answers to the fifth research question revealed that benchmark data was used in eight of
the ten areas surveyed for five basic reasons: to spur fiirther research, to compare
technology equipment and usage, to improve existing processes, to upgrade weak areas of
operation and to develop new services. The sixth research question revealed that several
characteristics existed in the three community colleges which made improvements based
on the benchmark report. These included a campus culture which accepts the TQM
philosophy of continuous improvement, support from leadership in time, money and
66
influence, and a team of knowledgeable persons to oversee the design, collection,
analysis, institutional distribution, review and utilization of the benchmark data to bring
about institutional changes.
67
CHAPTER V
CONCLUSIONS
The purpose of this chapter is to review the purpose of the study, draw
conclusions from the study's findings, make recommendations about the benchmarking
process in community colleges, and purpose future investigations in applying the
technique of benchmarking to the higher educational setting.
Purpose of Snidy
The reason for conducting study is to evaluate the feasibility of using the TQM
technique of benchmarking to bring about operational improvements in mid-sized
community colleges in Texas. Homogeneous mid-sized community colleges were
selected for this study because they provide the necessary environment and the interest to
examine the benchmarking process in an educational setting. Benchmarking was selected
because of its successfiil use in leading-edge companies as a way to continuously
improve operational processes and gain competitive advantage.
Community colleges face the challenges of making needed and continual changes
in areas to achieve resource efficiency, to meet demands of accountability, to stay abreast
of technology, to utilize an increasing knowledge base, and to provide life-long leaming
opportunities. Although quality management processes such as TQM, CQI, and CQA
have afready provided the way and the language to meet these challenges; effective
methods of bringing about continuous improvement, such as benchmarking, are still not
throughly tested in higher education. Because benchmarking provides a means for
making comparisons that might assist colleges in making needed changes, it was applied
to mid-size community colleges in Texas to observe the feasibility of it's use in
educational settings.
Thus because this is the first benchmark study of its type to be conducted among
community colleges, it provides valuable information not currently available on the use
of benchmarking in the community colleges educational setting. It also provides
68
comparison material for those community colleges which participated in the study and to
any institution or administrator seeking to improve the services of their college.
Conclusions
The question which guided this study was what are the conditions necessary for
benchmarking to be used by mid-sized community colleges in Texas to improve
community college operations? The conclusion of this study is that benchmarking can be
a valuable tool in making continuous improvements in a community college setting.
Conclusions are drawn by answering the six sub-questions that support this main
conclusion.
Sub-Question 1
1. What planning steps should be taken by the lead college in beginning a benchmark
study for an educational institution?
Seven planning steps should be taken by a lead college in beginning a benchmark study.
a. Recognize the need and the potential benefits of a comparative study is the first step
to a successful study. The lead college in this investigation had administrators and
other personnel that acquired a good understanding of benchmarking, its processes and
its benefits, and thus, realized that their institution needed comparative data to
continue to make operational improvements. A good understanding would include
what a benchmark study should entail, how the study should be conducted, how the
benchmark data can be utilized, how improvements can be made from the data as well
as an understanding of the cost in time that is required to produce usable data. A
superficial understanding of benchmarking seems a sure guarantee of frustration and
failure. Therefore, failure to allow sufficient time to adequately frain personnel on
benchmarking, design the instmment, complete the survey, and utilize the benchmark
data is one of the major reasons for failure. The only community college that reported
not using the benchmark data to make improvements only participated because their
president required them to do so due to his friendship with the president of the lead
college. Implications from this study are that involved personnel and administrators
69
must understand and believe in the benchmarking process and outcomes for an
institution to use the data in making operational improvements.
b. Provide the essential elements of money and leaders is the second necessary
planning step a community college must take in beginning a benchmark study.
Financial support is needed to cover study expenses. Money in this study was
provided by the lead college for hiring a researcher, providing work space, technology,
copying, postage, and telephone. Leaders such as a supportive president, a Task Force
to serve as the benchmark team, and key administrators and personnel to help design,
complete, and evaluate the survey instmment are necessary for an institution to derive
benefits from a benchmark study. Two major criteria were used in selecting the Task
Force at the lead college, peer respect and their belief that improvement was possible.
Task Force members were selected because they were respected by their peers in the
institution; this allowed them to lead others participating in designing the instmment,
answering the survey, and in examining the data to find areas to make improvements.
They believed that improvements could be made by identifying areas where funding
could be improved; this lead to a committee that was anxious to evaluate the data and
move on into adapting practices from other community colleges to improve their own
operations. It appears that the amount of change and number of improvements that
occur from a benchmark study are in direct relationship to the belief of the
institution's leaders in the value of the benchmark data,
c. Hire an outside researcher works well when conducting a benchmarking study. The
choice by the lead college to employ a facilitator to research and educate the Task
Force is believed to be one reason for their successfiil utilization of the study. Each
college needs a researcher or an appoint contact person to help plan and conduct the
local benchmark study. Such a researcher should have knowledge of the educational
environment, statistics, the benchmarking process and group dynamics. This kind of
person can provide objectivity, adherence to a time line, and breadth needed for a
successful study. The result of this choice was that Task Force members understood
the qualifications they needed to be an effective benchmarking team member, the
scope of the benchmarking process, the steps necessary to start the benchmarking
70
process, and how benchmarking data could be used in school improvement. If a
researcher is not hired to oversee the entire benchmark process, a consultant in the
process of benchmarking or a facilitator from within the campus stmcture with
benchmarking knowledge should at least be designated to train the Task Force of an
institution embarking on a benchmark study.
d. Select, train and work through a Task Force representative of the various
areas and levels of a community college is the fourth planning step necessary for
productive use of the benchmark data. Doing this provides the different perspectives
and the grass root involvement needed for designing a benchmark study. The group
process through brainstorming and discussion provides valuable support throughout
the benchmarking process. The fact that the lead college had a Task Force to establish
benchmark study guidelines such as the parameters, the type, partners, survey design
and pilot study is probably one of the reasons that the benchmark study proved to be
so usefiil to their community college,
e. Train in the technique of benchmarking and the results that could be produced by
such a study is the fifth plarming step that helps ensure peak performance by the Task
Force. It may be that a result of the training and direct guidance from the researcher
was the impetus for the interest and expectancy that existed at the lead college about
the improvements that could be made from the data. Understanding the value of a gap
analysis explained by the researcher helped the lead college in targeting areas for
improvement. Therefore, understanding and selecting the type of benchmarking
study, a peer institution comparison, helped the committee in interpreting the
benchmark study data for their institution.
f Select peer institutions, comparable in size and resources, as benchmark partners
and obtaining support from their presidents is the sixth important planning step
necessary to have involvement in the survey design and data collection from all of the
community colleges. As shovra by the follow-up survey, these choices fiimished the
Task Force and the lead institution with data that has been used in several different
ways to bring about improvements. It also provided two other participating
institutions with data that was used for making system improvements. Since the
71
community college with the most knowledge and training regarding benchmarking
made the most improvements, it appears that more improvements could have resulted
from the data had the peer institutions been better trained and sold on the
benchmarking technique,
g. Involve all of the participating institutions in the survey design and having one of
the community colleges conduct a pilot study is the seventh planning step. Multiple
sources of input produced data that several areas in all three of the colleges, which
utilized the study, used to bring about system improvements. The pilot study is
necessary to refine the survey instrument through application, and it also helps the
lead college establish intemal benchmarks to compare the data against later.
Sub-Question 2
2. Can a benchmark study be designed to ensure that each of the participating institutions
feels included and derives information usefiil in making system improvements?
The method of allowing one sponsoring or lead community college to conduct the
study with suggestions from other participants worked well. All six community colleges
were allowed to submit key processes and areas of interest to them, and each institution
was given the opportunity to examine the final questionnaire before they were asked to
complete it. Thus, if any of the institutions had a particular area of interest, they were
allowed to include it in the survey instmment; this method of constmcting the
questionnaire to contain useable data is efficient because it allows each participating
institution to select the type of benchmark information they feel they need collected.
Because they were included in the designing of the instrument, the colleges all cooperated
in completing the survey in order to gain data that they sought for comparison purposes.
Allowing each area to design their own particular section allowed them to design
questions to produce the information that they needed to make equal comparisons. For
example the chart designed by the computer area to compare positions and pay helped
them compare like situations. The follow-up survey showed that this method of
designing an instrument made participating institutions feel included and allowed them to
obtain information that was useful in making system improvements. This method was
72
lengthy, but did provide an instmment that yielded results that the participating
institutions could use as evidenced by the follow-up survey. It appears that the amount of
input suggested by an instittition directly related to the amount of use the data received at
each participating institution.
Sub-Question 3
3, What time period is reasonable for conducting a benchmark study in community
colleges?
A year appears to be sufficient time for the collection, analysis and use of the
benchmark information. It should be noted that time commitment is important in a
benchmark study because time is needed to educate people throughout the college about
working as teams and the benchmarking process before beginning the design, completion
and implementation stages of the study. Sufficient time is also needed for the benchmark
team to engage in meaningfiil thought, reflection, and dialogue before overseeing design
and usage of the benchmark data. It seems unlikely that a benchmark study will be used
to accomplish improvements if the purpose, meaning and potential system applications
are not endorsed by a team that has had time to participate in the team process.
Sub-Question 4
4. What procedures can be used to collect, analyze and distribute benchmark data so that
it can be utilized by individual institutions to improve processes in community colleges?
It is recognized that collection, analysis and distribution of benchmark data
definitely influence the usage of the information by individual institutions to make
improvements. Data collection from a hard copy and an elecfronic copy of the survey
instmment was somewhat effective because of convenience; however, a personal contact
such as a telephone call after the cover letter and questiormaire were received by
participating community colleges might have helped compliance with in the desired
response time frame. Keeping open lines of communications between the institutional
contacts during the survey completion time demonsfrated that e-mail, fax, telephone or
73
mail were sufficient to keep responses unified on the questionnaire. Contacting the
presidents of each institution to assure the retum of the survey instmment did result in all
six institutions eventually completing and retuming the entire survey.
All the colleges received a final report to use for analysis. More institutional
changes were produced from the benchmark data at the lead college than any of the
others, probably due to the data provided by the added information in the gap analysis
that aided them in making institutional comparisons. No other participating institution
performed a complete gap analysis; this probably contributed to limited use of the data.
The existence of the Task Force with the authority to take action or make
recommendations and the data provided by the gap analysis appears to have provided
decision makers with the foundation necessary to begin making system changes in several
areas. The fact that the raw data report, both quantitative and qualitative reports and
frends lists were used by three community colleges to make system improvements,
however, demonstrates that some changes can be made without a official gap analysis.
This would indicate that for maximum use of benchmark data a gap analysis is essential.
How the complete survey results are distributed also influences the usage of the
benchmark data. Placing the data in easily accessible locations and notifying key
decision makers of the location of the information seemed to influence the amount and
quality of improvements occurring from the benchmark final report; the community
college which sent key persons a copy of the data for their areas made the most system
improvements. Whether the data is used as a starting point to target weaknesses or to
check and see if local performance is on the right track, the availability of the data
influences the usage of the information. As noted the college which did not use the data
for improvement did not distribute the results of decision makers.
Sub-Question 5
5. Can the benchmark information be used at individual institutions to improve processes
in community colleges?
With time and effort expended to evaluate the benchmark data, systems
improvements in many areas of college campus operations can be made. Respondents
74
which used the benchmark information reported process improvements in eight of the ten
areas surveyed: business affairs, computer technology, distance education, institutional
effectiveness, instmctional, outsourcing, personnel, and scheduling instmction.
Evaluating the benchmark data produced changes in 75% of the colleges responding to
the follow up survey, and 50% of those responding indicated that they would participate
in another benchmark study. Therefore, benchmark data can be used at community
college for operational improvements.
When the need for a change is established by the benchmark data, the change is
best planned and executed by the people nearest the weak process area because
adaptations from one system to another are best made by the people who perform the task
in question. At times more detailed information is needed to outline an effective plan for
change; thus, at times another committee should be appointed to gather more specific
information before the change process is detailed. It seems unlikely that an institution
will achieve any improvements from benchmark data until personnel understand the way
to use the information and are given the freedom to explore possibilities for adaptation of
practices surveyed.
Sub-Question 6
6, What are the characteristics of a college that utilizes benchmark data for process
improvement?
When benchmarking is actively supported in an institution with a TQM or CQI
culture, it is a beneficial tool in bringing about change. Community colleges with such
cultures are already committed to continuous improvement and are looking for efficient
ways to make system improvements. Although not all of the participating institutions
commited time to educate people throughout the college about working as teams and the
benchmarking process, an initial commitment of time from administrators and key
personnel was demonstrated by all six of the community colleges in designing and
completing the survey instmment. However, not all of the participating institutions
continued the commitment in reviewing and using the benchmark data. One college chose
not to take the time to have the benchmark data available on campus; three institutions
75
chose not to commit to doing a gap analysis for their college. It might be that these
community colleges could have used the benchmark data more effectively had they been
more committed to using the benchmark data. It seems that the stronger this commitment
was in the community colleges involved, the more data was used to begin improvements.
Initiative by the president of an institution appears to be vital characteristic of a
college in successfially conducting a benchmark study and using the data. Strong
encouragement and high expectations from such a leader create productive results in
organizational cultures seeking continuous quality improvement. An attitude of "What
can we do better?" by both the president and a Task Force is essential in utilizing
benchmark data to mark campus improvements. Such an attitude usually exists in
institutions which have adopted TQM fundamentals. The three institutions which utilized
the results of the study the most had presidents who were strongly supportive of the
project and at least two have strong TQM cultures. Administrators also must be active in
supporting the benchmark study especially to share the gathered data with key decision
makers and encourage improvements to occur.
Because benchmarking is a team project, a critical factor in its successfiil use is
for several key people in each institutional setting to understand its meaning, purpose and
potential applications to their system. In this regard although all six community colleges
report that the review and answering of the survey instmment was done by several people
in different areas of expertise, only one indicated that it was done by committees or
teams. Teams with their pooling of ideas and dispersal of responsibilities are more able
to undertake efforts to adapt the practices of the peer institutions into improvements on
their campus. Active efforts to capitalize on the benchmark data by making comparisons,
doing a gap analysis, evaluating performance and frends, continuing with fiirther
research, and communicating raw data create support for change in the decision-making
process. Thus, a close relationship exists between campus and administrative support and
the number of changes which occur as a result of an educational benchmark study.
76
Recommendatinns
Based on the analysis of the process of benchmarking in mid-sized community
colleges in Texas several recommendations can be made. It is recommended that
• Each participating institution have a basic TQM culture where continuous
improvement, teamwork, employee empowerment and other TQM pnnciples are
in place. Benchmarking fits naturally into this culture as a continuous
improvement technique,
• Each participating institution have a trained Task Force to oversee the benchmark
process in their community college because extensive planning is necessary on the
part of the lead college at the beginning of a benchmark study. This planning
appears to work best if spearheaded by a president who is involved with and
supportive of the benchmark study.
• Training be provided and support of adminisfrators and central personnel be
obtained before beginning the benchmark process in order for an institution to
reap comprehensive benefits and bring about improvements from benchmark data.
An efficient way to conduct fraining would be to have the facilitator train
adminisfrators, key persormel or the benchmark task force between the time the
institution agrees to participate and the designing of the survey instrument. An
institution could continue to reap the benefits of training for several benchmark
studies without refraining and allow for such studies to be conducted more often
to spur continuous improvement.
• Every effort be made to ensure that each participating institutions feels included.
Any survey instrument used by peer institutions in a benchmark study should be
designed by those educational organizations participating, so that each school has
some ownership in the results of the benchmark study.
• At least one or two campuses visits to each institution by the researcher are
recommended. It is now believed that a visit to the campus for training purposes
and for follow up after the data was distributed would have aided the peer
77
institutions in using the data. If a face to face meeting is held and training done at
the beginning, it seems that better cooperation will exist between institutional
contacts thus accelerating the process.
• Recommended that at least a year be committed to conducting and utilizing
benchmark data before another study is begun. This time period allows for the
necessary training, input and evaluation before improvements can be
implemented. It is also suggested that at least three months are needed to
adequately answer the survey instmment if much detail is required.
• Each participating institution conduct a gap analysis from the final report. The
lead college that did a gap analysis made the most improvements; therefore, for
maximum use of the benchmark data an overall gap analysis is recommended.
Each decision-making person who has been trained in how to use benchmark data
in a community college be involved in evaluating the final report in the their area.
It is also recommended that the final report be made available to all personnel on
campus and that they be informed of its location and contents in a general
memorandum.
• The areas of the benchmark survey be selected by the participating community
colleges to ensure utilization of the information in making improvements, and
• A pilot study be conducted by one of the participating colleges because it allows
for intemal benchmarks to be set and refinements in the design instmment to be
made that enable the final report to be understandable.
Future Investigations
Other benchmark studies need to be conducted and analyzed in other institutions
of higher education both two-year and four-year institutions. As data banks are produced
from other benchmark studies in higher education, data can be used for comparisons in
process areas, thus as a follow up to this study, individually sponsored studies could be
more focused. Less comprehensive benchmark studies could be conducted within areas
in a shorter length of time. A benchmark study with training provided for all institutions
should be conducted to determine if training increases usage of the data. Research on the
78
role that the facilitator versus other means of conducting the study, i.e., a research
committee would help establish the importance of institutions in employing and using
these outside consultants, A study with face-to-face communication between the college
contacts would reveal improvements that this could bring to conducting a benchmark
study.
Once a college has realized the value of benchmark data in making system
improvements, other types of benchmark studies should be conducted such as
benchmarking other types of organizations. Best practice benchmark studies are needed
between colleges and businesses in those areas that perform the same fiinctions such as
the business area. Such studies should serve to upgrade basic operationalfimctionsin
education. Benchmarking classroom techniques and processes within and between
institution could produce data that would improve classroom instmction on college
campuses.
Summary
This chapter reviewed the purpose of the study, drew conclusions from the study's
findings, made recommendations about the benchmarking process in community
colleges, and purposed fiiture investigations in applying the technique of benchmarking
to the higher educational setting.
79
BIBLIOGRAPHY
American Productivity and Quality Center. (1993). The benchmarking management guide,
Portland, OR: Productivity Press.
Application for the Pioneer Award. College of Business, Arizona State University, Spring 1994.
Unpublished intemal document. Tempe, AZ: Arizona State University.
Apps. J. W. (1988). Higher education in leaming society: Meeting new demands for education
and training. San Francisco, CA: Jossey-Bass.
Aronow, I. (1993, January 26). lona closing its campus in Yonkers. The New York Times, pp.
WC9.
Ashworth, K. H, (1994). The Texas case study: Performance-based fiinding in higher education.
Change, 26 (6). 8-15.
Association for Institiitional Research (AIR). (1994, May 29-June 1, 1994), NACUBO
Benchmarking Paper presented at the 34* Annual Forum of the Association for
Institutional Research, New Orleans, LA,
Beckhard, R., & Pritchard, W. (1992). Changing the E.s.sence. San Francisco, CA: Jossey-Bass.
Bell, D. (1973). The coming of the pnst-indiistrial sor.iety: A venttire in social forecasting.
New York: Basic Books, Inc.
80
Bimbaum, R. (1988). How colleges work: The cybernetics of academic organization and
leadership, San Francisco: Jossey-Bass.
Blumenstyk, G. (1995, September 1). Measuring productivity and efficiency: Project aims to
come up with comparative costs for operations, from parking to purchasing. The
Chronicle of Higher Education, pp A41
Bogue, E. G., & Saunders, R. L. (1992). The Evidence for Quality San Francisco: Jossey-Bass.
Boyer C. M., & McGuiness, A, C , Jr. (Febmary 1986). State initiatives to improve
undergraduate education, ECS survey highlights. AAHE Bulletin, ,3-7,
Breedin,B. (1994). Remembering the G.L Nill. AAHE Bulletin, 46 (9), 12-23.
Brewer, P. B., Hale, C. D,, & McLaurin, S. (1996). Benchmarking academic credit and
noncredit continuing education. The .Toumal of Continuing Higher Education, 44 (1),
2-11,
Brown, M. G., Hitchcock, D, E. & Willard, M, L. (1994), Why TQM fails and what to do about
it. New York: Irwin Professional Publishing,
Calek, A, (1995). 1995 quality in education listing. Quality Progress 28. 27-77
Camp, R. C. (1989). Benchmarking: The search for industty best practices that lead to
superior performance. Milwaukee, WI: ASQC Quality Press.
Camp, R. C. (1992). Leaming from the best leads to superior performance. .Toumal of Business
Strategy, 13 (3), 3-6,
Camp, Robert C. (1995), Business process benchmarking. Milwaukee, WI: ASQC Quality
Press.
81
Carr, D. K., Hard, K. J., & Trahant, W. J. (1996), Managing the change process. New York:
McGraw-Hill.
Carr, D, K., & Johansson, H. J. (1995). Best practices in reengineering New York:
McGraw-Hill, Inc.
Chaffee, E. E., & Sherr, L. A. (1992). Ouality: Transforming postsecondary education ASHE-
ERIC Higher Education Report No, 3, Washington, D.C: The George Washington
University School of Education and Human Development,
Clarke, A. (1995). Benchmarking on campus (pp, 5). HEPROC CQI-L Archive: American
Association for Higher Education.
Clark, K. L. (1993, December 5). Benchmarking as a global strategy for improving instmction
in higher education. Paper presented at the Intemational Conference on New Concepts
in Higher Education, Phoenix, AZ.
Clotfelter, C. T., Ehrenberg, R. G„ Getz, M,, & Siegfiied, J. J, (1991), Economic challenges in
higher education. Chicago: The University of Chicago Press.
Coate, L.E. (1992). Reshaping the university: Restmcting and process re-engineering
(Unpublished Report) Portland: Oregon State University,
Coate, L, E. (1993, January 7-9). Implementing TQM at OSIJ. Paper presented at the meeting
of the MBA Managers Program, Rancho Santa Fe, CA.
Comesky, R. A., & McCool, S. (1992). Total quality improvement guide for institutions of
higher education. Madison, WI: Magna Publications.
Dale, B. (1995, October 30 &31). Practical benchmarking for colleges and universities. Paper
presented at the AAHE Workshop, Key Biscayne, FL.
DeCosmo, R. D., Parker, J, S., & Heverly, M. A, (1991), Total quality management goes to
community college. In Total quality management in higher education. L. A. Sheer &
D. J. Teeter (Eds.). San Francisco: Jossey-Bass Inc.
82
Dertouzos, M,, Lester, R., & Solow, R., & MIT Commission on Industrial Productivity. Made in
America- Regaining the productive edge Cambridge, MA: The MIT Press, 1989.
Detrick, G., Magelh, P,, & Pica, J, (1994), Benchmarking: A key to successfiil program review.
Paper presented at the 1994 Annual Meeting of the Graduate Management Admission
Council, San Diego, CA.
Detrick, G., & Pica, J. A, (1995, April 10-12). Benchmarking business school performance:
Lessons leamed. Paper presented at the American Association of Collegiate Schools of
Business, Minneapolis, MN,
Dmcker, P. F. (1992), Managing forthe fiittire: The 1990s and beyond New York: Tmman
Talley Books/Dutton,
Ewell, P. T., Finney, J., & Lenth, C. (1992). Filling in the mosaic: The emerging pattem of
state-based assessment. AAHE Bulletin, 42 (8), 3-5.
Ewell, P. T. (1993). A preliminary study of the feasibility and utility for national policy on
instmctional 'good practice' indicators in undergraduate education. Boulder, CO:
National Center for Higher Education Management Systems.
Extemal Comparisons Summary report: MBA: Leadership & Leaming. (1993). Boston:
Harvard Business School.
Ford, D.J. (1993). Benchmarking HRD. Training and Development. 46 (6). 36-41.
Freed, J. E., Klugman, M. R., & Fife, J. D. (1997). A culttire for academic excellence. ASHE-
ERIC Higher Education Report, 25, (1). Washington D.C: George Washington
University Press.
Freed, J. E., Klugman, M. & Fife, J. D. (1994). Total Quality Management on campus:
Implementation, Experiences, and Observations. Paper presented at an annual meeting of
the Association for the Study of Higher Education, November 10-13, Tuscon, Arizonia.
83
Gaither, G., Nedwek, B. p., & Neal, J. E. (1994). Measuring up: The promi.ses and pitfalls of
performance indicators in higher education ASHE-ERIC Higher Education Report No. 5
Washington, D.C: The George Washington University, Graduate School of Education
and Human Development.
Haack, R, L. (1998). NACUBO benchmarking and its effect on higher education business
processes. Unpulbished doctoral dissertation. University of Nebraska, Lincoln.
Hammons, J. O. (1994, July-August). Can total quality management succeed at your college-
Now? In D. B. Lumsden (Ed,), Community College Toumal of Research and Practice,
IB. (4), 335-344.
Heaphy, M. S., & Gmska, G. F, (1995), The Malcolm Baldrige National Quality Award: A
yardstick for quality growth New York: Addison-Wesley Publishing Co.
Inger, M. (1993). Benchmarking in education: Tech prep, A case in point. New York:
Columbia University, National Center for Research in Vocational Education, Berkley,
CA,
Kantrov, A, (1984), The constraints of corporate tradition: Doing the correct thing, not just what
the pa.st dictates. New York: Harper and Row.
Keeton, M., & Mayo-Wells, B. (1994). Benchmarking for efficiency in leaming. AAHE
Bulletin, 46 (8), 9-13.
Kempner, D. E, (1993, December), The pilot years: The growth of the NACUBO benchmarking
project. NACUBO Business Officer, 27 (6V 21-31,
Leaming Resources Network. (1992). Ratios for success (304). Manhattan, KS: Leaming
Resources Network,
Leibfiied, K. H. J., & McNair, C J. (1992), Benchmarking. New York: Harper Collins
Publishers.
84
Liebfiied, K. H. J., & McNair, C J, (1992). Benchmarking: A tool for continuous improvement.
In The Coopers & Lybrand Performance Solutions Series, CMA, Harper Business, 1992.
Leslie, D. & Fretwell, F. K, (1996). Wise moves in hard times: Creating and managing resilient
colleges and universities. San Francisco: Jossey-Bass.
LeTarte, C E., & Schwinn, C J. (1994, July-August), Evolution of TQM principles and
practices at Jackson Community College. In D. B. Lumsden (Ed). Community College
Joumal of Research and Practice, 18 (4), 345-358,
Martin, D. (1995, May 14). Upsala ceremony is a last hurrah. The New York Times, p. 5B,
Massey, W. F., & Meyerson, J. (Eds.). (1994). Measuring institutional performance in higher
education. Stanford Fomm for Higher Education Futures. Princeton, N.J.: Peterson's
Guides.
Minter, J. M, (1996). National cooperative data share - Benchmark data exchange. Available on
Intemet at http://www.edmin.coTn/jma/ncds.html
Myerson, J. W., & Massy, W. F. (Eds.). (1994). Measuring instittitional performance in higher
education.
The nation. (1989, September). Chronicle of Higher Education Almanac, American Council on
Education, p, 24.
85
Nicklin, J. L. (1995, January 27). The hum of corporate buzzwords: Colleges look to businesses
for advice on resttiicturing. Chronicle of Higher Education p. A33.
Parfett, M. (1994). The BPR HanHhnnk Oxford, England: NCC Blackwell Limited.
Peters, T, (1987). Thriving on chaos: Handbook for a management revolution. New York:
Alfred A. Knopf
Pinellas County Schools. (1994). Superintendent's quality challenge. Largo, FL: Author.
The pilot years: The growth of the NACUBO benchmarking project. (1993, December),
NACUBO Business Officer, 27, (6), 21-^1
Price, C (1994). Pilot in higher education change: A view from the helm. In
S. Weil, (Ed.). (1994). Introducing change 'from the top' in universities and colleges.
(pp. 29-39). London: Kogan Page Limited.
Ross, J. E. (1993). Total quality management: Text, cases and readings. Defray Beach, FL: St,
Lucie Press.
86
Sashkin, M., & Kiser, K, J. (1993). Putting total quality management to work. San Francisco:
Berrett-Koehler Pub.
Sapp, M. M., & Temares, M. L, (1992). A monthly checkup: Key success indices track health
of the University of Miami. NACUBO Business Officer, 25 (9), 24-31.
Schrof, J, M. (1996, September 16). Cmnch time for community colleges. U. S. News & Worid
Report, 123.
Schnell, G. (1995), Success stories (with contacts) for CQI in higher education (pp, 6)
HEPROC CQI-L Archive: American Association for Higher Education,
Seymour, D.T. (1993a), On Q: Causing quality in higher education. Phoenix, AZ: Oryx Press,
Seymour, D,T. (1993b). Quality on campus: Three institutions, three beginnings. Change, 25
(3), 14-27.
Seymour, D., & associates (Eds.). (1996). High performing colleges: The Malcolm Baldridge
National Quality Award as a framework for improving higher education (Vols. 1-2).
Maryville, MO: Prescott Publishing.
Shafer, B. S., & Coate, L. E. (1992, November). Benchmarking in higher education: A tool for
improving quahty and reducing cost. NACTTBO Business Officer. 26 (5), 28-35.
Shaw, G. (1995). Benchmarking on campuses (pp 6). HEPROC CQI-L Archive: American
Association for Higher Education.
Sheer, L. A., & Lozier G.G. (1991), Total quality management in higher education. In L. A.
Sheer & D, J. Teeter, (Eds.). Total quality management in higher education. San
Francisco: Jossey-Bass Inc.
Sherr, L. A., & Teetor, D. J. (1991). Total quality management in higher education. New
Directions for Institutional Research No. 71. San Francisco: Jossey-Bass
87
Sork, T. J., & Caffarella, R. S, (1990), Planning programs for adults. In B. B. Meriam & P. M.
Cunningham, (Eds.), Handbook of adult and continuing education (pp. 233-245). San
Francisco: Jossey-Bass.
Taylor, B, E., Meyerson, J. W,, & Massy, W. (1993). Strategic indicators for higher educaiton:
Improving performance Princeton, NJ: Peterson's Guides.
Teeter, D. L., & Lozier, G. G. (1993). Pursuit of quality in higher education: Case .sttidies in
total quality management New Directions for Institutional Research. San Francisco,
CA: Jossey-Bass.
Thor, L. M. (1994, July-August). Introducing the human side of total quality management into
educational institutions. In D, B, Lumsden (Ed.). Community College .Toumal of
Research and Practice, 18 (4), 345-358,
Tucker, S. (1996). Benchmarking: A guide for educators. Thousand Oaks, CA: Cowin Press,
Inc.
Veysey, L.R. (1965). The emergence of the American university. Chicago: The University of
Chicago Press.
Weaver, C.N. (1995). Managing the Four stages of TQM. Milwaukee, WI: ASQC Quality
Press.
Weil, S. (Ed.). (1994). Introducing change 'from the top' in universities and colleges. London:
Kogan Page Limited.
Wolverton, M. (1994). A new alliance continuous quality and classroom effectiveness. ASHE-
ERIC Higher Education Report No. 6. Washington, D. C : The George Washington
University, School of Education and Human Development.
88
Zemsky, R., & Massy, W. F. (1995). Toward an understanding of our current predicaments.
Change, 9 7 ( 6 ) 41-4Q
89
APPENDIX A
90
Business
Institution
Budgeting
Please attach a copy of your 1995-96 budget and answer the following budget questions.
3. Budget expenditures: Please give the dollar amount and the percentage of your
institution's budget that each category represents.
Category Dollar Amount Percentage of Total Budget
equipment (other than
technology)
expendable supplies
maintenance (physical plant
operations)
personnel (including fringe
benefits)
personnel salary (excluding
fringe benefits)
personnel fringe benefits
professional development
technology equipment
fravel
utihties
91
Business 1 2
technology?
personnel?
travel?
maintenance?
92
Business 1 3
Travel
Definitions:
Instructional travel—travel for the purpose of teaching classes i.e. off campus,
intercampus
Institutional travel—travel required for business i.e. committee service, new
regulations training, coordinating boards requirements,
agency requirements
Professional travel—travel for improving professional skills
Student travel—travel of student representing the college
7. What policies does your institution have goveming travel ? (may attach policy)
For adminisfration?
For faculty?
For staff?
For students?
93
Business 1 - 4
10. In the first column check the areas for which your institution provides travel money;
in the second column give the percentage of the travel budget allocated for each area
checked; in the third column make any explanatory comments that might be needed to
clarify your answers.
Area as defined above Percentage travel budget Comments
Instmctional
Institutional
Professional
Student
Other (specify)
11. In what areas and how does your institution allocate money for travel? (Under
method check all that apply; then explain how the method is applied in each of the three
areas: faculty, adminisfration and classified)
Method Faculty Adminisfration Classified
formula (please
identify formula for
each area)
individual
budgets
exfra/
additional requests
other
(Please explain)
94
Business 1 5
Purchasing
13. Does your institution group purchase by combining with outside institutions or
businesses? yes no
If so, what types of institutions?
15. Does your institution self insure any form of coverage? yes no
If yes, what coverages are self insured?
16, Does your institution undertake a bid process to select your liability and/or worker's
compensation coverage?
yes no
If yes, how often?
17. What kinds of insurance coverage are purchased by the instittition for college
employees?
95
Computer Technology 2
Institution
Mainframe Questions
1. Regarding your record keeping and accounting fimctions, please answer the following
questions.
What database is being used?
2 Please complete the following table with the indicated codes (use as many codes in
each cell as are necessary to properly define conditions):
Administrative /4rf/ioc reporting: Accessible by:
software: N=Not possible Q=Query language
H=Homegrown D=Outsourced DP Staff R=Report generator
P=Proprietary L=Local (College) DP Staff D=Downloading files
Area (give brand name if 0=CoIlege Non DP Staff 0=Open Data Base
P) Coimectivity
Prospective Students
Registration
Transcripts
Personnel
Payroll
General Ledger
Accounts Payable
Accounts Receivable
Financial Aid
Inventory
Library
Purchasing
96
Computer Technology 2 2
Institution
PC Questions
3. How is your computer equipment maintained and serviced? (Place an "X" in each
appropriate cell)
Equipment In house Service Confracts Other Outsource
Printers
PCS
LAN
WAN
4. How is your computer equipment supported (Place an "X" in each appropriate cell)
Equipment In house Service Contracts Other Outsource
Printers
PCS
LAN
WAN
97
Computer Technology 2 - 3
Institution
5. Please indicate the number of micro-computers currently in use at your institution, and
the number which are connected to a local area network:
Persormel Questions
6. Please list, in the table below, the job titles for each individual position whose primary
purpose is support of computer technology at your institution. If the relative starting
salary for a secretary with no experience is indexed at 100, please give the relative
starting salary for each position, assuming that the individual holding the position has the
minimum experience you require.
For example, if a beginning secretary with no experience really makes $7.50 per hour,
and a beginning programmer-analyst with 3 years' experience (the stated minimum on
a job description) makes $18.00 per hour (or monthly equivalent), you would show
the index for the programmer-analyst to be 240. The math is:
98
Computer Technology 2 4
Institution
The index will allow us to compare the relative value each college places on its
computer personnel while discounting regional salary variations. Please also indicate
whether the individual is paid on an hourly or monthly basis, and the percent time the
person is employed in support of computer technology (in the example below, the
programmer-analyst only works 75% time as a programmer-analyst; the individual
also works 25% time teaching CIS courses).
Policy Questions
Definition:
Technology—Telecommunications services such as telephone and data services,
instructional equipment, such as audio-visual equipment, and computer equipment.
99
Computer Technology 2 - 5
Institution
Does your institution charge lab fees which include technology usage
separate from a technology fee? yes no
If yes, in what areas?
9. Do you provide computers at the institution's expense for school use by the
following: (Placing an "X" on the line means yes computers are provided for that
group; for each group checked, please indicate on the right side what method is used
to determine who gets these computers.)
students
faculty
administration
clerical
10. What is your replacement policy for microcomputers, e.g., every so many years, as
needed, etc)? (may attach copy of the policy)
In labs/classrooms?
100
Computer Technology 2 - 6
Institution
In offices?
11. What is your ratio of service providers (e.g., fiill-time, equivalent persons dedicated to
micro-computer support) to machines?
15. Do you require students to sign contracts before they are allowed to use your
institution's technology systems? yes no
If yes, which of the following does it cover? (Please attach copy of
contract)
network use
E-mail
copyrights
security
student privacy rights
other (please explain)
16. Does your institution have a policy regarding usage of the Intemet?
yes no
If yes, please attach a copy of the policy.
101
Computer Technology 2 7
Institution
Training
18. What is your institution's procedure for training employees on hardware and
software?
Budget
28. As you define it, what does your technology budget include:
telephones
instmctional equipment services (audio-visual equipment)
computers
LAN
WAN
102
Computer Technology 2 8
Institution
Service
30, What approximate time lapses occur between a service request and service in
the dealing with the following:
hardware problems
printer problems
software problems
connections to LAN
connections to WAN
31, In what areas has your institution experienced problems? (check all that
apply)
software updates
software installation
hardware updates
hardware installation
32. How have you attempted to resolve the problems in the areas checked?
Software updates:
Software installation:
Hardware updates:
Hardware installation:
103
Developmental/Tutoring Instmction 3 - 1
Institution
Developmental Instmction
2. Please give the contact hours for the fall of 1995 for each developmental area, the
number of employees in the area for fall 1995, and the number of students enrolled in
each area for fall 1995:
Developmental Area Contact Hours Number of Employees Number of Students
7. How many FTE support staff are employed in the noncourse offerings?
Reading:
Math:
104
Developmental/Tutoring Instmction 3 - 2
Institution
10. How is a student charged for developmental classes and labs? (Please check the
appropriate line.)
Regular tuition rate
Tuition plus lab fee
Lab fee only
Clock hour fee
Other
(Please explain)
12, What percentage of classes (based on contact hours generated in the fall of 1995) is
taught by each? (Please fill in the percentages to equal 100%) for each area.)
Area Full Time Instmctors Part Time Instmctors
Writing
Reading
Math
Other (specify)
105
Developmental/Tutoring Instmction 3 3
Institution
Tutoring
16. How are tutoring services fimded? (Check all that apply.)
Institutional fimds
Workstudy fiinds
Charge to student
Grant funds
Describe:
Other-
Describe:
Learning Center
106
Developmental 3 - 4
Institution
18. In your leaming center or developmental labs which do you use: (Check all that
apply.)
lab assistants
lab supervisors
fiill time instmctors
part time instmctors
partial load 4or fiill time instmctors
other (specify)
19. Are classes taught in a developmental lab rather than having lab work supplemental
to instmction? yes no
If yes, is instmction by (Check appropriate response)
instmctor lab assistant lab supervisory other (specify)
107
Distance Education 4 - 1
Institution
Definition:
Distance Education—a planned teaching/learning experience that uses a wide
spectrum of technologies to reach learners at a distance and is designed to
encourage learner interaction and certification.
2. For which of the following does your institution use its delivery systems:
credit courses
noncredit courses (e.g., continuing education, seminars, etc.)
instmction within the college system
instmction outside the college system
intercampus meetings
other (specify)
3. For the above, does your institution create its own course content or purchase
courses?
create purchase both
108
Distance Education 4 2
Institution
10, Name these positions of support personnel in distance education in the first column;
then in the second column please indicate if they have other responsibilities outside of
distance education..
Position job title Other responsibilities
11. Who teaches the courses offered by your delivery systems, (Check all that apply.)
Full time faculty as part of load
Full time faculty as overload
Part time faculty
Confracted services (If check, please list what services are contracted.)
Other (specify)
12. How are services such as registration and advising provided for distance
education?
13. Indicate your institution's fiiture in distance education by listing the top three
priority activities planned.
109
Distance Education 4 - 3
Institution
Budgeting
14. How are your delivery systems budgeted? (Check all that apply.)
Institutional budget
Grants
Fee Stmcture
Other (Please explain)
15. What percentage of the college budget goes to distance education activities?
18. Do you charge any additional fees for distance education services?
yes no
If yes, please attach a policy.
If yes, how much?
Equipment
Please attach a list of the equipment that your institution uses in operating your delivery
systems. (Should include brand, type of equipment and model #.)
19. What kind of dedicated space do you have for the operation of your delivery
systems?
designated shared other (specify)
Please offer any additional comments on equipment for delivery systems here.
110
Distance Education 4 - 4
Institution
Agreements
20. With how many other institutions, (e.g., four year, two year, high schools) do
you have distance education agreements?
Please list these institutions.
DELIVERY SYSTEMS
Please answer the section of questions that apply to those systems checked above.
Broadcast
1, Which broadcast systems does your institution use?
Satellite
Closed Circuit for campus
Compressed video
Radio
Low power television
Other (specify)
Ill
Distance Education 4 - 5
Institution
6. What course titles are offered through broadcast? (may attach a brochure)
8. How are the course offerings for the broadcast system determined, e.g.., by
departments, by distance education department, etc.)?
9, Check how you publicize your broadcast classes: (check all that apply)
Inside campus
Focused advertising to a group
Outside campus
Separate as individual classes
Together
Paid advertising
Other (specify)
11. Does your institution have an agreement with a public television station?
yes no
If yes, what kind of agreement?
12. Does your institution have an agreement with a local cable system?
yes no
If yes, what kind of agreement?
112
Distance Education 4 - 6
Institution
13, In this area of broadcast does the cost of offering such classes exceed the
benefits?
yes no
Please explain,
Please offer any additional comments about broadcast delivery systems here.
1. What size physical area does your ITFS or microwave system serve?
2, How long have you been providing ITFS and microwave service?_
6. How are course offerings for the ITFS and microwave system determined, (e.g., by
departments, by distance education department, etc.)?
113
Distance Education 4 7
Institution
Please offer any additional comments about ITFS and microwave delivery systems here:
114
Distance Education 4 - 8
Institution
6. Is your institution using Intemet as a resource for additional classroom
information? yes no
If yes, how?
12. Does the cost of offering such Intemet classes exceed the benefits?
yes no
Please explain.
13. What course titles are offered through Intemet? (may attach a brochure)
115
Distance Education 4 • 9
Institution
16. How are the course offerings for Intemet determined, e.g.., by departments, by
distance education department, etc.)?
18, In your community college setting, what are the legal issues you are or must address
(e,g., receiving funding if class attendants are outside the area)?
19, What equipment ramifications should be addressed when offering distance education
classes over the Intemet, (e.g., sufficient disc space from too many enrollees)?
20. What are the advantages of using the Intemet as a delivery system for distance
education?
21. What are the disadvantages of using the Intemet as a delivery system for
distance education?
116
Distance Education 4 10
Institution
Please offer any additional comments about Intemet and computer systems here.
Video
6. What equipment is provided at college expense in these classrooms? (may attach list)
7. What course titles are offered through video? (may attach a brochure)
117
Distance Education 4
Institution
9. How are the course offerings for the video system determined, (e.g., by departments,
by distance education department, etc.)?
11, Does the cost of offering video classes exceed the expense of using video?
yes no
Please explain.
12. What are the advantages of using video as a delivery system of distance education?
13. What are the disadvantages of using video as a delivery system or distance
education?
Please offer any additional comments about video delivery systems here.
118
Institutional Effectiveness 5
Institution
1. Please attach an organizational chart of your institution and, if possible, attach brief
descriptions of job fimctions related to administrative positions that conduct research,
planning or institutional effectiveness duties.
Research
Planning
Institutuional
Effectiveness
Other (specify)
3, Describe your yearly procedure or attach a time line or policy for institutional
planning and effectiveness.
5. How does your institution use these accountability measures, success indicators or
objectives and outcomes in the decision-making process?
Please offer additional comments regarding institutional research and effectiveness here:
(particularly identify irmovative or usefiil approaches you are using)
119
Institutional Effectiveness 5 - 2
Institution
Resource Development
Grants
6, What percentage of your institution's annual budget is raised through grants and
contracts?
7. What is your institution's income per contact hour from federal sources?
10, Generally what has been your institution's obligation to personnel and resources after
the grant is concluded?
120
Institutional Effectiveness 5 3
Institution
Foundation
other (specify)
14. Do college personnel serve as voting members on your college foundation board?
yes no
If yes, which ones?
15, What pohcies govem the foundation's distributable fiinds? (may attach policy)
16. Does your institution hire outside managers for managing the foundation fund?
yes no
121
Instmctional 6 - 1
Institution
Please attach an instmctional organization chart and answer the following questions about
your credit instmctional programs.
Cost Study
On your 1994-95 cost per contact hour study submitted to the state, fill in the chart below
to indicate your cost per contact hour for the following:
Area State Median Lead College Your Institution
Institutional Support 0,83 0.82
Student Services 0.44 0.39
Staff Benefits 0.29 0,28
Library 0.19 0.28
Other
Please continue to fill out t le chart belo\V from your 994-95 contact hour study.
Academic- Academic- Academic- Voc/Tech Voc/Tech Voc/Tech
Area State Lead Your State Lead Your
Median College Institution Median College Institution
Instructional 0.21 0.21 0,32 0.28
Administration
Organized
Activities 0.08 0,00 0,07 0.05
Related to
Instmction
Instmctional Administration
Definition:
Instructional administrators—Employees who supervise faculty and maintain
instructional load.
1. If your institution defines instmctional administrators differently, please give your
definition.
122
Instmctional 6 2
Institution
4. In the first column list the titles of the various levels of your instmctional
administration; in the second column define the major duties of each level; and in the
third column explain how each level is compensated , i.e., pay, release time or both.
Titles Define Duties Compensation-pay, time,
both or other
123
Instmctional 6-3
Institiation
5. In the first column again list the titles of your instmctional administrators; in the
second column indicate how many departments that title supervises; in the third column
indicate how many different major areas are supervised; in the fourth column indicate
what the teaching load responsibility is for that title; and in the fifth column indicate the
length of contract of that title, i.e., 9 month, 10 month , 11 month or 12 month.
6. Once more please list the titles of your instmctional administrators in column one; in
column two, three and four please give the number of employees supervised per semester
by that instmctional administrative level.
Titles Full time faculty Part time faculty Classified
supervised supervised supervised
124
Instmctional 6 - 4
Institution
Load Considerations
Physical activity class load (Please include how many hours per week
physical education activity classes meet)?
Clinical hours load (Define each clinical type instmction and then explain
how the responsibility relates to faculty load)?
Intemship loads?
Lecture/lab loads?
9. Do fiill time instmctors receive load credit for lab instiaiction? yes no
125
Instmctional 6-5
Institution
10. Are load reductions given for school responsibilities outside instmction? yes no
If yes, what responsibilities?
Extra curricular activities
Proj ects/grants
Job-related activities
Professional organizational service
Committee service
11. Has the compensation either by pay or release time proved to be cost effective?
yes no
Explain.
12. Has the compensation proved to be beneficial, i.e. program growth, prestige to your
institution? yes no
Explain,
14. How do you calculate load for credit adult vocational classes?
15. Do you have mixed classes (i.e., semester hour credit students and adult vocational
students in the same class)? yes no
If yes, how do you calculate load on these classes?
16. How does your institution calculate load credit on small classes (i.e., under 10 in
enrollment) in the transfer area? (Give an example)
17. How does your institution calculate load credit on small classes (i.e., under 10 in
enrollment) in the vocational/technical area? (Give an example.)
126
Instmctional 6 - 6
Institution
19. How is the faculty compensated for these extra large classes?
not compensated
extra load credit
monetary compensation
^both of the above
other (Please explain)
Instmctional Personnel
21. What percentage of your part time employees are part time faculty?
22. What percentage of your full time faculty is on the following assignments
^hourly basis
9 month
10 month
11 month
12 month
127
Instmctional 6 - 7
Institution
28. What is the fiill time counselor ratio per FTE (15 credit hours) students at your
institution?
29. What is the fiill time librarian ratio per FTE (15 credit hours) students at your
institution?
32. Describe your academic advising system, (check the appropriate system)
centeralized, i.e., counseling center
decenfralized, i.e., departmental faculty
integrated (explain)
other (explain)
128
Instmctional 6 - 8
Institution
35. What percentage of your contact hours were generated by part time faculty for the last
fall semester?
36. On the left list the departments/divisions that used part time faculty and on the right
give the percentage of faculty for that department/division that is part time or attach a
printout of the information.
129
Instmctional 6 - 9
Institution
38. What is your load percentage goal for the the level of use of part time
faculty?
39. What is the maximum percentage goal for part time faculty?
40. How do you decide to replace part time with full time faculty?
41. How do you ensure quality of insttiiction in using part time instmctors?
43. Are the various department computer labs separate or consolidated in one area of the
campus? separate consolidated
130
Instmctional 6-10
Institution
Academic
Technical
Other (specify)
Please offer any additional comments on computer labs and staffing here:
47. Does your institution differentiate between lab instmctors, lab assistants and lab
supervisors? yes no
If yes, define:
lab instmctor
lab assistant
lab supervisor
48. Are lab assistants excluding student assistants considered classified personnel?
yes no
131
Instmctional 6
Institution
50. If you use assistants excluding student assistants, how is their pay figured?
55. What credentials (e.g., work, degree, qualifications) must a lab instmctor possess?
56. What credentials (e.g., work, degree, qualifications) must a lab assistant possess?
57. What credentials (e.g., work, degree, qualifications) must a lab supervisor possess?
58. If you use lab instmctors, how is their load for labs calculated?
132
Instmctional 6-12
Institution
62. How are technical labs organized? competency based customer driven
Other (explain)
Travel considerations
133
Instmctional 6-13
Institution
For administration?
For faculty?
For staff?
For students?
66. Are there policy differences between departments regarding travel? yes no
If so, what differences apply?
Budgeting
Renting policy
Leasing policy
Purchasing policy
134
Instmctional 6-14
Institution
135
Instmctional 6-15
Institution
81. Does the technical faculty receive release time for training in industry? yes no
If yes, how is this handled?
136
Outsourcing 7 I
Institution
Definition:
Qutsourcing-the contracting of services to outside sources that the college normally
performs
Services sold-the providing of services by the college for money to sources outside
college
1. Check the services that are oursourced and the services that are sold by your institution;
then for each outsourced service please explain the type of agreement that you have.
Services
Outsourced Sold by
Services Institution Service Explanation of Agreement
Accounting
Academic advising
Administrative Computer
Advertising
Bookstore
Building Maintenance
Career Counseling
Child Care
Clerical
Computer Services
Computer Software
Courier Services
Custodial
137
Outsourcing 7- 2
Institution
Services
Outsourced Sold by
Services Institution Service Explanation of Agreement
Distance Education
Energy Maintenance
Food Services
Grounds
Instmctional areas
Housing
Job Placement
Library
Mailing
Maintenance of
Personal computers
Network computer
Parking
Payroll
Personal Counseling
Police
Printing Services
Professional staff
Development Training
138
Outsourcing 7- 3
Institution
Services
Outsourced Sold by
Services In.stitntion Services Explanation of Agreement
Publications
Purchasing
Records Management
Security
Telephone
Testing
Travel Arrangements
Vehicle Maintenance
Vehicle use
Vending Machines
Other (Specify)
Other (Specify)
139
Personnel 8 - 1
Institution
PHRSONNF.L QTJF.STIONS
Definition:
Full time Fanilty-those who are eligible for TRS or ORP and state insurance
benefits
Part-time faculty-those faculty who are not eligible for TRS or ORP and state
insurance benefits
Administration—those personnel who are not faculty but typically have eligible
consideration for ORP
Part-time administrators—fhnsp personnel who are not faculty but would be eligible
for ORP consideration if the position was full time
Classified—those clerical employees in support positions who are usually paid on
an hourly basis or technical and professional non-faculty
positions which do imt have the responsibility of the
administrative positions
Part-time classified employees—those support employees not eligible for TRS or
ORP due to assignment of hours (i.e. 19 or less) or length of
assignment
1. If your definition of full time faculty is different from the one above, what is your
definition?
3. If your definition of an administrator is different from the one above, what is your
definition?
4. If your definition of classified is different from the one above, what is your definition?
140
Personnel 8-2
Institution
5. If you have employment categories not listed above, please list and define them
jelow:
Category Definition
Part-time employees
141
Personnel 8 - 3
Institution
11. Does your review process differ for fiall time faculty after the first few years?
yes no
If yes, how?
12. Are salary increases tied to this review process for fiill time faculty?
yes no
If yes, how?
13. What kind of review or evaluation process do you use for part-time faculty? (May
attach example or copy of evaluation instrument)
142
Personnel 8-4
Institution
15. Does your review process differ for part time faculty after the first few years?
yes no
If so, how?
16. Are salary increases for part time faculty tied to this review process?
yes no
If yes, how?
17. What kind of performance evaluation process do you use for classified /clerical?
(may attach example or copy of evaluation instmment)
18. Is this evaluation process different after the first few years of service?
yes ^no
If yes. How?
21. What kind of performance/evaluation process do you use for administration? (may
attach example or copy of evaluation instrument)
23. Is your process different after the first few years of service?
yes no
143
Personnel 8-5
Institution
Retirement
Please attach copies of any retirement policies that your institution is now using.
28. Does your institution offer a separate retirement plan from ORP/TRS?
yes no
If yes, answer the following.-
Describe your plan:
144
Personnel 8 - 6
Institution
29. Does your institution participate in Social Security other than Medicare?
yes no
If yes, is Social Security paid for your employees? yes no
If yes, is the employee share paid by the institution? yes no
If yes, what percenage is paid by the institution?
31. Does your institution use flex time scheduling of hours for some non-instmctional
positions?
If yes, please attach a copy of your policy goveming flex time scheduling and
answer the following question.
What deviations are allowed in scheduling work hours?
145
Personnel 8-7
Institution
Volunteers
In what areas?
Instmction
33. What does your institution consider to be the size of an average lecture class?
146
Instmctional Scheduling 9
Institution
INSTRUCTION AL-SCHEDin,TNG
2. Please list the number of sections in an average semester for each area.
Area Average number of sections
Academic
Technical
Continuing Education
3. How does your institution schedule night classes (e.g., number of hours per meeting, and
days of the meetings, etc.)?
5. Are full time faculty required to teach evening classes as a part of their regular load?
yes no
6. Can a person obtain a degree from your institution by attending only evening classes?
yes no
7. Within these evening academic classes, are other students allowed to take the same class
for continuing education credit? yes no
If so, how are they charged for the class?
8. Are any evening classes scheduled at a company site or at an unusual time (midnight, etc)
for business or industry? yes no
If yes, please explain:
147
Instmctional Scheduling 9 - 2
Institution
10. How many classes (total number of all sections) were offered at your institution for the
1996 summer?
11. What was the average number from the last three years of classes (total number of all
sections) offered in summer terms?
12. When are these classes offered? (check all the answers that apply)
Six week sessions
Eight week sessions
Ten week sessions
Eleven week sessions
Twelve week sessions
Other (Please explain)
13, Describe the summer classes offered by checking applicable times in column one, then
fill in the other two columns for each time checked note the total contact hours generated
and the cost versus revenue evaluation of each type of class.
Did vou generate enough
Time Total Contact Hours enrollment to be cost
Academic Technical effective? (YorN)
Academic Technical
One long session
per summer
148
Instmctional Scheduling 9 - 3
Institution
14. Why do you feel that your course scheduling meets your students needs?
15. Within these summer academic classes, are other students allowed to take the same class
for continuing education credit? yes no
If so, how are they charged for the class?
Nontraditional Scheduling
Definitions:
Mini-semester—a semester of credit offered in times between regular semesters
Weekend classes—credit classes which meet only on weekends (A weekend begins
at 5 p.m. on Friday until midnight on Sunday)
Block classes within a long semester—credit classes offered for short condensed
periods of time rather than the 15 weeks of a semester or a summer
term
Extension center—an off campus location that offers college credit classes
excluding dual credit classes offered at high schools
Dual credit classes—Classes offered with both high school credit and college class
credit being granted at the same time
Concurrent classes—Classes that students still attending high school may
attend for college credit only.
1. Check the types of nontraditional class offerings based on the above definitions that
your institution offers and then answer the questions in the section that corresponds to
those programs marked.
Mini-semesters
Weekend sections
Block classes
Extension center
Dual/Concurrent
149
Instmctional Scheduling 9 - 4
Institution
2. Has your institution found other optimal scheduling periods other than those listed
above which are cost effective? yes no
If yes, please explain.
Mini-semester scheduling
3. Please fill in the blanks below to answer the following questions. Check when your
mini-semesters are scheduled? On the average how many students enroll per semester?
How long do they meet? What is the average number of classes offered in a mini-
semester?
Time # Students Length £iif
Classes
Between spring and summer sessions
Between fall and spring sessions
Other (please explain)
4, Do you select instmctors for mini-semesters based on cost efficiency (e.g., part time
vs, full time)? yes no
6. How do you calculate the cost to your institution to offer such classes?
7. For 1995 what percentage of the students enrolled in mini-semesters did not attend
standard 15 week classes as well?
150
Instmctional Scheduling 9 5
Institution
10. Within these academic classes being taken for credit, are other students allowed to
take the same classes for continuing education credit? yes no
If so, how are these students charged for the class?
11. How do you package weekend classes, i,e,, what days, times of those days and what
durations are your week-end classes offered?
12. Do you compact any classes into single weekend block classes allowing a student to
complete an academic class in a shorter time than usual?
yes no
If yes, what classes are offered?
13, Do you compact any classes into single weekend block classes allowing a student to
complete a continuing education class in a short time? yes No
If yes, what classes are offered?
16. Can a person obtain a degree from your institution by attending only weekend classes?
yes no
151
Instmctional Scheduling 9 - 6
Institution
23, Do these block classes balance cost with the revenue generated? yes no
152
Instmctional Scheduling 9 - 7
Instittition
26. Generally how many students are required per class for the class to be offered?
in an academic class
in a technical class
in a continuing education class
27. Can a student receive a degree by taking classes offered only at your extension centers?
yes no
If campus faculty are used at extension centers, please answer the following:
Is it considered part of their regular load? yes no
Is it considered overload? yes no
Is faculty required to teach at extension centers? yes No
Are they compensated for time spent traveling? yes no
If yes, how are they compensated?
29. If employees of the main campus must travel to the extension centers, what type of
travel arrangements are used?
college vehicle
mileage paid (Please offer reimbursement rate)
meals
lump sum per semester
other (please explain)
153
Instmctional Scheduling 9 - 8
Institution
32. How many students are required in order to offer these classes?
35. What credentials must the academic instmctors possess to be qualified to teach dual
credit classes for your institution?
36. What credentials must the technical instmctors posses to be qualified to teach dual
credit classes for your institutuion?
154
Instmctional Scheduling 9 - 9
Institution
37. Who receives compensation for teaching these dual credit classes?
high school system
high school faculty doing the teaching
college faculty doing the teaching
other (please explain)
38. On the average how many continuing education classes are offered at your institution
during the summer?
Type 1st Quarter 2nd Quarter 3rd Quarter 4th Quarter
Funded
Nonfiinded
40. How are funds generated for nonfunded continuing education classes?
41. How are fimds generated for fiinded continuing education classes?
42. Is it your goal to make a profit from the fimded classes or break even?
profit break even
43. Is it your goal to make a profit from the nonfimded classes or break even?
profit break even
155
Instmctional Scheduling 9-10
Institution
46. Are any of your continuing education noncredit courses offered simultaneously for
academic credit? yes no
If yes, which ones?
156
Workforce 10
Institution
WORKFORCE DEVELOPMENT
Definitions:
Workforce Development — includes continuing education programs and customized
contract training for business and industry.
Workforce Continuing Education Programs — includes adult vocational education
courses offered by the College and approved by the Texas Higher
Education Coordinating Board.
Customized Contract Training — includes training and services which are
customized and contracted with business and industry to meet specific training
needs.
If checked either of the above, please attach a list of your program classes and answer the
following questions.
The following questions are divided into the two categories defined above.
Please answer each column separately following the definitions given above.
How are the needs for such classes How are the needs for such activities
determined? determined?
How are such classes staffed? How are such activities staffed?
Full-time faculty Full-time faculty
Part-time faculty Part-time faculty
Extemal consultants Extemal consultants
Other (explain please) Other (explain please)
If you use full-time faculty, are these If you use full-time faculty, are these
classes considered part of their load? activities considered part of their load?
yes no yes no
If no, how are they considered? If no, how are they considered?
157
Workforce 10-2
Institution
How are such classes funded? How are such activities funded?
Student tuition Student tuition
Special student fees Special student fees
Contracted with business Contracted with business
Other (explain please) Other (explain please)
Do you receive contact hour fianding Do you receive contact hour funding
for these classes? yes no for these activities? yes no
What do the students receive for What do the students receive for
these classes? these activities?
academic credit academic credit
CEU credit CEU credit
Other (please explain) Other (please explain)
Do you strive to break even or make a Do you strive to break even or make a
profit in offering these classes? profit in offering these activities?
^break even profit break even profit
Do you allow both continuing education Do you allow both continuing education
credit and academic credit to be given credit and academic credit to be given in
these classes? yes no in these activities? yes no
Identify the position which administers Identify the position which administers
the scheduling and staffing of these the scheduling and staffing of these
classes? activities?
How are such classes advertised? How are such activities advertised?
College catalog College catalog
Separate catalog or brochure Separate catalog or brochure
Mailouts Mailouts
^Newspapers Newspapers
Contract with business Contract with business
Other (specify) Other (specify)
158
Workforce 10-3
Institution
2. Based on the definitions given at the begirming, describe how your workforce
development efforts are administered (i.e., what the organizational chart looks like, how
the workforce development staff relates to the rest of the College, etc.) Please attach
workforce development organizational charts.
3. Describe how your workforce development efforts are evaluated? Please provide
samples of the evaluation tools if possible.
159
APPENDDC B
160
Qualifications for a Benchmarking Team
161
Starting a Benchmarking Process
In Belmont University's Quality Team Manual on Benchmarking in 1993 seven
points are listed for consideration before beginning to benchmark and are paraphrased as
follows:
3. What should you benchmark: Choose those processes that align with the
organizational mission and contribute to the organization's long-term success.
6. How should you collect data? First, establish intemal baseline performance
measures. Then be creative for tracking down other sources of data.
7. How can you implement what you leamed? Determine the variances between
your processes and those benchmarked. Separate out, if necessary, unique factors
either to the benchmarked organization or to higher education. Then, develop a
mission statement for the process, and set clear goals and action plans,4, p. 63
162
Baldridge Award Explanation
163
Benchmarking for School Tmprovement
Tucker, Sue (1996), Benchmarking: A Guide for Educators. Cowin Press, Inc., Thousand
Oaks, CA, p. 8,
164
Robert Camp's Formal 10-Step Process for a Benchmark Study
3. Plan and conduct the investigation—Determine what data are needed and how
to conduct the b. investigation. Document the best practices found.
4. Determine the current performance gap—,,,decide how much better the best
practices are than the current work methods
5. Project fiiture performance levels—Decide how much the performance gap will
narrow or widen in the near fiiture and what repercussions this has for the
organization,
165
Definitions of Benchmarking
Benchmarking is the search for best practices that lead to superior performance.
Camp. R, C, (1989) Benchmarking: The search for industry best practices that lead to
superior performance. Milwaukee, WI: Quality Press (ASQC), p. 12.
166
' Benchmarking, is the ongoing, systematic process for measuring and comparing
the work processes of one organization to operations.
Dertouzos, M., Lester, R., and Solow, R., and the MIT Commission on Industrial
Productivity. Made in America: Regaining the Productive Edge Cambridge, MA: The
MIT Press, 1989, p. 84.
167
• Benchmarking is an improvement process in which an organization compares its
performance against best-in-class organizations, determines how those
organizations achieved their performance levels, and uses the information to
improve its own performance: the subjects that can be benchmarked include
strategies, products/programs/services, operations, processes and procedures.
Pinellas County Schools, (1994). Superintendent's quality challenge. Largo, FL: Author,
p. 29.
Tucker, Sue (1996), Benchmarking: A Guide for Educators, Cowin Press, Inc., Thousand
Oaks, CA, p. ix.
• Benchmarking is not a cookbook program that requires only a recipe for success.
Instead, it is a discovery and leaming process that can be used over and over again
to achieve different goals. It is a way of working and thinking in the school to
achieve continuous improvement.
Tucker, Sue (1996), Benchmarking- A Guide for Educators. Cowin Press, Inc., Thousand
Oaks, CA, p. 3.
************
168
• Benchmarking does not mean cloning without thought, the success of other
companies. What is best practice in one organization cannot readily be transferred
to another without a thorough understanding of the leaming that has gone into
achieving the standard, and recognition of the impact of the process on the culture
of the organization, in terms of both customer and employee reactions.
169
Possible Types of Benchmark Studies
The four types of benchmarking: internal, competitive, functional, generic.
170
Benchmarking's rationale is that the necessary effort to identify expert and reliable
benchmark resources be made. Limiting a benchmarking activity to a smaller population
of potential organizations greatly reduces the quantity of information generated.
Scope of benchmarking activity: one-time event, periodic, continual.
Chris Millard, Logistics Director at the Rover Body and Pressings plant recognizes that to
have a clear focus is vital: "If you only benchmark an operation in total terms, you will
miss. You examine and establish benchmarks for the processes which are the drive in
achieving the targets set for the overall operation.
171
APPENDIX C
172
POSSTBTE AREAS FOR BENCHMARKING
173
Peterson's AGB Strategic Indicators Survey Major Headings
Revenue
Tuition and fee income
Government appropriations
Government grants and contracts
Private gifts, grants and contracts
Endowment support for operations
Sales/service—educational activities
Sales/service—auxiliary enterprises
Sales/service—hospital
Other sources
174
Physical Plant
Financial:
Beginning of year value
Depreciation for the year
Retirement of plant
Additions to plant
End of year value
Plant inventory and condition
Estimate deferred maintenance
Endowments
Beginning of year market value
Retum on investment
Other additions to endowment
Subfractions from endowment
Normal support for operations
Special uses
End of year market value
Students
Fall enrollments (headcount & FTE, by level)
Fall FTE enrollment, by EEOC & level
Fall FTE enrollment by gender & level
Fall FTE enrollment by field of study & level
Degrees awarded by level
Admissions data for the fiill year:
Number of applications
Number of offers of admission
Number of matriculants
Geographical dispersion of entering student by level
Number of states represented
Student head count from home state
Students from outside the U. S. and Canada
Tuition and financial aid
Published charges
Financial aid headcounts by type of aid
Financial aid dollars by type of aid
175
Faculty anH Staff
Faculty numbers (full and PT by rank)
Regular faculty FTE
By field & rank
By EEOC category & rank
By gender & rank
Percent faculty over 60 years old
Faculty gains and losses for year
By rank
Headcount at the beginning of the year
In-hire
Voluntary termination
Termination by death or disability
Termination by the institution
Change, i.e. non-tenured to tenured
Headcount at the end of the year
Sponsored research
Expenditures for organized research
U. S. Government (by major agency)
State & local government agencies
Domestic corporations & foundations
Other domestic private foundations
Foreign governments & corporations
Bequests and gifts from living individuals
Other outside sponsors
Institutional fimds
Academic year faculty salary offsets
% of regular faculty members who are principal investigators on sponsored projects
Research proposal and award statistics
Proposals sent to potential outside sponsors
Awards received from outside sponsors
Fundraising
Dollars raised during the year by source
Dollars raised during the year by use
Designated or restricted dollars for current operations
Designated or restricted dollars for student aid
Designated or restricted dollars for plant
% of living alumni who are active donors
176
APPENDIX D
177
BUSINESS OFFICE RESUTTS
(Six colleges reporting)
178
Dollar Amount Percentage of Budget
179
Efficient lamps and fixtures
Computerized heating controls
Computerized electrical system
Revised computerized water irrigation system
Consolidated courses to fewer buildings
Closed Friday in summer
Technology
Technology Master Plan
$10 fee per student per semester
Budget limits for fimctional areas based on revenue limitations
Purchase through competitive bid process
Group all technology equipment together for better purchasing price
Budget monitoring and review by fiscal area for purchases
Thorough research of most productive/economical needs/goals
180
Personnel
Position justification
Budget limits for functional areas
Eliminated 18 positions for 95-96
Require two weeks lapse before replacing terminated employees
Hire all new employees at no more than three steps above entry level
Review faculty positions when faculty leave or retire
Replace fiill-time faculty with part-time
Flatter organizational stmcturing
Travel
Zero budgeting; no budget increases
Budget limits for functional areas based on revenue limitations
Require employees and students to travel together
Require use of rental vehicle when cost is less than personal vehicle
Budget monitoring and review by fiscal area
All out of state travel require Presidential approval
Percentage cuts during budget development
Maintenance
Preventive maintenance program
No increase in budgets
Comprehensive facilities survey to identify deferred maintenance costs
Eliminate maintenance confracts; hired three computer technicians and one
vehicle mechanic
Set up maintenance on time and materials budget
Reduce fiill-time staff and replace with part-time staff and student
workstudy
Increased preventive maintenance
Computerized preventive scheduling
Monitoring in-house maintenance cost vs. "outsourcing costs"
Travel
For faculty:
Formula=$300 per area/$140 per faculty/$1000 per division
Supervisor approval
See attached policies in study notebook
181
For staff:
• Zero budgeting
• Supervisor approval
• See attached policies in study notebook
For students:
• Zero budgeting
• Supervisor approval
• See attached policies in study notebook
Two colleges:
• Allocate by those responsible for performance
• Application for professional development and institutional monies
182
For administration:
Five colleges allocate by individual budgets:
• Using zero budgeting
• Budget requests
• As approved by dean/administration within budget guidelines
Two colleges:
• Approve administrative travel annually
• Allow for application for professional development and institutional
monies
For classified:
Four colleges allocate travel money by individual budgets as
• Requested by chair
• In department annual budget
• With dean/administrative approval within budget guidelines
Purchasing
183
Five colleges combine with
ISD's
Local city governments
Hospitals
County governments
Consortiums
State General Services Commission
Six colleges are interested in further group purchasing especially in the area of
technology.
Self insure
Four colleges self insure
• Three in the area of Worker's Compensation
• One only in the deductible portion
184
COMPUTER TECHNOLOGY RESULTS
(Six colleges reporting)
Programming language:
• Four colleges use COBOL
• One college each uses
• 4GL Report Writer (QUIZ)
• Unibasic
• DATATRIEVE
• FOCUS
185
Table D. 1. Administrative computing:
Administrative Ad hoc reporting: Accessible by:
software: N=Not possible Q=Query language
Area H=Homegrown D=Outsourced DP Staff R=Report generator
P=Proprietary L=Local (College) DP D=Downloading files
(give brand name Staff 0=Open Data Base
ifP) 0=College Non DP Connectivity
Staff
Prospective Two colleges: Five colleges use Four colleges access by
Students H Local DP staff Downloading files
One college each: One college uses Three colleges access by
P-POISE College Non DP staff Quety Language
Datatel Colleague Two colleges access by
P-SCT Report generator
One college accesses by
Open Data Base
Registration Two colleges Six colleges use Five colleges access by
H Local DP staff Downloading files
One college Two colleges use Four colleges access by
P-POISE College Non DP staff Query Language
P-Defimct
Datatel Colleague Three colleges access by
P-SCT Report generator
One college accesses by
Open Data Base
Transcripts Two colleges Six colleges use Four colleges access by
H Local DP staff Downloading files
Report generator
One college Two colleges use
P-POISE College Non DP staff Three colleges access by
P-Deftinct Query Language
Datatel Colleague
P-SCT Two colleges access by
Open Data Base
186
Table D.l. Continued
Administrative Ad hoc reporting: Accessible by:
Area software: N=Not possible Q=Query
H=Homegrown D=Outsourced DP language
P=Proprietary Staff R=Report
(give brand name L=Local (College) generator
ifP) DP Staff D=Downloading
0=CoIlege Non files
DP Staff 0=Open Data
Base
Connectivity
Personnel Two colleges Five colleges use Three colleges
H Local DP staff access by
Downloading
One College each One college uses files
P-Defimct College Non DP Report generator
Datatel Colleague staff
P-SCT Two colleges
access by
Query Language
Open Data Base
Payroll Two colleges Six colleges use Four colleges
H Local DP staff access by
Downloading
One college each Two colleges use files
P-POISE College Non DP Report generator
P-Defiinct staff Query Language
Datatel Colleague
P-SCT One college
accesses by
Open Data Base
General Ledger One college each Six colleges use Five colleges
H Local DP staff access by
P-CUFS Downloading
lAFRS Three colleges use files
P-SCT IA+ College Non DP
Datatel Colleague staff Three colleges
P-SCT access by
Report generator
Two colleges
access by
Query language
One college
accesses by
Open Data Base
187
Table D.l ContinneH
Administrative Ad hoc reporting: Accessible by:
software: N=Not possible Q=Query language
Area H=Homegrown D=Outsourced DP Staff R=Report generator
P=Proprietary L=Local (College) DP D=Downloading files
(give brand name Staff 0=Open Data Base
ifP) 0=College Non DP Connectivity
Staff
Accounts One college each Six colleges use Five colleges access by
Payable H Local DP staff Downloading files
P-CUFS
lAFRS Three colleges use Three colleges access by
P-SCT IA+ College Non DP staff Report generator
Datatel Colleague
P-SCT Two colleges access by
Quety language
One college accesses by
Open Data Base
Accounts One college each Five colleges use Five colleges access by
Receivable H Local DP staff Downloading files
P-CUFS
P-SCT IA+ Three colleges use Three colleges access by
Datatel Colleague College Non DP staff Report generator
P-SCT
Two colleges access by
Query language
One college accesses by
Open Data Base
Financial One college each Six colleges use Five colleges access by
Aid H Local DP staff Downloading files
P-POISE
SAFE ACT Two colleges use Four colleges access by
P-Defimct College Non DP staff Quety Language
Datatel Colleague
P-SCT Three colleges access by
Report generator
One college accesses by
Open Data Base
Inventory Five colleges Five colleges use Three colleges access by
H Local DP staff Downloading files
Quety language
One college each One college uses
Datatel Colleague College Non DP staff One college accesses by
Report generator
One college reports
Not possible
188
Table D 1 GnntinneH
Administrative Ad hoc reporting: Accessible by:
software: N=Not possible Q=Query language
Area H=Homegrown D=Outsourced DP Staff R=Report generator
P=Proprietary L=Local (College) DP D=Downloading files
(give brand name Staff 0=Open Data Base
ifP) 0=College Non DP Connectivity
Staff
Libraty One college each Three colleges Two colleges access by
P (data res.) Not possible Report generator
H Not possible
DRA Two colleges use
Dynix Local DP staff One college accesses by
P-CLSI Downloading files
One college uses
Outsourced DP staff
Purchasing One college each Five colleges use Three colleges access by
H Local DP staff Downloading files
lA
P-SCT IA+ Two colleges use Two colleges access by
Datatel Colleague College Non DP staff Report generator
P-APS Quety language
Bookstore Two colleges Two colleges Two colleges
N/A N/A N/A
One college One college uses One college accesses by
P-RAYCEE Downloading files Report generator
Two colleges Five colleges use Four colleges access by
H Local DP staff Quety language
Academic Downloading files
Advising One college each Two colleges use
P-POISE College Non DP staff Three colleges access by
P-Defimct Report generator
Datatel Colleague
189
Table D ?. Maintenance and servire- rQuest 4UR-6^
Equipment In house Service Contracts Other Outsource
Printers Six colleges Three colleges
PCS Six colleges Two colleges
LAN Six colleges Two colleges One college
WAN Four colleges One college One college
190
Personnel Questions
.Table D.6. Computer personnel l reakdown
Percent Hourly Pay H=Hourly
Job Title Time Index M=Monthly
Secretary (example) 100% 100 H
Programmer-analyst (example) 75% 240 M
Network Administrator 100% 272 M
Network Analyst 100% 222 M
Secretaty I 100% 137.5 M
Operator I 100% 159.5 M
Operator II 100% 175 M
Micro Specialist 100% 193.3 M
Programmer/Analyst 100% 202.9 M
Programmer/Analyst I 100% 213.3 M
Asst, Syst. Programmer 100% 213.3 M
Systems Programmer 100% 224.2 M
191
Director, Computer Services 100% M
Systems Analyst 100% M
Programmer Analyst 100% M
Computer Programmer 100% M
Coordinator, Network Services 100% M
PC Supervisor/LAN Manager 100% M
PC Speciahst 100% M
Policy Questions
Definition:
Technology—Telecommunications services such as telephone and data services,
instructional equipment, such as audio-visual equipment, and computer equipment.
General technology fee: (Quest. 7) (R=6)
Four colleges have general technology fees.
• Two colleges $ 10 per semester
• Three colleges $3 per semester hour
Which give students access to
• Computers and/or technical equipment
• Technology on campus.
• Nothing- Fee is used to maintain current with new computer technology in computer
labs.6.
Three colleges dedicate the fees to the purchase or support of technology.
No college encumbers the fee for support personnel.
192
No college encumbers the fee for maintenance/upgrades.
Two colleges charge lab fees which include technology usage separate from a technology
fee.
• One of these colleges uses the fees for English, Office Occupations, Information
Systems
Network accounts for students: TQuest. R) rR=6)
Four colleges provide student's with network accounts including:
• Four E-mail
• Three WWW
• Two Telnet
Three colleges tie these accounts to
• Courses
• Generic
Computers provided at the institution's expense: rQuest 9) (R=6,'\
Five colleges provide student's computers; usage is provided by
• Budget
• In labs and classrooms
• Vocational-Technical Majors
• Open labs - first come, first serve basis;
• Capital Equipment Task Force
Six colleges provide faculty computers; usage is provided by
• Two colleges in the budget
• One college each
• Being fiill-time faculty
• Six machines on 3 weeks or less loan first come, first served budget requests
• Based on need and availability of fiinds;
• Capital Equipment Task Force
Six colleges provide administrative computers; usage is determined by
• Two colleges by budget
• One college each by
• Job Description
• Beingfiall-timefaculty
• Based on need and availability of fiinds
Six colleges provide clerical computers; usage is determined by
• Two colleges determine by budget
• One college each determines by
• Beingfiall-timefaculty
• Based on need and availability of fiands
• Need based on Job Description
Replacement policy for microcomputers: (Quest. 10) (R=6)
Inlabs/classrooms:
20%/year
Asfiandsallow
No policy in place
As needed
Software drives Replacement
Still under development
193
In offices:
• 25%/year
• As fiinds allow
• No policy in place
• As needed
• Software drives Replacement
• Still under development
Ratio of service providers: (Quest. 11) (R=6)
Two colleges 1:100
One college each
• 1:248
• 1:260
• 1:300
• 1:350
Remote access to college's network: (Que.st. 12) (R=6)
Only one college provides remote access to the college's network at no charge for both
faculty and staff but not students
Computer operating systems: (Quest. 13) (R=6)
Six colleges provide Windows~3, 3.1, 95
Four colleges provide DOS
Two colleges provide UNIX
One college each provides
Macintosh
MS-DOS
Apple IIOS
VMS
VSE
VM
Hewlett Packard MPE/XL
DEC open VMS
Windows NT
Novell 4.x, 3.X;
Novell
Solaris
X86
VAX
DOS 6.22
Network operating systems: TQucst. 141 CR=6)
• Four colleges support NOVELL
• Three colleges support
• UNDC
• Windows NT
• Two colleges support
• Netware 3x & 4x
194
• One college each supports
• OS/2
• Solaris X86
Training
195
Budget
Technology budget- (Quest. 22) (n=S)
• R=$200,000—$3,000,000
• Ayg.=$ 1,554,600
• M=$ 1,642,031
196
Service
• Software installation by
• Cenfralized software installation
• Implement software standards
197
Additional comments on computer service:
• QUESTION #31 -We experience problems in all areas. We fix the problems a they
become known and have that knowledge when we face a similar situation. We tty
to implement network and hardware standards and are currently
implementing a help desk program,
• Our problem is numbers of machines versus numbers of technicians
198
DRVEl.OPMENTAL LABS, INSTRUCTION ^. TUTORING 01JRSTIONS
Developmental Instmction
(Six colleges reporting)
199
Develonmental department/division: (Quest R\ (R=f.)
Five colleges have developmental classes that are offered within the various
departments/divisions (English and math).
Two colleges have separate departments/divisions for developmental classes.
One college each has
A Developmental division which coordinates developmental advising and
houses ESL, a Leaming Center, and Reading Remediation
• A Developmental Education Department which coordinates reading, study
skills, and ESOL.
200
Developmental r.lass sjye limit: (Quest. 1 3) (R=6')
Writing
R=8-23
Avg.=19
M=20
Reading
R=12-25
Avg.=22
(Three colleges indicated a limit of 20 students on the 1st level and 25 students on
the 2nd level.)
Math
R=22-30
Avg.=27.8
ESL-One college indicated a limit of 12 in ESL classes.
Tutoring
201
Tutoring service Funding: (Quest. 16) (R=6)
All seven colleges fund tutoring by institutional funds; one indicating a supplemental
instmction fee to students.
Five colleges use
• Workstudy fiands
• Grant fund
Four colleges use Carl Perkins Grant
Three colleges use Student Support Services Grant
One college uses volunteers from the college and the community to provide
tutoring.
Leaming Center
202
Leaming center staffing: (Quest. 18) (n=f,^
Seven colleges use full-time instmctors in their leaming centers.
Five colleges use
• Lab assistants
• Lab supervisors
• Part-time instmctors
Three colleges use full-time instmctors partial load
One college each uses
• Volunteers
• Student Support Services tutors/coordinators
203
DISTANCE F.nUCATlON RRSin.TS
(Six colleges reporting)
204
Organizational Stmcture: (Quest. 6-7) (R=5)
Two colleges' distance leaming entities report to Vice Presidents of
• Information Technology
• Instmction and Student Services
Administrative Services (Computing Services)
One college each distance leaming entities report to
• Respective Deans
• Instmctional Administrators
• College President
205
Table D.IO. Titles and responsibilities of distance education support personnel
Job Title Responsibilities
Two colleges have a Coordinator of (This person also teaches Chemistty in one
Distance Education college.)
Director of Media Services
Associate VP of Information Media Services, Computer Services,
Technology Telecommunications, Academic Computing
Webmaster WWW activity
Director of Computer labs Manages persormel & stmcture of labs
Distance Leaming Technician Operates DL classroom
Media Services Coordinator Media Production
Systems Programmer Computing Support Services
Technical Branch Librarian Reference/Media Support in Tech library
Dean of Leaming Resources Coordinates Libraty/Media Centers;
Faculty/Staff Development
1/3 time—Director Multimedia Manages Campus Distribution of AV
Access & Prod. equipment/media production
'/z time—Video Production Campus Video Production Support
Technician
1/4 time—Systems Technician Maintaining Campus Equipment/Systems
Vi time—Leaming Resources Maintaining Division Records/Corres.
Secretary
Media Tech Campus-wide media services
206
Registration and Advising: (Quest. 12) (R=5)
All five colleges have distance education students go through traditional registration,
• Four colleges have students receive traditional advising.
• One college does off-campus bulk advising.
Budgeting
(Five colleges reporting)
207
Income/expenses- (Que.st. 16-17) (R=5)
• One college's income/expenses is in the instmctional budget.
One college is limited to contact hour reimbursement,
• One college each makes
$650,000 approx.
$25,000 (Fall 1996)
Comments:
• Distance leaming is vety expensive-must determine priority and expectations
before implementation.
• Hard costs for operating mn about $400,000; capital outlays vary on a year-by-
year basis; administrative costs need to be added to figures.
• Distance leaming equipment and the cost of telecourses are separate accounts;
personnel costs are incorporated into instmctional and support services
departments.
Equipment
(Five colleges reporting)
Comments:
At present can only point to point interactive video conferencing. The college hopes to
be able to find equipment for multi point delivety.
Telecourses broadcast via local cable compressed video/audio is near fiiture and will be
shared with local technical college.
We have a distance leaming classroom acquired through a federal grant. It is equipped
with two robotic cameras, a presentation camera, 12 student capacity with audio,
capability of multi-media computer, an Aladdin Pinnacle Media Printer, and various other
production equipment.
208
Agreements
(Five colleges reporting)
Comments:
• Negotiated on a semester by semester basis
• Agreements are currently being negotiated with other institutions.
• Lease nine telecourses from DCCCD and Tarrant Co. Junior College so that the
telecourses may be broadcast over a PBS station. The signal is down linked by
the cable companies within our three counties.
DELIVERY SYSTEMS
Broadcast
(Three colleges reporting)
209
For classroom equipment and course titles see attachments in study data notebook.
No college reports that the cost of offering broadcast classes exceeds the benefits.
210
ITFS and Microwave
(One college reporting)
Profit-(Quest 8) (R=1)
No, offer such classes because they are a service to the student and are beneficial
to them. We started these classes because it seemed to be the wave of the fiiture, and we
are in the business of educating and should stay abreast of future technology.
.Systems-(Quest 1)(R=1)
3 colleges-e-mail with list servers
3 colleges-WWW
211
Providers and rnst- (Quest. 2) (R=2)
Two colleges—Texas A & M
Two colleges—The Net
212
Services provided- (Qiie.st. 8) (R=4)
Four college provide WWW for employees and students
213
Publicity: (Quest 18)(R=4)
Two colleges publicize outside of campus
One college publicizes by
• Inside campus
• Inside and outside together
• Paid advertising
214
Di.sadvantages: (Quest. 21) (R=4)
Student participation requires heavy commitment and motivation
• Still can't cany a lot of information/video files
• High drop rate
• Students lose campus experience and physical presence of the classroom
• Not all course work can be taught effectively through this medium
• Costs for equipment and upgrades
• Training and support time
Costs for developing effective classroom techniques and presentations by
faculty
Additional comments:
• LRC access to curriculum support materials and instmction on research
techniques,
• Use of online resources is a critical element in this medium.
• "Browsing the stacks" for relevant information is no longer an option.
• Online Reference Helpdesk is a must.
• Storage and/or access for full text materials online should be an integral
part of the planning. Document request and delivery services must be
addressed.
Video
215
Length: (Quest 4^ (R=4)
• 5 years
• 10 years
• 10 years for downlinks, 6 years for cassettes
• 30 years
216
Co.st/expenses: (Quest 11) (R=4)
• Two colleges report cost does not exceed expenses.
• Two colleges do not know if cost exceeds expenses.
217
RESEARCH/PLANNTNG/INSTTTUTIONAL EFFECTIVENRSS RESULTS
(Six colleges reporting)
Institutional
Effectiveness •Tob Titles
R=0-2 Dean of Program Development and Institutional Research
Avg.=.6 Secretaty
Director of Institutional Research, Oversight
Other (specify)
Division Senior Secretary-Handles duties related to research, planning,
institutional effectiveness, and resource development
All of the above together - Director, Planning & Instittitional Research
Research Associate
Office Assistant (classified employee)
-71 s
Early procedure for institutional planning and effectiveness: (Quest 3) (R^6)
Four colleges review operations yearly for revisions.
One college
Evaluates and plans on a 3-year cycle
Is revising the planning process but intends that the new time line be fairly
flexible
219
Resource Development
Grants
Job titles:
Dean of Institutional Advancement
Dean of Program Development and Institutional Research
Dfrector of Financial Services
Other deans/directors, faculty, staff as appropriate for each grant
Deanof Occ/Ed
Fin. Aid Director
Business Office
Grant developer
Assistant to President for Industrial Development
Grants/Auxiliaty Coordinator
Planning & Grants Coordinator
Director, Foundation
Grant Acct.
Grant Monitor
Grant Directors
970
Approval by the Executive Administration and if matching funds are required,
approval by the Board of Tmstees.
Approval from immediate supervisor, then division director, dean and appropriate
vice-president.
Approval by Dean, Director of Development, V.P. of Administration, President
Foundation
221
by foundation funds:
R=5%-33%
Avg.=17%
M=15%
One college reports that 100% investment management fees are paid by the foundation.
222
INSTRUCTIONAL SURVEY RF.SUl.T.S
(Six colleges reporting)
Cost Study
In.stmctional Administration
Definition:
Instructional administrators-Employees who supervise faculty, oversee programs
and may or may not teach.
223
Table D.l3. Instmctional administration load anc compensation
Colleges Level and title Compensation Considered Load Contract
Have
6 Vice president 6-Admin. 0 12 mon.
6-for instmct.
1-acad. affrs.
6 Dean/Div. Chr. Salary 5-Admin. 6-12 mon.
1-Fac.
5 Dept. Chairs R=$100-$3600 4-Fac. R=3-18 R=9-12
Avg.=$1611 hrs. hrs.
6 Prog. Directors/ R=$500-$3500 5—Fac. R=9-24 M=9 mon.
Discipline Avg.=$1868 hrs.
Coord.
3 Prog. Coord. R=$0-$1800 3—Fac. M=9 mon.
*A11 responcing attached job d(jscriptions for defining duties of each position.
3 Prog. 1 40
Coord.
224
Calculating compensation: (Quest. 8) (R=5)
Two colleges use a formula for calculating compensation for instmctional administrators.
One policy attached
Fonnula: SHE release: 3(1-6 fac); 6(7-12 fac); 9(13+fac); 12
Stipend: Based on number of FT fac ($200 each) and adjunct ($100 each)
supervised
225
Physical activity r.lass load (Including how many hours per week p.e. classes meet)
Hours per week Load hours
3 hours per week = 2.5 load hours
2 hours per week = 1 load hour
3 hours per week = Full load 6 physical activity classes per week
3 contact hours = 1 lecture, 2 lab equate two SHE
3 hours per week = 2.14 load credit=3 hours contact
226
Coop instmction load
• .15 load hour per student
• One load hour for five hours
• U students/6
See attachment listing load equivalents
Intemship load
• .30 load hour per student
• One load hour for five hours
• # students/6
• See attachment listing load equivalents
Lecture/lab load
Two colleges calculate: Lecture is one load hour per one class hour/Lab is one load hour
per two lab hours.
• One college has a listing of load equivalents
227
One college for
• Curriculum development
Dual credit program coordinator
• Honors program coordinator
• Some lab coordinators
Cost effective
Two colleges
Don't know if pay compensation has been cost effective.
Say no pay compensation has not been cost effective.
One college each says
Pay compensation is cost effective because it is paid at the part time rate.
Full implementation is planned for 96-97.
Three colleges say yes teaching load reductions or release time have been cost effective
by explaining that typically faculty spend more time on the project than they
would on their regular classes, and the released class is covered at the part-time
rate.
Two colleges say no teaching load reductions or release time have not been cost effective
explaining that one course reduction is the equivalent of the college paying 1/10
of salaty for the activity, plus adjunct cost.
None of the five colleges have found a more effective way than pay or load reductions to
accomplish such adminisfrative duties.
Two colleges offer mixed classes (i.e., semester hour credit students and adult vocational
students in the same class) calculating load
• On basis of the load for the credit class
• By counting adult vocational students toward the total required to make a
class, since they pay the same costs.
228
Average lechire r.lass- (Quest. 21) (R=5)
R= 19.52-35
Avg.= 28.4
M=30
Small transfer classes (i.e.. under 10 in enrollment) load credit- (Quest. 22-23) (R=5)
Four colleges use prorata formula based on number of students
Examples:
3 credit, 3 lecture hour class with 5 students equals 1.5 load hours
Based on 15; if class has 10 stiidents, salaty = 10/15
• Dependent on intention of offering for proration or full credit
• Sometimes afiill-=timefaculty (like adjuncts) may teach a class on a pro-
rated basis with 12 being the base number.
One college
• Decides to mn a class under 12 if it is taught by a fiill-time faculty for part
of a 15 hour load
• Is in process of developing policy
Small vocational/technical classes (i.e., under 10 in enrollment) load credit: (Quest. 23)
(E^
Three colleges use a prorata formula
• Based on the number of students—3 credit, 3 lecture hour class with 5
students equals 1.5 load hours
• Based on 15 except those programs with accreditation stipulations on class
size
• Decisions are made to mn a class under 12 if it is taught by a fiill-time
faculty for
part of a 15 hour load
• Sometimes afiall-=timefaculty (like adjuncts) may teach a class on a pro-
rated basis with 12 being the base number.
One college
• Gives fiill credit
• Is in process of development
229
Two colleges offer faculty of extra large classes help in grading.
By using SCANTRON with lead faculty grading essay portions of exams
• By permitting student help
Instmctional Personnel
Percentage of your fiill-time employees are fiill-time faculty (Quest. 27) (R=4^
R= 38-50
Avg. 45.7%
M=47.5%
Percentage of your continuous part-time employees are part-time credit faculty: (Quest,
28) (R=4)
R=67%-95%
Avg.=85.75%
M=90.5%
230
Faculty office hours: (Quest. 31) (R= 5)
All five colleges require faculty to maintain office hours with the minimum requirement
of
Two colleges 10 hours per week
One college
• What ever is necessary to consult with students
• 5 hour per week or one hour per day
• One hour a week for each three hours taught is our requirement.
Counselor credentials:
• Two colleges require a Master's Degree in guidance and counseling
• One college requires L. P. C.
231
Four colleges do not increase counselor salaries with credentials such as LPC.
Two colleges select the mentors from the same work area
Professional Development
232
Academic professional development opportunities: (Quest. 42) (R=3)
Numerous workshops, seminars, conferences, travel
Institutional level
Program level—All relate to goals and objectives
Individual level
Mini-grants
Retum to industty
Retum to university
Faculty development leave
No-cost tuition for classes
233
Classified professional development opportunities: (Quest. 46) (R=5)
Two colleges offer no-cost tuition classes according to work schedule.
One college offers
• Numerous workshops, seminars, conferences and travel
• Computer software, human relations
• Local workshops, on-campus seminars
• One college has a separate non-faculty professional develop, committee
234
Required professional development activities: (Quest. 52) (R=5)
Three colleges require some professional development activities.
Part-time faculty
235
Percentages of credit contact hours generated by part-time faculty for fall 1995:
R=I0%-50%
Avg. 27.6 %
Percentage of noncredit contact hours generated by part-time faculty for fall 1995:
R=95%-100%
Avg.=98.3%
Departments/divisions use of part-time faculty in the fall of 1995: (Que.st. 57) (R=S)
236
Table D.19. Use of part-time faculty-Institution 2 - Listed by deans
Department/ Percentage of Percentage of Percentage of
Divisions semester hours taught contact hours taught noncredit classes
by part-time faculty by part-time faculty taught by part-
time faculty
237
Maximum percentage goal (% of cla.sse.s) of part-time faculty (Quest. 60^ (R=4)
• Two colleges have not defined this percentage,
• Two colleges report 35 % as a maximum percentage for part-time faculty.
238
Discinline-specific computer labs- (Quest. 66) (R=S)
239
Table D.22. Discipline-specific computer labs—Institution 2:
Discipline/Department Number of What other disciplines use these labs?
labs
Accounting 2
Graphic Arts 2
Office 2
CADD 2
Electronics 2
Micro-computers 1
Computer Science 8
SOS/Engineering 3
Joumalism 1
Medical Records 1
Open labs 2
RTDC-(Continuing Ed.) 6
240
Table D.23. Discipline-specific computer abs—Institution 3:
Discipline/Department Number of What other disciplines use these labs?
labs
Art 2
Communications 2
English, Philosophy & 4
Reading
Legal Professions 2
Restaurant Management 1
Distribution & 3
Marketing
Trade & Industty 3
Special Law 1
Enforcement
Industrial Education 2
Technical Education 3
Dental Assisting 1
Allied Health 2
241
Table D,24, Discipline-speci fie computer abs—Institution 4:
Discipline/Department Number What other disciplines use th eselabs?
of labs
Office Technology 5 Business/Accounting
CISY/Electronics 5 Electronics/Accounting
English 3 Read/Cont, Ed.
Math 2 Cont, Ed
Reading 1 English
Leaming Center 3 College-wide
Foreign Language 1 Spanish/French/ESL
242
Five colleges have these systems accessed by
• Administration
• Academic
• Technical
One college has clerical access to input files.
243
Lab sunervi.sor and student work- (Quest 7S) (R=4)
Four colleges use lab supervisors to aid students in the lab.
Two colleges use lab supervisors to teach or tutor students in the lab.
One college uses lab supervisors to evaluate/grade student work in the lab.
Pay for lab assistants (excluding student assistants): (Quest. 80) (R=5)
Two colleges pay on an hourly rate.
One pays by
• Pay grade determined by personnel officer,
• Minimum wage
• On the classified grade and step scale
244
Technical labs organjyed: (Quest 8^) (R=4)
Four colleges organize technical labs on a competency based criteria.
One college organizes technical labs on a customer driven service basis.
Pharmacy Technology 1
Allied Health 2
Radiologic Technology 21
Respiratoty Care 10
Surgical Technology 2
Veterinarian Assistant 1
Accounting 14
245
Table D.26, Continued
Department Number
Management 14
Business 50
Real Estate 8
Fire Protection Technology 8
Criminal Justice 5
Automotive Technology - TDC 2
Automotive Technology 6
Commercial Service Technology 8
Electronics Service Technology 13
Industrial Technology 8
Environmental Health Technology 6
Hazardous Material Technology 4
Art 21
Dance 1
English 50
Joumalism 9
Mass Communications 31
French 9
Music 31
Photography 12
Radio Television 10
Speech Communication 2
Theater Art 4
Vocational Nursing U
Architecture 2
246
Table D.26. Continued:
Department Number
Biology 30
Electronics 13
Engineering 7
Math 59
Astronomy 1
Chemistty 13
Geology 6
Physics 6
Physical Science 3
TOTAL 726
247
Table D,28. Noncomputerized labs-Institution 5:
Department Number
CHDV
Arts
Office Technology
Management
Foreign Language.
TOTAL 144
Travel Considerations
248
Equipment
249
Disposal of equipment policy: (Que.st. 92) (R=5)
Four colleges dispose of equipment at a surplus sale/auction.
One college disposes of equipment by deactivation.
250
OUTSOURCING RESI II T.S
(Six colleges reporting)
Outsourced Services-
The six instittitions reporting record 17 different areas which are outsourced.
251
Maintenance of personal computer^-Agreement is for annual maintenance
service
Ennting-Agreement is for color printing, brochures and catalog
Vehicle use-Consortium pricing/lowest quotes
252
PERSONNEL QUF.STinNS
(Six colleges reporting)
Definition:
Full time Fanilty-those who are eligible for TRS or ORP and state insurance
benefits
Part-time fanilfy-those faculty who are not eligible for TRS or ORP and state
insurance benefits
Administration-those personnel who are not faculty but typically have eligible
consideration for ORP
Part-time administrators-those personnel who are not faculty but would be
eligible for ORP consideration if the position was full time
Classified-those clerical employees in support positions who are usually paid on an
hourly basis or technical and professional non-faculty positions
which do nut have the responsibility of the administrative
positions
Part-time classified employees-those support employees not eligible for TRS or
ORP due to assignment of hours (i.e. 19 or less) or length of
assignment
253
Administrator
Two agreed with the definition.
Other definitions include:
• Positions which report to a dean, vice president or the president and have
budgetaty control over a specific area
• An employee exempt from FLSA minimum wage and overtime
provisions. Non-manual or office work directly related to management
policies or general business operations or performance or administrative
fiinctions. Administrative positions above grade 7 are eligible for ORP
• An employee who has management and or supervisoty responsibilities
related to specific college programs or services. These employees are paid
from the Adminisfrative and Professional salaty schedule.
• An administrator is any supervisoty personnel in salaty range 8 or above.
They may or may not meet all ORP eligibility requirements.
Classified
Only one school agreed with the definition.
One reports not even using the term instead the refer to these employees as
exempt and non-exempt support and technical staff.
Other definitions were as follows:
Classified staff are non-contractual, at-will employees, (secretarial,
support staff and maintenance), who do not meet the eligibility
requirements for participation in ORP, Professional staff are
contractual employees but do not meet the eligibility requirements
for ORP.
• Clerical and support employees (i.e. maintenance, skill crafts,
persormel) not exempt from overtime pay,
• Employees who are hired and paid from the Classified Staff salary
schedule.
• A classified employee is an employee that is covered by Fair Labor
Standards Act (FLSA) and the Fair Labor Standards Amendments
of 1985 and are paid on an hourly basis. These employees are
referred to as non-exempt employees.
254
Table D,30. Non-listed positions
Categoty Definition
C2. Professional Professionals, requiring a degree in their field
of expertise, who are given a one year
contract, but do not meet ORP requirements as
adminisfrative
C4. Teaching Assistants (sub- TA's have annual contracts, no rights to tenure
categoty of faculty) or promotion, and are paid as a proportion of
the faculty pay scale.
C6. Professional staff An employee that is exempt from the FLSA,
receives an annual contract, and one whose
employment must be approved by the
college's Board of Tmstees. Some of these
employees such as administrators are eligible
for ORP, but some are not eligible for ORP.
Examples of positions which are not eligible
for ORP are: Assistant Directors,
Coordinators, Admission Specialists, Financial
Aid Specialists, etc.
C8. Non-exempt support and Personnel who fall under the Fair Labor
technical staff Standards Act and who submit monthly time
sheets.
Part-time employees
255
• In small departments that cannot justify a fiill-time person and large
departments who need extra help during msh periods.
• In clerical and the physical plant (maintenance); still another uses them in
business services, student services, and instmction.
Some of the schools added classroom observations and appraisal planning forms
to the evaluation process
Four schools evaluations are different primarily in frequency after the first few years.
256
Part-time faculty (Quest 13-16) (R=6)
All six schools evaluate part-time faculty all but one using the same process as is used
with fiill-time.
Four schools evaluate part-time faculty annually
One school adds a classroom visit by the department chair for evaluation.
One evaluates evety semester
One evaluates each semester for the first year and then annually.
In only one college does the process change after the first few years.
None of the schools tie part-time salaty increases to the evaluations.
One evaluates adminisfration each six months the first year and then once a year
257
Retirement
They are obtained through a training program with state agencies and those who come to
school and volunteer.
One rewards them with work experience; the other doesn't have a plan of reward.
258
Instmction
259
INSTRl JCTIONAI-SCHEDl n.ING
(Seven colleges reporting)
Technical
R=20-500
Avg,=207
M=187
260
Faculty required to teach night classes (Quest, 5) (R=6)
• Three colleges require faculty to teach evening classes
• Three colleges do not require faculty to teach evening classes
• One college requires it only if their day load does not make.
• Three community colleges students can not take credit classes for
continuing education credit
• Three colleges do allow students to take credit classes for
continuing education credit
261
• One school each offered:
• A nine week session
• A five week session
• A 2-5 week session
Nontraditional Scheduling
Definitions:
Mini-semester—a semester of credit offered in times between regular semesters
Weekend classes—credit classes which meet only on weekends (A weekend begins at
5 p.m. on Friday until midnight on Sunday)
Block classes within a long semester—credit classes offered for short condensed
periods of time rather than the 15 weeks of a semester or a summer
Extension center—an off campus location that offers college credit classes excluding
dual credit classes offered at high schools
Dual credit classes—Classes offered with both high school credit and college class
credit being granted at the same time
Concurrent classes—Classes that students still attending high school may attend for
college credit only.
262
Only one school had found another optimal scheduling time they use "flexible
entry" and "fast-track" classes.
Mini-semester .scheduling
263
Weekend cla.ss .scheduling
• One reports that load and pay are calculated the same as for other classes.
• One school mns an eight week block simultaneously with the two long terms.
264
Exten.sion center scheduling
(Three colleges report having)
• Two offer continuing education courses requiring at least 12 for the class
to make.
• Three of the schools use part-time faculty in the vicinity of the center.
• One school requires that faculty teach if local community people cannot be
found.
• Two others pay mileage between 24-28 cents per mile for using personal
vehicles.
265
WORKFORCE DEVELOPMENT RR^TITT<;
(Six colleges reporting)
All six colleges receive contact hour fiinding for continuing education classes.
266
• Six colleges give CEU credit for continuing education classes,
• Two colleges give certificates of completion.
• One college gives academic credit for completion,
267
Activities are staffed by
• Four colleges use part-time faculty
• Three colleges use fiill-time faculty.
• One college uses extemal consultants.
All five colleges reporting receive contact hour fiinding for the training.
268
MarketinP (Ouest 11 -12^ (R=4^
Four colleges advertise by
• Separate catalogs or brochures
• Contracts with business
Three colleges advertise by
• Mailouts
• Newspapers
Two colleges advertise by
• College catalog
• Word of mouth
• Radio
269
Evaluations are used as
As management tool
In quality improvement
Part of Institutional Effectiveness Plan
Establishing goals and objectives
Budget requests
Corrective action
Program and course improvements
270
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Business Offices
2) allocate less about 11% of the total budget to maintenance or physical plant operations
including equipment, supplies, persormel, and travel
4) prioritize the budget by categories with existing personnel being the most important
categOty
Purchasing
7) use group purchasing for supplies
271
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Computer Technology
4) Provide network accounts to students with e-mail but without remote access
6) Do not require students to sign contracts before using institution's technology system
7) Have no set procedures for training employees on hardware and software, and no
mandatoty hardware or software fraining
11) Provide service in less eight hours or less after the request for assistance
12) Have experienced technology problems in both software updates and software
isntallation.
272
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Developmental Education
4) charging students for developmental classes and labs by applying regular tuition rates
and lab fees
11) placing student in remediation based on placement test scores such as TASP.
Tutoring
12) offering peer tutoring by labs, one-on-one instmction and small groups.
13) offering professional tutoring by labs, onoe-on-one instmction and small groups.
14) fiinding tutoring services by institutional fimds, workstudy funds, and grant fiinds.
Leaming Centers
15) offering leaming centers which offer basic skills remediation, GED preparation,
TASP remediation, tutoring, and college level computer assisted instmction.
16) using lab supervisors, fiill-time instinctors, and part-time instmctors in the leaming
centers and developmental labs.
273
TRENDS IN MID-SIZED COMMUNITY COLLEGES FN TEXAS
Distance Education
3) having distance education classes taught by full time faculty as part of their load.
4) registering and assisting distance education students using the same methods as other
classes use such as telephone and on-campus registration.
Budgeting
6) budgeting delivety systems though institutional budgets and grants.
Agreements
8) in distance education agreements with one other institution.
Delivery Systems
9) using cable television, the intemet and video to deliver distance education.
Intemet
17) not calculating if the offering of video classes exceed the expense of using video.
274
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Research/Planning/lnstitutional Effectiveness
3) tty to absorb personnel and resources after a grant has been concluded.
Foundation
7) have 85%) or more of the foundation management costs paid by the institution.
275
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Instmctional
1) on the average exceed the state median in per contact hours cost in instittitional
support and library.
2) on the average are below the state median in per contact hour cost in insttiictional
administration cost.
Instmctional administration
3) allow fiill-time instmctors to receive load credit for lab instmction.
4) grant teaching load reduction or release time for school responsibilities outside of
instmction such
5) use prorata formula based on number of students to figure load in small fransfer
classes.
In.stmctional personnel
Professional development
9) offer professional development to part-time faculty.
10) offer professional development to technical/vocational faculty to maintain proficiency
within their fields which includes fraining for new technology which the institutions fiind.
Part-time faculty
276
16) decide to replace fiill-time with part-time faculty based on enrollment decreases plus
other factors
20) have campus-wide computer systems for administration, academic, and technical
areas which can be access by all three areas.
23) consider student workers in labs to assist students and provide tutorial and clerical
assistance.
27) separate lectures and labs with faculty being responsible for the lecture.
Equipment
30) use cost as a consideration in obtaining equipment by rental, lease or purchase.
277
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Outsourcing
1) food service
2) publications
3) vending machines
278
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Personnel
7) vaty the review process in the area offrequencyafter the first few years.
8) use an annual evaluation process for part-time faculty which duplicates the fiill-time
faculty process,
10) use annual performance evaluations by supervisors for classified personnel after the
first year of work.
11) tie salaty increases to the performance evaluations for classified personnel.
18) have a core time from 8 am-5pm Monday through Friday that all non-insttiictional
employees work.
279
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Instmctional Scheduling
Night classes
3) require fiill-time faculty to teach evening classes as a part of their regular load.
5) do not allow students to take credit classes for continuing education credit.
6) schedule evening classes at company sites such as medical facilities, banks and Army depots.
Summer Term
7) offer summer classes in six week, eight week and twelve week sessions.
8) generate enough enrollment to be cost effective in both academic and technical classes
offered in two short summer sessions,
Nontraditional
9) offer mini-semester classes, weekend sections of classes, extension center classes and
dual/concurrent classes.
10) do not compact classes into single weekend blocks allowing a student to completee an
academic class in a shorter time frame than usual.
280
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Workforce Develnprnpnt
1) offer both worforce continiuing education programs and customized contract training
classes.
4) fimd continuing education classes by student's tuition, special student fees and
contracts with business.
7) do not allow both continuing education and academic credit for the same classes.
8) advertise continuing education credit through local newspapers, the college catalog,
special brochures or catalogs, and mailouts.
Customized Contract Training
9) assess the need for customized training classes by requests from business.
11) fimd customized fraining by contracting with business and student's tuition.
281
APPENDDC E
282
Benchmark Study Follow-up Survey
Study Procedures:
Please share your thoughts about the way the survey was:
Constructed? (Allowing the sponsor school to organize the preliminaty survey,
but allowing all participants to make changes to the survey instmment to include their
areas of concem and need)
LIsed2 (In your institution to compare yourselves with others institutions in the
study?)
283
Check in the left column any of the ten areas benchmarked which used the information
collected for self-analysis of processes and procedures included? Then in the other two
columns please indicate the which processes or procedures were compared or analyzed,
and if different from the above chart, please indicate any improvements being made or
made.
Benchmarked Processes or Procedures Improvements
Area Analyzed
Business Affairs
Computer
Technology
Developmental
Instmction
Distance
Education
Institutional
Effectiveness
Instmctional
Outsourcing
Persormel
Scheduling
Instmction
Workforce
Development
Check any of the ten areas benchmarked in which the area's participants have established
contacts with others in their area from the survey for networking purposes?
Business
Computer Technology
Distance Education
^Developmental Education
Institutional Effectiveness
Instmctional
Outsourcing
Personnel
Scheduling Instmction
Workforce Development
284
Data Presentation:
Comment on how the data was used after the final report was sent by sharing:
Where the data was located?
Did your school do a gap analysis of the data to compare your institution's performance?
yes no other
If so, check which areas:
Business
Computer Technology
Distance Education
Developmental Education
Institutional Effectiveness
Instmctional
Outsourcing
Personnel
Scheduling Instmction
Workforce Development
Results:
Check any areas that you found the material usefiil for in making fiiture decisions for
your institution?
Business
Computer Technology
Distance Education
Developmental Education
Institutional Effectiveness
Instmctional
Outsourcing
Personnel
Scheduling Instmction
Workforce Development
What improvements have been started as a result of analyzing the data from the
benchmark study?
Share two things that you leamed about your instittition when it was compared to others?
Are any steps being taken to improve your weak areas of comparison from the study?
yes no
285
What do you think should have been done with the data from the study that was not done
at your institution?
Comment on how any of the data was used after the final report was sent that you hasn't
been covered by these questions:
286
APPENDIX F
287
TIME LINE FOR RESOURCE MANAGEMENT STUDY
288
APPENDDC G
289
BUSINESS OFFICE GAP ANAI YST<;
290
Dollar Amount
Personnel Range: $11, 159, 114--$26,632,590 Range: 57.8I%-75.58%
(excluding Average: $16,965,00() Average: 65,2%
fringe benefits) Median: $14,227,829 Median: 65.26%
LCC: $19,601,637 LCC: 65.26%
291
Dollar Amount spent on travel
Range: $400,000-72,920
Average: $254,186
LCC: $349,204
292
COMPUTER TECHNOLOGY GAP ANALYSIS
Policy Questions
Budget
293
Percentage is computer equipment: (Quest. 25)
R=2%-30%
Avg,=14%
M=12%
LCC=5%
Service
Printer problems:
R=l/2 hr.-days (One college reports hours to days)
Avg.=5 hrs,
LCC=6.3
Software problems:
R=l/2 hr.-days (One college reports hours to days)
Avg.=3.6 hrs.
LCC=6.3
Connections to LAN:
R=l/2 hr,-16 hrs. (Once college reports hours to days)
Avg.=5,8 hrs,
LCC=No reply
Tonnections to WAN:
R=l/2 hr,-days
LCC-No reply
294
DEVELOPMENTAL LABS, INSTRUCTION AND TUTORING
GAPANAI.Y.STS
Developmental Instruction
295
FTE support staff in the noncourse offerings: (Quest 7)
R=0-2.25
Avg.=l
LCC=2.25
296
Developmental lah sjye limit: (Quest 20)
Writing
R=20-24
Avg.=21.5
M=20
LCC=20
Reading
R= 15-30
Avg=22.8
M=24
LCC=25
Math
R=24-30
Avg.=28.5
LCC=30
297
DISTANCE EDUCATION GAP ANALYSTS
298
RESEARCH/PIANNINC/TNSTITTJTIONAL EFFECTIVENESS GAP ANALYSTS
Instittitional
Effectiveness •Tob Titles
R=0-2 Dean of Program Development and Institutional Research
Avg.=.6 Secretaty
LCC=0 Director of Institutional Research, Oversight
Resource Development
Grants
299
Income per contact hour from federal sources: (Quest. 7)
R=0-$1.30
Avg.=$.92
M=$1.12
LCC=$1.12
Foundation
by foundation fimds:
R=5%-33%
Avg.=I7%
M=15%
LCC=I5%
300
INSTRUCTIONAL SURVEY GAP ANALYSIS
Cost Study
301
Instmctional Administration
Two colleges offer mixed classes (i.e., semester hour credit students and adult vocational
students in the same class) calculating load
On basis of the load for the credit class
By counting adult vocational sttidents toward the total required to make a
class, since they pay the same costs,
l^CQdoes not have mixed classes.
302
Average lecture class: (Quest. 21)
R= 19.52-35
Avg.= 28.4
M=30
LCC 19.52
InsttTictional Personnel
Percentage of your continuous part-time employees are part-time credit faculty: (Quest.
25)
R=67%-95%
Avg.=85.75%
M=90.5%
LCC=90%
303
Rank system: (Quest. 29)
Two colleges have rank systems
LCC has a rank system
304
Counselor credentials:
Two colleges require a Master's Degree in guidance and counseling
• One college requires L. P C.
LCC requires a Master's Degree in guidance and counseling.
Professional Development
305
Part-time faculty professional development: (Quest 47)
Four colleges offer professional development to part-time faculty.
One college offers these
Same as other groups-numerous workshops, seminars, conferences and travel
• At the beginning of each semester
LCC offers professsional development.
• In four colleges the college pays for technical faculty to receive training in new
technology
LCC does not papy fr technical faculaty to receive fraining in new technology,
• In two colleges the college pays for technical faculty to receive additional
certifications.
LCC does not pay for technical faculty to receive additional certifications.
306
Part-time faculty
(Quest. 55)
Percentages of credit contact hours generated by part-time faculty for fall 1995:
R=10%-50%
Avg. 27,6 %
M=24%
LCC=29.4%
Percentage of noncredit contact hours generated by part-time faculty for fall 1995:
R=95%-100%
Avg.=98.3%
LCC=95%
307
Lab assistants con.sidered: (Quest. 70-71)
Four colleges consider lab assistants, excluding students as classified personnel.
• Three colleges' lab assistants, excluding students, are e\'aluated b\ their
supervisors.
• One college evaluates the same as faculty.
LCC considers lab assistants as classified who are evaluated by their superv isors.
Equipment
308
Institutional budget percentage for equipment: (Quest. 93)
R=2.65%-18%
Avg.=7.81%
M=5%
LCC=2.65%
309
OUTSOURCING GAP ANAI VSK
Outsourced Services:
LCC outsources:
Some advertising
Food services
Some publications
Travel
Vending machines
The six institutions reporting record 17 different areas which are outsourced.
310
One college outsources:
Child care-Agreement offers services to students through a Perkins Voc/Tech
Grant
Mailing-Agreement is with local vendor for large bulk mailing projects only
Maintenance of personal computers-Agreement is for annual maintenance service
Printing-Agreement is for color printing, brochures and catalog
-Consortium pricing/lowest quotes
311
TNSTRIJCTIONAL-SCHEDIJLTNG GAP ANAI .Y.<^I.<;
312
Continuing education, degrees and sites (Quest. 6-8)
• Four of the institutions a degree may be obtained by attending only
evening classes
• Two do not allow a degree to be obtained at night
LCC may obtain a degree by attending only evening classes.
• Three community colleges students can not take credit classes for
continuing education credit
• Three colleges do allow students to take credit classes for continuing
education credit
LCC allows students to take credit classes for continiuing education credit.
313
Contact hours generate (Quest 13) (R=6)
• The most contact hours are generated by the two short summer sessions
both academically and technically.
• Only two colleges generated any contact hours by offering:
• One long session per summer
• One long night session per summer
• Two short night sessions per summer
One college reported that they had conducted a cost effective report and found
that the cost of the course was primarily determined by "whether the course is
taught by fiill-time or part-time faculty,"
Nonfraditional Scheduling
(R=6)
314
Mini-seme.ster .scheduling
(R=4)
315
Extension center scheduling
(Three colleges report having)
316
APPENDIX H
COVER LETTER
317
March 17, 1997
Dear
Enclosed you will find the results in your area of the final
report from the benchmark study. Notice that the final report
includes both quantitative and qualitative data as well as a trends
list. The other report for your area is a gap analysis showing where
XXXXXXX falls in comparison to the peer institutions. The gap
analysis should be used to evaluate areas where XXXXXXX is
performing better than our peer institutions and where we may be
falling behind and need to make changes.
If you have any questions, you may access the entire report
by viewing or checking out a copy at any the libraty on any of the
four campuses or in the president's office; or by contacting
XXXXXXXXXX, the benchmark director.
Sincerely yours.
XXXXXXXXX
Benchmark Project Director
318