PDF

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 328

BENCHMARKING: A CASE STUDY OF THE PROCESS

IN MID-SIZED COMMUNITY COLLEGES IN TEXAS

by

E. JAN PORTER McCATHERN, B.S., M.A.

A DISSERTATION

IN

HIGHER EDUCATION

Submitted to the Graduate Faculty


of Texas Tech University in
Partial Fulfillment of
the Requirements for
the Degree of

DOCTOR OF EDUCATION

Approved

Accepted

December, 1999
Copyright 1999, Jan P, McCathem
ACKNOWLEDGMENTS

1 have only finished this project with the support and encouragement of many
people. 1 am indebted to Dr. John Murray for joining me and guiding me on the last
part of this journey. His editing, suggestions and support are appreciated. I also wish to
thank my other two committee members, Dr. Robert Ewalt and Dr, Brent Cejda for their
input and suggestions.
Most of all I must thank my husband, Glenn, who has always supported and
encouraged me in my educational endeavors and in life. Few women are blessed to
have a soul mate who seeks their good and growth as a person as much as my husband
does. I thank God for him. I am also grateful to my two sons, Levi and Kurtis, who
have stood by mom encouraging and believing that I could achieve this goal,
A few others have been very instrumental in supporting me during this process,
Dr, Bud Joyner always believed that it was a good study and his encouragement is
greatly appreciated. My prayer partners, Vicki Hooker, Betsy Armstrong, Tonya
Canada, Ellen Rumpff and Betty Thompson have offered words, notes and passages of
encouragement. For these and their fiiendships, I am eternally gratefijl,
I finished this task only with the support and encouragement of my Lord and
Savior, Jesus Christ, to whom belongs all honor and glory forever.

11
TABLE OF CONTENTS

ACKNOWLEDGEMENTS ii

ABSTRACT vi

LIST OF TABLES viii

CHAPTER
L INTRODUCTION 1
Statement of Problem 1
Decreasing Resources 2
Increasing Accountability Demands 2
Available Information 4
Changes 5
Purpose of Study 9
Research Question 10
Justification forthe Study 11
Definitions 12
Delimitations and Limitations of Study 14
Significance of Study 15
Organization of the Document 16

IL REVIEW OF LITERATURE 18
Theoretical Background 18
Philosophy of TQM 20
TQM Application 21
TQM in Higher Education 25
TQM Continuous Improvement 28
TQM Benchmarking 28
Business Benchmarking Background 29
Benchmarking in the Baldridge Award 30

iii
Definifions of Benchmarking 32
Types of Benchmarking 34
Higher Education Benchmarking 35
Examples of Benchmark Studies 36
Summary 39

IIL METHODOLOGY 40
Background for study 40
Sample selection 40
Instrumentation 43
Support secured 43
Survey development 43
Content areas 45
Pilot Study 45
Data Collection 45
Data Analysis 46
Follow-Up Survey 46
Summary 47

IV. FINDINGS 48
Sub-Question 1 48
Essential Elements 48
Study Guidelines 49
Designing the Instrument 53
Sub-Question 2 54
Functional Areas 55
Internal Baseline 57
Sub-Question 3 58
Sub-Question 4 59
Sub-Question 5 62

IV
Sub-Question 6 65
4.7 Summary 65

V. CONCLUSIONS 67
Purpose of Study 67
Conclusions 68
Sub-Question 1 68
Sub-Question 2 71
Sub-Question 3 72
Sub-Question 4 72
Sub-Question 5 74
Sub-Question 6 74
Recommendations 76
Future Investigations 77
Summary 78

BIBLIOGRAPHY 79

APPENDICES
A - BENCHMARKING SURVEY INSTRUMENT 89
B - TASK FORCE BENCHMARKING HANDOUTS 159
C - TASK FORCE POSSIBLE BENCHMARK AREAS 171
D - FINAL BENCHMARKING REPORT 176
E - BENCHMARK FOLLOW-UP SURVEY 282
F BENCHMARK STUDY ORIGINAL TIMELINE 287
G - SAMPLES OF LEAD COLLEGE GAP ANALYSIS 289
H - COVER LETTER 317
ABSTRACT

The problem facing higher education today is threefold: decreasing resources,


increasing demands for accountability, and increasing information available. These
pressures are forcing changes in higher education, and colleges and universities are
seeking efficient ways to meet these challenges. As this desire for quality has become a
part of the culture of higher education, many institutions have accepted and adapted the
Total Quality Management (TQM) theory to their campuses. Part of TQM is continuous
improvement; thus, an institution must have an understanding of its current practices and
of ways that current practice can be ameliorated. Both of these needs are met by the
TQM technique of benchmarking.
Benchmarking is defined as the ongoing analytical process of comparison used as
a tool for continuous improvement in an effort to manage change. This peer
benchmarking study was conducted in six mid-sized community colleges in Texas to
answer the question, what are the conditions necessary for benchmarking to be used by
mid-sized community colleges in Texas to improve community college operations?
Using a survey instrument designed by the six participant colleges, ten key areas of
community college operations were evaluated. The lead college employed the researcher
to construct the survey instrument, conduct the internal pilot study, gather and analyze the
data, establish current trends, and produce a gap analysis. Two years after the study was
conducted, a follow up study was conducted to determine the use of the survey data in
making changes and what conditions existed at each community college that made
changesfi-omthe survey information. Thus, this study applied the TQM technique of
benchmarking to the community college setting to determine what conditions were
needed to use benchmarking data in beginning quality improvements
The study found that the conditions necessary for benchmarking to be used for
community college improvements were (1) seven planning steps including recognizing
the need for a study; providing the essential elements for the study; selecting a leader;
selecting, training and utilizing a task force; selecting the peer institutions; establishing
study guidelines; and overseeing the design and pilot study of the survey instrument, (2) a

VI
survey instrument designed by a researcher and participants, (3) a year's commitment to
the study, and (4) collection, analysis and access to the data by decision makers in each
area surveyed. Five basic types of improvements were made based on the benchmark
data: fiarther research in weak institutional areas, improvements in technology equipment
and usage, making existing processes more efficient, upgrading weak areas and
developing new services.
The study concluded that support and involvement of the president and senior
leadership is necessary for improvements to be made from a benchmark study, that
training of decision-makers and a Task Force on each campus is necessary in a
benchmark study, that doing a gap analysis of an institution after receiving the data, and
that making the data available to key personnel are necessary in making improvements
from benchmark data.

vu
LIST OF TABLES

3.1 Statistics on community colleges considered to be mid-sized 37


3.2 Revenue statistics considered on community college defined as mid-sized 38
4,1 Gap analysis examples from developmental area of survey instrument 56
D,l Administrative computing 186
D.2 Maintenance and service 190
D.3 Microcomputers primarily for students 190
D.4 Microcomputers for faculty 190
D,5 Microcomputers for staff 190
D.6 Computer personnel breakdown 191
D,7 Contact hours, employees, students 199
D.8 Percentage of developmental classes taught 200
D,9 Types of tutoring offered 201
D.IO Titles and responsibilities of distance education support personnel 206
D.l 1 Per contact hour cost 223
D,12 Contact hour cost study 223
D,13 Instructional administration load and compensation 224
D.14 Instructional adminisfration supervision 224
D.15 Rank systems for instructional personnel 230
D.16 Librarians and counselors 231
D.17 Examplesof activities for each employee group 235
D,18 Use of part-time faculty—Institution 1 236
D.19 Use of part-time faculty-Institution 2 237
D.20 Use of part time faculty-Institution 3 237
D,21 Discipline-specific computer labs—Institution 1 239
D.22 Discipline-specific computer labs—Institution 2 240
D,23 Discipline-specific computer labs—Institution 3 241
D.24 Discipline-specific computer labs—Institution 4 242
D.25 Discipline-specific computer labs-Institution 5 242

via
D.26 Noncomputerized labs-Institution 1 245
D.27 Noncomputerized labs—Institution 2 247
D.28 Noncomputerized labs-Institution 5 248
D.29 Noncomputerized labs-Institution 6 248
D.30 Non-listed positions 255
F.l Time line for resource management study 288
G.l Contact hours, employees, students 295
G.2 Percentage of developmental classes taught 296
G.3 Per contact hour cost 301
G,4 Contact hour study 301
G.5 Instructional adminisfrators compensation and load 302
G.6. Librarians and counselors 304

IX
CHAPTER I
INTRODUCTION

As a result of higher education institutions continually striving for operational


excellence, many management concepts and practices developed in the business world
have been adopted by colleges and universities. Examples of these concepts and
practices include establishing multiple levels of supervision with a formal chain of
command; generating forms, reports and data; management by objective: implementing
multiple checks and approvals; using complicated formal evaluations; instituting
competitive bidding; strategic planning, and establishing complex planning systems to set
direction and allocate resources. Yet, these practices have been largely unsuccessful in
achieving their designed objectives in the corporate world and are believed to be some of
the reasons why many U.S, businesses are not competing well in today's global economy
(Sheer & Lozier, 1991); higher education has struggled with achieving excellence in
using them as well.
Several authors (Bell, 1973; Naisbitt, 1982; Toffler, 1980) in writing about
societal changes claim that society is moving from an industrial age to an information
age. This change results from a technology explosion, more consumer awareness,
increased liability by the courts, taxpayer revolts, and increased knowledge of
competition; all which combine to mandatefimdamentalchanges in all aspects of society
including higher education (Peterson, 1993). Due to these shifts in society, Apps (1988)
and Boyer (1990) believe that higher education is experiencing another major transitional
change and must find new methods of adaptation.

Statement of Problem
Three major problems are facing higher education today: decreasing resources,
increasing demands for accountability, and greater amounts of available information.
Decreasing Resources
Increasing operating costs coupled with decreasing and volatile revenue sources
have created both internal and external pressures for higher education. Reductions in
state and federal appropriations and slower-growing tuition rates limit the fiinds available
to colleges; thus, costs must be controlled and reliable sources of additional revenue
found merely to keep from losing ground (DeCosmo, Parker, & Heverly, 1991). Since
resources are not as plentifijl as they once were, cost and efficiency are major issues for
colleges. While building such resource efficiency, colleges and universities must deal
with environmental changes such as changing demographics, increasing demands for
accountability from government, governing boards and the media, and public
disillusionment with the value of higher education.

Increasing Accountability Demands


Peterson (1993) notes that this call for accountability seems to have begun with
the mid-1980s reports of the American Council on Education, the National Commission
on Excellence, the National Institute of Education Study Group, and the Association of
American Colleges, that cited examples of college and university graduates whose
communication and critical thinking skills were below expectations. Ruppert (1994)
reports that ten states-Colorado, Florida, Illinois, Kentucky, New York, South Carolina,
Teimessee, Texas, Virginia and Wisconsin-have developed new accountability policies
either from their state legislatures or the state higher education agency.
In response to this public demand for quantifiable evidence that higher education
is worth the cost and time involved, Gaither, Nedwek, and Neal (1994) found three major
economic and societal changes of the last three decades that are primarily responsible.
First, a higher education is considerably more important to securing and keeping a good-
paying job than it was 30 years ago. Second, consumers have become more
sophisticated, and they are no longer willing to accept that the faculty's professional
expertise should allow them to make decisions that affect student's lives without some
evidence of the validity and reliability of those decisions. And third, the cost of higher
education is greater than ever. Thus, for these reasons colleges and universities are being
held more accountable for the success level of their graduates.
Gaither, Nedwek, and Neal (1994) further state that three types of performance
indicators are being used to judge the quality of any higher education process. The first
indicator is the measured results including the percentage of students graduating, the
number of degrees, the percentage of students going to graduate schools or the number of
publications by faculty; and the measured value or how many students went to the best
graduate schools, their average starting salary compared to comparable schools, or the
number of faculty pubUcations cited in scholarly publications. The second indicator is
the quality of input or SAT and ACT scores of entering students, their rank in class, and
the rank of faculty members graduate degrees. The last indicator is the measurement of
critical processes or the realization that 50% of starting students never achieve their
education goals. They conclude, "Higher education institutions must start developing
better ways to judge more adequately how well they are doing" (Gaither, Nedwek, &
Neal, 1994, pp, ix-x). Therefore, they believe that higher education is being held up to
scrutiny through such indicators and that this trend for accountabilty is irreversible.
Performance indicators have emerged partly in response to pressures for colleges and
universities to demonsfrate desired results for resources provided. This shift to the new
notion of quality, which incorporates assessment, is helping to meet the public's and
policy maker's demands for more accountability.
Higher education ostensibly must prepare for a three-tier system of accountability
that includes national baseline standards, state performance standards, and accreditation
standards for recognition of overall excellence. Reports of findings are likely to become
public, communicating institutional strengths and weaknesses to a broad segment of the
public, and thereby, acting as demands for change (Gaither, Nedwek, & Neal, 1994;
DeCosmo, Parker, & Heverly, 1991), This assessment movement is a shift from a
formula funding model toward funding based on outcomes, results, and performance.
Such a focus on performance, "using funding incentives as motivators, helps encourage
policy makers and academic communities to explore the use of a system of indicators to
improve the efficiency and effectiveness of higher education" (Gaither, Nedwek, & Neal,
1994, p, iii). For example, as early as 1987 the state of Texas moved toward
consideration of incentive funding, and later performance funding, as part of a "good
practices" effort (Ashworth, 1994; Gaither, 1993). As a result the Texas Coordinating
Board for Higher Education developed and the legislature adopted as one of it
performance indicators a semester-by-semester report to the Legislative Budget Board
about the percentage of timeflail-timefaculty spend with lower-division courses (Gaither,
Nedwek, & Neal, 1994).
Other examples of this trend of accountability have been noted in a study by the
Education Commission of the States revealing that 40 states were actively promoting
assessment in 1992 (Ewell, Finney, & Lenth, 1992). By 1986, all 50 states and the
District of Columbia had developed initiatives to improve undergraduate education
according to Boyer and McGuiness, Jr, (February 1986). An example of a national effort
toward accountability is the expansion, although largely symbolic, of the Malcolm
Baldridge National Quality Award, handed out annually by the U.S. Department of
Commerce, to include schools and colleges by 1996 (Fushberg, 1993). Such attempts at
reforms externally imposed and politically based have a number of common themes:
"input versus output, excellence versus quality, cost versus productivity"; all of them
indicate a concern for the quality of undergraduate instruction (Gaither, Nedwek, & Neal,
1994, pp. 17-18). These pressures of decreasing resources and increasing demands for
accountability are occurring at the same time that greater volumes of information are
readily available for processing by our knowledge-based society, that organized learning
as a life long process is being sought, that new theories about how humans leam are being
developed, and that new modes of technological operation and governance are being
developed.

Available Information
This knowledge and technology explosion or as Wolverton (1994) labels it, this
"exponentially exploding knowledge base" (p. vi), is the third problem facing higher
education today. In discussing this knowledge explosion, Parston (1986) observed that
more than ever, institutions now need to incorporate both lifelong learning practices and a
mass system of post-school education while continuing to produce value on limited
resources. In explaining why certain traditions were unsustainable in higher education
today, Susan Weil (1994) stated, "The nature and rhythms of knowledge creation and
dissemination have changed dramatically, supported by the revolution in technology..."
(pp. 22-23), This higher volume of available information predicts Alsete, (1995) is
changing "the methods of how institutions of higher education operate in the mid-1990s"
(p. iii). Similarly, Peter Drucker (September-October 1992) predicted that in the "next 50
years, schools and universities will change more and more drastically than they have
since they assumed their present form more than 300 years ago when they reorganized
themselves around the printed book" (p. 97). He goes on to say that these changes will
be forced in part by "new technology, such as computers, videos, and telecasts via
satellite" and in part by "the demands of a knowledge-based society in which organized
learning must become a hfelong" (p. 97).
In a survey of over 350 college and university senior administrators reported in
the Chronicle of Higher Education Almanac for September 1989, 42% ranked facilities
and technology and 39% ranked adequate resources as two of the top three challenges
facing higher education (p. 24). Carr, Hard, and Trahant (1996) listing core changes in
organizations placed information technology as a key change for the 1990s. One of a
number of conditions affecting colleges and universities today, Alsete (1995) explains is
that "not only is the knowledge base of many areas in higher education changing rapidly,
awareness concerning these changes is so widespread that the stake holders of higher
education are increasingly dissatisfied with the status quo" (p. xi). Thus, these three
problems are demanding changes in higher education.

Changes
Some of these changes can be contiolled by effective resource management.
Since the 1860's institutions of higher education have continually changed due to
intellectual, philosophical, and functional changes in society, observes Veysey (1965);
such adaptation must continue to take place as we face a new millermium. The stresses
discussed above are not only facing colleges and universities, but community colleges as
well. Moreover, for community colleges, added to these pressures, exists the struggle to
once again define their role in higher education. Business and industry, communities,
government, and four year institutions all desire to shape two year schools for their own
benefit, but community colleges must chose for themselves how to relate to each of these
service areas. In doing so community colleges must define their role in
continuing/lifelong learning, workforce development, vocational certification, distance
learning, and four year transfer.
Sir Winston Churchill, once said, "There is nothing wrong with change if it is in
the right direction. To improve is to change, so to be perfect is to have changed often"
(quoted in Jablonski, 1992, p, 25), Alstete (1995) agrees that to stay viable, a new way
of thinking that creates more efficient operations and a desire for continual learning by
students must be integrated into higher education. Rush (1994) summarized the problem
by stating that the challenge higher education faces is "reorienting their thinking around
customers, processes, and a different set of measurements. Institutions not only need to
rethink what they do, but how they do it and how they measure themselves" (p. 88).

Total Quality Management


One change theory and method being adapted from the world of business is Total
Quality Management, Jablonski (1992) explains that TQM "provides an avenue for
coping with change and directing it toward a positive outcome for the future" (p, 23). To
constantly strive for change and create a willingness on behalf of an institution to
participate in change is to practice TQM with its team and participatory approaches, "An
organization positioned for change will succeed and in doing so, redefine the standards
for its competitors. In contrast, those unwilling or unprepared for change will be left
behind—victims of the change process" (Jablonski, 1992, p. 25). One paradigm in TQM
is the concept of continuous improvement.

Benchmarking
One of the latest TQM techniques for continuous improvement being used by
business and industry is benchmarking. Benchmarking is being used and is effective in

6
higher education for several reasons. First, because it focuses on outputs and quality as
explained by Alsete (1995), "Traditionally, institutions have been primarily concerned
with inputs and costs, where improved quality can only be achieved from greater
expenditures. Benchmarking is different, because it focuses on the outputs and quality of
services, not the inputs" (p. 41), Second, benchmarking is easy to understand and
implement by all levels of employees in an organization for all kinds of processes
(Spendolini, 1992), Benchmarking is currently being used successfully in colleges and
universities (Alsete, 1995) because it is a part of a learning process within an organization
and the key to productivity improvement which lies in performing better through
continuous learning. "Since institutions of higher education profess learning and value
hard data, using benchmarking to improve our processes is a natural extension of what we
provide to students in the classroom" (p, 38),
Third, many companies, such as Xerox, Motorola, IBM and others, have been
using it for years, thus establishing a pattern to adapt to colleges and universities
(Spendolini, 1992), Fourth, benchmarking uses reliable research techniques, such as
surveys and interviews which can "provide external and objective measurements for goal-
setting, and for improvement tracking over time" (Alsete, 1995, p. 27), Fifth, it is one
method that aids institutions in staying competitive. Because today's students are more
demanding and tend to "shop"for the college they will attend, institutions of higher
education are using benchmarking to improve by "comparing performance (both
administiatively and academically) with comparable or peer institutions, 'best-in-class,'
and even world-class organizations outside of higher education" (Alsete, p, 4). Sixth,
Tucker (1996) found that benchmarking can be used to rapidly address specific strategic
goals and prevent "wheel reinvention o f by an institution as well as learning how other
organizations have addressed the issues and problems they are facing. Seventh, Detrick
and Pica (1995) reported that one real value of a benchmarking study is institutional
infrospection, because it forces participants to go inside their own institution, collect
information, and raise questions.
Higher education institutions have begun to use benchmarking to compare
themselves to peer or competitive institutions and have found several benefits have
accrued. A study conducted for the American Assembly of Collegiate Schools of
Business (AACSB) and the Graduate Management Admission Council (GMAC) by Pica
and Detrick working with schools in the Big Ten Conference in 1993 found that
participating institutions reported the benchmark study was a positive experience, and has
been used in self-analysis, in preparing budget requests, and in getting additional
resources, (Alsete, 1995, pp, 43-44).
Responding to inquires from the CQI-L web site on what benchmarking studies
have accomplished in colleges and universities, Janice Dossey-Terrell at the University of
Central Florida states: "Thus far, our benchmarking efforts have been very helpful, and
have prevented us from making some decisions we would have regretted in the future"
(Dossey-Terrell, 1995, p, 4). A Pennsylvania State University respondent states that they
have been using benchmarking initiative in a wide variety of areas, "We continue to
profit from our corporate partners who keep us on track and emphasize the importance of
benchmarking processes for improvement, and not just the collection of data to prove
how good we are" (Sandmeyer, 1995, p, 4),
Another Respondent, Gerry Shaw from Babson College which used
benchmarking to discover best practices replies: "We have learned where and what to
avoid as we move along, how others have dealt with resistance along the way, how
technology can be used to better enable what we are trying to do, and how to achieve a
stronger customer focus" (1995, pp. 4-5), A University of Maryland respondent reported
that benchmarking successfully reduced the processing time for surplus property requests
(Schnell, 1995,p, 6).
Further Ray Carlson from Dalhousie University believes that benchmarking is a
much needed comparative analysis technique:
From my perspective, benchmarking is probably the key to CQI being
useful on campus. At the same time, benchmarking relies on some form
of outcome measurement—as one can measure outcomes in a valid and
reliable way, it becomes possible to identify processes that seem more
effective, and then try to isolate factors that might be responsible, and test
whether introduction of these factors leads to better results. (Alsete, 1995,
p. 56)
Benchmarking in higher education is also being conducted at several European
institutions Alsete (1995) reports.
In addition to gathering data for process improvement, benchmarking is useful by
college and university leaders
for strategic planning and forecasting, because it develops knowledge of
the competition, views state of the art practices, examines trends in
product/service development, and observes patterns of customer behavior.
Benchmarking is a source for new ideas, process comparisons, and goal-
setting. It enables the... practioner to see the organizational functions
from an external point-of-view, and not be limited to the traditional method of
developing ideas and objectives internally. (Alsete, 1995, p. 11)
Higher education uses benchmarking by identifying competitors performance
standards and then comparing one's own performance those standards. Such information
can produce greater efficiency as both college boards and administrators become aware of
ways of doing things they had not thought possible, Alsete (1995) explains that
benchmarking has emerged as a usefiil, easily understood, and effective tool for staying
competitive and developing adept practices. Understanding how another institution
operates can provide both new ideas and new methods to bring about improved
performance. The prestigious American prize for quality, the Malcolm Baldridge award
has incorporated benchmarking into its application process. Such inclusion confirms that
benchmarking has become a recognized tool of the TQM continuous improvement
process in institutions of higher education

Purpose of Study
The purpose of this study is to evaluate the feasibility of using the TQM technique
of benchmarking in mid-sized community colleges in Texas. Mid-sized colleges were
selected for this peer benchmarking case study because they experience the same
pressures as all community colleges, but because of size, they have fewer resources than
larger institutions and larger student populations to serve than smaller institutions.
Selecting such homogeneous institutions provided the necessary environment to examine
the methods and design of the benchmarking process. Based on published research and
input received during the study from the participating colleges, these selected institutions
were surveyed in ten educational areas: business operations, computer technology,
developmental instruction, distance education, institutional effectiveness, instructional,
outsourcing, personnel, scheduling instruction, and workforce development. After
information was compiled from the six institutions selected, the raw data was distributed
to the institutions involved in the study. Then, a gap analysis of the data was calculated
for the lead college, and benchmarks were also established as either trends or the best
practices in each area of study. Two years later, the institutions were contacted again and
asked to complete a questionnaire to determine how the benchmark study had been
utilized by each institution. These surveys were then evaluated to determine if
improvements were made as a result of using benchmarking data. If improvement had
been made because of the benchmarking study, this would indicate that the process could
be used by community colleges to make improvements.

Research Question
To guide the evaluation of the benchmarking process in mid-sized community
colleges, a main research question and a series of sub-questions were developed. The
main question for the study, what are the conditions necessary for benchmarking to be
used by mid-sized community colleges in Texas to improve community college
operations? was developed to evaluate the process of benchmarking.
The six sub-questions to be answered were as follows.
1. What plaiming steps should be taken by the lead college in beginning a benchmark
study for an educational institution?
2. Can a benchmark study be designed to ensure that each of the participating institutions
feels included and derives information useful in making system improvements?
3. What time period is reasonable for conducting a benchmark study in community
colleges?
4. What procedures can be used to collect, analyze and distribute benchmark data so that
it can be utilized by individual institutions to improve processes in community
colleges?

10
5. Can the benchmark information be used at individual institutions to improve processes
in community colleges?
6. What are the characteristics of a college that utilizes benchmark data for process
improvement?
It should be noted that based on the literature from the use of benchmarking in
Business that answering the six sub-questions regarding planning, designing, conducting,
collecting and analyzing, and distributing and utilizing benchmark data, an adequate
evaluation of the benchmarking process's effectiveness in producing operational
improvements in community colleges can be accomplished.

•Justification for the Study


The last 30 years have seen a remarkable awakening of awareness of the need for
organizational change in business and industry, and adoption of TQM has led to thinking
in terms of quality, processes, and continuous improvement. Within theframeworkof
TQM, management theory is a technique for achieving continuous improvement, labeled
benchmarking. Dertouzos, Lester, Solow and the MIT Commission on Industrial
Productivity (1989) found that benchmarking and TQM are vehicles for "accomplishing
change sometimes, radical change, particularly in administrative functions" (p. 7), Camp
(1995) stated that the most "efficient way to promulgate effective change is by learning
from the positive experience of others. One learns because another learned first and was
willing and able to share that knowledge" (p. xiv). Thus, this new TQM technique of
benchmarking has become an efficient way to assure the success of business change
initiatives which improve one's operations. Nicklin (1995, January 27) reports on an
industry benchmarking study conducted by Exxon asking these questions:
In what direction is the industry going? Is there anything that our
competitors are doing that we could leam from? What types of
applications have they developed for managing various parts of their
business? How sophisticated are their applications? (Leibfiied and
McNair, 1992, p. 187)

Answering these same questions would benefit higher education institutions.

11
Much has been written about the need for change and improvement in higher
education, and new paradigms of operation are being proposed to address change, Alsete
(1995) explains
..,quality improvement techniques developed in the business world such as TQM,
reengineering, the Baldridge Award, and benchmarking are being used as tools in
these new paradigms or ways of thinking for colleges and universities.
Benchmarking has become part of the lexicon of continuous quality improvement
in the 1990s, (p. 16)

One change needed in colleges and universities is to place less emphasis on the input of
resources to academic departments and more emphasis on the work processes and outputs
of those departments. Thus, institutions should "determine what the expertise, skills, and
materials they invest produce, how they produce it, what it costs to produce it, and how
well it is produced" (Rush, 1994, p, 88). One way to efficiently shift to this output
emphasis is to benchmark. Another change needed in higher education is how decisions
are made and innovations selected; these are usually done without the active
involvement of those who will implement the changes and without data to support the
investments the innovations take (Tucker, 1996, p. x). Benchmarking directly addresses
these needs as well.
Thus, due to the fact that benchmarking is a new method that has proven to be
successful in many areas, six presidents of mid-sized community colleges in Texas
participated in conducting a benchmark study between their institutions.

Definitions
TQM or Total Quality Management is defined as the theory of customer-centered
quality-driven participatory management developed by Edward W. Deming during his
work in Japan. Deming bases his theory of management on fourteen points which if
followed are designed to improve quality, productivity, and competitive position by
continuous improvement in customer satisfaction through an integrated system of tools,
techniques, and training.

12
CQA nr Continuous Quality Assurance is defined as the aim of an organization, in
this study a mid-sized community college, to adapt TQM to its culture including
improving quality through continuous change.
Continuous Improvement is defined as a "constantly, never ending striving to do
better" (Hammons, 1994, p. 337).
Mid-sized community college is defined as a community colleges with annual
student headcount between 5,000-10,000, contact hours between 2 million and 6 million,
and appropriations between 9.5 million and 13 million dollars with primary service area
population of between 75,000 and 200,000.
Benchmarking is defined as the TQM technique used to facilitate continuous
improvement by analytically comparing ongoing processes in an effort to manage change.
Peer Benchmarking is defined as the process of analytically comparing peer
organizations in order to discover current trends and show institutional gaps in
performance where improvement can occur. This definition combines business's
definitions of competitive benchmarking, as defined by Spendolini (1992) as measuring
against two or three direct competitors with the same customer base, and industry
benchmarking, as defined by Leibfried and McNair (1992) as looking for general trends
across a larger group of related firms who have similar interests and technologies. But
the goal of both is to provide external standards for internal measurement and
institutional improvement.
Process is defined as a series of actions or operations that leads to a particular
result, Jablonski (1992) defines a process in TQM "as a series of operations linked
together to provide a result that has increased value" (p, 52), Key processes are "critical
functions that determine how well an organization succeeds (e.g., admissions, advising
teaching, and placement)" (Hammons, 1994, p. 338).
Trend is defined as a common approach to development, operation or growth
being utilized by a majority of the colleges studied, thus, indicating a degree of efficiency
in performance.
Malcolm Baldridge National Quality Award is defined as the United States annual
national quality award established by the Malcolm Baldridge National Quality

13
Improvement Act of 1987, signed by President Reagan and handed out by the Department
of Commerce, The purposes of the award are to promote quality awareness, to recognize
quality achievement of United States companies, and to publicize successful quality
strategies (Jablonski, 1992), In 1995, the Malcolm Baldridge National Quality Award
Education Pilot was launched; "it implies improvement compared to peer institutions and
appropriate benchmarks" (Wolverton, 1994, p. 14),

Delimitations and Limitations of Study


A delimitation of this study is the purposive sampling of six mid-sized
community colleges. By limiting the number and size of community colleges studied,
the generalizabilty to other institutions of higher education of the benchmarks is
decreased.
A second delimitation of this study was allowing the institutions to design the
survey instrument for their needs. This too decreases the generalizability of the
benchmarks to other institutions,
A third delimitation of this study was that the surveys were sent to benchmarking
contacts on each campus; thus, the information given on the survey was a result of their
instructions and interpretation of the data sought.
Another delimitation of this study was that all processes used at institutions of
higher education were not included. The ten functional areas and processes included in
the study were chosen by the institutions involved as those key areas of operation. Such
key areas were considered in order to provide specific and comparable information to aid
the participating institutions involved in making judgements about changes needed. Any
more data would have been cumbersome and overwhelming.
A limitation of the study is the report format of the questionnaire which poses the
problem of honesty in reporting. However, since the questionnaire was composed with
input from each participating institution, honesty was assumed.

14
Significance of Study
Tradition and intuition, used often as a basis for making changes, are really no
replacement for objective external comparison and analysis, or benchmarking.
Benchmarking is emerging in leading-edge companies as a tool for obtaining the
information needed to support continuous improvement and gain competitive advantage.
Recent surveys across major countries indicate that over two thirds of leading companies
are conducting benchmarking on a regular basis (Cook, 1995, p. 11). Leibfried and
McNair (1992, p. 3) illustrate that management is looking beyond the basics for ways to
improve upon existing products and services, and benchmarking provides an excellent
way to do this and survive. The marketplace in both business and higher education today
expects, and is receiving, constantly improving products and services at an ever-
decreasing cost per function. Such reports suggest that benchmarking is becoming a key
management technique for the 1990s and beyond because it provides a new vision and
perspective on traditional management concerns,
Peterson (1993) predicts that colleges and universities "can never again expect to
receive as much of their funding from tax sources as in the past and must now convince
private funding sources that higher education is a worthwhile investment" (p. 12), If
community colleges begin to search for funding from local businesses and industries,
they must justify their contributions in terms of what the college can do for them. He
further notes that in fulfilling the community college mission, "it is essential that college
services of all types, as well as curriculum and instruction, be continuously assessed and
enhanced" (p. 15). This study has the potential of aiding community colleges in
developing a continuous improvement tool that will lead to better organizational
effectiveness.
The challenge facing community colleges and all colleges is what is the best way
to bring about needed and continual changes in areas of rapidfransformationsuch as
operational efficiency, technology and increased knowledge base, demands of
accountability, and life-long learning opportunities. Quality management principles and
processes such as TQM (Total Quality Management), CQI (Continuous Quality
Improvement), and CQA (Continuous Quality Assurance) have afready provided the way

15
and the language to meet these challenges in some higher education institutions. But
effective methods of bringing about continuous improvement, such as benchmarking, are
still needed. Benchmarking has provided the means for making comparisons to assist in
meeting the challenges of these transformations. For example, benchmarking benefits
higher education in the area of accountability by encouraging the institutions to rethink
and reorient processes to improve the quality and reduce costs.
Thus, this study of the development of a benchmarking process in mid-sized
community colleges in Texas will provide college administrators an evaluation of both
the design and methods of benchmarking which can be used in institutions of higher
education to make operational improvements. It also provided the community colleges
who participated in the study with data and benchmarks of practices in like institutions,
as well as providing other colleges with knowledge of existing practices in mid-sized
community colleges in Texas. This study is important because
• it is the first of its type to be conducted among community colleges; therefore, it will
provide valuable information on how benchmarking can be used in community
college educational settings;
• it will furnish guidelines for community colleges in applying the process of
benchmarking to their operations;
• it will test the usefiilness of benchmarking for community colleges;
• it will add to the research and literature in the field of higher education
• it will produce examples of efficient practices being used in community colleges;
• it will provide standards for participating mid-sized Texas community colleges to
measure themselves against.

Organization of the Document


The remainder of the dissertation is divided into the following four chapters. In
Chapter II, the relevant literature on Total Quality Management theory, its use in higher
education, its concept of continuous improvement, the technique of benchmarking, and
benchmarking's use in higher education is explored. Chapter III describes the
methodological framework for the study including the background for study, the sample

16
selection, the instrumentation, the pilot study, the data collection, the data analysis, and
the follow up survey is reviewed. In Chapter IV, the findings that answer the research
question and the five sub questions are presented. In Chapter V, the conclusions,
recommendations and implications for fiirther study are discussed.

17
CHAPTER II
REVIEW OF LITERATURE

International competition forced business in America in the 1970s and 1980s to


adopt methods that would improve quality and cut costs based on the theory of Total
Quality Management; colleges and universities in the 1980s and 1990s have begun
adapting TQM theory to their institutions. Even though colleges and universities are
distinctly singular institutions, some of the TQM techniques for coping with change and
improving quality being used in American business today have also proven to be
successfial in higher education. As complex organizations colleges and universities have
many business characteristics like multi-million dollar payrolls, accounting and billing
departments, computer systems, and security guards as well as a broad array of customers
including people who sit in classrooms, people who use to sit in classrooms, people in the
local community, employers who avail themselves of our educational services, suppliers,
and legislators who want to ensure state funds are being used wisely. Many institutions
have diversified holdings such as restaurants, bookstores, living accommodations,
investments, and real estate (Seymour, 1993, p. vii). Myerson and Massy (1994)
observed that in many respects the challenges colleges face and the challenges corporate
America faces are more alike than different.

Theoretical Background
Total Quality Management (TQM) is the theory being adopted to improve both
American industry and higher education. The theory's primary founder was Dr, W.
Edward Deming who joined with Dr. Joseph Juran after World War II. They introduced
the Statistical Quality Confrol (SQC) concept of management, a statistical theory
originated by Sir Ronald Fisher, but used during World War II by Walter Shewhart, a
Bell Laboratories physicist, to develop the zero-defects approach to producing
telephones. Deming, who had worked with Shewhart, developed his own version of SQC
which he introduced and refined in Japan (Jablonski, 1992, p. 29), Joseph M. Juran's
main contribution was defining and teaching about how to create customer-oriented

18
organizational systems (Sashkin & Kiser, 1993, pp. 37-38). Both Deming and Juran saw
quality as a management function that could be systematically improved by using
statistical tools, consumer research, goal-setting, team work, problem solving, human
resource management, and strategic planning (Seymour, 1993, pp. viii-ix).
In Japan the two were joined by Kaoru Ishikawa, the early Keidanren leader who
helped Japanese executives identify the need for quality and who invited Deming to teach
Japanese business executives how to achieve quality; Ishikawa is known for his efforts to
spread the use of TQM techniques through his technical books that describe TQM tools
and his new tool, the Fishbone Diagram. The term "total quality control" was actually
coined by another pioneer of the theory, Armand V. Feigenbaum (Sashkin & Kiser,
1993, p. 37). Deming, Edwards and others emphasize that quality enhancement
throughout an organization is the basis for profit. This concept was first recognized by
the Japanese as the strategy needed to solve their economic crisis after World War II, and
thus, Japanese companies made the first total commitments to quality enhancement
(Peterson, 1993, p. 5).
Since the work in Japan, Deming (1986) has refined his theory of management for
the improvement of quality, productivity, and competitive position into fourteen points:
1. Create constancy of purpose toward improvement of product and service,
with the aim to become competitive and to stay in business and to provide
jobs.
2. Adopt the new philosophy. We are in a new economic age.
3. Cease dependence on inspection to achieve quality. Eliminate their need for
inspection on a mass basis by building quality into the product in the first
place.
4. End the practice of awarding business on the basis of price tag alone.
Instead minimize total cost by moving toward a single supplier,
5. Improve constantly and forever the system of production and service, to
improve quality and productivity, and thus constantly decrease costs.
6. Institute training on the job.
7. Institute leadership,
8. Drive out fear, so that everyone may work effectively for the company.
9. Break down barriers between departments.
10. Eliminate slogans, exhortations, and targets for the work force asking for zero
defects and new levels of productivity,
11. Eliminate numerical quotas for the work force and numerical goals for
management.

19
12. Remove barriers that rob people of pride in workmanship by abolishment of
the annual or merit rating and of management by objective.
13. Institute a vigorous program of education and self-improvement.
14. Put everybody in the company to work to accomplish the transformation.
(pp.23-24)

He adds that apphcation of his theory produces a chain reaction: First, quality improves;
second, costs decrease because of less rework, fewer mistakes, fewer delays, and snags;
third, materials are used better; fourth, productivity improves; fifth, the market is
captured with better quality and lower prices; and sixth, the organization stays in business
and provides jobs (Sashkin & Kiser, 1993). Deming (1986) fiirther argues that the
consumer is most important; and therefore, quality should be aimed at the consumer's
needs both present and future. He predicts that companies that adopt constancy of
purpose for quality, productivity, and service, and go about it with intelligence and
perseverance, have a chance to survive in the global market.
This fourteen point theory also supports Deming's seven deadly diseases: lack of
constancy of purpose, an emphasis on short-term profits, individual performance
evaluations, managers who are highly mobile, use by management of numbers and
figures that are visible and available, excessive medical costs and excessive legal liability
costs. Sashkin and Kiser (1993) found that these diseases in management "refer to
beliefs, policies, and practices so firmly entrenched that many, perhaps most, American
managers regard them as basic truths" (pp. 34-35). Such basic truths hindered American
industry for a while from successful competing worldwide. Thus, the roster of founders
of the quality approach include Deming, Kaoru Ishikawa, Joseph M. Juran and Armand
V. Feigenbaum.

Philosophy of TQM
This new philosophy emphasizes a few guiding principles and applies to both
large and small institutions. TQM allows an organization to set expectations higher than
in the past, to recognize and remove barriers to change, and to enable high-level
managers to solicit the opinions and ideas of their associates and do something with those
good ideas. Supporting the philosophy of TQM are both qualitative and quantitative

20
tools which allow a better understanding of the way business is conducted. Such tools
allow for measuring improved quality which is required in TQM's continuous
improvement, as well as recognizing when the improved productivity, performance,
efficiency, and quality goals are achieved (Jablonski, 1992).
Jablonski (1992) reports that the U. S. General Accounting Office conducted a
review of the 20 highest-scoring applicants for the Malcolm Baldridge National Quality
Award in business over a two-year period to evaluate the impact of TQM practices on
their organizations. In nearly all cases, those using TQM techniques achieved better
employee relations, greater customer satisfaction, higher productivity, improved
profitability, and increased market share.

TQM Application
TQM is a "cooperative form of doing business that relies on the talents and
capabilities of both labor and management to continually improve quality and
productivity using teams" explains Jablonski (1992, p. 71). He fiirther declares that three
ingredients are necessary for TQM to flourish in any company: participative
management, continuous process improvement and the use of teams. Participative
management is defined as "arming your people with the skills and support to better
understand how they do business, identifying opportunities for improvement, and making
change happen..." (p. 71). Continuous process improvement (CPI) a major component of
TQM theory is defined as "accepting small, incremental gains as a step in the right
direction toward total quality....Continuous process improvement reinforces a basic
tenant of TQM—long-term focus" (p. 72). TQM also involves teams which should
include a "cross-section of members who represent some part of the process under study"
(p. 43). Business and industiy have successfully applied these principles to their
operations.
In the 1980s, higher education institutions began to apply the TQM principles.
The early higher education institutions who tried the switch to quality management were
a mixture of community colleges, four-year colleges and universities, and four-year
public schools. Although distinctively different, each institution found the quality

21
principles appropriate to its situation. A number of institutions have been surveyed and
have provided in-depth information about why they became interested in the quality
movement (American Association 1994; Seymour 1993; Seymour & Collett 1991)
Freed, Klugman and Fife (1997) provided a list of institutions that participated in such
studies: Alabama State University, Arkansas State University, Arkansas Tech University,
Babson College, Belmont University, Brazosport College, Central Connecticut State
University, Clemson University, Colorado State University, Cornell University, Dallas
County Community College District, Delaware County Community College, Duke
University, El Camino Community College, Fordham University, Fox Valley Technical
College, George mason University, Georgia Instittite of Technology, Grand Rapids Junior
College, Lamar Community College, Maricopa County Community College District,
Marietta College, Northwest Missouri State University, Oregon State University, The
Pennsylvania State University, St. John Fisher College, Samford University, Southern
College of Technology, United States Air Force Academy, University of Chicago,
University of Illinois-Chicago, University of Maryland, University of Minnesota-Duluth,
University of Minnesota-Twin Cities, University of Montevallo, University of
Pennsylvania, University of Tampa, University of Wisconsin-Madison, University of
Wyoming, Winona State University (p. 15),
In response to this increased interest in the quality movement during 1990 to
1993, the American Association for Higher Education (AAHE) and the William C. Norris
Institute collaborated in January of 1993 to establish the Academic Quality Consortium
(AQC) with the purpose of providing campuses committed to implementing quality in
higher education "with the opportunity to exchange information, build on one another's
experiences, expand on assessment practices already used, and share the results of their
work with the wider higher education community" (Freed, Klugman & Fife, 1997, p.
29). One way the AQC shares its experiences and knowledge of quality practices is
through the AAHE Continuous Quality Improvement (CQI) Project, an e-mail discussion
list, CQI-L, open to all educators interested in quality improvement in higher education.
The group began with a membership of approximately 50 and has grovra to
approximately 750 members in fall 1996.

22
Calek (1995) has also documented the growing interest in quality principles in
the annual Quality in Education Survey by Quality Progress. Since 1991, Quality
Progress has conducted a survey to determine how many community colleges and four-
year public and private colleges and universities offer courses in quality improvement and
whether they apply the quality principles in managing their institutions. The number of
participants has steadily increased over the five years: community colleges from 14 to 83
and four-year colleges and universities from 78 to 220. Eighty-eight percent of the four-
year colleges and universities that responded in 1995 reported using the quality principles
to manage their administrations, 55 percent offer quality-related certificated, minors, or
degrees, and 42 percent do both. Among community colleges, 91 percent of the
respondents use quality principles to manage their adminisfrations, 66 percent offer
quality related certificates, minors or degrees, and 52 percent do both.
The American Association for Higher Education (1994) noted the growth of the
total quality management movement in colleges and universities in the late 1980s and
early 1990s, Then institutions questioned whether the quality movement was appropriate
for education. Just a few years later, adminisfrators did not ask whether the quality
movement was appropriate, but how the quality principles could be made relevant and
worthwhile on their campuses. Freed, Klugman and Fife (1994) recorded the growth of
the quality movement in higher education in a national survey on TQM on campus. This
survey of over 400 institutions that were identified as having shown interest in the quality
principles, 25 percent of the responding institutions reported that they began
implementation of the principles reported that they had begun implementation of the
principles in 1990 or before, 50 percent reported that they began in 1991 or 1992, and 25
percent reported that they began in 1993 and early 1994.
But in implementing TQM many institutions have experienced failure. Creating a
culture for academic excellence by implementing the quality principles is not easy in
higher education institutions, "as the strong historical fraditions in higher education make
any kind of change exfremely difficult" (Freed, Klugman, & Fife, 1997, p. 9). These
authors further explain,

23
Most institutions have missions, but most are not accustomed to
measuring the outcomes of their processes. Traditionally, constituencies
in higher education institutions act independently rather than
interdependently. Leaders are usually not trained in the tools and
techniques used to improve systems and processes. Developing
management skills and knowledge is not the norm in higher education.
Professional development is more often discipline- and person- specific
rather than oriented toward developing members who can collectively
improve institutional processes. Although data are collected for a variety
of purposes in directing higher education institutions, the quality
principles emphasize the systematic collection of data to be used in
making academic and administrative decisions. Committees in academe
are common, but collaborating and working as teams for common
purposes are not. (pp, v-vi)

Carr and Johansson (1995, p. 16) found that TQM is process-focused, emphasizes
teams and mutual values, and uses shared tools for problem-solving. In practice, TQM
tries to improve processes and includes the continuous drive for quality improvement as a
part of the fabric and culture of an entire organization (Parfett, 1994, p, 160), Thus the
process, methods, and language of total quality management were adopted by U, S.
industry to develop a method of quickly finding and implementing practices that would
achieve world-class results, and higher education has adapted these techniques to their
institutions.
Brown, Hitchcock, and Willard (1994) studied TQM failures and reported that if
there had been a failure, it was not one of philosophy; but one of implementation. They
cite three phases of TQM implementation: start up, alignment, and integration. They also
state, "It's no wonder that somewhere between one-half and three-quarters of the
organizations implementing TQM drop their initiatives within the first two years (p. 1),
Then they list reasons for possible failure in each area. In the start up phase the authors
cite "lack of management commitment, poor timing and pacing, wasted education and
fraining, and lack of short-term bottom-line results" (p, 2). In alignment phase they show
that divergent sfrategies such as employees thinking that quality is separate from work,
and "inappropriate measures, outdated appraisal methods, and inappropriate rewards" (p.
71). And in the integration phase, they list "failing to transfer true power to employees,
maintaining outmoded management practices, poor organization and job design, outdated

24
business systems and failing to manage learning and innovation diffusion" (pp, 138-9).
Applying the principles of TQM to higher education has not been fool-proof, but has
proven to be successful in many institutions.

TQM in Higher Education


Manufacturers and non-profit organizations are no longer separate entities;
everyone, including colleges and universities, is in the same business that of service;
therefore, competition, costs, accountability and a service orientation are the motivation
behind searching for quality across a broad set of organizations (Seymour, 1993, pp. 7-8).
Because of this some colleges and universities are realizing that TQM values are more
compatible with higher education than most other management systems (Sheer & Lozier,
1991, p.3).
Higher education has discovered several reasons for adapting TQM theory. First,
survival in an increasingly competitive environment. Seymour (1993) explains that in
this "Age of Consumerism" in higher education it is a buyer's market where every
survey asks,."Why did you chose this college?" illustrating that students apply to more
schools now than they did in the past. "Once they've been accepted, they compare
financial aid packages, facilities, and so on. Transfers are up as well....The result is a
dynamic marketplace" (pp. 2-3).
The second motivating reason is the escalation of the costs of doing business
forcing institutions to conserve resources. Because resources are not as plentiful as they
once were due to resistance to higher tuition levels and financial problems at the state and
federal levels, limited fimds are now available to colleges. So colleges must control
costs and find reliable sources of additional revenue merely to keep resources from
declining (DeCosmo, Parker, & Heverly, 1991, pp, 13-14). Many institutions now realize
that eliminating waste, inefficiency and rework leads to improvements in processes and
that such an approach is to be sought because it does not require new resources.
The third driving force is the frend to make organizations more accountable for
their actions and outcomes (p. 5). In higher education concerns over the decline of
student performances on standardized exams, professional licensing exams, the dilution

25
of the liberal arts, and the steady climb of tuition, have caused legislative commissions
and assessment programs to begin investigating the value of the college degree (Seymour,
1993), The public and governments are scrutinizing higher education very closely and
are demanding greater accountability in areas of student retention, especially retention of
minority students, and the need for remedial assistance in order to succeed in college-
level courses (DeCosmo, Parker & Heveriy, 1991). The standard response has been to
revisit time-worn cost containment and policy options. Institutions have become
accomplished experts in crisis management rather than simply asking, "Is there a better
way to manage higher education?" (Seymour, 1993, p. viii).
Hubbard (1993) points to the first three institutions that became involved in the
quality movement: Northwest Missouri State University in 1984, Fox Valley Technical
College in 1986, and Oregon State University in 1989. Although these three institutions
were different in size, location, and type of degrees offered, they all had presidents who
developed an interest in the quality movement, became champions of quality within their
own institutions, and applied the quality principles with a long-term commitment
(Hubbard, 1993). Many other colleges and universities have now agreed with Seymour
(1993) who said that TQM is "too well-grounded in a scientific approach to problem
solving, and it has been tested, scrutinized, and revised in thousands of organizations over
a period of three years. Bottom line: It works" (p. ix). He goes on to conclude that
"quality can be actively and aggressively managed into our colleges and universities" (p.
x), because quality is being managed into operations and attitudes in higher education
"with a sense of purpose, urgency and excitement" (p. 3). Then he cites examples of
higher education institutions which since 1985 have adapted TQM principles with the
concept of continuous, systematic improvement to their campuses: the University of
Wisconsin-Madison, North Dakota University System, Oregon State University, and
Delaware County Community College.
Because the goals of private industry and universities differ from those of
community colleges, Peterson (1993) observes that the implementation by almost every
institution of higher education during the 1980s of some type strategic planning model
such as Management by Objectives did not fit very well into their structure and mission.

26
However, since then community colleges have recognized the importance of long-range
planning in maintaining their mission as opportunity colleges. During this time they have
evolved from junior to community colleges and their mission has become more diverse.
Due to this evolution, systematic long-range planning and improvement have become
more necessary to maintain community colleges effectiveness. Peterson (1993) writes
that it is "apparent that the basic tenets of this concept [TQM] are applicable to non-profit
organizations" (p. 3) such as colleges and universities, but institutions of higher education
are unique in many ways and must develop their own methodology in implementation.
He calls this new methodology Continuous Quality Assurance (CQA) which is TQM
adapted to colleges and universities,
DeCosmo, Parker and Heverly (1991) agree that community colleges can provide
"an affordable avenue to excellent programs and services" (p, 14) by adopting the
principles and methods embodied in TQM to their campuses. Theses authors conclude:
Solving today's problems requires the intelligence and hard work of
everyone. TQM offers a paradigm that encourages effective participation.
To achieve giant strides in both quality and efficiency. Everyone must be
freed to pursue quality and accept responsibility for it, to work together in
teams, and to listen to internal and external customers, (p. 14)

Seymour (1993) suggests that TQM is a structural system that can help create a true
learning organization which involves learning how to improve the registration of students
and learning how to improve campus maintenance (p. 31).
A TQM environment calls for an unrelenting desire for quality, teamwork,
long-term thinking, rewards for results, and empowered employees. In
many colleges, there is a willingness to settle for less,
compartmentalization and competitiveness between individuals and units,
short-term thinking, poor and nonexistent reward systems, and external
pressures for accountability, bureaucratic processes and "layered"
administration....TQM not for everyone...! am thoroughly convinced
that TQM is an extremely worthwhile approach to managing a community
college. In my view the long-term benefits of TQM far outweigh its costs.
(Hammons, 1994, p. 344)
Schauerman (1994) describes El Camino College's struggle to overcome resistance to
change and implement TQM on his campus, and then concludes that improved service
processes, more productivity by assigning monetary and human resources more wisely

27
and getting people to work together and to respect each other's strengths are benefits of
having a TQM culture. In writing of her experience in implementing TQM on her
campus, Thor (1994) also chronicles the barriers to change that must be overcome in
implementing TQM. She summarizes by saying, "the process of continuous change has
begun to make its mark on American education, and employees and students nationwide
are benefitting from that beginning" (p. 367). In relating the establishment of TQM at
Jackson Community College, LeTarte and Schwinn (1994) relate that a basic belief in
TQM is a belief in positive change.

TQM Continuous Improvement


A major component of TQM is the process of continuous improvement. Kantrov
(1984) points out that "neither blind resistance to change nor wholesale junking of the
past makes for sensible development. Gradual adaptation does" (p, 177), This gradual
adaptation is called Kaizen, which is defined as gradual, unending improvement, doing
little things better, and setting and achieving ever-higher standards (Kantrov, 1984). He
further points out that Kaizen means ongoing improvement involving everyone, including
both managers and workers; and the Kaizen philosophy assumes life-be it work life,
social life, or home life-can be constantly improved.
Benchmarking is an excellent vehicle for developing a Kaizen culture in a
company because it is a continuous process of "evaluating current performance, setting
goals for the future, and identifying areas for improvement and change" (Leibfiied &
McNair 1992, p. 13). Rush (1994) further observed that benchmarking can play a pivotal
role in helping to reshape an institution of higher education by facilitating the rethinking
and reorientation necessary for continuous improvement,

TQM Benchmarking
Benchmarking focuses on the processes used by competitors, as well as industry
trends, to identify opportunities for continuous improvement, not just for price cutting
observed Leibfiied and McNair (1992), They further explain that benchmarking
transforms the change process into a continuous improvement effort focused on process

28
learning at the organizational level, which leads to continued innovation and change.
Because it is based on clear, two-way communication and participation between partners,
benchmarks cannot be established without a comprehensive understanding of current
practice within an organization, desired results, and the recognition and acceptance of the
changes that will need to occur to meet and exceed those goals,
Leibfried and McNair (1992) also show that benchmarking is a dynamic, ongoing
effort by management and workers alike that contains the seed of organizational and
cultural change that must occur if survival, let alone competitive excellence, is to be
achieved. Thus, the goal is to put benchmarking to work to achieve world-class
competitive capability. Tucker (1996) found that the foundation of benchmarking is
"education orfransferringlearning from one group to another" (p. xiv).

Business Benchmarking Background


Many benchmarkers consider Sun Tzu a fourth-century (B.C.) Chinese author of
The Art of War to be the pafron saint of the process (Camp, 1995). Cook (1995) reports
that primitive benchmarking "first emerged in America in the 1950s when standards were
used to measure business performance in terms of cost/sales and investment ratios. This
allowed businesses within particular industries to see how they compared with their peers
and to identify their sfrengths and weaknesses" (p. 15), However, these benchmarking
techniques did not investigate the practices or procedures which led to achieving
increased improvements in performance. She fiirther explains that the growth of the
computer industry throughout the 1960s and 1970s brought further developments in the
use of benchmarking due to the proliferation of suppliers and systems which allowed the
purchaser to measure and compare performance systematically.
But, benchmarking as a change tool did not evolve in America until the early
1980s when global competition forced America industry to find methods of rapidly
improving products and services or else face extinction. At this time total quality
management was already spreading rapidly, and it provided managers with new
principles, tools, and methods for analyzing the way people processes worked in their
businesses (Tucker, 1996). Benchmarking was actually pioneered in the early 1980s by

29
the Xerox Corporation in response to increased competition and a rapidly declining
market share. Cook (1995) reports that Xerox used benchmarking whenever it found
something that someone else did better, by adopting the new level of performance as a
new base standard in its own operations. The guiding principle was "Anything anyone
else can do better, we should aim to do at least equally well" (p. 14). Xerox
Reprographics Manufacturing Group president, Charles Christ, sent a team to Japan
because
I needed a benchmark, something that I can measure myself against to
understand were we have to go from here. This competitive benchmarking
resulted in specific performance targets rather than someone's guess or
intuitive feel of what needs to be done-which is the real power of the
process. (Leibfiied & McNair, 1992, p. 20)

The use of benchmarking grew rapidly in industry through out the US during the
1980s as many companies recognized the need to improve the quality of their output and
business performance. This development naturally followed from TQM's emphasis on
process. Benchmarking became a recognized tool in the development of a continuous
improvement process. Other companies like Fed Ex, IBM, health care organizations and
hotels began to use benchmarking. Tucker (1996) declares that almost all of the Fortune
500 companies have done at least one benchmarking project, and many use it on a regular
basis.
UK companies with US connections began benchmarking during the early 1990s
(Cook, 1995). So organizations such as Xerox, Digital Equipment Company and
Milliken Industrials used and continue to use this technique as an integral part of their
corporate sfrategy. Cook (1995) fiirther explains that in 1991 the first benchmarking
seminar was held by the British Quality Association, Almost all of them are expecting to
increase their investment in benchmarking over the next five years.

Benchmarking in the Baldridge Award


Benchmarking is a part of the prestigious American prize for quality the Malcolm
Baldridge National QuaHty Award (1994). The stated purpose of this award is 'to
promote quality awareness, to recognize quality achievements in U. S. companies, and to

30
publicize successfiil quality strategies " (Spendolini, 1992, p. 5). Organizations
competing for it are required to document their benchmarking projects in several sections
of the application. Thus, one issue raised by the Baldridge award is that of external
comparisons, Wolverton (1994) fiirther adds that "internally, this means year-to-year
improvement; externally, it implies improvement compared to peer institutions and
appropriate benchmarks" (pp. 14-15). For example. The Malcolm Baldrige National
Quality Award: A Yard.stick for Quality Growth by Heaphy anH On.Qlra (1995) describes
category 2.0 of the award. Information and Analysis, as the guideline which examines the
scope and use of data as well as the benchmarking activities of the company. Under the
heading competitive Comparisons and Benchmarks, the right benchmark data must be
chosen for collection and analysis, and then used to guide decisions. Benchmarking,
examined in this category, is an approach for learning what other organizations are doing
and for getting new ideas. This section 2.2 is worth fifteen points and includes two areas
which directly address benchmarking:

a) How competitive comparisons and benchmarking information and data


are selected and used to help drive improvement of overall company
performance or specifically describing how needs and priorities are
determined, how criteria for seeking appropriate information and data
from within and outside the company's industry are chosen, how the
information and data are used within the company to improve
understanding of processes and process performance, and how the
information and data are used to set sfretch targets and/or encourage
breakthrough approaches, b) How the company evaluates and improves
its overall process for selecting and using competitive comparisons and
benchmarking information and data to improve planning and overall
company performance by using data obtained from other organizations
such as customers or suppliers through sharing, obtained from the open
literature, obtained by testing and evaluation by the company itself; and/or
obtained by testing and evaluation by independent organizations, (p, 54)

In 1995 after two years of research and input by educators and organizations such
as the Association for Supervision and Curriculum Development and the American
Association of School Administrators, information about the upcoming Baldridge Quality
Award for Education was released by the U.S. Department of Commerce for the Malcolm

31
Baldridge National Quality Award Education Pilot, In this new Baldridge education pilot
criteria all seven categories relate to information and analysis that support overall
mission-related performance excellence. This includes benchmarking and peer
comparison (p. 49). Many leaders such as Tucker (1996, p. 6) expect that this will
radically change the way we assess school effectiveness.

Definitions of Benchmarking
Benchmarking corresponds to the human learning experience in teaching an
organization to improve, and continuous improvement brings needed changes that keep
the instittition competitive, David Keams (Leibfiied & McNair, 1992) CEO of Xerox
Corporation, the American pioneer of benchmarking defines benchmarking as: "the
continuous process of measuring our products, services, and practices against our
toughest competitors or those companies known as leaders" (p, 10), Tom Peters in
Thriving on Chaos uses the term "creative swiping" to describe the process of looking
beyond your own organization for other ideas... that may be adapted and enhanced to fit
your special cfrcumstances" (p. 25).
Robert Camp (1995) shares the dictionary definition: "benchmarking as a standard
against which something can be measured. A survey mark of previously determined
position used as a reference point" (p, 18). He fiirther defines benchmarking as a 10-step
process which is "a structured way of looking outside to identify, analyze, and adopt the
best in the industry or function" (p, 18). Benchmarking may be descriptive and be
converted into performance measurements that can show the effect of adopting the
practice.
Kempner (1993) defines benchmarking more concisely as an ongoing, systematic
process for measuring and comparing the work processes of one organization to those of
another, by bringing an external focus to internal activities, fiinctions or operations.
Leibfried and McNair (1992) agree that benchmarking brings an external focus, "a
movement away from a concern with cost reduction and budgets, to an understanding of
what activities customers value and what level of performance they expect" (p. 2). Shafer
and Coate (1992) define benchmarking as a "positive process that provides objective

32
measurements for baselining (setting the initial values), goal-setting, and improvement
tracking, which can lead to dramatic innovations" (pp. 28-35). Tucker (1996) defines
benchmarking as the process used to "achieve superior performance through a team
research and data-driven process by which learning and innovation trigger fundamental
breakthroughs in thinking and practice" (p. ix),
Dertouzos, Lester, Solow, and the MIT Commission on Industrial Productivity
(1989) define benchmarking as the "ongoing, systematic process for measuring and
comparing the work processes" of one organization to another (p. 84), Tucker (1996)
defines benchmarking as "the process used to achieve superior performance through team
research and a data-driven process by which learning and innovation trigger fundamental
breakthroughs in thinking and practice" (p, ix). Cook (1995) defines benchmarking as:
the process of identifying, understanding and adapting outstanding
practices from within the same organization or from the other businesses
to help improve performance. This involves a process of comparing
practices and procedures to identify ways in which an organization can
improve. Thus new standards and goals can be set which, in tum, will
help better satisfy the customer's requirements for quality, cost, product
and service, (p. 13)

Thus through benchmarking an organization can add value to their customers and
distinguish themselves from their competitors.
Superintendent's Quality Challenge (1994) defines benchmarking as an
"improvement process in which an organization compares its performance against best-
in-class organizations, determines how those organizations achieved their performance
levels, and uses the information to improve its own performance by benchmarking
sfrategies, products, programs, services, operations, processes and/or procedures" (p, 29).
Regardless of how benchmarking is defined, all agree that it is an analytical process of
comparison used as a tool for continuous improvement in an effort to manage change.
The purpose of the benchmarking process is to provide managers or administrators with
external points of reference or standards for evaluating the quality and cost of their
organization's internal activities, practices, and processes. Rush (1994) affirms that
benchmarking strives to provide an institution with a

33
demonstrable cost or quality advantage by replacing management intuition
or 'gut feel' with facts and analysis that foster better operational practices.
It attempts to answer the following questions" How well are we doing
compared to others? How good do we want to be? Who's doing the best?
How do they do it? How can we adapt what they are doing to our
institution? How can we be better than the best? (pp. 84-85)

Furthermore, Alstete (1995) explains that the goal of benchmarking is to "provide key
personnel, in charge of processes, with an extemal standard for measuring the quality and
cost of internal activities, and to help identify where improvement opportunities exist" (p.
in).

Types of Benchmarking
Research shows six basic types of benchmarking studies can be conducted
depending upon the data needed: Internal, Competitive including peer. Industry, Non-
competitive, Best practice, and Functional/Generic. Intemal benchmarking is defined by
Cook (1995) as the "measuring and comparing of organizational data on similar practices
from different departments of the same organization" (p, 18). Alstete (1995) applies this
definition to higher education by explaining that intemal benchmarking can "best be
conducted at large, decentralized institutions where there are several departments or units
that conduct similar processes" (pp. iv-v).
Competitive benchmarking as defined by Spendolini (1992) is "measuring an
organization against direct competitors who are selling to the same customer base" (p.
17). Rush (1994) adds that the intent is to find an "extemal standard against which the
institution can compare itself (pp. 89-90). Such benchmarking is used to establish
performance standards and to detect trends in the competitive environment and Leibfiied
and McNair (1992) found that it usually includes two or three of a firm's closest
competitors.
Industry benchmarking as defined by Rush (1994) is "identifying trends and
providing insights into how those frends are being set" (p. 90). Leibfiied and McNair
(1992) show that it looks for general trends across a much larger group of related firms
than competitive benchmarking using a "more general procedure that compares a firm

34
against companies with similar interests and similar technologies, attempting to identify
product and service trends rather than current market share rankings" (p, 116). This is
similar to competitive benchmarking, but differs in that the scope is broader to allow
looking at more institutions. But the goal is again to provide extemal standards for
intemal measurement and to identify the best operational practices that could be adopted
or adapted at an institution.
Noncompetitive benchmarking as defined by Cook (1995) is the process of
measuring and comparing "a related process in a noncompetitive organization, a related
process in a different industry or an unrelated process in a different industry" (p. 19).
Best practice/world class benchmarking is defined also by Cook (1995) as the process of
"learning from best practice/world-class organizations, leaders of the process being
benchmarked" (p. 20).
This type of benchmarking is also called fimctionaL/generic benchmarking by
Spendolini (1992, p. 17). Rush (1994) differentiates between functional and generic
benchmarking by saying that "functional benchmarking uses a large group of competitors
and is more broadly defined while generic uses the broadest application of data collection
from different industries to find the best operations practices available" (p. 11). Alstete
(1995) reveals that the selection of the benchmarking type depends on the "processes
being analyzed, the availability of data, and the available expertise at the institutions" (pp.
iv-v).

Higher Education Benchmarking


Traditionally college and university operational costs have been fiinctional and
organizational with institutional funds being budgeted on a department-by-department
basis, with a clear emphasis on inputs. Thus, Rush (1994) maintains that past
benchmarking in higher education has been input driven (e.g., instmctional dollars spent
per student, students per faculty member, etc.) with few measures being output driven.
But with the adoption of TQM and the concept of continuous improvement, interest in
benchmarking is growing in institutions of higher education. Many studies have been
done on total quality management in higher education (Coate, 1990, 1993; Conresky,

35
1992; Seymour, 1993a, 1993b, 1994, 1996; Teeter & Lozier, 1993; Willliams, 1993), but
little research has been documented about benchmarking (Kempner, 1993; Shafer and
Coate, 1993; Shaughnessy, 1993).
Evidence of growing interest in benchmarking is the National Postsecondary
Education Cooperative Council Meeting Proceedings of December 9-10, 1996, where a
Research Associate with the State Higher Education Executive Officers began a
discussion with an overview of benchmarking including its definition, processes, and
current national efforts. The discussion was to center on a possible benchmarking study
by the council. However, most participants expressed views that many other
organizations could be more effective than NPEC, like institutional departments,
individual institutions, or postsecondary systems. Some authors claim that benchmarking
used with other improvement initiatives has the capacity to move business processes in
higher education forward at a faster pace than other approaches used alone (Douglas,
1997; Shafer & Coate, 1992). Tucker (1996) supports the idea that individual institutions
can be more effective than large organizations at addressing benchmarking studies. She
explains that what's missing from school reform efforts is a "careful analysis of what is
and isn't working in the individual current system and an infusion of ideas and
documented practices that work in similar settings" (p. x).

Examples of Benchmark Studies


Within higher education benchmarking is a relatively new phenomenon.
Although the field has abounded with comparative data for many years, very little of it
has focused on key operational processes and the outputs of those processes.
Traditionally, comparative data has been used to justify budget requests or legislative
appropriations, to satisfy the ratio requirements of rating agencies, or to keep managers
and boards informed about institutional revenue and expense patterns. This traditional
comparative data most often took the form of balance sheet ratios, revenues and
expenditures per student, and other similar measures. "In ahnost all cases these measures
have been used to defend institutional growth, program expansion, or budgeting
increases" (Rush, 1994, p. 86).

36
Some examples of benchmark studies do exist in higher education. Alstete
reports (1995) that graduate business schools, professional associations such as the
National Association of College and University Business Officers (NACUBO), and the
Association for Continuing Higher Education (ACHE), independent data sharing
consortia, private consulting companies, and individual institutions are all conducting
organizational benchmarking projects. The broad-based NACUBO benchmarking
program (NACUBO, 1995) was begun in late 1991, in collaboration with Coopers &
Lybrand to establish a national data base of key benchmark data for a variety of financial
and administrative fiinctions within four year higher education institutions. It seeks to
provide participants with an objective basis for improved operational performance by
offering the best practices of other organizations. Between 1991 and 1995, 344
institutions have participated in the NACUBO study (Haack, 1998). The current project
analyzes core functions of colleges and universities, such as accounting, admissions,
development, payroll, purchasing, student housing, and others. The information does not
yet include academic department benchmarks, but it does provide data for 38 functional
areas. The purpose of the NACUBO benchmarking project is to "provide colleges and
universities with current operational information and benchmarks as they consider
various restmcturing and cost-reduction initiatives" (Rush, 1994, pp. 94-95). Haack
(1998) found that institutions involved with this study of benchmarking reported positive
changes in higher education business practices. The approach of using the benchmark
data in a non-fraditional manner, not using it to justify more money from states, students
or donors but using the data to cut costs and increase quality, is unique to higher
education (Blumenstyk, 1993; Massy & Meyerson, 1994).
Reporting on his experience with benchmarking, Edwin, J. Merck, Vice President
for Finance and Operations at Wheaton College says.
We had a lot of good ideas for improvements intemally, but we got
a fuller spectmm by looking for best practices....We contacted some
similar sized schools to see how they handled the coordination issues [of
the financial aid letter and student's bills], and leamed that all had
grappled with this problem; there were a wide range of possible
approaches, and no one school was completely satisfied with its solution.
Taking some comfort in the universality of this problem, we built upon the

37
ideas of others, which enabled larger improvements at a quicker pace. In
tum, we shared our results; several of the originally 'benchmarked'
schools have found our solution a usefiil enhancement to implement or
adapt. It's really not necessary for all of us to reinvent the wheel with
every problem. If we aggregate creative resource, we'll all be able to
spend less time solving each problem, thus allowing capacity for tackling
more problems with limited resources. (Kempner, 1993, p, 22)

The Association for Continuing Higher Education (ACHE) and graduate business
schools have also conducted specialized benchmarking studies that focus on the
"processes and practices concerning their institutional departments" (AACSB & Alsete,
p, 45). The College of Business at Arizona, Northwest Missouri, and Samford all
benchmark according to Wolverton (1994). Arizona has tried to gather data from peer
institutions on college specific priorities, but such benchmarking at the undergraduate
institutional level remains limited because "the collection of relevant data about peer or
better baccalaureate programs has not been accomplished" (COB, 1994, p. 34).
Colleges and universities can use benchmarking in bringing about changes needed
for school improvement. Benchmarking, explains Tucker (1996), is not a mechanism for
reducing a school's budget; instead, it is a
mechanism for deploying resources in the most effective manner to
achieve customer satisfaction. It is not a cookbook program that requires
only a recipe for success. Instead, it is a discovery and learning process
that can be used over and over again to achieve different goals. It is a way
of working and thinking to achieve continuous improvement in a college,
(p. 3)

Benchmarking can be used to address specific college sfrategic goals more rapidly.
Tucker (1996) believes that it can be used to "prevent 'wheel reinvention' " and leam
how other organizations have addressed the issues they are facing" (p, 8), She also states
that federal and state mandates can be implemented very efficiently through a benchmark
study, and that while conducting a benchmark study an institution can practice one of the
fimdamental improvement strategies required in the Malcolm Baldridge Quality Award
criteria.

38
Benchmarking is being used for improving administrative processes as well as
instmctional models at colleges and universities, by examining processes and models at
other schools and adapting their techniques and approaches. (Chaffee & Sherr, 1992;
Clark, December 5, 1993), Such a project can provide "an objective basis for improved
operational performance measurement, a tool for change within the institution, a 'pointer'
to the best practices of other organizations, and a ineans to bring about change more
quickly" (Rush, 1994, p. 96). Benchmarking is a practical approach for a college or
university to use to assess how their resources are being spent. It can be a "vehicle for
promoting substantive, change-oriented action within an institution by providing
compelling evidence of the need to change" (Rush, 1994, p. 96), A benchmark study
offers the following advantages to an institution: "It sets performance goals. It helps
accelerate and manage change. It improves processes. It allows individuals to 'outside
the box'. It generates an understanding of world-class performance" (Cook, 1995, p, 1)
Due to its reliance on hard data and research methodology, benchmarking is
especially suited for institutions of higher education where these types of studies are
usually very familiar to faculty and administrators. Practitioners at colleges and
universities have found that "benchmarking helps overcome resistance to change,
provides a structure for extemal evaluation, and creates new networks of communication
between schools where valuable information and experiences can be shared" (AACSB,
1994, p. 16-17).

Summary
This review of literature has included discussions of the theoretical background of
Total Quality Management, TQM use in higher education, TQM's continuous
improvement concept, TQM benchmarking, and higher education's benchmarking.

39
CHAPTER III
METHODOLOGY

This chapter will discuss the background, sample selection, instmmentation


including development and content, the pilot study, data collection, data analysis, and the
follow-up survey used to determine institutional implementation of the study results.

Background for study


Because of the interest and willingness to invest in the project, a community
college was selected as the lead college or the sponsoring base of operations for this
study. This lead college had afready created a Resource Management Task Force within
its Total Quality Management system with representative and diverse membership from
the goveming board, administration, faculty and staff This Task Force led by the college
president was considering areas and methods of strategic change necessary for the future
success of the institution. Thus, the Task Force hired the researcher to serve as an outside
facilitator and a resource person throughout the study. The researcher was confracted to
conduct the benchmarking study following the example of Exxon Chemical and Janssen
Pharmaceutica companies (Rush, 1994), who both used an outside facilitator to "increase
objectivity and breadth" (pp. 195-6) in their benchmark studies. The researcher's
community college background, statistical training, and knowledge of the benchmarking
process added breadth to the process. The outside researcher also provided the
objectivity and confidentiality necessary to gain support from other mid-sized community
colleges in Texas as well as the time needed to keep the project on track.

Sample Selection
For benchmarking to be usefiil in evaluating several like processes at once and to
produce institutional frends similar institutions must be compared, mid-sized Texas
community colleges were selected. Community colleges were selected for this study by
primarily considering student and community populations, and appropriations or fiinds
available for operation. In order to select the colleges to be studied statistics were

40
gathered on student headcount, semester contact hours generated, community population
base including both the city size and the surrounding county sizes, and appropriations.
Thus, the level of money available for operation and the population served by higher
education operations were the primary considerations in trying to equate schools with
comparable means.
Mid-sized community colleges were defined as community colleges with student
headcount between 5,000-11,000, contact hours between 2,5 million and 5.5 milhon, and
appropriations between $9,5 milUon and 13 million with primary service area population
of between 70,000 and 210,000. Colleges fitting these criteria are found in Table 3,1.
Nine colleges fell into the mid-range of statistics being used for selection.

Table 3,1. Statistics on community colleges considered to be mid-sized.


College Appropriation Service Student Contact Hours
Population Headcount
CI $10,997,430 113,060 7016 3,235,161
C2 $12,090,000 1,914,903 9319 3,972,995
C3 $12,724,000 1,562,544 10,200 3,988,118
C4 $12,902,013 206,279 10,377 5,457,133
C5 $10,480,000 73,684 6887 3,250,983
C6 $ 9,661,000 21,035 5561 2,944,409
C7 $ 9, 482,034 78,073 4590 2,336,355

C8 $11,987,000 134,642 7582 3,867,756


C9 $ 9,787,000 155,393 5652 3,004,090

To fiirther assure that selected institutions were comparable in resources available


other statistics were considered as shown in Table 3,2, Revenue considered included
annual budget figures such as the tax base, the maintenance and operation (M&O) rate,
the M&O taxes assessed per contact hour, the unrestricted tuition and fee revenues per
contact hour, and tax value per contact hour for 1994-95. Such statistics show that
comparable resources are available and used.

41
Table 3.2 Revenue statistics considered on community colleges defined as mid-sized.
M&O Unrestricted Tax Value
Taxes tuition & fee per
College Tax Base M&O Rate per revenue per Contact
1995 1995-96 contact contact hour Hour
hour
CI $ 4,384,591,839 .1233 $1.70 $1.38 $1339
C2 $ 954,237,592 .0452 NA NA $240

C3 $17,040,410,727 .0674 $2.63 $1.50 $4326


C4 $ 8,461,063,860 .1749 $2,53 $1.11 $1551

C5 $ 3,062,748,240 .1598 $1.70 $1,48 $1031

C6 $ 4,304,865,606 .0722 $ .95 $1.14 $1357

C7 $ 3,846,087,476 .1984 NA NA $1646


C8 $ 4,433,455,292 .1223 $1.33 $1.71 $1085

C9 $ 2,008,967,337 .2310 $1.52 $1,57 $ 659

Nine Texas community colleges met these "mid-sized" qualifications in four or


more areas. One of these institutions was eliminated from the study because it failed to
meet the definition of a mid-sized community college in two critical areas, student
headcount and contact hours, and it was experiencing financial difficulties at the time.
Another of the institutions was eliminated from consideration because it was receiving
support from a four year university system and this unique situation would skew the
results. One college elected not to participate in the study due to intemal turmoil.
Thus six peer community colleges were selected and agreed to participate and fit
the definitions of mid-sized community colleges set; it was felt that a small number of
institutions would also keep the amount of data manageable. Including other institutions
that were not compatible in size, budget and environment would not enhance the desired
results because utilizing more or less money to serve more or less students would not
allow the participating institutions to apply practices and frends to their situations in a
peer benchmarking situation.

42
In.stmmentation
A survey design was selected for this study because it allowed for the efficient
collection of benchmarking data from each institution and in follow-up procedures. The
survey design also allowed for rapid turn-around in data collection, and the ability to
make comparisons and identify practices and gaps across institutions as well as determine
use of the benchmark data for improvement.

Support secured
Adopting Peterson's (1993) belief that the college president is the key leader in a
higher education system, the presidents of the six selected colleges were given an
explanation of the project by the lead college's president and asked for their institution's
cooperation in the project. This explanation included a guarantee of confidentiality and
rationale for the use of the data collected. Each president designated an institutional
study director or contact person responsible for helping decide what data was to be
collected for benchmarking by giving input on the design of the study, seeing that the
questionnaire was completed and retumed in a timely manner, and reporting the survey
information results to their institution. Therefore, each contact person assumed
responsibility for approving the items in the scope of the survey instmment and then
contacting appropriate personnel on their campus to complete the survey form.

Survey development
After the facilitator conducted research on benchmarking, she along with the Task
Force of the lead college took the initiative in developing the survey instmment.
Realizing as Rush (1994) points out that before any benchmarking is undertaken, "the
ultimate objective or target has to be very clearly understood and the potential cost and
benefits assessed," the Task Force set the objectives for the study: to collect information
from like community colleges in key areas of resource management to determine what
processes at their institution could be improved. The participating community colleges
were to use the data for similar purposes.

43
Combining Leibfiied and McNair's (1992) suggestion that a benchmarking
process follow three steps: "indentification of core issues, intemal baseline data
collection, and extemal information data collection" (p. 45) with Rush's (1994)
guidelines for a benchmarking questionnaire, the process of stmcturing the survey
instmment was begun. The lead community college held Task Force and executive staff
meetings to select areas that would be most beneficial to benchmark. Each major area of
college processes was considered as possible area of benchmarking.
The lead benchmarking college was guided in establishing the questionnaire by
the belief that critical to the "extemal data collection process is the exchange of
information." Because a benchmarking organization expects to give and receive
information; Liebfired and McNair (1992) advise "do not ask for something that you are
not willing to provide to others" (pp. 44-45). The Task Force with the assistance of the
facilitator selected college personnel from the lead institution most knowledgeable about
key processes, and the other five institutional leaders were contacted for input on what
key process areas should be selected for benchmarking. This procedure followed Rush's
(1994) guidelines of "clearly identifying the department, activity, or process to be
benchmarked so that apples-to-oranges comparisons are avoided...,and keeping the study
simple" (p. 96), Benchmarking is intended to point to trends or best practices and allow
for measuring the performance of one's own institution in light of customer expectations
and the performance of other organizations; thus, the five other schools were continually
consulted about departments, activities and processes that they desired to benchmark.
Once the content areas were designated, knowledgeable people in each area were
selected to work with the facilitator in developing questions and items for the scope of the
survey. These lead college experts met for brainstorming sessions with the facilitator to
select areas or topics of comparison for the survey. Then the facilitator formulated
questions for the survey instmment and the expert groups again met for review and
revision sessions. During this time the other institutional leaders were contacted by e-
mail and telephone to gather their input on areas they wished to be benchmarked. Eight
areas were originally targeted; however, after meeting with the adminisfrative committee,
the various process area experts, and gaining input from the other participating

44
community colleges, ten areas were chosen by the facilitator for the survey. After the key
areas were chosen, the facilitator used the input from each expert group to formulate
questions for the survey instmment. These area surveys were then distributed to the
professionals for refinement and feedback.
Once the personnel at the lead college approved of the survey, it was distributed
to the other five participating institutions for their input and suggestions. After receiving
input from the participating institutions, further improvements were made to the survey
instmment.

Content Areas
The ten educational areas chosen for the study were business or the processes
involved in the finances of an institution, computer technology or the information
systems utilized by the institution, developmental instruction or the remediation pre-
college program of the institution, distance education or the instmction delivered via
technology, institutional effectiveness or the measurement, resource and planning area
of a community college, instructional or the educational programs of the institution,
outsourcing or the process areas which are contracted outside of the institution's
operation, personnel or the administrative, faculty, and staff employees of the institution,
scheduling instruction or the arrangement of educational classes at the institution, and
workforce education or the continuing education and customized fraining programs of
the institution for business and industry and the community (see Appendix A for copy of
survey instrument). Following Cook's (1995) waming that the outcome of the
benchmarking process is "only as good as the preparation which goes into it" (p, 23) and
the adoption of the idea of continuous improvement, the preparation was detailed and
results were expected which would produce areas for improvement.

Pilot Sttidy
After the survey instmment was finalized, a pilot of the instmment was conducted
at the lead college. Each of the ten key area surveys were distributed to the division chair

45
or the administrator primarily responsible for that process area to complete. As these
surveys were retumed, further refinement was made to the area surveys to remove any
ambiguities.

Data Collection
The revised survey instmments were then sent by mail and e-mail to the
designated contact person at each community college to be distributed, completed and
retumed for analysis. During this collection period, responses and correspondence were
handled by mail, e-mail, FAX and telephone in order to keep costs at a minimum and for
the convenience of participants which is recommended by Tucker (1996) and Leibfiied
and McNafr (1992).
As expected some colleges finished the surveys and retumed them promptly.
Others had to be contacted numerous times and encouraged to complete the project. All
six institutions eventually completed the survey instmment.

Data Analysis
After information was retumed, each college was assigned a code number. For
reasons of legality, exchange, and confidentiality, this method was chosen. The raw
quantitative data was entered into a spread sheet and qualitative data was entered in
narrative format using a word processor. If there was a question about a community
college's response, the contact person was contacted for clarification. Next ranges, the
median, and the means were calculated for each item on the survey requiring a numerical
response. When the responses sought were descriptive in nature, narrative responses
were recorded with indications as to how many institutions gave like narrative responses.
The data was then analyzed to determine trends. If at least four community colleges
reported similar responses to survey items, a general frend in mid-sized community
colleges in Texas was noted. Each participating institution then received a final copy
which included these conclusions. Next the lead college had the researcher do a gap
analysis of their institution using the raw data which identifies the community college
doing the analysis performance and makes comparisons with the other benchmarking

46
partners. Cook (1995) defines this gap analysis process by explaining. "Where the
intemal standard is higher than the target performance this is termed a "positive' gap.
Where the performance levels in place in an organization are lower than the target
performance...this is called a negative gap....there are "pockets of excellence" within an
organization...build on what is good" (p. 123). The other participating instittitions were
encouraged to do their own gap analysis.

Follow-Up Survey
Two years later, a follow-up survey was prepared to determine the use of the
benchmark data in each participating institution in order to evaluate the design and
methods of peer benchmarking in higher education. This survey instmment ask about
study procedures, data presentation, and usage or results. This descriptive data was
recorded, analyzed and conclusions drawn about the usefiilness of a benchmark study in
peer institutions in higher education.

Summary
This chapter examined the methods and procedures of a case study in
benchmarking among mid-sized community colleges in Texas including a discussion of
the background, the sample selection, the instrumentation, which consisted of the survey
development and selection of content areas, the administration of a pilot study, the
collection of data, the analysis of the data, and the administration and analysis of a follow
up survey on the process of conducting a benchmark study.

47
CHAPTER IV
FINDINGS

This chapter will report and analyze the data derived from the two questionnaires,
the original benchmark study and the follow-up study. All six participating community
colleges completed and retumed the original benchmark study, and four (66%) of the
participating institutions responded to the follow-up survey.
The basic research question was "what are the conditions necessary for
benchmarking to be used by mid-sized community colleges in Texas to improve
community college operations?" In order to answer this question six sub-questions were
posed.

Sub-Question 1
1. What planning steps should be taken by the lead college in beginning a benchmark
study for an educational institution?
Four basic steps were taken by the lead college at the beginning of the benchmark
study: recognizing the need for a comparative study; providing the essential study
elements, leaders and money; establishing study guidelines such as selecting the
parameters, the type, and the partners for the benchmark study, and assuming the task of
overseeing the design and pilot study on the survey instmment. The first basic step taken
was to recognize the need for conducting a comparative study. The president of the lead
college had afready come to think that a benchmark study might provide meaningfiil data
for continuous improvement at his the community college.

Essential Elements
The second basic step was to provide the essential elements needed for such a
study: leaders and money. Therefore, the president of the lead college decided to form
an ad hoc Resource Management committee, establish funding for the development,
distribution, collection and analysis of the survey instmment, and name a dean of
institutional effectiveness to explore the feasibility of conducting a benchmark study and

48
the hiring of a researcher. The president had afready set aside budget monies to cover the
cost of hiring a researcher and providing work space and equipment such as a computer
with an e-mail account, postage, telephone expenses, and visits to the various campuses if
they were deemed necessary. Because the means of communication proved sufficient to
provide all the information needed, no campus visit were ever made.
Once it was determined that a benchmark study appeared feasible, the Resource
Management committee hired a facilitator to administer the study because it was
determined that a facilitator would increase objectivity, help to maintain focus, keep the
project on track by adherence to a time line, fiimish benchmarking method research
skills, and provide the confidentiality necessary to gain support from other mid-sized
community colleges in Texas. Confidentiality was believed to be critical because
institutions are sharing processes and quantitative data that might be sensitive.
Adherence to a time line was thought necessary in order to have the results when the
budget for the next biennium needed to be drafted. The Resource Management
committee decided to seek a researcher with knowledge of community college operations,
statistics, group communication skills, and a knowledge of the benchmarking process
because they believed such traits would allow them access to the information they
deemed necessary to make decisions about what data was needed, how it could best be
collected and how potential findings could be implemented to bring about improvements.
Expertise in collecting and analyzing the data was necessary in order that thefindingsbe
presented in a usable form for making decisions about how changes could be best
accomplished.

Study Guidelines
The third basic step taken was to select the parameters of the study, select the type
of benchmark study, and chose the benchmark partners to be included in the study.

Parameters of study
One guideline established by the Task Force at the lead college was to select the
parameters of the study. Since the creation of the Resource Management Committee, its

49
charge had been to discover ways to better utilize resources at the lead college; so the
Resource Management Committee now assumed the role of a Task Force to supervise the
study. In order to select the parameters of the study the Task Force needed to better
understand the technique of benchmarking and their function in this study. The facilitator
was asked to research the qualifications needed to be a member of a benchmarking team,
definitions of benchmarking, how to start the benchmarking process including how
benchmarking relates to the National Baldridge Award, how benchmarking can be used
in school improvement and the different types of benchmarking studies and report to the
Task Force members. First, the Task Force considered material from Spendolini (1992)
to review the characteristics needed by a quality benchmarking team The Benchmarking
Book lists the qualifications for members of a benchmarking team as:
• functional expertise and a demonstrated level of job skills, or work-related
performance;
• sufficient credibility in the institution, as judged by subject-matter knowledge,
employment history, and the level of position(s) held;
• above average communication skills, in order to communicate well with other team
members;
• the need to have a high level of team spirit, including a sense of cooperation, effective
listening skills, an ability to reach a consensus, and respect for the opinions of others,
(p, 54) (See Appendix B for handout).
Second, after reviewing and discussing several definitions of benchmarking (see
Appendix B), the facilitator's recommended definition of benchmarking was accepted.
Benchmarking for the purpose of this study was defined as the TQM technique used to
facilitate continuous improvement by analytically comparing ongoing processes in an
effort to manage change. This definition satisfied the purpose of this study which was to
evaluate the usefulness of the process of benchmarking in mid-sized community colleges
in Texas.
Third, the research material demonsfrated the use and importance of
benchmarking by presenting it's use in the Baldridge Award (see Appendix B) as an
approach for learning what other organizations are doing and for getting new ideas to
help drive improvement and to improve planning and institutional performance
(Spendolini, 1992). Next, the Task Force examined the seven points that Belmont

50
University's Quality Team (see Appendix B) stated should be addressed before beginning
a benchmark study as listed by Alsete (1995):
• Is there already a focus in your work area or department around service,
employees, and continuous improvement of processes?
Is benchmarking the right strategy in this sitiaation?
What should you benchmark?
What should you measure?
What organization(s) should you benchmark?
How should you collect data?
How can you implement what you leamed?. (p. 63)

Fourth, in order to better understand the value of benchmarking to educational


institutions, the Task Force scmtinized Tucker's benchmarking benefits (see Appendix
B), She describes the following educational benefits that come from conducting a
benchmark study:
Can be used to rapidly address specific sfrategic goals of an
institution.
Can be used to prevent "wheel reinvention o f and leam how other
organizations have addressed the issues and problems one is facing.
Can be used faster and more effectively to improve curricula, instmction,
leaming outcomes, and programs when adjusting to higher standards of
achievement and accountability required by both state and federal law.
Can more efficiently implement federal and state mandates.
Can increase the benchmarking team's understanding of quality principles
and tools.
• Can be used to practice one of the fimdamental improvement strategies
required in the Malcolm Baldridge Quality Award criteria involving
quahty assessment, (p. 8)

Type of study
Next, the Task Force considered the differing types of benchmark studies to
establish another guideline for the study. Combining the research on benchmarking the
facilitator presented six basic types of benchmarking studies that can be conducted
depending upon the data needed: intemal, competitive including peer, industry, non-
competitive, best-practice, and functional/generic. The Task Force concluded that
information from other similar community colleges was needed to make operational
comparisons and encourage improvements in resource use; thus, it decided to conduct
they peer institution benchmark study. This choice fimiished the Task Force and the

51
institution with data that has been used in several different ways to bring about
improvements. The study was still being referenced at the lead college in the annual
budget hearings two years after the study. Peer benchmarking is defined as the process of
analytically comparing peer organizations in order to discover current trends and to show
institutional gaps in performance where improvement can occur. This definition
combines business's definitions of competitive benchmarking (Spendolini, 1992), as
measuring against two or three direct competitors with the same customer base, and
industry benchmarking, as looking for general trends across a larger group of related
firms who have similar interests and technologies (Leibfried & McNair, 1992). The goal
of both is to provide extemal standards for intemal measurement and institutional
improvement. This type of benchmark study was chosen because the desire of the Task
Force was to have extemal standards for intemal management and institutional
improvement.
Thus, the Task Force understood the qualifications for a benchmarking team, how
to start the benchmarking process, the definitions of benchmarking, how benchmarking is
used in the Baldridge Award, how to benchmark for school improvement and the types of
benchmarking studies that can be conducted to facilitate continuous improvement at their
organization. The other participating colleges were not provided this information.

Partners for the study


The last guideline set by the lead college Task Force determined which peer
institutions would be selected to participate in the study. In order to select the colleges to
be studied statistics were gathered on student headcount, semester contact hours
generated, community population base including both the city size and the surrounding
county sizes, and appropriations to determine the amount of money available for
operation. Institutions that fit the following criteria were considered peer institutions:
similar student headcount, between 5,000-11,000; similar contact hours, between 2.5
million and 5,5 million; similar appropriations, between $9,5 million and 13 million; and
similar service area populations, between 70,000 and 210,000. The researcher provided
the Task Force with data on peer institutions falling within these guidelines. It was

52
determined that six peer institutions besides themselves would be ask to participate in the
study; one institution declined to participate resulting in six institutions participating fiilly
in the benchmark study.
The president of the lead college then contacted the presidents of these institutions
and obtained the cooperation of the other five community colleges for the study. Each of
the presidents agreed to participate in the study and to appoint a contact person
responsible for conducting the study on their campus. The names of the contact persons
were then given to the researcher; however, no training was done to prepare the contact
persons for participating in the study and no Task Force was appointed to support the
study at any of the peer institutions except the lead college. It was recommended that the
contact persons at the participating community colleges select key personnel to be
involved in designing and completing the survey instrument. All of the schools used
principal personnel in the areas covered by the study. It was also recommended that the
contact person take the necessary steps to prepare and educate their institution's
participants in benchmarking and the benefits of such a study. Whether this was done or
not is unknown.

Designing the Instrument


The fourth basic step taken by the lead college was assuming the task of
overseeing the designing of the survey instmment. Early in the process before the
researcher was hired, the Resource Management committee had held several meetings to
brainstorm and discuss which areas at the lead college that they felt should be compared
to other institutions. Then after the facilitator came on board, she gave the Task Force
NACUBO's fiinctional areas list of possible topics to include and Peterson's (Taylor et.
al., 1993) AGB Sfrategic Indicators major headings list (see Appendix C), With her the
Task Force discussed and edited its original list of concems from the brainstorming
sessions; this edited list was later used to develop the sections of the benchmark
instmment. For example, using the edited list the outsourcing section of the study was
designed almost exclusively by the Task Force over a period of about a month. After this

53
stage the Task Force also requested knowledgeable personnel provide input into the
various survey areas for further development.
The facilitator took the hst developed by the Task Force of potential items to be
included in the survey instmment to the expert personnel in each designated area which
included both instmctional division chairs and administrators. The key personnel were
asked to add components to the survey that they felt were needed in order to improve
operations in their areas. This expanded rough draft of questions was then taken back to
the Task Force for evaluation and changes were made. As the last planning step before
the pilot study, the key personnel from each functional area, which by now included ten
sections, were ask to fiirther refine the questions in their section of the survey instmment.
Therefore, four major planning steps were taken by the lead college in beginning the
benchmark study in mid-sized community colleges in Texas: recognizing the need;
providing the essential study elements, leaders and money; establishing the parameters,
the type, and the benchmark partners for the study, and assuming the task of overseeing
the design and pilot study.

Sub-Question 2
2. Can a benchmark study be designed to ensure that each of the participating institutions
feels included and derives information usefiil in making system improvements?
The facilitator and the Resource Management Task Force of the lead college
began developing the survey instmment. As discussed above, the Task Force began by
brainstorming functional areas at their college that were aligned with the institutional
mission statement and could possibly be improved. Later, with the facilitator, they
discussed what areas should be measured in order to generate comparative performance
data that might be used to bring about institutional improvement. The facilitator also met
with the executive staff of the lead college to brainstorm and select areas that they
determined would be beneficial to benchmark. As a result of these brainstorming
sessions, the facilitator recommended the requests be grouped into eight areas of
community college operations for the benchmark study.

54
Once the content areas to be benchmarked were designated, the key personnel at
the lead college were identified by the executive staff and the Task Force and these area
experts were asked to develop the scope of the survey. These knowledgeable people in
each designated area at the lead college were selected to work with the facilitator in
developing concems and questions. The other five institutions were also asked by e-mail
and letter to contact their selected key personnel to be a part of the survey development;
several revisions of the survey instmment came from this input. Included on the list of
pivotal personnel from the lead college were division chairs, so meetings were held with
these leaders to brainstorm specific processes that should become survey items.
Additionally, some of the selected key personnel from the lead college were interviewed
one-on-one to determine what information would be helpfiil in making comparisons of
operational processes for improvement. Throughout the entire development of the
instmment, the other participating institutional leaders were continually consulted by e-
mail and telephone to gather their input about departments, activities and processes that
they desired to be benchmarked. Input was sought from all the participants because the
instmment was intended to benefit all participating community colleges. Except in one
instance, when it was recommended that scheduling of instmction not be included in the
instmctional area but be a separate section, the areas of information sought by
participants matched those presented by the lead college.

Functional Areas
After meetings with the administrative committee, the division chairs, the various
process area experts, and gaining input from the other participating community colleges,
the facilitator decided on ten fimctional areas for organizing the questions so as to make
the data useful for making system improvements in all of the participating institutions.
Following this detailed process yielded the following areas:
• business or the processes involved in the finances of an institution,
• computer technology or the information systems utilized by the institution,
• developmental instmction or the remediation pre-coUege program of the
institution.

55
distance education or the instmction delivered via technology,
institutional effectiveness or the measurement, resource and planning area of a
college,
instmctional or the professional educational programs of the institution,
outsourcing or the contracting of services to outside sources that the college
normally performs
personnel or the administrative, faculty, and staff employees of the institution,
scheduling instmction or the arrangement of educational classes at the institution,
and
• workforce education or the continuing education and customized training
programs of the institution for business and industry, and the community
Rationalization for selecting these areas was established by each fiinctional area.
Each area was then allowed to design their section of the survey instmment. The
business area believed that they needed to know how other mid-sized community colleges
in Texas distribute their fimds, attempt to cut costs, prioritize their spending, allocate
money for travel, use purchasing groups, and deal with early retirement issues (see
Appendix A). The computer technology group determined that they needed to
understand how other community colleges are using mainframe computer technology,
using personal computers on campus, paying computer related personnel, establishing
computer usage policies, budgeting for computer technology, fraining employees for
using computer technology, and servicing their computer technology (see Appendix A).
Clarification to ascertain what other mid-sized community colleges in Texas are using to
place students into developmental education, how they are teaching developmental
classes, and fimding developmental classes was sought by the developmental education
area (see Appendix A), The distance education area desired to appraise what mid-sized
community colleges in Texas are offering by distance education, what technology they
are using to deliver distance education, and how they are staffing, budgeting and
equipping distance education departments (see Appendix A). How other institutions deal
with institutional grants and operate foundation offices was important to the institutional
effectiveness domains (see Appendix A), The instmctional sector requested information

56
on how other mid-sized community colleges in Texas compare to each other in cost per
contact hour spent for instmction, classify instmctional administrators, figure load hours,
staff instmction in the classrooms, in labs, govem instmctional related travel, provide
instmctional equipment, handle instmctional purchasing, and provide instmctional
professional development (see Appendix A). Administrators wanted to determine what
services are outsourced and what type of agreements govem services which are
outsourced, sold or provided to other mid-sized community colleges (see Appendix A).
Personnel departments sought to comprehend how other community colleges in Texas are
defining personnel levels, using part-time employees at each level, and evaluating each
employee group (see Appendix A). A new and growing area for community colleges,
workforce education, aspired to know how other mid-sized community colleges in Texas
are offering workforce education, workforce training and services, and organizing for
workforce development (see Appendix A), These areas were considered essential by
those involved in this benchmark study when the instrument was developed.
Following the selection of these fimctional areas, the facilitator used the input
from each expert group at the lead college and suggestions from the other colleges to
draft questions for the survey instmment. Then another round of meetings and interviews
occurred at the lead community college to allow the experts in each area to review and
refine the survey instrument. The survey questions were then revised to reflect the added
input and distributed to the professionals of the lead college forfiartherrefinement and
feedback. Once the personnel at the lead college approved of the survey, it was
distributed to the other five participating institutions for their suggestions on what
basic process areas they would like to have included in the study. After receiving
recommendations from the participating institutions, fiirther improvements were made to
the survey instmment,

Intemal Baseline
Furthermore, the Task Force agreed to establish their own institution's intemal
baseline performance measures after the instrument was finalized but before it was sent
out to participating institutions. So a pilot study was conducted using the newly designed

57
survey instmment within the lead college. Each area was asked to record the survey
information and then comment on the survey instmment itself in the areas of clarity and
ease of completion. Again after the intemal pilot study, revisions were made to the
survey instmment.
Using input from all participants several times during the formatting process and
designing a survey instmment that utilized this input allowed the participating institutions
to take ownership in the process and derive information that they deemed necessary to
make system improvements. Leibfiied and McNair (1992) indicate that an
understanding of the benchmarking questiormaire is important in gaining support for a
study; therefore, much effort was expended in helping all of the participating institutions
understand what was being benchmarked, who was to be involved, and what would be the
value of the study.
In the follow-up study, the participants were asked to evaluate the method used to
constmct the survey instrument that is by allowing the lead college to organize the
preliminary survey but allowing all participants to give input and make changes to the
survey instrument so that it included their areas of concern and need (see Appendix E).
Four respondents (66%) indicated that they felt incorporated into the process and that
their concems were included in the final survey device, and two (34%) failed to retum the
follow up survey because the original contact person was no longer employed at that
community college. Seventy-five percent of the responding institutions disclosed that
they derived information from the benchmark report that was usefial in making system
improvements. One follow up survey states that their institution liked having one college
organize the survey questions with input from others. Another commented that is was "a
lot of work by the main college."

Sub-Question 3
3. What time period is reasonable for conducting a benchmark study in community
colleges?
The original projected time period of the study was six months (see Appendix F),
beginning with the hiring of the facilitator in April and ending with the final report to the

58
peer institutions in September. Afterwards, the facilitator was to continue working with
the lead college's Task Force to help facilitate the use of the data and the change
processes at this institution during the school year. Four of the projected six months were
designated for the creation of the survey instmment, one month for participating
institutions to complete and retum the instmment and one month for analysis and
distribution of the data. These predictions were made based on the professional judgment
of the facilitator and discussions with the designated contact persons from the
participating community colleges.
The actually time period taken to complete the study was one year. Because the
instmment design meetings fell during the summer, it was difficult to carry out the work
in a timely fashion due to vacations and summer school demands; thus, it required longer
than anticipated to finish the survey instmment. The amount of time needed to complete
the survey instmment was also longer than anticipated because it occurred at the
beginning of the school year. Because the institutions involved aspired to give accurate
and complete information on the survey, the time constraints of beginning a new
semester, caused the reporting stage to also take longer than expected.

Sub-Question 4
4. What procedures can be used to collect, analyze and distribute benchmark data so that
it can be utilized by individual institutions to improve processes in community colleges?
A hard copy and an e-mail copy of the survey instrument was sent to each
participating institution for completion. A cover letter explaining the time frame for
response and that any questions or responses could be e-mailed, faxed, telephoned or sent
by mail was also included. During the data collection, correspondence between the
participating institutions and the researcher took place often to clarify or explain desired
responses on the survey. All six institutions eventually completed and retumed the
survey instmment.
When the raw data was received, it was recorded in the computer question by
question by the college numbers assigned to produce a raw data report. Then both
quantitative and qualitative reports were generated from the raw data. It was decided by

59
the Task Force and the facilitator that the calculating of ranges, the median, and the mean
for numerical responses would supply the necessary information to make comparisons;
and that a compilation of the narrative responses and an established trends list would aid
in comparing the institutions. The researcher recorded both the quantitative and the
qualitative data and then did a trends analysis of the responses. These analyses made up
the final report (see Appendix D); a copy of this final report was then sent to each
community college.
This final report was reviewed by the Task Force at the lead college. Afterwards
the lead college had the data fiirther analyzed by the researcher using a gap analysis
technique which looks at the quantitative data from a peer institution and then compares
their college's response to the others (Table 4.1). When such an analysis indicates an
institution is located at the top or bottom of the range or is way above or below the
average or median, then a "gap" in performance exists between that institution and the
other surveyed. The purpose of such an analysis is to determine those areas in which an
institution may below other institutions and thus need to make improvements.

Table 4.1. Gap analysis example from developmental area of survey instrument
Area Full Time Part Time
Instructors Instructors
Writing R=54%-100% R=l,7%-46%
Avg.=77.6% Avg.=25.5%
M=80% M=28
LCC*=65% LCC*=35%
GAP=-15% GAP=+7%
Reading R=5.6%-100% R=12%-94.4%
Avg.=61.4% Avg.= 46%
M=69 M=38%
LCC*=88% LCC*=12%
GAP=+19% GAP=+26%
*LCC = Lead Community College

The gap analysis (see Appendix G) was then given to each Task Force member in a
comparison notebook to review before a meeting.

60
Then, the Task Force assessed the gap analysis to determine areas where a gap
existed between their institution and the peer institutions. Cook's (1995) gap decisions
were followed, by identifying "both the size of gap in performance and potential causes,
the next step is to identify and prioritize areas of change and to draw up a plan for
improvements.,,." (p. 127). Then the lead college had to decide if the should seek to
match or better the peer institution's practice, and if the resources and capability to
achieve the desired improvement existed at their college. Several change options for
improvement were considered: potential changes were brainstormed, scenarios for
possible future versions of a process were developed, for altematives for improving a
process were discussed, and target areas were selected for future improvements (Cook,
1995). Each area where a gap was found between the lead colleges and the other peer
institutions was then considered by the Task Force for possible improvements; finally the
Task Force targeted and prioritized these gap areas that indicated their college was not
performing as efficiently as others. A copy of the benchmark data including the raw data,
the quantitative and qualitative reports and the gap analysis for each area was then sent to
the appropriate division chairs and adminisfrators. The various area experts then
reviewed the benchmark data, especially the gap analyses and the frends lists, which
shows if the institution is addressing each process like the majority of other institutions or
if it is falling behind in making operational changes. For example an adminisfrator from
the lead college reported, "The raw data and the gap analysis was a starting point to
target our weak processes," Only at the lead college was the final report fiirther analyzed
by using the gap analysis technique.
Even though a gap analysis was not done at all of the institutions, the final report
was analyzed for comparison purposes. One institution which used only the final report
stated in the follow up survey that the results were canvassed to compare "our
performance to see if we were on the right track." Another follow up survey comment
said, "We are still referencing the study's benchmark report in our budget hearings this
year regarding a technology fee." Thus, the other community colleges did evaluate the
final report and use the data to make institutional changes.

61
Distribution of the benchmark final report was different for each participating
institution. Two colleges sent copies of the report in each area to key personnel and
administrators, and one community college circulated the sections of the material to the
persons who completed the survey. When the complete survey results were distributed at
the lead college, a cover letter (Appendix H) explained the locations of each of the five
complete copies of the entire final report and the gap analysis and then the areas were
sent a copy of the section of the survey which applied directly to their operations.
Another institution reports having the raw data, the summary report, and the trends lists
placed in a central location, the institutional effectiveness office, and sending memos to
the various divisions announcing the location of the data. The fourth institution
responding does not recall what was done with the final report but reports that it did not
distribute the final report.

Sub-Question 5
5. Can the benchmark information be used at individual institutions to improve
processes in community colleges?
The follow-up survey revealed that colleges utilized the benchmark data in several
areas including: business affairs, computer technology, distance education, institutional
effectiveness, instmctional, outsourcing, personnel, and scheduling instmction. Basically
the benchmark data was found to be used to spur fiirther research in several areas, to
compare technology equipment and usage, to improve existing processes, to upgrade
weak areas and to develop new services.
The lead college's Task Force, division chairs and executive committee served as
teams for reviewing the final benchmark report including the trends list and the local gap
analysis. By using this information, experts in the various areas at the lead college began
to identify areas where improvement was needed. The data was further examined to
determine what elements were making other institutions more effective than the lead
community college. Then plans for change, which included both fiirther study
recommendations and definite instmctions with proposed time lines, were developed at
the lead college

62
One result of this review was a call for further research. The Task Force
appointed a subcommittee to develop a survey instmment and to survey each employee
group, including administrators, faculty and classified staff about the performance gaps
found. These intemal surveys asked for specific input regarding each area where a gap
was identified as to what improvements should be made and how these improvements
could be made. Following the results of this intemal survey, the Task Force
recommended two fiirther studies, one in the area of instmctional administration and one
on retirement options. Thus, data from the instmctional section of the benchmark study
became the starting point for two different committees. An ad hoc instmctional
administration committee was created because the benchmark data showed that all five of
the other community colleges considered the division chair position as an administrator
instead of faculty employed under a 12-month contract with 100% release time; thus, this
committee was to conduct fiirther study on compensation formulas and duties of division
chairs. Another ad hoc committee was appointed to consider retirement options. This
retirement subcommittee is still looking at various retirement options such as plans to
allow retiring faculty to serve in master teacher positions part-time. Another community
college after reviewing the benchmark data and conducting fiirther research instituted a
change through a committee that recommended replacing fiill-time faculty with part-time
faculty.
Another way the benchmark study was used was to compare technology
equipment and usage with other peer institutions. Two institutions used the computer
technology section of the benchmark study to begin improvements in computer utilization
and efficient usage of computer labs. When comparing the technology data at one
community college, the need for additional faculty computers became apparent, and in
the next budget period "all faculty and staff were approved for "the purchase of high-end
computers." One institution used the benchmark study information to allocate more
fimds to develop new programs in technology.
Several institutions used the benchmark data to make improvements on existing
processes; for example, the survey data was used at one community college for updating
their institutional sfrategic plans. Another community college reports that as result of

63
evaluating the institutional effectiveness data a new strategic plan which better "closes
the loop on all ventures of the college from intent to result" was developed. One
community college used the institutional effectiveness time lines to establish a local
"time line for operation, planning and publishing" of their strategic plan; the results of the
study also spurred the development of a new institutional effectiveness planning process.
After examining the benchmark data regarding costs, one college determined that by
upgrading a climate control system in the heating and air conditioning system, they could
operate as efficiently as the peer institutions. Still another institution used the benchmark
information to improve operations in summer school by changing to consider all summer
school offerings together as one semester in figuring academic standing.
Some institutions used the benchmark study to upgrade weak areas of operation.
The business section of the benchmark survey was used to upgrade one institution's
professional development. It reports that the survey highlighted their "need to increase
investment in employee development." This community college declares that as a result
of the benchmark survey, it has made "substantial investment in professional
development." Two institutions tell of using the benchmark data to make compensation
and salary changes; for example, one institution updated its faculty salaries as a result of
the survey. Another reported it made compensation changes for division chairs and
department heads based on recommendations from an instmctional adminisfration
committee which was formed as a result of the benchmark comparisons. One institution
used the survey to redefineflail-timefaculty because their Vice President of Instmction
questioned the accuracy of their existing definition after reviewing the benchmark study.
Another college reports that the data arrived as a subcommittee was being formed to
investigate a tenure and faculty salary schedule; thus, the data was used by that
committee to bring about improvements in personnel salaries.
Finally, institutions can use the benchmark information to develop new services
and processes. One community college felt the need to investigate Internet-based
instmction. An institutional effectiveness office at one institution recognized the
importance of establishing a time line for its institutional effectiveness activities. Another
institution was encouraged by the benchmark data to plan and offer an Intemet course,

64
One year after the study results were received, it offered its first Intemet course. At the
time of the follow-up study, the college was offering twenty Intemet courses.

Sub-Question 6
6. What are the characteristics of a college that utilizes benchmark data for process
improvements?
Participating colleges who utilized the benchmark data seemed to have common
characteristics: A president and a contact person who supported the benchmark process
and entered into it with plans to use the information generated, a willingness to invest
time and perhaps fiands into the benchmark study process, a contact person and other
expert persormel perhaps in the form of a committee or team who are knowledgeable of
the benchmark process and take an interest in designing the survey instmment, a definite
interest in applying the benchmark data to their institution such as doing a gap analysis, a
definite plan for distribution and presentation of the benchmark data to the various areas,
a culture which embraces the TQM philosophy, and an understanding of how to utilize
the benchmark data to encourage fiirther research, create new committees to plot
necessary changes and to make adaptations to create institutional improvements.
The follow-up survey instmment asked if each participating institution would
participate in another peer institutional benchmark study and 75% of those responding
replied yes. They were also asked if their institution would consider sponsoring a peer
institutional study, 50% of those responding responded yes. One institution responding
to the follow up survey indicated that the contact person was angered by the study and
that his institution "participated only because our President is a fiiend of your President
and, out of friendship, our President insisted that we participate." Obviously, this
institution did not distribute nor use the benchmark data.

Summary
This chapter reviewed the research question what are the conditions necessary for
benchmarking to be used by mid-sized community colleges in Texas to improve
community college operations? The findings to the six sub questions answered this main

65
research question. Findings from question one revealed that four basic steps were taken
by the lead college in beginning the benchmark study: recognizing the need for a
comparative study; providing the essential study elements, leaders and money;
establishing the parameters, the type, and the benchmark partners for the study, and
assuming the task of overseeing the design and pilot study on the survey instmment.
Research question two found that a benchmark study can be designed to ensure that each
of the participating institutions feels included and derives information useful in making
system improvements. Lead by the facilitator and the Task Force of the lead college, all
of the participating community colleges were involved in developing the survey
instrument. The lead community college conducted an intemal pilot study to test the
clarity, ease of completion, and value of the questions included in the instmment; and
respondents indicated that they felt included in the survey instmment design and that they
derived information from the benchmark report that was usefiil in making system
improvements. The third research question found that a year is a reasonable period of
time to complete a benchmark study with all participants involved in both the design and
completion of a survey instrument.
Research question four found that sending both a hard copy and an e-mail copy of
the survey instmment generated responses from all of the participating benchmark
institutions. Producing both a raw data report containing both quantitative and qualitative
analysis and a trends list supplied the necessary information for community colleges to
make comparisons even thought this final report distributed differently at each institution.
Answers to the fifth research question revealed that benchmark data was used in eight of
the ten areas surveyed for five basic reasons: to spur fiirther research, to compare
technology equipment and usage, to improve existing processes, to upgrade weak areas of
operation and to develop new services. The sixth research question revealed that several
characteristics existed in the three community colleges which made improvements based
on the benchmark report. These included a campus culture which accepts the TQM
philosophy of continuous improvement, support from leadership in time, money and

66
influence, and a team of knowledgeable persons to oversee the design, collection,
analysis, institutional distribution, review and utilization of the benchmark data to bring
about institutional changes.

67
CHAPTER V
CONCLUSIONS

The purpose of this chapter is to review the purpose of the study, draw
conclusions from the study's findings, make recommendations about the benchmarking
process in community colleges, and purpose future investigations in applying the
technique of benchmarking to the higher educational setting.

Purpose of Snidy
The reason for conducting study is to evaluate the feasibility of using the TQM
technique of benchmarking to bring about operational improvements in mid-sized
community colleges in Texas. Homogeneous mid-sized community colleges were
selected for this study because they provide the necessary environment and the interest to
examine the benchmarking process in an educational setting. Benchmarking was selected
because of its successfiil use in leading-edge companies as a way to continuously
improve operational processes and gain competitive advantage.
Community colleges face the challenges of making needed and continual changes
in areas to achieve resource efficiency, to meet demands of accountability, to stay abreast
of technology, to utilize an increasing knowledge base, and to provide life-long leaming
opportunities. Although quality management processes such as TQM, CQI, and CQA
have afready provided the way and the language to meet these challenges; effective
methods of bringing about continuous improvement, such as benchmarking, are still not
throughly tested in higher education. Because benchmarking provides a means for
making comparisons that might assist colleges in making needed changes, it was applied
to mid-size community colleges in Texas to observe the feasibility of it's use in
educational settings.
Thus because this is the first benchmark study of its type to be conducted among
community colleges, it provides valuable information not currently available on the use
of benchmarking in the community colleges educational setting. It also provides

68
comparison material for those community colleges which participated in the study and to
any institution or administrator seeking to improve the services of their college.

Conclusions
The question which guided this study was what are the conditions necessary for
benchmarking to be used by mid-sized community colleges in Texas to improve
community college operations? The conclusion of this study is that benchmarking can be
a valuable tool in making continuous improvements in a community college setting.
Conclusions are drawn by answering the six sub-questions that support this main
conclusion.

Sub-Question 1
1. What planning steps should be taken by the lead college in beginning a benchmark
study for an educational institution?
Seven planning steps should be taken by a lead college in beginning a benchmark study.
a. Recognize the need and the potential benefits of a comparative study is the first step
to a successful study. The lead college in this investigation had administrators and
other personnel that acquired a good understanding of benchmarking, its processes and
its benefits, and thus, realized that their institution needed comparative data to
continue to make operational improvements. A good understanding would include
what a benchmark study should entail, how the study should be conducted, how the
benchmark data can be utilized, how improvements can be made from the data as well
as an understanding of the cost in time that is required to produce usable data. A
superficial understanding of benchmarking seems a sure guarantee of frustration and
failure. Therefore, failure to allow sufficient time to adequately frain personnel on
benchmarking, design the instmment, complete the survey, and utilize the benchmark
data is one of the major reasons for failure. The only community college that reported
not using the benchmark data to make improvements only participated because their
president required them to do so due to his friendship with the president of the lead
college. Implications from this study are that involved personnel and administrators

69
must understand and believe in the benchmarking process and outcomes for an
institution to use the data in making operational improvements.
b. Provide the essential elements of money and leaders is the second necessary
planning step a community college must take in beginning a benchmark study.
Financial support is needed to cover study expenses. Money in this study was
provided by the lead college for hiring a researcher, providing work space, technology,
copying, postage, and telephone. Leaders such as a supportive president, a Task Force
to serve as the benchmark team, and key administrators and personnel to help design,
complete, and evaluate the survey instmment are necessary for an institution to derive
benefits from a benchmark study. Two major criteria were used in selecting the Task
Force at the lead college, peer respect and their belief that improvement was possible.
Task Force members were selected because they were respected by their peers in the
institution; this allowed them to lead others participating in designing the instmment,
answering the survey, and in examining the data to find areas to make improvements.
They believed that improvements could be made by identifying areas where funding
could be improved; this lead to a committee that was anxious to evaluate the data and
move on into adapting practices from other community colleges to improve their own
operations. It appears that the amount of change and number of improvements that
occur from a benchmark study are in direct relationship to the belief of the
institution's leaders in the value of the benchmark data,
c. Hire an outside researcher works well when conducting a benchmarking study. The
choice by the lead college to employ a facilitator to research and educate the Task
Force is believed to be one reason for their successfiil utilization of the study. Each
college needs a researcher or an appoint contact person to help plan and conduct the
local benchmark study. Such a researcher should have knowledge of the educational
environment, statistics, the benchmarking process and group dynamics. This kind of
person can provide objectivity, adherence to a time line, and breadth needed for a
successful study. The result of this choice was that Task Force members understood
the qualifications they needed to be an effective benchmarking team member, the
scope of the benchmarking process, the steps necessary to start the benchmarking

70
process, and how benchmarking data could be used in school improvement. If a
researcher is not hired to oversee the entire benchmark process, a consultant in the
process of benchmarking or a facilitator from within the campus stmcture with
benchmarking knowledge should at least be designated to train the Task Force of an
institution embarking on a benchmark study.
d. Select, train and work through a Task Force representative of the various
areas and levels of a community college is the fourth planning step necessary for
productive use of the benchmark data. Doing this provides the different perspectives
and the grass root involvement needed for designing a benchmark study. The group
process through brainstorming and discussion provides valuable support throughout
the benchmarking process. The fact that the lead college had a Task Force to establish
benchmark study guidelines such as the parameters, the type, partners, survey design
and pilot study is probably one of the reasons that the benchmark study proved to be
so usefiil to their community college,
e. Train in the technique of benchmarking and the results that could be produced by
such a study is the fifth plarming step that helps ensure peak performance by the Task
Force. It may be that a result of the training and direct guidance from the researcher
was the impetus for the interest and expectancy that existed at the lead college about
the improvements that could be made from the data. Understanding the value of a gap
analysis explained by the researcher helped the lead college in targeting areas for
improvement. Therefore, understanding and selecting the type of benchmarking
study, a peer institution comparison, helped the committee in interpreting the
benchmark study data for their institution.
f Select peer institutions, comparable in size and resources, as benchmark partners
and obtaining support from their presidents is the sixth important planning step
necessary to have involvement in the survey design and data collection from all of the
community colleges. As shovra by the follow-up survey, these choices fiimished the
Task Force and the lead institution with data that has been used in several different
ways to bring about improvements. It also provided two other participating
institutions with data that was used for making system improvements. Since the

71
community college with the most knowledge and training regarding benchmarking
made the most improvements, it appears that more improvements could have resulted
from the data had the peer institutions been better trained and sold on the
benchmarking technique,
g. Involve all of the participating institutions in the survey design and having one of
the community colleges conduct a pilot study is the seventh planning step. Multiple
sources of input produced data that several areas in all three of the colleges, which
utilized the study, used to bring about system improvements. The pilot study is
necessary to refine the survey instrument through application, and it also helps the
lead college establish intemal benchmarks to compare the data against later.

Sub-Question 2
2. Can a benchmark study be designed to ensure that each of the participating institutions
feels included and derives information usefiil in making system improvements?
The method of allowing one sponsoring or lead community college to conduct the
study with suggestions from other participants worked well. All six community colleges
were allowed to submit key processes and areas of interest to them, and each institution
was given the opportunity to examine the final questionnaire before they were asked to
complete it. Thus, if any of the institutions had a particular area of interest, they were
allowed to include it in the survey instmment; this method of constmcting the
questionnaire to contain useable data is efficient because it allows each participating
institution to select the type of benchmark information they feel they need collected.
Because they were included in the designing of the instrument, the colleges all cooperated
in completing the survey in order to gain data that they sought for comparison purposes.
Allowing each area to design their own particular section allowed them to design
questions to produce the information that they needed to make equal comparisons. For
example the chart designed by the computer area to compare positions and pay helped
them compare like situations. The follow-up survey showed that this method of
designing an instrument made participating institutions feel included and allowed them to
obtain information that was useful in making system improvements. This method was

72
lengthy, but did provide an instmment that yielded results that the participating
institutions could use as evidenced by the follow-up survey. It appears that the amount of
input suggested by an instittition directly related to the amount of use the data received at
each participating institution.

Sub-Question 3
3, What time period is reasonable for conducting a benchmark study in community
colleges?
A year appears to be sufficient time for the collection, analysis and use of the
benchmark information. It should be noted that time commitment is important in a
benchmark study because time is needed to educate people throughout the college about
working as teams and the benchmarking process before beginning the design, completion
and implementation stages of the study. Sufficient time is also needed for the benchmark
team to engage in meaningfiil thought, reflection, and dialogue before overseeing design
and usage of the benchmark data. It seems unlikely that a benchmark study will be used
to accomplish improvements if the purpose, meaning and potential system applications
are not endorsed by a team that has had time to participate in the team process.

Sub-Question 4
4. What procedures can be used to collect, analyze and distribute benchmark data so that
it can be utilized by individual institutions to improve processes in community colleges?
It is recognized that collection, analysis and distribution of benchmark data
definitely influence the usage of the information by individual institutions to make
improvements. Data collection from a hard copy and an elecfronic copy of the survey
instmment was somewhat effective because of convenience; however, a personal contact
such as a telephone call after the cover letter and questiormaire were received by
participating community colleges might have helped compliance with in the desired
response time frame. Keeping open lines of communications between the institutional
contacts during the survey completion time demonsfrated that e-mail, fax, telephone or

73
mail were sufficient to keep responses unified on the questionnaire. Contacting the
presidents of each institution to assure the retum of the survey instmment did result in all
six institutions eventually completing and retuming the entire survey.
All the colleges received a final report to use for analysis. More institutional
changes were produced from the benchmark data at the lead college than any of the
others, probably due to the data provided by the added information in the gap analysis
that aided them in making institutional comparisons. No other participating institution
performed a complete gap analysis; this probably contributed to limited use of the data.
The existence of the Task Force with the authority to take action or make
recommendations and the data provided by the gap analysis appears to have provided
decision makers with the foundation necessary to begin making system changes in several
areas. The fact that the raw data report, both quantitative and qualitative reports and
frends lists were used by three community colleges to make system improvements,
however, demonstrates that some changes can be made without a official gap analysis.
This would indicate that for maximum use of benchmark data a gap analysis is essential.
How the complete survey results are distributed also influences the usage of the
benchmark data. Placing the data in easily accessible locations and notifying key
decision makers of the location of the information seemed to influence the amount and
quality of improvements occurring from the benchmark final report; the community
college which sent key persons a copy of the data for their areas made the most system
improvements. Whether the data is used as a starting point to target weaknesses or to
check and see if local performance is on the right track, the availability of the data
influences the usage of the information. As noted the college which did not use the data
for improvement did not distribute the results of decision makers.

Sub-Question 5
5. Can the benchmark information be used at individual institutions to improve processes
in community colleges?
With time and effort expended to evaluate the benchmark data, systems
improvements in many areas of college campus operations can be made. Respondents

74
which used the benchmark information reported process improvements in eight of the ten
areas surveyed: business affairs, computer technology, distance education, institutional
effectiveness, instmctional, outsourcing, personnel, and scheduling instmction.
Evaluating the benchmark data produced changes in 75% of the colleges responding to
the follow up survey, and 50% of those responding indicated that they would participate
in another benchmark study. Therefore, benchmark data can be used at community
college for operational improvements.
When the need for a change is established by the benchmark data, the change is
best planned and executed by the people nearest the weak process area because
adaptations from one system to another are best made by the people who perform the task
in question. At times more detailed information is needed to outline an effective plan for
change; thus, at times another committee should be appointed to gather more specific
information before the change process is detailed. It seems unlikely that an institution
will achieve any improvements from benchmark data until personnel understand the way
to use the information and are given the freedom to explore possibilities for adaptation of
practices surveyed.

Sub-Question 6
6, What are the characteristics of a college that utilizes benchmark data for process
improvement?
When benchmarking is actively supported in an institution with a TQM or CQI
culture, it is a beneficial tool in bringing about change. Community colleges with such
cultures are already committed to continuous improvement and are looking for efficient
ways to make system improvements. Although not all of the participating institutions
commited time to educate people throughout the college about working as teams and the
benchmarking process, an initial commitment of time from administrators and key
personnel was demonstrated by all six of the community colleges in designing and
completing the survey instmment. However, not all of the participating institutions
continued the commitment in reviewing and using the benchmark data. One college chose
not to take the time to have the benchmark data available on campus; three institutions

75
chose not to commit to doing a gap analysis for their college. It might be that these
community colleges could have used the benchmark data more effectively had they been
more committed to using the benchmark data. It seems that the stronger this commitment
was in the community colleges involved, the more data was used to begin improvements.
Initiative by the president of an institution appears to be vital characteristic of a
college in successfially conducting a benchmark study and using the data. Strong
encouragement and high expectations from such a leader create productive results in
organizational cultures seeking continuous quality improvement. An attitude of "What
can we do better?" by both the president and a Task Force is essential in utilizing
benchmark data to mark campus improvements. Such an attitude usually exists in
institutions which have adopted TQM fundamentals. The three institutions which utilized
the results of the study the most had presidents who were strongly supportive of the
project and at least two have strong TQM cultures. Administrators also must be active in
supporting the benchmark study especially to share the gathered data with key decision
makers and encourage improvements to occur.
Because benchmarking is a team project, a critical factor in its successfiil use is
for several key people in each institutional setting to understand its meaning, purpose and
potential applications to their system. In this regard although all six community colleges
report that the review and answering of the survey instmment was done by several people
in different areas of expertise, only one indicated that it was done by committees or
teams. Teams with their pooling of ideas and dispersal of responsibilities are more able
to undertake efforts to adapt the practices of the peer institutions into improvements on
their campus. Active efforts to capitalize on the benchmark data by making comparisons,
doing a gap analysis, evaluating performance and frends, continuing with fiirther
research, and communicating raw data create support for change in the decision-making
process. Thus, a close relationship exists between campus and administrative support and
the number of changes which occur as a result of an educational benchmark study.

76
Recommendatinns
Based on the analysis of the process of benchmarking in mid-sized community
colleges in Texas several recommendations can be made. It is recommended that
• Each participating institution have a basic TQM culture where continuous
improvement, teamwork, employee empowerment and other TQM pnnciples are
in place. Benchmarking fits naturally into this culture as a continuous
improvement technique,
• Each participating institution have a trained Task Force to oversee the benchmark
process in their community college because extensive planning is necessary on the
part of the lead college at the beginning of a benchmark study. This planning
appears to work best if spearheaded by a president who is involved with and
supportive of the benchmark study.
• Training be provided and support of adminisfrators and central personnel be
obtained before beginning the benchmark process in order for an institution to
reap comprehensive benefits and bring about improvements from benchmark data.
An efficient way to conduct fraining would be to have the facilitator train
adminisfrators, key persormel or the benchmark task force between the time the
institution agrees to participate and the designing of the survey instrument. An
institution could continue to reap the benefits of training for several benchmark
studies without refraining and allow for such studies to be conducted more often
to spur continuous improvement.
• Every effort be made to ensure that each participating institutions feels included.
Any survey instrument used by peer institutions in a benchmark study should be
designed by those educational organizations participating, so that each school has
some ownership in the results of the benchmark study.
• At least one or two campuses visits to each institution by the researcher are
recommended. It is now believed that a visit to the campus for training purposes
and for follow up after the data was distributed would have aided the peer

77
institutions in using the data. If a face to face meeting is held and training done at
the beginning, it seems that better cooperation will exist between institutional
contacts thus accelerating the process.
• Recommended that at least a year be committed to conducting and utilizing
benchmark data before another study is begun. This time period allows for the
necessary training, input and evaluation before improvements can be
implemented. It is also suggested that at least three months are needed to
adequately answer the survey instmment if much detail is required.
• Each participating institution conduct a gap analysis from the final report. The
lead college that did a gap analysis made the most improvements; therefore, for
maximum use of the benchmark data an overall gap analysis is recommended.
Each decision-making person who has been trained in how to use benchmark data
in a community college be involved in evaluating the final report in the their area.
It is also recommended that the final report be made available to all personnel on
campus and that they be informed of its location and contents in a general
memorandum.
• The areas of the benchmark survey be selected by the participating community
colleges to ensure utilization of the information in making improvements, and
• A pilot study be conducted by one of the participating colleges because it allows
for intemal benchmarks to be set and refinements in the design instmment to be
made that enable the final report to be understandable.

Future Investigations
Other benchmark studies need to be conducted and analyzed in other institutions
of higher education both two-year and four-year institutions. As data banks are produced
from other benchmark studies in higher education, data can be used for comparisons in
process areas, thus as a follow up to this study, individually sponsored studies could be
more focused. Less comprehensive benchmark studies could be conducted within areas
in a shorter length of time. A benchmark study with training provided for all institutions
should be conducted to determine if training increases usage of the data. Research on the

78
role that the facilitator versus other means of conducting the study, i.e., a research
committee would help establish the importance of institutions in employing and using
these outside consultants, A study with face-to-face communication between the college
contacts would reveal improvements that this could bring to conducting a benchmark
study.
Once a college has realized the value of benchmark data in making system
improvements, other types of benchmark studies should be conducted such as
benchmarking other types of organizations. Best practice benchmark studies are needed
between colleges and businesses in those areas that perform the same fiinctions such as
the business area. Such studies should serve to upgrade basic operationalfimctionsin
education. Benchmarking classroom techniques and processes within and between
institution could produce data that would improve classroom instmction on college
campuses.

Summary
This chapter reviewed the purpose of the study, drew conclusions from the study's
findings, made recommendations about the benchmarking process in community
colleges, and purposed fiiture investigations in applying the technique of benchmarking
to the higher educational setting.

79
BIBLIOGRAPHY

Alsete, J, W. (1996). A competitive benchmarking study of non-credit program administration.


Joumal of Continuing Higher Education, 44 (2), 23-33.

Alsete, J. W, (1995). Benchmarking in higher education: Adapting best practices to improve


quality. ASHE-ERIC Higher Education Report No. 5. Washington, D.C.: George
Washington University Graduate School of Education and Human Development,

American Assembly of Collegiate Schools of Business. (1994). New benchmarking survey


makes business schools introspective, Newline, 25 (1), 16-17,

American Association for Higher Education. (1994). 25 snapshots of a movement: Profiles of


campuses implementing CQI. Washington D.C.: Author.

American Productivity and Quality Center. (1993). The benchmarking management guide,
Portland, OR: Productivity Press.

Application for the Pioneer Award. College of Business, Arizona State University, Spring 1994.
Unpublished intemal document. Tempe, AZ: Arizona State University.

Apps. J. W. (1988). Higher education in leaming society: Meeting new demands for education
and training. San Francisco, CA: Jossey-Bass.

Aronow, I. (1993, January 26). lona closing its campus in Yonkers. The New York Times, pp.
WC9.

Ashworth, K. H, (1994). The Texas case study: Performance-based fiinding in higher education.
Change, 26 (6). 8-15.

Association for Institiitional Research (AIR). (1994, May 29-June 1, 1994), NACUBO
Benchmarking Paper presented at the 34* Annual Forum of the Association for
Institutional Research, New Orleans, LA,

Bateman, M., & Elliott, R. W. (1994). An attempt to implement performance-based fimding in


Texas higher education: A case study. In Focus on the budget: Rethinking current
practice, Denver: State Higher Education Executive Officers.

Beckhard, R., & Pritchard, W. (1992). Changing the E.s.sence. San Francisco, CA: Jossey-Bass.

Bell, D. (1973). The coming of the pnst-indiistrial sor.iety: A venttire in social forecasting.
New York: Basic Books, Inc.

80
Bimbaum, R. (1988). How colleges work: The cybernetics of academic organization and
leadership, San Francisco: Jossey-Bass.

Blumenstyk, G. (1993). Colleges look to "benchmarking" to measure how efficient and


productive they are. Chronicle of Higher Education, pp. A41-42.

Blumenstyk, G. (1995, September 1). Measuring productivity and efficiency: Project aims to
come up with comparative costs for operations, from parking to purchasing. The
Chronicle of Higher Education, pp A41

Bogue, E. G., & Saunders, R. L. (1992). The Evidence for Quality San Francisco: Jossey-Bass.

Boyer C. M., & McGuiness, A, C , Jr. (Febmary 1986). State initiatives to improve
undergraduate education, ECS survey highlights. AAHE Bulletin, ,3-7,

Boyer, E, L, (1990). Scholarship reconsidered: Priorities of the professoriate. Princeton: The


Camegie Foundation for the Advancement of Teaching.

Breedin,B. (1994). Remembering the G.L Nill. AAHE Bulletin, 46 (9), 12-23.

Brewer, P. B., Hale, C. D,, & McLaurin, S. (1996). Benchmarking academic credit and
noncredit continuing education. The .Toumal of Continuing Higher Education, 44 (1),
2-11,

Brigham, S. (1995). Benchmarking, HF.PROC CQT_T. Archive [On-line], Available at


www.heproc.org.

Brown, M. G., Hitchcock, D, E. & Willard, M, L. (1994), Why TQM fails and what to do about
it. New York: Irwin Professional Publishing,

Calek, A, (1995). 1995 quality in education listing. Quality Progress 28. 27-77

Camp, R. C. (1989). Benchmarking: The search for industty best practices that lead to
superior performance. Milwaukee, WI: ASQC Quality Press.

Camp, R. C. (1992). Leaming from the best leads to superior performance. .Toumal of Business
Strategy, 13 (3), 3-6,

Camp, Robert C. (1995), Business process benchmarking. Milwaukee, WI: ASQC Quality
Press.

Zarlson, R. (1995). Benchmarking on campus (pp7). HEPROC CQI-L Archive: American


Association for Higher Education.

81
Carr, D. K., Hard, K. J., & Trahant, W. J. (1996), Managing the change process. New York:
McGraw-Hill.

Carr, D, K., & Johansson, H. J. (1995). Best practices in reengineering New York:
McGraw-Hill, Inc.

Chaffee, E. E., & Sherr, L. A. (1992). Ouality: Transforming postsecondary education ASHE-
ERIC Higher Education Report No, 3, Washington, D.C: The George Washington
University School of Education and Human Development,

Clarke, A. (1995). Benchmarking on campus (pp, 5). HEPROC CQI-L Archive: American
Association for Higher Education.

Clark, K. L. (1993, December 5). Benchmarking as a global strategy for improving instmction
in higher education. Paper presented at the Intemational Conference on New Concepts
in Higher Education, Phoenix, AZ.

Clotfelter, C. T., Ehrenberg, R. G„ Getz, M,, & Siegfiied, J. J, (1991), Economic challenges in
higher education. Chicago: The University of Chicago Press.

Coate, L. E, (1990, November). TQM on campus: Implementing total quality management in a


university setting, NACUBO Business Officer, 26-35.

Coate, L.E. (1992). Reshaping the university: Restmcting and process re-engineering
(Unpublished Report) Portland: Oregon State University,

Coate, L, E. (1993, January 7-9). Implementing TQM at OSIJ. Paper presented at the meeting
of the MBA Managers Program, Rancho Santa Fe, CA.

Cook, S. (1995). Practical benchmarking. London: Kogan Page.

Comesky, R. A., & McCool, S. (1992). Total quality improvement guide for institutions of
higher education. Madison, WI: Magna Publications.

Dale, B. (1995, October 30 &31). Practical benchmarking for colleges and universities. Paper
presented at the AAHE Workshop, Key Biscayne, FL.

DeCosmo, R. D., Parker, J, S., & Heverly, M. A, (1991), Total quality management goes to
community college. In Total quality management in higher education. L. A. Sheer &
D. J. Teeter (Eds.). San Francisco: Jossey-Bass Inc.

Deming, W, E. (1986). Out of the crisis. Cambridge, MA, :Massachusetts Institute of


Technology Center for Advanced Engineering Study.

82
Dertouzos, M,, Lester, R., & Solow, R., & MIT Commission on Industrial Productivity. Made in
America- Regaining the productive edge Cambridge, MA: The MIT Press, 1989.

Detrick, G., Magelh, P,, & Pica, J, (1994), Benchmarking: A key to successfiil program review.
Paper presented at the 1994 Annual Meeting of the Graduate Management Admission
Council, San Diego, CA.

Detrick, G., & Pica, J. A, (1995, April 10-12). Benchmarking business school performance:
Lessons leamed. Paper presented at the American Association of Collegiate Schools of
Business, Minneapolis, MN,

Dossey-Terrell, J. (1995). Benchmarking on Campuses (pp, 4). HEPROC CQI-L Archive:


American Association for Higher Education.

Douglas, B. (1997, December), Seventh inning stretch: A retrospective of the NACUBO


benchmark program. NACUBO Business Officer, 31 (6), 29-34,

Dmcker, P. F. (1992), Managing forthe fiittire: The 1990s and beyond New York: Tmman
Talley Books/Dutton,

Dmcker, P. F. (1992, September-October). The new society of organizations. Harvard Business


Review.

Ewell, P. T., Finney, J., & Lenth, C. (1992). Filling in the mosaic: The emerging pattem of
state-based assessment. AAHE Bulletin, 42 (8), 3-5.

Ewell, P. T. (1993). A preliminary study of the feasibility and utility for national policy on
instmctional 'good practice' indicators in undergraduate education. Boulder, CO:
National Center for Higher Education Management Systems.

Extemal Comparisons Summary report: MBA: Leadership & Leaming. (1993). Boston:
Harvard Business School.

Ford, D.J. (1993). Benchmarking HRD. Training and Development. 46 (6). 36-41.

Freed, J. E., Klugman, M. R., & Fife, J. D. (1997). A culttire for academic excellence. ASHE-
ERIC Higher Education Report, 25, (1). Washington D.C: George Washington
University Press.

Freed, J. E., Klugman, M. & Fife, J. D. (1994). Total Quality Management on campus:
Implementation, Experiences, and Observations. Paper presented at an annual meeting of
the Association for the Study of Higher Education, November 10-13, Tuscon, Arizonia.

Gaither G. (1993), Performance fiinding in Texas. Assessment Update, 5 (42), 12-13.

83
Gaither, G., Nedwek, B. p., & Neal, J. E. (1994). Measuring up: The promi.ses and pitfalls of
performance indicators in higher education ASHE-ERIC Higher Education Report No. 5
Washington, D.C: The George Washington University, Graduate School of Education
and Human Development.

Haack, R, L. (1998). NACUBO benchmarking and its effect on higher education business
processes. Unpulbished doctoral dissertation. University of Nebraska, Lincoln.

Hammons, J. O. (1994, July-August). Can total quality management succeed at your college-
Now? In D. B. Lumsden (Ed,), Community College Toumal of Research and Practice,
IB. (4), 335-344.

Heaphy, M. S., & Gmska, G. F, (1995), The Malcolm Baldrige National Quality Award: A
yardstick for quality growth New York: Addison-Wesley Publishing Co.

Hubbard, D. L. (Ed.). (1993), Continuous quality improvement making the transition to


education. Maryville, Mo,: Prescott Publishing Co,

Inger, M. (1993). Benchmarking in education: Tech prep, A case in point. New York:
Columbia University, National Center for Research in Vocational Education, Berkley,
CA,

Jablonski, J, R. (1992). Implementing TQM (2"** ed). Albuquerque, NM: Technical


Management Consortium, Inc.

Kantrov, A, (1984), The constraints of corporate tradition: Doing the correct thing, not just what
the pa.st dictates. New York: Harper and Row.

Keeton, M., & Mayo-Wells, B. (1994). Benchmarking for efficiency in leaming. AAHE
Bulletin, 46 (8), 9-13.

Keller, G. (1983). Academic strategy: The management revolution in American higher


education. Baltimore: The Johns Hopkins University Press.

Kempner, D. E, (1993, December), The pilot years: The growth of the NACUBO benchmarking
project. NACUBO Business Officer, 27 (6V 21-31,

Leaming Resources Network. (1992). Ratios for success (304). Manhattan, KS: Leaming
Resources Network,

Leibfiied, K. H. J., & McNair, C J. (1992), Benchmarking. New York: Harper Collins
Publishers.

84
Liebfiied, K. H. J., & McNair, C J, (1992). Benchmarking: A tool for continuous improvement.
In The Coopers & Lybrand Performance Solutions Series, CMA, Harper Business, 1992.

Leslie, D. & Fretwell, F. K, (1996). Wise moves in hard times: Creating and managing resilient
colleges and universities. San Francisco: Jossey-Bass.

LeTarte, C E., & Schwinn, C J. (1994, July-August), Evolution of TQM principles and
practices at Jackson Community College. In D. B. Lumsden (Ed). Community College
Joumal of Research and Practice, 18 (4), 345-358,

Losh, C (1994). Benchmarking technical education dehvery systems. Vocational Education


Joumal, 69 (7), 62.

Marchese, T. (1994, January/Febmary), Getting the Baldridge right. Change, 26. 4.

Marchese, T. (1995, July/August). The management of colleges. Change, 27, 1.

Marchese, T. (1995). Understanding benchmarking. AAHE Bulletin.47 (8\ 3-5.

Martin, D. (1995, May 14). Upsala ceremony is a last hurrah. The New York Times, p. 5B,

Massey, W. F., & Meyerson, J. (Eds.). (1994). Measuring institutional performance in higher
education. Stanford Fomm for Higher Education Futures. Princeton, N.J.: Peterson's
Guides.

Middaugh, M. F., & HoUowell, D, E. (1992). Examining academic and administrative


productivity measures. New Directions for Institutional Research. 75. 61-76.

Minter, J. M, (1996). National cooperative data share - Benchmark data exchange. Available on
Intemet at http://www.edmin.coTn/jma/ncds.html

Myerson, J. W., & Massy, W. F. (Eds.). (1994). Measuring instittitional performance in higher
education.

National Association ofUniversisty Business Offices. (1995). Benchmarking prospectus.


Washington, DC: National Association of college and University Business Officers.

Naisbitt, J. (1982). Megatrends. New York: Wamer Books.

The nation. (1989, September). Chronicle of Higher Education Almanac, American Council on
Education, p, 24.

National Postsecondary Education Cooperative Council Meeting Proceedings. (1996, December


9-10). Available on http;//nces.ed.gov/npec/proceedings.html, p. 13,

85
Nicklin, J. L. (1995, January 27). The hum of corporate buzzwords: Colleges look to businesses
for advice on resttiicturing. Chronicle of Higher Education p. A33.

Parfett, M. (1994). The BPR HanHhnnk Oxford, England: NCC Blackwell Limited.

Parston, G. (Ed,). (1986), Managers as strategists London: King's Fund,

Peters, T, (1987). Thriving on chaos: Handbook for a management revolution. New York:
Alfred A. Knopf

Peterson, C S . (1993). Continuous quality assurance Washington, DC: American Association


of Community Colleges.

Pinellas County Schools. (1994). Superintendent's quality challenge. Largo, FL: Author.

The pilot years: The growth of the NACUBO benchmarking project. (1993, December),
NACUBO Business Officer, 27, (6), 21-^1

Price, C (1994). Pilot in higher education change: A view from the helm. In
S. Weil, (Ed.). (1994). Introducing change 'from the top' in universities and colleges.
(pp. 29-39). London: Kogan Page Limited.

Pryor, L. S, (1989). Benchmarking: A self-improvement sfrategy, .Toumal of Business Strategy.


28-31

Reutter, B. (1995). National Conference on Continuous Quality Improvement. CQI-L


@MRNET: Auburn University.

Richards, S. (1993). The consumer paradigm. London: Public Management Foundation,

Ross, J. E. (1993). Total quality management: Text, cases and readings. Defray Beach, FL: St,
Lucie Press.

Ruppert, S. (1994), Charting higher education accountability: A source book on state-level


performance indicators. Denver, CO: Education Commission of the States. (ERIC
Document Number ED 375 789 177),

Rush, S, C (1994). Benchmarking-How good is good? In J. W. Meyerson & W. F, Massey,


(Eds,), Mea.suring Instittitional Performance in Higher Education, (pp. 83-97).
Princeton: Peterson's.

Sandmeyer, L. (1995). Benchmarking on campuses (pp. 4). HEPROC CQI-L Archive:


American Association for Higher Education.

86
Sashkin, M., & Kiser, K, J. (1993). Putting total quality management to work. San Francisco:
Berrett-Koehler Pub.

Sapp, M. M., & Temares, M. L, (1992). A monthly checkup: Key success indices track health
of the University of Miami. NACUBO Business Officer, 25 (9), 24-31.

Schauerman, S. & Peachy, B. (1994, July-August), Sfrategies for implementation: the El


Camino college total quality management story. In D. B. Lumsden (Ed.). Community
College Tonmal of Research and Practice, 18 (4), 345-358,

Scherkenbach, W. W. (1992), The Deming route to quality and productivity. Washington, D.


C : CEEPress.

Schrof, J, M. (1996, September 16). Cmnch time for community colleges. U. S. News & Worid
Report, 123.

Schnell, G. (1995), Success stories (with contacts) for CQI in higher education (pp, 6)
HEPROC CQI-L Archive: American Association for Higher Education,

Seymour, D.T. (1993a), On Q: Causing quality in higher education. Phoenix, AZ: Oryx Press,

Seymour, D,T. (1993b). Quality on campus: Three institutions, three beginnings. Change, 25
(3), 14-27.

Seymour, D.T. (1994). The Baldridge cometh. Change, 26 (2^ 16-27

Seymour, D., & associates (Eds.). (1996). High performing colleges: The Malcolm Baldridge
National Quality Award as a framework for improving higher education (Vols. 1-2).
Maryville, MO: Prescott Publishing.

Shafer, B. S., & Coate, L. E. (1992, November). Benchmarking in higher education: A tool for
improving quahty and reducing cost. NACTTBO Business Officer. 26 (5), 28-35.

Shaughnessy, T. (1993). Benchmarking, total quality management and libraries. Library


Administration and Management. 7 (1) 7-12.

Shaw, G. (1995). Benchmarking on campuses (pp 6). HEPROC CQI-L Archive: American
Association for Higher Education.

Sheer, L. A., & Lozier G.G. (1991), Total quality management in higher education. In L. A.
Sheer & D, J. Teeter, (Eds.). Total quality management in higher education. San
Francisco: Jossey-Bass Inc.

Sherr, L. A., & Teetor, D. J. (1991). Total quality management in higher education. New
Directions for Institutional Research No. 71. San Francisco: Jossey-Bass

87
Sork, T. J., & Caffarella, R. S, (1990), Planning programs for adults. In B. B. Meriam & P. M.
Cunningham, (Eds.), Handbook of adult and continuing education (pp. 233-245). San
Francisco: Jossey-Bass.

Spendolini, M. J. (1992), The benchmarking book. New York: American Management


Association.

Stark, B. (1995). Benchmarking: Wish Lists/Benchmarkability (p. 3) HEPROC CQI-L:


American Association for Higher Education,

Taylor, B, E., Meyerson, J. W,, & Massy, W. (1993). Strategic indicators for higher educaiton:
Improving performance Princeton, NJ: Peterson's Guides.

Teeter, D. L., & Lozier, G. G. (1993). Pursuit of quality in higher education: Case .sttidies in
total quality management New Directions for Institutional Research. San Francisco,
CA: Jossey-Bass.

Thor, L. M. (1994, July-August). Introducing the human side of total quality management into
educational institutions. In D, B, Lumsden (Ed.). Community College .Toumal of
Research and Practice, 18 (4), 345-358,

Toffler, A. (1980). The third wave. New York: Morrow.

Tucker, S. (1996). Benchmarking: A guide for educators. Thousand Oaks, CA: Cowin Press,
Inc.

Veysey, L.R. (1965). The emergence of the American university. Chicago: The University of
Chicago Press.

Watson, G. H. (1993). Strategic benchmarking: How to rate your company's performance


improvement. Portland, OR: Productivity Press.

Weaver, C.N. (1995). Managing the Four stages of TQM. Milwaukee, WI: ASQC Quality
Press.

Weil, S. (Ed.). (1994). Introducing change 'from the top' in universities and colleges. London:
Kogan Page Limited.

Williams, G. (1993). Total quality management in higher education: Panacea or placebo?


Higher Education, 25. 229-237.

Wolverton, M. (1994). A new alliance continuous quality and classroom effectiveness. ASHE-
ERIC Higher Education Report No. 6. Washington, D. C : The George Washington
University, School of Education and Human Development.

88
Zemsky, R., & Massy, W. F. (1995). Toward an understanding of our current predicaments.
Change, 9 7 ( 6 ) 41-4Q

89
APPENDIX A

BENCHMARKING SURVEY INSTRUMENT

90
Business
Institution

BUSINESS OFFICE QUE.STIONS

Budgeting

Please attach a copy of your 1995-96 budget and answer the following budget questions.

1. What is your total unrestricted educational and general budget for


1995-96?
2. What was your total unrestricted 1994-95 audited financial statement?

3. Budget expenditures: Please give the dollar amount and the percentage of your
institution's budget that each category represents.
Category Dollar Amount Percentage of Total Budget
equipment (other than
technology)
expendable supplies
maintenance (physical plant
operations)
personnel (including fringe
benefits)
personnel salary (excluding
fringe benefits)
personnel fringe benefits
professional development
technology equipment
fravel
utihties

4. List all other areas of the budget with 5% or more allocated.

91
Business 1 2

5, Do you prioritize the following categories of your budget? yes no


If yes, indicate how they are prioritized by ranking the following categories from
1 -8 with 1 being most important,
equipment
expendable supplies
fringe benefits
personnel (existing)
personnel (new positions)
travel
technology
maintenance

6. What efforts has your institution taken to cut


utility costs?

training and equipment costs?

expendable supplies costs?

technology?

personnel?

travel?

maintenance?

Please offer any additional comments regarding budgeting here:

92
Business 1 3

Travel

Definitions:
Instructional travel—travel for the purpose of teaching classes i.e. off campus,
intercampus
Institutional travel—travel required for business i.e. committee service, new
regulations training, coordinating boards requirements,
agency requirements
Professional travel—travel for improving professional skills
Student travel—travel of student representing the college

7. What policies does your institution have goveming travel ? (may attach policy)
For adminisfration?

For faculty?

For staff?

For students?

8. What percentage of your institution's budget is spent on trayel?_

9. What dollar amount of your institution's budget is spent on travel?_

93
Business 1 - 4
10. In the first column check the areas for which your institution provides travel money;
in the second column give the percentage of the travel budget allocated for each area
checked; in the third column make any explanatory comments that might be needed to
clarify your answers.
Area as defined above Percentage travel budget Comments
Instmctional
Institutional
Professional
Student
Other (specify)

11. In what areas and how does your institution allocate money for travel? (Under
method check all that apply; then explain how the method is applied in each of the three
areas: faculty, adminisfration and classified)
Method Faculty Adminisfration Classified
formula (please
identify formula for
each area)
individual
budgets
exfra/
additional requests
other
(Please explain)

Please offer any additional comments about travel here:

94
Business 1 5

Purchasing

12. Does your institution use group purchasing? yes no


If so, in what areas?

13. Does your institution group purchase by combining with outside institutions or
businesses? yes no
If so, what types of institutions?

If so, what areas are affected by these agreements?

14. Is your institution interested in other group purchasing combinations?


yes ^no
If yes, what areas?

Please offer any additional comments about purchasing here:

Insurance other than UGTP

15. Does your institution self insure any form of coverage? yes no
If yes, what coverages are self insured?

16, Does your institution undertake a bid process to select your liability and/or worker's
compensation coverage?
yes no
If yes, how often?

17. What kinds of insurance coverage are purchased by the instittition for college
employees?

Please offer an additional comments on insurance here:

95
Computer Technology 2
Institution

COMPUTER TECHNOLOGY Ol IFSTIONS

Mainframe Questions

1. Regarding your record keeping and accounting fimctions, please answer the following
questions.
What database is being used?

What programming language is being used?

2 Please complete the following table with the indicated codes (use as many codes in
each cell as are necessary to properly define conditions):
Administrative /4rf/ioc reporting: Accessible by:
software: N=Not possible Q=Query language
H=Homegrown D=Outsourced DP Staff R=Report generator
P=Proprietary L=Local (College) DP Staff D=Downloading files
Area (give brand name if 0=CoIlege Non DP Staff 0=Open Data Base
P) Coimectivity

Prospective Students
Registration
Transcripts
Personnel
Payroll
General Ledger

Accounts Payable
Accounts Receivable
Financial Aid
Inventory
Library
Purchasing

96
Computer Technology 2 2
Institution

Administrative Ad hoc reporting: Accessible by:


software: N=Not possible Q=Query language
H=Homegrown DOutsourced DP Staff R=Report generator
P=Proprietary L=LocaI (College) DP Staff D=Downloading files
Area (give brand name if 0=CoIlege Non DP Staff 0=Open Data Base
P) Connectivity
Bookstore
Academic advising
Other (specify)

PC Questions

3. How is your computer equipment maintained and serviced? (Place an "X" in each
appropriate cell)
Equipment In house Service Confracts Other Outsource
Printers
PCS
LAN
WAN

4. How is your computer equipment supported (Place an "X" in each appropriate cell)
Equipment In house Service Contracts Other Outsource
Printers
PCS
LAN
WAN

97
Computer Technology 2 - 3
Institution

5. Please indicate the number of micro-computers currently in use at your institution, and
the number which are connected to a local area network:

Computers Primarily for Students


# of computers 8088/286 386 486 Pentium Apple II Macintosh PowerMac
Total
LAN Coimected

Computers Primarily for Faculty


# of computers 8088/286 386 486 Pentium Apple II Macintosh PowerMac
Total
LAN Coimected

Computers Primarily for Staff


# of computers 8088/286 386 486 Pentium Apple II Macintosh PowerMac
Total
LAN Connected

Persormel Questions

6. Please list, in the table below, the job titles for each individual position whose primary
purpose is support of computer technology at your institution. If the relative starting
salary for a secretary with no experience is indexed at 100, please give the relative
starting salary for each position, assuming that the individual holding the position has the
minimum experience you require.

For example, if a beginning secretary with no experience really makes $7.50 per hour,
and a beginning programmer-analyst with 3 years' experience (the stated minimum on
a job description) makes $18.00 per hour (or monthly equivalent), you would show
the index for the programmer-analyst to be 240. The math is:

100 * Computer Person $ 100 * $ 18.00


= Index = 240
Secretary $ S7.50

98
Computer Technology 2 4
Institution

The index will allow us to compare the relative value each college places on its
computer personnel while discounting regional salary variations. Please also indicate
whether the individual is paid on an hourly or monthly basis, and the percent time the
person is employed in support of computer technology (in the example below, the
programmer-analyst only works 75% time as a programmer-analyst; the individual
also works 25% time teaching CIS courses).

Percent Hourly Pay H=Hourly


Job Title Time Index M=Monthly
Secretary (example) 100% 100 H
Programmer-analyst (example) 75% 240 M

Policy Questions

Definition:
Technology—Telecommunications services such as telephone and data services,
instructional equipment, such as audio-visual equipment, and computer equipment.

7. Does your institution have a general technology fee ? Yes ^No


If yes, please answer the following questions.
How much is it?

To what services does this fee give the student access?

99
Computer Technology 2 - 5
Institution

Are the fees dedicated to purchase or support of technology?


Yes No

What percent of the technology fee budget is encumbered for


support personnel?

What percent of the technology fee budget is encumbered for


maintenance/upgrades?

Does your institution charge lab fees which include technology usage
separate from a technology fee? yes no
If yes, in what areas?

8. Are network accounts provided to students? yes no


If yes, please place an "X: in the space provided next to each service.
E-mail
Telnet
WWW
Other(specify)

If yes, what are these accounts?


course specific generic other (specify)

9. Do you provide computers at the institution's expense for school use by the
following: (Placing an "X" on the line means yes computers are provided for that
group; for each group checked, please indicate on the right side what method is used
to determine who gets these computers.)
students

faculty

administration

clerical

10. What is your replacement policy for microcomputers, e.g., every so many years, as
needed, etc)? (may attach copy of the policy)
In labs/classrooms?

100
Computer Technology 2 - 6
Institution

In offices?

11. What is your ratio of service providers (e.g., fiill-time, equivalent persons dedicated to
micro-computer support) to machines?

12. Do you provide remote access to your college's network? yes no


If yes, for whom?
faculty
staff
students
If yes, do you charge separately for this access? yes no

13. What computer operating systems do you support?

14. What network operating systems do you support?

15. Do you require students to sign contracts before they are allowed to use your
institution's technology systems? yes no
If yes, which of the following does it cover? (Please attach copy of
contract)
network use
E-mail
copyrights
security
student privacy rights
other (please explain)

16. Does your institution have a policy regarding usage of the Intemet?
yes no
If yes, please attach a copy of the policy.

17. How many machines are on your primary network?

Please offer any additional comments on computer policies here:

101
Computer Technology 2 7
Institution

Training

18. What is your institution's procedure for training employees on hardware and
software?

19. Is training mandatory on hardware? _yes no


If yes, when is the training required? before usage anytime

20. Is training mandatory on software? yes no


If yes, when is the training required? before usage anytime

21. Are the end-users being trained? yes no


If yes, how?

Budget

22. What is the dollar amount of your technology budget?_

23. What percentage of your technology budget is teleconununications?

24, What percentage of your technology budget is instmctional equipment?_

25. What percentage of your technology budget is computer equipment?

26, What percentage of your technology budget is maintenance?

27, How is your technology budget listed?


cenfralized as a line item
included with all other equipment items
disttibuted among departments
other (specify)

28. As you define it, what does your technology budget include:
telephones
instmctional equipment services (audio-visual equipment)
computers
LAN
WAN

102
Computer Technology 2 8
Institution

Please offer any additional comments on computer training here:

Service

29, Do you call a central location on campus for service assistance?


yes no

30, What approximate time lapses occur between a service request and service in
the dealing with the following:
hardware problems
printer problems
software problems
connections to LAN
connections to WAN

31, In what areas has your institution experienced problems? (check all that
apply)
software updates
software installation
hardware updates
hardware installation

32. How have you attempted to resolve the problems in the areas checked?
Software updates:

Software installation:

Hardware updates:

Hardware installation:

Please offer any additional comments on computer service here:

103
Developmental/Tutoring Instmction 3 - 1
Institution

DEVELOPMENTAL TABS, INSTRUCTION AND TUTORING QUESTIONS

Developmental Instmction

1, Does your institution offer remediation through developmental classes?


yes no
(If yes, please answer the following.)

2. Please give the contact hours for the fall of 1995 for each developmental area, the
number of employees in the area for fall 1995, and the number of students enrolled in
each area for fall 1995:
Developmental Area Contact Hours Number of Employees Number of Students

3. How many FTE faculty are in your developmental area?_

4. How many FTE support staff are in your developmental area?_

5. Estimate the percentage of remediation done through noncourse offerings?

6. How many FTE faculty are employed in the noncourse offerings?

7. How many FTE support staff are employed in the noncourse offerings?

8. Does your institution have a separate department/division for developmental classes or


are these classes offered within other departments/divisions?
separate ^within other (explain)

9. Are graders or assistants used in developmental classes? yes no


If yes, give the numbers used in each area?
writing reading math
If yes, describe what they grade in each area:
Writing:

Reading:

Math:

104
Developmental/Tutoring Instmction 3 - 2
Institution

10. How is a student charged for developmental classes and labs? (Please check the
appropriate line.)
Regular tuition rate
Tuition plus lab fee
Lab fee only
Clock hour fee
Other
(Please explain)

11. How is load figured for your developmental classes?


by contact hours by credit hours ^by clock hours other (please explain)

12, What percentage of classes (based on contact hours generated in the fall of 1995) is
taught by each? (Please fill in the percentages to equal 100%) for each area.)
Area Full Time Instmctors Part Time Instmctors
Writing
Reading
Math
Other (specify)

13. What is the class size limit on your developmental classes?

writing reading math

14. How are students placed in developmental classes?

If tests are used, please check those used:


In house Pre TASP TASP ASSET COMPASS
Other (Please explain)

Please offer any other comments on developmental classes here:

105
Developmental/Tutoring Instmction 3 3
Institution

Tutoring

15. Does your institution offer peer and/or professional tutoring?


yes no

f yes, please check the types of tutorin]^ offered.


Type Tutoring Labs One on one tutoring Small group tutoring
Peer
Professional

16. How are tutoring services fimded? (Check all that apply.)
Institutional fimds
Workstudy fiinds
Charge to student
Grant funds
Describe:
Other-
Describe:

Please off any additional comments on tutoring here:

Learning Center

17. Do you have a leaming center? _yes no


If yes, please check all activities offered through your center.
Basic skills remediation
GED preparation
TASP remediation
TAAS remediation
Competency based high school credit
Tutoring
College level computer assisted instmction
Other (Describe)

106
Developmental 3 - 4
Institution
18. In your leaming center or developmental labs which do you use: (Check all that
apply.)
lab assistants
lab supervisors
fiill time instmctors
part time instmctors
partial load 4or fiill time instmctors
other (specify)

19. Are classes taught in a developmental lab rather than having lab work supplemental
to instmction? yes no
If yes, is instmction by (Check appropriate response)
instmctor lab assistant lab supervisory other (specify)

20. What is the lab size limit on your developmental labs?


writing reading math

Please offer any additional comments on leaming centers/developmental labs here:

107
Distance Education 4 - 1
Institution

DISTANCE EDUCATION QUE.STIONS

Definition:
Distance Education—a planned teaching/learning experience that uses a wide
spectrum of technologies to reach learners at a distance and is designed to
encourage learner interaction and certification.

1. Is your institution involved in distance education? yes no


If yes, please complete the rest of this questionnaire.

2. For which of the following does your institution use its delivery systems:
credit courses
noncredit courses (e.g., continuing education, seminars, etc.)
instmction within the college system
instmction outside the college system
intercampus meetings
other (specify)

3. For the above, does your institution create its own course content or purchase
courses?
create purchase both

4. Do you sell the services of any of delivery systems to outside businesses?


yes no
If yes, check services offered:
Interview hookups
Teleconferencing
Intemet
Training
Other (please specify)

5. What departments/divisions supply support for distance education?

6. To what position do these entities from departments/divisions report?

7. To what position do distance education support personnel report?

108
Distance Education 4 2
Institution

8. What is the number of your FTE faculty dedicated to distance leaming?_

9. What is the number of your FTE support personnel dedicated to distance


education?

10, Name these positions of support personnel in distance education in the first column;
then in the second column please indicate if they have other responsibilities outside of
distance education..
Position job title Other responsibilities

11. Who teaches the courses offered by your delivery systems, (Check all that apply.)
Full time faculty as part of load
Full time faculty as overload
Part time faculty
Confracted services (If check, please list what services are contracted.)

Other (specify)

12. How are services such as registration and advising provided for distance
education?

13. Indicate your institution's fiiture in distance education by listing the top three
priority activities planned.

Please offer any additional comments about distance education here:

109
Distance Education 4 - 3
Institution

Budgeting

14. How are your delivery systems budgeted? (Check all that apply.)
Institutional budget
Grants
Fee Stmcture
Other (Please explain)

15. What percentage of the college budget goes to distance education activities?

16. What is your income from distance education?

17. What are your expenses for distance education?

18. Do you charge any additional fees for distance education services?
yes no
If yes, please attach a policy.
If yes, how much?

If yes, to what groups?

Please offer any additional comments on budgeting distance education here:

Equipment

Please attach a list of the equipment that your institution uses in operating your delivery
systems. (Should include brand, type of equipment and model #.)

19. What kind of dedicated space do you have for the operation of your delivery
systems?
designated shared other (specify)

Please offer any additional comments on equipment for delivery systems here.

110
Distance Education 4 - 4
Institution

Agreements

20. With how many other institutions, (e.g., four year, two year, high schools) do
you have distance education agreements?
Please list these institutions.

Please offer any additional comments about agreements here.

DELIVERY SYSTEMS

21. Check the types of delivery systems used at your institution.


Broadcast
ITFS and microwave
Cable television
Intemet and computers
Video
Other (specify)

Please answer the section of questions that apply to those systems checked above.

Broadcast
1, Which broadcast systems does your institution use?
Satellite
Closed Circuit for campus
Compressed video
Radio
Low power television
Other (specify)

2. What is the coverage area (mile radius) of your broadcast system?

3. How long have you been providing broadcast service?

4, How many classrooms are connected to send and receive broadcast.

Ill
Distance Education 4 - 5
Institution

5. What equipment is provided in these classrooms? (may attach list)

6. What course titles are offered through broadcast? (may attach a brochure)

7, Are there enrollment limits on these classes? yes no


If yes, what are they?

8. How are the course offerings for the broadcast system determined, e.g.., by
departments, by distance education department, etc.)?

9, Check how you publicize your broadcast classes: (check all that apply)
Inside campus
Focused advertising to a group
Outside campus
Separate as individual classes
Together
Paid advertising
Other (specify)

10. Does your institution have uplink capability? yes no


If yes, describe.

11. Does your institution have an agreement with a public television station?
yes no
If yes, what kind of agreement?

12. Does your institution have an agreement with a local cable system?
yes no
If yes, what kind of agreement?

112
Distance Education 4 - 6
Institution

13, In this area of broadcast does the cost of offering such classes exceed the
benefits?
yes no
Please explain,

14. What are the advantages of using broadcast to deliver education?

15. What are the disadvantages of using broadcast to dehver education?

Please offer any additional comments about broadcast delivery systems here.

ITFS and Microwave

1. What size physical area does your ITFS or microwave system serve?

2, How long have you been providing ITFS and microwave service?_

3. How many students do you serve with this system?

4, What course titles are offered through ITFS and microwave?_

5. Are their enrollment limits on these classes? yes ^no


If yes, what are they?

6. How are course offerings for the ITFS and microwave system determined, (e.g., by
departments, by distance education department, etc.)?

113
Distance Education 4 7
Institution

7, Check how you publicize your broadcast classes:


Inside campus
Focused advertising to a group
Outside campus
Separate as individual classes
Together
Paid advertising
Other (specify)

8. Do you make money offering classes benefits by ITFS and microwave?


yes ^no
Explain,

Please offer any additional comments about ITFS and microwave delivery systems here:

Intemet and Computers

1. What Intemet and computer systems does your institution offer?


E-mail with list servers
List servers
WWW
_Other (specify)

2. Who is your Intemet provider?

3, How much do you pay for the Intemet seryice?_

4, Is your institution using Intemet as a teaching tool? yes no


If yes, how?

5. Is your institution using Intemet as a leaming tool? yes no


If yes, how?

114
Distance Education 4 - 8
Institution
6. Is your institution using Intemet as a resource for additional classroom
information? yes no
If yes, how?

7. Do you have a WEB page? yes no


If yes, do you maintain your own web page? yes no
If yes, how? as an institution
_by division
by department
other (specify)

8. What Intemet services does your institution provide?

9. To what groups are these Intemet services offered?

If offered to students, how many?


on campus
off campus

10. How long have you been providing Intemet service?_

II. What equipment is provided for Intemet connection?

12. Does the cost of offering such Intemet classes exceed the benefits?
yes no
Please explain.

13. What course titles are offered through Intemet? (may attach a brochure)

115
Distance Education 4 • 9
Institution

14. What level of classes is offered through Intemet?

15. Are there enrollment limits on these Intemet classes? yes no


If yes, what are they?

16. How are the course offerings for Intemet determined, e.g.., by departments, by
distance education department, etc.)?

17. Check how you publicize your Intemet classes:


Inside campus
Focused advertising to a group
Outside campus
Separate as individual classes
Together
Paid advertising
Intemet
Other (specify)

18, In your community college setting, what are the legal issues you are or must address
(e,g., receiving funding if class attendants are outside the area)?

19, What equipment ramifications should be addressed when offering distance education
classes over the Intemet, (e.g., sufficient disc space from too many enrollees)?

20. What are the advantages of using the Intemet as a delivery system for distance
education?

21. What are the disadvantages of using the Intemet as a delivery system for
distance education?

116
Distance Education 4 10
Institution

Please offer any additional comments about Intemet and computer systems here.

Video

1. Check which video systems your institution offers?


Two way interactive
Traditional Teleconferencing
Video Teleconferencing
Cassettes
Other (specify)

2. To what statewide video networks do you subscribe?


VIDNET
Starlink
TEA
Other (Please explain)

3. What are the number of connections your video system serves?

4. How long have you been providing video service?

5. How many classrooms do you have cormected to video?


on campus
off campus

6. What equipment is provided at college expense in these classrooms? (may attach list)

7. What course titles are offered through video? (may attach a brochure)

117
Distance Education 4
Institution

8. Are there enrollment limits on your video classes? yes no


If yes, what are they?

9. How are the course offerings for the video system determined, (e.g., by departments,
by distance education department, etc.)?

10. Check how you publicize your video delivery classes:


Inside campus or group
Outside campus
Separate as individual classes
Together
Paid advertising
Other (specify)

11, Does the cost of offering video classes exceed the expense of using video?
yes no
Please explain.

12. What are the advantages of using video as a delivery system of distance education?

13. What are the disadvantages of using video as a delivery system or distance
education?

Please offer any additional comments about video delivery systems here.

118
Institutional Effectiveness 5
Institution

RESEARCH/PLANNING/INSTITUTTONAI, EFFRCTIVENESS QUESTIONS

1. Please attach an organizational chart of your institution and, if possible, attach brief
descriptions of job fimctions related to administrative positions that conduct research,
planning or institutional effectiveness duties.

2. How many classified employees work in these offices?


Give the numbers and briefly explain the job title and function of each employee.
Number Office .Toh Titie and Function

Research

Planning

Institutuional
Effectiveness

Other (specify)

3, Describe your yearly procedure or attach a time line or policy for institutional
planning and effectiveness.

4. What are your institutional accountability measures, success indicators, or objectives


and outcomes? (may attach copy of measures)

5. How does your institution use these accountability measures, success indicators or
objectives and outcomes in the decision-making process?

Please offer additional comments regarding institutional research and effectiveness here:
(particularly identify irmovative or usefiil approaches you are using)

119
Institutional Effectiveness 5 - 2
Institution

Resource Development
Grants

6, What percentage of your institution's annual budget is raised through grants and
contracts?

7. What is your institution's income per contact hour from federal sources?

8. How many employees are involved in pursuing or overseeing grant money?_


Please list these by function and/or attach job descriptions.

9. Do you have a grant approval process? yes no


If yes, please attach or below describe your form or process.

10, Generally what has been your institution's obligation to personnel and resources after
the grant is concluded?

11. Do you have an exit plan for each grant? yes no


If yes, please explain.

Please offer any additional comments on grants here:

120
Institutional Effectiveness 5 3
Institution

Foundation

12. Do you have a college foundation? yes no


If yes, please answer the following:
What is the dollar amount of your foundation?

Describe your foundation management stmcture?

13. What percentage of your foundation management costs paid by each?


by institution
by foundation fiinds
by a combination of the above (Please give percentage of each.)

other (specify)

14. Do college personnel serve as voting members on your college foundation board?
yes no
If yes, which ones?

15, What pohcies govem the foundation's distributable fiinds? (may attach policy)

16. Does your institution hire outside managers for managing the foundation fund?
yes no

17. What is your investment policy for foundation funds?


(may attach policy)

18, What percentage of the distributable fimds goes for


scholarships
institutional support

Please offer any additional comments on foundations here:

121
Instmctional 6 - 1
Institution

INSTRl JCTIONAI, QTIFSTTONS

Please attach an instmctional organization chart and answer the following questions about
your credit instmctional programs.

Cost Study

On your 1994-95 cost per contact hour study submitted to the state, fill in the chart below
to indicate your cost per contact hour for the following:
Area State Median Lead College Your Institution
Institutional Support 0,83 0.82
Student Services 0.44 0.39
Staff Benefits 0.29 0,28
Library 0.19 0.28
Other

Please continue to fill out t le chart belo\V from your 994-95 contact hour study.
Academic- Academic- Academic- Voc/Tech Voc/Tech Voc/Tech
Area State Lead Your State Lead Your
Median College Institution Median College Institution
Instructional 0.21 0.21 0,32 0.28
Administration

Organized
Activities 0.08 0,00 0,07 0.05
Related to
Instmction

Instmctional Administration

Definition:
Instructional administrators—Employees who supervise faculty and maintain
instructional load.
1. If your institution defines instmctional administrators differently, please give your
definition.

122
Instmctional 6 2
Institution

2. How are your instmctional administrators considered?


faculty administrators other (specify)

3. Are load reductions given for school responsibilities outside instmction?


yes no
If yes, what responsibilities?
Extra curricular activities
Proj ects/grants
Job-related activities
Professional organizational service
Committee service

4. In the first column list the titles of the various levels of your instmctional
administration; in the second column define the major duties of each level; and in the
third column explain how each level is compensated , i.e., pay, release time or both.
Titles Define Duties Compensation-pay, time,
both or other

123
Instmctional 6-3
Institiation

5. In the first column again list the titles of your instmctional administrators; in the
second column indicate how many departments that title supervises; in the third column
indicate how many different major areas are supervised; in the fourth column indicate
what the teaching load responsibility is for that title; and in the fifth column indicate the
length of contract of that title, i.e., 9 month, 10 month , 11 month or 12 month.

Tifle #ofDepts. # Major Areas Teaching Length of


Load Contract

6. Once more please list the titles of your instmctional administrators in column one; in
column two, three and four please give the number of employees supervised per semester
by that instmctional administrative level.
Titles Full time faculty Part time faculty Classified
supervised supervised supervised

Please offer any additional comments about instmctional administration here:

124
Instmctional 6 - 4
Institution
Load Considerations

7. How is a full time semester faculty load calculated at your institution?

8. How does this definition apply across departments in figuring


Science lab load?

Computer lab load?

Physical activity class load (Please include how many hours per week
physical education activity classes meet)?

Clinical hours load (Define each clinical type instmction and then explain
how the responsibility relates to faculty load)?

Leaming labs load?

Technical lab loads?

English leaming lab loads?

Private music lesson loads?

Performance (music and theatre arts) ensemble loads?

Coop instmction loads?

Intemship loads?

Lecture/lab loads?

9. Do fiill time instmctors receive load credit for lab instiaiction? yes no

125
Instmctional 6-5
Institution

10. Are load reductions given for school responsibilities outside instmction? yes no
If yes, what responsibilities?
Extra curricular activities
Proj ects/grants
Job-related activities
Professional organizational service
Committee service

11. Has the compensation either by pay or release time proved to be cost effective?

yes no
Explain.

12. Has the compensation proved to be beneficial, i.e. program growth, prestige to your
institution? yes no
Explain,

13. Do adult vocational classes count as faculty load? yes no


If yes, how are they applied to the faculty status?

14. How do you calculate load for credit adult vocational classes?

15. Do you have mixed classes (i.e., semester hour credit students and adult vocational
students in the same class)? yes no
If yes, how do you calculate load on these classes?

16. How does your institution calculate load credit on small classes (i.e., under 10 in
enrollment) in the transfer area? (Give an example)

17. How does your institution calculate load credit on small classes (i.e., under 10 in
enrollment) in the vocational/technical area? (Give an example.)

126
Instmctional 6 - 6
Institution

18. Are faculty compensated for extra large classes? yes no


If yes, define an extra large class.

19. How is the faculty compensated for these extra large classes?
not compensated
extra load credit
monetary compensation
^both of the above
other (Please explain)

Please offer any additional load consideration comments here:

Instmctional Personnel

20. What percentage of yourfialltime employees arefialltime faculty?_

21. What percentage of your part time employees are part time faculty?

22. What percentage of your full time faculty is on the following assignments
^hourly basis
9 month
10 month
11 month
12 month

23. What percentage of your classified personnel is on the following assignments


^hourly basis
9 month
10 month
11 month
12 month

24. What percentage of your administration is on the following assignments


hourly basis
9 month
10 month
11 month
12 month

127
Instmctional 6 - 7
Institution

25. Does your institution have a rank system? yes no


If yes, what is the value of each step.
instmctor
assistant professor
associate professor
professor
other (specify)

26. Are faculty required to maintain office hours? yes no


If yes, what is the minimum requirement?

27. Are counselors and librarians considered faculty or non-faculty?


Faculty Non-faculty
Counselors
Librarians
If nonfaculty, how are they classified?

If nonfaculty on what scale are they paid?

28. What is the fiill time counselor ratio per FTE (15 credit hours) students at your
institution?

29. What is the fiill time librarian ratio per FTE (15 credit hours) students at your
institution?

30. What credentials must your counselors possess and why?

31. Do counselor salaries increase with credentials such as LPC, etc?


yes no

32. Describe your academic advising system, (check the appropriate system)
centeralized, i.e., counseling center
decenfralized, i.e., departmental faculty
integrated (explain)

other (explain)

128
Instmctional 6 - 8
Institution

33, Do you use computer assisted instmction to save on personnel cost?


yes no
If yes, in what areas?

Please offer any other personnel comments here:

Part time faculty

34, How does your institution define part time faculty?

35. What percentage of your contact hours were generated by part time faculty for the last
fall semester?

36. On the left list the departments/divisions that used part time faculty and on the right
give the percentage of faculty for that department/division that is part time or attach a
printout of the information.

Department Percentage of Instmction


by part time

129
Instmctional 6 - 9
Institution

37. How are part time instmctors compensated?

38. What is your load percentage goal for the the level of use of part time
faculty?

39. What is the maximum percentage goal for part time faculty?

40. How do you decide to replace part time with full time faculty?

41. How do you ensure quality of insttiiction in using part time instmctors?

Please offer any additional part time instmction comments here:

Computer labs and staffing

42, How many computer labs does your institution have?

43. Are the various department computer labs separate or consolidated in one area of the
campus? separate consolidated

44, List discipline specific computer labs.


Discipline Number of labs

130
Instmctional 6-10
Institution

45. Does your institution have a centralized computer system? yes no


Is so, in what areas: administrative academic technical
other

If so, who accesses each system?


Administrative

Academic

Technical

Other (specify)

46. How are these labs staffed?


lab instmctor (faculty)
lab assistant
lab supervisor
other (specify)

Please offer any additional comments on computer labs and staffing here:

Other Labs (e.g., science, math, medical, business)

47. Does your institution differentiate between lab instmctors, lab assistants and lab
supervisors? yes no
If yes, define:
lab instmctor

lab assistant

lab supervisor

48. Are lab assistants excluding student assistants considered classified personnel?
yes no

49. How are lab assistants excluding students evaluated?

131
Instmctional 6
Institution

50. If you use assistants excluding student assistants, how is their pay figured?

51. Give the percentage of assistants excluding students employed as each.


fiall time part time

52. Do you use student workers in your labs? yes no


How?

53. What does a lab assistant do with student work?


aid students teach students evaluate student work

54. What does a lab supervisor do with student work?


aid student teach student evaluate student work

55. What credentials (e.g., work, degree, qualifications) must a lab instmctor possess?

56. What credentials (e.g., work, degree, qualifications) must a lab assistant possess?

57. What credentials (e.g., work, degree, qualifications) must a lab supervisor possess?

58. If you use lab instmctors, how is their load for labs calculated?

59. If you use lab assistants, how is their pay calculated?

60. If you use lab supervisors, how is their pay calculated?

61. Do you separate lecture and labs? yes no

If yes, what position is responsible for lecture?

If yes, what position is responsible for lab?

132
Instmctional 6-12
Institution

62. How are technical labs organized? competency based customer driven
Other (explain)

63. Do you have departmental labs? yes no


If yes, on the left list the departments that use labs, in the middle give the number
of lab sections used per department and on the right give the number of computers
in computerized labs per department.

Department Number Computerized

IfnQ, how do you accommodate computerization?

Please offer any additional comments on academic or technical labs here:

Travel considerations

64. How do you allocate funds for travel?


by institutional budget
by department/division budget
by individual budget
grant
other (specify)

133
Instmctional 6-13
Institution

65. What policies does your institution have goveming travel ?

For administration?

For faculty?

For staff?

For students?
66. Are there policy differences between departments regarding travel? yes no
If so, what differences apply?

67, Who handles the arrangements for travel at your institution?


centralized college position
office of individual traveling
confract with extemal agency
other (explain)

Please offer any additional comments on travel here:

Budgeting

68. What position is responsible for equipment inventory, restocking, etc?

69. How do you usually provide equipment for your institution?


rent
lease
purchase

70. What is your policy for obtaining equipment?

Renting policy

Leasing policy

Purchasing policy
134
Instmctional 6-14
Institution

71. How do you discontinue equipment?

72, What percentage of you institutional budget is allocated for equipment?_

73. What percentage of your annual equipment budget is instmctional equipment?_

74. What percentage of your equipment budget is general/operational equipment?_

75. What percentage of your budget is grant money?

Please offer any additional comments on budgeting here.

Instructional Professional Development

76. What position is responsible for professional development at your institution?

What is this position? part time fiill time

How is this position compensated? release time money

77. Describe the academic (faculty) professional development process at your


institution?

78. Describe the administrative professional development process at your institution?

79. Describe the classified professional development process at your institution?

135
Instmctional 6-15
Institution

80. Does the technical faculty receive professional development? yes no


If yes, please answer
Is it separate from the academic? yes no
Is it for training in new technology? yes no
Is it for additional certifications? yes no
Does the college pay for technical faculty to receive training in new
technology? yes no
Does the college pay for technical faculty to receive additional
certifications? yes no

81. Does the technical faculty receive release time for training in industry? yes no
If yes, how is this handled?

82. How is professional development funded for your institution?


institution wide fimd
department/division budgets
other (specify)

83. What percentage of the budget is allocated for instmctional professional


development?

84. Are there any required professional development activities? yes no


If yes, list examples for each employee group?

Please offer any additional comments on instmctional professional development here:

136
Outsourcing 7 I
Institution

QUTSOURCING-BIJSINESS OFFICE QIJESTIONS

Definition:
Qutsourcing-the contracting of services to outside sources that the college normally
performs
Services sold-the providing of services by the college for money to sources outside
college

1. Check the services that are oursourced and the services that are sold by your institution;
then for each outsourced service please explain the type of agreement that you have.

Services
Outsourced Sold by
Services Institution Service Explanation of Agreement

Accounting

Academic advising

Administrative Computer

Advertising

Bookstore

Building Maintenance

Career Counseling

Child Care

Clerical

Computer Services

Computer Software

Courier Services

Custodial

137
Outsourcing 7- 2
Institution

Services
Outsourced Sold by
Services Institution Service Explanation of Agreement

Distance Education

Energy Maintenance

Food Services

Grounds

Instmctional areas

Health & Medicine

Housing

Job Placement

Library

Mailing

Maintenance of
Personal computers

Network computer

Parking

Payroll

Personal Counseling

Police

Printing Services

Professional staff
Development Training

138
Outsourcing 7- 3
Institution

Services
Outsourced Sold by
Services In.stitntion Services Explanation of Agreement

Publications

Purchasing

Records Management

Security

Telephone

Testing

Travel Arrangements

Vehicle Maintenance

Vehicle use

Vending Machines

Other (Specify)

Other (Specify)

139
Personnel 8 - 1
Institution

PHRSONNF.L QTJF.STIONS

Definition:
Full time Fanilty-those who are eligible for TRS or ORP and state insurance
benefits
Part-time faculty-those faculty who are not eligible for TRS or ORP and state
insurance benefits
Administration—those personnel who are not faculty but typically have eligible
consideration for ORP
Part-time administrators—fhnsp personnel who are not faculty but would be eligible
for ORP consideration if the position was full time
Classified—those clerical employees in support positions who are usually paid on
an hourly basis or technical and professional non-faculty
positions which do imt have the responsibility of the
administrative positions
Part-time classified employees—those support employees not eligible for TRS or
ORP due to assignment of hours (i.e. 19 or less) or length of
assignment

1. If your definition of full time faculty is different from the one above, what is your
definition?

2. What is your definition of full-time faculty load?

3. If your definition of an administrator is different from the one above, what is your
definition?

4. If your definition of classified is different from the one above, what is your definition?

140
Personnel 8-2
Institution

5. If you have employment categories not listed above, please list and define them
jelow:
Category Definition

Part-time employees

6. Does your institution use part-time classified staff?


yes no
If yes, please answer the following questions.
In what areas?

How are they hired?

How are they paid?


Stmctured pay schedule
Specific clock hour rate
Other (please explain)

7. Does your institution use part-time administrators?


yes no
If yes, please answer the following questions.
In what areas?

141
Personnel 8 - 3
Institution

How are they located and hired?

How are they paid?


Stmctured pay schedule
Specific clock hour rate
Negotiated (If negotiated, on what basis?_
Other (please explain)

Please offer any additional comments on part-time employees here:

Review or evaluation process

8, What kind of review or evaluation process do you use forfiall-timefaculty? (may


attach example or copy of evaluation instmment)

9, Is the evaluation process a standard for all full-time faculty?


yes no

10, How often do you review or evaluate fiall-time faculty?

11. Does your review process differ for fiall time faculty after the first few years?
yes no
If yes, how?

12. Are salary increases tied to this review process for fiill time faculty?
yes no
If yes, how?

13. What kind of review or evaluation process do you use for part-time faculty? (May
attach example or copy of evaluation instrument)

14. How often do you review or evaluate part-time faculty?_

142
Personnel 8-4
Institution

15. Does your review process differ for part time faculty after the first few years?
yes no
If so, how?

16. Are salary increases for part time faculty tied to this review process?
yes no
If yes, how?

17. What kind of performance evaluation process do you use for classified /clerical?
(may attach example or copy of evaluation instmment)

18. Is this evaluation process different after the first few years of service?
yes ^no
If yes. How?

19. Are salary increases tied to this process? yes no


If yes, how?

20. How often do you evaluate classified/clerical staff performance?_

21. What kind of performance/evaluation process do you use for administration? (may
attach example or copy of evaluation instrument)

22. How often do you conduct these evaluations of administration?

23. Is your process different after the first few years of service?
yes no

24. Are salary increases tied to this process? yes no


If yes, how?

143
Personnel 8-5
Institution

25. Does your institution award merit pay? yes no


If yes, to what groups?

If yes, how is it determined?

Please offer any additional comments on review and evaluation here:

Retirement

Please attach copies of any retirement policies that your institution is now using.

26. Has or does your institution offer early retirement packages?


yes no
If so, what was your purpose in offering such packages?

If so, has the retirement of the faculty resulted in an increase or a decrease


in the educational credentials of the overall faculty?
increase decrease
If so, describe the budget impact of these retirement packages?

27. Has or does your institution offer retirement bonuses or programs?


yes no
If yes, briefly describe your offerings.

28. Does your institution offer a separate retirement plan from ORP/TRS?
yes no
If yes, answer the following.-
Describe your plan:

144
Personnel 8 - 6
Institution

To what degree does your institution finance the plan?

When is an employee vested in the plan?

29. Does your institution participate in Social Security other than Medicare?
yes no
If yes, is Social Security paid for your employees? yes no
If yes, is the employee share paid by the institution? yes no
If yes, what percenage is paid by the institution?

Please offer any additional comments about retirement here:

Non-instmctional work hours

30. Is there a core time for all non-instmctional employees to work?


yes no
If so, what are those hours?

31. Does your institution use flex time scheduling of hours for some non-instmctional
positions?
If yes, please attach a copy of your policy goveming flex time scheduling and
answer the following question.
What deviations are allowed in scheduling work hours?

Please offer any additional comments about work hours here:

145
Personnel 8-7
Institution

Volunteers

32. Does your institution use volunteers? yes no


If yes, please answer the following questions.

In what areas?

How are they obtained?

How are they managed?

How are they rewarded?

Please offer any additional comments on volunteers here:

Instmction

33. What does your institution consider to be the size of an average lecture class?

Please offer any additional comments on instmction here:

146
Instmctional Scheduling 9
Institution

INSTRUCTION AL-SCHEDin,TNG

Fall/Spring Evening Cla.ss Scheduling

1. Does your institution offer evening classes? yes no


If answered yes, please also answer the following questions.

2. Please list the number of sections in an average semester for each area.
Area Average number of sections
Academic
Technical
Continuing Education

3. How does your institution schedule night classes (e.g., number of hours per meeting, and
days of the meetings, etc.)?

4. What percentage of these classes are taught byfiall-timefaculty?_


by part-time faculty?_

5. Are full time faculty required to teach evening classes as a part of their regular load?
yes no

6. Can a person obtain a degree from your institution by attending only evening classes?
yes no

7. Within these evening academic classes, are other students allowed to take the same class
for continuing education credit? yes no
If so, how are they charged for the class?

8. Are any evening classes scheduled at a company site or at an unusual time (midnight, etc)
for business or industry? yes no
If yes, please explain:

Please offer any additional comments on evening class scheduling here:

147
Instmctional Scheduling 9 - 2
Institution

Summer Term Scheduling

9. Does your institution offer summer classes? _yes no


If answered yes, please answer the following questions.

10. How many classes (total number of all sections) were offered at your institution for the
1996 summer?

11. What was the average number from the last three years of classes (total number of all
sections) offered in summer terms?

12. When are these classes offered? (check all the answers that apply)
Six week sessions
Eight week sessions
Ten week sessions
Eleven week sessions
Twelve week sessions
Other (Please explain)

13, Describe the summer classes offered by checking applicable times in column one, then
fill in the other two columns for each time checked note the total contact hours generated
and the cost versus revenue evaluation of each type of class.
Did vou generate enough
Time Total Contact Hours enrollment to be cost
Academic Technical effective? (YorN)
Academic Technical
One long session
per summer

Two short sessions


per summer

One long night


session per summer
Two short night
sessions per
summer
Other (specify)

148
Instmctional Scheduling 9 - 3
Institution

14. Why do you feel that your course scheduling meets your students needs?

15. Within these summer academic classes, are other students allowed to take the same class
for continuing education credit? yes no
If so, how are they charged for the class?

Please offer any additional comments on summer scheduling here:

Nontraditional Scheduling

Definitions:
Mini-semester—a semester of credit offered in times between regular semesters
Weekend classes—credit classes which meet only on weekends (A weekend begins
at 5 p.m. on Friday until midnight on Sunday)
Block classes within a long semester—credit classes offered for short condensed
periods of time rather than the 15 weeks of a semester or a summer
term
Extension center—an off campus location that offers college credit classes
excluding dual credit classes offered at high schools
Dual credit classes—Classes offered with both high school credit and college class
credit being granted at the same time
Concurrent classes—Classes that students still attending high school may
attend for college credit only.

1. Check the types of nontraditional class offerings based on the above definitions that
your institution offers and then answer the questions in the section that corresponds to
those programs marked.
Mini-semesters
Weekend sections
Block classes
Extension center
Dual/Concurrent

149
Instmctional Scheduling 9 - 4
Institution

2. Has your institution found other optimal scheduling periods other than those listed
above which are cost effective? yes no
If yes, please explain.

Mini-semester scheduling

3. Please fill in the blanks below to answer the following questions. Check when your
mini-semesters are scheduled? On the average how many students enroll per semester?
How long do they meet? What is the average number of classes offered in a mini-
semester?
Time # Students Length £iif
Classes
Between spring and summer sessions
Between fall and spring sessions
Other (please explain)

4, Do you select instmctors for mini-semesters based on cost efficiency (e.g., part time
vs, full time)? yes no

5, How do you calculate instmctor pay for mini-semester classes?

6. How do you calculate the cost to your institution to offer such classes?

7. For 1995 what percentage of the students enrolled in mini-semesters did not attend
standard 15 week classes as well?

8. What methods do you use to evaluate these compacted courses?

9. Do you do a cost vs. revenue comparison for these courses? yes no


If yes, how is it done?

150
Instmctional Scheduling 9 5
Institution

10. Within these academic classes being taken for credit, are other students allowed to
take the same classes for continuing education credit? yes no
If so, how are these students charged for the class?

Please offer any additional comments on mini-semester here:

Weekend class scheduling

11. How do you package weekend classes, i,e,, what days, times of those days and what
durations are your week-end classes offered?

12. Do you compact any classes into single weekend block classes allowing a student to
complete an academic class in a shorter time than usual?
yes no
If yes, what classes are offered?

13, Do you compact any classes into single weekend block classes allowing a student to
complete a continuing education class in a short time? yes No
If yes, what classes are offered?

14. How is credit given for these classes?


Academic for degree
Continuing education
Technical

15. Have such weekend classes proved to be cost effective? yes no

16. Can a person obtain a degree from your institution by attending only weekend classes?
yes no

151
Instmctional Scheduling 9 - 6
Institution

Please offer any additional comments on weekend classes here:

Block Class Scheduling Within a Long Semester (other than mini-terms):

17. In what disciplines do you offer block classes?

18. What groups of students are targeted for such classes?


adult
high school
18-22 year olds

19. What percentage of these classes is taught by


Full-time faculty Part-time faculty

20. How is load calculated for these classes?

21, How is instmctor pay calculated for these classes?

22, Are any of these block classes offered at night? yes no

23, Do these block classes balance cost with the revenue generated? yes no

Please offer any additional comments on block scheduling here:

Extension center .scheduling

24. How many extension centers do you have?


Please list your extension centers here:

152
Instmctional Scheduling 9 - 7
Instittition

25. Kind of courses offered at these centers:


academic
technical
continuing education

26. Generally how many students are required per class for the class to be offered?
in an academic class
in a technical class
in a continuing education class

27. Can a student receive a degree by taking classes offered only at your extension centers?
yes no

28. Who teaches these extension center classes?


fiill-time faculty from campus
part-time faculty from campus
part-time faculty in the vicinity of the center
other (specify)

If campus faculty are used at extension centers, please answer the following:
Is it considered part of their regular load? yes no
Is it considered overload? yes no
Is faculty required to teach at extension centers? yes No
Are they compensated for time spent traveling? yes no
If yes, how are they compensated?

Are they compensated for travel mileage?


yes no
If yes, how is the compensation calculated?

29. If employees of the main campus must travel to the extension centers, what type of
travel arrangements are used?
college vehicle
mileage paid (Please offer reimbursement rate)
meals
lump sum per semester
other (please explain)

153
Instmctional Scheduling 9 - 8
Institution

30. Do you pay an on site coordinator at each extension center? yes no


If yes, how much?
If yes, what are the responsibilities of this coordinator?

31. Has maintaining extension centers proved to cost effective? yes no

Please offer any additional comments on extension centers here:

Dual/Concurrent Enrollment Classes

32. How many students are required in order to offer these classes?

33. Where is physical space for these classes provided?

34, Who teaches these classes?


Full-time college faculty
Full-time high school faculty
Part-time college faculty
Part-time high school faculty
If fiill-time college faculty is used, please answer these questions:
Are these classes part of their regular load? yes no
Are they required to teach such classes? yes ^no
Are these classes considered overload? yes no

35. What credentials must the academic instmctors possess to be qualified to teach dual
credit classes for your institution?

36. What credentials must the technical instmctors posses to be qualified to teach dual
credit classes for your institutuion?

154
Instmctional Scheduling 9 - 9
Institution

37. Who receives compensation for teaching these dual credit classes?
high school system
high school faculty doing the teaching
college faculty doing the teaching
other (please explain)

Please offer any additional comments about dual/concurrent classes here:

Continuing Education Scheduling

38. On the average how many continuing education classes are offered at your institution
during the summer?
Type 1st Quarter 2nd Quarter 3rd Quarter 4th Quarter
Funded
Nonfiinded

39. What is the process for scheduling these?

40. How are funds generated for nonfunded continuing education classes?

41. How are fimds generated for fiinded continuing education classes?

42. Is it your goal to make a profit from the fimded classes or break even?
profit break even

43. Is it your goal to make a profit from the nonfimded classes or break even?
profit break even

44. What number of students is required to make a nonfunded class?_

45. What number of students is required to make a fiinded class?

155
Instmctional Scheduling 9-10
Institution

46. Are any of your continuing education noncredit courses offered simultaneously for
academic credit? yes no
If yes, which ones?

Please offer any additional comments on continuing education classes here:

156
Workforce 10
Institution

WORKFORCE DEVELOPMENT

Definitions:
Workforce Development — includes continuing education programs and customized
contract training for business and industry.
Workforce Continuing Education Programs — includes adult vocational education
courses offered by the College and approved by the Texas Higher
Education Coordinating Board.
Customized Contract Training — includes training and services which are
customized and contracted with business and industry to meet specific training
needs.

1. Which programs are offered at your institution?


Workforce Continuing Education Programs
Customized Contract Training

If checked either of the above, please attach a list of your program classes and answer the
following questions.
The following questions are divided into the two categories defined above.
Please answer each column separately following the definitions given above.

Workforce Continuing Education Customized Contract Training

How are the needs for such classes How are the needs for such activities
determined? determined?

How are such classes staffed? How are such activities staffed?
Full-time faculty Full-time faculty
Part-time faculty Part-time faculty
Extemal consultants Extemal consultants
Other (explain please) Other (explain please)

If you use full-time faculty, are these If you use full-time faculty, are these
classes considered part of their load? activities considered part of their load?
yes no yes no
If no, how are they considered? If no, how are they considered?

157
Workforce 10-2
Institution

Workforce Continuing Education Customized Contract Training

How are such classes funded? How are such activities funded?
Student tuition Student tuition
Special student fees Special student fees
Contracted with business Contracted with business
Other (explain please) Other (explain please)

Do you receive contact hour fianding Do you receive contact hour funding
for these classes? yes no for these activities? yes no

What do the students receive for What do the students receive for
these classes? these activities?
academic credit academic credit
CEU credit CEU credit
Other (please explain) Other (please explain)

Do you strive to break even or make a Do you strive to break even or make a
profit in offering these classes? profit in offering these activities?
^break even profit break even profit

Do you allow both continuing education Do you allow both continuing education
credit and academic credit to be given credit and academic credit to be given in
these classes? yes no in these activities? yes no

Identify the position which administers Identify the position which administers
the scheduling and staffing of these the scheduling and staffing of these
classes? activities?

How are such classes advertised? How are such activities advertised?
College catalog College catalog
Separate catalog or brochure Separate catalog or brochure
Mailouts Mailouts
^Newspapers Newspapers
Contract with business Contract with business
Other (specify) Other (specify)

158
Workforce 10-3
Institution

Workforce Continuing Education Customized Contract Training


What percentage of this program's What percentage of this program's
budget is spent for marketing? budget is spent for marketing?

2. Based on the definitions given at the begirming, describe how your workforce
development efforts are administered (i.e., what the organizational chart looks like, how
the workforce development staff relates to the rest of the College, etc.) Please attach
workforce development organizational charts.

3. Describe how your workforce development efforts are evaluated? Please provide
samples of the evaluation tools if possible.

4. How is this evaluation used?

159
APPENDDC B

TASK FORCE BENCHMARKING HANDOUTS

160
Qualifications for a Benchmarking Team

1) fianctional expertise and a demonstrated level of job skills, or work-related


performance

2) the employee should have sufficient credibility in the institution, as judged by


subject-matter knowledge, employment history, and the level of position(s) held.

3) have above average communication skills, in order to communicate well with


other team members

4) need to have a high level of team spirit, including a sense of cooperation,


effective listening skills, an ability to reach a consensus, and respect for the
opinions of others.

Spendolini, M. J. (1992) The Benchmarking Book. New York: American Management


Association, p 54.

161
Starting a Benchmarking Process
In Belmont University's Quality Team Manual on Benchmarking in 1993 seven
points are listed for consideration before beginning to benchmark and are paraphrased as
follows:

1. Is there already a focus in your work area or department around service,


employees, and continuous improvement of processes?

2. Is benchmarking the right strategy in this situation? (According to the


Intemational Quality Study, work-class benchmarking is only suitable for already
high-performing organizations. Competitive or peer benchmarking is more
appropriate for low or medium performing organizations.)

3. What should you benchmark: Choose those processes that align with the
organizational mission and contribute to the organization's long-term success.

4. What should you measure: You are attempting to generate comparative


performance data; you are observing how they achieved those results.

5. What organization(s) should you benchmark?

6. How should you collect data? First, establish intemal baseline performance
measures. Then be creative for tracking down other sources of data.

7. How can you implement what you leamed? Determine the variances between
your processes and those benchmarked. Separate out, if necessary, unique factors
either to the benchmarked organization or to higher education. Then, develop a
mission statement for the process, and set clear goals and action plans,4, p. 63

Alsete, J. W. (1995) Renr.hmarking in Higher Education: Adapting Best Practices To


Improve Quality. ASHE-ERIC Higher Education Report No.5. Washington, D.C: The
George Washington University Graduate School of Education and Human Development.
ERIC Clearinghouse on Higher Education, p, 63,

162
Baldridge Award Explanation

Benchmarking, examined in this category, is an approach for leaming what other


organizations are doing and for getting new ideas. This section 2.2 is worth fifteen points
and includes two areas which directly address benchmarking:

a) How competitive comparisons and benchmarking information and data


are selected and used to help drive improvement of overall company
performance or specifically describing how needs and priorities are
determined, how criteria for seeking appropriate information and data
from within and outside the company's industry are chosen, how the
information and data are used within the company to improve
understanding of processes and process performance, and how the
information and data are used to set stretch targets and/or encourage
breakthrough approaches.
b) How the company evaluates and improves its overall process for selecting and
using competitive comparisons and benchmarking information and data to
improve planning and overall company performance by using data obtained from
other organizations such as customers or suppliers through sharing, obtained from
the open literature, obtained by testing and evaluation by the company itself;
and/or obtained by testing and evaluation by independent organizations (p. 54),

Spendolmi, M. L (1992) The Benchmarking Book. New York: American Management


Association, p 54.

163
Benchmarking for School Tmprovement

1. Can be used to rapidly address specific strategic goals of an institution

2. Can be used to prevent "wheel reinvention o f and leam how other


organizations have addressed the issues and problems one is facing.

3. Can be used faster and more effectively to improve curricula, instmction,


leaming outcomes, and programs when adjusting to higher standards of
achievement and accountability required by both state and federal law.

4. Can more efficiently implement federal and state mandates.

5. Can to increase the benchmarking teams understanding of quality principles


and tools.

6. Can be used to practice one of the fundamental improvement strategies required


in the Malcolm Baldridge Quality Award criteria involving quality assessment.

Tucker, Sue (1996), Benchmarking: A Guide for Educators. Cowin Press, Inc., Thousand
Oaks, CA, p. 8,

164
Robert Camp's Formal 10-Step Process for a Benchmark Study

1. Decide what to benchmark.

2. Identify whom to benchmark.

3. Plan and conduct the investigation—Determine what data are needed and how
to conduct the b. investigation. Document the best practices found.

4. Determine the current performance gap—,,,decide how much better the best
practices are than the current work methods

5. Project fiiture performance levels—Decide how much the performance gap will
narrow or widen in the near fiiture and what repercussions this has for the
organization,

6. Communicate benchmark findings and gain acceptance communicate the


findings to all those who have a need to know in order to gain acceptance and
commitment.

7. Revise performance goals—Convert findings into operational statements

8. Develop action plans—Create specific implementation plans, measurements,


assignments, and timetables for taking action

9. Implement specific actions and monitor progress-Implement the plan and


report progress to key process owners and management,

10. Recalibrate the benchmark-Continue to benchmark and update work


practices to stay current with ongoing industry changes

Camp, Robert C. (1995) Business Process Benchmarking, Milwaukee, WI: ASQC


Quality Press, pp, 19-22.

165
Definitions of Benchmarking

' Benchmarking is the process of identifying someone or some organization that is


doing something better than you are doing, studying how they are doing it, and
adapting those procedures that would be most usefiil to reach your desired
outcome. The essence of modeling oneself against certain standards or processes
in order to produce a desire result is what b, is all about, (p. xi)

• Benchmarking is analyzing performance, practices, and processes within and


between organizations and industries, to obtain information for self-improvement.

Alsete, J. W. (1995) Benchmarking in Higher Education: Adapting Best Practices To


Improve Quality ASHE-ERIC Higher Education Report No.5. Washington, D.C. The
George Washington University Graduate School of Education and Human Development.
ERIC Clearinghouse on Higher Education, p, 20,
4:3(:4:3|e3|e3|e3|e3|ei|ei(:3|:3|:

• Benchmarking is the continuous process of measuring our products, services, and


practices against our toughest competitors or those companies known as leaders,
(Xerox definition)

• Benchmarking is a standard against which something can be measured, A survey


mark of previously determined position used as a reference point. A survey mark
of previously determined position used as a reference point, (dictionary
definition)

• Benchmarking is a stmctured way of looking outside to identify, analyze, and


adopt the best in the industry or fianction.

Camp, Robert C. (1995) Business Process Benchmarking. Milwaukee, WI: ASQC


Quality Press, 35, p. 18

Benchmarking is the search for best practices that lead to superior performance.

Camp. R, C, (1989) Benchmarking: The search for industry best practices that lead to
superior performance. Milwaukee, WI: Quality Press (ASQC), p. 12.

166
' Benchmarking, is the ongoing, systematic process for measuring and comparing
the work processes of one organization to operations.

Dertouzos, M., Lester, R., and Solow, R., and the MIT Commission on Industrial
Productivity. Made in America: Regaining the Productive Edge Cambridge, MA: The
MIT Press, 1989, p. 84.

• Benchmarking is one method or tool to help create a new way of thinking , or


"paradigm" for h. e. to make a substantial and sustainable change in efficiency.

Keeton, M., & Mayo-Wells, B. (1994). Benchmarking for Efficiency in Leaming,


AAHE Bulletin, 46(8), p. 9-13,

• Benchmarking is an extemal focus on intemal activities, fiinctions, or operations


in order to achieve continuous improvement. Starting from an analysis of existing
activities and practices within the firm, the objective is to understand existing
processes, or activities and then to identify an extemal point of reference, or
standard, by which that activity can be measured or judged, (pp,l-2).

Benchmarking transforms the change process into a continuous improvement


effort focused on the process, utilizing the knowledge of existing employees to
drive the change. It supports leaming at the organizational level, which leads to
continued innovation and change (p. 115).

Benchmarking definitions suggest an extemal focus, a movement away from a concem


with cost reduction and budgets to an understanding of what activities customers value
and what level of performance they expect (p. 2).

Leibfiied, K, H. J. & McNair, C. J. (1992). Benchmarking. New York: Harper Collins


Publishers.

Benchmarking is a continuous, systematic, process for evaluating, the products ,


services, and work processes or organizations, that are recognized as representing
best practices for tiie purpose of organizational improvement (p. 9).

Benchmarking is a continuous process of "investigation that provides valuable


information...[by] leaming from others [in] a time-consuming, labor-intensive
process requiring discipline [that produces] usefiil information for improving
virtually any business activity.

Spendolini, M. J. (1992) The Benchmarking Book. New York: American Management


Association, p, 33.

167
• Benchmarking is an improvement process in which an organization compares its
performance against best-in-class organizations, determines how those
organizations achieved their performance levels, and uses the information to
improve its own performance: the subjects that can be benchmarked include
strategies, products/programs/services, operations, processes and procedures.

Pinellas County Schools, (1994). Superintendent's quality challenge. Largo, FL: Author,
p. 29.

• Benchmarking is the process....used to achieve superior performance—a team


research and data-driven process by which leaming and innovation trigger
fiandamental breakthroughs in thinking and practice.

Tucker, Sue (1996), Benchmarking: A Guide for Educators, Cowin Press, Inc., Thousand
Oaks, CA, p. ix.

» Benchmarking is the process by which organizations leam, modeled on the human


leaming process" (a benchmarking program manager defined benchmarking.

Watson, G. H. (1993). Stt-ategic Benchmarking: How to rate your company's


performance improvement. Portland, OR: Productivity Press, p. 2.

What benchmarking is not:


• Benchmarking is not a quick review or a "look and see tour" of another schools'
good program. It is a study of everything that contributes to the positive
outcomes in a school that has the very best outcomes in a particular educational
arena.

Benchmarking is not a mechanism for reducing the school budget. Instead, it is a


mechanism for deploying resources in the most effective manner to achieve
customer (stakeholder) satisfaction.

• Benchmarking is not a cookbook program that requires only a recipe for success.
Instead, it is a discovery and leaming process that can be used over and over again
to achieve different goals. It is a way of working and thinking in the school to
achieve continuous improvement.

Tucker, Sue (1996), Benchmarking- A Guide for Educators. Cowin Press, Inc., Thousand
Oaks, CA, p. 3.
************

168
• Benchmarking does not mean cloning without thought, the success of other
companies. What is best practice in one organization cannot readily be transferred
to another without a thorough understanding of the leaming that has gone into
achieving the standard, and recognition of the impact of the process on the culture
of the organization, in terms of both customer and employee reactions.

Cook, S. (1995). Practical Benchmarking. London: Kogan Page, p. 24.


************

169
Possible Types of Benchmark Studies
The four types of benchmarking: internal, competitive, functional, generic.

Alsete, J. W. (1995) Benchmarking in Higher Education- Adapting Best Practir.es To


Improve Quality ASHE-ERIC Higher Education Report No,5. Washington, D,C, The
George Washington University Graduate School of Education and Human Development.
ERIC Clearinghouse on Higher Education, p. 10,
************

Three types of benchmarking: Internal, competitive, andfunctional


Competitive-direct competitors selling to the same customer base
Advantages: Information relevant to business results, comparable
practices/technologies, history of information gathering
Disadvantages: data-collection difficult, ethical issues, antagonistic
attitudes,
Spendolini, M. L (1992^ The Benchmarking Book. New York: American Management
Association, p. 17.
:|c3(:3|c:|e3|e3|c4::|;:(e4e:|e:4e

Four types of benchmarking:/«/erna/, Competitive, Non-competitive, and best practice


Internal—measuring and comparing company data on similar practices from other
parts of an organization.
Competitive—benchmarking is against direct competitors.
Non-competitive—a process by measuring and comparing: a related process in a
non-competitive organization, a related process in a different industry, and
an unrelated process in a different industry.
Best practice/world class—involves leaming from best practice or world-class
organizations—the leaders of the process being benchmarking.
Cook, S. (1995). Practical Benchmarking. London: Kogan Page, pp. 18-20.
4:4c%3|e:|e3|c:(e4:3|:%}|e3|e

Types of benchmarking: internal, competitive, functional (generic).


Types of information: services, work processes, support fiinctions, organizational
performance like costs (expenses) and income (revenue) and strategy.
Uses of information: strategic plarming, forecasting, new ideas, product/process
comparisons, goal setting. The intended use of the benchmarking information
affects the amount of effort required to identify and collect that information. Also
the uses significanfly affect your choice of benchmarking partners and the types of
questions you ask them.
Quality of information: Most benchmarking studies do not rely on the report of a
single individual to reflect the workings of an entire organization.

170
Benchmarking's rationale is that the necessary effort to identify expert and reliable
benchmark resources be made. Limiting a benchmarking activity to a smaller population
of potential organizations greatly reduces the quantity of information generated.
Scope of benchmarking activity: one-time event, periodic, continual.

Spendolini, M, J. (1992) The Benchmarking Book. New York: American Management


Association, p. 62-65.
************

Chris Millard, Logistics Director at the Rover Body and Pressings plant recognizes that to
have a clear focus is vital: "If you only benchmark an operation in total terms, you will
miss. You examine and establish benchmarks for the processes which are the drive in
achieving the targets set for the overall operation.

Cook, S. (1995). Practical Benchmarking. London: Kogan Page, pp. 18-20.

171
APPENDIX C

TASK FORCE POSSIBLE BENCHMARK AREAS

172
POSSTBTE AREAS FOR BENCHMARKING

NACUBO Benr.hmarking Program Functional Areas

Academic affairs Accounts payable Intercollegiate athletics


Alumni relations Bookstore Career planning & placement center
Collections Central stores Central budget department
Development office Facilities Environmental health and safety
Financial aid Food services Human resources: Benefits Admin,
Admissions Legal affairs Human resources: General
Library Mail Human resources: Hiring
Parking Payroll Intramural & recreational sports
Police/security Purchasing Multi-campus system administration
Registrar Sponsored projects Student counseling
Student affairs Student health services Telecommunications
Student housing Students accounts receivable/student billing

173
Peterson's AGB Strategic Indicators Survey Major Headings

Revenue
Tuition and fee income
Government appropriations
Government grants and contracts
Private gifts, grants and contracts
Endowment support for operations
Sales/service—educational activities
Sales/service—auxiliary enterprises
Sales/service—hospital
Other sources

Current expendittires by fiinction


Instmction
Research
Public service
Academic support
Student services
Institutional support
Plant operations and maintenance
Expenditures—auxiliaries
Expenditures—hospital
Expenditures—independent operations
Student financial aid
Total expenditures

Current expenditures by object


Wages and salaries (by employee type)
Fringe benefits
Interest paid for outside entities

Balance sheet: Assets


Current fimds
Endowment book value
Plant and equipment
Other assets
Total assets

Balance .sheet: Liabilities & Fund Balances


Current liabilities
Short-term debt to outside entities
Long-term debt to outside entities
Fund balances
Total liabilities andfiandbalances

174
Physical Plant
Financial:
Beginning of year value
Depreciation for the year
Retirement of plant
Additions to plant
End of year value
Plant inventory and condition
Estimate deferred maintenance

Libraries & Information Re.sources


Library holdings:
Books and monograph volumes
Joumal subscriptions
Information sources:
Microcomputers for student use

Endowments
Beginning of year market value
Retum on investment
Other additions to endowment
Subfractions from endowment
Normal support for operations
Special uses
End of year market value

Students
Fall enrollments (headcount & FTE, by level)
Fall FTE enrollment, by EEOC & level
Fall FTE enrollment by gender & level
Fall FTE enrollment by field of study & level
Degrees awarded by level
Admissions data for the fiill year:
Number of applications
Number of offers of admission
Number of matriculants
Geographical dispersion of entering student by level
Number of states represented
Student head count from home state
Students from outside the U. S. and Canada
Tuition and financial aid
Published charges
Financial aid headcounts by type of aid
Financial aid dollars by type of aid

175
Faculty anH Staff
Faculty numbers (full and PT by rank)
Regular faculty FTE
By field & rank
By EEOC category & rank
By gender & rank
Percent faculty over 60 years old
Faculty gains and losses for year
By rank
Headcount at the beginning of the year
In-hire
Voluntary termination
Termination by death or disability
Termination by the institution
Change, i.e. non-tenured to tenured
Headcount at the end of the year
Sponsored research
Expenditures for organized research
U. S. Government (by major agency)
State & local government agencies
Domestic corporations & foundations
Other domestic private foundations
Foreign governments & corporations
Bequests and gifts from living individuals
Other outside sponsors
Institutional fimds
Academic year faculty salary offsets
% of regular faculty members who are principal investigators on sponsored projects
Research proposal and award statistics
Proposals sent to potential outside sponsors
Awards received from outside sponsors
Fundraising
Dollars raised during the year by source
Dollars raised during the year by use
Designated or restricted dollars for current operations
Designated or restricted dollars for student aid
Designated or restricted dollars for plant
% of living alumni who are active donors

176
APPENDIX D

FINAL BENCHMARKING REPORT

177
BUSINESS OFFICE RESUTTS
(Six colleges reporting)

Unrestricted monies- (Quest. 1-2) (R=6)


Total unrestricted 1995-96 educational and general budget
Range: $16,815,107-$39,990,027
Average: $26,297,000
Median: $23,790,697

Total unrestricted 1994-95 audited financial statement


Range: $16,606,345-$38,558,402
Average: $23,162,000
Median: $21,398,102

Budget Expenditures: (Que.st. 3-4) (R=6)


Dollar Amount Percentage of Budget

Equipment Range: $100,000-$ 1,140,372 Range: .4%—3.23%


Average: $481,086 Average: 1.44%
Median: $186,362 Median: ,78%

Technology Range: $83,532~$647,605 Range: .5%-3.1%


Equipment Average: $383,525 Average: 1,55%
Median: $400,000 Median: 1.7%

Expendable Range: $359,000-$2,832,369 Range: 1.7%-6.68%


Supplies Average: $1,133,993 Average: 3,87%
Median: $883,432 Median: 3.4%

Maintenance Range: $l,005276-$3,826,943 Range: 4.5%-11,6%


(Physical plant Average: $2,521,000 Average: 9,94%
operations) Median: $2,488,507 Median: 11,18%

Personnel Range: $12,161,114-$29,608,590 Range: 60.87%-76.83%


(including Average: $19,841,000 Average: 70%
fringe benefits) Median: $21,474,420 Median: 72,53%

Personnel Range: $11, 159, 114-$26,632,590 Range: 57.81%-75,58%


(excluding Average: $16,965,000 Average: 65,2%
fringe benefits) Median: $14,227,829 Median: 65,26%

Personnel Range: $21,741-$2,985,000 Range: 1.25-8.1%


fringe benefits Average: $1,425,000 Average: 5.33%
Median: $1,081,084 Median: 6%)

178
Dollar Amount Percentage of Budget

Professional Range: $19,000~$238,965 Range:. 1%-1%


Development Average: $82,149 Average: .36%
Median: $35,316 Median: .16

Travel Range: $25,033-$359,663 Range:,01%-1.2%


Average: $187,925 Average: ,71%
Median: $149,452 Median: ,43%

Utilities Range: $737,870-$2,147,079 Range:3,18%-5,6%


Average: $1,166,515 Average: 4.41%
Median: $956,276 Median: 4.1%

Other areas of budget with 5% or more allocated:


Performance Grants
Other Operating Expenses
Debt Service
General Administration—8.79%>
Shident Seryices-6.77%
Instmction~61.64%
Physical plant-10.79%

Budget Priorities: (Quest. 5) (R=6)


Category Mode Rank # Colleges Average Rank
for that Rank
equipment 3 2 3.8
expendable supplies 3 2 4,8
fringe benefits 2 2 3.5
personnel (existing) 1 4 1
personnel (new positions) 2 2 4

Category Mode Rank # Colleges Average Rank


for that Rank
travel 8,5 2 each 6.5
technology 6,4 2 each 4.2
maintenance 5 2 5

Cost cutting efforts: TOuest. 6'> rR=6'>


Utility costs
• Energy management system
• Comprehensive management program
• New thermal storage system with industrial heat pump
• Computerized air conditioning controls

179
Efficient lamps and fixtures
Computerized heating controls
Computerized electrical system
Revised computerized water irrigation system
Consolidated courses to fewer buildings
Closed Friday in summer

Training and equipment co.sts


Consolidated computer labs
Budget limits for functional areas based on revenue limitations
Purchasing department contracts for items at lowest cost for quality
needed
Budget monitoring and review by fiscal area of purchases in all areas
Equipment must be prioritized; no increase given
Full justification of need before purchasing
Require cost analysis of repair vs, replacement purchase
Redistribute old technology equipment when appropriate
Finding grants and private sources to fiind
Participation in county consortium
Closely monitored bidding
Use of in-house personnel for staff development/training

Expendable .supplies costs


• Budget limits for fiinctional areas based on revenue limitations
• Purchasing department contracts for items at lowest cost efficiency
• Budget monitoring and review by fiscal area of purchases
• No increase in budgets
• Purchases in larger quantities when possible
• Require competitive price quotes
• Cut off current year spending around June 1
• Using e-mail for campus bulletins and communications
• Using no colored paper
• Participation in county consortium
• Closely monitored bids/quotes
• Recycling

Technology
Technology Master Plan
$10 fee per student per semester
Budget limits for fimctional areas based on revenue limitations
Purchase through competitive bid process
Group all technology equipment together for better purchasing price
Budget monitoring and review by fiscal area for purchases
Thorough research of most productive/economical needs/goals

180
Personnel
Position justification
Budget limits for functional areas
Eliminated 18 positions for 95-96
Require two weeks lapse before replacing terminated employees
Hire all new employees at no more than three steps above entry level
Review faculty positions when faculty leave or retire
Replace fiill-time faculty with part-time
Flatter organizational stmcturing

Travel
Zero budgeting; no budget increases
Budget limits for functional areas based on revenue limitations
Require employees and students to travel together
Require use of rental vehicle when cost is less than personal vehicle
Budget monitoring and review by fiscal area
All out of state travel require Presidential approval
Percentage cuts during budget development

Maintenance
Preventive maintenance program
No increase in budgets
Comprehensive facilities survey to identify deferred maintenance costs
Eliminate maintenance confracts; hired three computer technicians and one
vehicle mechanic
Set up maintenance on time and materials budget
Reduce fiill-time staff and replace with part-time staff and student
workstudy
Increased preventive maintenance
Computerized preventive scheduling
Monitoring in-house maintenance cost vs. "outsourcing costs"

Travel

Policies (Quest. 7) (R=6)


For administration:
• Zero budgeting
• Supervisor approval
• See attached policies in study notebook

For faculty:
Formula=$300 per area/$140 per faculty/$1000 per division
Supervisor approval
See attached policies in study notebook

181
For staff:
• Zero budgeting
• Supervisor approval
• See attached policies in study notebook

For students:
• Zero budgeting
• Supervisor approval
• See attached policies in study notebook

Institutional Budget TQne.st 8-91 (R=f^\


Percentage of budget spent on travel
Range: 2%-.43% Average: 1.67%

Dollar Amount spent on travel


Range: $400,000-72,920 Average: $254,186

Percentage of travel budget by area:


Area Range Ayerage
fristmctional 30%-.082% 8.76%
Institutional 26%-,359% 9,83%
Professional 92%-26% 56,5%
Student I8%~.2% 11.4%

Travel Money Allocation (Quest. 10) (R=6)


For faculty:
Two colleges allocate by use of a formula:
$3 00 per area/$ 140 per faculty/$ 1000 per division
• $ 190 per faculty but not all faculty travels

Four colleges allocate by individual budgets as requested by:


Chair
• Department
• Dean/adminisfration within budget guidelines

Three colleges consider exfra/additional requests:


• As approved by VP
• As required
• As requested through the administration

Two colleges:
• Allocate by those responsible for performance
• Application for professional development and institutional monies

182
For administration:
Five colleges allocate by individual budgets:
• Using zero budgeting
• Budget requests
• As approved by dean/administration within budget guidelines

Three colleges consider extra/additional requests by:


• VP approval
• Requirement
• Request through administrative channels

Two colleges:
• Approve administrative travel annually
• Allow for application for professional development and institutional
monies

For classified:
Four colleges allocate travel money by individual budgets as
• Requested by chair
• In department annual budget
• With dean/administrative approval within budget guidelines

Three colleges consider extra/additional requests for travel as :


• Approved by VP
• Required
• Requested through adminisfrative chaimels

At one college all travel is considered professional development.

Purchasing

Group purchasing (Quest. 12-14) (R=5)


Five colleges group purchase in
• Supply areas
• Medical supply area
• Computer equipment area
• Paper products
• Chemical products
• A consortium

183
Five colleges combine with
ISD's
Local city governments
Hospitals
County governments
Consortiums
State General Services Commission

Six colleges are interested in further group purchasing especially in the area of
technology.

Insurance other than TIGIP


(Quest. 15-17) (R=6)

Self insure
Four colleges self insure
• Three in the area of Worker's Compensation
• One only in the deductible portion

Bid process for liability and/or worker's compensation


All six colleges undertake the bid process at these varying times
• Every three years for three colleges
• Every ten years for one college
• Annually for one college
• Contract 1 year with option to renew for additional 2 years not to exceed 3
years

Insurance purchased for college employees


Five colleges purchase some other types of insurance coverage for their employees
General liability—Errors & Omissions
Long-term disability
Health Insurance
Dental Insurance
Medical malpractice for Health Sciences
Supplement premium for employee dependents' health and life insurance
Cancer, term-life
Educator's legal liability
Public employee dishonesty/forgery or alteration (Crime Policy)
Unemployment compensation
Worker's compensation

184
COMPUTER TECHNOLOGY RESULTS
(Six colleges reporting)

Record keeping and accounting: (Quest. 1) (R=5)


Database:
Indexed files
VSAM FILE STRUCTURE
Hewlett-Packard Turbo-Image—Student and Human Resources Records
DEC Open VMS ISAM-Financial Records
Unidata
RMS

Programming language:
• Four colleges use COBOL
• One college each uses
• 4GL Report Writer (QUIZ)
• Unibasic
• DATATRIEVE
• FOCUS

185
Table D. 1. Administrative computing:
Administrative Ad hoc reporting: Accessible by:
software: N=Not possible Q=Query language
Area H=Homegrown D=Outsourced DP Staff R=Report generator
P=Proprietary L=Local (College) DP D=Downloading files
(give brand name Staff 0=Open Data Base
ifP) 0=College Non DP Connectivity
Staff
Prospective Two colleges: Five colleges use Four colleges access by
Students H Local DP staff Downloading files
One college each: One college uses Three colleges access by
P-POISE College Non DP staff Quety Language
Datatel Colleague Two colleges access by
P-SCT Report generator
One college accesses by
Open Data Base
Registration Two colleges Six colleges use Five colleges access by
H Local DP staff Downloading files
One college Two colleges use Four colleges access by
P-POISE College Non DP staff Query Language
P-Defimct
Datatel Colleague Three colleges access by
P-SCT Report generator
One college accesses by
Open Data Base
Transcripts Two colleges Six colleges use Four colleges access by
H Local DP staff Downloading files
Report generator
One college Two colleges use
P-POISE College Non DP staff Three colleges access by
P-Deftinct Query Language
Datatel Colleague
P-SCT Two colleges access by
Open Data Base

186
Table D.l. Continued
Administrative Ad hoc reporting: Accessible by:
Area software: N=Not possible Q=Query
H=Homegrown D=Outsourced DP language
P=Proprietary Staff R=Report
(give brand name L=Local (College) generator
ifP) DP Staff D=Downloading
0=CoIlege Non files
DP Staff 0=Open Data
Base
Connectivity
Personnel Two colleges Five colleges use Three colleges
H Local DP staff access by
Downloading
One College each One college uses files
P-Defimct College Non DP Report generator
Datatel Colleague staff
P-SCT Two colleges
access by
Query Language
Open Data Base
Payroll Two colleges Six colleges use Four colleges
H Local DP staff access by
Downloading
One college each Two colleges use files
P-POISE College Non DP Report generator
P-Defiinct staff Query Language
Datatel Colleague
P-SCT One college
accesses by
Open Data Base
General Ledger One college each Six colleges use Five colleges
H Local DP staff access by
P-CUFS Downloading
lAFRS Three colleges use files
P-SCT IA+ College Non DP
Datatel Colleague staff Three colleges
P-SCT access by
Report generator
Two colleges
access by
Query language
One college
accesses by
Open Data Base

187
Table D.l ContinneH
Administrative Ad hoc reporting: Accessible by:
software: N=Not possible Q=Query language
Area H=Homegrown D=Outsourced DP Staff R=Report generator
P=Proprietary L=Local (College) DP D=Downloading files
(give brand name Staff 0=Open Data Base
ifP) 0=College Non DP Connectivity
Staff
Accounts One college each Six colleges use Five colleges access by
Payable H Local DP staff Downloading files
P-CUFS
lAFRS Three colleges use Three colleges access by
P-SCT IA+ College Non DP staff Report generator
Datatel Colleague
P-SCT Two colleges access by
Quety language
One college accesses by
Open Data Base
Accounts One college each Five colleges use Five colleges access by
Receivable H Local DP staff Downloading files
P-CUFS
P-SCT IA+ Three colleges use Three colleges access by
Datatel Colleague College Non DP staff Report generator
P-SCT
Two colleges access by
Query language
One college accesses by
Open Data Base
Financial One college each Six colleges use Five colleges access by
Aid H Local DP staff Downloading files
P-POISE
SAFE ACT Two colleges use Four colleges access by
P-Defimct College Non DP staff Quety Language
Datatel Colleague
P-SCT Three colleges access by
Report generator
One college accesses by
Open Data Base
Inventory Five colleges Five colleges use Three colleges access by
H Local DP staff Downloading files
Quety language
One college each One college uses
Datatel Colleague College Non DP staff One college accesses by
Report generator
One college reports
Not possible

188
Table D 1 GnntinneH
Administrative Ad hoc reporting: Accessible by:
software: N=Not possible Q=Query language
Area H=Homegrown D=Outsourced DP Staff R=Report generator
P=Proprietary L=Local (College) DP D=Downloading files
(give brand name Staff 0=Open Data Base
ifP) 0=College Non DP Connectivity
Staff
Libraty One college each Three colleges Two colleges access by
P (data res.) Not possible Report generator
H Not possible
DRA Two colleges use
Dynix Local DP staff One college accesses by
P-CLSI Downloading files
One college uses
Outsourced DP staff
Purchasing One college each Five colleges use Three colleges access by
H Local DP staff Downloading files
lA
P-SCT IA+ Two colleges use Two colleges access by
Datatel Colleague College Non DP staff Report generator
P-APS Quety language
Bookstore Two colleges Two colleges Two colleges
N/A N/A N/A
One college One college uses One college accesses by
P-RAYCEE Downloading files Report generator
Two colleges Five colleges use Four colleges access by
H Local DP staff Quety language
Academic Downloading files
Advising One college each Two colleges use
P-POISE College Non DP staff Three colleges access by
P-Defimct Report generator
Datatel Colleague

189
Table D ?. Maintenance and servire- rQuest 4UR-6^
Equipment In house Service Contracts Other Outsource
Printers Six colleges Three colleges
PCS Six colleges Two colleges
LAN Six colleges Two colleges One college
WAN Four colleges One college One college

Table D,3. Microcomputers primarily for snidents: rQues •'?)(R=4),


#of
computers 8088/ 386 486 Pentiu Apple Mac Power
286 m II Mac
Total R= R= R= R= R=30 R=4-60 R=0-10
10-350 50-197 265-600 99-260 Avg.=30 Avg.=270 Avg.=6.6
Avg.=124 Avg.=12 Avg.=363 Avg.=155 (2 coll.)
9
LAN R= R= R= R= R=0-30 R=0-60 R=0-10
Coimected 0-150 35-110 150-400 99-260 Avg.=15 Avg.=24 Avg.=5
Avg.= 45 Avg.= 69 Avg.=233 Avg.=149 (2 coll.) (2 coll.)

Table D.4. Microcomnuters for far. iltv:


#of 8088/ Apple Power
computer 286 386 486 Pentiu II Mac Mac
s m
Total R= R=32-35 R= R=21-60 -4- R=2-20 R=2-5
34-100 Avg.=33.5 17-180 Avg.=42 Avg.=7.5 Avg.=3.5
Avg.=67 Avg.=99
LAN -20- R=35 R=60 R=21-55 -0- R=3-20 -5-
Connected Avg.=35 Avg.=113 Avg.=35 Avg.=9

Table D. 5. Miprocomn iters for staff

#of 8088 386 486 Pentium Apple Mac Power


computers /286 Mac

Total R= R=0-25 R= R=10-65 -0- R=l-2 -0-


2-50 Avg.=7 75-200 Avg.=37 Avg.=1.5
Avg.= Avg.=l 11
24

LAN -20- R=2-25 R= R= 10-65 -0- -2- -0-


Connected Avg.=13.5 20-90 Avg.=31
Avg.=65.5

190
Personnel Questions
.Table D.6. Computer personnel l reakdown
Percent Hourly Pay H=Hourly
Job Title Time Index M=Monthly
Secretary (example) 100% 100 H
Programmer-analyst (example) 75% 240 M
Network Administrator 100% 272 M
Network Analyst 100% 222 M
Secretaty I 100% 137.5 M
Operator I 100% 159.5 M
Operator II 100% 175 M
Micro Specialist 100% 193.3 M
Programmer/Analyst 100% 202.9 M
Programmer/Analyst I 100% 213.3 M
Asst, Syst. Programmer 100% 213.3 M
Systems Programmer 100% 224.2 M

Asst. Director 100% 224.2 M

Director 100% 253,5 M

Secretary 100% 100 Semi-monthly

Director 100% 295 Semi-monthly

Systems Analyst 100% 248 Semi-monthly

Operations Manager 100% 197 Semi-monthly

Programmer/Analyst 100% 180 Semi-monthly

Computer Operator 100% 123 Semi-monthly

Microcomputer Support Spl, 100% 156 Semi-monthly

Microcomputer Repair Tech 100% 148 Semi-monthly

Data Network Spl. 100% 148 Semi-monthly

191
Director, Computer Services 100% M
Systems Analyst 100% M
Programmer Analyst 100% M
Computer Programmer 100% M
Coordinator, Network Services 100% M
PC Supervisor/LAN Manager 100% M
PC Speciahst 100% M

Programmer Analyst II 100% 227 H


Programmer I 100% 159 H
Telecom Tech II 100% 144 H
Telecom Specialist 100% 148 H
Hardware/Software Tech I 100% 144 H
Hardware/Software Tech II 100% 157 H
LAN Technician 100% 175 H

Policy Questions
Definition:
Technology—Telecommunications services such as telephone and data services,
instructional equipment, such as audio-visual equipment, and computer equipment.
General technology fee: (Quest. 7) (R=6)
Four colleges have general technology fees.
• Two colleges $ 10 per semester
• Three colleges $3 per semester hour
Which give students access to
• Computers and/or technical equipment
• Technology on campus.
• Nothing- Fee is used to maintain current with new computer technology in computer
labs.6.
Three colleges dedicate the fees to the purchase or support of technology.
No college encumbers the fee for support personnel.

192
No college encumbers the fee for maintenance/upgrades.
Two colleges charge lab fees which include technology usage separate from a technology
fee.
• One of these colleges uses the fees for English, Office Occupations, Information
Systems
Network accounts for students: TQuest. R) rR=6)
Four colleges provide student's with network accounts including:
• Four E-mail
• Three WWW
• Two Telnet
Three colleges tie these accounts to
• Courses
• Generic
Computers provided at the institution's expense: rQuest 9) (R=6,'\
Five colleges provide student's computers; usage is provided by
• Budget
• In labs and classrooms
• Vocational-Technical Majors
• Open labs - first come, first serve basis;
• Capital Equipment Task Force
Six colleges provide faculty computers; usage is provided by
• Two colleges in the budget
• One college each
• Being fiill-time faculty
• Six machines on 3 weeks or less loan first come, first served budget requests
• Based on need and availability of fiinds;
• Capital Equipment Task Force
Six colleges provide administrative computers; usage is determined by
• Two colleges by budget
• One college each by
• Job Description
• Beingfiall-timefaculty
• Based on need and availability of fiinds
Six colleges provide clerical computers; usage is determined by
• Two colleges determine by budget
• One college each determines by
• Beingfiall-timefaculty
• Based on need and availability of fiands
• Need based on Job Description
Replacement policy for microcomputers: (Quest. 10) (R=6)
Inlabs/classrooms:
20%/year
Asfiandsallow
No policy in place
As needed
Software drives Replacement
Still under development

193
In offices:
• 25%/year
• As fiinds allow
• No policy in place
• As needed
• Software drives Replacement
• Still under development
Ratio of service providers: (Quest. 11) (R=6)
Two colleges 1:100
One college each
• 1:248
• 1:260
• 1:300
• 1:350
Remote access to college's network: (Que.st. 12) (R=6)
Only one college provides remote access to the college's network at no charge for both
faculty and staff but not students
Computer operating systems: (Quest. 13) (R=6)
Six colleges provide Windows~3, 3.1, 95
Four colleges provide DOS
Two colleges provide UNIX
One college each provides
Macintosh
MS-DOS
Apple IIOS
VMS
VSE
VM
Hewlett Packard MPE/XL
DEC open VMS
Windows NT
Novell 4.x, 3.X;
Novell
Solaris
X86
VAX
DOS 6.22
Network operating systems: TQucst. 141 CR=6)
• Four colleges support NOVELL
• Three colleges support
• UNDC
• Windows NT
• Two colleges support
• Netware 3x & 4x

194
• One college each supports
• OS/2
• Solaris X86

Student contracts: (Que.st. 15) (R=6)


Two colleges require student's to sign contracts before using the institution's technology
system.
• The contract allows them to use
• Two colleges
• Network
• E-mail
• One college each
• Copyrights
• Security
• Student privacy rights

Intemet policy: (Quest. 16) (R=6)


Three colleges have policy's regarding usage of the Intemet.
(One college attached a copy of their policy.)

Machines on primary network: (Quest. 17) (R=6;>


• R=250-1200
• Avg.=796
• M=900

Training

Training procedures for employees: (Quest, 181 ("R=61


One on one when time available
Small groups in Alpha center
Periodic classes
In-house workshops
Tuition free instmction is provided on a voluntaty basis
As needed

Mandatory training: TOuest. 19-20) ^ = 6 )


No college has mandatoty training on hardware or software.

End-users training: rOuest. 21) (R=6)


Six colleges are fraining end-users by
Small groups
One on one as needed
With periodic classes
Consulting
In-house training is provided to all users of the colleges host systems
In-house workshops
Intemally through staff development classes

195
Budget
Technology budget- (Quest. 22) (n=S)
• R=$200,000—$3,000,000
• Ayg.=$ 1,554,600
• M=$ 1,642,031

Percentage is telecommunications- (Quest 21) (R=4)


• R=.85%-40%
• Avg.=14,6%
• M=8,2%

Percentage instmctional equipment: (Quest. 24) (R=5)


• R=15%~50%
• Avg,=29%)
• M=28%

One college reports:


None of the Computing Services Dept. Budget is for instmctional equipment. We
don't have a "technology budget" deans and depts. Have equipment budgets which
include funds for computers/technology.

Percentage is computer equipment: (Quest. 25) (R=5)


• R=2%-30%
• Avg,=14%
• M=12%

Percentage is maintenance: (Que.st. 26) (R=5)


R=II%-20%
Avg.=I6%
M=16.5%

Technology budget listed: (Quest. 27) rR=5)


• Four colleges list the technology Ijudget as distributed among departments.
• Two colleges list the technology budget as
• A cenfralized line item
• Included with all other equipment items

Technology budget includes: (Quest. 28) rR=5)


Five colleges technology budgets include:
• Computers
• LAN

Four colleges technology budgets include:


• Telephones
• Instmctional equipment services (audio-visual equipment)
• WAN

196
Service

Central location for service assistance: (Quest. 29) (R=5)


Four colleges call a central location on acampus for service assistance.

Approximate time lapses: (Quest. 30) (R=5)


Hardware problems: (R=5)
• R=l/2 hr.-days
• Ayg.=3,6hrs.
• One college reports hours to days.

Printer problems: (R=5)


• R=l/2 hr.-days
• Avg.=5 hrs.
• One college reports hours to days.

Software problems: (R=5)


• R=I/2 hr.-days
• Avg.=3.6 hrs.
• One college reports hours to days.

Connections to LAN: (R=3)


• R=l/2 hr.-16 hrs,
• Avg.=5.8 hrs.
• Once college reports hours to days.

Connections to WAN: (R=2)


• R=l/2 hr,-days

Experienced problems- (Quest. 31) (;R=4)


• Four colleges expeprience problems in
• Software updates
• Software installation

• Three colleges experience problems in hardware updates

• Two colleges experience problems in hardware installation

Re.solving the problems: (Quest. 32) rR=2)


One college each resolves problem in
• Software updates by
• Centralized software installation
• Implement software standards

• Software installation by
• Cenfralized software installation
• Implement software standards

Hardware updates by centralized hardware installation

197
Additional comments on computer service:
• QUESTION #31 -We experience problems in all areas. We fix the problems a they
become known and have that knowledge when we face a similar situation. We tty
to implement network and hardware standards and are currently
implementing a help desk program,
• Our problem is numbers of machines versus numbers of technicians

198
DRVEl.OPMENTAL LABS, INSTRUCTION ^. TUTORING 01JRSTIONS

Developmental Instmction
(Six colleges reporting)

Table D. 7. Contact hours, employees, students: (Quest, 2) (R=6)


Developmental Contact Hours Number of Number of Students
Area Employees
English/Writing R=l 5,984—104,528 R=5+-69 R=313-2307
Avg.=47,510.9 Avg.=24 Avg.=1024
Math R=45,344-l 19,744 R=14-37 R=821-3025
Avg.=77,034.6 Ayg,=24,8 Avg. =1938
Reading R=7,424-67,376 R=3-18 R=185-1088
Avg.=34,330.7 Avg,=ll Avg.=615

ESL R=7,168-I4,880 R=5-13 R=92-224


Avg.=l 1,029 Avg,=9 Avg.=151

FTE faculty in developmental area: (Quest. 3) rR=6)


R=13-45
Avg.=25

FTE support staff in developmental area: fOuest, 4) CR=6)


R=0-5
Avg.=2.6

Percentage of remediation done through noncourse offerings: (Quest. 5) (R=6)


R=0%-10%
Avg.=7.3%

FTE faculty in the noncourse offerings: (Quest. 6) (R=6)


R=0-3
Avg.=l

FTE .support staff in the nnnrnurse offerings: (Ouest. 7) (R=6)


R=0-2.25
Avg.=l

199
Develonmental department/division: (Quest R\ (R=f.)
Five colleges have developmental classes that are offered within the various
departments/divisions (English and math).
Two colleges have separate departments/divisions for developmental classes.
One college each has
A Developmental division which coordinates developmental advising and
houses ESL, a Leaming Center, and Reading Remediation
• A Developmental Education Department which coordinates reading, study
skills, and ESOL.

Assistants in developmental classes: (Quest 9) (R^^)


Two colleges use graders or assistants in developmental classes
• Both in math (2 and 5 respectively) to grade homework papers,
• One in reading (1) to occasionally score homework assignments.

Charge for developmental classes and labs: (Quest. 10) (R=6)


• Six colleges use tuition plus lab fees as developmental class charges.
Five colleges use the regular tuition rate charge for developmental classes.
• One college has no charge for noncourse based remediation.

Developmental class loads: (Que.st. 11) (R=6)


• Seven colleges figure developmental class loads based on contact hours.
• Two colleges figure developmental class loads based on credit hours.
• One college figures ESL course load by a combination of credit hours and contact
hours.

Table D. 8, Percentage of developmental classes taught: (Quest. 12) (R=6)


Area Full-Time Instructors Part-Time Instructors
Writing R=54%—100% R=1.7%—46%
Avg.=77.6% Avg.=25,5%
M=80% M=28
Reading R=5.6%—100% R=I2%—94,4%
Avg.=61.4% Avg.= 46%
M=69 M=38%
Math R=21%—100% R=I4%—79%
Avg.=60.2% Avg, 46.4%
M=60 M=45
Other (specify) R=42.9%—69% R=31%—57.1%
ESL Avg,5I.6% Avg. 48,4%

200
Developmental r.lass sjye limit: (Quest. 1 3) (R=6')
Writing
R=8-23
Avg.=19
M=20
Reading
R=12-25
Avg.=22
(Three colleges indicated a limit of 20 students on the 1st level and 25 students on
the 2nd level.)
Math
R=22-30
Avg.=27.8
ESL-One college indicated a limit of 12 in ESL classes.

Stildent placement in developmental classes: (Quest. 14) (R=6)


All seven colleges place students in developmental classes based on test scores:
Three colleges use
• TASP scores
• Pre TASP scores
Two colleges use in house placement scores
One college each uses
ASSET
• Nelson-Denny for nursing majors and to determine grade
equivalency
CELT for ESOL

Tutoring

Peer and/or professional tutoring: (Quest. 15) (R=6)


All seven colleges offer tutoring.

Table D.9. Types of tutoring offered


Type Tutoring Labs One on one tutoring Small group tutoring
Peer All 7 colleges offer All 7 colleges offer All 7 colleges offer
Professional Five colleges offer Six colleges offer Six colleges offer

201
Tutoring service Funding: (Quest. 16) (R=6)
All seven colleges fund tutoring by institutional funds; one indicating a supplemental
instmction fee to students.
Five colleges use
• Workstudy fiands
• Grant fund
Four colleges use Carl Perkins Grant
Three colleges use Student Support Services Grant
One college uses volunteers from the college and the community to provide
tutoring.

Additional comments on tutoring:


One college is a Level I Nationally Certified CRLA Training Program for tutors.

Leaming Center

Have leaming center: (Quest. 17) (R=6)


All seven colleges have leaming centers
Six colleges offer
• Basic skills remediation
• GED preparation
• TASP remediation
Five colleges offer
• Tutoring
• College level computer assisted instmction
Three colleges offer TAAS remediation
One college offers competency based high school credit

Other activities of the leaming centers include:


Multimedia labs
Workshops
Seminars
Math lab
Writing lab
Altemative Leaming Ctr.

202
Leaming center staffing: (Quest. 18) (n=f,^
Seven colleges use full-time instmctors in their leaming centers.
Five colleges use
• Lab assistants
• Lab supervisors
• Part-time instmctors
Three colleges use full-time instmctors partial load
One college each uses
• Volunteers
• Student Support Services tutors/coordinators

Classes or supplemental lab work: (Quest. 19) (R=6)


Four colleges offer classes in developmental labs taught by
• Three colleges use instmctors
• Two colleges use lab assistants
• One colleges uses a lab supervisor

Developmental lab size limit: (Quest. 20) (R=2)


Writing
R=20-24
Avg.=21.5
M=20
Reading
R= 15-30
Avg=22.8
M=24
Math
R=24-30
Avg.=28.5

203
DISTANCE F.nUCATlON RRSin.TS
(Six colleges reporting)

Distance Education—a planned teaching/learning experience that uses a wide


spectrum of technologies to reach learners at a distance and is designed to
encourage learner interaction and certification.

Of the six schools reporting, five are involved in distance education.

Distance education usage: (Quest. 2) (R=5)


All five of the colleges offer credit courses via distance education,
• Four colleges offer noncredit (e.g., continuing education, seminars, etc)
via distance education.
• Three colleges offer instmction within the college system via distance
education.
• Only one college offers intercampus meetings via distance education.

Course Content: (Quest. 3) (R=5)


Three colleges
Both create and purchase their course content
Two colleges
Purchase their course content

Sell Services (Que.st. 4) (R=6)


Three colleges sell the services of their delivety systems to outside businesses:
• All three sell teleconferencing
• One sells Intemet service
• One sells fraining
• One sells courseware

Pis Ed. Support (Quest 5) (R=6)


Two colleges find support from:
• A broadcast area or media services
• Leaming Resource Departments
• Instiiictional divisions, i.e. Arts & Sciences and Business & Technology

One college each finds support from:


• Computing Services
• Center for Professional Development
• Distance Education area

204
Organizational Stmcture: (Quest. 6-7) (R=5)
Two colleges' distance leaming entities report to Vice Presidents of
• Information Technology
• Instmction and Student Services
Administrative Services (Computing Services)
One college each distance leaming entities report to
• Respective Deans
• Instmctional Administrators
• College President

Two colleges support personnel report to


• Deans of Leaming Resources or Academic and Vo-Tech
One college each support persormel report to
• Divisional midmanagement
• VP of Instmction or VP of Administrative Services
• Director of Media Services

Personnel (Quest. 8-11) (R=5)


Number of FTE faculty dedicated to distance leaming:
• 2—Two colleges
2.5
8
25

Number of FTE support personnel dedicated to distance leaming:


On as needed basis.
Less than 1
1
2
6

205
Table D.IO. Titles and responsibilities of distance education support personnel
Job Title Responsibilities
Two colleges have a Coordinator of (This person also teaches Chemistty in one
Distance Education college.)
Director of Media Services
Associate VP of Information Media Services, Computer Services,
Technology Telecommunications, Academic Computing
Webmaster WWW activity
Director of Computer labs Manages persormel & stmcture of labs
Distance Leaming Technician Operates DL classroom
Media Services Coordinator Media Production
Systems Programmer Computing Support Services
Technical Branch Librarian Reference/Media Support in Tech library
Dean of Leaming Resources Coordinates Libraty/Media Centers;
Faculty/Staff Development
1/3 time—Director Multimedia Manages Campus Distribution of AV
Access & Prod. equipment/media production
'/z time—Video Production Campus Video Production Support
Technician
1/4 time—Systems Technician Maintaining Campus Equipment/Systems
Vi time—Leaming Resources Maintaining Division Records/Corres.
Secretary
Media Tech Campus-wide media services

Instiiictors: (Quest. 11) (R=-5)


Five colleges use full-time faculty for distance leaming courses counting it as part of their
load.
Three colleges
• Use fiill-time faculty for distance leaming counting it as overload.
• Use part-time faculty

206
Registration and Advising: (Quest. 12) (R=5)
All five colleges have distance education students go through traditional registration,
• Four colleges have students receive traditional advising.
• One college does off-campus bulk advising.

Priorities (Quest. 1 3) (R=5)


Four colleges prioritize:
• Offering courses via Intemet
• Offering more courses with interactive video/audio between two locations

Two colleges prioritize:


• Consortiums or coordination with others to expand distance leaming
courses
• Expanding the offerings on distance leaming for larger portions of degree
plan
• Establishing 2-way video systems with area high schools
One college each prioritizes:
• Development of institutional plan
• Funding analysis
• Increasing availability of courses to the home
• Increase instmctional support of Intemet
• Continue offering teleconferences
• Remote Access

Budgeting
(Five colleges reporting)

Budgeting delivery sy.stems: (Quest. 14) (R=5)


Five colleges use grants
• Four colleges use institutional budgets
• Two colleges use a fee stmcture

Percentage of college budget: (Ouest. 15) (R=5)


Three colleges do not maintain a separate budget for distance education.
One college each receives
• Less than 2%
1%
One college indicates that distance education is in the instmctional budget.

207
Income/expenses- (Que.st. 16-17) (R=5)
• One college's income/expenses is in the instmctional budget.
One college is limited to contact hour reimbursement,
• One college each makes
$650,000 approx.
$25,000 (Fall 1996)

• One college each expenses are


• Same as other classes,
$500,000-$700,000
$8,616 (Fall 1996)

Fees: (Quest 18) (R=5)


Only one college charges additional fees ($25) to students taking courses via TV.

Comments:
• Distance leaming is vety expensive-must determine priority and expectations
before implementation.
• Hard costs for operating mn about $400,000; capital outlays vary on a year-by-
year basis; administrative costs need to be added to figures.
• Distance leaming equipment and the cost of telecourses are separate accounts;
personnel costs are incorporated into instmctional and support services
departments.

Equipment
(Five colleges reporting)

Lists of equipment in attachments in data notebooks.

Dedicated space: (Quest. 19) (R=5)


• Two—dedicated space
• Two—shared space
• Two —other space

Comments:
At present can only point to point interactive video conferencing. The college hopes to
be able to find equipment for multi point delivety.
Telecourses broadcast via local cable compressed video/audio is near fiiture and will be
shared with local technical college.
We have a distance leaming classroom acquired through a federal grant. It is equipped
with two robotic cameras, a presentation camera, 12 student capacity with audio,
capability of multi-media computer, an Aladdin Pinnacle Media Printer, and various other
production equipment.

208
Agreements
(Five colleges reporting)

Four colleges have distance education agreements with other institutions:


3 other institutions one college
1 other institution three colleges

Comments:
• Negotiated on a semester by semester basis
• Agreements are currently being negotiated with other institutions.
• Lease nine telecourses from DCCCD and Tarrant Co. Junior College so that the
telecourses may be broadcast over a PBS station. The signal is down linked by
the cable companies within our three counties.

DELIVERY SYSTEMS

Types of delivety systems used by institutions


4—Broadcast
2—ITFS and microwave
5—Cable television
4—Intemet and computers,
5—Video

Broadcast
(Three colleges reporting)

Broadca.st system: (Ouest. 1) (R=3)


I—Satellite
1—Closed Circuit for campus
1—Compressed video
I—Low power television

Time and coverage: (Quest 2-3) (R=4)


Mile radius:
15-20 miles with wdreless; further with cable
80 miles
10 mile approx.
Cable closed circuit

Time of broadcast service:


30 years
10 years
8 years
2 years

209
For classroom equipment and course titles see attachments in study data notebook.

Class limits: (Quest 7^ (R=4)


Two colleges place no limits on class enrollment.
Two colleges place limits-one is over 50

Course .selections: (Quest. 8) (R=4)


Three colleges have departments determine offerings.
One college has divisions determine offerings.

Publicity: (Quest 9UR =4)


4—Inside campus
3—Outside campus
2—Together
3—Paid advertising

No college reporting has uplink capability.

Agreements with local cable: (Que.st. 12) (R=4)


All four colleges have agreements with local cable.
• To cany the broadcast signal—38 cable stations
• Franchise agreement between cable system and the city provides for up to
four educational access channels in the city to be shared among the public
schools and higher education. All buildings on campus also have a cable
connection at no cost.
• Shared access charmel serves 55,000 homes. Programs originate from
DMC Media Center
• One free educational channel shared with 3 institutions in 24 hour day

No college reports that the cost of offering broadcast classes exceeds the benefits.

Advantages to broadca.st: (Quest 14) (R=4)


Three colleges report reaching more students esp. those who would not otherwise take the
courses.
Two colleges report student convenience, audience saturation, and program marketing
because student can tape broadcast and view and review as convenient and as
needed
One colleges reports standard accepted philosophy

Disadvantages: (Quest. 15) (R=4)


One college each reports
• Limited time to station
• Limited service area
• Students miss interaction with other students
• Standard accepted philosophy

210
ITFS and Microwave
(One college reporting)

Physical area: (Quest. 1)(R=1)


15-20 mile radius from transmitter

Length of service: (Quest. 2) (R=l)


6 years, hold license to 4 ITFS chaimels which are leased to a wireless cable
system.

Students served: (Que.st. 3)(R=1)


15,000 subscribe, in 1995-96, we had 704 enrollments in telecourses.

Course tities: (Quest. 4) (R=l)


All telecourse offerings

Enrollment limits: (Quest. 5) (R=l)


Yes, depends on class what they are.

Offerings determined: (Quest. 6) (R=l)


By academic division

Publicize broadcast classes: (Quest. 7) (R=l)


• Inside campus
• Outside campus
• Together
• Paid advertising

Profit-(Quest 8) (R=1)
No, offer such classes because they are a service to the student and are beneficial
to them. We started these classes because it seemed to be the wave of the fiiture, and we
are in the business of educating and should stay abreast of future technology.

Tnterpp^ and Computers


(Four colleges reporting)

.Systems-(Quest 1)(R=1)
3 colleges-e-mail with list servers
3 colleges-WWW

211
Providers and rnst- (Quest. 2) (R=2)
Two colleges—Texas A & M
Two colleges—The Net

Two colleges pay $2400 per year


On college pays
• $2500 a year—excluding live web
$7980 per year

Teaching tool: (Quest. 4) (R=4)


Four colleges use Intemet as a teaching tool in
• Freshman English classes
• Credit classes and noncredit classes
• Instmction in Intemet services and protocols
• Instmction

Leaming tool: (Quest 5) (R=4)


Four colleges use Intemet as a leaming tool
Three colleges in
• Research
One college in
• Credit classes and noncredit classes
• Student access to notes/assignments

Additional classroom information: (Que.st. 6) (R=4)


Four colleges use the Intemet for additional classroom information for
Subject research
Faculty home pages
Students can upload and download assignments
Student access to college publications
LRC teaches bibliographic instmction for online resources
Instmctors assignments require Intemet
Some telecourse instmctors interact with students through e-mail and
postings on web servers.

WEB page: (Quest. 7) (R=4)


Four colleges have a WEB page.

Four colleges maintain their own web pages as an institution

Three college maintain their own web pages


• By division
• By department

One college maintains its web page by a WEB Page Committee

212
Services provided- (Qiie.st. 8) (R=4)
Four college provide WWW for employees and students

Three colleges provide e-mail to employees and students

One college provides


• Dial-up Access to everyone
• Telnet to evetyone
• FTP to evetyone
• SMTP to evetyone
• Intemet instmction to students
• Distance Education Courses by Intemet students

Length: (Quest. 10)(R=4)


2 years
2 years to staff, 1 year to students
4 !4 years

Equipment: (Quest. 11) (R=4)


• Four colleges provide PCS for Intemet cormection
• Two colleges provide Macintosh Network Links through Mac's

Co.st balance: (Quest. 12) (R=4)


• Two colleges do not feel that cost exceeds benefits because students like it
• Two colleges do not know if cost exceed benefits because there is not enough
histoty or they do not offer classes via Intemet.

Cour.se titles and levels: (Quest. 13-14)


Freshman Composition
English
Computer Science
• Two colleges offer Freshman level courses
• One college offers Sophomore level courses

Enrollment limits: (Quest. 16) (R=4)


Only on college sets enrollment limits on Intemet classes at standard section size.

Offerings determined: (Quest 17) (R=4)


• Two colleges offerings are determined by departments
• One college offerings are determined by coordination around instmctional units

213
Publicity: (Quest 18)(R=4)
Two colleges publicize outside of campus
One college publicizes by
• Inside campus
• Inside and outside together
• Paid advertising

Legal issues: (Quest. 19) (R=4)


Maintain instmctional rights
Policy for use of the network

Equipment ramifications: (Quest. 19) (R=4)


Remote access to network
Management software for bulletin board activity and course management
Call in capability from home
Sufficient bandwidth
Altemative access to accommodate the medium and provide reliable
service
Shadowing/Tandem processing for speed, reliability, servicing and access
Router capable of handling the fraffic
Phone service/helpdesk for technical issues and assistance
CDROM tower(s) and services for resource materials
FAX/Scanners, document and video cameras and other multimedia
equipment available for instmctors to use to prepare class materials
appropriate for this medium
• FAXing capabilities
• Software availability and upgrades.

Advantages of Internet: (Quest. 20) (R=4)


Access to materials not normally available
Self-paced eru-ollment
Easily updated
Goes directly to the student's home
Interaction is more regular and intense
Convenience for the students and instmctors
An effective leaming medium
Potential for wide variety of views and experiences for class discussions
Potential for offering courses to fit specialized need by specialist in the
field
Greater potential market
May relieve some sfrain on campus space and operation budget.

214
Di.sadvantages: (Quest. 21) (R=4)
Student participation requires heavy commitment and motivation
• Still can't cany a lot of information/video files
• High drop rate
• Students lose campus experience and physical presence of the classroom
• Not all course work can be taught effectively through this medium
• Costs for equipment and upgrades
• Training and support time
Costs for developing effective classroom techniques and presentations by
faculty

Additional comments:
• LRC access to curriculum support materials and instmction on research
techniques,
• Use of online resources is a critical element in this medium.
• "Browsing the stacks" for relevant information is no longer an option.
• Online Reference Helpdesk is a must.
• Storage and/or access for full text materials online should be an integral
part of the planning. Document request and delivery services must be
addressed.

Video

Video systems: (Quest 1) (R=4)


Four colleges offer:
• Traditional Teleconferencing
• Cassettes
Two colleges offer:
• Two way interactive
• Video Teleconferencing

Statewide networks: (Quest. 2) (R=4)


• Four colleges subscribe to Starlink.
• One college subscribes to VIDNET.

Number of connections: (Quest, 3) (R=4)


5
• 10± classrooms are wired to receive
• Videotelecom is downlink only

215
Length: (Quest 4^ (R=4)
• 5 years
• 10 years
• 10 years for downlinks, 6 years for cassettes
• 30 years

Classrooms: (Quest S) (R=4)


1 and all on campus classrooms on campus
• 10 off campus

Equipment: (Quest 6^ (R=4)


• College provides monitors and receiving equipment for teleconferences
• Distance leaming equipment provided by grants.
• See attached. Cost for classroom equipment is split with high schools
Picture Tel Venue 2000 with auxiliaty graphics monitor
25" monitors, VCR

Course titles: (Quest 7-> (R=4^


See attached brochures
Telecourses by cassettes
• Organic Chemistty
Calculus I, II, and IH
• Physics course.
Cable Telecourses
Govt.
Eng.
Hist.
• Gen. Psyc.

Enrollment limits: (Quest. 8) (R=4)


Only one college has an enrollment limit per class of 24 students per two-way video site

Course offerings determined: (Quest. 9) (R=4)


• Three colleges determine course offerings by department.
• One college determines course offerings by faculty and instmctional deans, based
on student requirements/available courseware

Publicity: (Quest. 10) (R=4)


Three colleges publicize
• Together—inside and outside campus and in the video group
• Paid advertising
Two colleges publicize inside campus or the video group.
One college pubhcizes
• Outside campus
• Separate as individual classes

216
Co.st/expenses: (Quest 11) (R=4)
• Two colleges report cost does not exceed expenses.
• Two colleges do not know if cost exceeds expenses.

Advantages: (Quest. 12) (R=4)


• Meets specific needs in isolated areas
• Teleconferencing meets short-term training needs
• Student convenience, isochronous delivety of telecourses
• Two-way video allow more course offerings
• A small group of each of two colleges make one group large enough to
offer the course.
• Standard accepted philosophy

Disadvantages: (Quest. 13) (R=4)


Two colleges report cost can be high.
One college reports
• None except increased technical support required
• Schedule of both institutions
• Laboratoty offerings,
• Standard accepted philosophy

217
RESEARCH/PLANNTNG/INSTTTUTIONAL EFFECTIVENRSS RESULTS
(Six colleges reporting)

Classified employees: (Quest. 2) (R=6)


R= 1/2-4
Avg.=2.2
M=3

•lob Titles and Functions in Institutional Effectiveness:

Research # •loh Titles


R=0-3 Director of Institutional Research and Records
Avg.=1.4 Management
M=2 Secretary of Institutional Research
Research Assistants
Coordinator of Evaluation
Program Evaluation Research Associate
Student Tracking Director of Institutional Research,
Oversight, General Research

Planning Tob Titles


R=0-2 Vice President of College
Avg.=.7 Secretaty
M=l Secretarial
Director of Institutional Research
Chief Planning Officer

Institutional
Effectiveness •Tob Titles
R=0-2 Dean of Program Development and Institutional Research
Avg.=.6 Secretaty
Director of Institutional Research, Oversight

Other (specify)
Division Senior Secretary-Handles duties related to research, planning,
institutional effectiveness, and resource development
All of the above together - Director, Planning & Instittitional Research
Research Associate
Office Assistant (classified employee)

-71 s
Early procedure for institutional planning and effectiveness: (Quest 3) (R^6)
Four colleges review operations yearly for revisions.
One college
Evaluates and plans on a 3-year cycle
Is revising the planning process but intends that the new time line be fairly
flexible

Institutional accountability: (Quest. 4) (R=6)


Four colleges attached copies of objectives, strategies and master plans.
One college stated that
The main accountability measures are those identified by the THECB.
• The new measures being developed are associated with the seven strategic
goals developed by the President's Advisoty Council.

Use of accountability measures: (Quest. 5) (R=6)


Two colleges tie accountability directly to the budget process:
• Updated and reviewed annually prior to budget planning to feed into
budget decision making for department/division/unit requests
• New initiatives are tied to the Sfrategic Plan, and the need for
improvement in outcomes is cited when fimds for new initiative are
sought.
One college reports
• Final institutional effectiveness yearly report includes evaluations of
outcome measures and plans for improvement.
• Outcomes are incorporated into the Continuous Improvement Process for
the development of fiiture goals and objectives.
• Once performance measures are developed, the college's Council for
Planning and Institutional Effectiveness will monitor the performance
measures and submit an annual report to the President's Advisoty Council
on the Institution's performance relative to its strategic plan.

Additional comments regarding institutional research and effectiveness:


• Have comprehensive data files which allow useful statistical studies and follow-
up reports to be conducted. Have superior developmental studies follow-up
system. Have unique and vety accountable means of tracking results as reported
in Strategic Plan Trends and Status column of Attachment 7.
• Laredo Community College is undergoing a restmcturing of its organization and
the process will be modified as required.

219
Resource Development
Grants

Percentage of annual budget is grants and contracts: (Quest. 6) (R=6)


R=0%-14%
Avg.=8.33%
M=10%

Income per contact hour from federal sources: (Quest. 7) (R=6)


R=0-$1.30
Avg.=$.92
M=$1.I2

Employees involved in pursuing or overseeing grant money: (Quest. 8) (R=6)


R=l-6
Ayg.=2.8
M=2.5

Job titles:
Dean of Institutional Advancement
Dean of Program Development and Institutional Research
Dfrector of Financial Services
Other deans/directors, faculty, staff as appropriate for each grant
Deanof Occ/Ed
Fin. Aid Director
Business Office
Grant developer
Assistant to President for Industrial Development
Grants/Auxiliaty Coordinator
Planning & Grants Coordinator
Director, Foundation
Grant Acct.
Grant Monitor
Grant Directors

Grant approval process: (Quest. 9) (R=6)


Six colleges have a grant approval process which includes
• Following Preliminaty Grant Approval Process
• Having application reviewed and approved by Dean of Program Development and
Institutional Research applications, then Director of Financial Services reviews
budget implications, and President approves.
• President signs all grant applications upon review of the Vice President of
Business and Finance

970
Approval by the Executive Administration and if matching funds are required,
approval by the Board of Tmstees.
Approval from immediate supervisor, then division director, dean and appropriate
vice-president.
Approval by Dean, Director of Development, V.P. of Administration, President

After grant is conr.luded: (Quest 10) (Ti,=f,)


Six colleges either discontinue the grant services and personnel or absorb them
into the institution.
Six colleges consider the grant agreement when planning to continue services.

Exit nian for each grant- (Quest 1 H (R=6)


Three colleges have exit plans for each grant.
One college
• Has a formal exit plan spelled out in its approval form.
• Depends upon type of grant and the grant covenant
• Has the Project Director work with Grants/Auxiliary Coordinator and
Planning & Grants Coordinator on fiscal and narrative reports

Foundation

College foundation: (Quest. 12) (R=6)


Five colleges have college foundations worth
R=$500,000-$ 11,091,978
Ayg.=$4,325,744
M=$2,436,740
AC=$ 11,091,978

Foundation management stmcture: (Quest. 12) (R=6)


• 20 member volunteer board of directors; executive director and an administrative
secretaty are the only paid staff.
• Appointed Board: President, Vice President, Secretaty, Treasurer
Foundation Board; Foundation Executive Committee; Foundation Investment
Committee; Executive Director of Foundation
• Board of directors nominated by the Board at large with two members the Board
of Tmstees and two members from the faculty and staff. The Executive Director
reports to the President and Foundation Board.

Percentage of foundation management co.sts:(Quest. 13) (R=5)


by institution:
R=66%-100%
Avg.=86%
M=85%

221
by foundation funds:
R=5%-33%
Avg.=17%
M=15%

One college reports that 100% investment management fees are paid by the foundation.

Voting members on foundation hoard: (Quest 14) (R=5)


Two colleges have college personnel who serve as voting members on the foundation
board.
• Two include the President as a voting member
• One college each includes
• Business Manager
• Two Representatives from Faculty/Staff

Foundation's di.strihutahle fiinds: (Quest. 15) (R=5)


• Two colleges majority of funds are scholarship based and distributed at discretion
• Two colleges have policy and procedure manuals goveming distribution.
• One college determines by a Foundation Board as recommended by Executive
Committee or as determined by Donor's wishes

Outside managers: (Quest. 16) (R=5)


Two colleges hire outside managers to manage the foundation fund.

Investment policy: (Quest. 17) (R=5)


Five colleges attached copies of policies.

Percentage of di.strihutahle funds for scholarships: (Quest. 18) (R=5)


R=40%-100%
Ayg.=74,6%
M=78%

Percentage of di.strihutahle funds for in.stitutional support: (R=4)


R=12%-40%
Avg,=24,3%

222
INSTRUCTIONAL SURVEY RF.SUl.T.S
(Six colleges reporting)

Cost Study

Table D. 11. Per contact hour cost: (Q uest. 1)


Area State Median LCC* Range Average
Institutional 0.83 0.82 1.13-.65 0.88
Support

Student Services 0.44 0.39 .55-.39 0.44


Staff Benefits 0.29 0.28 .56-.08 0.29
Library 0.19 0.28 .26-.19 0.24

Table D.l2 . Contact hour study: (Quest. 2)


Area Acad. Acad. Acad. Acad. Vo/Tec Vo/Tec Vo/Tec Vo/Tec
State LCC* Instit. Instit. State LCC* Instit. Instit.
Median Median Range Avg. Median Median Range Avg.

Inst. 0.21 0.21 .48- .20 0,32 0.28 .53-.13 0.28


Admin. .04
Organ.
Activities 0.08 0.00 0.00 0.00 0.07 0,05 0.00 0.01
Related to
Instruction
*Lead CommunityCc )llege usee as exam pie

In.stmctional Administration

Definition:
Instructional administrators-Employees who supervise faculty, oversee programs
and may or may not teach.

Load reductions: (Quest. 3)


Three colleges give load reductions for responsibilities outside of instmction.
• Three for projects and grants
• Two for
• Extra curricular activities
• Job-related activities
• Committee service one including faculty senate officers

223
Table D.l3. Instmctional administration load anc compensation
Colleges Level and title Compensation Considered Load Contract
Have
6 Vice president 6-Admin. 0 12 mon.
6-for instmct.
1-acad. affrs.
6 Dean/Div. Chr. Salary 5-Admin. 6-12 mon.
1-Fac.
5 Dept. Chairs R=$100-$3600 4-Fac. R=3-18 R=9-12
Avg.=$1611 hrs. hrs.
6 Prog. Directors/ R=$500-$3500 5—Fac. R=9-24 M=9 mon.
Discipline Avg.=$1868 hrs.
Coord.
3 Prog. Coord. R=$0-$1800 3—Fac. M=9 mon.
*A11 responcing attached job d(jscriptions for defining duties of each position.

Table D. 14. Instmctional adminisfration supervision


# Level #of #Maj. # Students Super. Super. Super.
and title Depts FTFac PTFac Class.

Vice 4(1)* 7400 (1)"^


6 president

Dean/ R=l-25 R=l-38 R=64-3000 R=8- R=7- R=l-ll


6 Div. Chr. Avg=8.3 Avg+17.8 Avg=I645 160 135 Avg=4.8
M=7 M=16 M=1050 Avg=43 Avg=53 M=5
M=32 M=40

6 Dept. R=l-8 R=l-8 R=600- R=9-3I R=8-31 R=l-3


Chr/ Avg=3.1 Avg=4.1 5780 Avg=24 Ayg=18 Avg=2.2
Prog. Dir Ayg=2773

3 Prog. 1 40
Coord.

224
Calculating compensation: (Quest. 8) (R=5)
Two colleges use a formula for calculating compensation for instmctional administrators.
One policy attached
Fonnula: SHE release: 3(1-6 fac); 6(7-12 fac); 9(13+fac); 12
Stipend: Based on number of FT fac ($200 each) and adjunct ($100 each)
supervised

Equity: (Quest. 9) (R=5)


Two colleges do not attempt to ensure equity.
One college
• Formula in process of revision
• Follow formula
• Release time for projects must be approved by Instmctional Council.
• Operates on a classification system

Faculty Load Considerations

Faculty load calculated: (Quest. 10) (R=5)


Three colleges require 15 credit hours for a full load
One college requires
• 15 load hours with load hours determined by formula
• Minimumof 21 contact hours
• Ratio is as follows: lecture 1:1, lab 1:2; clinical .75:1

Department variations in load calculations were explained on attached policies.

Specific lab calculations: (Quest. 11) (R=5)


Science lab load:
Three colleges calculate one load hour for two hours of lab instmction.
One college
• 1 SHE per contact hour
• See attachment listing load equivalents; they vaty depending on the course

Computer lab load:


Three colleges calculate one load hour for two hours of lab instmction.
One college
• 1 SHE per contact hour
• See attachment listing load equivalents; they vaty depending on the course

225
Physical activity r.lass load (Including how many hours per week p.e. classes meet)
Hours per week Load hours
3 hours per week = 2.5 load hours
2 hours per week = 1 load hour
3 hours per week = Full load 6 physical activity classes per week
3 contact hours = 1 lecture, 2 lab equate two SHE
3 hours per week = 2.14 load credit=3 hours contact

Leaming lab load


• Not equated to load but based on regular 40 hour work week
• One load hour for two hours of lab
• Five 3 semester hour classes or 21 contact hours is a full-time load
See attachment listing load equivalents

Technical lab load


Two colleges have one load hour for two hours of lab instmction
One college has a special hsting of load equivalents

English/vmting leaming lab load


Two colleges have one load hour for two hours of lab instmction
One college
• Uses lab assistants
• Has a special listing of load equivalents

Basic skills lab load


Two colleges have one load hour for two hours of lab
One college
• Bases on regular 40 hour work week
• Has a special attachment listing load equivalents

Private music lesson load


.6 load hour per hour of instmction
• 5 clock hours=3 contact hour class
100 level: ,31=1 student; 200 level: .62=1 student
• See attachment listing load equivalents

Performance (music and theater arts) ensemble load


• Five load hours for a performing ensemble.
• No class reduction for ensemble—computed on lecture/lab basis
• Three load hours for a theater production
• See attachment listing load equivalents

226
Coop instmction load
• .15 load hour per student
• One load hour for five hours
• U students/6
See attachment listing load equivalents

Intemship load
• .30 load hour per student
• One load hour for five hours
• # students/6
• See attachment listing load equivalents

Lecture/lab load
Two colleges calculate: Lecture is one load hour per one class hour/Lab is one load hour
per two lab hours.
• One college has a listing of load equivalents

Clinical hours load


Two colleges include attachments
One college calculates
One load hour for 2 Vi hours lab
Nursing: .75 SHE per 1 contact hour
Allied Health: .25 per 1 contact hour
Rad Tech: 14.50 per clinical hour
See attachment listing load equivalents

Instmction: (Quest. 12-13) (R=5)


All five colleges allow fiill-time instmctors to receive load credit for lab instmction.
• Three colleges calculate such load on a one load hour for two lab hours
ratio
• One college varies calculation according to the lab.

No college uses computer-assisted instmction to save on personnel cost.

School Respon.sibilities: (Que.st. 14-17) (R=5)


All five colleges grant teaching load reductions or release time for school responsibilities
outside of instmction for
• Projects/grants by five colleges
• Exfra curricular activities by three colleges
Two colleges for
• Job-related activities
• Committee service
• Course development

227
One college for
• Curriculum development
Dual credit program coordinator
• Honors program coordinator
• Some lab coordinators

Cost effective
Two colleges
Don't know if pay compensation has been cost effective.
Say no pay compensation has not been cost effective.
One college each says
Pay compensation is cost effective because it is paid at the part time rate.
Full implementation is planned for 96-97.

Three colleges say yes teaching load reductions or release time have been cost effective
by explaining that typically faculty spend more time on the project than they
would on their regular classes, and the released class is covered at the part-time
rate.

Two colleges say no teaching load reductions or release time have not been cost effective
explaining that one course reduction is the equivalent of the college paying 1/10
of salaty for the activity, plus adjunct cost.

None of the five colleges have found a more effective way than pay or load reductions to
accomplish such adminisfrative duties.

Vocational Classes: (Quest. 18-20) (R=5)


Three colleges do not count vocational classes as faculty load.
Two colleges do count vocational classes as faculty load calculating
• A determined class by class basis equivalence with a credit class
• Formula based on contact hours

Three colleges calculate load for credit adult vocational classes


• Same as any credit hour class
• Basis of 15 students enrolled then case by case
• Formula based on contact hours

Two colleges offer mixed classes (i.e., semester hour credit students and adult vocational
students in the same class) calculating load
• On basis of the load for the credit class
• By counting adult vocational students toward the total required to make a
class, since they pay the same costs.

228
Average lechire r.lass- (Quest. 21) (R=5)
R= 19.52-35
Avg.= 28.4
M=30

Small transfer classes (i.e.. under 10 in enrollment) load credit- (Quest. 22-23) (R=5)
Four colleges use prorata formula based on number of students
Examples:
3 credit, 3 lecture hour class with 5 students equals 1.5 load hours
Based on 15; if class has 10 stiidents, salaty = 10/15
• Dependent on intention of offering for proration or full credit
• Sometimes afiill-=timefaculty (like adjuncts) may teach a class on a pro-
rated basis with 12 being the base number.
One college
• Decides to mn a class under 12 if it is taught by a fiill-time faculty for part
of a 15 hour load
• Is in process of developing policy

Small vocational/technical classes (i.e., under 10 in enrollment) load credit: (Quest. 23)
(E^
Three colleges use a prorata formula
• Based on the number of students—3 credit, 3 lecture hour class with 5
students equals 1.5 load hours
• Based on 15 except those programs with accreditation stipulations on class
size
• Decisions are made to mn a class under 12 if it is taught by a fiill-time
faculty for
part of a 15 hour load
• Sometimes afiall-=timefaculty (like adjuncts) may teach a class on a pro-
rated basis with 12 being the base number.
One college
• Gives fiill credit
• Is in process of development

Extra large lecture classes: (Quest. 24-26) (R=5)


One college does not offer exfra large lecture classes.
Four colleges do offer exfra large lecture classes,
• Biology sections of 80-90
• Telecourses of 60-70 students in one section
• Greater than 45
• Count of above 50

Three colleges offer compensation for exfra large classes.


• Two colleges give exfra load credit
• One college pays $25/stdt over class maximum as compensation

229
Two colleges offer faculty of extra large classes help in grading.
By using SCANTRON with lead faculty grading essay portions of exams
• By permitting student help

Instmctional Personnel

Percentage of your fiill-time employees are fiill-time faculty (Quest. 27) (R=4^
R= 38-50
Avg. 45.7%
M=47.5%

Percentage of your continuous part-time employees are part-time credit faculty: (Quest,
28) (R=4)
R=67%-95%
Avg.=85.75%
M=90.5%

Rank system: (Quest. 29) (R=4)


Two colleges have ranks systems.

Table D.15. Rank systems for instmctional personnel


Ranks Have Compensation Amount Minimum Requirements
Instmctor—2 colleges Both = -0- BA/BS degree

Assist, professor— $750 & $1800* Attachment File 2,6,1.1


2 colleges BA/BS degree

Assoc, professor— $1,150 & $4,200* Attachment File 2,6,1.1


2 colleges BA/BS degree

Professor—2 colleges $1,800 & $8,200* Attachment File 2.6.1.1


MA/MS degree
*The amounts in the compensation column are the amounts paid for each rank, not the
total compensation

Movement in rank: (Quest. 30) (R=2)


• By policy on attachment File 2,6.1.1
• By merit-See pages attached, 6-8. 301 to 30c

230
Faculty office hours: (Quest. 31) (R= 5)
All five colleges require faculty to maintain office hours with the minimum requirement
of
Two colleges 10 hours per week
One college
• What ever is necessary to consult with students
• 5 hour per week or one hour per day
• One hour a week for each three hours taught is our requirement.

Table D.16. Librarians and counselors: (Quest. 32-38) (R=4)


Title Considered faculty Considered Nonfaculty # By This Title
Counselors R=9-10 R=5 R=0-10
Avg.=9,5 Avg.=5 Avg.=6
Librarians R=l-7 R=2 R=2-7
Ayg.=3.25 Avg.=2 Avg. 3.75
M=3

Ratio of full-time librarians per FTE (15 credit hours) students:


1:1754
1:1786
1:852
1:1160

Ratio of full-time librarians per student (unduplicated head count):


1:3282,5
1:2598
1:1484
1:1775

Ratio of full-time counselors per FTE (15 credit hours) students:


1:389.7
1:1071
1:596

Ratio of full-time counselors per student (unduplicated head count):


1:729.4
1:1559
1:1039

Counselor credentials:
• Two colleges require a Master's Degree in guidance and counseling
• One college requires L. P. C.

231
Four colleges do not increase counselor salaries with credentials such as LPC.

Academic advising sy.stem: (Que.st. 39) (R= 5)


Three colleges have integrated counseling systems
With two colleges having major advised by faculty and undeclared/liberal
arts advised by counselors
One college has both individual faculty advising and scheduled "drop-in"
sessions where faculty advise large numbers
Two colleges have decentralized counseling by departmental faculty
One college has a centralized counseling center

Mentoring system for new personnel: (Quest 40) (R=i;>


Three colleges have mentoring systems for new personnel.
• That are full-time faculty
• That are part-time faculty
One college has a mentoring system for classified.
One college compensates the faculty mentors $50 per mentee per semester

Two colleges select the mentors from the same work area

Professional Development

Responsibility for academic professional development: (Quest. 41) (R=4)


Two colleges have Professional Development coordinators or directors
One college has the
• Dean of LRC/Instmctional Dean/ Faculty Senate oversee professional
development
• Professional Development Committee composed of faculty oversee
development.
• In two colleges the position is part-time.
In two colleges the position is full-time.
• In two colleges the position is administration.
• In two colleges the position is faculty.

One college compensates the position with


• Release time
• Both release time and money
• Being part of the position.

232
Academic professional development opportunities: (Quest. 42) (R=3)
Numerous workshops, seminars, conferences, travel
Institutional level
Program level—All relate to goals and objectives
Individual level
Mini-grants
Retum to industty
Retum to university
Faculty development leave
No-cost tuition for classes

Responsibility for technical professional development: (Quest. 43) (R=4)


• Two colleges have the same position responsible for both academic and technical
development a Professional Development Coordinator or Director
• One college has the Dean of LRC/ Instmctional Deans/ Faculty Senate
responsible for technical development

Two colleges have this position as part-time.


Two colleges have this position as fiill-time.

One college compensates the position with


Release time
Money, Job range-$33,960-$53,532
Both money and release time
Neither money nor release time
Beieng part of the positions

Technical professional development opportunities: (Quest. 44) (R=5)


Numerous workshops, seminars, conferences and travel
E-mail, Intemet, Multi-media, software training
Travel, on-campus seminars, faculty grants
Technical training; CEU compensation
Mini-grants, retum to industty, retum to university
Faculty development leave, no-cost tuition for one college's classes

Administrative professional development opportunities: (Ouest. 45) (R=4)


Numerous workshops, seminars, conferences and travel
Conferences to fit needs of individuals
Travel, on-campus seminars, faculty grants
Mini-grants, retum to industty, retum to university
Faculty development leave, no-cost tuition at one college for classes

233
Classified professional development opportunities: (Quest. 46) (R=5)
Two colleges offer no-cost tuition classes according to work schedule.
One college offers
• Numerous workshops, seminars, conferences and travel
• Computer software, human relations
• Local workshops, on-campus seminars
• One college has a separate non-faculty professional develop, committee

Part-time faculty professional development: (Quest. 47) (R=5)


Four colleges offer professional development to part-time faculty.
One college offers these
• Same as other groups-numerous workshops, seminars, conferences and travel
• At the beginning of each semester

Professional development to maintain field proficiency: (Quest. 48) (R=5)


All five colleges offer professional development to technical/vocational faculty to
maintain proficiency within their fields.
• In three colleges such training is separate from the academic.
• In five colleges it is for training in new technology.
• In three colleges it is for additional certifications
• In four colleges the college pays for technical faculty to receive training in new
technology
• In two colleges the college pays for technical faculty to receive additional
certifications.

Technical faculty receive release time: (Quest. 49) (R= 5)


Three colleges grant release time for industry training
One college grants release time based on
• A policy, but no one has done it.
• Professional development fiands
Faculty application to the Professional Development Committee.
(Proposal is then reviewed and funded/not funded.)

Professional development fiinding: (Quest. 50) (R=5)


Four colleges provide institution-wide funds for professional development.
• Three colleges provide fund through department/division budgets.
One college uses Title III, Strengthening Institutions Grant, funds.

RnHget allocation: (Oue.st. 51) (R=3)


$25,000 in an institution-wide fimd directed by the Faculty Prof, Dev, Committee;
$91,461 across instmctional departments
5% of budget
Separate Professional Development Account

234
Required professional development activities: (Quest. 52) (R=5)
Three colleges require some professional development activities.

Employee Group Examples of required professional development


activities
Administration Technology training
Licensure updates
Dependent upon evaluation system
Full-time faculty Technology training
Licensure updates
Teaching techniques
Dependent upon evaluation system
Full-time faculty members must attend school
or other approved activity at least evety
four years and complete a minimum of
three semester hours or equivalent.
Part-time faculty Technology training
Licensure updates
Teaching techniques
Dependent upon evaluation system
Classified Technology training
Licensure updates
Dependent upon evaluation system

Part-time faculty

Definition of part-time faculty: (Quest. 53) (R= 5)


Faculty teaching less than one-half time
Non-contractual faculty
Individual hired to work 2 course loads or less
11 SHE or less
60% load or less

Use of part-time faculty: (Quest 54-56) (R=6)


All six colleges use part-time faculty.

235
Percentages of credit contact hours generated by part-time faculty for fall 1995:
R=I0%-50%
Avg. 27.6 %

Percentage of noncredit contact hours generated by part-time faculty for fall 1995:
R=95%-100%
Avg.=98.3%

Departments/divisions use of part-time faculty in the fall of 1995: (Que.st. 57) (R=S)

Department/ Percentage of Percentage of Percentage of


Divisions semester hours taught contact hours taught noncredit classes
by part-time faculty by part-time faculty taught by part-
time faculty

Allied Health 20.85% 18.58%


Behavioral Studies 15.03% 20.71%
Business 25.04% 29.02%
Industrial 16,74% 15.17%
Technology
Law Enforcement 100% 100%
Lang/Comm/FA 28.40% 27.24%
Nursing 21.89% 21.89%
Sciences & 33.76% 32.75%
Engineering

ACcess 17.45% 21.14%


Institutional 29.23% 24.40%

236
Table D.19. Use of part-time faculty-Institution 2 - Listed by deans
Department/ Percentage of Percentage of Percentage of
Divisions semester hours taught contact hours taught noncredit classes
by part-time faculty by part-time faculty taught by part-
time faculty

Liberal and Fine 30-35% 30-35% 90%


Arts
Business & 30-35% 30-35% 90%
Technology

Health & Natural 30-35% 30-35% 90%


Sciences

Table D.20. Use of part time faculty-Institution 3—See attached tables:


Department/ Percentage of Percentage of Percentage of
Divisions semester hours taught contact hours taught noncredit classes
by part-time faculty by part-time faculty taught by part-
time faculty

Liberal Arts 20% 25%


Fine Arts .07% 3%
Math, Science, & 11% 3%
HPE

Developmental 36% 48%


Ed,

Advertise part-time faculty positions: (Quest. 58) (R=5)


Five college advertise in local newspapers.
One college uses
• Professional contacts or word of mouth
• Applications for full-time positions

Part-time instmctors compensation: (Quest. 59) (R= 5)


Four colleges pay by a specific load hour rate
• Formula based on experience (attachment A)
$400 lecttire hour, $200 lab hour
• See attachment 6-13, 59a-b
$465/SHE

One college pays on a clock hour rate at $14.50/contact hour

One college pays based on educational attainment on a part-time schedule

237
Maximum percentage goal (% of cla.sse.s) of part-time faculty (Quest. 60^ (R=4)
• Two colleges have not defined this percentage,
• Two colleges report 35 % as a maximum percentage for part-time faculty.

Replacement of part-time: (Quest 6 n (R=S)


Five colleges decide to replace part-time with full-time faculty based on enrollment
including % of sections needed and enrollment history per discipline
One college also considers:
Availability of part-time to faculty
• Nature of discipline
• Day or evening need,
• Growth within program

Replacement of flill-time: (Ouest 62) (R=S)


All five colleges based replacement decisions on enrollment decreases also considering
• Employees retirement
• Full-time faculty leaving
• Availability of part-time to faculty
• Nature of discipline
• Day and evening need
• Program deactivation

Part-time instmctor evaluation: (Quest. 63) (R=5)


All five colleges use student evaluations for part-time evaluation.
• Three colleges use supervisor observations for part-time evaluation,
• Two colleges use supervisor evaluations for part-time evaluation.
• One colleges uses
• Student performance in courses and follow-up courses
• Self evaluation
• Peer evaluations

In.stmctional Computer Labs and Staffing

# computer labs: (Quest. 64) (R=5)


R=5-53
Avg.=29
M=32

Computer labs separate or consolidated in one area: (Quest 65) (R=5)


Five colleges report separate computer labs.
• One reports a consolidated computer lab.

238
Discinline-specific computer labs- (Quest. 66) (R=S)

Table D.21. Discipline-specific computer 1abs—Institution 1:


Discipline/department Number of What other disciplines use these labs?
labs
Access 1
Accounting 1 Business Administration, Economics,
Management, Travel and Tourism, &
Real Estate
Electronics 1
CIS 4 Office Technology
Commercial Art 2 Joumalism
Math 2 Computer Science, Computer
Information Systems
English 2
Modem Language 1
Music 1
Nursing Service I
Office Technology 3

239
Table D.22. Discipline-specific computer labs—Institution 2:
Discipline/Department Number of What other disciplines use these labs?
labs
Accounting 2
Graphic Arts 2
Office 2
CADD 2
Electronics 2
Micro-computers 1
Computer Science 8
SOS/Engineering 3
Joumalism 1
Medical Records 1
Open labs 2
RTDC-(Continuing Ed.) 6

240
Table D.23. Discipline-specific computer abs—Institution 3:
Discipline/Department Number of What other disciplines use these labs?
labs
Art 2
Communications 2
English, Philosophy & 4
Reading

ESOL &. Reading 4


Math/Physics 3
Natural Sciences 3
Business Administration 3
Business Technology 5
Computer Science 7

Legal Professions 2

Restaurant Management 1

Distribution & 3
Marketing
Trade & Industty 3

Special Law 1
Enforcement

Industrial Education 2

Technical Education 3

Dental Assisting 1

Allied Health 2

Registered Nurse Ed. 1

Vocational Nurse Ed. 1

241
Table D,24, Discipline-speci fie computer abs—Institution 4:
Discipline/Department Number What other disciplines use th eselabs?
of labs
Office Technology 5 Business/Accounting
CISY/Electronics 5 Electronics/Accounting
English 3 Read/Cont, Ed.
Math 2 Cont, Ed
Reading 1 English
Leaming Center 3 College-wide
Foreign Language 1 Spanish/French/ESL

Table D.25, Discipline-specific computer abs—Institution 5:


Discipline/Department Number What other disciplines use these labs?
of labs
Math 1 None
Academic Success Center 1 All—Combination of Student
services labs and developmental labs
English (computerized 2 None
classrooms)
Open Access 1
Accounting I
Nursing 1

Lab staffing: (Quest. 67) (R=5)


Five colleges use
• Faculty lab instmctors for labs.
• Lab assistants
Two colleges use lab supervisors for labs.

Campus-wide computer system: (Quest. 68) (R=5)


All five colleges have campus-wide computer systems for
• Adminisfration
• Academic
Four colleges have campus-wide computer systems for the technical areas.

242
Five colleges have these systems accessed by
• Administration
• Academic
• Technical
One college has clerical access to input files.

Noncomputerized Labs and Staffing

Lab staffing differentiation: (Quest 69) (R=5)


All five colleges differentiate between lab instmctors, lab asistants and lab supervisors.
Two colleges define a lab instmctor as faculty.
One defines as
• Termed teaching assistant-able to setup, teach lab
• Primaty lead
• Full-time classified staff, with approximately 30 hours of 40 are meeting
with computerized labs and group sessions

Two colleges define a lab assistant as


• A student supervised by a fiill-time employee
• An assistant to the faculty
One college defines a lab assistant as a secondaty aide

One college each of the three reporting define a lab supervisor as


• Responsible for the lab
• Full-time staff
• Non-classified professional staff confract

Lab assistants considered: (Quest. 70-72) (R=4)


Four colleges consider lab assistants, excluding students as classified personnel.
• Three colleges' lab assistants, excluding students, are evaluated by their
supervisors.
• One college evaluates the same as faculty.

Only one college reported the percentage of lab assistants employed:


75% as full time 25%) as part time

Student workers in labs: (Quest 73-74) (R=5)


• Foiar colleges use student workers in labs to assist students and provide tutorial
and clerical assistance.
• Five colleges use lab assistants to aid students.

243
Lab sunervi.sor and student work- (Quest 7S) (R=4)
Four colleges use lab supervisors to aid students in the lab.
Two colleges use lab supervisors to teach or tutor students in the lab.
One college uses lab supervisors to evaluate/grade student work in the lab.

Credentials of a lab instmctor- (Quest. 76) (R=S)


All five colleges require a lab instmctor to have a Bachelor's degree.
• Two colleges also require a Master's degree.
• One college accepts professional certification for a lab instmctor.

Credentials of lab assistant: (Quest. 77) (R=:^)


One college each requires a lab assistant to have
• Knowledge of computer software
• Technical credentials
• Course work in the subject

Credentials of a lab .supervisor: (Que.st. 78) (R=2)


One college each requires
• BA or BS degree
• Master's degree

Load for labs instmctors calculated: (Quest. 79) (R=3)


Two college calculate load for lab instmctors on a one hour load per 2 lab hours basis.
One college each calculates load for lab instmctors
• Maybe a fiill-time position
• By hourly employees

Pay for lab assistants (excluding student assistants): (Quest. 80) (R=5)
Two colleges pay on an hourly rate.
One pays by
• Pay grade determined by personnel officer,
• Minimum wage
• On the classified grade and step scale

Pay for lab supervisors: (Quest. 81) (R=3)


Two colleges pay lab supervisors by a professional staff salaty scale.

Separate lecture and labs: (Quest. 82) (R=5)


• Five colleges separate lectures and labs with faculty being responsible for the
lecture.
• Four colleges have faculty also responsible for the lab.
• Two colleges also have teaching assistants and lab personnel partially responsible.

244
Technical labs organjyed: (Quest 8^) (R=4)
Four colleges organize technical labs on a competency based criteria.
One college organizes technical labs on a customer driven service basis.

Departments that use noncomputerized labs- (Quest 84) (R=4^

Table D.26. Noncomputerized labs-Institution 1:


Department Number
Basic Academic Skills 15
English as Second Language 21
Reading 3
Dental Assistant 1
Dental Hygiene 2
Medical Data Specialist 1
Medical Laboratoty Technology 4
Occupational Therapy Assistant 2
Paramedicine Technology 10

Pharmacy Technology 1

Physical Therapy Assistant 4

Allied Health 2

Radiologic Technology 21

Respiratoty Care 10

Surgical Technology 2

Veterinarian Assistant 1

Child Development Assistant 6

Physical Education 103

Substance Abuse Counseling 2

Accounting 14

Computer Information Systems 56

245
Table D.26, Continued
Department Number
Management 14
Business 50
Real Estate 8
Fire Protection Technology 8
Criminal Justice 5
Automotive Technology - TDC 2
Automotive Technology 6
Commercial Service Technology 8
Electronics Service Technology 13
Industrial Technology 8
Environmental Health Technology 6
Hazardous Material Technology 4
Art 21
Dance 1
English 50
Joumalism 9
Mass Communications 31
French 9
Music 31
Photography 12
Radio Television 10
Speech Communication 2
Theater Art 4
Vocational Nursing U
Architecture 2

246
Table D.26. Continued:
Department Number
Biology 30
Electronics 13
Engineering 7
Math 59
Astronomy 1
Chemistty 13
Geology 6
Physics 6
Physical Science 3
TOTAL 726

Table D.27. Noncomputerized labs-Institution 2:


Department Number
Biology 77
Chemistty 24
Engineering Drafting 2
Calculus 4
Geology/Earth Science 8
Physics/Asfronomy 8
Health Science 30
Music 15
TOTAL 168

247
Table D,28. Noncomputerized labs-Institution 5:
Department Number
CHDV

Arts

Office Technology

Management

Foreign Language.

Table D.29, Noncomputerized labs-Institution 6:


Department Number
Writing (English) 60
Science 28
Math 56

TOTAL 144

Travel Considerations

Process for travel approval: (Quest. 85) (R=5)


Two colleges have a chain of command that must approve requests e,g, approval by the
department head, the division head, and the appropriate VP
One college each
• Submits professional request to budget manager
• Submits a fravel request
• Submits fravel form to division director
Submits proposal to a faculty committee that administers professional
development fund.

Travel policy differences: (Qiie.st. 86) (R=5)


One college has policy differences between departments regarding travel formulated by
accreditation standards

248
Equipment

Equipment policies in study notebook (Quest 87) (R =2)

Po.sition for equipment: (Quest SS^ (R=S)


Two colleges have a Director of Purchasing responsible for inventory, restocking, etc.
One college each has
Business Office Accountant
• Immediate supervisor of discipline or program
• Inventoty clerk
• Sr. Accounting Manager

Percentage of equipment provision: (Quest 89) (R=^)


One college rents 4% of its equipment

Two colleges receive equipment by donation.


R=l
Avg.=I

Three colleges lease equipment:


R=l%-5%
Avg.=2.3%

Three colleges purchase equipment:


R=90-99%
Avg. 95.6%

Determination of action: (Quest. 90) (R=4)


Four colleges use cost as a consideration in obtaining equipment by rental, lease or
purchase.
• Two colleges consider usefial life expectancy in obtaining equipment,
• One college considers
• On a case by case basis
• Maintenance costs
• Utilization
• Effectiveness of purchase vs. other options

Removal of equipment from inventory: (Quest. 91) (R=5)


Three colleges have equipment retirement/inventoty confrol forms
One college has a
• Deactivation process
• Central location for removal

249
Disposal of equipment policy: (Que.st. 92) (R=5)
Four colleges dispose of equipment at a surplus sale/auction.
One college disposes of equipment by deactivation.

Institutional budget percentage for equipment: (Quest. 93) (R=5)


R=2.65%-18%
Avg.=7.81%
M=5%

Annual equipment budget instmctional equipment percentage: (Que.st. 94) (R=4)


R=8%-75%
Avg.=50%

Equipment budget percentage is general/operational equipment: (Quest. 95) (R=3)


R=2%-25%
Ayg.=14%

250
OUTSOURCING RESI II T.S
(Six colleges reporting)
Outsourced Services-

The six instittitions reporting record 17 different areas which are outsourced.

All six colleges outsource:


Food Services-Agreements range from a minimum guaranteed profit to a
commission on sales arranged by contract and bid process.
• Vending Machines-Agreements include commission, contract and
revenue

Four colleges outsource:


• Publications-Including catalog and other items; bids are taken from
vendors

Three colleges outsource:


• Advert! sing-Various items to various media including radio
• Travel Arrangements-Agreements include contracts with travel agencies
with no financial commitment or consortium pricing/lowest quotes

Two colleges outsource:


• BQokstore-Agreements include annual renewable confract (5 yr. option)
to a commission on sales with Bames and Noble
• Building Maintenance-Agreements for a physical plant manager only, and
contracts and bids with issuance of purchase orders in addition to in-house
• Courier Services-Agreement with a purolator for daily deposits to bank,
and consortium pricing/in-house mail and open pos on "as need basis"
• Custodial Services-Agreement for custodial manager only, and contract
purchase orders issued 1 yr w/option to renew up to 3 years (also in-house)
• Grounds—Agreement for physical plant manager only, and use of
temporaty labor through bid process w/Temp agencies as needed
• Security—Agreement with local police depart. For after hrs. security using
bid process; agreement for grounds and buildings
• Telephone—Agreements include confracts with Southwestem Bell and
MCI
• Vehicle Maintenance—Agreement is a confract with a local auto repair
shop, and dealer warranties

One college outsources:


• Child care—Agreement offers services to students through a Perkins
Voc/Tech Grant
• Mailing-Agreement is with local vendor for large bulk mailing projects
only

251
Maintenance of personal computer^-Agreement is for annual maintenance
service
Ennting-Agreement is for color printing, brochures and catalog
Vehicle use-Consortium pricing/lowest quotes

Services Sold by Tnstitutinn-


Only three colleges sold services.

Two colleges sell


• Child Care-Make child care available to community clients as well as
students and employees
• Housing services—Agreements include rent houses available to the public,
and faculty housing and student dormitories

One college sells services:


• Bookstore services—Agreement allows the public to purchase items at
bookstore
• In.stmctional training—Agreement includes confract fraining in certain
areas
• Printing-Agreement is limited to educational and civic entities
• Professional Staff Development Training-Agreement is contract training
• Testing—Testing center administers tests for the cost of the exam
• Other—Fitness Center available to community members for an annual fee

252
PERSONNEL QUF.STinNS
(Six colleges reporting)

Definition:
Full time Fanilty-those who are eligible for TRS or ORP and state insurance
benefits
Part-time fanilfy-those faculty who are not eligible for TRS or ORP and state
insurance benefits
Administration-those personnel who are not faculty but typically have eligible
consideration for ORP
Part-time administrators-those personnel who are not faculty but would be
eligible for ORP consideration if the position was full time
Classified-those clerical employees in support positions who are usually paid on an
hourly basis or technical and professional non-faculty positions
which do nut have the responsibility of the administrative
positions
Part-time classified employees-those support employees not eligible for TRS or
ORP due to assignment of hours (i.e. 19 or less) or length of
assignment

Definitions (Quest. 1 -5) (R=6)


Full-time faculty:
Two schools agreed with the definition.
Four gave their own definitions:
• We split full-time into two categories, fiill-time and limited full-
time"—to fiill-time we add that they are assigned at 100%. Limited
full-time would be those benefits eligible faculty that are 50% or
more, but less then thefiall-time100%).
• Three tests besides being benefit eligible must be met to be fiall-
time faculty: 1) teaching a half load or more, 2) for a period of one
long semester or 4.5 months, and 3) earning a rate comparable to a
fiill-time faculty member placed on the salaty scale,
• Two—defined full-time faculty using teaching assignments:
• A faculty member is a person whose primaty responsibility
is to teach with a minimum of six semester equivalents or
fifteen clock hours.
• The other defines fiall-time faculty as "...one whose
teaching assignment is at least 61% of the fifteen semester
hour teaching load during the contract period involved,"

Full-time faculty load:


• Four colleges define a full-time faculty load as 15 load hours per semester
• One added 15-15.5 semester hours and 10 office hours per week
• Another defined between 15-18 instmctional units per long semester

253
Administrator
Two agreed with the definition.
Other definitions include:
• Positions which report to a dean, vice president or the president and have
budgetaty control over a specific area
• An employee exempt from FLSA minimum wage and overtime
provisions. Non-manual or office work directly related to management
policies or general business operations or performance or administrative
fiinctions. Administrative positions above grade 7 are eligible for ORP
• An employee who has management and or supervisoty responsibilities
related to specific college programs or services. These employees are paid
from the Adminisfrative and Professional salaty schedule.
• An administrator is any supervisoty personnel in salaty range 8 or above.
They may or may not meet all ORP eligibility requirements.

Classified
Only one school agreed with the definition.
One reports not even using the term instead the refer to these employees as
exempt and non-exempt support and technical staff.
Other definitions were as follows:
Classified staff are non-contractual, at-will employees, (secretarial,
support staff and maintenance), who do not meet the eligibility
requirements for participation in ORP, Professional staff are
contractual employees but do not meet the eligibility requirements
for ORP.
• Clerical and support employees (i.e. maintenance, skill crafts,
persormel) not exempt from overtime pay,
• Employees who are hired and paid from the Classified Staff salary
schedule.
• A classified employee is an employee that is covered by Fair Labor
Standards Act (FLSA) and the Fair Labor Standards Amendments
of 1985 and are paid on an hourly basis. These employees are
referred to as non-exempt employees.

254
Table D,30. Non-listed positions
Categoty Definition
C2. Professional Professionals, requiring a degree in their field
of expertise, who are given a one year
contract, but do not meet ORP requirements as
adminisfrative
C4. Teaching Assistants (sub- TA's have annual contracts, no rights to tenure
categoty of faculty) or promotion, and are paid as a proportion of
the faculty pay scale.
C6. Professional staff An employee that is exempt from the FLSA,
receives an annual contract, and one whose
employment must be approved by the
college's Board of Tmstees. Some of these
employees such as administrators are eligible
for ORP, but some are not eligible for ORP.
Examples of positions which are not eligible
for ORP are: Assistant Directors,
Coordinators, Admission Specialists, Financial
Aid Specialists, etc.
C8. Non-exempt support and Personnel who fall under the Fair Labor
technical staff Standards Act and who submit monthly time
sheets.

Personnel in positions exempt from the FLSA


Exempt support and technical without broad administrative responsibilities.
staff

Part-time employees

Part-time classified staff (Quest. 6) (R=6)


All six colleges use part-time classified staff.
Areas were they are used:
• Two colleges use part-time help in all areas of the college—including the
President's office, College Relations, Financial Services, Human
Resources, Dean of Students Office, Vaughan Libraty, Media Services,
Instmctional Computing, Biology/Chemistty/Geology/Physical Sciences
office, Health and Physical Education Center, Government/Computer
Science/Graphic Arts/VNE/AND Depts.

255
• In small departments that cannot justify a fiill-time person and large
departments who need extra help during msh periods.
• In clerical and the physical plant (maintenance); still another uses them in
business services, student services, and instmction.

Ways people are hired:


• By advertising , screening, and interviewing at most schools
• Sometimes they are hired through the personnel office
• One college sometimes uses personal knowledge for hiring

Part-time classified are paid:


• Four schools on a stmctured pay schedule
• Two schools pay by a specific clock hour rate

Part-time administrators (Quest. 7) (R=6)


Only one school uses part-time administrators
They are usually a present employee who is appointed as an "interim"
administrator and is compensated for the position for a specified length of time.
These part-time adminisfrators are hired through formal advertising, screening and
interviewing and are paid on a stmctured pay schedule.

Review or evaluation process

Full-time Faculty (Quest. 8-12) (R=6)


All schools reporting use an evaluation process including:
• Student surveys
• Self-evaluation
• Supervisor assessment

Some of the schools added classroom observations and appraisal planning forms
to the evaluation process

All six schools consider the evaluation process a standard forfiall-timefaculty.


• Three schools evaluate fiill-time faculty annually
• One school only evaluates frill professors every three years
One school evaluated full-time faculty once per year for the first four
years, and fifth-year and tenured faculty evety other year
• Another school evaluates tenured faculty once evety three years and
probationaty or tenure track faculty each semester for the first two years
and once armually thereafter.

Four schools evaluations are different primarily in frequency after the first few years.

Three schools salary increases are tied to the evaluations,

256
Part-time faculty (Quest 13-16) (R=6)
All six schools evaluate part-time faculty all but one using the same process as is used
with fiill-time.
Four schools evaluate part-time faculty annually
One school adds a classroom visit by the department chair for evaluation.
One evaluates evety semester
One evaluates each semester for the first year and then annually.
In only one college does the process change after the first few years.
None of the schools tie part-time salaty increases to the evaluations.

Classified personnel (Quest 17-20) (R=6)


All six of the schools have performance evaluations by the supervisors for classified
persormel.
• Three schools report this process differs after the first few years of service.
• One college evaluates classified quarterly the first year and then every six
months for two more years, and then the fourth year and beyond is once a
year
• One school evaluates at 30, 60 90, and 6 months the first year and then
annually
• One school evaluates evety six months the first year and then once a year
• Two schools tie evaluations to salary increases.

Administration (Quest. 21-24) (R=6)


• One college surveyed does not evaluate administration
• Two use performance appraisal instruments similar to classified
• Three use peer and subordinate surveys, self-evaluations, supervisor
assessments and anonymous evaluations from faculty and staff

Four colleges annually evaluate administration

One evaluates adminisfration each six months the first year and then once a year

Two schools tie these evaluations to salary increases.

Merit Pay (Quest. 25) (R=6)


Only one college still uses merit pay and it is linked to supervisoty evaluations.

257
Retirement

Packages and plans (Quest. 26-29) (R=6)


Only one institution offers an early retirement package, and it is a new program.
No college offers retirement bonuses
Only two offer separate retirement plans from ORP/TRS.
• One college repeorta an in house retirement package
• Another school reports the "Part-time employees who are currently
contributing to TRS/ORP, are not retirees of TRS/ORP, or are not
full-time students at MCC contribute 7,6%) of total wages to a
retirement plan." Both of these plans are vested immediately.
Two colleges still participate in Social Security
62%) of the employee's share in one college is paid by the institution
50%) of the employee share the other institution pays
One college reports that they have a faculty pre-retirement program where faculty
members 55 years of age or older may take a reduced load.

Non-instmctional work hours

Work hours (Quest ^0-31) (R=6)


Only one college does not consider from 8 a.m. until 5 p.m. to be core hours of work.
This college reports some longer days 7:30 a.m, until 5:00 p.m. and 4 or 4 'A day
work weeks for non-instmctional personnel
Two colleges allow for flex time scheduling in non-instioictional positions with the
decisions about hours being made by supervisors.

Volunteers (Ouest. 32) (R=6)


Three colleges use volunteers
• One college uses them for an information desk
• One college uses them throughout the school
One college uses them primarily in clerical fimctions.

They are obtained through a training program with state agencies and those who come to
school and volunteer.

They are managed by the designated supervisor/frainers.

One rewards them with work experience; the other doesn't have a plan of reward.

258
Instmction

Class size (Quest. 33) (R=3)


The "average lecture class" size:
• 40 students
• 30 students
By department as follows: Social Science, 29-30; English, 21; Business,
18; and math, 26.

259
INSTRl JCTIONAI-SCHEDl n.ING
(Seven colleges reporting)

Fall/Spring Evening Class Scheduling

Numbers of cla.s.ses and meeting times (Quest. 1-3) (R=6)


Seven institutions surveyed have evening classes

The number of sections range from


Academic areas
R=20-1549
Avg.=l 709.6
M=500

Technical
R=20-500
Avg,=207
M=187

Continuing education area


R=38-363
Avg.=166.5
M=65

Acheduling of night classes (Quest. 3) (R=6)


Most of these classes are scheduled:
• Onenightaweekfor3 to3,5 hoursor
• Two nights a week for 1.5 hours
• One college offers
• 4 hour classes per week
• 3,5 hour classes per week (no indication of number of weeks
classes are offered).

Instmctors for evening classes (Quest, 4-5) (R=6)


Use of fiall-time faculty to teach these classes
R=20%-75%
Avg.=49%
M=40

Use of part-time faculty to teach evening classes


R=25%-80%
Avg.=51%
M=50%

260
Faculty required to teach night classes (Quest, 5) (R=6)
• Three colleges require faculty to teach evening classes
• Three colleges do not require faculty to teach evening classes
• One college requires it only if their day load does not make.

Continuing education, degrees and sites (Quest. 6-8) (R=6)


• Four of the institutions a degree may be obtained by attending only
evening classes
• Two do not allow a degree to be obtained at night

• Three community colleges students can not take credit classes for
continuing education credit
• Three colleges do allow students to take credit classes for
continuing education credit

• Four of the institutions offer evening classes at off-campus sites such as an


Army depot, medical facilities, banks and other company locations
• Two offer no such classes.

Summer Term Scheduling


(R=6)

Summer class numbers (Quest. 9-14) (R=6)


All six of the colleges retuming surveys have summer classes.
Number of class sections offered in summer 1996
R=7-1089
Avg.=433
M=348

Average number of sections offered over the last three years


R=7-973
Avg.=401
M=302

Session length (Quest 12)(R=5)


Five schools offered six week sessions
Five schools offered eight week sessions
Two schools offered ten week sessions
One school offered an eleven week session
Four schools offered twelve weeks sessions

261
• One school each offered:
• A nine week session
• A five week session
• A 2-5 week session

Contact hours generated (Quest 1 ^) (R=f^)


The most contact hours are generated by the two short summer sessions both
academically and technically.
Only two colleges generated any contact hours by offering:
• One long session per summer
• One long night session per summer
• Two short night sessions per summer
One college reported that they had conducted a cost effective report and found that the
cost of the course was primarily determined by "whether the course is taught by full-time
or part-time faculty."

Continuing Education Credit (Quest. 1 5) (R=6)


• Three of the colleges grant continuing education credit for summer academic
credit and charge the in-district tuition rate for those students
• Two of the colleges do not allow continuing education credit and academic credit
from the same course.

Nontraditional Scheduling

Definitions:
Mini-semester—a semester of credit offered in times between regular semesters
Weekend classes—credit classes which meet only on weekends (A weekend begins at
5 p.m. on Friday until midnight on Sunday)
Block classes within a long semester—credit classes offered for short condensed
periods of time rather than the 15 weeks of a semester or a summer
Extension center—an off campus location that offers college credit classes excluding
dual credit classes offered at high schools
Dual credit classes—Classes offered with both high school credit and college class
credit being granted at the same time
Concurrent classes—Classes that students still attending high school may attend for
college credit only.

Class offerings (Oue.st. 1-2) (R=6)


# Nontraditional offerings
4 Mini-semesters
6 Weekend sections of classes
3 Block classes
5 Extension centers
6 Dual/concurrent classes

262
Only one school had found another optimal scheduling time they use "flexible
entry" and "fast-track" classes.

Mini-semester .scheduling

Classes (Que.st. 3) (R=4)


Four college offer miiu-semesters between spring and summer sessions and between fall
and spring sessions.
Attendance
R=50-986 in 2-22 classes offered in spring/summer
R=45-601 in 2-10 classes offered in winter
Avg.=397 in spring/summer
Avg.=317 in winter

Instmction (Quest. 4-5) (R=4)


Only one college selects a part-time or fiill-time instmctor for their mini-semesters based
on cost efficiency.
• When calculating instmctor pay:
• One college pays the same as for regular classes because contact hours are
the same
• One pays mini-semester instmctors at the part-time rate
• One considers it part of a load or pays class overload rates

Cost (Quest. 6) (R=4)


• Two schools do not calculate mini-semester costs separate from regular classes
• One school calculates the cost based on salaries and utilities while another adds
adminisfrative costs to their calculations,
• Only one school has done a cost vs, revenue comparison for these courses.

Sttidents and evaluations (Quest. 7-10) (R=2)


One found that 52% of the students enrolled in mini-semesters did not attend
standard 15 week classes
• The other found that only 10%) did not attend standard classes,

• Only one college allows continuing education students to attend


these academic classes at the same time, and they charge the in-
county credit rate for the class for all students.

• Two schools use student evaluations to review the class


One school uses the number of classes that make, average class
size and need for graduation as criteria for evaluation.
• One school adds faculty assessment to the student evaluations.

263
Weekend cla.ss .scheduling

Cla.ss setup (Quest. 11)(R=6)


• Two colleges offer once a week Saturday classes
One adds to Saturday only classes Friday, Saturday and Sunday classes
• Three colleges offer Friday, Saturday and Sunday classes.

Single weekend blocks (quest. 12-1 ^1 (R=6^


Two schools compact classes into single weekend blocks for real estate classes
• Three schools offer continuing education block classes ranging from one day
weekend classes to three-week weekend classes.

Credit for weekend blocks (Quest. 14) (R=6)


• Four schools offer weekend block classes for academic degree credit
Three schools offer continuing education credit and technical credit in weekend
block classes.

Cost effective (Quest 15) (R=6)


• Five of the schools report that weekend classes have proven to be cost effective.

Degree (Quest. 16)(R=6)


• No school offers a degree from attending weekend classes only.

Block Class Scheduling Within a Long Semester (other than mini-terms):


(Two colleges reporting)

Sttidents and disciplines (Quest. 17-18) (R=2)


• Two colleges target adult students for block classes
One school targets 18-22 year old students
• One college targets high school students for block classes.

• Two schools report block classes in Real Estate


One school reports classes in Industrial Management, Computer Science, Physical
Education, Eating Disorders, Health Science, Marketing, Office Administration
and Drama.

riass setup (Ouest. 19-23) (R=2)


• No school responded to the question about what percentage of these classes was
taught by full-time vs. part-time.

• Both schools report that block classes are offered at night

• One reports that load and pay are calculated the same as for other classes.
• One school mns an eight week block simultaneously with the two long terms.

264
Exten.sion center scheduling
(Three colleges report having)

Centers (Ouest. 24) (R=\^


Three schools which have extension
R=2-5 centers
Avg.=4.7 centers
M=3 centers

Classes (Ouest. 25-27) (R=^)


• All three schools offer academic and technical courses requiring at least 10
students for a class to make

• Two offer continuing education courses requiring at least 12 for the class
to make.

• No school offers a complete degree program through the centers.

Staff (Quest. 28) (R=3)


• Two schools use fiill-time faculty from the campus to teach some of the
extension classes considering the classes to be part of the regular load or
overload

• Three of the schools use part-time faculty in the vicinity of the center.

• One school requires that faculty teach if local community people cannot be
found.

• Only one school pays an on site coordinator to coordinate the creation of


the schedule including classroom assignments, locating faculty , textbook
sales and promotion of programs.

Travel (Que.st. 28-31) (R=3)


• One school provides a college vehicle and meals for travel to extension
centers

• Two others pay mileage between 24-28 cents per mile for using personal
vehicles.

• No school compensates for time spent traveling; however, travel mileage


is granted by both schools.

265
WORKFORCE DEVELOPMENT RR^TITT<;
(Six colleges reporting)

Workforce Continuing Education-


Clas.ses (Oiie.st 1-2) (R=6)
Need for classes is determined by:
Requests from business/industty
Contacts with business/industty
Popular topics
Required by the government
Observation
Surveys
Needs assessments
Sales calls

Class staffing (Quest 3) (R=6^


Classes are staffed at
Four colleges by
Part-time faculty
Two colleges by
• Full-time faculty
• Extemal consultants
One college by
Continuing Education Adjunct Faculty

Faculty load (Que.st. 4) (R=6)


• Two colleges do not count the continuing education classes as faculty
load,
• Two colleges sometimes count the classes as load; one noted that some
classes have combined continuing education and credit; therefore those
classes are a part of the faculty's load,
• One college reports that when the classes are not part of the faculty's load,
they are paid for their services.

Funding (Quest. 5-8) (R=6)


All six colleges charge the students tuition for the continuing education classes.
• Five colleges confract with business to fiind classes,
• Four colleges charge special student fees.
• One college relies on state fimds.

All six colleges receive contact hour fiinding for continuing education classes.

266
• Six colleges give CEU credit for continuing education classes,
• Two colleges give certificates of completion.
• One college gives academic credit for completion,

• Three colleges strive to break even on continuing education classes.


• Three colleges strive to make a profit on continuing education classes.

Credit (Quest. 9) (R=6)


• Four colleges do not allow both continuing education credit and academic
credit in the same class.
• Two colleges do allow both continuing education credit and academic
credit in the same class.

Administration (Quest. 10) (R=6)


Three colleges use directors to schedule and staff these classes.
• Three colleges use deans to schedule and staff these classes.

Marketing (Quest. 11-12) (R=6)


Five colleges advertise in newspapers.
Four colleges advertise in the
• College catalog
• Separate catalog or brochure
• Mailouts
Three colleges advertise by contacts with business
One college also uses the radio for advertisement

Two colleges use 0% of their continuing education budget for marketing.


One college uses
4%) of their budget for marketing
10%) of their budget for marketing

Customized Tontract Training


Classes (Quest. 1-2) (R=5)
Need for fraining is determined by
Requests from business/industty and government
One-on-one marketing leads; sales calls; meeting with clients
Training needs assessment
Anticipated company needs
Popular topics
Surveys

267
Activities are staffed by
• Four colleges use part-time faculty
• Three colleges use fiill-time faculty.
• One college uses extemal consultants.

Facultv load (Quest. 4) (R=5)


• Two colleges do not consider continuing education classes part of faculty load
• Two colleges do consider them load in some cases—
If continuing education credit and academic credit are given for the same
class.
• One college tries to employ faculty on an adjunct basis for contract training.
• One college pays faculty for their time at a contract price.

Funding (Quest. 5-8) (R=5)


• Five colleges contract with business tofiandthe training,
• Four colleges charge student tuition for the training.

• Three colleges assess special student fees for the training.

All five colleges reporting receive contact hour fiinding for the training.

• Four colleges give CEU credits for the fraining


• Two colleges give academic credit for the training.
• One college gives a certificate for the training.
• Four colleges seek to make a profit from such training
• One college sees to break even from such

Credit (Que.st. 9) (R=5)


• Three colleges allow both continuing education and academic credit to be
given for fraining.
• Two colleges do not allow both continuing education and academic credit
to be given for fraining.

Adminisfration (Quest. 10) (R=5)


• Two colleges have directors to administer these training activities.
• Two colleges have deans to administer these fraining activities.
One college has a Workforce Training Coordinator to administer these
fraining activities.

268
MarketinP (Ouest 11 -12^ (R=4^
Four colleges advertise by
• Separate catalogs or brochures
• Contracts with business
Three colleges advertise by
• Mailouts
• Newspapers
Two colleges advertise by
• College catalog
• Word of mouth
• Radio

Two colleges use (R=2)


• 4%) of program budget on marketing
• 5% of program budget on marketing

Organizational Charts (Ouest 12) (R=6)


• Most of the colleges attached organizational charts,
• Colleges market their institutional education services by tuming to intemal
sources first and then to extemal forces.
• Colleges seek to meet the needs of business/industry including small
businesses.

Evaluations (Quest. 13-14) (R=6)


Four colleges evaluate the classes and fraining by
• Student/participant evaluations
• Adminisfrative evaluations
• Faculty and Staff evaluations
• Director of Continuing Education evaluates staff and courses
• Employer evaluations

Two colleges evaluate the entire program by


• An intemal sfrategic plan
• Percentage of repeat business
• Percentage of growth within the division
• Contracts initiated
• Clients served
• Revenues generated

269
Evaluations are used as
As management tool
In quality improvement
Part of Institutional Effectiveness Plan
Establishing goals and objectives
Budget requests
Corrective action
Program and course improvements

270
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Business Offices

Mid-sized community colleges in Texas:


Budgeting
1) allocate less than 1% of the total budget on equipment other than technology
equipment

2) allocate less about 11% of the total budget to maintenance or physical plant operations
including equipment, supplies, persormel, and travel

3) allocate less than \% of the total budget to travel

4) prioritize the budget by categories with existing personnel being the most important
categOty

5) have developed energy management systems to cut utility costs

6) allocate travel money through departmental/division budgets

Purchasing
7) use group purchasing for supplies

8) combine with outside institutions or businesses for group purchasing

9) are open to more groups purchasing combinations

Insurance other than UGIP


10) Undertake a bid process to select liability and worker's compensation coverage
11) Purchase general liability insurance for college employees.

271
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Computer Technology

Mid-size community colleges in Texas:

1) Use the COBOL programming language

2) Maintain and service their computer equipment by in house staffs

3) Have general technology fees

4) Provide network accounts to students with e-mail but without remote access

5) NOVELL is the network operating system that most support

6) Do not require students to sign contracts before using institution's technology system
7) Have no set procedures for training employees on hardware and software, and no
mandatoty hardware or software fraining

8) Technology budgets are distributed among department budgets

9) Technology budgets include telephones, instmctional equipment, computers, LAN and


WAN

10) Have a cenfral location on campus for technology service.

11) Provide service in less eight hours or less after the request for assistance
12) Have experienced technology problems in both software updates and software
isntallation.

272
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Developmental Education

Mid-sized community colleges in Texas are

1) offering remediation through developmental courses.

2) offering 10%) of remediation through noncourse offerings.

3) not using graders or assistants in developmental classes.

4) charging students for developmental classes and labs by applying regular tuition rates
and lab fees

5) figuring load for developmental classes by contact hours

6) teaching the majority of developmental writing classes with full-time instmctors.

7) teaching the majority of developmental reading classes with full-time instmctors.

8) limiting developmental writing class size to 20.

9) limiting developmental reading class size to 20-25,

10) limiting developmental math class size to 30.

11) placing student in remediation based on placement test scores such as TASP.

Tutoring

12) offering peer tutoring by labs, one-on-one instmction and small groups.

13) offering professional tutoring by labs, onoe-on-one instmction and small groups.

14) fiinding tutoring services by institutional fimds, workstudy funds, and grant fiinds.
Leaming Centers
15) offering leaming centers which offer basic skills remediation, GED preparation,
TASP remediation, tutoring, and college level computer assisted instmction.

16) using lab supervisors, fiill-time instinctors, and part-time instmctors in the leaming
centers and developmental labs.

273
TRENDS IN MID-SIZED COMMUNITY COLLEGES FN TEXAS
Distance Education

Mid-sized community colleges in Texas are

1) involved in distance education.

2) delivering both credit and noncredit courses by distance education.

3) having distance education classes taught by full time faculty as part of their load.
4) registering and assisting distance education students using the same methods as other
classes use such as telephone and on-campus registration.

5) plarming to expand distance education by establishing 2-way interactive systems and


Intemet class offerings

Budgeting
6) budgeting delivety systems though institutional budgets and grants.

7) not charging distance education students any additional fees.

Agreements
8) in distance education agreements with one other institution.

Delivery Systems
9) using cable television, the intemet and video to deliver distance education.

Intemet

10) using the Intemet as both a teaching and leaming tool.

11) using the Intemet as a resource for additional classroom information.

12) displaying and maintaining their ovra WEB page as an institution.

13) offering Intemet services to both personnel and students,

14) offering Intemet connection through PC's,


Video
15) using traditional teleconferencing and cassettes to delvety video distance education.

16) subscribing to the Starlink video network.

17) not calculating if the offering of video classes exceed the expense of using video.

274
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Research/Planning/lnstitutional Effectiveness

Mid-sized community colleges in Texas:

Resource Development Grants


1) receive more that $1 of their institutions income per contact hours from federal
sources.

2) use a grant approval process.

3) tty to absorb personnel and resources after a grant has been concluded.

4) do not use an exit plan for grants.

Foundation

5) have college foundations.

6) manage their foundations through boards,

7) have 85%) or more of the foundation management costs paid by the institution.

8) distribute 75%) or more of the funds in scholarships.

275
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Instmctional

Mid-sized community colleges in Texas:

1) on the average exceed the state median in per contact hours cost in instittitional
support and library.

2) on the average are below the state median in per contact hour cost in insttiictional
administration cost.

Instmctional administration
3) allow fiill-time instmctors to receive load credit for lab instmction.

4) grant teaching load reduction or release time for school responsibilities outside of
instmction such

5) use prorata formula based on number of students to figure load in small fransfer
classes.

6) do not offer extra large lecture classes.

In.stmctional personnel

7) require faculty to maintain office hours with minimum requirements.

8) do not increase counselor salaries with credentials such as LPC.

Professional development
9) offer professional development to part-time faculty.
10) offer professional development to technical/vocational faculty to maintain proficiency
within their fields which includes fraining for new technology which the institutions fiind.

11) provide institution-wide funds for professional development.

Part-time faculty

12) use part-time faculty

13) advertise part-time faculty positions in local newspapers

14) compensate part-time instmctors by a specific load hour rate


15) decide to replace part-time with full-time faculty based on enrollment including % of
sections needed and enrollment histoty per discipline

276
16) decide to replace fiill-time with part-time faculty based on enrollment decreases plus
other factors

17) use student evaluations for part-time instmctor evaluations.

Instructional computer labs and staffing


18) have separate area computer labs.

19) use both lab instmctors and lab assistants.

20) have campus-wide computer systems for administration, academic, and technical
areas which can be access by all three areas.

Noncomputerized labs and staffing


21) differentiate between lab instmctors, lab assistants and lab supervisors.

22) consider lab assistants, excluding students as classified personnel.

23) consider student workers in labs to assist students and provide tutorial and clerical
assistance.

24) use lab assistants to aid students.

25) use lab supervisors to aid students in labs.

26) require lab instmctors to have a Bachelor's degree,

27) separate lectures and labs with faculty being responsible for the lecture.

28) have faculty responsibility for the labs as well,

29) organize technical labs on a competency based criteria.

Equipment
30) use cost as a consideration in obtaining equipment by rental, lease or purchase.

31) dispose of equipment at surplus sales/auctions.

277
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS

Outsourcing

Mid-sized community colleges in Texas outsource:

1) food service

2) publications

3) vending machines

278
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Personnel

Mid-sized community colleges in Texas:

1) consider a full-time faculty load as 15 hours per semester.

2) use part-time classified staff

3) pay part-time classified on a stmctured pay schedule.

4) do not use part-time administrators.


5) use an evaluation process for fiill-time faculty that includes student surveys, self-
evaluations and supervisor assessments.

6) consider the evaluation process a standard for fiill-time faculty.

7) vaty the review process in the area offrequencyafter the first few years.

8) use an annual evaluation process for part-time faculty which duplicates the fiill-time
faculty process,

9) evaluate part-time faculty annually.

10) use annual performance evaluations by supervisors for classified personnel after the
first year of work.

11) tie salaty increases to the performance evaluations for classified personnel.

12) use an aimual evaluation process for administrators.

13) do not award merit pay.

14) do not offer early retfrement packages,

15) do not offer retirment bonuses or programs.

16) do not ofTer separate retirement plans from ORP/TRS.

17) do not participate in Social Security other than Medicare.

18) have a core time from 8 am-5pm Monday through Friday that all non-insttiictional
employees work.

279
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Instmctional Scheduling

Mid-sized community colleges in Texas:

Night classes

1) offer evening classes.

2) offer one night a week for three hours evening classes.

3) require fiill-time faculty to teach evening classes as a part of their regular load.

4) grant degrees to students who take only evening classes.

5) do not allow students to take credit classes for continuing education credit.

6) schedule evening classes at company sites such as medical facilities, banks and Army depots.

Summer Term
7) offer summer classes in six week, eight week and twelve week sessions.
8) generate enough enrollment to be cost effective in both academic and technical classes
offered in two short summer sessions,

Nontraditional
9) offer mini-semester classes, weekend sections of classes, extension center classes and
dual/concurrent classes.

10) do not compact classes into single weekend blocks allowing a student to completee an
academic class in a shorter time frame than usual.

11) offer academic degree credit for weekend classes.

12) beleive that weekend classes are cost effective.

13) do not grant degrees for weekend classes only.

280
TRENDS IN MID-SIZED COMMUNITY COLLEGES IN TEXAS
Workforce Develnprnpnt

Mid-sized community colleges in Texas:

1) offer both worforce continiuing education programs and customized contract training
classes.

Workforce Continuing Ediicatinn


2) assess the need for continuing education classes by both requests and contacts with
business.

3) staff continuing education classes mostly with part-time staff.

4) fimd continuing education classes by student's tuition, special student fees and
contracts with business.

5) receive contact hour fiinding for workforce continuing education.

6) grant CEU credit for continuing education classes.

7) do not allow both continuing education and academic credit for the same classes.

8) advertise continuing education credit through local newspapers, the college catalog,
special brochures or catalogs, and mailouts.
Customized Contract Training
9) assess the need for customized training classes by requests from business.

10) staff customized fraining classes with mostly part-time faculty.

11) fimd customized fraining by contracting with business and student's tuition.

12) receive contact hour funding for customized training classes.

13) grant CEU credit for customized training classes.

14) seek to make a profit from customized training classes.


15) advertise customized training classes through special catalogs or brochures and
contacts with business.

16) evaluate workforce education classes and fraining by student/participant evaluations,


administrative evaluations, faculty and staff evaluations, the directors of continuing
education and employer evaluations.

17) evaluations are used as management and improvement tools.

281
APPENDDC E

BENCHMARK FOLLOW-UP SURVEY

282
Benchmark Study Follow-up Survey
Study Procedures:
Please share your thoughts about the way the survey was:
Constructed? (Allowing the sponsor school to organize the preliminaty survey,
but allowing all participants to make changes to the survey instmment to include their
areas of concem and need)

Analyzed? (Raw data that was sent to each school)

LIsed2 (In your institution to compare yourselves with others institutions in the
study?)

Use of Study Data-


Check in the left column any of the ten areas benchmarked which used the information
collected to improve or alter the processes or procedures analyzed? Then in the other
two columns please indicate the process or procedure compared and improvements being
made or made.
Benchmarked Area Process or Procedure Improvements
Compared Considered
or Instigated
Business Affairs
Computer Technology
Developmental
Instmction
Distance Education
Institutional
Effectiveness
Instmctional
Outsourcing
Persormel
Scheduling Instmction
Workforce
Development

283
Check in the left column any of the ten areas benchmarked which used the information
collected for self-analysis of processes and procedures included? Then in the other two
columns please indicate the which processes or procedures were compared or analyzed,
and if different from the above chart, please indicate any improvements being made or
made.
Benchmarked Processes or Procedures Improvements
Area Analyzed
Business Affairs
Computer
Technology
Developmental
Instmction
Distance
Education
Institutional
Effectiveness
Instmctional
Outsourcing
Persormel
Scheduling
Instmction
Workforce
Development

Check any of the ten areas benchmarked in which the area's participants have established
contacts with others in their area from the survey for networking purposes?
Business
Computer Technology
Distance Education
^Developmental Education
Institutional Effectiveness
Instmctional
Outsourcing
Personnel
Scheduling Instmction
Workforce Development

284
Data Presentation:
Comment on how the data was used after the final report was sent by sharing:
Where the data was located?

Who had access to the data?

Did your school do a gap analysis of the data to compare your institution's performance?
yes no other
If so, check which areas:
Business
Computer Technology
Distance Education
Developmental Education
Institutional Effectiveness
Instmctional
Outsourcing
Personnel
Scheduling Instmction
Workforce Development

Results:
Check any areas that you found the material usefiil for in making fiiture decisions for
your institution?
Business
Computer Technology
Distance Education
Developmental Education
Institutional Effectiveness
Instmctional
Outsourcing
Personnel
Scheduling Instmction
Workforce Development

Please share how the material helped in the areas checked.

What improvements have been started as a result of analyzing the data from the
benchmark study?

Share two things that you leamed about your instittition when it was compared to others?

Are any steps being taken to improve your weak areas of comparison from the study?
yes no

285
What do you think should have been done with the data from the study that was not done
at your institution?

Would your institution participate in another peer institution benchmark study?


yes no

Would your institution consider sponsoring a peer institution benchmark study?


yes no

Comment on how any of the data was used after the final report was sent that you hasn't
been covered by these questions:

286
APPENDIX F

BENCHMARK STUDY ORIGINAL TIMELINE

287
TIME LINE FOR RESOURCE MANAGEMENT STUDY

Table F. 1. Time line for resource management study


Deadline Task

4/12/96 Survey coordinator begins study by reviewing past Task Force


actions, and training Task Force on benchmarking
4/18/96 Identify peer institutions, identify LCC* staff who will help with
and draft first questions for each area of the survey
4/19/96 Meet with Task Force and establishing study parameters; contact
peer institutions for input on survey design
4/26/96 Submit first draft of questions to LCC staff for feedback on survey
instmment
5/9/96 Revise first drafts to reflect feedback from LCC staff/
adminisfration and submit to Task Force at LCC for feedback
5/10/96 Revise draft to reflect feedback
5/28/96 Complete revision of first draft for feedback from LCC staff again
6/10/96 Submit second draft to LCC staff for feedback
6/17/96 Set up committee meetings in various areas for feedback on second
draft
6/18/96 Meet with scheduling staff for feedback on second draft; meet with
technology committee on first draft of survey instrument
6/19/96 Meet with personnel staff for feedback on second draft
6/20/96 Revise survey to reflect feedback from staff and submit to peer
institutions for feedback
6/25/96 Contact peer institutions for feedback on study drafts and areas of
inquity additions
6/27/96 Continue meetings with LCC staff on second draft revisions;
continue receiving revisions from peer institutions; revise second
drafts
7/1/96 Submit survey instiomient to Task Force Conunittee, to intemal
areas and to peer institutions for final revisions
7/8/96 Receive survey and make final revisions by July 15
7/15/96 Send out final instiiunent to LCC areas for completion
8/1/96 Receive survey instrument back and record data to establish
intemal guidelines and make survey revisions
8/17/96 Send out final instinment to peer instittitions for completion
9/3/96 Finish recording LCC data from intemal study
9/16/96 Begin recording peer institution data
9/30/96 Complete analysis of instrument data
10/1/96 Submit data to participating instittitions for accuracy review
10/7/96 Clarification of data via correspondence or meeting or both
10/15/96 Complete final report and submit to peer institutions and LCC
Task Force ^
*LCC stands for the Lead Community College

288
APPENDDC G

SAMPLES OF LEAD COLLEGE GAP ANALYSIS

289
BUSINESS OFFICE GAP ANAI YST<;

Unrestricted monies- (Quest. 1-2)


Total unrestricted 1995-96 educational and general budget
Range: S16,815,107-$39,990,027
Average: $26,297,000
Median: 523,790,697
LCC:* $30,034,586

Total unrestricted 1994-95 audited financial statement


Range: S16,606,345-$38,558,402
Average: $23,162,000
Median: $21,398,102
LCC: $22,228,603

Budget Expendittires: (Quest. 3-4)


Dollar Amount Percentage of Budget
Equipment Range: 5100,000-51,140,372 Range: .4%—3.23%
Average: $481,086 Average: 1.44%
Median: $186,362 Median: .78%
LCC: $208,524 LCC: .69%

Technology Range: $83,532-$647,605 Range: ,5%-3.1%


Equipment Average: $383,525 Average: 1.55%
Median: $400,000 Median: 1.7%
LCC: $586,488 LCC: 1.96%

Expendable Range: $359,000-$2,832,369 Range: 1.7%-6.68%


Supplies Average: $1,133,993 Average: 3.87%
Median: $883,432 Median: 3,4%
LCC: $2,832,369 LCC: 6.68%

Maintenance Range: $l,005276-$3,826,943 Range: 4.5%-l 1.6%


(Physical plant Average: $2,521,000 Average: 9.94%
operations) Median: $2,488,507 Median: 11.18%
LCC: $3,357,620 LCC: 11.18%

Personnel Range: 512,161,114-$29,608,590 Range: 60.87%-76.83%


(including Average: $19,841,000 Average: 70%
fiinge benefits) Median: $21,474,420 Median: 72.53%,
LCC: $21,738,197 LCC: 72.53%

*LCC stands for lead community college.

290
Dollar Amount
Personnel Range: $11, 159, 114--$26,632,590 Range: 57.8I%-75.58%
(excluding Average: $16,965,00() Average: 65,2%
fringe benefits) Median: $14,227,829 Median: 65.26%
LCC: $19,601,637 LCC: 65.26%

Personnel Range: $21,741-$2,985,000 Range: 1.25-8.1%


fringe benefits Average: $1,425,000 Average: 5,33%
Median: $1,081,084 Median: 6%
LCC: $2,221,560 LCC: 7,4%

Professional Range: $19,000-$238,965 Range: , 1 % - 1 %


Development Average: $82,149 Average: ,36%
Median: $35,316 Median: .16
LCC: $34,480 LCC: ,11%

Travel Range: $25,033-$359,663 Range: .01%-1.2%


Average: $187,925 Average: .71%)
Median: $149,452 Median: ,43%
LCC: $359,663 LCC: 1,2%

Utilities Range: $737,870-$2,147,079 Range: 3,18%-5,6%


Average: $1,166,515 Average: 4.41%
Median: $956,276 Median: 4.1%
LCC: $956,276 LCC: 3,18%

Budget Priorities: (Quest, 5) (R=7)


Category Mode Rank # Colleges Avg Rank LCC Rank
for that Rank
equipment 3 2 3.8 3
expendable supplies 3 2 4.8 5
fringe benefits 2 2 3.5 7
persormel (existing) 1 4 1 1
personnel (new positions) 2 2 4 2
travel 8,5 2 each 6.5 8
technology 6,4 2 each 4.2 4
maintenance 5 2 5 6

Institutional Budget (Quest. 8-9)


Percentage of budget spent on fravel
Range: 2%-.43%
Average: 1.67%)
LCC: 1.16%

291
Dollar Amount spent on travel
Range: $400,000-72,920
Average: $254,186
LCC: $349,204

Percentage of travel budget by area:


Area Range Average LCC
Instmctional 30%-.082% 8,76% 11%
Institutional 26%-.359% 9,83% 20%
Professional 92%-26% 56.5% 53%
Student 18%-,2% 11,4% 16%

292
COMPUTER TECHNOLOGY GAP ANALYSIS
Policy Questions

General technology fee: (Quest. 7)


Four colleges have general technology fees.
• Two colleges $ 10 per semester
• Three colleges $3 per semester hour
LCC does not have a technology fee.

Network accounts for students: (Quest. 8)


Four colleges provide student's with network accounts including:
• Foiar E-mail
Three-—WWW
• Two Telnet
LCC does not provide such access at this time.

Machines on primary network: (Quest, 17)


R=250-1200
Avg.=796
M=900
LCC=1200

Budget

Technology budget: (Quest. 22)


R=$200,000—$3,000,000
Avg.=$ 1,554,600
M=$I,642,03I
LCC=$ 1,678,962

Percentage is telecommunications: (Quest. 23)


R=.85%-40%
Avg.=I4.6%
M=8.2%
LCC=10.5%

percentage in.stmctional equipment: (Ouest. 24)


R=15%-50%
Avg.=29%
M=28%
LCC=17%

293
Percentage is computer equipment: (Quest. 25)
R=2%-30%
Avg,=14%
M=12%
LCC=5%

Percentage is maintenance: (Que.st 26)


R=ll%-20%
Avg.=16%
M=16.5%
LCC=20%

Service

Central location for service assistance: (Quest. 29)


Four colleges call a central location on campus for service assistance.
LCC calls a cenfral locations on campus for service assistance.

Approximate time lapses; (Quest. 30)


Hardware problems:
R=I/2 hr.-days (One college reports hours to days)
Avg.=3.6 hrs.
La:=6.3

Printer problems:
R=l/2 hr.-days (One college reports hours to days)
Avg.=5 hrs,
LCC=6.3

Software problems:
R=l/2 hr.-days (One college reports hours to days)
Avg.=3.6 hrs.
LCC=6.3

Connections to LAN:
R=l/2 hr,-16 hrs. (Once college reports hours to days)
Avg.=5,8 hrs,
LCC=No reply

Tonnections to WAN:
R=l/2 hr,-days
LCC-No reply

294
DEVELOPMENTAL LABS, INSTRUCTION AND TUTORING

GAPANAI.Y.STS

Developmental Instruction

Table G.l, Contact hours, employees, students: (Quest, 2)


Developmental Contact Hours Number of Employees Number of Students
Area

English/Writing R=15,984—104,528 R=5+-69 R=313-2307


Avg,=47,510.9 Avg,=24 Avg.=1024
LCCf=39,552 LCC=16 LCC=412
Math R=45,344-119,744 R=14-37 R=821-3025
Avg.=77,034.6 Avg.=24,8 Avg. =1938
LCC=74,208 LCC=30 LCC=1210
Reading R=7,424-67,376 R=3-18 R=185-1088
Avg,=34,330.7 Avg,=l 1 Avg.=615
LCC=26,720 LCC=10 LCC=606

ESL R=7,168-14,880 R=5-13 R=92-224


Avg.=l 1,029 Avg.=9 Avg.=151
LCC=11,040 LCC=5 LCC=138
*LCC stands for Lead Community College

FTE faculty in developmental area: (Quest. 3)


R=13-45
Avg.=25
LCC=28

FTE support .staff in developmental area: (Quest. 4)


R=0-5
Avg.=2.6
LCC=4.85

Percentage of remediation done through noncourse offerings: (Ouest. 5)


R=0%-10%
Avg.=7.3%
LCC=10%

FTE faculty in the noncourse offerings: (Quest. 6)


R=0-3
Avg.=l
LCC=.75

295
FTE support staff in the noncourse offerings: (Quest 7)
R=0-2.25
Avg.=l
LCC=2.25

Table G. 2. Percentage of developmental classes taught: (Ouest 12)


Area Full Time Instructors Part Time Instructors
Writing R=54%—100% R=l,7%—46%
Avg.=77.6% Avg.=25.5%
M=80% M=28
LCC=65% LCC=35%
Reading R=5.6%—100% R= 12%—94.4%
Avg.=61.4% Avg.= 46%
M=69 M=38%
La:=88% LCC=12%
Math R=21%—100% R=14%—79%
Avg,=60,2% Avg. 46.4%
M=60 M=45
LCC=21 LCC=79%

Other (specify) ESL R=42.9%—69% R=31%—57,1%


Avg,51,6% Avg, 48.4%
LCC=43% LCC=57%

Developmental class size limit: (Quest, 13)


Writing
R=8-23
Avg.=19
M=20
LCC=20
Reading
R=12-25
Avg.=22
LCC=25
Math
R=22-30
Avg.=27.8
LCC=30

ESL-One college indicated a limit of 12 in ESL classes.

296
Developmental lah sjye limit: (Quest 20)
Writing
R=20-24
Avg.=21.5
M=20
LCC=20
Reading
R= 15-30
Avg=22.8
M=24
LCC=25
Math
R=24-30
Avg.=28.5
LCC=30

297
DISTANCE EDUCATION GAP ANALYSTS

Personnel (Quest. 8-11)


Number of FTE faculty dedicated to distance leaming:
• 2—Two colleges
2.5
8
25
LCC has 2.

Number of FTE support personnel dedicated to distance leaming:


On as needed basis.
Less than 1
1
2
6
LCC is on as needed basis-

Only LCC has an agreement with a public television station.

298
RESEARCH/PIANNINC/TNSTITTJTIONAL EFFECTIVENESS GAP ANALYSTS

Classified employees: (Que.st. 2)


R= 1/2-4
Avg.=2.2
M=3
LCC=1

•lob Titles and Functions in Institutional Effectiveness-

Research U Job Titles


R=0-3 Director of Institutional Research and Records
Avg.=1.4 Management
M=2 Secretary of Institutional Research
LCC=0 Research Assistants
Coordinator of Evaluation
Program Evaluation Research Associate
Student Tracking Director of Institutional Research,
Oversight, General Research

Planning •lob Titles


R=0-2 Vice President of College
Avg.=,7 Secretaty
M=l Secretarial
Lcc=n Director of Institutional Research
Chief Planning Officer

Instittitional
Effectiveness •Tob Titles
R=0-2 Dean of Program Development and Institutional Research
Avg.=.6 Secretaty
LCC=0 Director of Institutional Research, Oversight

Resource Development
Grants

Percentage of annual budget is grants and confracts: (Quest, 6)


R=0%,-14%
Avg.=8,33%
M=10%
LCC=10%

299
Income per contact hour from federal sources: (Quest. 7)
R=0-$1.30
Avg.=$.92
M=$1.12
LCC=$1.12

Employees involved in pursuing or overseeing grant money: (Quest 8)


R=l-6
Avg.=2,8
M=2,5
LCC=1

Foundation

College foundation: (Quest. 12)


Five colleges have college foundations worth
R=$500,000-$ 11,091,978
Avg.=$4,325,744
M=$2,436,740
LCC=$ 11,091,978

Percentage of foundation management costs:(Quest. 13)


by institution:
R=66%-100%
Avg.=86%
M=85%
LCC=85%

by foundation fimds:
R=5%-33%
Avg.=I7%
M=15%
LCC=I5%

Percentage of distributable fiinds for scholarships: (Quest. 18) (R=5)


R=40%-100%
Avg.=74,6%
M=78%
LCC=78%

percentage of distributable fiinds for institutional support: (R=4)


R=12%-40%
Avg.=24,3%
LCC=12%

300
INSTRUCTIONAL SURVEY GAP ANALYSIS

Cost Study

Table G.3, Per contact hour cost-1994-95: (Quest. 1)


Area State Median LCC* Range Average
Institutional 0.83 0.82 1.13-.65 0.88
Support

Student Services 0,44 0,39 .55-.39 0.44


Staff Benefits 0,29 0.28 .56-.08 0.29
Libraty 0,19 0,28 .26-.I9 0.24

Table G.4. Contact lour study -1994-95 : (Quest.2)


Area Acad. Acad. Acad. Acad. Vo/Tec Vo/Tec Vo/Tec Vo/Tec
State LCC* Instit. Instit. State LCC* Instit. Instit.
Median Median Range Avg. Median Median Range Avg.

Inst. 0,21 0.21 .48- .20 0.32 0.28 .53-,13 0.28


Admin. .04
Organ.
Activities 0.08 0.00 0.00 0.00 0.07 0,05 0,00 0.01
Related to
Instruction
''Lead Community College used as example

301
Instmctional Administration

Table G.5. 1 nstmctional adminisfrators compensation and load


#Have Level and title Compensation Considered Load Contract
6 Vice president Salary & 100% 5-Admin. 0 6-12 mon.
6-for instmct. release time 1-Fac
1-acad. affrs. LCC pays LCC-Fac.
6 Dean/Div, Chr $2700; 80%
release time
*A11 responding attached job descriptions for defining duties of each position.

Calculating compensation: (Quest. 8) (R=5)


Two colleges use a formula for calculating compensation for instmctional administrators.
LCC uses a formula for calculatin compensation

Faculty Load Considerations

Faculty load calculated: (Que.st. 10) (R=5)


Three colleges require 15 credit hours for a fiill load
One college requires
• 15 load hours with load hours determined by formula
• Minimum of 21 contact hours
• Ratio is as follows: lecture 1:1, lab 1:2; clinical .75:1
LCC requires 15 credit hours.

Vocational Classes: (Quest. 18-20) (R=5)


Three colleges do not count vocational classes as faculty load.
Two colleges do count vocational classes as faculty load calculating
• A determined class by class basis equivalence with a credit class
Formula based on contact hours
LCC does not count classes as faculty load.

Two colleges offer mixed classes (i.e., semester hour credit students and adult vocational
students in the same class) calculating load
On basis of the load for the credit class
By counting adult vocational sttidents toward the total required to make a
class, since they pay the same costs,
l^CQdoes not have mixed classes.

302
Average lecture class: (Quest. 21)
R= 19.52-35
Avg.= 28.4
M=30
LCC 19.52

Extra large lecttire classes: (Quest 24-26)


Four colleges do offer extra large lectiu-e classes.
• Biology sections of 80-90
• Telecourses of 60-70 students in one section
• Greater than 45
• Count of above 50.
Three colleges offer compensation for exfra large classes,
• Two colleges give exfra load credit
• One college pays $25/stdt over class maximum as compensation
Two colleges offer faculty of exfra large classes help in grading,
• By using SCANTRON with lead faculty grading essay portions of exams
• By permitting student help
LCC does not offer exfra large lecture classes.

InsttTictional Personnel

Percentage of your fiill-time employees are fiill-time faculty: (Quest. 27)


R= 38-50
Avg. 45.7%
M=47.5%
LCC=38%

Percentage of your continuous part-time employees are part-time credit faculty: (Quest.
25)
R=67%-95%
Avg.=85.75%
M=90.5%
LCC=90%

303
Rank system: (Quest. 29)
Two colleges have rank systems
LCC has a rank system

Table G.6, Librarians and counselors: (Quest. 32-38)


Title Considered faculty Considered # By This Title
Nonfaculty
Counselors R=9-10 R=5 R=0-10
Avg,=9,5 Avg.=5 Avg.=6
LCC=all faculty
Librarians R=l-7 R=2 R=2-7
Avg,=3.25 Avg.=2 Avg. 3.75
LCC=all librarians M=3

Ratio of fiiU-timle librarians per FTE (15 credit hours) students


1:1754
1:1786
1:852
1:1160
LCC=1:I754

Ratio of fiill-time librarians per student (unduplicated head count):


1:3282.5
1:2598
1:1484
1:1775
LCC= 1:3283

Ratio of filll-time counselors per FTE (15 credit hours) sttadents:


1:389.7
1:1071
1:596
LCC= 1:390

Ratio of full-time counselors per student (unduplicated head count):


1:729.4
1:1559
1:1039
LCC=1:729

304
Counselor credentials:
Two colleges require a Master's Degree in guidance and counseling
• One college requires L. P C.
LCC requires a Master's Degree in guidance and counseling.

Academic advising system: (Quest. 39)


Three colleges have integrated counseling systems
• With two colleges having major advised by faculty and undeclared/liberal
arts advised by counselors
• One college has both individual faculty advising and scheduled "drop-in"
sessions where faculty advise large numbers
Two colleges have decenfralized counseling by departmental faculty
LCC has a centralized counsesling center.

Mentoring system for new personnel: (Quest. 40)


Three colleges have mentoring systems for new personnel.
• That are filll-time faculty
• That are part-time faculty
• One college has a mentoring system for classified.
• One college compensates the faculty mentors $50 per mentee per semester
• Two colleges select the mentors from the same work area
LCC has a mentoring system for fiill-time faculty that provides compensation.

Professional Development

Respon.sibi1ity for academic professional development: (Quest. 41)


Two colleges have Professional Development coordinators or directors
One college has the
• Dean of LRC/Instmctional Dean/ Faculty Senate oversee professional
development
• Professional Development Committee composed of faculty oversee
development.
• In two colleges the position is part-time.
• In two colleges the position is fiiU-time.
• In two colleges the position is administration.
• In two colleges the position is faculty.
T.CC has a part-time faculty professional development coordinator.

305
Part-time faculty professional development: (Quest 47)
Four colleges offer professional development to part-time faculty.
One college offers these
Same as other groups-numerous workshops, seminars, conferences and travel
• At the beginning of each semester
LCC offers professsional development.

Professional development to maintain field proficiency: (Quest 4S)


All five colleges offer professional development to technical/vocational faculty to
maintain proficiency within their fields.
• In three colleges such training is separate from the academic.
LCCis not separate from academic.

• In three colleges it is for additional certifications


LCC is not for additional certification.

• In four colleges the college pays for technical faculty to receive training in new
technology
LCC does not papy fr technical faculaty to receive fraining in new technology,

• In two colleges the college pays for technical faculty to receive additional
certifications.
LCC does not pay for technical faculty to receive additional certifications.

Technical faculty receive release time: (Quest. 49)


Three colleges grant release time for industry fraining
One college grants release time based on
• A policy, but no one has done it.
• Professional development fimds
• Faculty application to the Professional Development Committee,
(Proposal is then reviewed and funded/not fiinded.)
LCC does grant release time for industty fraining.

Professional development fiinding: (Quest. 50)


Four colleges provide institution-wide fiinds for professional development.
• Three colleges provide fimd through department/division budgets.
• One college uses Title III, Strengthening Institutions Grant, fiinds.
LCC provides through institution-wide fiands.

Required professional development activities: (Quest. 52


Three colleges require some professional development activities.
I-CC does not require any profesional development activities.

306
Part-time faculty

(Quest. 55)
Percentages of credit contact hours generated by part-time faculty for fall 1995:
R=10%-50%
Avg. 27,6 %
M=24%
LCC=29.4%

Percentage of noncredit contact hours generated by part-time faculty for fall 1995:
R=95%-100%
Avg.=98.3%
LCC=95%

Maximum percentage goal (% of classes) of part-time faculty: (Quest. 60)


• Two colleges have not defined this percentage.
• Two colleges report 35 % as a maximum percentage for part-time faculty.
LCC maximum of 35%.

Instmctional Computer Lahs and Staffing

# computer labs: (Quest. 64)


R=5-53
Avg.=29
M=32
LCC=36

Noncomputerized Labs and Staffing

Tab staffing differentiation: (Quest. 69)


All five colleges differentiate between lab insttnctors, lab asistants and lab supervisors.
Two colleges define a lab insttiictor as faculty.
One defines as
Termed teaching assistant-able to setup, teach lab
• Primary lead
Full-time classified staff, with approximately 30 hours of 40 are meeting
with computerized labs and group sessions
I CC defines as faculty.

307
Lab assistants con.sidered: (Quest. 70-71)
Four colleges consider lab assistants, excluding students as classified personnel.
• Three colleges' lab assistants, excluding students, are e\'aluated b\ their
supervisors.
• One college evaluates the same as faculty.
LCC considers lab assistants as classified who are evaluated by their superv isors.

Equipment

Position for equipment: (Que.st. 88)


Two colleges have a Director of Purchasing responsible for inventory, restocking, etc.
One college each has
• Business Office Accountant
• Immediate supervisor of discipline or program
• Inventoty clerk
• Sr. Accounting Manager
LCC has a Business Office Accountant.

Two colleges receive equipment by donation.


R=l
Avg.=l
LCC=0

Three colleges lease equipment:


R=l%-5%
Avg.=2.3%
LCC=1%

Three colleges purchase equipment:


R=90-99%
Avg. 95.6%
LCC=99%

Removal of equipment from inventorv: (Quest. 91)


Three colleges have equipment retirement/inventoty control forms
One college has a
• Deactivation process
• Centtal location for removal
I,CC completes an equipment retirement form.

308
Institutional budget percentage for equipment: (Quest. 93)
R=2.65%-18%
Avg.=7.81%
M=5%
LCC=2.65%

Annual equipment budget instmctional equipment percentage: (Quest. 94) (R=4)


R=8%)-75%
Avg.=50%
LCC=not available

Equipment budget percentage is general/operational equipment: (Quest. 95) (R=3)


R=2%-25%
Avg.=14%
LCC=not available

309
OUTSOURCING GAP ANAI VSK

Outsourced Services:
LCC outsources:
Some advertising
Food services
Some publications
Travel
Vending machines

The six institutions reporting record 17 different areas which are outsourced.

All six colleges outsource:


Food Services-Agreements range from a minimum guaranteed profit to a
commission on sales arranged by confract and bid process.
Vending Machines-Agreements include commission, contract and revenue

Four colleges outsource:


Publications-Including catalog and other items; bids are taken from vendors

Three colleges outsource:


Advertising—Various items to various media including radio
Travel Arrangements—Agreements include confracts with tra\'el agencies with no
financial commitment or consortium pricing/lowest quotes

Two colleges outsource:


Bookstore—Agreements include annual renewable contract (5 yr. option) to a
commission on sales with Bames and Noble
Building Maintenance—Agreements for a physical plant manager only, and
confracts and bids with issuance of purchase orders in addition to in-house
Courier Services—Agreement with a purolator for daily deposits to bank, and
consortium pricing/in-house mail and open pos on "as need basis"
Custodial Services—Agreement for custodial manager only, and confract purchase
orders issued 1 yr w/option to renew up to 3 years (also in-house)
Grounds-Agreement for physical plant manager only, and use of temporaty labor
through bid process w/Temp agencies as needed
Security-Agreement with local police depart. For after hrs. security using bid
process; agreement for grounds and buildings
Telephone—Agreements include confracts with Southwestem Bell and MCI
Vehicle Maintenance—Agreement is a confract with a local auto repair shop, and
dealer warranties

310
One college outsources:
Child care-Agreement offers services to students through a Perkins Voc/Tech
Grant
Mailing-Agreement is with local vendor for large bulk mailing projects only
Maintenance of personal computers-Agreement is for annual maintenance service
Printing-Agreement is for color printing, brochures and catalog
-Consortium pricing/lowest quotes

Services Sold by Institution:


LCC sells these services:
Bookstore
Childcare
Housing
Instmctional contract training
Professional staff and development contract training

Only three colleges sold services.

Two colleges sell


Child Care—Make child care available to community clients as well as students
and employees
Housing services—Agreements include rent houses available to the public, and
faculty housing and student dormitories

One college sells services:


Bookstore services—Agreement allows the public to purchase items at bookstore
Instmctional fraining—Agreement includes contract fraining in certain areas
Printing—Agreement is limited to educational and civic entities
Professional Staff Development Training-Agreement is contract training
Testing—Testing center administers tests for the cost of the exam
Other-Fitness Center available to community members for an annual fee

311
TNSTRIJCTIONAL-SCHEDIJLTNG GAP ANAI .Y.<^I.<;

Fall/Spring Evening Cla.ss Scheduling

Fall/Spring Evening Cla.ss Scheduling

Numbers of classes and meeting times (Quest. 1-3)


Seven institutions surveyed have evening classes

The number of sections


R= 20-1549 in the academic area
Avg.=1709.6
M=500
LCC=I9I

R= 20-500 in the technical area


Avg.=207
M=I87
LCC=201

R=38-363 in the continuing education area


Avg.=166,5
M=65
LCC=173

Instmctors for evening classes (Quest. 4-5)


Use of filll-time faculty to teach these classes
R= 20%-75%
Avg.=49%)
M=40
LCC=75%

Use of part-time faculty to teach evening classes


R= 25%-80%
Avg.=51%
M=50
LCC=25%

Faculty required to teach night classes (Quest. 5)


• Three colleges do not require faculty to teach evening classes
• Two colleges require faculty to teach evening classes
• One college requires it only if their day load does not make,
T-CC does not require faculty to teach evening classes.

312
Continuing education, degrees and sites (Quest. 6-8)
• Four of the institutions a degree may be obtained by attending only
evening classes
• Two do not allow a degree to be obtained at night
LCC may obtain a degree by attending only evening classes.

• Three community colleges students can not take credit classes for
continuing education credit
• Three colleges do allow students to take credit classes for continuing
education credit
LCC allows students to take credit classes for continiuing education credit.

• Four of the institutions offer evening classes at off-campus sites such as an


Army depot, medical facilities, banks and other company locations
• Two offer no such classes.
LCC offers classes at off-campus sites.

Summer Term Scheduling

Summer class numbers (Quest. 9-14)


All six of the colleges retuming surveys have summer classes.
Number of class sections offered in summer 1996
R= 7-1089
Avg,= 433
M=348
LCC=436

Average number of sections offered over the last three years


R=7-973
Avg,=401
M=302
LCC=404

Session length (Quest 12) (R=5)


Five schools offered six week sessions
Five schools offered eight week sessions
Two schools offered ten week sessions
One school offered an eleven week session
Four schools offered twelve weeks sessions
One school each offered:
• A nine week session
• A five week session
• A 2-5 week session
LCC offers six week, eight week, ten week, and eleven week sessions,

313
Contact hours generate (Quest 13) (R=6)
• The most contact hours are generated by the two short summer sessions
both academically and technically.
• Only two colleges generated any contact hours by offering:
• One long session per summer
• One long night session per summer
• Two short night sessions per summer
One college reported that they had conducted a cost effective report and found
that the cost of the course was primarily determined by "whether the course is
taught by fiill-time or part-time faculty,"

Continuing Education Credit (Quest. 15) (R=6)


• Three of the colleges grant continuing education credit for summer
academic credit and charge the tuition rate for those students
• Two of the colleges do not allow continuing education credit and
academic credit from the same course.
LCC-does allow continuing educaiton credit for summer academic classes.

Nonfraditional Scheduling
(R=6)

Class offerings (Quest. 1-2) (R=6)


S Nonfraditional offerings
4 Mini-semesters
6 Weekend sections of classes
3 Block classes
5 Extension centers
6 Dual/concurrent classes
Only one school had found another optimal scheduling time they use "flexible
entty" and "fast-frack" classes.
LCC offers all nonfraditional offerings expect the block classes.

314
Mini-seme.ster .scheduling
(R=4)

C]a.s.ses (Quest. 3) (R=4)


Four college offer mini-semesters between spring and summer sessions and between fall
and spring sessions.
Attendance
R=50-986 in 2-22 classes offered in spring/summer
R=45-601 in 2-10 classes offered in winter
Avg.=397 in spring/summer
Avg.=317 in winter
LCC =350 students in 22 classes in spring/summer
LCC=120 students in 10 classes in winter

Instmction (Quest 4) (R=4)


When calculating instmctor pay:
• One college pays the same as for regular classes because contact hours are
the same
• One pays mini-semester instmctors at the part-time rate
• One considers it part of a load or pays class overload rates
LCC has a summer overload formula for fiill-time in.stmctors and a fixed rate for
part-time instmctors

Weekend class scheduling


(R=6)

Class settip (Quest. 11) (R=6)


• Two colleges offer once a week Saturday classes
• One adds to Saturday only classes Friday, Saturday and Sunday classes
• Three colleges offer Friday, Saturday and Sunday classes.
LCC offers Saturday classes once a week for the duration of the semester.

Single weekend blocks (Quest 12-13) (R=6)


• Two schools compact classes into single weekend blocks for real estate classes
• Three schools offer continuing education block classes ranging from one day
weekend classes to three-week weekend classes.
LCC offers single weekend compact classes.

315
Extension center scheduling
(Three colleges report having)

Centers (Quest 24) (R=3)


Three schools have extension centers
R=2-5 centers
Avg,=4.7
M=3
LCC=3

316
APPENDIX H

COVER LETTER

317
March 17, 1997

Dear

Enclosed you will find the results in your area of the final
report from the benchmark study. Notice that the final report
includes both quantitative and qualitative data as well as a trends
list. The other report for your area is a gap analysis showing where
XXXXXXX falls in comparison to the peer institutions. The gap
analysis should be used to evaluate areas where XXXXXXX is
performing better than our peer institutions and where we may be
falling behind and need to make changes.

As discussed, this information should highlight areas where


we can make improvements and become more efficient within your
area. This information has been purposely provided at the time
because you are making budget decisions.

If you have any questions, you may access the entire report
by viewing or checking out a copy at any the libraty on any of the
four campuses or in the president's office; or by contacting
XXXXXXXXXX, the benchmark director.

Sincerely yours.

XXXXXXXXX
Benchmark Project Director

318

You might also like