Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 11

Defining the boundaries of public administration: Undisciplined

mongrels versus diciplined purists


"Undisciplined mongrels" are faculty from public administration programs who publish in a wide
variety of journals. We expected that undisciplined mongrels would have more successful publishing
records than their counterparts-"disciplined purists" who publish exclusively in public administration
journals. This expectation is supported through an analysis of journal publications by a panel of 91
junior faculty members. We also expected that the methods that are currently used to rank public
administration programs would discard a massive body of publication activity by public
administration faculty. This expectation is also soundly supported. Findings indicate that from 1990
through 1997, a scant 18 percent of the articles published by the faculty panel were published in the
highly selective set of 11 journals that are currently used to rank public administration programs.
A crisis of identity has always hounded the field of public administration (Mainzer 1994; Waldo
1968). Some people worry that the field lacks focus and thus has no identity. Others worry that the
field is being "slowly nibbled to death" by the behavioral sciences (Fesler 1975, 117). The truth of the
matter is that public administration has always been a little schizophrenic about its identity.
Some public administration faculty members are trained in neighboring disciplines such as political
science, accounting, law, management, sociology, planning, and economics. Others are trained in
public administration. Some publish in a wide array of journals, including those from disciplines
where they earned their Ph.D.'s. Others exclusively publish in public administration journals. What is
the mix of the "disciplined purists" who were trained in public administration and tend to publish in
public administration journals versus the "undisciplined mongrels" who were trained outside of public
administration and tend to publish in non-public administration journals?
Our study answers this question by looking at the publishing activity of a junior faculty panel that had
appointments in public administration programs beginning in the fall of 1990. First, we look at the
number of faculty with degrees in public administration relative to other disciplines. Then we look at
the publication activity of our panel over an eight-year period and compare the number of their
publications in straight public administration journals with the number of publications in other
journals. A measure of journal quality is also used to evaluate whether the quality of non-public
administration journals is better or worse than public administration journals.
We hypothesize that undisciplined mongrels will publish more articles and will be cited more
frequently than disciplined purists. We offer this hypothesis for two reasons. First, many of the nonpublic administration journals are more widely read and more frequently cited than the straight public
administration journals, so articles published in those journals receive more notice. Second, we
believe that the complexity of problems confronted by public administration researchers requires the
creative perspectives that are more typically found with multi-disciplinary orientations.
A multidisciplinary orientation is a huge advantage to a field like public administration. Theories that
illuminate understanding tend to evolve by cross-fertilizing ideas across disciplines (Mosher 1975;
Waldo 1980). There has been more mingling of the hard with the soft sciences over the years and a
greater integration of diverse theoretical lenses (Farmer 1995; Frederickson 1980). This is as it should
be. Research that addresses technologically sophisticated policy questions simply cannot be contained
within the confined boundaries of traditional scientific disciplines (Birkhead and Carroll 1980;
Lindblom 1972; Stone 1990). Even specialties such as the biological sciences are an invaluable
resource to the field (Savage 1974). It stands to reason that public administration research is destined
to be intellectually chaotic (Stillman 1991) and conceptually untidy (Golembiewski 1977).
To summarize, the expository evidence suggests that a vast body of research in public administration
spans across many different disciplines. Therefore, we expect that a significant body of work by

faculty members from public administration programs will be published outside the highly restricted
domain of public administration journals. We also hypothesize that the products of multidisciplinary
research should yield more important contributions to the field.
If the research of undisciplined mongrels turns out to represent the bulk of publication activity by
faculty and is more frequently quoted than articles that are published in straight public administration
journals, we need to revisit the current methods that are used to evaluate and rank public
administration programs. Programs should not be evaluated on the basis of faculty publications in a
massively restricted set of public administration journals, but on the basis of publications in all
journals. We now turn to a consideration of the highly restrictive methods that are currently used to
rank public administration programs.
Ranking of Public Administration Programs
Research that is found in journals that are used to rank public administration (PA) programs is
undeniably one of the most visible and important statements about how the territory of public
administration is bounded. The currently accepted method that is used to rank PA programs recognizes
publications in only 10 journals (Legge and Devore 1987; Morgan et al. 1981; Morgan and Meier
1982). In the most recent study that ranked programs, Douglas (1996) added one new journal to the
original set of 10. Public administration programs with faculty that publish extensively in these 11
journals should be more likely to have high rankings. Programs with faculties who publish
extensively in other journals should, on the other hand, be more likely to have low rankings (or no
ranking whatsoever).
Establishing territorial boundaries has its advantages for a relatively new discipline like public
administration. Residents take more pride in being citizens. Fences can also be constructed to keep
intruders out. Staking out the territory, however, has effects that are entirely unintended.
Territorial Squabbles
Fixed territorial boundaries split any faculty into two groups: insiders (faculty who publish in the
sanctioned set of 11 journals) and outsiders (faculty who publish in other journals). Outsiders are
often extremely productive, yet their work does not contribute to the rankings of PA programs.
In a series of criticisms that followed publication of Douglas's (1996) recent study that ranked public
administration programs, various faculty members suggested that the highly selective set of journals
that have been used to rank PA programs should be expanded. Guyot (1997) wants to include the
American Political Science Review in addition to other basic, disciplinary-focused journals.
Rohrbaugh and Andersen (1997) offer a bolder recommendation. They point out that the faculty at
SUNY Albany publishes only one-fourth of their articles in the journals that have been used to rank
PA programs. They believe that "the productivity of our faculties should not be bounded but
boundless" (186). Golembiewski (1997) seconded Rohrbaugh and Andersen's plea, arguing we need
to "widen the circle of relevant sources used" to rank PA programs (464).
On the one hand, it is a useful exercise to haggle over the appropriate boundaries of the discipline.
After all, the territory defines our identity. On the other hand, haggling over territorial boundaries is
problematic in the sense that the boundaries are being continuously disputed (Keller and Spicer 1997).
The discipline has become the Israel of academic disciplines because we are always squabbling over
the precise (and priceless) boundary lines that define our identity.
Inconsistency of Rankings
We have argued that research by public administration faculty members is broader in scope and more
comprehensive in character than is reflected in the restricted set of journals that are used to rank PA

programs. We will show that the proportion of publication activity by PA faculty that has been used to
rank programs is tiny. Studies that rank PA programs should be terribly difficult to replicate. The most
recent study (Douglas 1996) acknowledges that the rankings have historically been very unstable.
Only two PA programs in Douglas's study, Georgia and Syracuse, achieved high rankings in previous
studies. "The remainder of the programs are less successful at holding their positions in the rankings.
These less productive schools tend to move up and down the rankings in an unpredictable manner"
(439).
There is a genuine crisis of identity when a university program in one study is determined to have a
high ranking and in a subsequent study, no ranking whatsoever. Douglas attributes the instability of
rankings to the mobility of a few, highly productive faculty members. But to the extent that the
method he and others used to rank programs relies on a tiny proportion of publications by faculty
members, the instability in rankings is more likely to be caused by the artifact of using severely
truncated samples of faculty publication activity (Farber, Powers, and Thompson 1984).
The crisis of identity is fueled by judgments about where scholars "should" publish. To achieve high
rankings, publication in the select list of public administration journals is required. Our identity as a
field, however, is simply what public administration faculty members publish and where we publish
it. Instead of offering prescriptions about what the discipline of public administration should become,
we present evidence that reveals what the field has become. This result sets the boundaries of the
field.
The current method of ranking PA schools has been criticized for discarding a massive body of
research by public administration faculty. This criticism is evaluated by analyzing publications that
were authored by a panel of public administration faculty. An explanation of the methods we used to
collect and code publication activity by the faculty panel follows.
The Panel
Ninety-one faculty members who were appointed in the fall of 1990 as new, tenure track assistant
professors in public administration programs comprised the panel. Assistant professors have a time
period of six years from the date of their appointment to be promoted to associate professor or be
tenured. Promotion decisions are based primarily on an individual's success with publishing articles in
academic journals. Therefore, members of the panel were motivated to choose journal outlets for their
research that are valued by the tenured faculty members of public administration programs.
Studies of academic productivity (Rodgers and Maranto 1989) typically select the names of faculty
members from faculty directories. Faculty who are not promoted or who resigned are excluded from
the sampling frame. This study overcomes this limitation by identifying all assistant professors who
began employment as faculty members during the same time period-the fall of 1990.
Panel members were identified over a six-month period beginning in August 1990. Phone calls were
initially made during the month of August to the directors or administrative secretaries of all 220
public administration programs in the 1986-87 NASPAA membership roster, which lists accredited
and non-accredited programs. The purpose of the phone calls was to obtain the names of all
individuals who had been offered appointments as tenure track assistant professors and who were to
begin employment in the fall of 1990. Learning if new faculty had been hired and, if so, obtaining
their names required several calls to some programs. Only one institution refused to provide the
requested information. These phone calls yielded the names of 108 faculty members.
Success of this research project depended on verification of the appointment, correct spelling of each
person's full name, and background information about their specific areas of research. A personalized
letter was mailed to each faculty member in October 1990 requesting that the individual return a
current vita and complete a questionnaire that contained questions about his or her teaching

commitments and research programs. Two weeks after the initial mailing, a follow-up request was
sent to all panel members who had not responded.
A third follow-up request was mailed in January 1991 to persons who had not responded to the first
two mailings, requesting return of the survey and a vita, or an indication of whether the faculty
member: (1) preferred to answer the survey questions over the phone; (2) needed additional
information about the study before responding; (3) wished to report he or she was not an assistant
professor; or (4) elected not to participate in the study. By January 1991, responses had been received
from 97 members of the initial sampling frame. Follow-up phone calls were made to the remaining
nonrespondents to make a final appeal and to confirm the status, name, and employment of the
individual named on the phone survey.
Seventeen persons out of the 108 new faculty members were senior faculty or part-time faculty and
were deleted from the panel. The final panel of assistant professors consisted of 91 assistant
professors who were employed by 68 PA programs.
Publications in Journals
Publications by panel members were identified using computer-based data searches, a hard copy
review of certain journals, and, in selected cases, follow-up contact with the panel member. The
Social Science Citation Index (SSCl) was searched annually beginning in 1991 to identify
publications by members of the panel. A search for publications was also undertaken in 1997 using
UNCOVER-a computer database that includes a broader selection of journals than is covered by
SSCl.
Searches were run using both initials of each individual and only his or her first initial in the event
articles might have been published under the first name only. Confirmation that a publication was
authored by a member of the panel was made by verifying the university affiliation of the author.
Vitae were also helpful in confirming whether a member of the panel authored a publication.
In selected cases, articles were located and reviewed to verify that the author of the article was a bona
fide member of the panel. This was occasionally necessary in cases for persons who had common
names. In a few instances, the panel member was contacted to confirm or deny authoring a
publication or to solicit a copy of an updated vita.
Publications were included in the listing regardless of the number of authors or the order of
authorship. Multipleauthored articles received the same recognition as singleauthored articles.
Journal Quality
Impact Factors. Information about journals from the SSCI is used to construct indicators of journal
quality that are reported in the Journal Citation Reports QCR). Using SSCI data, JCR constructs an
indicator of journal quality called the impact factor, which measures the frequency of citations by
other journals to articles published by a source journal. A journal publishing articles that are
frequently cited by other journals is recognized to have higher quality than a journal that is seldom
cited. Impact factors from the JCR were used to measure journal quality in this study. The impact
factor is calculated by dividing the total number of citations to articles that were published by the
source journal over two years by the number of articles published in the source journal over the same
period of time. According to the 1995 JCR, the average impact factor for all journals in the social
sciences was .97.
Panel members published in a number of journals that have no impact factor. Journals without impact
factors are not necessarily little cited by other journals. Inclusion of a journal in the SSCI database
and subsequent calculation of an impact factor by JCR depends on several criteria. The editor must

submit a recent volume for review by the Institute for Scientific Information staff and agree to
forward future volumes of the journal in a timely fashion. Decisions to include a journal in the
database are a function of citations by other journals and a determination of the journal's
distinctiveness relative to other journals in the field. Treating a journal without an impact factor as
having zero quality would be misleading. Therefore, our measure of journal quality is based only on
journals that have impact factors.
Publications were divided into two groups: The set of journals that are used to rank PA programs and
the set of other journals. Publication activity for the two sets of journals was compared. Although
Public Administration Review (PAR) has only published rankings of PA programs that are based on
the set of 10 (and more recently, 11) journals, rankings using an expanded set of 26 journals were
recently published by Forrester (1996). Thus, analysis using Forrester's expanded list is also
presented. Fourteen of the 26 journals used to rank PA programs had no impact factors. Hard copy
indexes of these journals were examined from 1990-97 to guarantee that all articles by members of
the panel had been identified.
Disciplinary Orientations of Journal Publications. The disciplinary orientation of journals is coded
from a detailed breakdown of subject areas provided by the (JCR). For example, the JCR reports that
the subject area for the Journal of Drug Issues is substance abuse; the subject area for the Journal of
Policy Modeling is economics, and so forth. Some journals had two subject areas; others had three.
We coded the subject areas) for each journal article by each member of the panel. The subject area
was automatically defined to be public administration for all publications in any of 11 journals used to
rank PA schools or Forrester's expanded set of 15 journals. A comprehensive listing was built of 36
subject areas that were covered by journal publications of panel members.
Subject areas were subsequently linked with traditional academic disciplines. Twenty of the 36 subject
areas were already identical to academic disciplines. For example, the subject matter of law is
identical to the discipline of law. The remaining 16 subject areas were grouped according to academic
discipline. For example, health policy and services, and public health were grouped under the
discipline of Health Policy; medicine, medicine-legal, geriatrics, and psychiatry were grouped under
the discipline of Medicine; psychology, psychology-psychoanalysis, psychology-social, psychologyclinical, and applied psychology under Psychology; social sciences-interdisciplinary, social sciencesmath methods, and social sciencesbiomedical under Social Sciences; and political science and
international relations were grouped under the single discipline of Political Science. This grouping of
subject areas into disciplines yielded a final listing of 25 distinct disciplines.
Citations. Citations to an article are a key indicator of its contribution to knowledge (Farber et al.
1984). Frequently cited articles have a more profound influence on the thinking of others. The SSCI
database was searched annually to identify all citations to articles (or books) that were authored by a
member of the panel. Self citations (citations by a member of the panel to his or her own published or
unpublished work) were not counted. Citation searches also helped to identify additional journal
publications that were authored by panel members which had been missed using the source searches.
Results
Overall Analysis
Information on the Ph.D. degrees of 70 persons was reported on panel member vitae. Of the 70
faculty members, only 12 (17 percent) earned their Ph.D. (or DPA) in public administration. Other
panel members had Ph.D. degrees that spread across a staggering variety of disciplines, including
accounting, anthropology, city and regional planning, comparative politics, criminal justice, decision
sciences, economics, educational administration, information systems, international relations,
operations research, political science, public affairs, public policy and management, public service,
regional analysis and planning, sociology, and systematics and ecology.

A complete listing of all journal publications from 199097 by panel members is presented in the
Appendix. Three hundred forty-two articles were published in 190 journalsan average of 3.8 articles
per panel member over the eightyear period.
Consider now the overall quality of publications as reflected by impact factors from the Social
Science Citation Index. The quality of publications as measured by the mean impact factor was .70 (n
= 219). Publication quality by panel members turns out to be slightly better than is reflected in public
administration journals taken as a whole-reported by the JCR to be only .54-but worse than the overall
quality of all social science journals (which is .97).
Disciplinary Coverage
Our review of the literature suggested that research by panel members cuts across a wide range of
disciplines. Results support this expectation. Table I shows the coverage by discipline of articles in
journals that were authored by panel members. Twenty-four disciplines other than public
administration are represented, including traditional disciplines such as law, medicine, geography,
anthropology, and social work. A surprising 63 percent of the coverage fell outside public
administration. The most popular disciplines were economics (11 percent coverage), political science
(9 percent coverage), management (5 percent coverage), and the social sciences (4 percent coverage).
Moreover, members of the panel published in journals that address the audiences of virtually all social
science disciplines. The Success of Multidisciplinary Research We also expected that undisciplined
mongrels with
multidisciplinary research programs would have more successful publishing records. Success should
be shown by greater productivity (more publications), higher quality (better impact factors), and more
visibility (more citations to published work). All expectations were supported.
Table 2 breaks down the publishing activity by panel members on the basis of publications in public
administration journals. Panel members are broken down into five distinct groups:
Complete coverage: all journal articles by the panel members are published in public administration
(PA) journals.
Majority coverage: at least half of the articles are published in PA journals.
Minority coverage: less than half of the articles are published in PA journals.
No coverage: no articles are published in PA journals. Panel members with no journal publications.
First, consider the publication output of panel members who published one or more journal articles in
a public administration journal. As the proportion of coverage drops from complete to majority to
minority, the mean number of article publications increases from 3.0 to 4.8 to 6.8. The quality of
publications (as measured by impact factors) also increases from 0.7 to 1.5 to 3.6. Finally, overall
visibility increases as the mean number of citations jumps from 3.4 to 8.5 to 24.3. Scholars who were
most productive published a minority of their publications in public administration journals. As the
disciplinary coverage of publications by panel members spread, productivity was higher, quality was
better, and visibility was enhanced.
Second, consider the group of 21 panel members who had "no coverage" in PA journals. They
published, on average, 3.2 articles-about the same number of publications as panel members who
published exclusively in PA journals. However, the quality of publications by authors with no
publications in PA journals was much higher than those who published exclusively in PA journals.
Their work elicited almost four times as many citations. Part of the advantage here is due to the fact

that journals outside PA are better established, have wider readership, and are more frequently cited
than are the less well-established PA journals.
Third, consider the largest group-the 24 panel members with no journal publications. They received
an average of 3.5 citations to their publications which consisted of eight books and several graduate
school publications. Interestingly, the number of citations to the work of panel members with no
journal publications is virtually identical to the number of citations to work by panel members who
published all of their articles in public administration journals.
Method of Ranking Public Administration Programs
We expected that the method of ranking public administration programs would discard a massive
body of publication activity by faculty members. This expectation was soundly supported. Table 3
compares the publications in journals that are used to rank programs with publications in other
journals. Recall that the method currently used to rank public administration programs recognizes
publications only in the select set of 11 journals. Application of this method results in the massive
discard of 82 percent of the publications by panel members in 94 percent of the journals. Inclusion of
the journals listed in the expanded list of PA journals improves this stark result somewhat, but 86
percent of the journals and 67 percent of the publications are still discarded.
How does the quality of publications compare across the two sets? The average quality of the
publications in journals used to rank PA programs had an impact factor of .60, while the mean impact
factor of publications in journals not used to rank programs was .76. Thus, publications in the select
set of 11 publications reflected only 79 percent of the average quality for publications in journals not
used to rank PA schools. This evidence backs up Golembiewski's contention (1997) that some of the
most talented and productive faculty members publish in journals that are not used to rank PA
programs.
Discussion
What is the character of the typical research program of an assistant professor? The faculty who were
the most productive members of the panel organized a program of research around a research question
of fundamental interest to policy makers or administrators. The question might tie into any aspect of
government, ranging from managing the quality of the environment to privatization. Doctoral
programs of assistant professors in the panel reflected a rich, multidisciplinary character. For example,
some faculty members integrated engineering with psychology; others combined the perspective of
political science with economics; still others integrated communications research with sociology and
organizational theory. Thus, faculty published in a highly specialized set of outlets that are read by
persons with a particular interest in the topic. Organization of the research is not around a discipline.
Rather, it is undisciplined in the sense that it spans across disciplines.
Origins of Public Administration Classics
Findings reflect the publishing activity of junior, nontenured faculty members. What are the
publication origins of work by the senior scholars of the field? In particular, what are the origins of
work that are considered classics? Shafritz and Hyde (1992) recognize 51 works to be classics, 38 of
which were published after the end of World War 11. Of these 38 articles, only 34 percent were
published in a journal that is used to rank public administration programs. Half of the classics were
originally published as books, not journal articles. Two-thirds of the classics in the field would not
have been recognized by the current method that is used to rank PA programs.
Panel members published a total of 20 books. This finding is somewhat unexpected since junior
faculty are expected to publish in refereed journal outlets to establish recognition as scholars and

eligibility for tenure. To ignore the contributions of books in the rankings also results in the discard of
contributions to the field which have potentially higher exposure and readership than journal articles.
What Method Should Be Used to Rank PA Programs?
Critics of the method used to rank PA programs have differing opinions about the journals that should
be used to rank PA programs. They also have serious concerns about including journals that have low
quality and limited readership. Questions about which journals to add and which to exclude continue
to fester as the boundaries of public administration continue to expand.
Our findings show that the research activity of faculty members is, using Rohrbaugh and Andersen's
term, "boundless." Instead of employing a system that ranks PA programs by searching articles in a
selective set of 11 journals, rankings of PA programs should be tied to the research productivity of
faculty members affiliated with public administration programs. That is to say, the unit of analysis
should be the person, not the journal. Why did Morgan et al. (1981)-the originators of the method
used to rank programs-use journals (and not persons) as the preferred sampling frame? "To do
otherwise would require a complete list of PA faculty for all institutions, and then attempt to locate
each of their publications in a multitude of journals representing a large number of disciplines. This
procedure is clearly beyond the scope of this modest and initial effort" (668). Even Morgan et al.
(1981) recognized that the preferred method of ranking programs would have considered the research
productivity of all faculty members.
Rankings that are tied to a comprehensive recognition of all publishing activity are technologically
feasible today. Through sponsorship of NASPAA, universities could establish systems using
standardized formats for submitting electronic summaries of publication activity by all public
administration faculty members. A comprehensive system of benchmarking schools with one another
could be designed that recognizes the contribution of all forms of publication (journal articles as well
as books) to the scholarly literature. Quality of journal publications also can be recognized using such
indicators as impact factors.
Conclusion
Early explorers of the field-people like Woodrow Wilson, Herbert Simon, and Dwight Waldo-drove
the first stakes into the hunting ground of intellectual discourse to define the initial boundaries of
public administration. Boundaries of the field have gradually expanded over time as other scholars
have staked out new claims to uncharted territory. Research in the field today draws on the
knowledge, methods approaches, and wisdom from a wide variety of disciplines. Many faculty
members are undisciplined mongrels who get a kick out of breaking down the territorial boundaries of
traditional disciplines. They are intellectual gypsies who thrive on the advances and developments of
other disciplines.
What, then, should define the boundaries of the discipline of public administration? The answer
suggested by the current system used to rank public administration programs focuses on a highly
selective set of publication outlets that have not been claimed by other disciplines. If another
discipline (for example, political science) has already staked out a claim on a journal (such as the
American Journal ofPolitical Science), public administration has no right to claim ownership. The
focus of public administration as a discipline is thus defined by the residual of outlets that are not
already claimed by other disciplines. The field is left to feast on the leftovers.
The field is undisciplined. So what? It is possible for researchers to become more focused and
disciplined. But why`? Any bounded definition of the field "...would be either so encompassing as to
call forth the wrath of ridicule of others, or so limiting as to stultify its own disciples" (Mosher 1956,
177). The "overlapping and vague boundaries of public administration should be viewed as a
resource, even though they are irritating to some with orderly minds" (Mosher 1956, 177).

Undisciplined mongrels break down fences, challenge territorial boundaries, and discover new
horizons. There is power and influence in being an undisciplined mongrel. Mongrels may have the
advantage of discovery over disciplined purists who are more likely to be set in their ways. Mongrels
sniff in places that are denied to purists and visit any territory for any reason without invitations.
Most policy problems defy the tidy solutions. Deriving satisfactory solutions requires much more than
narrowminded searches that optimize specific outcomes. Undisciplined research that cuts across
several disciplines is more likely to be useful to practitioners.
It is high time to stop hanging our heads in shame because we are an emerging discipline that lacks
focus. The glue that bonds together faculty members is the reality that all research is "for the public"
(Ventriss 1991). Focus will gently emerge only after we stop beating ourselves up for being too
unfocused.
We have proposed a method of evaluating public administration programs that diffuses the crisis of
identity and infuses the field with a renewed sense of pride and purpose. Our mission as a field is to
steal, borrow, and beg from other disciplines to address the complex problems of a modern society.
Our identity must be broad in its scopethe United Nations of academic discourse. Why not yank the
stakes that have been set in concrete and declare our territory boundless?
Acknowledgments
The authors wish to thank Lisa Owen from the Institute for Scientific Information (ISI) for her
assistance in providing information about the current status of ISI review for public administration
journals, and Janet Robertson, the Journal Citation Reports project coordinator, who provided helpful
suggestions on an earlier version of the manuscript. Three reviewers and Irene Rubin provided
invaluable suggestions on an earlier draft of the manuscript.
References
References
References
Birkhead, Guthrie S., and James D. Carroll, eds. 1980. Forward to Education for Public Service 1980.
Syracuse, NY: Maxwell School, Syracuse University.
Douglas, James W. 1996. Faculty, Graduate Student, and Graduate Productivity in Public
Administration and Public Affairs Programs, 1986-1993. Public Administration Review 56(5): 43340.
Farber, Michael, Patricia Powers, and Fred Thompson. 1984. Assessing Faculty Research Productivity
in Graduate Public Policy Programs. Policy Sciences 16: 281-9.
Farmer, David J. 1995. The Language of Public Administration: Bureaucracy Modernity and
Postmodernity. Tuscaloosa, AL: University of Alabama Press.
Fesler, James W. 1975. Public Administration and the Social Sciences: 1946 to 1960. In American
Public Administration: Past, Present, Future, edited by Frederick C. Mosher. University, AL:
University of Alabama Press.
Forrester, John P 1996. Public Administration Productivity: An Assessment of Faculty in PA
Programs. Administration and Society 27(4): 537-66.

Frederickson, H. George. 1980. New Public Administration. University, AL: University of Alabama
Press.
Golembiewski, Robert T. 1997. Letter to the Editor: Response to Rohrbaugh and Andersen. Public
Administration Review 57(5): 464.
References
1977. Public Administration as a Developing Discipline: Part l: Perspectives on Past and Present. New
York: Marcel Dekker.
Guyot, James F. 1997. Whence Public Administration? Public Administration Review 57(3): 273.
Journal Citation Reports. 1994. 1995 Social Sciences Edition. Philadelphia: Institute for Scientific
Information, Inc.
Keller, Larry, and Mike Spicer. 1997. Political Science and American Public Administration: A
Necessary Cleft? Public Administration Review 57(3): 270-2.
Legge, Jerome S., and James Devore. 1987. Measuring Productivity in U.S. Public Administration
and Public Affairs Programs: 1981-1985. Administration and Society 19(2): 14756.
Lindblom, Charles E. 1972. Integration of Economics and the Other Social Sciences through Policy
Analysis. In Integration of the Social Sciences through Policy Analysis, edited by James C.
Charlesworth, l-14. Monograph 14, American Academy of Political and Social Science.
Mainzer, Lewis C. 1994. Public Administration in Search of a Theory. Administration and Society
26(3): 359-94.
Morgan, David R., Kenneth J. Meier, Richard C. Kerney, Steven W Hays, and Harold B. Birch. 1981.
Reputation and Productivity Among U.S. Public Administration and Public Affairs Programs. Public
Administration Review 41(6): 666-73.
References
Morgan, David R., and Kenneth J. Meier. 1982. Reputation and Productivity of Public
Administration/Public Affairs Programs: Additional Data. Public Administration Review 42(2): 171-9.
Mosher, Frederick C. 1956. Research in Public Administration: Some Notes and Suggestions. Public
Administration Review 16(3): 169-78.
. 1975. Introduction: The American Setting. In American Public Administration: Past, Present, Future,
edited by Frederick C. Mosher, 1-10. University, AL: University of Alabama Press.
Rodgers, Robert C., and Cheryl C. Maranto. 1989. Causal Models of Academic Productivity in
Psychology. Journal of Applied Psychology 74(4): 636-49.
Rohrbaugh, John, and David F. Andersen. 1997. Letter to the Editor. Public Administration Review
57(2): 186.
Savage, Peter. 1974. Dismantling the Administrative State: Paradigm Reformulation in Public
Administration. Political Studies 22(2): 147-57.

References
Shafritz, Jay M., and Albert C. Hyde. 1992. Classics of Public Administration. Pacific Grove, CA:
Brooks/Cole Publishing Company.
Stillman, Richard J., II. 1991. Preface to Public Administration: A Search for Themes and Direction.
New York: St. Martin's Press.
Stone, Donald C. 1990. The Changing Public Service: Looking Back ... Moving Forward. Public
Administration Review 50(2): 2008.
Ventriss, Curtis. 1991. Contemporary Issues in American Public Administration Education: The
Search for an Educational Focus. Public Administration Review 51(1): 4-14.
Waldo, Dwight. 1968. Scope of the Theory of Public Administration. In Theory and Practice of Public
Administration, edited by James C. Charlesworth, 1-26. Philadelphia: American Academy of Political
and Social Science.
. 1980. The Enterprise of Public Administration: A Summary View Novato, CA: Chandler and Sharp.
AuthorAffiliation
Robert Rodgers University of Kentucky Nanette Rodgers Lexington, Kentucky
AuthorAffiliation
Robert Rodgers confesses to being an undisciplined mongrel who is a professor of a public policy and
administration school which was not ranked by Douglas (1996) as one of the top 54 public
administration programs in the United States. Email: PUB708@ukcc.uky. edu
Nanette Rodgers was married to a faculty member from a public policy and administration school
which, she was sad io report, is not ranked among the top 54 public administration programs in the
country.
Copyright American Society for Public Administration Sep/Oct 2000
Rodgers, R., & Rodgers, N. (2000). Defining the boundaries of public
administration: Undisciplined mongrels versus diciplined purists. Public
Administration Review, 60(5), 435-445. Retrieved from
http://search.proquest.com/docview/197169446?accountid=160841

You might also like