Professional Documents
Culture Documents
Beneficence and The Expert Bureaucracy Ethics For The Future of Big Data Governance
Beneficence and The Expert Bureaucracy Ethics For The Future of Big Data Governance
Sara R. Jordan
To cite this article: Sara R. Jordan (2014) Beneficence and the Expert Bureaucracy, Public
Integrity, 16:4, 375-394
Abstract
The future of public administration lies in its ethical knowledge work and expertise.
Government knowledge in years ahead will rely on Big Data research and analy-
sis. Given the strong role that data analytics will play, three principles of research
ethics—beneficence, precaution, and refinement—are arguably the ideal ethical
principles for the future.
The expectation that government will serve as a repository for expert knowledge
and evidence-based expert decision-making has increased since Public Integrity
published its inaugural issue. Today, citizens expect that government agencies, from
the U.S. Food and Drug Administration to local police departments, will provide
expertly gathered information freely and with minimal barriers to its use. The public
also expects that bureaucracies will use information wisely to beneficently protect
the nation from nefarious external actors or internal threats.
A cursory perusal of the news stories that dominated the beginning of 2014
shows the tension inherent in public expectations of the uses of data and informa-
tion technology. People cried out for large retail firms like Target and Neiman
Marcus and the attorneys general of their states to “do something” to rectify data
breaches due to hacking, and at the same time recoiled in horror at the revelation
that the National Security Administration (NSA) engages in what appears to be
indiscriminate worldwide gathering of text message data (Ball 2014; Finkle 2014;
Prah 2014). The demand for government to intervene to block criminal challenges
to the “data integrity” of individuals,1 coupled
with demands that government minimize its
gathering and use of individuals’ publicly or
“How ought government protect the
privately traded data (e.g., Facebook and text
public against the use of their data but messages), introduces a complex problem
also protect the public through use of for officials. The problem is neatly summa-
their data?” rized in the following question: How ought
government protect the public against the
use of their data but also protect the public
through use of their data? This is the heart of the conundrum that government faces
as it attempts to leverage the technological innovation of Big Data, already widely
adopted by private organizations, such as Amazon.com and even Target stores
(Mayer-Schonberger and Cukier 2013; Sathi 2013; Siegel 2013).
Yet the important questions about Big Data are likely to be ethical, not technical.
When we ask what the government is obligated to do to protect the public through
the use of data, it is essential to understand that this question is an ethical one related
to beliefs about the relationship between government expertise, analytic techniques,
individual privacy, and public interest values. Given this, the purpose of the present
article is to answer the question of how the government can ethically use Big Data
analytics tools to protect the public against threats without at the same time, through
its use of these tools, threatening ethical values.
Anecdotes and rigorous analysis by social scientific researchers suggest that
trust in government, whether in general or in respect to the information government
reveals about itself, is declining (Chanley, Rudolph, and Rahn 2000; Tolbert and
Mossberger 2006). The paucity of public trust is most certainly related to the scale
and magnitude of current scandals like the exposure of U.S. communication data
sweeps like PRISM or DISHFIRE, but it is also likely to be related to the govern-
ment’s inattention to the identification or public discussion of ethical principles
for guiding the use of individual information in government decision-making
(Ball 2014). As the U.S. government lurches from one data-gathering or data-use
scandal to the next, there has been little discussion, scholarly or popular, of what
the government should do to police itself against future assaults on individual
data integrity.Larsen and Milakovich (2005), and other scholars, have described
and prescribed ideal management techniques for e-government, but they do not
attend directly to the ethics of electronic governance expertise.2 The lag in the
analysis of the ethics of governmental knowledge work, including information
technology development, is lamentable. In light of the recent scandals (which
threaten to be commonplace through the revelations of Edward Snowden), it is
reasonable to expect a groundswell of concern to establish regulations to guide
administrators into the data-driven future, whether they are public officials, gov-
Current trends in the Big Data field would suggest that the next several decades
will be marked by abuses. Societal effects will be, in many cases, detrimental.
Those who stand to benefit most from Big Data will be the powers that create
and control the resources: corporations, data brokers, and governments. Many
of the best things to come from Big Data are long-term goals: personalized
medicine, complete and accurate electronic medical records, crime prevention,
and error reduction in industry, effective system safety protocols, global resource
management, rational food distribution, and universal human rights. The greatest
benefits from Big Data will fall upon a population that has not yet been born.
(Berman 2013, p. 226)
The obligations that we, as constitutive of both governors and governed, have
to future generations are similar in content to what we owe each other at present.
“Obligations to future generations are essentially an obligation to produce—or to
attempt to produce—a desirable state of affairs for the community of the future,
to promote conditions of good living for future generations” (Golding 1972, p.
86). Philosophical analysis of the problem of obligations to future generations
would define our obligation as of the kind where the future has a presumptive
right to make a claim against us for our present actions taken strictly to benefit
ourselves. Future generations, logically and temporally, may not be able to press
a claim against our self-interested actions, but if the future were somehow made
present, then its claim against us would stand as a block against our being able
to act without incurring moral culpability. In the case of the obligations of our
generation of governors to future recipients of programs crafted based upon Big
Data analytics, the obligation is to protect from harm and to create better out-
comes than would otherwise come to pass. While Big Data seems promising as a
solution to “wicked problems” in business and government, as Davis (2012) and
boyd and Crawford (2012) point out, the gathering and use of Big Data presents
ethical problems, not only for the privacy and data integrity of the individuals
whose data are used, but for the ethical integrity of the data analysts.
Figure 1
Beneficence, Precaution, and Refinement
Beneficence
Beneficence, or the mandate to maximize the good while minimizing harms, com-
pels scientific or governmental knowledge producers to question the public utility
of their research question, the risk-benefit ratios of their methods, and the utility of
the communication of their results. Beneficence is an applied philosophical principle
guiding behavior that benefits others irrespective of the benefit to self or that reduces
known or perceived harms to others.4 Frequently, but incorrectly, beneficence is
reduced to mean strictly that a policy or action has a favorable risk-benefit ratio.
Historically understood, however, beneficence does not compel beneficent individu-
als to be cost-benefit maximizers. Revisiting the classic definition of beneficence
makes this clear.
Aristotle (Aristotle 1167b–1168a) describes beneficence as a characteristic of
a noble human nature that compels us to love the product of our own labor more
than the object created could love us in return (Aristotle 1908/2009). Persons with
beneficence actively love or seek to benefit others, irrespective of the benefits to
themselves or even to the universe of others. Beneficence is both a weak and a
strong principle. In its weak form, beneficence is interpreted as charity. To be
charitable requires that one ascertain the needs of others and respond appropriately
within one’s means. However, as noted by Aristotle, Cicero, and others, charity does
not require a beneficent intent, and charitable giving can be motivated by a desire
to accrue social esteem or alleviate guilt. Strong beneficence requires significant
time, effort, and, following Aristotle, craftsmanship to ensure that the recipient
of beneficent actions accrues genuine benefits. To perform truly beneficent acts,
then, requires knowledge work—the needs of the recipients must be ascertained
and analyzed, the knowledge gained must be applied, a decision must be made to
abide by such intent, and actions must be taken to fulfill the goal of performing
beneficent actions.5
Expertise
Expertise is a trait possessed by an individual deft in the application of knowl-
edge (the union of philosophical wisdom and practical wisdom) and of skill to
satisfy an end.6 Expertise requires an unusually high level of excellence in the
accumulation of brute facts and application of knowledge. Following the classic
Aristotelian formulation of arête (excellence) as the union of sophia (wisdom)
and phronesis (practical knowledge), expertise must be directed toward, or put
into the service of, an end. In the case of governmental expertise, knowledge
must be put to the ends established by citizens and their legitimate and authori-
tative representatives.
Knowledge
Knowledge is the union of philosophical and practical wisdom. To “know” philo-
sophically requires an appreciation of universal forms and particular ideas and the
difference between the two. Practical knowledge is won by gathering information
intelligently and by ethical and prudential use of information gathered through
experience. An individual who is philosophically knowledgeable can reason using
abstract concepts and principles divorced from personal experience of the phenom-
enon discussed. Practical wisdom requires the application of abstract forms to the
realm of sensation and experience. Application of wisdom includes three forms of
reasoning: investigating the goodness of an event or action (ethics), considering the
organization of power to satisfy abstract principles (politics), and making judgments
about the appropriate skill to use to satisfy a goal (techne).
In the context of this article, the primary concern is with the role of government
agents as experts in knowledge work—the technically excellent (i.e., expert) produc-
tion and management of knowledge. To produce knowledge means to speak, write,
or otherwise put forth, for intersubjective agreement or disagreement, materials that
communicate philosophical or applied knowledge to an audience. To manage knowl-
edge means to organize into discrete and explicable categories the communications
of other knowledge producers for the purpose of streamlining and simplifying the
process of gathering information for an audience seeking information. Governments
produce and manage knowledge both directly and indirectly. A matrix of the roles
of government in knowledge work is shown in Table 1.
Inherently Governmental
In addition to the conceptual clarification described above, the arguments presented
here rest upon a concept that may not have a well-known definition. In current
regulatory interpretations, inherently governmental activities are defined as actions
“so intimately related to the public interest as to require performance by [federal]
government employees” (Office of Federal Procurement Policy 2011, p. 56227).
TABLE 1
Government Roles in Knowledge Work
Direct Indirect
PUBLIC INTEGRITY
“The approval of agency responses to Freedom of Information Act requests (other than “Service as arbitrators or provision of alternative dispute resolution
routine responses that, because of statute, regulation or agency policy, do not require the (ADR) services” (Appendix B, 5).
exercise of judgment in determining whether documents are to be released or withheld),
and the approval of agency responses to administrative appeals of denials of Freedom of
Information Act requests” (Appendix A, 18).
“The conduct of administrative hearings to determine the eligibility of any person for a se- “Provision of non-law-enforcement security activities that do not directly
curity clearance, or involving actions that affect matters of personal reputation or eligibility involve criminal investigations, such as prisoner detention or transport
to participate in government programs” (Appendix A, 18). and non-military national security details” (Appendix B, 9).
Beneficence and the Expert Bureaucracy
Notably, for present purposes, Policy Letter 11–01.3(b)(1) states that the defi-
nition of an inherently governmental function “does not normally include—(1)
gathering information for providing advice, opinions, recommendations, or ideas
to Federal Government officials.” In other words, simply gathering information
is not an activity restricted to government employees. Gathering and analyzing
information may be performed by government contractors in either the public
or the nonprofit sector. This covers a broad range that includes data-gathering
activities performed by universities (university-affiliated research centers, or
UARCs), federally funded research-and-development centers (FFRDCs), or even
in the context of workaday reporting by nonprofit organizations, such as hospitals
reporting statistics on the use of their emergency care facilities for treatment of
influenza. What differentiates the inherently governmental from the contractor’s
“critical function” role in information is what these actors might permissibly
do with the information they gather. Governmental knowledge work consists
in the use of knowledge and skill to establish and maintain citizens’ authorized
purposes. Individuals and organizations that are not part of the government may
be instrumental for the accumulation of information useful to the propagation of
government purposes, but as instruments these contractors may not decide the
ends to which they are used to achieve the ends legitimately determined by elected
officials or their appointees.
Refinement
Gathering information from “clouds” and “[data] lakes” requires methodological
refinement on an unprecedented scale. The goal of refinement in data analysis is to
succeed in satisfying public interest principles. To mine data to improve national
security—for example, to succeed in “finding tiny nuggets of data that might turn out
to be a terrorist communication or signal, and that is a huge undertaking” (Konkel
2013a)—requires a level of data analysis and visualization that extends well beyond
the capacity of ordinary citizens. For example, the NSA has disclosed unparalleled
levels of graphic visualization for more than four trillion data points. In the case
of such massive data work, refinement requires advances in supercomputing, but
also in intuitive threat and risk perception by individuals. Thus, refinement entails
hardware, software, and human resources improvements.
Refinement in Big Data methods include adaptation of extraction platforms, secu-
rity software, retrieval and management techniques, and data archive management,
to name a few, in order to match citizen-negotiated ends. However, regardless of its
level of refinement, if the ends to which the tool is used are ill-suited to the prefer-
ences of the population, it should not be used. The principle of beneficence requires
that there be some foreknowledge of the needs and requirements of the recipient of
beneficent acts. If, for example, extraction programs monitor social media access
in ways that could expose individuals’ relationships, preferences, and families to
unwanted public scrutiny, then regardless of the efficiency of the extraction, it should
not be done by governmental entities without explicit permission.
Also among the requirements of refinement for governmental use is disclosure of
systematic biases in data analysis (Berman 2013). Refinement of human bias may
be the least magnificent of the advances in Big Data knowledge work, but it is likely
to be the most difficult and imperative. Refinements of ordinary bias can come in
two ways. The first is insistence on a provisional and falsificationist approach to
the practice of knowledge production. The second is insistence on transparency in
the practice of knowledge management.
Falsificationism is the idea, advanced by Karl Popper (1963), that scientific inquiry
should be designed to disconfirm hypotheses rather than confirm them. In practice, a
falsificationist approach requires that analysts approach their data questioningly, as if
the data could show their assumptions to be wrong or demonstrate that the proposed
program will not work. This may include questioning whether an instrument should
be used at all, as in the case of “fixing” or “cleaning” messy data. Discovering and
disclosing bias requires that analysts avoid falling into the trap of intellectual narcis-
sism, the belief that bigger data produce conclusive results, and similarly that they
do not take as social gospel the statistical significance of all their findings.
The intermediate principle of refinement requires improving technical skills as
instruments useful for ends compatible with the public interest as revealed or as
forecast. While Big Data could be put to use by citizens and government clients to
rationalize their policy demands before presenting them in the political marketplace,
the risky near future means that barriers to citizen access to and use of Big Data
may need to be restrained by the ordinary processes of executive accountability to
legislative and judicial oversight.
On the governmental side, one key promise of Big Data is that it could allow
governments to forecast policy demands and administrative needs before citizens
even express their views in the voting booth. As Timothy Andrews reiterates,
One example of this is the Centers for Disease Control and Prevention (CDC)
BioSense2.0 and Special Bacteriology Reference Laboratory (SBRL). The SBRL
may enable researchers to define whether an emerging pathogen has the characteris-
tics of a substantial threat before its pathogenicity is known, so that harm reduction
can be achieved before harm occurs to humans or animals. Consequently, before
the public knows what it wants, the government can know what is in the public’s
interest. Such potential opens the door to paternalistic choices.
Refinement must be bolstered by another commitment—transparency. This is
particularly important to ensure that data are properly verified and user access agree-
ments are clear and nonprejudicial.7 Data verification should be refined to include all
of the information relevant for an outside individual to understand the provenance,
compilation, changes to (e.g., creation or weighting of variables), and user require-
ments for the data. User requirements may include strongly technical language, but
beyond necessary technical and legal barriers to entry, no further stipulations should
be made that restrict access to data. The rules governing Freedom of Information
Act (FOIA) requests are one possible set of reasonable rules for implementation
of access to Big Data produced by government. These rules establish regulatory
guidelines that protect national security and data privacy, but they do not rise to
meet the standard of an ethical principle, such as precaution, that outlines when
data should or should not be shared.
Precaution
The best methods and most public-interested approach to data management and
analysis will not prevent all potential harms from government use of Big Data. De-
cisions made with even the most rigorously analyzed empirical data may still have
unforeseen ill consequences. Examples from healthcare research abound, but the
removal of vetted “blockbuster” drugs, such as Vioxx, from pharmacy shelves due
to unanticipated risks due to long-term use provides a reasonable cautionary tale.
Even for those not directly harmed by adverse decisions, there are knock-on effects,
such as reductions in service, increases in unwanted services, and loss of privacy,
that arise even from well-intentioned uses of Big Data. Instances of mistaken place-
ment on no-fly lists, mistaken denials of credit, and incorrect information on health
records are but a few examples. The National Academy of Sciences enumerated the
privacy risks of Big Data in the following way:
The rich digital record that is made of people’s lives today provides many benefits
to most people in the course of everyday life. Such data may also have utility
for counterterrorist and law enforcement efforts. However, the use of such data
for these purposes also raises concerns about the protection of privacy and civil
liberties. Improperly used, programs that do not explicitly protect the rights of
innocent individuals are likely to create second-class citizens whose freedoms
to travel, engage in commercial transactions, communicate, and practice certain
trades will be curtailed—and under some circumstances, they could even be
improperly jailed. (National Research Council, 2008, sec. 3.2)
Outside the realm of analysis-based decision-making, the dangers of Big Data also
include “over-provisioning access, inadvertently exposing personally-identifiable
information and transferring data outside of a required geographical location”
(Robinson 2013). Taking the 2014 proposal to build a “Big Data to Knowledge”
(BD2K) initiative in the National Institutes of Health as an example, the dangers
become quite obvious if the beneficently intentioned initiatives are not constrained
by an attitude of precaution and an imperative to refinement. The BD2K program
proposes to:
s Facilitate the broad use and sharing of large, complex biomedical data sets
through the development of policies, resources and standards;
s Develop and disseminate new analytical methods and software;
s Enhance training of data scientists, computer engineers, and
bioinformaticians; and
s Establish Centers of Excellence to develop generalizable approaches that
address important problems in biomedical analytics, computational biology,
and medical informatics. (Kalil and Green 2013)
have queried fads, scandals, and “tectonic shifts” as seemingly ponderous as Big
Data. Many of these fads evaporated, and the players in the scandals were relegated
to infrequently used case studies. What remained constant is the heart of this ar-
ticle and this journal—thinking openly and accessibly, through timeless virtues
and values, to find a solution to the problems of the day. In the mission statement
of Public Integrity (www.mesharpe.com/mall/results1.asp?ACR=pin/), the editors
reiterate that the “driving force [of modern public service and the journal] is the
notion of integrity that is so basic a part of democratic life.” Thus, it is asked here
whether a commitment to beneficence, precaution, and refinement in government
knowledge work could undermine the value of protecting the public interest against
government interests.
In the opening paragraphs of this article, the problem of Big Data was posed
in the form of a question: How ought government protect the public against the
use of their data but also protect the public through use of their data? In the sub-
sequent pages, I showed how the production and management of knowledge by
government employees is an “inherently governmental task” rooted in the basics
of political organization and “so intimately related to the public interest” that it
must be governed by principles designed to constrain the actions of individuals
in a relationship of legitimate and authoritative, but disproportionate, power to
the recipients of their intended beneficent acts. I posited that a comparable set
of ethical principles for inherently governmental Big Data work already exists
in the form of the ethical principles outlined for the use of government funds to
produce generalizable knowledge using human participants in research projects.
I outlined that the optimal guiding principle for both research and knowledge
work is beneficence, where the intent to act beneficently must be complemented
by mid-range ethical principles that inform how knowledge is gathered (refine-
ment) and how it is used (precaution). These suggestions were outlined to support
a conjecture that the biggest challenge for integrity in public administration for
the next decade (at least) is the management of decision-making guided by Big
Data–driven governance. Certainly government employees have an obligation
to “responsible risk-taking,” beneficence, and the public interest, but where do
these obligations end and the rise of a total-information, privacy-rights-denying,
technocracy begin?
Beneficence is a virtue for governing disparate power relationships, and even
a beneficent government will wield disproportionate power. If it exercises power
beyond the limits of its contract with the people, it threatens to become a pater-
nalistic overseer. If it does so by using tools of technological intervention that are
unavailable to any except the lucky few chosen to govern, then even a purportedly
beneficent government becomes a paternalist technocracy. The example of Big Data
use by the NSA represents a clear example of this possibility.
Big Data use does not have to decline into technocracy or paternalism if it is con-
ducted with the ethics of genuine beneficence in mind, whereby an open discussion of
the principles and the practices of refinement of tools and implementation of precau-
tions become part of the public conversation on technology and governmental roles.
To the extent that governments do not undermine their own legitimacy by usurping
authority contracted to them by the people, and ensure that their mandate to provide
security is tempered by the value of securing space for decisional autonomy and liberty,
the descent of Big Data governance into technocratic paternalism can be avoided.
NOTES
1. Data integrity refers to the idea that data are complete, accurate, unbiased, and
correctly correlated with the appropriate personal identifiers throughout an individual’s
lifetime.
2. For the purposes of this article, e-government is part of knowledge work. E-
government is “the use by the government of Web-based Internet applications and other
information technologies, to—(A) enhance the access to and delivery of government
information and services to the public, other agencies, and other government entities; or
(B) bring about improvements in government to operations that may include effective-
ness, efficiency, service quality or transformation” (E-Government Act of 2002, quoted
in Larsen and Milakovich 2005, p. 57).
3. The meaning of societal benefits will not be contested here, but elsewhere I con-
test the vagrant uses of this term and related terms, such as benefits to society (Jordan
2014).
4. An analogy could be drawn between this definition of beneficence and the Is-
lamic principle of ăl-’ămr bĭ-’l-mă‘rūf wă-’n-năhī ‘ănĭ ’l-mŭnkăr (to enjoin the good
and forbid evil). Although used as justification for restrictions on behavior, this principle,
philosophically speaking, commands believers to beneficence in the same manner in
which Christians might be enjoined to charity.
5. Note, for example, Cicero’s discussion in De Officiis (2.61–65) of “one-off”
charitable giving versus true charity as an example of weak versus strong beneficence.
6. This definition is restricted to individual expertise at the expense of developing
a further definition of collective expertise, as the problem of aggregating the level and
application of knowledge is more than can be well explicated here.
7. “Data verification is the process that ensures that the data was collected, annotated,
identified, stored and made accessible in conformance with a set of approved protocols
for the resource” (Berman 2013, p. 153).
REFERENCES
Aristotle. 1908/2009. Nicomachean Ethics. Translated by William David Ross, trans.
Oxford: Oxford University Press.
Ball, James. 2014. “NSA Collects Millions of Text Messages Daily in ‘Untargeted’
Global Sweep.” Guardian, January 16. Available at www.theguardian.com/
world/2014/jan/16/nsa-collects-millions-text-messages-daily-untargeted-global-
sweep, accessed January 16, 2014.
Barkin, Noah. 2014. “Spying Plunges U.S.-German Ties Lower Than Iraq War: Merkel
Ally.” Reuters, January 16. Available at www.reuters.com/article/2014/01/16/us-
germany-usa-spying-idU.S.BREA0F0TX20140116, accessed January 17, 2014.
Berman, Evan M., and Jonathan P. West. 1998. “Responsible Risk-Taking.” Public
Administration Review 58, no. 4: 246–352.
Berman, Jules J. 2013. Principles of Big Data: Preparing, Sharing, and Analyzing
Complex Information. Waltham, Mass.: Morgan Kaufmann/Elsevier.
boyd, danah, and Kate Crawford. 2012. “Critical Questions for Big Data.” Informa-
tion, Communication and Society 15, no. 5: 662–679.
Carter, Lucy. 2007. “A Case for a Duty to Feed the Hungry: GM Plants and the Third
World.” Science and Engineering Ethics 13, no. 1: 69–82.
Chanley, Virginia A.; Thomas J. Rudolph; and Wendy M. Rahn. 2000. “The Origins
and Consequences of Public Trust in Government: A Time Series Analysis.” Public
Opinion Quarterly 64, no. 3: 239–256.
Davis, Kord. 2012. Ethics of Big Data: Balancing Risk and Innovation. Sebastopol,
Calif.: O’Reilly Media.
Department of Commerce. 2009. “Safe Harbor Privacy Principles.” Available at www.
export.gov/safeharbor/eu/eg_main_018475.asp, accessed July 1, 2014.
Finkle, Jim. 2014. “Exclusive: Cybercrime Firm Says Uncovers Six Active At-
tacks on U.S. Merchants.” Reuters, January 17. Available at www.reuters.com/
article/2014/01/17/us-target-databreach-idU.S.BREA0G18P20140117, accessed
January 17, 2014.
Golding, Martin P. 1972. “Obligations to Future Generations.” Monist 56, no. 1:
85–99.
Government Business Council. 2013. “Turning Optimism into Reality: How Big Data
Is Transforming Government—A Candid Survey of Federal Employees.” Available
at www.govexec.com/gbc/turning-optimism-reality-how-big-data-transforming-
government/61934, accessed August 13, 2013.
Jordan, Sara R. 2014. “Beneficence in Public Administration Research; or, Who Needs
the NSF Anyway?” Administration & Society 46, no. 1: 112–121.
Kalil, Tom. 2012. “Big Data Is a Big Deal.” Office of Science and Technology Policy
Weblog, March 29. Available at www.whitehouse.gov/blog/2012/03/29/big-data-
big-deal, accessed August 21, 2013.
Kalil, Tom, and Eric Green. 2013. “Big Data Is a Big Deal for Biomedical Research.”
Office of Science and Technology Policy Weblog, April 23. Available at www.
whitehouse.gov/blog/2013/04/23/big-data-big-deal-biomedical-research, accessed
August 21, 2013.
Kalil, Tom, and Fen Zhao. 2013. “Unleashing the Power of Big Data.” Office of
Science and Technology Policy Weblog, April 18. Available at www.white-
house.gov/blog/2013/04/18/unleashing-power-big-data/, accessed August 21,
2013.
Konkel, Frank. 2013a. “Hot to Spot a Data Scientist.” FCW: The Business of Federal
Technology, April 24. Available at http://fcw.com/Articles/2013/04/24/define-data-
scientist.aspx, accessed August 21, 2013.
———. 2013b. “NSA Shows How Big ‘Big Data’ Can Be.” FCW: The Business of
Federal Technology, June 13. Available at http://fcw.com/Articles/2013/06/13/NSA-
big-data.aspx, accessed August 21, 2013.
Larsen, Bettina, and Michael Milakovich. 2005. “Citizen Relationship Management
and E-Government.” In Electronic Government, edited by Maria A. Wimmer,
Roland Traunmüller, Åke Grönland, and Kim V. Andersen, pp. 57–68. Berlin:
Springer.
National Research Council. 2008. Protecting Individual Privacy in the Struggle
Against Terrorists: A Framework for Program Assessment. Washington, DC: Na-
tional Academies Press.
Mayer-Schönberger, Viktor, and Kenneth Cukier. 2013. Big Data: The Revolution That
Will Transform How We Live, Work and Think. New York: Houghton Mifflin.
Office of Federal Procurement Policy. 2011. “Publication of the Office of Federal
Procurement Policy (OFPP) Letter 11-01, Performance of Inherently Governmental
and Critical Functions.” Office of Management and Budget. 76 FR 176: 56227–
56242. Available at www.gpo.gov/fdsys/pkg/FR-2011-09-12/pdf/2011-23165.pdf.
Office of the Press Secretary, White House. 2014. “Remarks by the President on
Review of Signals Intelligence.” Available at www.whitehouse.gov/the-press-
office/2014/01/17/remarks-president-review-signals-intelligence/, accessed January
17, 2014.
Prah, Pamela M. 2014. “Target’s Data Breach Highlights State Role in Privacy.” USA
Today. Available at www.usatoday.com/story/news/nation/2014/01/16/target-data-
breach-states-privacy/4509749/, accessed January 17, 2014.
Popper, Karl. 1963. Conjectures and Refutations: The Growth of Scientific Knowledge.
London: Routledge.
Robinson, Laura. 2013, July 3. “The Security Pitfalls of Mining Big Data.” USA
Today. Available at www.usatoday.com/story/cybertruth/2013/07/03/cybersecurity-
pitfalls-of-mining-big-data/2486783/, accessed August 21, 2013.
Sathi, Arvind. 2013. Big Data Analytics. Boise, Ida.: MC Press.
Siegel, Eric. 2013. Predictive Analytics: The Power to Predict Who Will Click, Buy, Lie
or Die. Hoboken, N.J.: John Wiley.
Tolbert, Caroline J., and Karen Mossberger. 2006. “The Effects of E-Government on
Trust and Confidence in Government.” Public Administration Review 66, no. 3:
354–369.