Professional Documents
Culture Documents
Data Ethics: Practical Strategies For Implementing Ethical Information Management and Governance 2nd Edition Katherine O'Keefe
Data Ethics: Practical Strategies For Implementing Ethical Information Management and Governance 2nd Edition Katherine O'Keefe
https://ebookmeta.com/product/ethics-for-records-and-information-
management-1st-edition-norman-a-mooradian/
https://ebookmeta.com/product/artificial-intelligence-ethics-and-
international-law-practical-approaches-to-ai-governance-2nd-
edition-abhivardhan/
https://ebookmeta.com/product/business-ethics-ethical-decision-
making-cases-12th-edition-o-c-ferrell/
https://ebookmeta.com/product/implementing-the-sustainable-
development-goals-in-nigeria-barriers-prospects-and-
strategies-1st-edition-eghosa-o-ekhator-editor/
Business Ethics Ethical Decision Making and Cases 13th
ed O C Ferrell John Fraedrich Linda Ferrell
https://ebookmeta.com/product/business-ethics-ethical-decision-
making-and-cases-13th-ed-o-c-ferrell-john-fraedrich-linda-
ferrell/
https://ebookmeta.com/product/information-assurance-and-risk-
management-strategies-manage-your-information-systems-and-tools-
in-the-cloud-1st-edition-bradley-fowler/
https://ebookmeta.com/product/information-security-
management-2nd-edition-workman/
https://ebookmeta.com/product/responsible-ai-implementing-
ethical-and-unbiased-algorithms-1st-edition-sray-agarwal/
https://ebookmeta.com/product/ethics-and-governance-3rd-edition-
james-beck/
i
‘I can’t think of a subject more relevant than data ethics. Given that we live in a
data-dependent world, the most important question is not “Can I do something
with data?” but “Should I do something with data?”. These questions should be
considered by teens learning to code, business people gathering and exploiting
customer data, scientists developing and releasing AI applications, and anyone
creating and using data. Katherine O’Keefe and Daragh O Brien provide excellent
groundwork for addressing these questions and give us the tools to think and act
with our data in a responsible way. Read their book, share it and apply it!’
Danette McGilvray, President and Principal, Granite Falls Consulting and
author of Executing Data Quality Projects
‘Ethics play an increasingly important role when considering how to collect and
use personal information. This updated edition of Data Ethics clearly explains
how to take ethics seriously and make it an integral part of business information
management and governance. The combination of sound and up-to-date legal
theories with practical tips and case studies makes it a useful handbook for
anyone working with data on a regular basis. The only disadvantage is the
realization my own to-do list grew a little longer.’
Paul Breitbarth, Senior Visiting Fellow, European Centre on Privacy and
Cybersecurity, Maastricht University
iii
Second edition
Data Ethics
Practical strategies for implementing
ethical information management and
governance
Katherine O’Keefe
Daragh O Brien
iv
Publisher’s note
Every possible effort has been made to ensure that the information contained in this book is
accurate at the time of going to press, and the publisher and authors cannot accept respon-
sibility for any errors or omissions, however caused. No responsibility for loss or damage
occasioned to any person acting, or refraining from action, as a result of the material in this
publication can be accepted by the editor, the publisher or the authors.
First published in Great Britain and the United States as Ethical Data and Information Management:
Concepts, tools and methods in 2018 by Kogan Page Limited
Second edition published in 2023
Apart from any fair dealing for the purposes of research or private study, or criticism or review, as
permitted under the Copyright, Designs and Patents Act 1988, this publication may only be repro-
duced, stored or transmitted, in any form or by any means, with the prior permission in writing of
the publishers, or in the case of reprographic reproduction in accordance with the terms and li-
cences issued by the CLA. Enquiries concerning reproduction outside these terms should be sent
to the publishers at the undermentioned addresses:
2nd Floor, 45 Gee Street 8 W 38th Street, Suite 902 4737/23 Ansari Road
London New York, NY 10018 Daryaganj
EC1V 3RS USA New Delhi 110002
United Kingdom India
www.koganpage.com
The right of Katherine O’Keefe and Daragh O Brien to be identified as the authors of this work has
been asserted by them in accordance with the Copyright, Designs and Patents Act 1988.
ISBNs
Hardback 978 1 3986 1029 3
Paperback 978 1 3986 1027 9
Ebook 978 1 3986 1028 6
CONTENTS
Foreword x
Index 371
x
FOREWORD
This is not a light topic. This is a serious and challenging effort. But
Daragh and Katherine are well-immersed and knowledgeable in this work
and have written another excellent edition of their ethics book.
I need to reinforce their discussion and urge the reader to embrace the
topic and the learning within, but not because ‘history is repeating’ or ‘this
is required for my organization’. The reader must be attentive because we
have entered the realm of anthropology.
Anthropology is the science of being human. Relating to the study of
human behaviour from an environmental, biological and societal perspec-
tive, anthropology observes all aspects of the human experience. In other
words, anything that is anthropological is PERSONAL. Given that data is
one of those movements that affect the human experience, it too is personal.
Anthropology shows us that humans either adapt to massive shifts in
society or suffer from the failure to adapt. Our unintended consequences
from the rise of data are the tip of the iceberg.
When we, as members of humanity, deal with data in an organizational
sense, we are compelled to behave. After all, this can affect career and liveli-
hood – always a good motivator. When a business leader or politician says
‘data is important’, we easily move to the social or institutional behaviour.
But if this is all ‘anthropological’ then humanity needs to define the new
‘right things’ to do for individuals – at the workplace, and everywhere else.
What is the right thing to do with data? Rather than have a supervisor or
institution specify an institutional behaviour, maybe the discussion needs to
be reversed; what do you need to know about data behaviour before you
even start the job?
At the end of the day, when humans do new things, and then these new
things go off the rails, we usually find an answer in human behaviour. Data,
as a subject, is square in the centre of humans adapting to technology.
Daragh and Katherine provide excellent examples, justifications and
solutions.
The neat thing about anthropology is it is well studied. There are patterns
we can use to our advantage.
Organizations need to teach that they and their employees operate a data
supply chain with far-reaching consequences. The young child needs to learn
that they generate data, and not share personal data, while at the same time
needing to know they need to share toys and kindness. The high-school
student needs to learn to be more judicious before downloading the new
app. Advanced institutions of learning need to offer and require dedicated
classes in organizations, data and ethics.
xii Foreword
John Ladley
1
Introduction
Why write a book on data ethics?
Introduction
We live in interesting times. The pace of innovation and development in
various fields of information management and information technology con-
tinues to accelerate, with functionality and features common today that
2 Introduction
would not have appeared out of place in science-fiction movies of even a few
years ago. We have been gathering and recording information in written and
pictographic forms for around 5,000 years. Cuneiform texts from ancient
Mesopotamia are among the oldest evidence of recorded history.
The advent of modern technologies means we are now recording, in a
year, more information than we have recorded in all of the preceding his-
tory of humankind. The pace of logging, recording and cataloguing of in-
formation continues to accelerate as our technical capabilities evolve. The
challenge we now face is whether our love affair with technology and tech-
nological innovation may have left us ill-prepared for the various ethical
and moral issues that the uses of that technology increasingly throw us on
a day-to-day basis.
In the years since the first edition published, there have been many devel-
opments, so while the time for a new edition was right, we also wanted to
make sure the title aligns with the content, which we feel is achieved with
‘Data Ethics’. The purpose of this book is to explore whether the fundamen-
tal ethical challenges we face are new variants of issues we have struggled
with in all cultures for many thousands of years. While our technology ca-
pabilities advance at a phenomenal rate, the solutions needed to put ethical
principles into practice might be found in the fundamental disciplines of
information management and how we manage and lead our information
organizations.
In the past few years, we’ve seen both a lot of rapid change in the fields
related to data and information ethics and, at the same time, less change
than hoped. In 2018, we were seeing a groundswell in awareness of the need
for data ethics. As we were finalizing proofs for the first edition of our book,
scenarios that we had hypothesized might be the case from smaller and less-
publicized whistleblowing were splashed across the headlines as major inter-
national scandals, causing crises in multiple governments and multinational
organizations. Between 2018 and 2023, some events have brought the im-
portance of data ethics to very public attention and there has been some
movement towards attempting to put teeth into ethical standards at national
and supranational levels. The European Union took up the challenge to pro-
mote ethical AI put to them by the former European Data Protection
Commissioner and published guidelines for ethical AI, following up with a
move towards creating an EU Regulation for ethical AI.
At a national level, some countries have looked to implement and support
ethical frameworks for data use at different levels. Two examples we are
personally familiar with are Scotland and Ghana: in 2020, Scotland created
Introduction 3
out more complex problems based on a function that learns the parameters of
the required outcome from the data available), and deep learning (the devel-
opment of mathematical models that break complex analysis into simpler
discrete blocks that can in turn be adjusted to better predict final outcomes
from the AI process).
These tools, and the technology platforms they run on, are increasingly
powerful. Just as the world of data management software has evolved in
recent years to develop more powerful software tools and platforms for data
analytics and visualization, hardware manufacturers are now beginning to
develop processing chips for the next generation of smartphones, computers
and Internet of Things (IoT) devices to allow complex AI functions to be
deployed ‘on device’, rather than relying on servers in data centres or hosted
in cloud-based environments.
The ‘Quantified Self’ movement is a good example of the development of
our data-gathering and analytics capabilities over the past few years. Twenty
years ago, if we were tracking our exercise routines we would have used a
range of independent technologies such as stopwatches and heart-rate mon-
itors and would have recorded our progress and performance indicators in
a physical notebook (because our computers were too big to bring to the
gym or too heavy to bring on a run). We might have kept a food diary in the
notebook as well. We might have manually tracked historic trends or used
group activities to compare our progress against a representative sample of
our peers.
Today, we wear lightweight fitness trackers that also track our location
and movement, recording our exercise performance in terms of distance, ef-
fort and other performance indicators. We might log the food we are eating
by taking photographs of our meals instead of writing in a notebook.
Further logging of our activities and actions is automatically through our
wearable technologies. We track our performance against peers through
pooled data that is shared via our applications.
Increasingly, our software tools can infer the calorie and nutrient content
of food we eat based on a machine-learning analysis of a photograph of our
meals. AI and analytics can enable the automatic tailoring of our exercise
regimes to our fitness levels, our ability and our progress. The same tech-
nologies can also predict health issues that might arise based on the data
they ingest about us. Add to the mix more specialized additional technolo-
gies to read blood sugar, blood pressure or other aspects of physical health,
and our simple smartphone is the hub of a device that increasingly resembles
the tricorder in Star Trek.
Introduction 7
targeting, a technology that requires the use of AI to analyse and filter vast
amounts of data about people from social media, has led to what the Online
Privacy Foundation has referred to as ‘the weaponized, artificially intelli-
gent, propaganda machine’ (Revell, 2017). By gathering data in contexts
where people are less likely to be misleading or provide inaccurate informa-
tion or take other steps to mask or protect their data from being gathered or
measured, and by combining data across multiple data sets and data points,
it has apparently become possible to manipulate and undermine a democ-
racy without firing a single shot (Neudert, 2017).
All too often, particularly in the context of big data and analytics pro-
cesses, we can be faced with a significant short-term win or benefit from the
use of data in new or novel ways. Often there is a potential benefit to society
arising from the processing. Sometimes that societal benefit is substantial.
However, often the impact on individuals and the choices they might make
or the freedoms they may otherwise enjoy can be disproportionate to the
benefit to society. In these contexts, the data-driven dilemma is one of deter-
mining whether, even if we can do something fancy with data or data-related
technologies, should we?
tions embrace and apply ethics and ethical principles in their information man-
agement practices will become a source of competitive advantage, both when
seeking customers but also for attracting and hiring employees (Jenkin, 2015).
Many organizations are attempting to innovate in this area through the
introduction of ethics forums or taking part in public ethics discussion groups
or conferences. Increasingly, organizations are turning to standards bodies
such as the Institute of Electrical and Electronics Engineers Standards
Association (IEEE) to define and develop standards for information ethics.
But standards frameworks can only really advise or give guidance on what
your data ethics should be, they can only provide guidance on the types of
ethical decisions you should be taking. Likewise, an internal data ethics forum
that is not linked to some form of operational governance will be unable to
manifest any sustainable change in information management practices.
affect the choices people make about their movements (impacting freedom
of movement) or who they meet where (impacting rights of assembly).
The type of trade-off we see in Figure 0.1 is at the heart of the balancing test
required under EU data protection laws when organizations are seeking to rely
on the fact that something is in their legitimate interests. It may be in the le-
gitimate interest of the organization, but that interest needs to be balanced
against the rights and interests of the individuals affected by the processing. It
also arises in other countries in the context of privacy impact assessments,
where many countries now require consideration of the impact of data pro-
cessing on the choices people might make about interacting with services, par-
ticularly services offered by government agencies.
This type of trade-off and balancing decision is also at the heart of many
of the ethics frameworks that govern professions such as lawyers or doctors.
For example, the lawyer’s professional duty of confidentiality has high value
to society, as without it people who need legal advice or representation
might be afraid to seek it out (American Bar Association, 2020; Law Society
of Ireland, 2013). Likewise, medical practitioners, psychologists or counsel-
lors all operate under an ethical presumption of confidentiality. This has
social value and minimizes the invasiveness or intrusion into the personal
and private life of the patient or others (American Medical Association,
2016; Psychological Society of Ireland, 2019). However, these ethical duties
can be overruled where there is a wider societal issue or where the disclosure
is in the interests of the individual.
Individual impact/invasiveness
Introduction 15
Chapter summary
●● The issues, risks and potential benefits that are presented by our
increasingly pervasive tools and technologies for data capture and data
analysis, much of which has evolved rapidly over the last decade.
Introduction 17
Questions
Throughout the book we will end each chapter with some questions and
thoughts for practitioners and students. These are intended to trigger in-
trospective learning and may not have an answer. In this chapter, we start
off easily:
Further reading
In this section of each chapter we provide some hints for other related reading you
might want to consider relevant to the chapter. This will be in addition to the
references for each chapter. For this introductory chapter, we would suggest the
following further reading:
Floridi, L (2014) The Fourth Revolution: How the infosphere is reshaping human
reality, Oxford University Press, Oxford
Hasselbalch, G and Tranberg, P (2016) Data Ethics: The new competitive
advantage, PubliShare, Copenhagen
References
American Bar Association (2020) Rule 1.6 Confidentiality of Information: Client–
Lawyer Relationship, Model Rules of Professional Conduct, www.americanbar.
org/groups/professional_responsibility/publications/model_rules_of_professional_
conduct/rule_1_6_confidentiality_of_information.html (archived at https://perma.
cc/N8A7-BZNC)
American Medical Association (2016) Chapter 3: Opinions on privacy,
confidentiality & medical records, AMA Principles of Medical Ethics,
www.ama-assn.org/sites/default/files/media-browser/code-of-medical-ethics-
chapter-3.pdf (archived at https://perma.cc/7XBB-V6GL)
Arthur, C (2014) Facebook emotion study breached ethical guidelines, researchers
say, The Guardian, 30 June, www.theguardian.com/technology/2014/jun/30/
facebook-emotion-study-breached-ethical-guidelines-researchers-say (archived
at https://perma.cc/NW2W-EXFQ)
Buytendijk, F (2015) Think about digital ethics within continually evolving
boundaries, Gartner, 1 April, www.gartner.com/smarterwithgartner/think-about-
digital-ethics-within-continually-evolving-boundaries/ (archived at https://
perma.cc/LU6G-XH23)
Cao, L (2016) Data science and analytics: a new era, International Journal of Data
Science and Analytics, 1 (1), 1–2
Cukier, K and Mayer-Schönberger, V (2013) The dictatorship of data, MIT
Technology Review, 31 May, www.technologyreview.com/s/514591/the-
dictatorship-of-data/ (archived at https://perma.cc/5DXM-4KHF)
Davenport, T H and Patil, D (2012) Data Scientist: The sexiest job of the 21st
century, Harvard Business Review, October, hbr.org/2012/10/data-scientist-
the-sexiest-job-of-the-21st-century (archived at https://perma.cc/9VA4-2W3P)
Introduction 19
European Data Protection Supervisor (2015) Towards a New Digital Ethics: Data,
dignity, and technology, 11 September, edps.europa.eu/sites/edp/files/
publication/15-09-11_data_ethics_en.pdf (archived at https://perma.cc/W7RA-
XUDV)
Google Scholar (nd) Luciano Floridi, scholar.google.com/citations?user=
jZdTOaoAAAAJ (archived at https://perma.cc/4L8Y-F748)
Helbing, D et al (2017) [accessed 1 August 2017] Will democracy survive
Big Data and artificial intelligence? Scientific American, 25 February,
www.scientificamerican.com/article/will-democracy-survive-big-data-and-
artificial-intelligence/ (archived at https://perma.cc/LK7X-R2C8)
Jenkin, M (2015) [accessed 1 August 2017] Millennials want to work for
employers committed to values and ethics, The Guardian, 5 May, www.
theguardian.com/sustainable-business/2015/may/05/millennials-employment-
employers-values-ethics-jobs (archived at https://perma.cc/TJ37-MQJR)
Law Society of Ireland (2013) A Guide to Good Professional Conduct of Solicitors,
3rd edn, Law Society of Ireland, Dublin
Lee, D (2014) Samaritans pulls ‘suicide watch’ Radar app, BBC News, 7 November,
www.bbc.com/news/technology-29962199 (archived at https://perma.cc/3WC2-
5SKH)
Lekach, S (2016) Privacy Panic? Snapchat Spectacles raise eyebrows, Mashable,
16 November, mashable.com/article/snapchat-spectacles-privacy-safety
(archived at https://perma.cc/4BB8-FSEZ)
Lifestream Blog (2011) Lifelogging, http://lifestreamblog.com/lifelogging/ (archived
at https://perma.cc/V94K-XAH4)
Neudert, L-M (2017) [accessed 1 August 2017] Computational Propaganda in
Germany: A cautionary tale, Oxford Internet Institute, University of Oxford,
docslib.org/doc/4608254/computational-propaganda-in-germany-a-cautionary-
tale (archived at https://perma.cc/9KCA-TGXH)
Novi, S (2017) Cambridge Analytica: psychological manipulation for Brexit and
Trump? Medium, 9 July, snovi.medium.com/cambridge-analytica-psychological-
manipulation-for-brexit-and-trump-2e73c2be5117 (archived at https://perma.cc/
J8ZU-MCFD)
Orme, J (2014) Samaritans pulls ‘suicide watch’ Radar app over privacy concerns,
The Guardian, 7 November, www.theguardian.com/society/2014/nov/07/
samaritans-radar-app-suicide-watch-privacy-twitter-users (archived at https://
perma.cc/8VDY-3A2B)
Pariser, E (2011) The Filter Bubble: How the new personalized web is changing
what we read and how we think, Viking, London
Pongnumkul, S, Chaovalit, P and Surasvadi, N (2015) Applications of smartphone-
based sensors in agriculture: A systematic review of research, Journal of Sensors,
www.hindawi.com/journals/js/2015/195308/ (archived at https://perma.cc/
FP89-9JQ2)
20 Introduction
Ethics in the 01
context of data
management
What will we cover in this chapter?
In this chapter you will:
memory over the past several thousand years. In many ways, cultures with
long-developed literary traditions forgot the value of oral traditions in the
encoding and transmission of memory, dismissing deep historical and scien-
tific knowledge of indigenous peoples as ‘merely’ myths and legends. More
recently, researchers and scientists have begun to identify the scientific and
historical knowledge in indigenous oral traditions (Terry et al, 2021).
Socrates’ words could be easily adapted to any of the emergent technolo-
gies in data management and would still be as relevant and p rovocative of
debate as they were when first recorded 2,500 years ago. At their heart is a
fundamental truth that in any technology there are both benefits and risks.
Jump forward a couple of millennia, and philosophers are still talking
about the meaning of our relationships to technology. Martin Heidegger
argued that modern technology allowed us to have new relationships with
the world that were not previously possible (Heidegger, 1977). Heidegger
considered the ways of thinking that lie behind technology and saw technol-
ogy as a means to an end, not an end in and of itself. But the advent of
technologies changed the relationships between people and the other objects
in nature that we interact with and, through the operation of technology, a
new nature of a thing can be revealed. For example, the use of mining tech-
nology can turn a beautiful hillside into a tract of land that produces coal or
iron ore. He also gave the example of technology being used to manipulate
uranium to produce nuclear energy, but highlighted that that process is an
example of one that can have either a destructive or a beneficial use.
When discussing technology, we often use similar phrases interpreted in
a way that suggests that ‘technology is ethically neutral’ and that ‘it just
depends on how you use it’. But this interpretation does not consider the
many decision points that go into the design and development of any tool or
technology. At each stage, there is an assumption made as to what some-
thing should be and how something should work. Many of these decisions
have ethical impacts, and some tools are designed to be much better at
producing some outcomes than others.
For an example of technology much less fraught with explosive potential
than nuclear energy, think about a pair of scissors. A tool as simple as scis-
sors requires a number of technologies and manufacturing capabilities, and
decisions that have ethical impacts. The mining and metallurgical technolo-
gies that create the metal for the blades, the plastics moulding technologies
that produce the handles, and the manufacturing technologies that assemble
the scissors, all have their impacts on people and the environment they are
in. This is the greater social and environmental context of decisions and
ethical impacts that surround the design decision for a simple tool. However,
Ethics in the context of data management 25
the design of the simple tool itself also can affect the dignity of the people
who use them. Scissors aren’t a neutral technology if you are a left-handed
person using a pair of scissors made with the design assumption that the
user is right-handed. The assumptions underlying the decisions we make as
to what information should be included in a data set or design specification,
how technology should work, what is a ‘good’ result, and the characteristics
of our default ‘user ‘can have direct effects on people and their experience of
the tool. Whether scissors are designed as an ergonomic tool to be used by a
right-handed person, a left-handed person, or both, as an ambidextrous set
of scissors, can have an impact on individuals in terms of the usability of the
scissors. The thinking and considerations that are applied to the design and
application of technology affect (in this case in a literal way) how objects in
the real world are manipulated.1
Unfortunately, it’s not just the aching arm of a left-handed paper cutter
that might be manipulated by technology. The potential for people to be
manipulated using technology is also a concern, and one that is increasingly
topical given the growing concerns about the abuse of social media and as-
sociated technologies to influence people’s moods, market products and ser-
vices, and influence elections. While there are potentially significant benefits
from the technologies in question, they create significant ethical questions
about their impact on individual choice and agency.
In the same time frame as Heidegger, Mario Bunge argued that ‘the tech-
nologist must be held not only technically but also morally responsible for
whatever he designs or executes: not only should his artifacts be optimally
efficient but, far from being harmful, they should be beneficial, and not only
in the short run but also in the long term’ (Bunge, 1977). This echoes the
sentiment in Recital 4 of the EU General Data Protection Regulation, which
says that ‘the processing of personal data should be designed to serve
mankind’ (European Union, 2016).
areas of ethical issues or ethical risk in data that we will introduce in this
chapter as examples of the range of ethical questions you may encounter.
against the constitutional order or the moral code’ (van der Sloot, 2015). In
this, ‘privacy’ is not just a negative right to be ‘left alone’, but a positive right
to be free to develop one’s personality as an autonomous human.
Building on the concepts of privacy as a right related to human dignity as
Warren and Brandeis framed it, Stanley I Benn defined privacy in the context
of respect for the person as an autonomous individual or ‘chooser’.
Essentially, Benn framed the violation of privacy as a failure to respect per-
sonhood (Benn, 1971; Hudson and Husack, 1979). This human rights focus
brings us back to first principles, with an understanding that privacy as a
right upholds the treatment of a human as an autonomous individual, a
‘chooser’ who must be treated as an end, not just a means. The conceptual-
ization of the individual as ‘chooser’ directly relates to the need to be able to
actively and knowingly consent to the processing of one’s information and
the purposes for which it is processed.
In the wake of human rights violations perpetrated in the years leading
up to, during and after the Second World War, European nations have
adopted a strong fundamental rights approach to privacy, regarding privacy
as a necessary right fundamental to the respect for human dignity. This fun-
damental rights-based focus is reflected both in the institution of an
overarching data protection directive, and in Articles 7 and 8 of the European
Convention on Human Rights, which has binding treaty power.
This rights-based understanding of privacy has a deep history in European
philosophy and ethics, which are based in philosophical understandings of
personhood and the individual, including Immanuel Kant’s formulations of
the categorical imperative.2 In tracing back our understanding of privacy to
first principles, we may uncover the foundations of an ethical framework for
new developments in technology and actions.
This ethical approach ultimately finds expression in many of the funda-
mental principles of data privacy and data protection laws, which attempt
to provide a legal framework to give effect to the ethical values of privacy as
a defined right.
reporting, and the result was the direct impacts of the emissions released on
the environment and the global climate crisis. We will discuss this more in-
depth as a case study in Chapter 4. For the moment, the software, comput-
ing and data analytics involved in the modern automobile industry are an
example of how the ethics of data processing and environmental concerns
are linked.
in machine learning are significant. These costs and the expense of training
models also affect the accessibility of resources to engage in research.
This aspect of the environmental concerns in data ethics has challenged
the business models of some of the biggest players in the tech world. This
was brought to public attention well beyond the usual audience for research
in AI ethics, when the publication of a paper by a group of researchers from
academia and tech ‘On the dangers of stochastic parrots: Can language
models be too big?’ (Bender et al, 2021) coincided with Google precipi-
tously firing two of its star researchers in AI ethics who were co-authors on
the paper, raising questions of ‘ethics washing’ and corporate ‘capture’ of
academic research. This influential paper raises a number of ethical ques-
tions on the environmental and societal implications of the trend in NLP
research to move towards ever more massive language models.
tential mitigations to the energy costs. For example, cooling systems that
increase efficiency by using sensors to determine where energy would best be
spent to cool the system, or integrated development of a data centre that
uses excess heat in a community district heating system to heat buildings in
the area. However, the concentration of data centres in a location that has
been considered ideal due to climate, resources and infrastructure has
strained the country’s energy grid to its limits and impacted the country’s
ability to meet climate targets.
The environmental and ethical concerns and trade-offs regarding the re-
source consumption of data centres are an example of questions in data
ethics that require engaging with structurally at political and community
planning levels, not simply at a level of individual or organizational decision
making. It also illustrates how organizational decision making may exter-
nalize costs or impacts, affecting individuals and communities that might
not have been considered as stakeholders. This will require nuanced discus-
sion, as decisions like Singapore’s 2019 ban on building new data centres
may only displace effects and do not address the demand for data processing
power (Mah, 2021).
Lisette kirjoitteli tuon tuostakin ja kyseli, joko hän oli saanut selville
uransa ja ryhtynyt sille valmistautumaan.
I.
"En tiedä ajan pitkään. Nyt ei vielä ole väsyttänyt. Onpa hauska
vähän liikkuakin."
"On tässä kyliä montakin, vaikk'ei aivan tien varrella; eivät ne juuri
näy maantielle."
"Ei sitte saattane olla ketään, joka haluaisi myödä taloansa?" jatkoi
Jaakko, vähän aikaa vaiti oltuaan ja mietiskeltyään, miten paraiten
saisi tiedustelluksi niin paljon kuin mahdollista paikkakunnan oloja.
"Voista enimmäkseen."
"On tässä maata sen verta liikenemään, mutta kukas tänne suotta
rupeaisi kartanoja rakentelemaan? Ei täällä kannata muiden elää
kuin oikean talonpojan, eikä talonpoika tule toimeen paljailla
kartanoilla, hän tarvitsee paljon muutakin."
Kun vourakirjat oli tehty, joista kiireessä jäi pois ajan määrä, vaikku
Kauppinen oli mielessään aikonut viittäkymmentä vuotta, läksi hän
kuormillensa ajamaan, luvaten markkinoilta palatessaan tarkemmin
määrätä kartanoiden sijan ja sopia kivijalkain teosta jo samana
syksynä ennen maan jäätymistä. Matti Vanhanen lupasi erittäin
sovitusta maksusta toimittaa kiviä paikalle jo siksi ja sitte työn aikana
lisää tarpeen mukaan.
Hyvillään näin oivallisesta kaupasta läksi Matti pellolle maantien
varrelle syyskyntöä jatkamaan.
"Niin oli. Kohta hänestä tulee meidän kylän asukas", ilmoitti Matti
riemumielellä uutisensa.
"Älä hätäile, hyvä naapuri, onhan sitä nyt minullakin sentään vähä
järkeä; en minä rupea taloani syömään enkä hävittämään."
"Hyvähän olisi, jos niin kävisi. Mutta ole varoillasi, ja muista minun
sanani, jos hullusti käypi."
Matti oli tällä välin jo noussut aidan yli maantielle ja astui kotiinsa
päivälliselle.
"Kun on puoti lähellä, tulee siitä aina otetuksi tavaraa enempi kuin
etäältä haetuksi, ja se on vaarallista, se, sanon minä. Mutta koska ei
enää käy auttaa, niin olkoon sitte."
II.
Nyt heti ryhdyttiin työhön, ja pian tuli kivijalat tehdyksi; mutta kun
Harjulan kylän miehet, joita yksinomaan oli tässä pohjatyössä, tulivat
Kauppiselta rahaa pyytämään, ei sitä häneltä koskaan irronnut.
"Ottakaa
tavarassa!" vastasi hän aina lyhyeen.
Mari ei tuohon ehtinyt mitään vastata, kun silloin juuri tuli toisia
ostajia.
Vähän ajan päästä astui Sakari uudelleen tupaan, puntari toisessa
ja äsken ostetut tavarat toisessa kädessä. Hän tahallansa oli
satuttanut tulonsa niin, että muita syrjäisiä ei ollut läsnä.
Joulun jälkeen alkoi sitte hirren veto, kuin Kauppinen ensin kävi
noutamassa lisäksi tavaraa koko kymmenen kuorman. Vanhanen
veti vetämistään määrän mukaan, kunnes hirsikasa kasvoi aika
suureksi, ja pyysi sitte Kauppista mittaamaan.
"No, en minä vielä tähän asti ole pyrkinyt ketään pettämään; mutta
mitattakoon!" taipui Vanhanen. "Minä vain käyn noutamassa rengin
avuksi liikuttelemaan."
"Jos nämä eivät kaikki kelpaa täydestä hinnasta, niin minä sanon,
että lyhyeen teidän kaupitsemisenne loppuu!" uhkasi suoravainen
Sakari.