Download as pdf or txt
Download as pdf or txt
You are on page 1of 70

Data Ethics: Practical Strategies for

Implementing Ethical Information


Management and Governance 2nd
Edition Katherine O'Keefe
Visit to download the full and correct content document:
https://ebookmeta.com/product/data-ethics-practical-strategies-for-implementing-ethic
al-information-management-and-governance-2nd-edition-katherine-okeefe/
More products digital (pdf, epub, mobi) instant
download maybe you interests ...

Ethics for Records and Information Management 1st


Edition Norman A. Mooradian.

https://ebookmeta.com/product/ethics-for-records-and-information-
management-1st-edition-norman-a-mooradian/

Artificial Intelligence Ethics and International Law:


Practical approaches to AI governance, 2nd Edition
Abhivardhan

https://ebookmeta.com/product/artificial-intelligence-ethics-and-
international-law-practical-approaches-to-ai-governance-2nd-
edition-abhivardhan/

Business Ethics: Ethical Decision Making & Cases 12th


Edition O. C. Ferrell

https://ebookmeta.com/product/business-ethics-ethical-decision-
making-cases-12th-edition-o-c-ferrell/

Implementing the Sustainable Development Goals in


Nigeria Barriers Prospects and Strategies 1st Edition
Eghosa O. Ekhator (Editor)

https://ebookmeta.com/product/implementing-the-sustainable-
development-goals-in-nigeria-barriers-prospects-and-
strategies-1st-edition-eghosa-o-ekhator-editor/
Business Ethics Ethical Decision Making and Cases 13th
ed O C Ferrell John Fraedrich Linda Ferrell

https://ebookmeta.com/product/business-ethics-ethical-decision-
making-and-cases-13th-ed-o-c-ferrell-john-fraedrich-linda-
ferrell/

Information Assurance and Risk Management Strategies:


Manage Your Information Systems and Tools in the Cloud
1st Edition Bradley Fowler

https://ebookmeta.com/product/information-assurance-and-risk-
management-strategies-manage-your-information-systems-and-tools-
in-the-cloud-1st-edition-bradley-fowler/

Information Security Management 2nd Edition Workman

https://ebookmeta.com/product/information-security-
management-2nd-edition-workman/

Responsible AI Implementing Ethical and Unbiased


Algorithms 1st Edition Sray Agarwal

https://ebookmeta.com/product/responsible-ai-implementing-
ethical-and-unbiased-algorithms-1st-edition-sray-agarwal/

Ethics And Governance 3rd Edition James Beck

https://ebookmeta.com/product/ethics-and-governance-3rd-edition-
james-beck/
i

PRAISE FOR DATA ETHICS


2ND EDITION

‘In a world where AI is creating a growing wave of often dubious information,


O’Keefe and O Brien’s book should be mandatory reading for everyone in IT,
media, regulatory bodies and beyond. This new edition focuses on emerging
topics of vital importance in a world where ethical decisions by IT may literally
be, in the extreme, matters of life and death.’
Barry Devlin, Founder, 9 Sight Consulting and author of Business
unIntelligence

‘I can’t think of a subject more relevant than data ethics. Given that we live in a
data-dependent world, the most important question is not “Can I do something
with data?” but “Should I do something with data?”. These questions should be
considered by teens learning to code, business people gathering and exploiting
customer data, scientists developing and releasing AI applications, and anyone
creating and using data. Katherine O’Keefe and Daragh O Brien provide excellent
groundwork for addressing these questions and give us the tools to think and act
with our data in a responsible way. Read their book, share it and apply it!’
Danette McGilvray, President and Principal, Granite Falls Consulting and
author of Executing Data Quality Projects

‘Reading Data Ethics gave me goosebumps. Impeccably researched, it is the


definitive work on the topic. Simultaneously confronting and enlightening, it
challenged my own ethical framework and validated the principles I hold dear in
my practice as a data governance executive. The foreword by John Ladley is
delightful and sets the scene perfectly for what is to follow. I look forward to our
DAMA community, both here in Australia and internationally, having the
opportunity to share their experiences after reading this outstanding book on
data ethics.’
Andrew Andrews, Data Governance Manager, ANZ Banking Group and Vice
President Marketing, DAMA International
ii

‘Ethics play an increasingly important role when considering how to collect and
use personal information. This updated edition of Data Ethics clearly explains
how to take ethics seriously and make it an integral part of business information
management and governance. The combination of sound and up-to-date legal
theories with practical tips and case studies makes it a useful handbook for
anyone working with data on a regular basis. The only disadvantage is the
realization my own to-do list grew a little longer.’
Paul Breitbarth, Senior Visiting Fellow, European Centre on Privacy and
Cybersecurity, Maastricht University
iii

Second edition

Data Ethics
Practical strategies for implementing
ethical information management and
governance

Katherine O’Keefe
Daragh O Brien
iv

Publisher’s note
Every possible effort has been made to ensure that the information contained in this book is
accurate at the time of going to press, and the publisher and authors cannot accept respon-
sibility for any errors or omissions, however caused. No responsibility for loss or damage
occasioned to any person acting, or refraining from action, as a result of the material in this
publication can be accepted by the editor, the publisher or the authors.

First published in Great Britain and the United States as Ethical Data and Information Management:
Concepts, tools and methods in 2018 by Kogan Page Limited
Second edition published in 2023

Apart from any fair dealing for the purposes of research or private study, or criticism or review, as
permitted under the Copyright, Designs and Patents Act 1988, this publication may only be repro-
duced, stored or transmitted, in any form or by any means, with the prior permission in writing of
the publishers, or in the case of reprographic reproduction in accordance with the terms and li-
cences issued by the CLA. Enquiries concerning reproduction outside these terms should be sent
to the publishers at the undermentioned addresses:

2nd Floor, 45 Gee Street 8 W 38th Street, Suite 902 4737/23 Ansari Road
London New York, NY 10018 Daryaganj
EC1V 3RS USA New Delhi 110002
United Kingdom India

www.koganpage.com

Kogan Page books are printed on paper from sustainable forests.

© Katherine O’Keefe and Daragh O Brien, 2018, 2023

The right of Katherine O’Keefe and Daragh O Brien to be identified as the authors of this work has
been asserted by them in accordance with the Copyright, Designs and Patents Act 1988.

ISBNs
Hardback 978 1 3986 1029 3
Paperback 978 1 3986 1027 9
Ebook 978 1 3986 1028 6

British Library Cataloguing-in-Publication Data


A CIP record for this book is available from the British Library.

Library of Congress Control Number


2023933664

Typeset by Integra Software Services, Pondicherry


Print production managed by Jellyfish
Printed and bound by CPI Group (UK) Ltd, Croydon, CR0 4YY
v

CONTENTS

Foreword x

Introduction: Why write a book on data ethics? 1

01 Ethics in the context of data management 21


What will we cover in this chapter? 21
Ethics and the evolution of technology 22
Privacy and the environment: two examples of ethical questions in
data 25
The evolution of privacy and technology 26
Data ethics and environmental concerns 27
Volkswagen’s ‘dieselgate’ scandal 28
Other ethical dilemmas 35
The drive for effective ethics in information management 40
Chapter summary 42
Questions 43
Notes 43
Further reading 43
References 44

02 Introduction to ethical concepts and frameworks 46


What will we cover in this chapter? 46
Introduction 47
Ethical theories you should know about 50
Relational ethics 59
Current concepts in data ethics: data colonialism and ethics
washing 66
Common elements of ethical frameworks 72
Chapter summary 75
Questions 75
Notes 76
Further reading 76
References 77
vi Contents

03 Ethical principles, standards and practice 81


What will we cover in this chapter? 81
Principles 82
Developing ethical data principles 84
Looking at data ethics principles in practice 91
Seeking the ‘ethic of society’ – where to look for ethical
inspiration 95
Conclusion 104
Chapter summary 104
Questions 104
Notes 105
References 105

04 Ethics, privacy and analytics 107


What will we cover in this chapter? 107
Analytics and data science: an ethical challenge 108
What is data analytics? 109
The ethical issues in data analytics 109
Chapter summary 133
Questions 133
Notes 134
Further reading 134
References 134

05 Ethics and data management (including AI) 139


What will we cover in this chapter? 139
Introduction 140
Introducing data management 141
Introducing data governance 144
Introducing data quality management 149
Introducing data modelling 153
Introducing data warehousing and business intelligence 156
Big Data, AI and machine learning 158
The role of data literacy and data acumen in data ethics 164
The ethical question of data-enabled actions 165
Chapter summary 167
Note 167
Further reading 168
References 169
Contents vii

06 Developing an ethical architecture for information


management 171
What will we cover in this chapter? 171
Data ethics and information architecture – what is the
connection? 172
Implementing ethical architecture: common challenges 195
Zachman, ethical architecture and the long-term vision 197
Chapter summary 198
Questions 198
Note 199
Further reading 199
References 199

07 Introducing the Ethical Enterprise Information


Management (E2IM) framework 202
What will we cover in this chapter? 202
Building a model for ethical enterprise information
management 203
Ethics and Big Data 221
Conclusion 225
Chapter summary 227
Questions 227
Further reading 227
References 228

08 Information ethics as an information quality


system 230
What will we cover in this chapter? 230
Ethics as a quality system 231
Applying quality management principles to ethical information
management 234
Chapter summary 264
Questions 265
Notes 265
Further reading 265
References 266
viii Contents

09 Information ethics and data governance 269


What will we cover in this chapter? 269
Introduction 270
How data governance supports ethics 271
Principles and modes of governance 273
Communication of ethical values and supporting ethical values
through stewardship and governance 287
Using storytelling to communicate values 292
Data review boards 295
Change management 301
Chapter summary 311
Questions 312
Note 312
Further reading 312
References 312

10 Information ethics and risk: Tools and methods


for identifying and managing ethical risk 315
What will we cover in this chapter? 315
Introduction 315
Looking for parallel models 318
Privacy by Design 319
Ethics by Design 320
Privacy engineering 322
The E2IM ethical impact assessment model 325
Chapter summary 344
Questions 344
Note 344
Further reading 345
References 346

11 Data ethics: the bigger picture 348


What will we cover in this chapter? 348
Introduction 348
Tackling the wicked problem 350
Beyond data literacy: data acumen 351
Educating the ethic of the individual 356
Contents ix

Educating the ethic of the organization 357


Educating the ethic of society 359
Educating to mitigate ‘ethics washing’ 360
Education on ethics 361
Conclusion 362
Chapter summary 363
Note 363
References 363

And in conclusion... 366

Index 371
x

FOREWORD

I am privileged to pen this foreword because: a) the authors have excellent


taste and b) a little-known aspect of my intellectual life is a significant
amount of academic work in ethics, religion and philosophy. I am also priv-
ileged because: 1) I have known Daragh and Katherine a long time and 2) I
have deep respect for their work. There is some incredibly valuable material
in these forthcoming pages.
Humanity’s track record is sketchy when it comes to doing new things.
Events and movements that create enormous benefits often suffer from un-
intended consequences. We sing the praises of the computer and the internet,
yet accept a plethora of risks, inconveniences and inefficiencies. Enormous
sums of money change hands, yet many societies are starting to wonder ‘is
this really all we are going to get?’
If we examine the Industrial Revolution, we see similar themes. Big shifts
in economies. Wondrous inventions. And unintended consequences of slums,
disease, and banking and social systems that took decades to appear re-
motely moral or equitable. Over time, humanity adjusted, and the actions of
people changed. So did our educational and government policies.
As this foreword is being written, the collapse in data systems of Southwest
Airlines during a weather event combined with the Christmas holidays is
grabbing a lot of attention. On top of that, the morning news has articles
about Google being fined (again), Facebook paying a fine (again) and cyber
threats growing exponentially (still).
Is all this relevant to the topic of data? Of course. When you work in the
realm of data you work in this realm of huge potential, and huge risk. Does
this book help address the clash between benefits and risks stemming from
the realm of data? Yes.
This book addresses unintended consequences (ethics) around a huge
change in the human experience (data). There is excellent research here,
strong examples and excellent guidance. It is crucial that the reader take this
content to heart. This cannot be emphasized enough.
Early on, Daragh and Katherine mention that humanity has run into
­ethics and data repeatedly. This builds their strong case that sincere organi-
zational efforts are required around ethics.
Foreword xi

This is not a light topic. This is a serious and challenging effort. But
Daragh and Katherine are well-immersed and knowledgeable in this work
and have written another excellent edition of their ethics book.
I need to reinforce their discussion and urge the reader to embrace the
topic and the learning within, but not because ‘history is repeating’ or ‘this
is required for my organization’. The reader must be attentive because we
have entered the realm of anthropology.
Anthropology is the science of being human. Relating to the study of
human behaviour from an environmental, biological and societal perspec-
tive, anthropology observes all aspects of the human experience. In other
words, anything that is anthropological is PERSONAL. Given that data is
one of those movements that affect the human experience, it too is personal.
Anthropology shows us that humans either adapt to massive shifts in
society or suffer from the failure to adapt. Our unintended consequences
from the rise of data are the tip of the iceberg.
When we, as members of humanity, deal with data in an organizational
sense, we are compelled to behave. After all, this can affect career and liveli-
hood – always a good motivator. When a business leader or politician says
‘data is important’, we easily move to the social or institutional behaviour.
But if this is all ‘anthropological’ then humanity needs to define the new
‘right things’ to do for individuals – at the workplace, and everywhere else.
What is the right thing to do with data? Rather than have a supervisor or
institution specify an institutional behaviour, maybe the discussion needs to
be reversed; what do you need to know about data behaviour before you
even start the job?
At the end of the day, when humans do new things, and then these new
things go off the rails, we usually find an answer in human behaviour. Data,
as a subject, is square in the centre of humans adapting to technology.
Daragh and Katherine provide excellent examples, justifications and
­solutions.
The neat thing about anthropology is it is well studied. There are patterns
we can use to our advantage.
Organizations need to teach that they and their employees operate a data
supply chain with far-reaching consequences. The young child needs to learn
that they generate data, and not share personal data, while at the same time
needing to know they need to share toys and kindness. The high-school
student needs to learn to be more judicious before downloading the new
app. Advanced institutions of learning need to offer and require dedicated
classes in organizations, data and ethics.
xii Foreword

Is this another fundamental change in how we treat fellow humans? I


believe this to be true, but time will tell. Perhaps, like the Industrial
Revolution, social norms should be viewed as subject to upheaval. For sure,
what is in this book needs to appear well before professional exposure.
For most organizations, the reactions to date towards data ethics have
been externally driven (GDPR, CCPA, etc), i.e. responses that apply to bad
actors and organizations that have not considered an ethical context as they
race to cash in on data. Data conferences and forums feature topics on ‘How
to monetize data’ or ‘How to create products from data’. I do not see many
topics of ‘Should we create products with our customer’s data?’ We have not
scratched the surface as to what to do proactively.
Data is an anthropological issue. The economic doctrines of land, labour
and capital now have a new friend: data. We are ill-equipped for this new age.
There is a way to go but we are in good hands with this book. Daragh
and Katherine provide a solid foundation.

John Ladley
1

Introduction
Why write a book on data ethics?

What will we cover in this chapter?


In this chapter we introduce why data and information ethics are becoming
a critical issue for business, society and for all of us as individuals. We think
about topics such as:

●● What are the implications of our data-gathering tools and technologies


from an ethical perspective?
●● What types of ethical dilemma might you face as your organization looks
to become more ‘data driven’ and harness the various new and emerging
sources of data and methods of data capture?
●● Given that law makers and legislation increasingly lag behind the pace of
technological innovation, is there value in taking a ‘principles approach’
to ethics for information management rather than waiting to be told what
is the wrong thing to do?
●● What is the importance of consciously assessing the trade-offs and
balances that need to be struck in the adoption and development of new
technologies and methods for putting information to use in organizations
and society?

Introduction
We live in interesting times. The pace of innovation and development in
various fields of information management and information technology con-
tinues to accelerate, with functionality and features common today that
2 Introduction

would not have appeared out of place in science-fiction movies of even a few
years ago. We have been gathering and recording information in written and
pictographic forms for around 5,000 years. Cuneiform texts from ancient
Mesopotamia are among the oldest evidence of recorded history.
The advent of modern technologies means we are now recording, in a
year, more information than we have recorded in all of the preceding his-
tory of humankind. The pace of logging, recording and cataloguing of in-
formation continues to accelerate as our technical capabilities evolve. The
challenge we now face is whether our love affair with technology and tech-
nological innovation may have left us ill-prepared for the various ethical
and moral issues that the uses of that technology increasingly throw us on
a day-to-day basis.
In the years since the first edition published, there have been many devel-
opments, so while the time for a new edition was right, we also wanted to
make sure the title aligns with the content, which we feel is achieved with
‘Data Ethics’. The purpose of this book is to explore whether the fundamen-
tal ethical challenges we face are new variants of issues we have struggled
with in all cultures for many thousands of years. While our technology ca-
pabilities advance at a phenomenal rate, the solutions needed to put ethical
principles into practice might be found in the fundamental disciplines of
information management and how we manage and lead our information
organizations.
In the past few years, we’ve seen both a lot of rapid change in the fields
related to data and information ethics and, at the same time, less change
than hoped. In 2018, we were seeing a groundswell in awareness of the need
for data ethics. As we were finalizing proofs for the first edition of our book,
scenarios that we had hypothesized might be the case from smaller and less-
publicized whistleblowing were splashed across the headlines as major inter-
national scandals, causing crises in multiple governments and multinational
organizations. Between 2018 and 2023, some events have brought the im-
portance of data ethics to very public attention and there has been some
movement towards attempting to put teeth into ethical standards at national
and supranational levels. The European Union took up the challenge to pro-
mote ethical AI put to them by the former European Data Protection
Commissioner and published guidelines for ethical AI, following up with a
move towards creating an EU Regulation for ethical AI.
At a national level, some countries have looked to implement and support
ethical frameworks for data use at different levels. Two examples we are
personally familiar with are Scotland and Ghana: in 2020, Scotland created
Introduction 3

a National Expert Group and engaged in a series of public consultations as


part of publishing a report on an ethical digital society to inform government
decision making around data ethics. Organizations like the UN Global Pulse
have engaged with countries in developing economies to help accelerate the
development of regulatory frameworks for ethics in artificial intelligence and
related fields through consultation and expert input.
The advent of a global pandemic put a spotlight both on how valuable for
the common good our data capabilities can be and on the dangers of ‘tech
solutionist’ thinking and the risks of surveillance, data misuse, and misinfor-
mation. Data dashboards became a daily reference in the personal lives of
many, and the complexities of data quality and public health surveillance sta-
tistics became part of highly politicized public debate. ‘Proxy metrics’ became
a news item as, in the absence of testing data, people charted indications of the
prevalence of coronavirus through buyer reviews of scented candles. Responses
to the pandemic drove home how vital transparency and trustworthiness are
to public adoption of innovations with data and technology and ‘data-driven’
decision making, and the necessity of broad public education in how to com-
municate and interpret data and statistics and identify misuse of data. A sound
ethical foundation and transparency around the governance of data collection
and use is a key enabler for the adoption of successful projects, and for people
to be willing to allow the collection and use of data.

The tools of data gathering


It is worth remembering that the Apple iPhone was introduced barely 15 years
ago, but has triggered a revolution in hand-held computing that has placed
the Apple-device ecosystem at the centre of a revolution in data collection
through apps and device capabilities. Far from being simply a communica-
tions device, the smartphone has developed into a powerful personal comput-
ing platform, with increasingly powerful sensors either built into it directly or
connecting to it via Bluetooth connections.
Google began as a search engine barely two decades ago, but it has
grown into a data management behemoth, developing everything from
email services to operating systems to home automation systems. Again,
this places Google at the centre of our lives and gives them an unrivalled
capability to gather or generate data about us and our interactions with the
world around us.
4 Introduction

These companies are not alone. Facebook has created a phenomenally


powerful tool for connecting with friends and family. This was supple-
mented with its 2015 acquisition of WhatsApp, which gives it information
about who is messaging who, helping to develop a very detailed map of your
personal connections and interactions – even if your family members or
friends are not using Facebook themselves.
Other innovative companies gather and use information in a variety of
ways. Some piggyback on the existing technology platforms such as smart-
phones through the development of new software applications. These appli-
cations are not limited to social networking applications. For example, there
are smartphone applications using the sensors already built into the phones
themselves to support agriculture (Pongnumkul, Chaovalit and Surasvadi,
2015). Smartphone apps have been created that use the camera on the phone
to help measure the chlorophyll in plants. This is a good indirect indicator of
the health of food crops such as rice.
Innovative methods for gathering and analysing data are emerging as
researchers and companies develop or deploy new sensor technologies that
transmit their data to online applications or smartphone applications.
From the humble fitness tracker to wearable cameras to telematics devices
in cars, to glasses frames capable of recording video and still images and
uploading them to the internet – e.g. Google’s Google Glass (Wikipedia,
nda), Snapchat’s Spectacles (Lekach, 2016) or Meta’s more recent collabo-
ration with Ray-Ban, ‘Ray-Ban Stories’ – the modern information-­gathering
and information management ecosystem is increasingly complex and
­powerful.
The data generated or derived from mobile devices or other sensor tech-
nologies that we are using can increasingly be used for a range of things,
from keeping semi-automated diaries of various aspects of our personal
lives or health – known as ‘Lifelogging’ (Lifestream Blog, 2011) or the
‘Quantified Self’ (Wikipedia, ndb) – to tracking traffic bottlenecks on mo-
torways, or the number of people passing by a particular place on a city
street. The deployment of powerful data-gathering capabilities is now sub-
stantially cheaper and less invasive than it would have been even a few
years ago.
To put this in context, when Daragh was a student in the 1990s he helped
a friend of his in the Faculty of Medicine with a term-paper experiment and
had to wear a heart-rate monitor for a few days. The device was the size of
three or four paperback books stacked on each other and had to be worn
strapped to his chest day and night. The monitor had to be given to one of
the local hospitals to be downloaded and analysed.
Introduction 5

Today, as he writes this, he is wearing a wristband fitness tracker that


sends data about his heart rate, movement and activity levels to his smart-
phone, integrating via an app on that device. Daragh could also choose to
use an app to track his sleep patterns, using the microphone on his phone to
record his breathing patterns and incorporating the heart-rate monitoring
from his wrist-worn fitness tracker.
Rather than having to struggle with a bulky recording device, he uses two
lightweight gadgets and some low-cost software running on his phone to
record the same type of data his med-student friend was logging 20 years
ago. Rather than having to wait a few hours or days for the data to be ana-
lysed and presented, when he wakes up Daragh can look at some graphical
charts on his phone that tell him how well he has slept and what his heart
rate is at any time of the day.
However, unlike the 1990s, when the data on Daragh’s heart-rate moni-
tor was accessed only by the laboratory staff in the hospital and his med-
student friend, today his data is exposed to a wide range of potential third
parties: the app developers, the fitness tracker manufacturer, the sleep-track-
ing application developer, the manufacturer of his smartphone and its oper-
ating system and potentially other third parties that his data has been shared
with or sold to.

The tools of data analytics


In parallel with the tools and technologies for gathering information, we
have witnessed a substantial increase in the ability of organizations, and
indeed individuals, to analyse data. The emergence of big data and data
management tools and technologies to support the analysis of large and
varied data sets has given rise to the emergence of job descriptions such as
‘Data Scientist’ in organizations.
In 2012 ‘Data Scientist’ was proclaimed as the sexiest job title of the 21st
century by the Harvard Business Review (Davenport and Patil, 2012). Not
unlike the smartphone, the term ‘data science’ has only been around for
about 20 years (Cao, 2016). As a term, it is still evolving and includes a spec-
trum of technical disciplines over and above traditional statistical analysis or
database querying.
Already, the field includes domains such as artificial intelligence (AI), natu-
ral language processing (the use of artificial intelligence to process text to fig-
ure out patterns or infer meaning), machine learning (the use of AI to figure
6 Introduction

out more complex problems based on a function that learns the parameters of
the required outcome from the data available), and deep learning (the devel-
opment of mathematical models that break complex analysis into simpler
discrete blocks that can in turn be adjusted to better predict final outcomes
from the AI process).
These tools, and the technology platforms they run on, are increasingly
powerful. Just as the world of data management software has evolved in
recent years to develop more powerful software tools and platforms for data
analytics and visualization, hardware manufacturers are now beginning to
develop processing chips for the next generation of smartphones, computers
and Internet of Things (IoT) devices to allow complex AI functions to be
deployed ‘on device’, rather than relying on servers in data centres or hosted
in cloud-based environments.
The ‘Quantified Self’ movement is a good example of the development of
our data-gathering and analytics capabilities over the past few years. Twenty
years ago, if we were tracking our exercise routines we would have used a
range of independent technologies such as stopwatches and heart-rate mon-
itors and would have recorded our progress and performance indicators in
a physical notebook (because our computers were too big to bring to the
gym or too heavy to bring on a run). We might have kept a food diary in the
notebook as well. We might have manually tracked historic trends or used
group activities to compare our progress against a representative sample of
our peers.
Today, we wear lightweight fitness trackers that also track our location
and movement, recording our exercise performance in terms of distance, ef-
fort and other performance indicators. We might log the food we are eating
by taking photographs of our meals instead of writing in a notebook.
Further logging of our activities and actions is automatically through our
wearable technologies. We track our performance against peers through
pooled data that is shared via our applications.
Increasingly, our software tools can infer the calorie and nutrient content
of food we eat based on a machine-learning analysis of a photograph of our
meals. AI and analytics can enable the automatic tailoring of our exercise
regimes to our fitness levels, our ability and our progress. The same tech-
nologies can also predict health issues that might arise based on the data
they ingest about us. Add to the mix more specialized additional technolo-
gies to read blood sugar, blood pressure or other aspects of physical health,
and our simple smartphone is the hub of a device that increasingly resembles
the tricorder in Star Trek.
Introduction 7

Of course, this data is useful and valuable. While the pencil-and-paper


lifeloggers of old may not have sat recording the speeds they drove at when
travelling from A to B, today’s lifeloggers are using the incredible informa-
tion-processing power of technologies that have relatively recently entered
the consumer space to record increasingly granular details about themselves,
which can then be made available to third parties such as insurance compa-
nies. Governments can use this data for planning health and social policy.
Insurers can use it at a macro level to refine their risk models, and at a micro
level to target and tailor the policies that they offer to us. Marketers can use
the data to identify whether we would be receptive to different methods of
advertising for the products they are promoting, and when we would be
most suggestible to be influenced by such marketing.
The tools of analysis, and the abundance of data sources that are now
available to organizations (and to individuals), create an environment that
is rich with potential and ripe with opportunity to develop insights into the
world at a level of detail and at a level of cost that previous generations
could only have dreamt about. However, with this potential we begin to see
the re-emergence of a perception of data and analytics as a panacea for all
ills, as if access to the right information will magically unlock the strategic
vision for a government or organization, or trigger a miraculous improve-
ment in physical well-being among fitness-tracking loggers of all of life’s
activities.
For example, the Samaritans in the United Kingdom do phenomenal
work supporting people who are stressed and depressed and helping them
find alternative methods to deal with their mental state. One of the methods
they deployed in 2014 was ‘Samaritans’ Radar’, an app that allowed you to
put the Twitter handle of a friend or family member into it and it would then
conduct sentiment and mood analysis on the person’s tweets to help identify
if they were in an up or down state of mind. All of this happened without
the knowledge or consent of the selected individual whose posts were being
analysed.
This is an innovative use of AI, machine learning and text analytics to
help people know when they need to reach out to offer support to a friend,
family member or colleague. However, the exact same tool, in the hands of
a malicious actor or bully, could be used to target harassing messages at a
vulnerable person more precisely and track the effectiveness of a harassment
campaign. Furthermore, the UK Information Commissioner’s Office ex-
pressed concerns that the application was, by its very nature, processing
8 Introduction

sensitive categories of personal data relating to the physical or mental health


of individuals without their knowledge or consent (Orme, 2014). At the
time, the National Association of Data Protection Officers in the UK also
expressed concerns (Lee, 2014).
In this respect, we risk repeating the errors of previous generations when
faced with the potential for technology to automate the processing of data.
The Vietnam war was one of the first heavily analytics-driven wars in the era
of modern computing. From the Secretary of Defence Robert McNamara
down, there was an emphasis on and an obsession with quantitative analyt-
ics as a way of gauging the success or failure of operations. From not recog-
nizing the inability of computers of the time to measure intangible factors to
not factoring in the quality of the input data, horrendous decisions impact-
ing on the lives of people for many generations were made – because the
data said it was the right decision.
A key missing link in our rationalization and analysis of technology is the
motives of both the person creating the data and the person seeking to use
the data. In the Vietnam war, McNamara and US military failed to consider
the motivation of the chain of command to filter information upwards, giv-
ing the ‘brass’ the answers they wanted, not the insights they needed.
McNamara should have been warned of this – when he was with Ford
Motor Company he had implemented a rigorous metrics-driven approach
to management that resulted in production staff simply dumping parts into
the river so they could work around metrics-driven rules on when new car
models could be introduced (Cukier and Mayer-Schönberger, 2013). Today,
we give fake names and email addresses to websites when registering for
services, or take other steps to control how our data is accessed or used by
others, such as using virtual private network (VPN) services or ad blockers
when surfing the internet. The organizations we interact with analyse and
use our data for their own purposes. The clarity and transparency of those
purposes to us as individuals or consumers is often lacking. Or the process-
ing may be essentially invisible to us, carried out below the surface of our
interactions with companies, products, services or government agencies for
aims and objectives that are unstated or unclear.
Of course, while we may not have learnt that information technology is
not always the panacea for our problems, it has evolved to a point where the
data gathered and processed by social media networks and other new tech-
nology platforms has the potential to influence people’s moods (Arthur,
2014), the information that they are presented with (Pariser, 2011) and the
decisions that they take (Samson, 2016). The emergence of psychographic
Introduction 9

targeting, a technology that requires the use of AI to analyse and filter vast
amounts of data about people from social media, has led to what the Online
Privacy Foundation has referred to as ‘the weaponized, artificially intelli-
gent, propaganda machine’ (Revell, 2017). By gathering data in contexts
where people are less likely to be misleading or provide inaccurate informa-
tion or take other steps to mask or protect their data from being gathered or
measured, and by combining data across multiple data sets and data points,
it has apparently become possible to manipulate and undermine a democ-
racy without firing a single shot (Neudert, 2017).

With great power comes great responsibility


The issues raised above highlight one of the key challenges we face as the
potential and power of information management in our lives becomes more
pervasive, and one of the key reasons that we decided to write this book.
The pace of development of data-processing technology has been such in
recent years that we have scarcely had a chance as a society to draw breath
and consider the implications of the technologies we are developing and
deploying. Innovations make it from the research lab to the consumer
shelves in a fraction of the time a manufactured product might have in pre-
vious generations. Technologies that influence the mood, mental state or
behaviour of people are capable of being deployed, in many cases, with
minimal cost and negligible testing or oversight.
Of course, books take time to research and write. When we started on
this process with the first edition of this book, there were very few publica-
tions of any kind in the area. A PhD candidate conducting research would
have had a very short literature review in the context of information ethics.
Beyond some key pioneering research, including work in academia by
Luciano Floridi at Oxford University (Google Scholar, nd) and in industry
by Gartner Group (Buytendijk, 2015), and the under-recognized work of
several other academic and industry researchers and practitioners, this was
a niche area. We fully expect this to change rapidly over the coming years,
but a critical challenge will be to translate the discussion of information eth-
ics from abstract concepts to tangible methods and practices.
The emergence of a regulatory focus on information ethics from the
European Data Protection Supervisor in late 2015 (European Data
Protection Supervisor, 2015) is, to our view, a sea-change moment. Since
10 Introduction

2016 there has been a growing stream of commentary in media, discussion


in standards bodies, and debate among practitioners. Discussion has ranged
from the security of IoT devices and whether it is ethical to go to market
with an unsecure technology (Wadell, 2017), to the use of data to influence
elections and behaviour (Helbing et al, 2017), to the ethical issues in AI.
At the heart of this evolution in focus on ethics in information manage-
ment is the realization, however trite it may sound, that ‘with great power
comes great responsibility’. The origin of this quote is traced variously to
Voltaire, the French Revolution, Winston Churchill or the pages of Marvel
comics. But, regardless of the origin, the sentiment is clear – we cannot
lightly introduce powerful technologies that have the potential to deliver
significant benefits to individuals or to society but equally have the potential
to inflict great harm. The complication we face in the Information Age is
that a failure to implement technologies with the appropriate balances in
place (e.g. easy-to-configure security in IoT devices, appropriate governance
in analytics planning and execution) has the potential to affect many thou-
sands, if not millions of people, directly or indirectly, before remedies can be
put in place. The alleged manipulation of voters through social media dur-
ing the UK Brexit referendum and the 2016 US presidential election is a
telling example of this (Novi, 2017).
The core issues to be addressed, however, go far deeper than the superfi-
cial issues of technology implementations and require us to consider the
question of what type of society we wish our technologies to enable, and
what controls or constraints we might wish to apply to the applications of
technologies and the development of such capabilities.
But we have been here before. Every generation has faced technological
changes or drives for social improvement through new technologies or in-
dustries that have promised huge potential. From the printing press, to the
development of the factory in the Industrial Revolution, to the development
of new medicines or medical procedures, industry after industry has had to
face ethical challenges. Often these have been addressed through regulatory
measures. For example, Lord Shaftesbury and others pioneered the Factory
Acts in the United Kingdom during the Industrial Revolution, which cur-
tailed the use of child labour and set minimum standards for schooling for
children working in factories. In other cases, industry regulators or over-
sight bodies act, such as in the case of the Tuskegee syphilis experiments
conducted in Alabama from 1932 to 1972, in which treatment for syphilis
was intentionally withheld from African American sharecroppers in rural
Alabama. This study, which we will look at in a later chapter of this book,
Introduction 11

resulted in the publication of the Belmont report that established ethical


guidelines for biomedical and behavioural research.
As information management practitioners living through this informa-
tion revolution, increasingly we find ourselves facing a variety of dilemmas
and difficult decisions. These arise through the potential capabilities of the
tools, technologies and data sets that are, at least in theory, available to us.

The data-driven dilemma


The ‘data-driven dilemma’ is the label that Daragh applies to the position
that information management professionals and data analysts increasingly
find themselves in, given the potential analytics capabilities of software tools
and the increasing richness of the ‘data exhaust’ left behind by people as
they use common tools and services such as mobile phones.
For example, imagine for a moment you are tasked with developing sta-
tistical reports to support government investment in tourism policy. The
recommendations you make will have potentially significant impacts on the
livelihoods of many tourist businesses in remote areas of the country. You
know that mobile-phone operators have all the data you need regarding the
movements of visitors to the country who have mobile phones from other
operators in their home countries and are roaming on mobile networks in
your country.
However, you will also capture data about business travellers and diplo-
mats. People have no mechanism of opting out. The data will not only show
you where each person has been travelling, but who was with them (or
rather, what devices might ultimately be linked to a person). The available
data is far beyond what you need for your stated analytics purpose. You will
also identify at a very granular level of detail the holiday activities of travel-
lers, including how long they spend at locations, and how quickly they get
from point A to point B when travelling. You will also get data about people
in your country and where they go (or where their devices go) on holidays.
This data is generated because of travellers using mobile-phone networks
in your country. It allows your analysis to be done quickly and at almost
negligible cost once you have the data, particularly when compared with the
cost and error rates inherent in doing surveys of travellers. Should you move
heaven and earth to get the mobile-phone network data? What are the other
implications and risks associated with this processing that might give rise to
ethical or legal issues? Should you seek forgiveness rather than permission?
12 Introduction

All too often, particularly in the context of big data and analytics pro-
cesses, we can be faced with a significant short-term win or benefit from the
use of data in new or novel ways. Often there is a potential benefit to society
arising from the processing. Sometimes that societal benefit is substantial.
However, often the impact on individuals and the choices they might make
or the freedoms they may otherwise enjoy can be disproportionate to the
benefit to society. In these contexts, the data-driven dilemma is one of deter-
mining whether, even if we can do something fancy with data or data-related
technologies, should we?

The innovator’s dilemma


Linked to the data-driven dilemma is the dilemma we face where the pace of
technological change is accelerating. The perception is that legislation is lag-
ging further behind the pace of change. For example, the European Union’s
General Data Protection Regulation (GDPR) came into force in 2018, a full
23 years after the original Data Protection Directive was enacted, and about
as long again since the development of the World Wide Web by Sir Tim
Berners-Lee.
Even within defined legal frameworks such as the GDPR, there is poten-
tially a wide range of ethical choices to make within the formal parameters
of what is legally acceptable. For example, the concept of ‘legitimate inter-
ests of the data controller’ as a basis for processing personal data of indi-
viduals under the GDPR requires decision makers in organizations to assess
the balance between the competing rights and interests of the affected data
subjects and the organization.
In this context, we increasingly find ourselves having to make evaluations
of the ethics of a course of action or a proposed application of technology
in information management. But traditional computer science degrees, busi-
ness management or law degrees leave us significantly ill-prepared for this
type of analysis. In addition, our standard business or technology innova-
tion or implementation processes often do not always allow for the consid-
eration time necessary to reflect on ethics as an explicit quality characteristic
of our product or service development.
As innovation does not exist in a vacuum we must also consider the or-
ganizational culture and governance frameworks we operate within and how
they can support or undermine ethical considerations. Increasingly the articu-
lation of these governance approaches and the rigour with which organiza-
Introduction 13

tions embrace and apply ethics and ethical principles in their information man-
agement practices will become a source of competitive advantage, both when
seeking customers but also for attracting and hiring employees (Jenkin, 2015).
Many organizations are attempting to innovate in this area through the
introduction of ethics forums or taking part in public ethics discussion groups
or conferences. Increasingly, organizations are turning to standards bodies
such as the Institute of Electrical and Electronics Engineers Standards
Association (IEEE) to define and develop standards for information ethics.
But standards frameworks can only really advise or give guidance on what
your data ethics should be, they can only provide guidance on the types of
ethical decisions you should be taking. Likewise, an internal data ethics forum
that is not linked to some form of operational governance will be unable to
manifest any sustainable change in information management practices.

Societal value versus individual rights – the


root of our own dilemmas
The common thread between both the data-driven dilemma and the innova-
tor’s dilemma is one of social value versus individual rights or agency. To
paraphrase the EU’s GDPR, the management and processing of information
must be designed for the benefit of humanity. The challenge for the informa-
tion management professional is to understand how to make the trade-offs
between competing value and benefits to people arising from processing.
If processing is significantly invasive or harmful to the rights or freedoms
of individuals but delivers an equally significant social benefit, it may well be
that there is a valid ethical trade-off to be made. If, however, the processing is
invasive to the individual but of limited value to society then the trade-off is
less compelling. Also, even in scenarios where the trade-off is skewed against
the individual, organizations might be able to take some action to redress that
balance through education, communication or other mechanisms.
Societal and social value needs to include or factor in the value to the
individual as well. Something may not improve the greater good, but may be
a benefit to individuals and support, or impinge on, their dignity or their
ability to exercise other rights. For example, CCTV may have a social ben-
efit in supporting the detection of crime. However, too much CCTV, or
CCTV that overlooks private as well as public locations, can be highly inva-
sive. It impacts the right to privacy of individuals on a mass scale, and could
14 Introduction

affect the choices people make about their movements (impacting freedom
of movement) or who they meet where (impacting rights of assembly).
The type of trade-off we see in Figure 0.1 is at the heart of the balancing test
required under EU data protection laws when organizations are seeking to rely
on the fact that something is in their legitimate interests. It may be in the le-
gitimate interest of the organization, but that interest needs to be balanced
against the rights and interests of the individuals affected by the processing. It
also arises in other countries in the context of privacy impact assessments,
where many countries now require consideration of the impact of data pro-
cessing on the choices people might make about interacting with services, par-
ticularly services offered by government agencies.
This type of trade-off and balancing decision is also at the heart of many
of the ethics frameworks that govern professions such as lawyers or doctors.
For example, the lawyer’s professional duty of confidentiality has high value
to society, as without it people who need legal advice or representation
might be afraid to seek it out (American Bar Association, 2020; Law Society
of Ireland, 2013). Likewise, medical practitioners, psychologists or counsel-
lors all operate under an ethical presumption of confidentiality. This has
social value and minimizes the invasiveness or intrusion into the personal
and private life of the patient or others (American Medical Association,
2016; Psychological Society of Ireland, 2019). However, these ethical duties
can be overruled where there is a wider societal issue or where the disclosure
is in the interests of the individual.

Figure 0.1 The value/invasiveness matrix

High value/low invasiveness High value/high invasiveness


What other ethical issues might Is this processing we should be
arise in this processing? doing?
Social/societal value

If there is minimal impact on the What controls should we be


individual or on society what other putting in place to mitigate risks?
constraints might arise?

Low value/low invasiveness Low value/high invasiveness


What is the value of this processing? What is the value of this processing?
Is this data processing an Is this data processing an
administrative function? administrative function?

Individual impact/invasiveness
Introduction 15

For example, many countries now require doctors, lawyers or psycholo-


gists to notify law enforcement where there are indications of child sexual
abuse in a matter they are dealing with. The ethical juggling act that must be
performed is unenviable – providing sufficient information to the authorities
to investigate while protecting the confidence of their client or patient.
Similarly, the disclosure of information for the purposes of seeking advice
on a matter, or a consultation on a case where this is in the interests of the
client or patient, also requires the balancing of the amount of information
disclosed (invasiveness) with the benefit that is being sought for the indi-
vidual (societal or individual value).
In mature professions that have struggled with ethics issues since their
inception, and whose practitioners have made some horrific errors of ethical
judgement over the years (we will read more about some of these later), the
ethical questions continue to be a constant and pervasive challenge on a day-
to-day basis. In information management we have been shielded, to an ex-
tent, from these complexities by the ethical codes and frameworks of the in-
dustries for whom we were implementing information management systems.
However, as technology for data gathering becomes more pervasive and
enters our pockets, our classrooms, our living rooms and our bedrooms, as
the volume of data we are recording about people and events continues to
grow, and as the technical capabilities to analyse and process that data be-
come both more accessible (low-cost analytics tools and cloud-based tech-
nologies) and more autonomous (machine learning and artificial intelligence),
the need for a maturing of ethical practices in information management is
more pressing than ever.

Introducing the rest of the book


The rest of this book is divided roughly into two thematic parts. The first part
is principles-focused and gives a crash course in the basics of ethical principles
and the types of ethical issues that modern information management profes-
sionals face. The second part focuses on tools and practices, looking at some
of the existing good practices in information management and exploring how
they can be applied to support ethical information management.
Our hope is that readers will develop an understanding of both what
principles we need to be implementing as part of an ethical enterprise infor-
mation management philosophy but will also become familiar with ap-
proaches to integrating ethical information management into their existing
information management practices.
16 Introduction

A note on the use of the terms ‘data’ and ‘information’


in this book
The first edition of this book was titled Ethical Data and Information
Management, and throughout this book we use the terms ‘data’ and ‘infor-
mation’ somewhat interchangeably. Depending on the sector you are com-
ing to this book from, this may come up against a desire for differentiation.
We recognize that there can be complexity in the usage of these two words,
but generally at the levels we are approaching both data and information in
this book, it is a distinction without a difference. A common industry de-
scription of the relation of data and information is that ‘information is data
in context’. This description is embedded in a taxonomic hierarchy of ‘data’,
‘information’, ‘knowledge’ and ‘wisdom’, and ‘data ethics’ or ‘information
ethics’ would be important for ‘wisdom’ in that taxonomy.
Regarding the differentiation between information ethics and data ethics
as philosophical sub-disciplines of applied ethics, we are of the opinion that
the differentiations between data ethics and information ethics (and other
related sub-disciplines such as computer ethics, digital ethics and AI ethics)
are more related to waves of academic interest and the specific ethical ques-
tions relating to the use of data or information at a specific time and place
than useful distinctions for practitioners. As you move beyond this book, we
highly recommend you take a holistic view of data ethics, information ethics,
computer ethics, digital ethics, technology ethics and other areas of applied
ethics you may find relevant to how you are approaching the use of data and
its potential impacts. It’s all data, and as soon as you have some context for
that data, it’s all information. What matters is the ethics.

Chapter summary

In this chapter we started to set out the reasons why it is increasingly


important for information management professionals to become more
‘ethically aware’ and what that might actually mean for our methods and
practices for managing information. Key topics we addressed include:

●● The issues, risks and potential benefits that are presented by our
increasingly pervasive tools and technologies for data capture and data
analysis, much of which has evolved rapidly over the last decade.
Introduction 17

●● The importance of responsibility of action in the context of these


increasingly powerful data-processing technologies and the need for a
maturing in our approach to identifying and addressing the ethical issues
they give rise to.
●● The potential for information management as a profession to learn from
the successes and failures of other professional disciplines, such as
medicine, in defining methods and practices for addressing their ethical
challenges.
●● The importance of consciously and deliberatively considering the ethical
trade-offs that arise in the development of new technologies for data
capture and processing, particularly as information management matures
into a professional discipline in its own right and our technical
capabilities begin more and more to drive innovation in other areas,
rather than technology innovation being driven by the needs of disciplines
with existing models of ethical governance.

Questions
Throughout the book we will end each chapter with some questions and
thoughts for practitioners and students. These are intended to trigger in-
trospective learning and may not have an answer. In this chapter, we start
off easily:

1 Why is our approach to data ethics so flawed?


2 What kinds of data-processing activity are you or your organization
engaging in that raises ethical concerns or questions?
3 How would you know if proposed processing of data raised ethical
concerns?
4 If something in your organization’s approach to managing data raised an
ethical concern for you, how would you express that and who would you
express it to?
18 Introduction

Further reading
In this section of each chapter we provide some hints for other related reading you
might want to consider relevant to the chapter. This will be in addition to the
references for each chapter. For this introductory chapter, we would suggest the
following further reading:
Floridi, L (2014) The Fourth Revolution: How the infosphere is reshaping human
reality, Oxford University Press, Oxford
Hasselbalch, G and Tranberg, P (2016) Data Ethics: The new competitive
advantage, PubliShare, Copenhagen

References
American Bar Association (2020) Rule 1.6 Confidentiality of Information: Client–
Lawyer Relationship, Model Rules of Professional Conduct, www.americanbar.
org/groups/professional_responsibility/publications/model_rules_of_professional_
conduct/rule_1_6_confidentiality_of_information.html (archived at https://perma.
cc/N8A7-BZNC)
American Medical Association (2016) Chapter 3: Opinions on privacy,
confidentiality & medical records, AMA Principles of Medical Ethics,
www.ama-assn.org/sites/default/files/media-browser/code-of-medical-ethics-
chapter-3.pdf (archived at https://perma.cc/7XBB-V6GL)
Arthur, C (2014) Facebook emotion study breached ethical guidelines, researchers
say, The Guardian, 30 June, www.theguardian.com/technology/2014/jun/30/
facebook-emotion-study-breached-ethical-guidelines-researchers-say (archived
at https://perma.cc/NW2W-EXFQ)
Buytendijk, F (2015) Think about digital ethics within continually evolving
boundaries, Gartner, 1 April, www.gartner.com/smarterwithgartner/think-about-
digital-ethics-within-continually-evolving-boundaries/ (archived at https://
perma.cc/LU6G-XH23)
Cao, L (2016) Data science and analytics: a new era, International Journal of Data
Science and Analytics, 1 (1), 1–2
Cukier, K and Mayer-Schönberger, V (2013) The dictatorship of data, MIT
Technology Review, 31 May, www.technologyreview.com/s/514591/the-
dictatorship-of-data/ (archived at https://perma.cc/5DXM-4KHF)
Davenport, T H and Patil, D (2012) Data Scientist: The sexiest job of the 21st
century, Harvard Business Review, October, hbr.org/2012/10/data-scientist-
the-sexiest-job-of-the-21st-century (archived at https://perma.cc/9VA4-2W3P)
Introduction 19

European Data Protection Supervisor (2015) Towards a New Digital Ethics: Data,
dignity, and technology, 11 September, edps.europa.eu/sites/edp/files/
publication/15-09-11_data_ethics_en.pdf (archived at https://perma.cc/W7RA-
XUDV)
Google Scholar (nd) Luciano Floridi, scholar.google.com/citations?user=
jZdTOaoAAAAJ (archived at https://perma.cc/4L8Y-F748)
Helbing, D et al (2017) [accessed 1 August 2017] Will democracy survive
Big Data and artificial intelligence? Scientific American, 25 February,
www.scientificamerican.com/article/will-democracy-survive-big-data-and-
artificial-intelligence/ (archived at https://perma.cc/LK7X-R2C8)
Jenkin, M (2015) [accessed 1 August 2017] Millennials want to work for
employers committed to values and ethics, The Guardian, 5 May, www.
theguardian.com/sustainable-business/2015/may/05/millennials-employment-
employers-values-ethics-jobs (archived at https://perma.cc/TJ37-MQJR)
Law Society of Ireland (2013) A Guide to Good Professional Conduct of Solicitors,
3rd edn, Law Society of Ireland, Dublin
Lee, D (2014) Samaritans pulls ‘suicide watch’ Radar app, BBC News, 7 November,
www.bbc.com/news/technology-29962199 (archived at https://perma.cc/3WC2-
5SKH)
Lekach, S (2016) Privacy Panic? Snapchat Spectacles raise eyebrows, Mashable,
16 November, mashable.com/article/snapchat-spectacles-privacy-safety
(archived at https://perma.cc/4BB8-FSEZ)
Lifestream Blog (2011) Lifelogging, http://lifestreamblog.com/lifelogging/ (archived
at https://perma.cc/V94K-XAH4)
Neudert, L-M (2017) [accessed 1 August 2017] Computational Propaganda in
Germany: A cautionary tale, Oxford Internet Institute, University of Oxford,
docslib.org/doc/4608254/computational-propaganda-in-germany-a-cautionary-
tale (archived at https://perma.cc/9KCA-TGXH)
Novi, S (2017) Cambridge Analytica: psychological manipulation for Brexit and
Trump? Medium, 9 July, snovi.medium.com/cambridge-analytica-psychological-
manipulation-for-brexit-and-trump-2e73c2be5117 (archived at https://perma.cc/
J8ZU-MCFD)
Orme, J (2014) Samaritans pulls ‘suicide watch’ Radar app over privacy concerns,
The Guardian, 7 November, www.theguardian.com/society/2014/nov/07/
samaritans-radar-app-suicide-watch-privacy-twitter-users (archived at https://
perma.cc/8VDY-3A2B)
Pariser, E (2011) The Filter Bubble: How the new personalized web is changing
what we read and how we think, Viking, London
Pongnumkul, S, Chaovalit, P and Surasvadi, N (2015) Applications of smartphone-
based sensors in agriculture: A systematic review of research, Journal of Sensors,
www.hindawi.com/journals/js/2015/195308/ (archived at https://perma.cc/
FP89-9JQ2)
20 Introduction

Psychological Society of Ireland (2019) Code of Professional Ethics, October, www.


psychologicalsociety.ie/source/Code%20of%20Professional%20Ethics%20
(Oct%202019).pdf (archived at https://perma.cc/78WB-F852)
Revell, T (2017) How to turn Facebook into a weaponised AI propaganda
machine, New Scientist, 28 July, www.newscientist.com/article/2142072-how-
to-turn-facebook-into-a-weaponised-ai-propaganda-machine/ (archived at
https://perma.cc/9YW4-3EDC)
Samson, A (2016) Big Data is nudging you, Psychology Today, 30 August, www.
psychologytoday.com/blog/consumed/201608/big-data-is-nudging-you (archived
at https://perma.cc/JZ3N-7GRM)
Wadell, K (2017) The Internet of Things needs a code of ethics, The Atlantic, 1
May, www.theatlantic.com/technology/archive/2017/05/internet-of-things-
ethics/524802/ (archived at https://perma.cc/D92V-GCWX)
Wikipedia (nda) Google Glass, en.wikipedia.org/wiki/Google_Glass (archived at
https://perma.cc/CV72-REAQ)
Wikipedia (ndb) Quantified Self, en.wikipedia.org/wiki/Quantified_Self (archived at
https://perma.cc/4J6N-TU3Q)
21

Ethics in the 01
context of data
management
What will we cover in this chapter?
In this chapter you will:

●● Develop an understanding of how ethics and technology have evolved in


tandem since the earliest days of human thought.
●● Explore why it is important for data management professionals to have a
grounding in fundamental concepts of ethics, particularly in the context
of modern information management capabilities.
●● Be introduced to some of the ethical discussions around technology and
the responsibilities of the technologist when developing or integrating
technologies.
●● Explore data privacy as an example of data ethics in action.
●● Explore questions of the environmental impacts of data processing as an
example of data ethics
●● Be introduced to some fundamental ethical questions that you can ask
when planning or performing any data-processing activity.
●● Develop an understanding of the emerging importance of information
ethics in the context of the regulatory oversight of data management.
22 Data Ethics

Ethics and the evolution of technology


Ethical issues relating to the application of data management technologies
and practices are gaining more prominence in mainstream news and politi-
cal discussion. There are calls for the development of new ethical frame-
works and approaches to help technologists deal with the complexity of the
issues that are raised, and in some jurisdictions around the world there has
been movement to formalize and even bring in regulation for data ethics in
some form at national and supranational levels. Computer science and busi-
ness management courses may teach modules on ethics, but often these
courses fail to provide the tools necessary for information management pro-
fessionals to work from first principles and apply robust ethical concepts in
the execution of our day-to-day roles. This can result in the ‘law of unin-
tended consequences’ being invoked, with technologies or analytics capa-
bilities being deployed in the real world without an appropriate analysis of
the ethical implications or impacts of the processing ­activity in question.
The tools and technologies we have at our disposal as data management
professionals have the potential to bring benefit or cause harm to people. All
data that is processed, with very few exceptions, impacts on people in some
way. Whether it is data that allows us to identify a person and make a deter-
mination about their eligibility for a loan, or d ­ ata that trains an artificial
intelligence system that provides sentencing recommendations to judges, or
whether it is data about the performance of a car’s engine in environmental
impact tests, the outcomes that result from the processing of that data affect
people through an impact on privacy, a potential for bias in decision mak-
ing, and an impact on governmental policies and climate change invest-
ments. With the incredible i­ncrease in computing power, data-gathering
­capabilities and machine-learning technologies, the potential for harm to be
caused to individuals or to groups of society, either by accident or by inten-
tional design, is significant. With significant power comes significant risk.
Because of this, it is more important than ever that people who work with
data – whether we call ourselves information management professionals,
researchers, data scientists, knowledge workers, developers – have an ap-
propriate grounding in ethics so that decisions that are made in relation to
the management and use of information are based on sound ethical princi-
ples, and so that potential consequences and harms arising from our deci-
sions are taken into account in making these decisions.
As the potential fields of application for information technologies and
data capture and analytics technologies continues to expand, information
Ethics in the context of data management 23

management practitioners need to be better equipped to extrapolate ethical


issues from first principles. When you learn a programming language you
usually start with some common core fundamentals such as variable defini-
tion and <If><Then><Else> logic. To start to get to grips with ethics in the
context of data management you need to go back to some basic ethical
concepts to give a common foundation from which to build your under-
standing.
The evolution of technology has gone hand in hand with the evolution
of ethics since the earliest times. Much of that parallel evolution has cen-
tred on the discussion and debate about the ways in which evolving tech-
nologies might best be applied, and some early discussions of the ethics of
technological development centred on new technology for recording and
disseminating information.
Plato, writing in the Phaedrus in 370BC (Plato, 2009), recounted Socrates’
view of the new information management technology that was emerging in
Greece at that time, writing: ‘This discovery of yours will create forgetfulness
in the learners’ souls, because they will not use their memories; they will trust
to the external written characters and not remember of themselves’.
Socrates was describing here the impact on memory and the transfer of
information in an oral tradition. His warning was that this new-fangled
writing would create the illusion of memory but not the reality of recall and
understanding. Later in this dialogue between Socrates and Plato, Socrates
says that, with the advent of written characters, people ‘will appear to be
omniscient and will generally know nothing’.
In recent decades, we’ve seen a small example of the displacing of memory
skills onto technology with the development of mobile phone technology.
Depending on how old you were when you got your first mobile phone, you
may have a mental ‘contacts list’ of phone numbers that you memorized.
Today with smartphones that easily have memory for hundreds of numbers,
many of us have ‘outsourced’ that memory activity to our phones, integrated
with a much larger list of addresses, email addresses and other information
about people in our contacts list. Daragh can remember the contact numbers
of people he knew before he got his first mobile phone in university, even if he
hasn’t dialled the number in decades, but he struggles to remember newer
numbers. Katherine has a different but related memory issue more directly
related to data management; she struggles to remember who some of the peo-
ple in her phone contacts list are and how she knew them. At a much larger
scale, the illusion of memory described in Plato might also be seen as a wide-
ranging consequence of the primacy of the written word as a way of encoding
24 Data Ethics

memory over the past several thousand years. In many ways, cultures with
long-developed literary traditions forgot the value of oral traditions in the
encoding and transmission of memory, dismissing deep historical and scien-
tific knowledge of indigenous peoples as ‘merely’ myths and legends. More
recently, researchers and scientists have begun to identify the scientific and
historical knowledge in indigenous oral traditions (Terry et al, 2021).
Socrates’ words could be easily adapted to any of the emergent technolo-
gies in data management and would still be as relevant and p ­ rovocative of
debate as they were when first recorded 2,500 years ago. At their heart is a
fundamental truth that in any technology there are both benefits and risks.
Jump forward a couple of millennia, and philosophers are still talking
about the meaning of our relationships to technology. Martin Heidegger
argued that modern technology allowed us to have new relationships with
the world that were not previously possible (Heidegger, 1977). Heidegger
considered the ways of thinking that lie behind technology and saw technol-
ogy as a means to an end, not an end in and of itself. But the advent of
technologies changed the relationships between people and the other ­objects
in nature that we interact with and, through the operation of technology, a
new nature of a thing can be revealed. For example, the use of mining tech-
nology can turn a beautiful hillside into a tract of land that produces coal or
iron ore. He also gave the example of technology being used to manipulate
uranium to produce nuclear energy, but highlighted that that process is an
example of one that can have either a destructive or a beneficial use.
When discussing technology, we often use similar phrases interpreted in
a way that suggests that ‘technology is ethically neutral’ and that ‘it just
depends on how you use it’. But this interpretation does not consider the
many decision points that go into the design and development of any tool or
technology. At each stage, there is an assumption made as to what some-
thing should be and how something should work. Many of these decisions
have ethical impacts, and some tools are designed to be much better at
­producing some outcomes than others.
For an example of technology much less fraught with explosive potential
than nuclear energy, think about a pair of scissors. A tool as simple as scis-
sors requires a number of technologies and manufacturing capabilities, and
decisions that have ethical impacts. The mining and metallurgical technolo-
gies that create the metal for the blades, the plastics moulding technologies
that produce the handles, and the manufacturing technologies that assemble
the scissors, all have their impacts on people and the environment they are
in. This is the greater social and environmental context of decisions and
ethical impacts that surround the design decision for a simple tool. However,
Ethics in the context of data management 25

the design of the simple tool itself also can affect the dignity of the people
who use them. Scissors aren’t a neutral technology if you are a left-handed
person using a pair of scissors made with the design assumption that the
user is right-handed. The assumptions underlying the decisions we make as
to what information should be included in a data set or design specification,
how technology should work, what is a ‘good’ result, and the characteristics
of our default ‘user ‘can have direct effects on people and their experience of
the tool. Whether scissors are designed as an ergonomic tool to be used by a
right-handed person, a left-handed person, or both, as an ambidextrous set
of scissors, can have an impact on individuals in terms of the usability of the
scissors. The thinking and considerations that are applied to the design and
application of technology affect (in this case in a literal way) how objects in
the real world are manipulated.1
Unfortunately, it’s not just the aching arm of a left-handed paper cutter
that might be manipulated by technology. The potential for people to be
manipulated using technology is also a concern, and one that is increasingly
topical given the growing concerns about the abuse of social media and as-
sociated technologies to influence people’s moods, market products and ser-
vices, and influence elections. While there are potentially significant benefits
from the technologies in question, they create significant ethical questions
about their impact on individual choice and agency.
In the same time frame as Heidegger, Mario Bunge argued that ‘the tech-
nologist must be held not only technically but also morally responsible for
whatever he designs or executes: not only should his artifacts be optimally
efficient but, far from being harmful, they should be beneficial, and not only
in the short run but also in the long term’ (Bunge, 1977). This echoes the
sentiment in Recital 4 of the EU General Data Protection Regulation, which
says that ‘the processing of personal data should be designed to serve
­mankind’ (European Union, 2016).

Privacy and the environment: two examples


of ethical questions in data
If you have heard people talking about the capabilities of data or boosting
digital transformation, big data, the adoption of AI in organizations or many
other data topics in the past decade or so, you will probably have heard some-
one say that ‘data is the new oil’. We dislike this metaphor for many reasons.
The problems with the ‘oil’ metaphor are many, and it highlights two different
26 Data Ethics

areas of ethical issues or ethical risk in data that we will introduce in this
chapter as examples of the range of ethical questions you may encounter.

The evolution of privacy and technology


One of the great risks identified in the application of modern data management
capabilities and analytical tools is violation of the fundamental human right to
privacy. However, this is not the first time that new developments and applica-
tions of technology have raised concerns about the preservation of the right to
privacy in the face of the danger of its dissolution by use of technology.
Samuel D Warren and Louis D Brandeis founded their argument for a
‘right to privacy’ in the light of developments in business and technology.
They referred to the basic principle in common law ‘that the individual shall
have full protection in person and in property’, stating that ‘it has been
found necessary from time to time to define anew the exact nature and ex-
tent of such protection’ (Warren and Brandeis, 1890) and arguing that
changes in political, social and economic realities reveal the need for new
legal recognition of human rights. They argued that ‘recent inventions and
business methods call attention to the next step which must be taken for the
protection of the person, and for securing to the individual what Judge
Cooley calls the right “to be let alone”.’
Warren and Brandeis conceptualized privacy as a basic fundamental right
to the development of the human as an autonomous individual with thoughts
and feelings. Essentially, they argued for the preservation of human dignity
in two forms: individual autonomy and the development of personality, and
the preservation of public face. In doing this, they presented two models for
understanding privacy that reflect the different ways privacy rights are ap-
proached in the United States and in Europe. While in the United States case
law regards privacy with a focus more on the idea of privacy as related to
intellectual property or as a ‘liberty right’, Warren and Brandeis also clearly
linked the ‘right to privacy’ to first and fourth amendment rights, with a
clear emphasis on privacy rights as fundamental to the dignity of the indi-
vidual. One can read this as an argument for privacy as a ‘personality right’.
Bart van der Sloot has identified a clear link between the conceptualization
of privacy rights in a European context to the German concept of
‘Persönlichkeitsrecht’, referring to Article 2, paragraph 1 of the German
Constitution, which states that ‘Everyone has the right to the free development
of his personality insofar as he does not violate the rights of others or offend
Ethics in the context of data management 27

against the constitutional order or the moral code’ (van der Sloot, 2015). In
this, ‘privacy’ is not just a negative right to be ‘left alone’, but a positive right
to be free to develop one’s personality as an autonomous human.
Building on the concepts of privacy as a right related to human dignity as
Warren and Brandeis framed it, Stanley I Benn defined privacy in the context
of respect for the person as an autonomous individual or ‘chooser’.
Essentially, Benn framed the violation of privacy as a failure to respect per-
sonhood (Benn, 1971; Hudson and Husack, 1979). This human rights focus
brings us back to first principles, with an understanding that privacy as a
right upholds the treatment of a human as an autonomous individual, a
‘chooser’ who must be treated as an end, not just a means. The conceptual-
ization of the individual as ‘chooser’ directly relates to the need to be able to
actively and knowingly consent to the processing of one’s information and
the purposes for which it is processed.
In the wake of human rights violations perpetrated in the years leading
up to, during and after the Second World War, European nations have
adopted a strong fundamental rights approach to privacy, regarding privacy
as a necessary right fundamental to the respect for human dignity. This fun-
damental rights-based focus is reflected both in the institution of an
­overarching data protection directive, and in Articles 7 and 8 of the European
Convention on Human Rights, which has binding treaty power.
This rights-based understanding of privacy has a deep history in European
philosophy and ethics, which are based in philosophical understandings of
personhood and the individual, including Immanuel Kant’s formulations of
the categorical imperative.2 In tracing back our understanding of privacy to
first principles, we may uncover the foundations of an ethical framework for
new developments in technology and actions.
This ethical approach ultimately finds expression in many of the funda-
mental principles of data privacy and data protection laws, which attempt
to provide a legal framework to give effect to the ethical values of privacy as
a defined right.

Data ethics and environmental concerns


The impacts on people’s privacy from data processing is one of the topics
often most closely linked to data ethics, but the ethical issues in information
management are much broader than a single issue. Even the European for-
mulation of data protection rights broadens the questions of the impact of
28 Data Ethics

processing data about people, from impacts on privacy to impacts on other


fundamental rights and freedoms. Another focus for questions and concerns
in data ethics can also be seen in the problems with ‘data is the new oil’ as a
slogan. One of Daragh’s responses to the ‘new oil’ cliche has been to point
out that maybe hitching your wagon to the images of oil spills and climate
disaster associated with fossil fuel in the 21st century, and to compare the
risks of data breaches to the risks of toxic waste spills isn’t the best strategy.
These analogies raise questions of the effects of our actions on the environ-
ment, another key area of ethical risk in data and information management
that has come under increasing attention in recent years.

Volkswagen’s ‘dieselgate’ scandal


In 2015, a scandal rocked the automobile industry when it was revealed that
Volkswagen had installed ‘defeat devices’ in over 11 million vehicles over
the past eight years, software that could detect whether its diesel cars were
running under test conditions rather than normal operating conditions for
road driving, and adjust the engine’s performance under test conditions, es-
sentially to cheat emissions tests. The environmental impact of this was that
these cars were generating up to 40 times the acceptable pollution emission
limits for diesel vehicles. The fallout of the scandal resulted in financial pen-
alties of more than $30 billion, criminal penalties including the imprison-
ment of one of VW’s senior executives, and the firing or suspension of ­several
others. A longer-term result of the scandal was a strategic pivot on
Volkswagen’s part to focus on electric vehicles, and a resulting overhaul of
its operational structures. (Daiss, 2019; US EPA, nd). The impact of the
scandal extended beyond a single company, though. The scandal brought
scrutiny to the standards in manufacturing of the entire industry, and in
2016 the International Organization for Standardization (ISO) and the
International Automotive Task Force (IATF) introduced a new revised auto-
motive industry standard for quality management systems to introduce eth-
ics requirements for the first time (Boler, 2016).
Why are we telling a story about the car industry in a book on data eth-
ics? Data processing doesn’t happen in a vacuum. There are always choices
being made, and even the most abstract theoretical calculations can often
result in real-world effects. In the case of Volkswagen, you may be asking
how this is a ‘data’ question. The decisions leading up to the scandal in-
volved ethical questions relating to the use of data analytics in regulatory
Ethics in the context of data management 29

reporting, and the result was the direct impacts of the emissions released on
the environment and the global climate crisis. We will discuss this more in-
depth as a case study in Chapter 4. For the moment, the software, comput-
ing and data analytics involved in the modern automobile industry are an
example of how the ethics of data processing and environmental concerns
are linked.

Digital waste and the carbon cost of processing data


In the 1990s, boosters hyped the ‘paperless office’ as the way of the future
and an environmental solution reducing the amount of paper waste in land-
fills. Decades later, while offices still aren’t ‘paperless’, we have also come to
a greater realization that, in the words of Gerry McGovern (2020), ‘digital
is physical’ and the energy costs of every action we take on computers is
significant. One could argue that the ‘paperless office’ has instead become an
office of digital waste, one of the forms of waste produced by a digital econ-
omy. Often when we focus on the environmental cost of waste, we focus on
the more tangible and visible effects of electronic waste, the discarded de-
vices used to process data. Research from the World Economic Forum
(2019) has identified e-waste as ‘the fastest growing waste stream in the
world’ and digital waste is also increasing exponentially.
The ecological impact of e-waste from the technology we use to process
data is inextricably linked to ethical questions regarding the effect on the
environment of ‘digital’ or data processing. But with the focus of this book
geared specifically towards data ethics or ethical information management,
our focus is more on the less tangible – but no less significant – impact of the
carbon costs of energy-intensive data processing. Think about how much
energy it takes to create, replicate, store, access and process this data in dif-
ferent ways. Because we don’t often see the physical storage of our ‘paper-
less’ data environment it’s easy to have a hoarding or wasteful attitude
­towards this data. In the words of Gerry McGovern (2020), ‘Digital encour-
ages extreme waste and an extreme waste mindset’.
The exponential increase in our use of energy over the centuries is mir-
rored by the exponential increase in the amount of data in existence. In 1981,
Buckminster Fuller described the amount of time across history that it took
for the information accumulated and distributed in the world to double.
Before 1900, it took approximately 100 years. By the end of the Second
World War, information was doubling every 25 years. By 1982, he projected
it would double every two years. A 2006 report from IBM r­ eflected Fuller’s
30 Data Ethics

discussion of the knowledge doubling curve, projecting that by 2010 the


world’s information base would double every 11 hours. IBM’s report, titled
‘The toxic terabyte’, highlighted a risk of Moore’s law regarding the expo-
nential growth in capacity of computer chip design and the ‘data explosion’
it enabled, identifying ‘the toxic terabyte’ as a ‘threat to business efficiency’,
which would require ‘changes in business and organizational behaviour’ as
well as technical tools to enable information life cycle management (IBM,
2006). This report identified the ‘data explosion’ as primarily an information
management problem and highlighted that ‘data-dumping’ behaviour is also
an environmental threat, asking ‘Who is – or should be – beating the drum
for a ‘greener’ approach to data creation and storage?’ The issues highlighted
in the 2006 report have been intensified by the last decade’s focus on ‘big
data’ and maximalist approaches to computing and analytics. Estimates are
that up to 90 per cent of data we store is essentially waste. In many cases, big
data (one of the biggest information management buzzwords of 2014–2016)
exemplifies this. Daragh’s perspective of big data from working in the tele-
communications industry and managing ‘big data’ before it became a buz-
zword has been that big data information management is simply information
management at scale, and that the same management and governance should
apply. We’ve observed a strong trend towards larger and larger data lakes
and larger, more energy-intensive automated tools to search through the un-
structured and structured big data to sort it.
Our turn to the ‘digital’ has resulted in a significant amount of energy use
going to the processing of data in various ways. It has been estimated that
internet usage alone consumes 1–2 per cent of global energy supply (Kiernan,
2021; Obringer et al, 2021). This makes the energy use consumed by the
internet comparable to the top five countries for highest consumption of
energy.

Environmental impacts of AI technologies


As various AI solutions become integrated into our day-to-day use of data
and technology, a number of researchers in computer science have begun to
sound the alarm on the carbon cost involved in the processing power re-
quired for computing, warning that the computing power required to train
and develop models may be a ‘significant contributor to climate change’.
The carbon footprint of data processing is a consideration for all comput-
ing, but the energy costs of deep learning and training large models are
particularly notable. The power requirements for the computations involved
Ethics in the context of data management 31

in machine learning are significant. These costs and the expense of training
models also affect the accessibility of resources to engage in research.
This aspect of the environmental concerns in data ethics has challenged
the business models of some of the biggest players in the tech world. This
was brought to public attention well beyond the usual audience for research
in AI ethics, when the publication of a paper by a group of researchers from
academia and tech ‘On the dangers of stochastic parrots: Can language
models be too big?’ (Bender et al, 2021) coincided with Google precipi-
tously firing two of its star researchers in AI ethics who were co-authors on
the paper, raising questions of ‘ethics washing’ and corporate ‘capture’ of
academic research. This influential paper raises a number of ethical ques-
tions on the environmental and societal implications of the trend in NLP
research to move towards ever more massive language models.

Environmental impacts of data centres


The increase in computing capacity and maximalist approach to computing
can be seen in the growth of data centres globally. Cloud computing has
taken over, for what are in many cases very sensible reasons relating to ef-
ficiency, economies of scale and the convenience of offerings, as well as the
complexities of maintaining separate servers with the capabilities of cloud
service providers. It offers a metaphor for the physical impact of a ‘service’
sold on a metaphor of intangibility. It’s a truism often ignored that ‘the cloud
is just other people’s computers, or servers’, and one of the most visible
manifestations of the energy cost of computing and data processing is the
data centre – if you are located near one. Otherwise, physical reality and
environmental costs, and trade-offs involved in the provision of computing
‘as a service’ running and cooling down data centres are obscured, even in-
tangible.
The building and use of data centres involves trade-offs considering po-
tential balancing of harms to communities and the environment. The bal-
ancing of these trade-offs can be extremely contentious in regions that host
many data centres. While cloud computing can be potentially much more
energy-efficient than the equivalent individual hard drives and servers that
the cloud services replace, the energy costs are often immense, and the ben-
efits of those efficiencies may not be seen directly by the communities bear-
ing the costs. In an article on the environmental impact of Ireland’s strategic
promotion of being the ‘data capital of Europe’, Aoife Kiernan (2021) de-
scribes some of the tensions between the ‘major impact on the environment’
of hosting one of the biggest hubs of data centres in Europe and some po-
32 Data Ethics

tential mitigations to the energy costs. For example, cooling systems that
increase efficiency by using sensors to determine where energy would best be
spent to cool the system, or integrated development of a data centre that
uses excess heat in a community district heating system to heat buildings in
the area. However, the concentration of data centres in a location that has
been considered ideal due to climate, resources and infrastructure has
strained the country’s energy grid to its limits and impacted the country’s
ability to meet climate targets.
The environmental and ethical concerns and trade-offs regarding the re-
source consumption of data centres are an example of questions in data
ethics that require engaging with structurally at political and community
planning levels, not simply at a level of individual or organizational decision
making. It also illustrates how organizational decision making may exter-
nalize costs or impacts, affecting individuals and communities that might
not have been considered as stakeholders. This will require nuanced discus-
sion, as decisions like Singapore’s 2019 ban on building new data centres
may only displace effects and do not address the demand for data processing
power (Mah, 2021).

Some proposed approaches to mitigate environmental


harms
While the environmental concerns raised by our usage of data are massive
and may seem intractable, significant energy is being devoted to finding
ways to address the harms to the environment caused by data use. The fol-
lowing are a few examples of ways in which researchers and practitioners
are seeking to mitigate the carbon impact of data.

Carbon tracking and reporting


Awareness and understanding of the often invisible costs of data processing
is foundational to changing behaviour, and many people are working on
solutions to help track, report and communicate these costs to help people
make decisions. Projects such as the Carbon Literacy Project (carbonliter-
acy.com) and carbon calculators that measure the footprint of AI or other
activities can, with varying accuracy, inform people on the carbon costs of
projects and support them in reducing emissions.
Another random document with
no related content on Scribd:
vastustamattomasti livahtivat pois ennen vaikka vaivallakin opittuihin
asioihin.

Tämä se alkoi Floriania huolettaa, ja niissä huolissaan hän muuttui


hyvin lyhyessä ajassa umpimieliseksi ja synkäksi, kun vilkas
mielikuvitus ryhtyi luomaan eteen kaikenlaista ikävää.

Lisette kirjoitteli tuon tuostakin ja kyseli, joko hän oli saanut selville
uransa ja ryhtynyt sille valmistautumaan.

Se saattoi Florianin melkein tuskihinsa. Kunniantunto ja


kunnianhimo estivät vastaamasta eittävästi, eikähän valhekaan olisi
pitkälle auttanut, ainakaan useampia lukukausia.

Viimein hän päätti kirjoittaa isällensä asian koko synkeydessään ja


tekikin kirjeen suunnitelman valmiiksi eräällä luentotunnilla.

Asuntoonsa palattuaan pyysi Florian johonkin tarpeesen rahaa


emännältänsä. Isä näet, arvaten poikansa kykenemättömäksi
muistamaan juoksevia asioitaan ja hoitamaan varojaan, oli pannut
hänen pankkiiriksensa asunnon emännän.

Silloin emäntä antoi hoitolaalleen kunnallisneuvoksen kirjeen, joka


samana päivänä oli saapunut hänen kirjeensä sisässä. Isä Guldén
siinä kertoi monen takauksen tähden olevansa ihan joutumaisillaan
vararikkoon ja samalla kehoitti Floriania nyt rohkeasti pitämään
huolta itsestään, niinkuin monen muunkin ylioppilaan täytyy tehdä.

Florian ei ensin käsittänyt oikein koko asiaa, vaan selvisihän se


viimein.

Olipa siinä miettimistä Florianin laiselle miehelle.


Neuvottomuudessaan aikoi hän jo surmata itsensä, mutta kauhistui
sitä tekoa, eikähän se muutenkaan oikein soveltunut
kunnianhimoiselle mielelle. Karttaakseen kiusaustakin siihen sulloi
hän uuteen revolveriin, jonka oli kerran saanut lahjaksi, kaksi jäljellä
olevaa patroonia ja työnsi aseen nuttunsa taskuun, aikoen viedä sen
asekauppaan myötäväksi.

Matkalla unhottui häneltä tämä asiansa ja hän astui


ylioppilastaloon ylös lukusaliin, istahti siellä sohvan nurkkaan ja
vaipui ajatuksiinsa.

Miten olikaan, sattui käsi taskussa revolveriin. Taaskin pettivät


ajatukset. Huolet hiukan hälvenivät, mieli kiintyi revolverin
koneistoon. Hän näet oli kerran huviksensa purkanut koko koneen ja
sitte jälleen asetellut paikoilleen.

Siinä miettiessään, miten lupasin nostaa hanaa ja miten samalla


patroonitukki pyörähtää, miten hanan ponsi sitte livahtaa liipasimen
varasta ja vipu silloin iskee hanan pään nallipuikkoon — sitä
miettiessään hän painalsikin liipasinta, ikäänkuin näyttääkseen: juuri
noin!

Lukusalissa oli paljo muitakin ylioppilaita rauhallisesti lukemassa


kuka sanomia, ken mitäkin.

Yht'äkkiä kuuluu paukaus ja Florian Guldén peittyy ruudin savuun.


Kaikki hypähtävät seisomaan. Savun hiukan hajotessa näkevät he
Florianin vetävän taskustaan revolveria ja pakenevat suin päin ulos
salista.

Florian kauhistuu. Hänkö on ampunut ja tässä rauhoitetussa


paikassa.
Ei, se on mahdotonta. Mutta tuossahan on revolveri ihan omassa
kädessä.
Ja reisi tuntuu kostealta ja niin kummalliselta. Mitä? vertako?

Florian tuskistuu. Muistamatta revolveriaan nostaa hän molempia


käsiään otsaansa kohti. Oikea käsi on vielä samassa asennossa
kuin sen lauaistessa revolveria taskussa. Aseensuu osuu juuri
ohimon kohdalle, vähän viistoon ylös päin. Etusormi vavahtaa ja
piipusta tuiskahtaa luoti luusta sisään.

Se teki tehtävänsä, vaikka olikin lennähtänyt käskemättä. Ase


putosi raukeasta kädestä, pää retkahti eteen päin.

Kun hyvään aikaan ei enää kuulunut mitään, rohkastuivat


muutamat ulos paenneista ja palasivat katsomaan tuota mieletöntä,
joka muka itsemurhan paikakseen valitsi juuri ylioppilasten lukusalin.

Onneton oli tapaturman uhrina kuoleman teossa! vähän vielä


vääntelehti ja vavahteli ruumis.

Lääkärin apua oli aivan käsillä, vaan turha vaiva. Nutun


rintataskusta pilkisti paperin nurkka. Toivoen siitä saatavan selitystä
kamalaan tekoon, otettiin se käsille ja luettiin.

Se oli tuo Florianin kirjeen suunnitelma isällensä. Hän siinä kuvaili


tilansa tukaluutta ja arvaeli siihen joutumisensa syitä sekä lausui
lopuksi:

"Jos olisin nuorempana päässyt johonkin käsin kosketeltavaan


työhön,
sen toki olisi täytynyt totuttaa tosi elämään, vaan nyt häilyn ilmassa.
Isä, voi miksi miksi et ajoissa pannut minua jonkin käsityön oppiin?
Ehkäpä se ei vielä nytkään ole liian myöhäistä."
VASARAKAUPPA.

I.

Hiljakseen vieriskeli markkinoilta toisille kuljeksivan kauppiaan


raskas tavarakuorma kauniina syyspäivänä illan suussa pitkin
maantietä, joka hyvästä täytteestä kovana kumisi alla kuin kallio.
Toisissa, omissa kärreissään vähän keveämmän kuorman lisänä
istui itse kauppias, teräväsilmäinen, nuorenpuolinen mies vielä
nuoremman vaimonsa kera. Hevosien paluuttajaksi mukaan lähtenyt
poika ajoi varsinaista kuormaa.

"Kuulepas, Mari", virkkoi kauppias erästä pitkää mäkeä noustessa,


"ei tämä markkinoilla vaelteleminen enää oikein lyö leiville. Liian
monta on ruvennut samaa neuvoa pitämään."

"Mitäpä meillä on muutakaan neuvoa. Vielä ovat ansiot liian pienet


ajatella yhteen paikkaan asettumista. Paljo noita on kauppiaita
kaupungeissakin."

"Eihän meidän olekaan pakko asettua kaupunkiin."

"No mutta, Jaakko, vielä vähemmin kannattaa ruveta maalle


asettumaan. Johan nyt on maakauppiaitakin melkein joka kylässä.
Ja vaikeaksi käy vähillä varoilla kilpailla entisten kanssa."

"On tällä kulmalla vielä paljo sellaisiakin seutuja, joista on


likimmäisiin kauppiaihin hyvä heilaus matkaa. Katselehan vähän
ympärillesi esimerkiksi tänä iltana, kun tästä ensimmäinen kylä tulee,
johon muuten yöksikin jäämme."

Siihen pysähtyi pakina, kun lähdettiin alamäkeä laskemaan ja se


antoi kylliksi hevosen ohjaamistyötä. Eipä puhe jatkunut sitte entistä
tasaisemmallakaan tiellä, sillä luonto alkoi isonlaisen salon jälkeen
muuttua ja vetää puoleensa huomiota. Vasemmalta päin aukesi
muutamassa tien mutkassa eteen suuri järvi ja sen rannalta näkyi
loitompaa jo viljeltyjä seutuja. Hevosetkin siitä vilkastuivat ja läksivät
halukkaammin taloja tavoittamaan.

Tie kohosi vähitellen harjanteelle eli kylänseljälle ja kohta vierivät


kuormat ensimmäisten mökkien ohitse, jopa viimein muutamain
talojenkin, jotka sekä ulkoasultaan että puhtaudeltaan olivat tavallisia
itäsuomalaisia: eivät kehua kannattaneet, mutta toimeen niissä
tultiin, ehkä välistä huonommastikin.

Sen sijaan näkyi pelloilla tiheässään ruiskekoja sekä ohra- ja


kauranärtteitä, jopa joitakuita ylivuotisiakin todistamassa, että
Harjulan kylässä ei ainakaan leivän puutteessa eletty.

"Katsos noita!" huomautti kauppias vaimollensa. "Onko köyhyyttä


vai mitä arvelet?"

"Eivätpä näytä viljan puolesta köyhiltä. Mutta miten heillä rahaa


lienee?" kysyi Mari, tajuten täydellisesti miehensä ajatukset, että
juuri Harjula hänen mielestään varsin hyvin soveltui kauppiaan
koetustantereksi.
"Miss' on ruista, siin' on aina rahaakin", vakuutti Jaakko lyhyeen ja
käänsi jo hepoansa erääsen pihaan, johon kyytipoika oli edeltä
kuorminensa kadonnut karjotan nurkan ta'itse.

Jättäen pojan kuorman vartiaksi riensi Jaakko vaimoineen


porstuaan, jonka molemmilla puolilla oli isot tuvat. Oikean puolisesta
kuuluvat ihmisäänet neuvoivat sisään neuvottelijoita kääntymään
sinne päin, ja Jaakko entiseltä muistilta pian löysi oven, vaikka
ikkunattomassa porstuassa jo olikin melkein pilkko pimeä, kun vielä
oven alapuoliskokin oli kiinni.

"Hyvää iltaa! terve taloon!" toivotti Jaakko.

Isäntä, roteva, kesk'ikäinen mies, Matti Vanhanen vastasi:

"Tulkaa tervennä! Kas, Kauppinenhan se onkin, eikä Jaakko enää


näy olevankaan yksinään."

"Ikäväksipä tuo alkoi aika yksin käydä. Tämä on vaimoni, ja nyt


olen näyttelemässä vähän maailmaa hänellekin."

"No, höllittäkäähän nuttuja!"

"Teemme, teemme sen työn, jos yösijaa talosta annetaan. Vaan


ensin: Vieläkö se, teidän liiterinne on lukollinen ja sopisiko sinne
meidän kuormamme? niitä on kaksikin."

"Ei lukosta juuri kehumista, vaan säilyy täällä kuormat vaikka


maantielläkin. Saatetaanhan nuo sentään ajaa sinne, jos sattuisi
yöllä sadetta tulemaan."

"Pankaas, hyvä emäntä, vettä ja hiiliä samovaariin; sill'


aikaapahan lämpiää, kuin käyn kuormat korjaamassa. Minä tuon
sieltä tieheiniä, älkää panko omianne."

Isäntä läksi Jaakon kanssa toimittamaan suojaa sekä kuormille


että myöskin hevosille, joiden muusta hoidosta sai poika pitää
huolen.

Emäntä otti uunilta teekeittiön, kävi täyttämässä sen vedellä ja


ryhtyi hiiliä sytytellessään pakinoimaan tupaan jääneen Marin
kanssa.

"Mistä se Kauppinen onkaan kotoisin? Ei tuota ole tullut kysytyksi,


vaikka on hän jo monesti käynyt meillä."

"Viipurin puolestahan me olemme molemmat."

"Oletteko jo kauankin olleet naimisissa?"

"Vasta neljä kuukautta."

"Eikö se alinomainen matkaaminen väsytä? En minä vain jaksaisi


sitä kestää. Ei rauhaa milloinkaan."

"En tiedä ajan pitkään. Nyt ei vielä ole väsyttänyt. Onpa hauska
vähän liikkuakin."

Jaakon ja isännän tulo keskeytti puhelun, vaikka eiväthän he olisi


sitä kieltäneet; muuten vain naiset luonnostansa puhelevat
mieluisemmin keskenään ja miesten läsnä ollessa supajavat, ell'ei
heiltä erittäin kysytä.

Jaakko kantoi eväskorin penkille pöydän päähän, joka, ollen


veistetty kahdesta suuresta juurikasta, seisoi keskellä peräseinää,
leveämpi pää ikkunaa, kapeampi ovea kohti. Avattuaan korin kaivoi
hän siitä ensin esille teetä ja sokuria sekä tyydytti talon pikku pojan
uteliaita silmäyksiä nostamalla aika ripun vesirinkelejäkin pöydälle.
Vesi herahti pikku Matille suuhun, mutta oi hän sentään julennut
pyytää, vaikka kyllä kovin mieli teki.

"Maltahan, kunnes teevesi kiehuu, saat sitte sinäkin", lohdutteli


Jaakko pienokaista.

"Onko tässä muita kyliä likellä?" kysyi hän melkein samaan


jatkoon isännältä.

"On tässä kyliä montakin, vaikk'ei aivan tien varrella; eivät ne juuri
näy maantielle."

"Miten varakasta tämä seutu on? Tässä kylässä kyllä näkyy


olevan kasattu kekojakin."

"Onhan tuota tultu toimeen. Tässä kylässä ei ole yhtään kitutaloa,


ja eläjiä ne ovat muutkin. Hyvät on maat meillä; jos keitä elos rupeaa
huononemaan, niin on syy työn teossa eikä maissa."

"Niin, niin, elää ahkera huonommallakin paikalla, laiska kituu


missä hyvänsä."

"Ei sitte saattane olla ketään, joka haluaisi myödä taloansa?" jatkoi
Jaakko, vähän aikaa vaiti oltuaan ja mietiskeltyään, miten paraiten
saisi tiedustelluksi niin paljon kuin mahdollista paikkakunnan oloja.

"Ei ainakaan tässä kylässä kenenkään mieli tehne luopua leivästä,


tuskinpa muissakaan näillä seuduin."

"Kun kaikki ovat varakkaita, miksi eivät rakenna itselleen parempia


kartanoita? Vai ovatko metsät huonot?"
"Eipä moisissa vikaa; kyllä niissä rakennustarpeita olisi toisellekin
antaa, mutta ei tule tehdyksi; aika menee niin tarkoilleen maatyön
tekoon."

"Ja rahdin vetoon. Eikö niin?"

"Käydään vähin rahdissakin, vaan ei muut paljon kuin köyhimmät


talot. Ne, ne siellä hyöstyttävät vieraita pihoja ja jättävät omat työnsä
tekemättä. Vaan sieltähän ne sentään hankkivatkin rahaa kaikkiin
maksuihinsa."

"Mistäs muut saavat?"

"Voista enimmäkseen."

Taas pysähtyi puhe.

Teetä juodessaan korvattomasta kahvikupista ja lohkolaitaiselta


teevadilta, jollaisia talon molemmat toisetkin parit olivat, katseli
Kauppinen niitä hyvin pitkään ja kysäsi sitte hetkisen päästä:

"Tottahan sentään vouraamalla saisi jonkun maatilkun, jos ken


haluaisi kartanon paikaksi?"

"On tässä maata sen verta liikenemään, mutta kukas tänne suotta
rupeaisi kartanoja rakentelemaan? Ei täällä kannata muiden elää
kuin oikean talonpojan, eikä talonpoika tule toimeen paljailla
kartanoilla, hän tarvitsee paljon muutakin."

"Olisiko esimerkiksi teillä sellaista joutavaa kartanon paikkaa,


tynnyrinalan verta?"
"Mitä, aiotteko te jättää matkustelemiset? Vai mitä varten sitä niin
kyselette?"

"Sepähän sitte nähtäisiin. Muuten vain kysyn."

"On tässä puolen virstan päässä sievä mäenkunnas, jota itsekin


olen katsellut kartanon paikaksi. Mutta eipä suinkaan minulla enää
saane muutetuksi, kun ei tullut nuorempana tehdyksi. Siihen näkyy
järvikin niin hyvästi, vaikk'ei tähän. Ja maat siinä ympärillä ovat ihan
yhtä hyvät kuin tässäkin."

"Pitäisipä oikein käydä katsomassa."

"Tänä iltana ei näe enää, vaan huomeisaamuna sopii käydä."

"Olisipa lähdettävä aikaisin edelleenkin."

"No, ette ainakaan pimeässä vielä lähde. Se on ihan tien varrella."

Seuraavana aamuna Kauppinen kävi Vanhalan isännän kanssa


ennen hevosten valjastusta mäenkunnasta katsomassa, mielistyi
siihen ja vourasi sen ynnä tynnyrinalan peltomaata viidestäkolmatta
markasta vuodelta, josta hinnasta myöskin kauppiaan elukkain piti
saaman käydä yhteisellä laitumella talon karjan kanssa.

Kun vourakirjat oli tehty, joista kiireessä jäi pois ajan määrä, vaikku
Kauppinen oli mielessään aikonut viittäkymmentä vuotta, läksi hän
kuormillensa ajamaan, luvaten markkinoilta palatessaan tarkemmin
määrätä kartanoiden sijan ja sopia kivijalkain teosta jo samana
syksynä ennen maan jäätymistä. Matti Vanhanen lupasi erittäin
sovitusta maksusta toimittaa kiviä paikalle jo siksi ja sitte työn aikana
lisää tarpeen mukaan.
Hyvillään näin oivallisesta kaupasta läksi Matti pellolle maantien
varrelle syyskyntöä jatkamaan.

Päivällisajan lähestyessä astui tietä myöten Kylänpään Sakari,


toisen talon isäntä, kirves takana vyön alla.

"Jumal'avuks'!" toivotti hän Matille, joka juuri painalsi aatran


syvempään kiinni pientaren laitaan ja ryhtyi päästämään hevosta
aisoista samalle neliölle haeskelemaan niukkoja ruohon korsia.

"Jumal' ann' apua!" vastasi hän. "Mistäs sinä kaalat?"

"Kävinpähän tuolla pieleksiä korjailemassa; olivat vähän


kallistuneet; pojan veitikat ovat kiskoneet pois tukia. Jospa minä ne
saisin kiinni!"

"Mistäpä heidät arvaat? Jätä rauhaan."

"Täytyyhän jättää pakostakin. — Hyvinhän sillä Kauppisella oli


vahvat kuormat."

"Niin oli. Kohta hänestä tulee meidän kylän asukas", ilmoitti Matti
riemumielellä uutisensa.

"Hm, vai niin!"

"Mitä vai niin ja hm?"

"Niin tuota, minä en olisi hänelle toki taloani antanut."

"No, eihän hän minunkaan taloani saanut; ainoastaan yhden


tynnyrin alan joutavaa maata."
"Katsohan vain, eikö vie taloasi kokonaankin kuin entinen piru,
joka sai miestä sormesta kiinni ja vei sitte koko miehen."

"Älä hätäile, hyvä naapuri, onhan sitä nyt minullakin sentään vähä
järkeä; en minä rupea taloani syömään enkä hävittämään."

"Hyvähän olisi, jos niin kävisi. Mutta ole varoillasi, ja muista minun
sanani, jos hullusti käypi."

"Pannaan, pannaan mieleen. Mutta pidä vain itse varasi. Ei


meidän talomme kaltainen toki tyhjästä häviä."

"Hyvä olisi! hyvä olisi!" jupisi Sakari vielä mennessään ja pudisteli


päätänsä.

Matti oli tällä välin jo noussut aidan yli maantielle ja astui kotiinsa
päivälliselle.

"Kateeksi taisi vähän pistää Kylänpään Sakarille", kertoi Matti


syödessään kotiväelle, "kun ei Kauppinen osunut vouraamaan
häneltä kartanon sijaa. Hyvin näytti tyytymättömältä, kun kerroin
hänelle kaupat."

"Mitäpä tuossa tuon vertaisessa on kadehtimistakaan", arveli


emäntä.

"Sitähän minäkin itsekseni ihmettelin, vaan siltä se kuitenkin


kuului. Ja vielä hän varoitteli minua, että en koko taloani menettäisi.
Jopa nyt jotakin! Vai luuleeko hän nyt yksin olevansa kaiken
maailman viisas sen tähden, että ennen pikku poikana kierteli
isävainajansa kanssa voisaksana! Eipä se hänelle itselleen
luonnistanutkaan, vaikka isänsä kyllä kostui."
Tuskin ehti Vanhalan perhe nousta ruoalta, kuin Sakari astui
tupaan. Kaikkein silmät suurenivat, niin oudolta tuntui nyt tutun
naapurin tulo. Tervehdittyä tarjosi Vanhalan isäntä
tupakkakukkaroansa, jossa oli kotikasvuista ja Venäjän lehtiä
sekaisin hakattuna.

Sakari pisti piippuun ja istui kotvasen ääneti penkillä. Viimein, kun


ei Matti alottanut puhetta, virkkoi hän:

"Minä vieläkin ajattelen sitä Kauppisen tuloa meidän kyläämme.


Etkö sinä voisi ennemmin antaa Notkonmäen kumpua torpan
paikaksi jollekin hyvälle työmiehelle kuin Kauppiselle?"

"Ei se nyt enää peräydy, kun on kerran tehty kirjat."

"Hm! vai niin! Sepä oli paha, se!"

"Minkä tähden? Ja mitä sinulla on Kauppista vastaan? Eikö hän


ole rehti mies?"

"No, enhän minä hänestä mitään pahaa tiedä erittäin, mutta en


erikoista hyvääkään."

"Mitäs sitte valittelet taikka oikeastaan tarkoitat kaivelemisellasi?


Sano suoraan!"

"Tulee puoti liian lähelle kyläämme. Se minusta on pahasti, hyvin


pahasti."

"Mikäpä vaara siinä sitte olisi?"

"Kun on puoti lähellä, tulee siitä aina otetuksi tavaraa enempi kuin
etäältä haetuksi, ja se on vaarallista, se, sanon minä. Mutta koska ei
enää käy auttaa, niin olkoon sitte."

Ja hyvästeltyään läksi Sakari, päätänsä vieläkin pudistellen.

II.

Lähempänä talvea Kauppinen sitte saapui vaimoinensa Harjulaan ja


asettui Vanhalan toiseen tupaan. Viisi kuormaa oli hänellä
mukanansa kauppatavaraa, vaan ei mitään huonekaluja eikä muuta
omaksi tarpeeksi varustettua. Mari oli tosin tahtonut hankkia
kaupungista niitäkin, mutta Jaakolla oli toiset tuumat, että ehtiihän
huoneet sisustaa sittekin, kuin ne ensin saadaan rakennetuksi.

Nyt heti ryhdyttiin työhön, ja pian tuli kivijalat tehdyksi; mutta kun
Harjulan kylän miehet, joita yksinomaan oli tässä pohjatyössä, tulivat
Kauppiselta rahaa pyytämään, ei sitä häneltä koskaan irronnut.
"Ottakaa
tavarassa!" vastasi hän aina lyhyeen.

"Eipä minulla ole tavaran tarvista", koetti monikin vastustaa.

"No, tulkaahan sentään katsomaan," houkutteli Kauppinen, "eikö


mitään löytyisi, jota talossa tarvitsisitte." ja kun miehet kerran
astuivat tavaravaraston ääreen, alkoi hän näytellä ja kehua:

"Katsokaas, miten hyviä piippuja, eivätkä maksa kuin puoli


kolmatta markkaa, vaan nämä helapiiput neljä. Ja tässä oivallista
nahkaa, eikös kelpaa tehdä saappaiksi!"

Samaan tapaan kävi hän varastonsa alusta loppuun saakka, eikä


aikaakaan, niin joku aina suostui ottamaan naulan muka parasta
kahvia, kirjavata saippuaa tai muuta rihkamaa.
Pieni riippuvaaka seisoi pöydällä tuvassa oven suussa, jossa
Kauppisen puhtaimmat tavarat olivat, jota vastoin peräpuoli oli
hänellä ainoana asuinhuoneena. Oli vähän omituista, että tuota
vaakaa ei koskaan nähty seisomassa tyhjiltään tasapainossa,
ehkäpä sen tähden, että siinä tavallisesti aina oli parinaulainen paino
toisella laudalla silloinkin, kuin ei ollut mitään punnittavana.

Astuipa kerran Kylänpään Sakari tuohon kauppatupaan. Harvoin


oli häntä tie tänne vetänyt ja vielä harvemmin Kauppisen työhön, jota
hän oli ainoastaan kerran ohi astuessaan pysähtynyt katsomaan.
Sen tähden Kauppinen nyt vähän oudostui, katsoi pitkään Sakaria ja
pikku kansipyttyä, joka hänellä oli kädessä, ja kysäsi:

"No, mitäs pytyssä on? Ruvetaankos kauppojen tekoon?"

"Olisihan tässä vähä voita, jos hinnoilla sovitaan."

"Miks'ei sovita. Jos otatte tavarassa, maksan minä 70 penniä;


rahassa vain 60."

"No, kahvia minä siitä ottaisin. Punnitkaahan voi ensin."

Kauppinen nosti pytyn vaa'an ilmassa riippuvalle laudalle, joka


siitä painui alas, pani sitte sysäämällä lisäksi vastapainoa ja sai
painopuolen kolahtamaan pöytään, vaikka tavarapuoli heti jälleen sai
voiton. Mutta Kauppinen olikin sukkela ja samaa vauhtia nosti pois
vastapainoa, kuin oli sen pannutkin. Kaadettuaan sitte voin paperille,
viedäkseen perästä päin aittaan, punnitsi hän tyhjän pytyn ja sitä
tehdessä pyttypuoli vaa'asta kerran kolahti ja painopuoli jysähti
pöytään.

"Neljännestä vaille neljä naulaa!" julisti hän päätökseksi.


"Ei se nyt tainnut olla oikein", arveli Sakari. "Se juuri kotona
punnittiin ja siinä oli täyteen neljä naulaa."

"No, tässähän voi on paljaltaan; punnitaan uudestaan", myönsi


Kauppinen äkeissään, asetti voin laudalle ja antoi vaa'an nyt
itsekseen asettua tasapainoon. "Katsokaa itse! neljä luotia vaille
neljä naulaa."

"Kummapa se! Mutta pankaahan nyt sama paino kahvia ja sokuria


yhteensä."

"Ei siitä niin paljon saa, kahvi on kalliimpaa."

"Minä maksan lisää rahassa, pankaa vain!"

Kauppinen punnitsi paksuun paperiin sokuria ja tarpeettoman


isoon paperipussiin kahvia, kehoitellen vielä Sakaria katsomaan,
miten runsaasti hän pani. Mutta isännän huomaamatta olikin hän
siirtänyt painot sille puolelle, jossa voi oli äsken ollut, joten siis
myötävä tavara annettiin toisella vaakalaudalla, kuin ostettava
otettiin.

Sakari maksoi hinnan eroa 30 penniä naulalta ja läksi.

"En minä käsitä, mitä tuolla Kylänpään Sakarilla on mielessä


minua vastaan", lausui Kauppinen ihmettelyänsä vaimolleen. "Aina
hän vain katsoo niin karsaasti eikä tee kauppaa muiden ihmisien
tavalla."

Mari ei tuohon ehtinyt mitään vastata, kun silloin juuri tuli toisia
ostajia.
Vähän ajan päästä astui Sakari uudelleen tupaan, puntari toisessa
ja äsken ostetut tavarat toisessa kädessä. Hän tahallansa oli
satuttanut tulonsa niin, että muita syrjäisiä ei ollut läsnä.

"Kyllä nyt on tapahtunut erehdys", alkoi Sakari. "Tällä samalla


puntarilla minä itse lähtiessäni punnitsin voin, ja sitä oli runsaasti
neljä naulaa. Te siitä saitte neljä luotia vaille. Sitte käskin teitä
panemaan yhtä paljon kahvia ja sokuria, kun luulin, että, jos
punnitsimet ovatkin erilaiset, niin tottahan kuitenkin saan täältäkin
neljä naulaa oman puntarini mukaan. Mutta katsokaa itse, nämähän
painavat vain neljännestä vaille neljä."

"Katsoittehan te itse, kuin punnitsin?"

"Kyllä minä katsoin ja katson nytkin tästä."

"Syyttäkää sitte itseänne."

"Eipä niinkään. Tämä on kruunattu puntari ja voi on vielä tuossa;


punnitaanpas uudelleen. Tässä on rievunkaistale." Ja Sakari nosti
voin paperineen rievulle, solmitsi sen päät ja pisti puntariin.
"Katsokaas nyt, onhan tässä neljä naulaa."

"Niinpä näkyy olevan ja vielä paperinkin paino."

"No, antakaa sitte tällä puntarilla minulle myöskin kahvit ja sokerit."

"Saatte neljännesnaulan lisäksi."

"En minä huoli teidän sokuritoppa-paperistanne."

"Maksaapa se paperi minullekin."


"Sen tähden pitäkääkin se omananne. Minulla on oma
kahvipussini, ja se painaa kaksi luotia."

Sakari kaatoi kahvit kauppiaan pussista omaansa, pisti pussin


puntariin ja vaati lisää. Kauppias vastahakoisesti totteli, kunnes
puntari näytti kaksi naulaa kaksi luotia. Sitte Sakari sitoi pussin
koskea kiinni, pisti sokuripalaset suupuoleen ja sai siihenkin lisää.

"Jos tahdotte meidän kylässä ruveta kauppaa pitämään, niin


jättäkää pois vehkeileminen! sanon minä ajoissa. Kohtuullista voittoa
minä en kiellä ottamasta, mutta rehellinen peli. Ja nyt hyvästi tällä
kertaa! Laittakaa vaaka kuntoon toiseksi kerraksi. Arvaattehan, että
minä en pidä tätä asiaa salassa, eikä se keino siis auta. Hyvästi!"

"Pahaapa näkyy Sakarilla tosiaankin olevan mielessä meitä


kohtaan", vastasi Mari nyt vasta miehensä äskeiseen ihmettelyyn.

Kauppinen, mitään virkkamatta, päästi vaa'an seljän toisesta


päästä pois pienen lyijymurikan. Hänen kyllä mieli paloi halusta
kurittaa tuota häiritsijää, mutta mahdoton oli päästä häneen kiinni,
ainakaan vielä nyt.

Vanhalan isäntä, saatuaan aikaa myöten kuulla tämän tapauksen,


muisti itsekseen jotain samanlaista vajautta ja alkoi vähän katua
kauppaansa, että oli Kauppisen ottanut maallensa, mutta toivoi
hänen kuitenkin tästä jo viisastuvan. Sitä paitsi tuntui niin mukavalta
saada ihan ilmaiseksi tuo viisikolmatta markkaa vuodessa, vieläpä
lisää työn teosta nyt rakennusaikana, jopa hirsistäkin, joita oli
luvannut omasta metsästään itse vetää Kauppiselle. Tosin häntä jo
oli vähän kyllästyttänyt kiven veto, kun Kauppinen oli kiviä moittinut
huonoiksi ja sen tähden maksanut vähemmän, kuin oli luvannut;
vaan olihan siitä sentään ollut ansiota, vaikkapa vähempikin. Eikä
Vanhanen luullut Kauppisen toki osaavan hirsiä halventaa. Vielä nuo
olivat luvut pitämättä töistä ja muista saamisista sekä otoista
Kauppisen kanssa, joka oli aina vastannut, että ehditäänhän vielä
katsella; mutta Vanhasen muistaen piti siellä olla koko joukko rahaa
tallessa; joutipahan säästyä, kunnes tarve tuli.

Joulun jälkeen alkoi sitte hirren veto, kuin Kauppinen ensin kävi
noutamassa lisäksi tavaraa koko kymmenen kuorman. Vanhanen
veti vetämistään määrän mukaan, kunnes hirsikasa kasvoi aika
suureksi, ja pyysi sitte Kauppista mittaamaan.

"Ehditäänhän nuo katsoa", yritti Kauppinen taaskin sanomaan,


mutta läksi kuitenkin, kun Vanhanen huomautti, että pian sekautuvat
hänen puunsa toisten tuomiin, ja kahden kesken he astuivat
hirsiläjän luo.

"Eihän tuo ole viisisylinen", alkoi Kauppinen moittia, näyttäen


erästä, joka oli hiukan lyhempi toisia sitä runsaampimittaisia; "eikä
tämä täyteen kymmentuumainen, ja tämäkin on vajaa."

Vanhanen oudostui. Hän oli itse metsässä mitannut joka puun;


mitenkä ne nyt olivat vajaat?

"Kyllä pitää mitata uudestaan. En minä usko muuta kuin omia


silmiäni."

"No, en minä vielä tähän asti ole pyrkinyt ketään pettämään; mutta
mitattakoon!" taipui Vanhanen. "Minä vain käyn noutamassa rengin
avuksi liikuttelemaan."

Vanhanen läksi ja palasi tuokion kuluttua rengin sekä Kylänpään


Sakarin kanssa, joka viimeksi mainittu oli sattunut tiellä toisiin, ja
Vanhanen oli siitä pyytänyt häntä mukaan vain huvin vuoksi.

Kauppinen koetti milloin lyhyyden, milloin vääryyden ja milloin


hienouden tai muun syyn nojalla hyljätä joka toista puuta. Kyllä hän
tietysti olisi nekin pitänyt, mutta melkein ilmaiseksi, vaikka sovittu
hintakaan ei ollut suurempi kuin yksi markka kokonaisesta
viisisylisestä puusta.

"Jos nämä eivät kaikki kelpaa täydestä hinnasta, niin minä sanon,
että lyhyeen teidän kaupitsemisenne loppuu!" uhkasi suoravainen
Sakari.

"Kahden kauppa, kolmannelle korvapuusti", mörähti Kauppinen


äkäisesti, mutta taipui kuitenkin ottamaan kaikki ja hillitsi mielensä,
vaikka kyllä kynsiä kihelmöittikin antamaan aika korvapuustia tuolle
kiusan hengelle, joka aina oli hänen tiellänsä.

Ja tiellä Sakari tosiaankin oli Kauppisella kaikessa, kuin vähänkin


vääryyteen vivahti ja tuli hänen tiedoksensa. Hän aina meni omasta
takaa koettelemaan ja antoi aika nuhteet, jopa sopivia uhkauksiakin,
milloin ei kuulijoita sattunut olemaan läsnä. Eikä Kauppinen
puolestaan uskaltanut tehdä vastoin Sakarin viisaita, vaikka
kiusallisiakin neuvoja, kun pelkäsi ehkä kinastellen tulevan ilmi
asioita, joita ei olisi ollut aivan helppo selvitellä.

Olipa kuitenkin yksi kohta, johon ei Kylänpään Sakarin


varovaisuus ulottunut vaikuttamaan: luvun teko Kauppisen ja
kyläläisten välillä. "Eihän tuolla mitään kiirettä ole", vastasi
Kauppinen jokaiselle, antoi vain kaikille tarvitsevaisille nyt jo
rahaakin ja otti heillä voita ja muuta tavaraa. Niinpä Vanhalan
isäntäkin sai rahaa, milloin vain tahtoi, kunnes viimein ei enää
muistanutkaan niiden summaa, ja vielä vähemmin hän muisti muita

You might also like