Professional Documents
Culture Documents
Ceglowski - The Moral Economy of Tech.1607
Ceglowski - The Moral Economy of Tech.1607
Ceglowski - The Moral Economy of Tech.1607
http://idlewords.com/talks/sase_panel.htm
The Moral Economy of Tech
by Maciej Ceglowski
This is the text version of remarks I gave on June 26, 2016, at a panel on the
Moral Economy of Tech at the SASE conference in Berkeley. The other panel
participants were Kieran Healy (whose remarks are here), Stuart Russell and
AnnaLee Saxenian. We were each asked to speak for ten minutes, to an audience
of social scientists.
I am only a small minnow in the technology ocean, but since it is my natural
habitat, I want to make an effort to describe it to you.
As computer programmers, our formative intellectual experience is working with
deterministic systems that have been designed by other human beings. These can
be very complex, but the complexity is not the kind we find in the natural world.
It is ultimately always tractable. Find the right abstractions, and the puzzle box
opens before you.
The feeling of competence, control and delight in discovering a clever twist that
solves a difficult problem is what makes being a computer programmer
sometimes enjoyable.
But as anyone who's worked with tech people knows, this intellectual background
can also lead to arrogance. People who excel at software design become
convinced that they have a unique ability to understand any kind of system at all,
from first principles, without prior training, thanks to their superior powers of
analysis. Success in the artificially constructed world of software design promotes
a dangerous confidence.
Today we are embarked on a great project to make computers a part of everyday
life. As Marc Andreessen memorably frames it, "software is eating the world".
And those of us writing the software expect to be greeted as liberators.
Our intentions are simple and clear. First we will instrument, then we will
analyze, then we will optimize. And you will thank us.
But the real world is a stubborn place. It is complex in ways that resist
abstraction and modeling. It notices and reacts to our attempts to affect it. Nor
can we hope to examine it objectively from the outside, any more than we can
apparatus understand its capabilities in a way the average citizen does not. My
greatest fear is seeing the full might of the surveillance apparatus unleashed
against a despised minority, in a democratic country.
What we've done as technologists is leave a loaded gun lying around, in the hopes
that no one will ever pick it up and use it.
CONCLUSION
The first step towards a better tech economy is humility and recognition of limits.
It's time to hold technology politically accountable for its promises. I am very
suspicious of attempts to change the world that can't first work on a local scale. If
after decades we can't improve quality of life in places where the tech lite
actually lives, why would we possibly make life better anywhere else?
We should not listen to people who promise to make Mars safe for human
habitation, until we have seen them make Oakland safe for human habitation. We
should be skeptical of promises to revolutionize transportation from people who
can't fix BART, or have never taken BART. And if Google offers to make us
immortal, we should check first to make sure we'll have someplace to live.
Techies will complain that trivial problems of life in the Bay Area are hard
because they involve politics. But they should involve politics. Politics is the thing
we do to keep ourselves from murdering each other. In a world where everyone
uses computers and software, we need to exercise democratic control over that
software.
Second, the surveillance economy is way too dangerous. Even if you trust
everyone spying on you right now, the data they're collecting will eventually be
stolen or bought by people who scare you. We have no ability to secure large data
collections over time.
The goal should be not to make the apparatus of surveillance politically
accountable (though that is a great goal), but to dismantle it. Just like we don't let
countries build reactors that produce plutonium, no matter how sincere their
promises not to misuse it, we should not allow people to create and indefinitely
store databases of personal information. The risks are too high.
I think a workable compromise will be to allow all kinds of surveillance, but limit
what anyone is allowed to store or sell.
More broadly, we have to stop treating computer technology as something
unprecedented in human history. Not every year is Year Zero. This is not the first
time an enthusiastic group of nerds has decided to treat the rest of the world as a