Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Relying on Algorithms and Bots Can Be Really, Really Dangerous | Wir...

http://www.wired.com/opinion/2013/03/clive-thompson-2104/

Opinion
You're Entitled To Our Opinion

culture & entertainment cars, gadgets, apps


Tweet 391 Like 518 125 Share 65

Relying on Algorithms and Bots Can Be Really, Really Dangerous


By Clive Thompson 03.25.13 6:30 AM
Follow @pomeranian99

Machines can make decisions. That doesnt mean theyre right. Illustration: Kronk So you cant wait for a self-driving car to take away the drudgery of driving? Me neither! But consider this scenario, recently posed by neuroscientist Gary Marcus: Your car is on a narrow bridge when a school bus veers into your lane. Should your self-driving car plunge off the bridge sacrificing your life to save those of the children? Obviously, you wont make the call. Youve ceded that decision to the cars algorithms. You better hope that you agree with its choice. This is a dramatic dilemma, to be sure. But its not a completely unusual one. The truth is, our tools increasingly guide and shape our behavior or even make decisions on our behalf. A small but growing chorus of writers and scholars think were going too far. By taking human decisionmaking out of the equation, were slowly stripping away deliberationmoments where we reflect on the morality of our actions.

1 von 5

23.04.2013 04:33

Relying on Algorithms and Bots Can Be Really, Really Dangerous | Wir...

http://www.wired.com/opinion/2013/03/clive-thompson-2104/

Platinum Age of TV Netflix Isnt Just Rebooting Arrested DevelopmentIts Revolutionizing TV Buckle Your Brainpan: The Primer Director Is Back With a New Film

Not all of these situations are so life-and-death. Some are quite prosaic, like the welter of new gadgets that try to nudge us into better behavior. In his new book To Save Everything, Click Here, Evgeny Morozov casts a skeptical eye on this stuff. He tells me about a recent example hes seen: A smart fork that monitors how much youre eating and warns you if youre overdoing it. Fun and useful, you might argue. But for Morozov, tools like the fork reduce your incentive to think about how youre eating, and the deeper political questions of why todays food ecosystem is so enfattening. Instead of regulating the food industry to make food healthier, Morozov says, were giving people smart forks. Or as Evan Selinger, a philosopher at Rochester Institute of Technology, puts it, tools that make hard things easy can make us less likely to tolerate things that are hard. Outsourcing our self-control to digital willpower has consequences: Use Siri constantly to get instant information and you can erode your ability to be patient in the face of incomplete answers, a crucial civic virtue. Things get even dicier when society at large outsources its biggest moral decisions to technology. For example, some police departments have begun using PredPol, a system that mines crime data to predict future criminal activity, guiding police to areas they might otherwise overlook. It appears to work, cutting some crimes by up to 27 percent. It lets chronically underfunded departments do more with less. But as Morozov points out, the algorithms could wind up amplifying flaws in existing law enforcement. For example, sexual violence is historically underreported, so it cant as easily be predicted. Remove the deliberation of what police focus on and you can wind up deforming policing. And doing more with less, while a worthy short-term goal, lets politicians dodge the political impact of their budgetary choices. And this, really, is the core of the question here: Efficiency isnt always a good thing. Tech lets us do things more easily. But this can mean doing them less reflectively too. Were not going to throw out all technology, nor should we. Efficiency isnt always bad. But Morozov suggests that sometimes tools should do the oppositethey should introduce friction. For example, new parking meters reset when you drive away, so another driver cant draft off of any remaining time. The city makes more money, obviously, but that design also compels your behavior. What if a smart meter instead offered you a choice: Gift remaining time to the next driver or to the city? This would foreground the tiny moral trade-offs of daily lifecity versus citizen. Or consider the Caterpillar, a prototype power strip created by German designers that detects when a plugged-in device is in standby mode. Instead of turning off the devicea traditional efficiency movethe Caterpillar leaves it on, but starts writhing. The point is to draw your attention to your power usage, to force you to turn it off yourself and meditate on why youre

2 von 5

23.04.2013 04:33

Relying on Algorithms and Bots Can Be Really, Really Dangerous | Wir...

http://www.wired.com/opinion/2013/03/clive-thompson-2104/

using so much. These are kind of crazy, of course. Theyre not tools that solve problems. Theyre tools to make you think about problemswhich is precisely the point. Related You Might Like Related Links by Contextly

Why Living in the Present Is a Disorder

Facebook Home Propaganda Makes Selfishness Contagious

How Were Turning Digital Natives Into Etiquette Sociopaths

The Internet of Things Has Arrived And So Have Massive Security Issues

Kill Your Meeting Room The Futures in Walking and Talking

ForgetMore the Internet of Things: Here Comes the Internet of Cars Show

Amazon Studios' New TV Shows: What's Worth Watching?

Ego-Wrangling the World's Most Powerful Leaders for a Portrait

Facebook Home Propaganda Makes Selfishness Contagious

Tags:Next automate this, magazine-april-2013 The Big Thing in Crowdfunding? Kickstarting People Post Comment | 52 Comments and 506 Reactions | Permalink Back to top
Tweet 391 Like 518 125

Reddit Digg Stumble Upon Email

3 von 5

23.04.2013 04:33

Relying on Algorithms and Bots Can Be Really, Really Dangerous | Wir...

http://www.wired.com/opinion/2013/03/clive-thompson-2104/

52 comments

Best

Community

Share

Trout007

a month ago

This article is so funny. Take these two sentences. Instead of regulating the food industry to make food healthier" and "Outsourcing our self-control to digital willpower has consequences:" Isn't the first one just outsourcing your self control to "political willpower"? That has even worse consequences. At least the digital kind you can choose if you want to use it.
31 1

Reply

Share

the_rat

>

Trout007 a month ago

Dude, words off my keyboard! Isn't this an article about giving up choice to robots? What kind of a philosopher misses the point this badly? He complains about a device that leaves the choice to eat or not in our hands and offers up government regulation of our eating habits as a better choice?
11

Reply

Share

Dave O'Connor

>

Trout007 a month ago

I agree with the general idea of the article but the sentences you quote are exactly what I was going to comment on.
7

Reply

Share

Joaquim Ventura

>

Trout007 a month ago

Exactly! I'm always amazed at how people mistrust machines, which have transparent (if open source) rule sets/logic and give you nothing more and nothing less than you put in (unless you want it to give you something emergent) and put faith in politicians/regulators that, on a good day are just humans with human flaws and on a bad day are outright crooks. At least I can trust my smart fork not to be lobbied.
3

Reply

Share

simonsmicrophone

>

Joaquim Ventura a month ago

Yes, very true. However, your smart fork might be lobbied. This is because it's not open and connected to some corporations network of dietary AI. Your average plod will buy it and worship it - not realizing they've sold part of their freedom for some coolness. I'm not sure if it's actually cool. It's just what I've been told by the masses.
1

Reply

Share

Amanda Dawson

>

Trout007 a month ago

If you think Richard`s story is amazing,, last month my uncles step-son basically also brought home $7387 sitting there fourty hours a month an their house and they're co-worker's step-mother`s neighbour did this for 7-months and got paid more than $7387 part-time on line.
4 von 5 23.04.2013 04:33

Relying on Algorithms and Bots Can Be Really, Really Dangerous | Wir...

http://www.wired.com/opinion/2013/03/clive-thompson-2104/

Previous Article

Don't Just Hate CISPA Fix It

Next Article

How We're Turning Digital Natives Into Etiquette Sociopaths

5 von 5

23.04.2013 04:33

You might also like